Compare commits

...

252 Commits

Author SHA1 Message Date
Gregory Becker
17c3d6ab19 colima wip 2023-05-30 16:11:36 -07:00
Todd Gamblin
e1bcefd805 Update CHANGELOG.md for v0.20.0 2023-05-21 01:48:34 +02:00
Manuela Kuhn
2159b0183d py-argcomplete: add 3.0.8 (#37797)
* py-argcomplete: add 3.0.8

* Update var/spack/repos/builtin/packages/py-argcomplete/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [@spackbot] updating style on behalf of manuelakuhn

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-20 11:31:28 -05:00
kwryankrattiger
078fd225a9 Mochi-Margo: Add patch for pthreads detection (#36109) 2023-05-20 07:43:27 -07:00
Manuela Kuhn
83974828c7 libtiff: disable use of sphinx (#37803) 2023-05-19 21:37:19 -05:00
Manuela Kuhn
2412f74557 py-anyio: add 3.6.2 (#37796) 2023-05-19 21:36:39 -05:00
Manuela Kuhn
db06d3621d py-alabaster: add 0.7.13 (#37798) 2023-05-19 21:34:00 -05:00
Jose E. Roman
c25170d2f9 New patch release SLEPc 3.19.1 (#37675)
* New patch release SLEPc 3.19.1

* py-slepc4py: add explicit dependency on py-numpy
2023-05-19 21:33:21 -05:00
Vanessasaurus
b3dfe13670 Automated deployment to update package flux-security 2023-05-16 (#37696)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-05-19 12:07:57 -07:00
Harmen Stoppels
6358e84b48 fix binutils dep of spack itself (#37738) 2023-05-19 12:02:36 -07:00
Swann Perarnau
8e634d8e49 aml: v0.2.1 (#37621)
* aml: v0.2.1

* add version 0.2.1
* fix hip variant bug

* [fix] pkgconf required for all builds

On top of needing pkgconf for autoreconf builds, the release configure
scripts needs pkgconf do detect dependencies if any of the hwloc, ze, or
opencl variants are active.

* Remove deprecation for v0.2.0 based on PR advise.
2023-05-19 11:58:28 -07:00
Mark W. Krentel
1a21376515 intel-xed: add version 2023.04.16 (#37582)
* intel-xed: add version 2023.04.16
 1. add version 2023.04.16
 2. adjust the mbuild resource to better match the xed version at the time
 3. replace three conflicts() with one new requires() for x86_64 target
 4. add patch for libxed-ild for some new avx512 instructions
    * [@spackbot] updating style on behalf of mwkrentel
    * Fix the build for 2023.04.16.  XED requires its source directory to be exactly 'xed', so add a symlink.
 5. move the mbuild resource up one level, xed wants it to be in the same directory as the xed source dir
 6. deprecate 10.2019.03
    * semantic style fix: add OSError to except
    * [@spackbot] updating style on behalf of mwkrentel

---------

Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
2023-05-19 10:28:18 -07:00
Harmen Stoppels
bf45a2b6d3 spack env create: generate a view when newly created env has concrete specs (#37799) 2023-05-19 18:44:54 +02:00
Thomas-Ulrich
475ce955e7 hipsycl: add v0.9.4 (#37247) 2023-05-19 18:29:45 +02:00
Robert Underwood
5e44289787 updates for the libpressio ecosystem (#37764)
* updates for the libpressio ecosystem

* [@spackbot] updating style on behalf of robertu94

* style fix: remove FIXME

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-05-19 09:24:31 -07:00
Massimiliano Culpo
e66888511f archspec: fix entry in the JSON file (#37793) 2023-05-19 09:57:57 -04:00
Tamara Dahlgren
e9e5beee1f fortrilinos: convert to new stand-alone test process (#37783) 2023-05-19 08:06:33 -04:00
Tamara Dahlgren
ffd134c09d formetis: converted to new stand-alone test process (#37785) 2023-05-19 08:05:23 -04:00
Massimiliano Culpo
bfadd5c9a5 lmod: allow core compiler to be specified with a version range (#37789)
Use CompilerSpec with satisfies instead of string equality tests

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-05-19 13:21:40 +02:00
Greg Becker
16e9279420 compiler specs: do not print '@=' when clear from context (#37787)
Ensure that spack compiler add/find/list and lists of concrete specs
print the compiler effectively as {compiler.name}{@compiler.version}.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-05-19 11:31:27 +02:00
Satish Balay
ac0903ef9f llvm: add version 16.0.3 (#37472) 2023-05-19 03:37:57 -04:00
Cyrus Harrison
648839dffd add conduit 0.8.8 release (#37776) 2023-05-19 00:34:19 -04:00
Pieter Ghysels
489a604920 Add STRUMPACK versions 7.1.2 and 7.1.3 (#37779) 2023-05-18 23:18:50 -04:00
eugeneswalker
2ac3435810 legion +rocm: apply patch for --offload-arch (#37775)
* legion +rocm: apply patch for --offload-arch

* constrain to latest version
2023-05-18 23:03:50 -04:00
Alec Scott
69ea180d26 fzf: add v0.40.0 and refactor package (#37569)
* fzf: add v0.40.0 and refactor package
* Remove unused imports
2023-05-18 15:23:20 -07:00
Alec Scott
f52f217df0 roctracer-dev-api: add v5.5.0 (#37484) 2023-05-18 15:11:36 -07:00
Alec Scott
df74aa5d7e amqp-cpp: add v4.3.24 (#37504) 2023-05-18 15:09:30 -07:00
Alec Scott
41932c53ae libjwt: add v1.15.3 (#37521) 2023-05-18 15:05:27 -07:00
Alec Scott
4296db794f rdkit: add v2023_03_1 (#37529) 2023-05-18 15:05:07 -07:00
H. Joe Lee
9ab9302409 py-jarvis-util: add a new package (#37729)
* py-jarvis-util: add a new package

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-18 17:28:59 -04:00
Benjamin Meyers
0187376e54 Update py-nltk (#37703)
* Update py-nltk

* [@spackbot] updating style on behalf of meyersbs

* Update var/spack/repos/builtin/packages/py-nltk/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-18 17:14:20 -04:00
Aditya Bhamidipati
7340d2cb83 update package nccl version 2.17, 2.18 (#37721) 2023-05-18 15:47:01 -05:00
Benjamin Meyers
641d4477d5 New package py-tensorly (#37705) 2023-05-18 15:38:23 -05:00
Benjamin Meyers
3ff2fb69af New package py-scikit-tensor-py3 (#37706) 2023-05-18 15:37:43 -05:00
vucoda
e3024b1bcb Update py-jedi 0.17.2 checksum in package.py (#37700)
Pypi for jedi-0.17.2.tar.gz does not match the package.py file.

https://pypi.org/project/jedi/0.17.2/#copy-hash-modal-60e943e3-1192-4b12-90e2-4d639cb5b4f7
2023-05-18 15:15:47 -05:00
Dom Heinzeller
e733b87865 Remove references to gmake executable, only use make (#37280) 2023-05-18 19:03:03 +00:00
Victor Lopez Herrero
919985dc1b dlb: add v3.3.1 (#37759) 2023-05-18 10:29:41 -07:00
Michael Kuhn
d746f7d427 harfbuzz: add 7.3.0 (#37753)
Do not prefer 5.3.1 anymore since it does not build with newer compilers. Linux distributions have also moved to 6.x and newer.
2023-05-18 10:27:48 -07:00
kaanolgu
b6deab515b [babelstream] FIX: maintainers list missing a comma (#37761) 2023-05-18 10:25:04 -07:00
Glenn Johnson
848220c4ba update R: add version 4.3.0 (#37090) 2023-05-18 10:13:28 -07:00
Glenn Johnson
98462bd27e Cran updates (#37296)
* add version 1.28.3 to r-hexbin
* add version 5.0-1 to r-hmisc
* add version 0.5.5 to r-htmltools
* add version 1.6.2 to r-htmlwidgets
* add version 1.4.2 to r-igraph
* add version 0.42.19 to r-imager
* add version 1.0-5 to r-inum
* add version 0.9-14 to r-ipred
* add version 1.3.2 to r-irkernel
* add version 2.2.0 to r-janitor
* add version 0.1-10 to r-jpeg
* add version 1.2.2 to r-jsonify
* add version 0.9-32 to r-kernlab
* add version 1.7-2 to r-klar
* add version 1.42 to r-knitr
* add version 1.14.0 to r-ks
* add version 2.11.0 to r-labelled
* add version 1.7.2.1 to r-lava
* add version 0.6-15 to r-lavaan
* add version 2.1.2 to r-leaflet
* add version 2.9-0 to r-lfe
* add version 1.1.6 to r-lhs
* add version 1.1-33 to r-lme4
* add version 1.5-9.7 to r-locfit
* add version 0.4.3 to r-log4r
* add version 5.6.18 to r-lpsolve
* add version 0.2-11 to r-lwgeom
* add version 2.7.4 to r-magick
* add version 1.22.1 to r-maldiquant
* add version 1.2.11 to r-mapproj
* add version 1.6 to r-markdown
* add version 7.3-59 to r-mass
* add version 1.5-4 to r-matrix
* add version 0.63.0 to r-matrixstats
* add version 4.2-3 to r-memuse
* add version 4.0-0 to r-metafor
* add version 1.8-42 to r-mgcv
* add version 3.15.0 to r-mice
* add version 0.4-5 to r-mitml
* add version 2.0.0 to r-mixtools
* add version 0.1.11 to r-modelr
* add version 1.4-23 to r-multcomp
* add version 0.1-9 to r-multcompview
* add version 0.1-13 to r-mutoss
* add version 1.18.1 to r-network
* add version 3.3.4 to r-nleqslv
* add version 3.1-162 to r-nlme
* add version 0.26 to r-nmf
* add version 0.60-17 to r-np
* add version 4.2.5.2 to r-openxlsx
* add version 2022.11-16 to r-ordinal
* add version 0.6.0.8 to r-osqp
* add version 0.9.1 to r-packrat
* add version 1.35.0 to r-parallelly
* add version 1.3-13 to r-party
* add version 1.2-20 to r-partykit
* add version 1.7-0 to r-pbapply
* add version 0.3-9 to r-pbdzmq
* add version 1.2 to r-pegas
* add version 1.5-1 to r-phytools
* add version 1.9.0 to r-pillar
* add version 1.4.0 to r-pkgbuild
* add version 2.1.0 to r-pkgcache
* add version 0.5.0 to r-pkgdepends
* add version 2.0.7 to r-pkgdown
* add version 1.3.2 to r-pkgload
* add version 0.1-8 to r-png
* add version 1.1.22 to r-polspline
* add version 1.0.1 to r-pool
* add version 1.4.1 to r-posterior
* add version 3.8.1 to r-processx
* add version 2023.03.31 to r-prodlim
* add version 1.0-12 to r-proj4
* add version 2.5.0 to r-projpred
* add version 0.1.6 to r-pryr
* add version 1.7.5 to r-ps
* add version 1.0.1 to r-purrr
* add version 1.3.2 to r-qqconf
* add version 0.25.5 to r-qs
* add version 1.60 to r-qtl
* add version 0.4.22 to r-quantmod
* add version 5.95 to r-quantreg
* add version 0.7.8 to r-questionr
* add version 1.2.5 to r-ragg
* add version 0.15.1 to r-ranger
* add version 3.6-20 to r-raster
* add version 2.2.13 to r-rbibutils
* add version 1.0.10 to r-rcpp
* add version 0.12.2.0.0 to r-rcpparmadillo
* add version 0.1.7 to r-rcppde
* add version 0.3.13 to r-rcppgsl
* add version 1.98-1.12 to r-rcurl
* add version 1.2-1 to r-rda
* add version 2.1.4 to r-readr
* add version 1.4.2 to r-readxl
* add version 1.0.6 to r-recipes
* add version 1.1.6 to r-repr
* add version 1.2.16 to r-reproducible
* add version 0.3.0 to r-require
* add version 1.28 to r-reticulate
* add version 2.0.7 to r-rfast
* add version 1.6-6 to r-rgdal
* add version 0.6-2 to r-rgeos
* add version 1.1.3 to r-rgl
* add version 0.2.18 to r-rinside
* add version 4-14 to r-rjags
* add version 1.3-1.8 to r-rjsonio
* add version 2.21 to r-rmarkdown
* add version 0.9-2 to r-rmpfr
* add version 0.7-1 to r-rmpi
* add version 6.6-0 to r-rms
* add version 0.10.25 to r-rmysql
* add version 0.8.7 to r-rncl
* add version 2.4.11 to r-rnexml
* add version 0.95-1 to r-robustbase
* add version 1.3-20 to r-rodbc
* add version 7.2.3 to r-roxygen2
* add version 1.4.5 to r-rpostgres
* add version 0.7-5 to r-rpostgresql
* add version 0.8.29 to r-rsconnect
* add version 0.4-15 to r-rsnns
* add version 2.3.1 to r-rsqlite
* add version 0.7.2 to r-rstatix
* add version 1.1.2 to r-s2
* add version 0.4.5 to r-sass
* add version 0.1.9 to r-scatterpie
* add version 0.3-43 to r-scatterplot3d
* add version 3.2.4 to r-scs
* add version 1.6-4 to r-segmented
* add version 4.2-30 to r-seqinr
* add version 0.26 to r-servr
* add version 4.3.0 to r-seurat
* add version 1.0-12 to r-sf
* add version 0.4.2 to r-sfheaders
* add version 1.1-15 to r-sfsmisc
* add version 1.7.4 to r-shiny
* add version 1.9.0 to r-signac
* add version 1.6.0.3 to r-smoof
* add version 0.1.7-1 to r-sourcetools
* add version 1.6-0 to r-sp
* add version 1.3-0 to r-spacetime
* add version 7.3-16 to r-spatial
* add version 2.0-0 to r-spatialeco
* add version 1.2-8 to r-spatialreg
* add version 3.0-5 to r-spatstat
* add version 3.0-1 to r-spatstat-data
* add version 3.1-0 to r-spatstat-explore
* add version 3.1-0 to r-spatstat-geom
* add version 3.1-0 to r-spatstat-linnet
* add version 3.1-4 to r-spatstat-random
* add version 3.0-1 to r-spatstat-sparse
* add version 3.0-2 to r-spatstat-utils
* add version 2.2.2 to r-spdata
* add version 1.2-8 to r-spdep
* add version 0.6-1 to r-stars
* add version 1.5.0 to r-statmod
* add version 4.8.0 to r-statnet-common
* add version 1.7.12 to r-stringi
* add version 1.5.0 to r-stringr
* add version 1.9.1 to r-styler
* add version 3.5-5 to r-survival
* add version 1.5-4 to r-tclust
* add version 1.7-29 to r-terra
* add version 3.1.7 to r-testthat
* add version 1.1-2 to r-th-data
* add version 1.2 to r-tictoc
* add version 1.3.2 to r-tidycensus
* add version 1.2.3 to r-tidygraph
* add version 1.3.0 to r-tidyr
* add version 2.0.0 to r-tidyverse
* add version 0.2.0 to r-timechange
* add version 0.45 to r-tinytex
* add version 0.4.1 to r-triebeard
* add version 1.0-9 to r-truncnorm
* add version 0.10-53 to r-tseries
* add version 0.8-1 to r-units
* add version 4.3.0 to r-v8
* add version 1.4-11 to r-vcd
* add version 1.14.0 to r-vcfr
* add version 0.6.2 to r-vctrs
* add version 1.1-8 to r-vgam
* add version 0.4.0 to r-vioplot
* add version 1.6.1 to r-vroom
* add version 1.72-1 to r-wgcna
* add version 0.4.1 to r-whisker
* add version 0.7.2 to r-wk
* add version 0.39 to r-xfun
* add version 1.7.5.1 to r-xgboost
* add version 1.0.7 to r-xlconnect
* add version 3.99-0.14 to r-xml
* add version 0.13.1 to r-xts
* add version 2.3.7 to r-yaml
* add version 2.3.0 to r-zip
* add version 1.8-12 to r-zoo
* r-bigmem: dependency on uuid
* r-bio3d: dependency on zlib
* r-devtools: dependency cleanup
* r-dose: dependency cleanup
* r-dss: dependency cleanup
* r-enrichplot: dependency cleanup
* r-fgsea: dependency cleanup
* r-geor: dependency cleanup
* r-ggridges: dependency cleanup
* r-lobstr: dependency cleanup
* r-lubridate: dependency cleanup
* r-mnormt: dependency cleanup
* r-sctransform: version format correction
* r-seuratobject: dependency cleanup
* r-tidyselect: dependency cleanup
* r-tweenr: dependency cleanup
* r-uwot: dependency cleanup
* new package: r-clock
* new package: r-conflicted
* new package: r-diagram
* new package: r-doby
* new package: r-httr2
* new package: r-kableextra
* new package: r-mclogit
* new package: r-memisc
* new package: r-spatstat-model
* r-rmysql: use mariadb-client
* r-snpstats: add zlib dependency
* r-qs: add zstd dependency
* r-rcppcnpy: add zlib dependency
* black reformatting
* Revert "r-dose: dependency cleanup"
  This reverts commit 4c8ae8f5615ee124fff01ce43eddd3bb5d06b9bc.
* Revert "r-dss: dependency cleanup"
  This reverts commit a6c5c15c617a9a688fdcfe2b70c501c3520d4706.
* Revert "r-enrichplot: dependency cleanup"
  This reverts commit 65e116c18a94d885bc1a0ae667c1ef07d1fe5231.
* Revert "r-fgsea: dependency cleanup"
  This reverts commit ffe2cdcd1f73f69d66167b941970ede0281b56d7.
* r-rda: this package is back in CRAN
* r-sctransform: fix copyright
* r-seurat: fix copyright
* r-seuratobject: fix copyright
* Revert "add version 6.0-94 to r-caret"
  This reverts commit 236260597de97a800bfc699aec1cd1d0e3d1ac60.
* add version 6.0-94 to r-caret
* Revert "add version 1.8.5 to r-emmeans"
  This reverts commit 64a129beb0bd88d5c88fab564cade16c03b956ec.
* add version 1.8.5 to r-emmeans
* Revert "add version 5.0-1 to r-hmisc"
  This reverts commit 517643f4fd8793747365dfcfc264b894d2f783bd.
* add version 5.0-1 to r-hmisc
* Revert "add version 1.42 to r-knitr"
  This reverts commit 2a0d9a4c1f0ba173f7423fed59ba725bac902c37.
* add version 1.42 to r-knitr
* Revert "add version 1.6 to r-markdown"
  This reverts commit 4b5565844b5704559b819d2e775fe8dec625af99.
* add version 1.6 to r-markdown
* Revert "add version 0.26 to r-nmf"
  This reverts commit 4c44a788b17848f2cda67b32312a342c0261caec.
* add version 0.26 to r-nmf
* Revert "add version 2.3.1 to r-rsqlite"
  This reverts commit 5722ee2297276e4db8beee461d39014b0b17e420.
* add version 2.3.1 to r-rsqlite
* Revert "add version 1.0-12 to r-sf"
  This reverts commit ee1734fd62cc02ca7a9359a87ed734f190575f69.
* add version 1.0-12 to r-sf
* fix syntax error
2023-05-18 09:57:43 -07:00
Cameron Stanavige
2e2515266d unifyfs: new v1.1 release (#37756)
Add v1.1 release
Update mochi-margo dependency compatible versions
Update version range of libfabric conflict
2023-05-18 09:42:27 -07:00
Chris Green
776ab13276 [xrootd] New variants, new version, improve build config (#37682)
* Add FNAL Spack team to maintainers

* New variants and configuration improvements

* Version dependent "no-systemd" patches.

* New variants `client_only`, and `davix`

* Better handling of `cxxstd` for different versions, including
  improved patching and CMake options.

* Version-specific CMake requirements.

* Better version-specific handling of `openssl` dependency.

* `py-setuptools` required for `+python` build.

* Specific enable/disable of CMake options and use of
  `-DFORCE_ENABLED=TRUE` to prevent unwanted/non-portable activation
  of features.

* Better handling of `+python` configuration.

* New version 5.5.5
2023-05-18 10:49:18 -05:00
Massimiliano Culpo
c2ce9a6d93 Bump Spack version on develop to 0.21.0.dev0 (#37760) 2023-05-18 12:47:55 +02:00
Peter Scheibel
4e3ed56dfa Bugfix: allow preferred new versions from externals (#37747) 2023-05-18 09:40:26 +02:00
Tamara Dahlgren
dcfcc03497 maintainers: switch from list to directive (#37752) 2023-05-17 22:25:57 +00:00
Stephen Sachs
125c20bc06 Add aws-plcuster[-aarch64] stacks (#37627)
Add aws-plcuster[-aarch64] stacks.  These stacks build packages defined in
https://github.com/spack/spack-configs/tree/main/AWS/parallelcluster

They use a custom container from https://github.com/spack/gitlab-runners which
includes necessary ParallelCluster software to link and build as well as an
upstream spack installation with current GCC and dependencies.

Intel and ARM software is installed and used during the build stage but removed
from the buildcache before the signing stage.

Files `configs/linux/{arch}/ci.yaml` select the necessary providers in order to
build for specific architectures (icelake, skylake, neoverse_{n,v}1).
2023-05-17 16:21:10 -06:00
Brian Van Essen
f7696a4480 Added version 1.3.1 (#37735) 2023-05-17 14:51:02 -07:00
Harmen Stoppels
a5d7667cb6 lmod: fix build, bump patch version (#37744) 2023-05-17 13:18:02 -04:00
Massimiliano Culpo
d45818ccff Limit deepcopy to just the initial "all" section (#37718)
Modifications:
- [x] Limit the scope of the deepcopy when initializing module file writers
2023-05-17 10:17:41 -07:00
Scott Wittenburg
bcb7af6eb3 gitlab ci: no copy-only pipelines w/ deprecated config (#37720)
Make it clear that copy-only pipelines are not supported while still
using the deprecated ci config format. Also ensure that the deprecated
stack does not fail on spack pipelines for tags.
2023-05-17 09:46:30 -06:00
Juan Miguel Carceller
f438fb6c79 whizard: build newer versions in parallel (#37422) 2023-05-17 17:15:50 +02:00
Harmen Stoppels
371a8a361a libxcb: depend on python, remove releases that need python 2 (#37698) 2023-05-17 17:05:30 +02:00
Tamara Dahlgren
86b9ce1c88 spack test: fix stand-alone test suite status reporting (#37602)
* Fix reporting of packageless specs as having no tests

* Add test_test_output_multiple_specs with update to simple-standalone-test (and tests)

* Refactored test status summary; added more tests or checks
2023-05-17 16:03:21 +02:00
Seth R. Johnson
05232034f5 celeritas: new version 0.2.2 (#37731)
* celeritas: new version 0.2.2

* [@spackbot] updating style on behalf of sethrj
2023-05-17 05:38:09 -04:00
Peter Scheibel
7a3da0f606 Tk/Tcl packages: speed up file search (#35902) 2023-05-17 09:27:05 +02:00
Yoshiaki Senda
d96406a161 Add recently added Spack Docker Images to documentation (#37732)
Signed-off-by: Yoshiaki Senda <yoshiaki@live.it>
2023-05-17 08:48:27 +02:00
Tamara Dahlgren
ffa5962356 emacs: convert to new stand-alone test process (#37725) 2023-05-17 00:25:35 -04:00
Massimiliano Culpo
67e74da3ba Fix spack find not able to display version ranges in compilers (#37715) 2023-05-17 00:24:38 -04:00
Chris Green
9ee2d79de1 libxpm package: fix RHEL8 build with libintl (#37713)
Set LDFLAGS rather than LDLIBS
2023-05-16 13:32:26 -05:00
John W. Parent
79e4a13eee Windows: fix MSVC version handling (#37711)
MSVC compiler logic was using string parsing to extract version
from compiler spec, which was fragile. This broke in #37572, so has
been fixed and made more robust by using attribute access.
2023-05-16 11:00:55 -07:00
kwryankrattiger
4627438373 CI: Expand E4S ROCm stack to include missing DaV packages (#36843)
* CI: Expand E4S ROCm stack to include missing DaV packages

Ascent: Fixup for VTK-m with Kokkos backend

* DaV SDK: Removed duplicated openmp variant for ascent

* Drop visit and add conflict for Kokkos

* E4S: Drop ascent from CUDA builds
2023-05-16 09:34:52 -05:00
Harmen Stoppels
badaaf7092 gha rhel8-platform-python: configure git safe.directory (#37708) 2023-05-16 16:31:13 +02:00
Harmen Stoppels
815ac000cc Revert "hdf5: fix showconfig (#34920)" (#37707)
This reverts commit 192e564e26.
2023-05-16 15:57:15 +02:00
Peter Scheibel
7bc5b26c52 Requirements and preferences should not define (non-git) versions (#37687)
Ensure that requirements `packages:*:require:@x` and preferences `packages:*:version:[x]`
fail concretization when no version defined in the package satisfies `x`. This always holds
except for git versions -- they are defined on the fly.
2023-05-16 15:45:11 +02:00
Harmen Stoppels
a0e7ca94b2 gha bootstrap-dev-rhel8: configure git safe.directory (#37702)
git has been updated to something more recent
2023-05-16 15:21:42 +02:00
Harmen Stoppels
e56c90d839 check_modules_set_name: do not check for "enable" key (#37701) 2023-05-16 11:51:52 +02:00
Ye Luo
54003d4d72 Update llvm recipe regarding libomptarget. (#36675) 2023-05-16 11:20:02 +02:00
QuellynSnead
c47b554fa1 libxcb/xcb-proto: Enable internal Python dependency (#37575)
In the past, Spack did not allow two different versions of the
same package within a DAG. That led to difficulties with packages
that still required Python 2 while other packages had already
switched to Python 3.

The libxcb and xcb-proto packages did not have Python 3 support
for a time. To get around this issue, Spack maintainers disabled
their dependency on an internal (i.e., Spack-provided) Python
(see #4145),forcing these packages to look for a system-provided
Python (see #7646).

This has worked for us all right, but with the arrival of our most
recent platform we seem to be missing the critical xcbgen Python
module on the system. Since most software has largely moved on to
Python 3 now, let's re-enable internal Spack dependencies for the
libxcb and xcb-proto packages.
2023-05-16 10:00:01 +02:00
Mikael Simberg
b027f64a7f Add conflict for pika with fmt@10 and +cuda/rocm (#37679) 2023-05-16 09:24:02 +02:00
Greg Becker
3765a5f7f8 unify: when_possible and unify: true -- Bugfix for error in 37438 (#37681)
Two bugs came in from #37438

1. `unify: when_possible` was broken, because of an incorrect assertion. abstract/concrete
   spec pairs were compared against the results that were in the process of being computed,
   rather than against the previous results.
2. `unify: true` had an ordering bug that could mix the association between abstract and
   concrete specs

- [x] 1 is resolved by creating a lookup from old concrete specs to old abstract specs,
      and we use that to associate the "new" concrete specs that happen to be the old
      ones with their abstract specs (since those are stripped out for concretization
- [x] 2 is resolved by combining the new and old abstract as lists instead of combining
      them as sets. This is important because `set() | set()` does not make any ordering
      promises, even though set ordering is otherwise guaranteed in `python@3.7:`
2023-05-16 01:08:34 -04:00
Robert Blake
690661eadd Upgrading kosh to 3.0 (#37471)
* Upgrading kosh to 3.0.

* Accidentally regressed the package, changing back.

* Updating py-hdbscan versions for kosh.

* Fixing bug in patch.

* Adding 3.0.1

* Removing 3.0.

* Updating package deps for hdbscan to match requirements.txt.

* Version reqs for 3.0.*, need newer numpy and networkx

* spack style

* Reordering to match setup.py, adding "type" to python depends.
2023-05-16 01:08:20 -04:00
eugeneswalker
f7bbc326e4 trilinos: @develop fixes (#37615)
* trilinos@develop fixes

* Update var/spack/repos/builtin/packages/trilinos/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2023-05-15 17:25:14 -07:00
Scott Wittenburg
a184bfc1a6 gitlab ci: reduce job name length of build_systems pipeline (#37686) 2023-05-16 00:26:37 +02:00
Alec Scott
81634440fb circos: add v0.69-9 (#37479) 2023-05-15 14:43:44 -07:00
Alec Scott
711d7683ac alluxio: add v2.9.3 (#37488) 2023-05-15 14:42:48 -07:00
Alec Scott
967356bcf5 codec2: add v1.1.0 (#37480) 2023-05-15 14:42:08 -07:00
Alec Scott
c006ed034a coinutils: add v2.11.9 (#37481) 2023-05-15 14:41:25 -07:00
Alec Scott
d065c65d94 g2c: add v1.7.0 (#37482) 2023-05-15 14:40:41 -07:00
Alec Scott
e23c372ff1 shadow: add v4.13 (#37485) 2023-05-15 14:38:32 -07:00
Alec Scott
25d2de5629 yoda: add v1.9.8 (#37487) 2023-05-15 14:37:31 -07:00
Alec Scott
d73a23ce35 cpp-httplib: add v0.12.3 (#37490) 2023-05-15 14:35:32 -07:00
Alec Scott
a62cb3c0f4 entt: add v3.11.1 (#37491) 2023-05-15 14:34:47 -07:00
Alec Scott
177da4595e harfbuzz: add v7.2.0 (#37492) 2023-05-15 14:34:06 -07:00
Alec Scott
e4f05129fe libconfuse: add v3.3 (#37493) 2023-05-15 14:33:19 -07:00
Alec Scott
c25b994917 libnsl: add v2.0.0 (#37494) 2023-05-15 14:32:49 -07:00
Alec Scott
95c4c5270a p11-kit: add v0.24.1 (#37495) 2023-05-15 14:31:43 -07:00
Alec Scott
1cf6a15a08 packmol: add v20.0.0 (#37496)
* packmol: add v20.0.0
* Fix zoltan homepage url
2023-05-15 14:29:01 -07:00
Alec Scott
47d206611a perl-module-build-tiny: add v0.044 (#37497) 2023-05-15 14:25:53 -07:00
Alec Scott
a6789cf653 zoltan: add v3.901 (#37498) 2023-05-15 14:25:00 -07:00
Alec Scott
933cd858e0 bdii: add v6.0.1 (#37499) 2023-05-15 14:24:14 -07:00
Alec Scott
8856361076 audit-userspace: add v3.1.1 (#37505) 2023-05-15 14:15:48 -07:00
Alec Scott
d826df7ef6 babl: add v0.1.106 (#37506) 2023-05-15 14:15:28 -07:00
Alec Scott
d8a9b42da6 actsvg: add v0.4.33 (#37503) 2023-05-15 14:14:45 -07:00
Alec Scott
7d926f86e8 bat: add v0.23.0 (#37507) 2023-05-15 14:09:51 -07:00
Alec Scott
1579544d57 beast-tracer: add v1.7.2 (#37508) 2023-05-15 14:09:18 -07:00
Alec Scott
1cee3fb4a5 cronie: add v1.6.1 (#37509) 2023-05-15 14:08:41 -07:00
Alec Scott
a8e2ad53dd cups: add v2.3.3 (#37510) 2023-05-15 14:08:09 -07:00
Alec Scott
6821fa7246 diamond: add v2.1.6 (#37511) 2023-05-15 14:07:31 -07:00
Alec Scott
09c68da1bd dust: add v0.8.6 (#37513) 2023-05-15 14:06:31 -07:00
Alec Scott
73064d62cf f3d: add v2.0.0 (#37514) 2023-05-15 14:05:37 -07:00
Alec Scott
168ed2a782 fullock: add v1.0.50 (#37515) 2023-05-15 14:02:36 -07:00
Alec Scott
9f60b29495 graphviz: add v8.0.5 (#37517) 2023-05-15 14:00:50 -07:00
Alec Scott
7abcd78426 krakenuniq: add v1.0.4 (#37519) 2023-05-15 13:59:15 -07:00
Alec Scott
d5295301de libfyaml: add v0.8 (#37520) 2023-05-15 13:58:13 -07:00
Alec Scott
beccc49b81 libluv: add v1.44.2-1 (#37522) 2023-05-15 13:55:57 -07:00
Alec Scott
037e7ffe33 libvterm: add v0.3.1 (#37524) 2023-05-15 13:54:15 -07:00
Alec Scott
293da8ed20 lighttpd: add v1.4.69 (#37525) 2023-05-15 13:53:30 -07:00
Alec Scott
2780ab2f6c mrchem: add v1.1.2 (#37526) 2023-05-15 13:51:54 -07:00
Alec Scott
1ed3c81b58 mutationpp: add v1.0.5 (#37527) 2023-05-15 13:50:48 -07:00
Alec Scott
50ce0a25b2 preseq: add v2.0.3 (#37528) 2023-05-15 13:49:47 -07:00
Alec Scott
d784227603 shtools: add v4.10.2 (#37530) 2023-05-15 13:47:15 -07:00
Alec Scott
ab9ed91539 tig: add v2.5.8 (#37531) 2023-05-15 13:46:03 -07:00
Alec Scott
421256063e trimgalore: add v0.6.9 (#37532) 2023-05-15 13:44:44 -07:00
Alec Scott
75459bc70c vdt: add v0.4.4 (#37533) 2023-05-15 13:43:40 -07:00
Carson Woods
33752eabb8 Improve package source code context display on error (#37655)
Spack displays package code context when it shouldn't (e.g., on `FetchError`s)
and doesn't display it when it should (e.g., when errors occur in builder classes.
The line attribution can sometimes be off by one, as well.

- [x] Display package context when errors occur in a subclass of `PackageBase`
- [x] Display package context when errors occur in a subclass of `BaseBuilder`
- [x] Do not display package context when errors occur in `PackageBase`,
      `BaseBuilder` or other core code that is not in a `package.py` file.
- [x] Fix off-by-one error for core code (don't subtract one from the line number *unless*
      it's in an actual `package.py` file.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-05-15 13:38:11 -07:00
Alec Scott
f1d1bb9167 gsl-lite: add v0.41.0 (#37483) 2023-05-15 13:36:03 -07:00
Alec Scott
68eaff24b0 crtm-fix: correct invalid checksum for v2.4.0 (#37500) 2023-05-15 13:34:32 -07:00
Alec Scott
862024cae1 dos2unix: add v7.4.4 (#37512) 2023-05-15 13:31:19 -07:00
Adam J. Stewart
9d6bcd67c3 Update PyTorch ecosystem (#37562) 2023-05-15 13:29:44 -07:00
Chris White
d97ecfe147 SUNDIALS: new version of sundials and guard against examples being install (#37576)
* add new version of sundials and guard against examples not installing
* fix flipping of variant
* fix directory not being there when writing a file
2023-05-15 13:21:37 -07:00
Alec Scott
0d991de50a subversion: add v1.14.2 (#37543) 2023-05-15 13:16:04 -07:00
Alec Scott
4f278a0255 go: add v1.20.4 (#37660)
* go: add v1.20.4
* Deprecate v1.20.2 and v1.19.7 due to CVE-2023-24538
2023-05-15 13:10:02 -07:00
Chris Green
6e72a3cff1 [davix] Enable third party copies with gSOAP (#37648)
* [davix] Enable third party copies with gSOAP

* Add FNAL Spack team to maintainers
2023-05-15 14:46:52 -05:00
snehring
1532c77ce6 micromamba: adding version 1.4.2 (#37594)
* micromamba: adding version 1.4.2
* micromamba: change to micromamba-1.4.2 tag artifacts
2023-05-15 10:40:54 -07:00
Mikael Simberg
5ffbce275c Add ut (#37603) 2023-05-15 10:35:55 -07:00
Carsten Uphoff
0e2ff2dddb Add double-batched FFT library v0.4.0 (#37616)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2023-05-15 10:28:52 -07:00
Mikael Simberg
c0c446a095 stdexec: Add 23.03 (#37638) 2023-05-15 10:20:12 -07:00
snehring
33dbd44449 tmscore: adding new package (#37644) 2023-05-15 10:17:50 -07:00
Sean Koyama
7b0979c1e9 hwloc: explicitly disable building netloc for ~netloc (#35604)
* hwloc: explicitly disable building netloc for ~netloc

* hwloc: update syntax for netloc variant configure argument

---------

Co-authored-by: Sean Koyama <skoyama@anl.gov>
2023-05-15 12:16:21 -05:00
snehring
c9849dd41d tmalign: new version 20220412 (#37645) 2023-05-15 10:14:58 -07:00
Chris Green
d44e97d3f2 [scitokens-cpp] New variant cxxstd, depend on standalone jwt-cpp (#37643)
* Add FNAL Spack team to maintainers
* New variant `cxxstd`
* Depend on `jwt-cpp`
* New versions: 0.7.2, 0.7.3
2023-05-15 13:08:00 -04:00
Adam J. Stewart
8713ab0f67 py-timm: add v0.9 (#37654)
* py-timm: add v0.9
* add v0.9.1 and v0.9.2
* add new package py-safetensors (v0.3.1)
2023-05-15 09:41:58 -07:00
Harmen Stoppels
6a47339bf8 oneapi: before script load modules (#37678) 2023-05-15 18:39:58 +02:00
Alec Scott
1c0fb6d641 amrfinder: add v3.11.8 (#37656) 2023-05-15 09:38:29 -07:00
Alec Scott
b45eee29eb canal: add v1.1.6 (#37657) 2023-05-15 09:36:17 -07:00
Alec Scott
6d26274459 code-server: add v4.12.0 (#37658) 2023-05-15 09:35:17 -07:00
Alec Scott
2fb07de7bc fplll: add v5.4.4 (#37659) 2023-05-15 09:34:13 -07:00
Alec Scott
7678dc6b49 iso-codes: add v4.15.0 (#37661) 2023-05-15 09:27:05 -07:00
Frank Willmore
1944dd55a7 Update package.py for maker (#37662) 2023-05-15 09:25:43 -07:00
Adam J. Stewart
0b6c724743 py-sphinx: add v7.0.1 (#37665) 2023-05-15 09:23:22 -07:00
eugeneswalker
fa98023375 new pkg: py-psana (#37666) 2023-05-15 09:19:54 -07:00
Todd Gamblin
e79a911bac bugfix: allow reuse of packages from foreign namespaces
We currently throw a nasty error if you try to reuse packages from some other namespace
(e.g., OLCF), but we should be able to reuse patched local versions of builtin packages.

Right now the only obstacle to that is that we try to look up virtual info for unknown
namespaces, and we can't get the package from the repo to do that. We *can* assume that
a package with a known namespace is similar, and that its virtual provider information
is reasonably accurate, so we now do that. This isn't 100% accurate, but neither is
relying on the package itself, as it may have gone out of date.

The real solution here is virtual edge information, but this is a stopgap until we have
that.
2023-05-15 09:15:49 -07:00
Todd Gamblin
fd3efc71fd bugfix: don't look up virtual information for unknown packages
`spec_clauses()` attempts to look up package information for concrete specs in order to
determine which virtuals they may provide. This fails for renamed/deleted dependencies
of buildcaches and installed packages.

This will eventually be fixed by #35258, which adds virtual information on edges, but we
need a workaround to make older buildcaches usable.

- [x] make an exception for renamed packages and omit their virtual constraints
- [x] add a note that this will be solved by adding virtuals to edges
2023-05-15 09:15:49 -07:00
Todd Gamblin
0458de18de bugfix: don't look up patches from packages for concrete specs
The concretizer can fail with `reuse:true` if a buildcache or installation contains a
package with a dependency that has been renamed or deleted in the main repo (e.g.,
`netcdf` was refactored to `netcdf-c`, `netcdf-fortran`, etc., but there are still
binary packages with dependencies called `netcdf`).

We should still be able to install things for which we are missing `package.py` files.

`Spec.inject_patches_variant()` was failing this requirement by attempting to look up
the package class for concrete specs.  This isn't needed -- we can skip it.

- [x] swap two conditions in `Spec.inject_patches_variant()`
2023-05-15 09:15:49 -07:00
Vanessasaurus
f94ac8c770 add new package flux-security (#37668)
I will follow this up with a variant to flux-core to add flux-security, and then automation in the flux-framework/spack repository.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2023-05-15 09:14:44 -07:00
Andrew W Elble
a03c28a916 routinator: update, deprecate old version (#37676) 2023-05-15 09:10:40 -07:00
Victor Lopez Herrero
7b7fdf27f3 dlb: add v3.3 (#37677) 2023-05-15 09:08:57 -07:00
Sergey Kosukhin
192e564e26 hdf5: fix showconfig (#34920)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2023-05-15 11:03:03 -05:00
Chris Green
b8c5099cde [jwt-cpp] New package (#37641)
* [jwt-cpp] New package

* Update homepage

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* [@spackbot] updating style on behalf of greenc-FNAL

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: greenc-FNAL <greenc-FNAL@users.noreply.github.com>
2023-05-15 10:16:11 -05:00
Stephen Sachs
ea5bca9067 palace: add v0.11.1 and explicit BLAS support (#37605) 2023-05-15 16:11:50 +02:00
Harmen Stoppels
e33eafd34f Bump tutorial command (#37674) 2023-05-15 13:54:52 +02:00
Xavier Delaruelle
e1344b5497 environment-modules: add version 5.3.0 (#37671) 2023-05-15 09:32:53 +02:00
Todd Gamblin
cf9dc3fc81 spack find: get rid of @= in arch/compiler headers (#37672)
The @= in `spack find` output adds a bit of noise. Remove it as we
did for `spack spec` and `spack concretize`.

This modifies display_specs so it actually covers other places we use that routine, as
well, e.g., `spack buildcache list`.

before:

```
-- linux-ubuntu20.04-aarch64 / gcc@=11.1.0 -----------------------
ofdlcpi libpressio@0.88.0
```

after:

```
-- linux-ubuntu20.04-aarch64 / gcc@11.1.0 -----------------------
ofdlcpi libpressio@0.88.0
```
2023-05-15 09:08:50 +02:00
Bruno Turcksin
d265dd2487 Kokkos: add new release and new architectures (#37650) 2023-05-14 13:21:40 -07:00
Greg Becker
a2a6e65e27 concretizer: don't change concrete environments without --force (#37438)
If a user does not explicitly `--force` the concretization of an entire environment,
Spack will try to reuse the concrete specs that are already in the lockfile.

---------

Co-authored-by: becker33 <becker33@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-05-14 13:36:03 +02:00
Paul R. C. Kent
0085280db8 gcc: add 12.3.0 (#37553) 2023-05-14 12:08:41 +02:00
Andrew W Elble
6e07bf149d freecad: new package w/ dependencies/updates (#37557)
* freecad: new package w/ dependencies/updates

* review

* symbols/debug variants only when autotools
2023-05-13 21:14:50 -05:00
dale-mittleman
811cd5e7ef Adding librdkafka versions 1.9.2, 2.0.2 (#37501)
Co-authored-by: Alec Scott <hi@alecbcs.com>
2023-05-13 16:00:12 -07:00
Adam J. Stewart
081e21f55e py-lightly: py-torch~distributed supported in next release (#37558) 2023-05-13 15:49:45 -07:00
Todd Gamblin
c5a24675a1 spack spec: remove noisy @= from output (#37663)
@= is accurate, but noisy. Other UI commands tend not to
print the redundant `@=` for known concrete versions;
make `spack spec` consistent with them.
2023-05-13 11:34:15 -07:00
eugeneswalker
e9bfe5cd35 new pkg: py-psmon (#37652) 2023-05-13 11:16:44 -07:00
eugeneswalker
ca84c96478 new pkg: py-psalg (#37653) 2023-05-13 09:02:57 -07:00
Chris Green
c9a790bce9 [gsoap] New package gSOAP (#37647) 2023-05-13 11:01:50 -05:00
eugeneswalker
91c5b4aeb0 e4s ci stacks: add: hdf5-vol-{log,cache} (#37651) 2023-05-13 04:54:44 +00:00
Larry Knox
c2968b4d8c Add HDF5 version 1.14.1 (#37579)
* Add HDF5 version 1.14.1
* Update to version HDF5 1.14.1-2.
2023-05-12 20:54:08 -04:00
Scott Wittenburg
c08be95d5e gitlab ci: release fixes and improvements (#37601)
* gitlab ci: release fixes and improvements

  - use rules to reduce boilerplate in .gitlab-ci.yml
  - support copy-only pipeline jobs
  - make pipelines for release branches rebuild everything
  - make pipelines for protected tags copy-only

* gitlab ci: remove url changes used in testing

* gitlab ci: tag mirrors need public key

Make sure that mirrors associated with release branches and tags
contain the public key needed to verify the signed binaries.  This
also ensures that when stack-specific mirror contents are copied
to the root, the root mirror has the public key as well.

* review: be more specific about tags, curl flags

* Make the check in ci.yaml consistent with the .gitlab-ci.yml

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2023-05-12 15:22:42 -05:00
Lehman Garrison
4e5fb62679 py-asdf: add 2.15.0 and dependencies (#37642)
* py-asdf: add 2.15.0 and dependencies

* py-asdf: PR review
2023-05-12 15:35:22 -04:00
Adam J. Stewart
cafc21c43d py-lightly: add v1.4.5 (#37625) 2023-05-12 11:39:03 -07:00
Adam J. Stewart
72699b43ab py-dill: add v0.3.1.1 (#37415) 2023-05-12 11:37:51 -07:00
MatthewLieber
6c85f59a89 Osu/mvapich2.3.7 1 (#37636)
* add 3.0b release

* adding mvapich2 version 2.3.7-1

---------

Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2023-05-12 11:30:05 -07:00
Nathan Hanford
eef2536055 Allow buildcache specs to be referenced by hash (#35042)
Currently, specs on buildcache mirrors must be referenced by their full description. This PR allows buildcache specs to be referenced by their hashes, rather than their full description.

### How it works

Hash resolution has been moved from `SpecParser` into `Spec`, and now includes the ability to execute a `BinaryCacheQuery` after checking the local store, but before concluding that the hash doesn't exist.

### Side-effects of Proposed Changes

Failures will take longer when nonexistent hashes are parsed, as mirrors will now be scanned.

### Other Changes

- `BinaryCacheIndex.update` has been modified to fail appropriately only when mirrors have been configured.
- Tests of hash failures have been updated to use `mutable_empty_config` so they don't needlessly search mirrors.
- Documentation has been clarified for `BinaryCacheQuery`, and more documentation has been added to the hash resolution functions added to `Spec`.
2023-05-12 10:27:42 -07:00
Massimiliano Culpo
e2ae60a3b0 Update archspec to v0.2.1 (#37633) 2023-05-12 18:59:58 +02:00
Chris Green
d942fd62b5 [root] New version 6.28.04 with C++20 support (#37640)
* Add FNAL Spack team to maintainers.
* New version 6.28/04.
* Support C++20 with ROOT >= 6.28.04.
2023-05-12 09:50:48 -07:00
Andrey Parfenov
99d511d3b0 Add more variants for STREAM to customize build (#37283)
* Added STREAM builds customization

* Changed stream_type to enum

* fix code style issues

Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>

* rm not necessary optimization

Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>

---------

Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
Co-authored-by: iermolae <igor.ermolaev@intel.com>
2023-05-12 12:17:59 -04:00
Adam J. Stewart
ab8661533b GDAL: add v3.7.0 (#37598) 2023-05-12 12:13:10 -04:00
Robert Cohn
f423edd526 intel-oneapi-mkl: support gnu openmp (#37637)
* intel-oneapi-mkl: support gnu openmp

* intel-oneapi-mkl: support gnu openmp
2023-05-12 12:03:19 -04:00
Manuela Kuhn
0a4d4da5ce py-rsatoolbox: add 0.0.5, 0.1.0 and 0.1.2 (#37595)
* py-rsatoolbox: add 0.0.5, 0.1.0 and 0.1.2 from wheels

* py-setuptools: add 63.4.3

* remove wheels and open up requirements

* Fix style

* Update var/spack/repos/builtin/packages/py-rsatoolbox/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-rsatoolbox/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Change version for python restriction

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-12 10:32:12 -05:00
Manuela Kuhn
845187f270 py-mne: add 1.4.0 and py-importlib-resources: add 5.12.0 (#37624)
* py-mne: add 1.4.0 and py-importlib-resources: add 5.12.0

* Fix style

* Update var/spack/repos/builtin/packages/py-mne/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-12 10:31:35 -05:00
Lehman Garrison
9b35c3cdcc Update tensorflow variant defaults to match upstream defaults (#37610)
* Update tensorflow variant defaults to match project's defaults

* Apply code style
2023-05-12 10:27:51 -05:00
Robert Cohn
fe8734cd52 Fix logic in setting oneapi microarchitecture flags (#37634) 2023-05-12 10:58:08 -04:00
Chris Green
40b1aa6b67 [geant4,geant4-data] New version 10.7.4 (#37382) 2023-05-12 15:20:50 +01:00
Eduardo Rothe
ced8ce6c34 cudnn: add versions 8.5.0, 8.6.0, 8.7.0 (#35998) 2023-05-12 07:38:11 -04:00
Tamara Dahlgren
9201b66792 AML: Convert to new stand-alone test process (#35701) 2023-05-12 13:22:11 +02:00
Massimiliano Culpo
fd45839c04 Improve error message for buildcaches (#37626) 2023-05-12 11:55:13 +02:00
Mikael Simberg
2e25db0755 Add pika 0.15.1 (#37628) 2023-05-12 11:45:30 +02:00
Massimiliano Culpo
ebfc706c8c Improve error messages when Spack finds a too new DB / lockfile (#37614)
This PR ensures that we'll get a comprehensible error message whenever an old
version of Spack tries to use a DB or a lockfile that is "too new".

* Fix error message when using a too new DB
* Add a unit-test to ensure we have a comprehensible error message
2023-05-12 08:13:10 +00:00
Steven R. Brandt
644a10ee35 Coastal Codes (#37176)
* Coastal codes installation
* Finished debugging swan.
* Fix formatting errors identified by flake8
* Another attempt to fix formatting.
* Fixed year in header.
* Fixed maintainers syntax and other details from review comments.
* Remove redundant url.

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-05-12 00:57:59 -04:00
snehring
bb96e4c9cc py-pysam: adding version 0.21.0 (#37623)
* py-pysam: adding version 0.21.0

* Update var/spack/repos/builtin/packages/py-pysam/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-12 00:47:53 -04:00
Tamara Dahlgren
d204a08aea Install/update the qt dependency (#37600) 2023-05-11 22:58:33 -05:00
Tamara Dahlgren
8e18297cf2 Environments: store spack version/commit in spack.lock (#32801)
Add a section to the lock file to track the Spack version/commit that produced
an environment. This should (eventually) enhance reproducibility, though we
do not currently do anything with the information. It just adds to provenance
at the moment.

Changes include:
- [x] adding the version/commit to `spack.lock`
- [x] refactor `spack.main.get_version()
- [x] fix a couple of environment lock file-related typos
2023-05-11 23:13:36 -04:00
MatthewLieber
b06d20be19 add 3.0b release (#37599)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2023-05-11 17:10:15 -07:00
Alec Scott
a14d6fe56d gegl: add v0.4.44 (#37516) 2023-05-11 15:52:22 -07:00
eugeneswalker
47ec6a6ae5 e4s ci: trilinos +rocm: enable belos to fix build failure (#37617) 2023-05-11 14:02:20 -07:00
Massimiliano Culpo
5c7dda7e14 Allow using -j to control the parallelism of concretization (#37608)
fixes #29464

This PR allows to use
```
$ spack concretize -j X
```
to set a cap on the parallelism of concretization from the command line
2023-05-11 13:29:17 -07:00
Dom Heinzeller
0e87243284 libpng package: fix build error on macOS arm64 (#37613)
Turn off ARM NEON support on MacOS arm64

Co-authored-by: Stephen Herbener <stephen.herbener@gmail.com>
2023-05-11 16:27:43 -04:00
Nichols A. Romero
384f5f9960 Update Intel Pin package up to 3.27 (#37470) 2023-05-11 19:06:03 +02:00
Andrey Parfenov
c0f020d021 add openmp_max_threads variant and enable avx 512 optimizations for icelake (#37379)
* add openmp_max_threads variant and enable avx 512 optimizations for icelake and cascadelake

Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>

* revert manual enabling of avx512 for icelake and cascadelake

Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>

---------

Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
2023-05-11 09:23:04 -05:00
Tamara Dahlgren
dc58449bbf caliper: convert to new stand-alone test process (#35691) 2023-05-11 14:40:02 +02:00
Tamara Dahlgren
d8a72b68dd bricks: convert to new stand-alone test process (#35694) 2023-05-11 14:39:09 +02:00
Mosè Giordano
040c6e486e julia: Fix llvm shlib symbol version for v1.9 (#37606) 2023-05-11 08:22:40 -04:00
Harmen Stoppels
4fa7880b19 lmod: fix CompilerSpec concrete version / range (#37604) 2023-05-11 12:00:07 +02:00
Nisarg Patel
f090b05346 Update providers of virtual packages related to Intel OneAPI (#37412)
* add a virtual dependency name instead of complete package name

* add OneAPI components as providers of virtual packages

* Revert the default of tbb

---------

Co-authored-by: Nisarg Patel <nisarg.patel@lrz.de>
2023-05-11 05:58:24 -04:00
Mikael Simberg
0c69e5a442 Add fmt 10.0.0 (#37591) 2023-05-11 04:57:47 -04:00
Massimiliano Culpo
8da29d1231 Improve the message for errors in package recipes (#37589)
fixes #30355
2023-05-11 10:34:39 +02:00
Massimiliano Culpo
297329f4b5 Improve error message for missing "command" entry in containerize (#37590)
fixes #21242
2023-05-11 10:33:51 +02:00
Mosè Giordano
1b6621a14b julia: Add v1.9.0 (#35631) 2023-05-11 10:30:52 +02:00
Peter Scheibel
bfa54da292 Allow clingo to enforce flags when they appear in requirements (#37584)
Flags are encoded differently from other variants, and they need a choice rule to
ensure clingo has a choice to impose (or not) a constraint.
2023-05-11 09:17:16 +02:00
Jaelyn Litzinger
730ab1574f Upgrade exago's petsc dependency to v3.19.0 (#37092)
* add petsc 3.19 for exago@develop
* simplify version syntax
2023-05-10 18:25:11 -07:00
Harmen Stoppels
2c17c4e632 ci: remove --mirror-url flag (#37457)
The flags --mirror-name / --mirror-url / --directory were deprecated in 
favor of just passing a positional name, url or directory, and letting spack
figure it out.

---------

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2023-05-10 16:34:29 -06:00
John W. Parent
ec800cccbb Windows: Fix external detection for service accounts (#37293)
Prior to this PR, the HOMEDRIVE environment variable was used to
detect what drive we are operating in. This variable is not available
for service account logins (like what is used for CI), so switch to
extracting the drive from PROGRAMFILES (which is more-widely defined).
2023-05-10 18:12:58 -04:00
John W. Parent
85cc9097cb Windows: prefer Python decompression support (#36507)
On Windows, several commonly available system tools for decompression
are unreliable (gz/bz2/xz). This commit refactors `decompressor_for`
to call out to a Windows or Unix-specific method:

* The decompressor_for_nix method behaves the same as before and
  generally treats the Python/system support options for decompression
  as interchangeable (although avoids using Python's built-in tar
  support since that has had issues with permissions).
* The decompressor_for_win method can only use Python support for
  gz/bz2/xz, although for a tar.gz it does use system support for
  untar (after the decompression step). .zip uses the system tar
  utility, and .Z depends on external support (i.e. that the user
  has installed 7zip).

A naming scheme has been introduced for the various _decompression
methods:

* _system_gunzip means to use a system tool (and fail if it's not
    available)
* _py_gunzip means to use Python's built-in support for decompressing
    .gzip files (and fail if it's not available)
* _gunzip is a method that can do either
2023-05-10 18:07:56 -04:00
snehring
830ee6a1eb py-gtdbtk: adding version 2.3.0 (#37581)
* py-gtdbtk: adding version 2.3.0

* py-gtdbtk: adding missing pydantic dep

* py-gtdbtk: restrict pydantic dep
2023-05-10 16:58:42 -05:00
Alec Scott
0da7b83d0b fd: merge fd-find with fd (#37580) 2023-05-10 14:29:13 -07:00
SoniaScard
f51a4a1ae1 Ophidia-analytics-framework, ophidia-io-server: Work (#36801)
* ophidia-io-server: new package at v1.7
* ophidia-io-server: Fix package
* ophidia-analytics-framework: new package at v1.7
* Fix code style in ophidia-analytics-framework
* Merge
* ophidia-analytics-framework: update package to v1.7.3
* Update package.py
* Fix style

---------

Co-authored-by: SoniaScard <SoniaScard@users.noreply.github.com>
Co-authored-by: Donatello Elia <eldoo@users.noreply.github.com>
2023-05-10 08:38:43 -07:00
H. Joe Lee
e49f10a28e fix(hdf5): h5pfc link failure (#37468)
* fix(hdf5): h5pfc link failure
  develop branch doesn't need linking any more.
  See: acb186f6e5
* [@spackbot] updating style on behalf of hyoklee

---------

Co-authored-by: hyoklee <hyoklee@users.noreply.github.com>
2023-05-10 08:33:27 -07:00
Harmen Stoppels
1d96fdc74a Fix compiler version issues (concrete vs range) (#37572) 2023-05-10 17:26:22 +02:00
Robert Cohn
8eb1829554 intel-oneapi-mkl: add threading support (#37586) 2023-05-10 10:47:57 -04:00
matteo-chesi
e70755f692 cuda: add versions 12.0.1, 12.1.0 and 12.1.1 (#37083) 2023-05-10 15:31:07 +02:00
G-Ragghianti
ebb40ee0d1 New option "--first" for "spack location" (#36283) 2023-05-10 12:26:29 +02:00
Robert Cohn
a2ea30aceb Create include/lib in prefix for oneapi packages (#37552) 2023-05-10 06:25:00 -04:00
Tamara Dahlgren
9a37c8fcb1 Stand-alone testing: make recipe support and processing spack-/pytest-like (#34236)
This is a refactor of Spack's stand-alone test process to be more spack- and pytest-like. 

It is more spack-like in that test parts are no longer "hidden" in a package's run_test()
method and pytest-like in that any package method whose name starts test_ 
(i.e., a "test" method) is a test part. We also support the ability to embed test parts in a
test method when that makes sense.

Test methods are now implicit test parts. The docstring is the purpose for the test part. 
The name of the method is the name of the test part. The working directory is the active
spec's test stage directory. You can embed test parts using the test_part context manager.

Functionality added by this commit:
* Adds support for multiple test_* stand-alone package test methods, each of which is 
   an implicit test_part for execution and reporting purposes;
* Deprecates package use of run_test();
* Exposes some functionality from run_test() as optional helper methods;
* Adds a SkipTest exception that can be used to flag stand-alone tests as being skipped;
* Updates the packaging guide section on stand-alone tests to provide more examples;
* Restores the ability to run tests "inherited" from provided virtual packages;
* Prints the test log path (like we currently do for build log paths);
* Times and reports the post-install process (since it can include post-install tests);
* Corrects context-related error message to distinguish test recipes from build recipes.
2023-05-10 11:34:54 +02:00
Alec Scott
49677b9be5 squashfs-mount: add v0.4.0 (#37478) 2023-05-10 10:49:21 +02:00
Alec Scott
6fc44eb540 shared-mime-info: add v1.10 (#37477) 2023-05-10 10:49:00 +02:00
Alec Scott
234febe545 kinesis: add v2.4.8 (#37476) 2023-05-10 10:48:44 +02:00
Alec Scott
83a1245bfd unifdef: add v2.12 (#37456) 2023-05-10 10:48:20 +02:00
Alec Scott
241c37fcf7 conmon: add v2.1.7 (#37320) 2023-05-10 10:47:54 +02:00
Alec Scott
a8114ec52c runc: add v1.1.6 (#37308) 2023-05-10 10:47:44 +02:00
Manuela Kuhn
f92b5d586f py-datalad: add 0.18.3 (#37411)
* py-datalad: add 0.18.3

* [@spackbot] updating style on behalf of manuelakuhn

* Remove metadata variant

* Fix dependencies

* Remove redundant version restriction
2023-05-10 03:57:59 -04:00
Alec Scott
492d68c339 r-knitr: add v1.42 (#37203) 2023-05-09 17:43:58 -05:00
Alec Scott
2dcc55d6c5 ssht: add v1.5.2 (#37542) 2023-05-09 11:37:40 -07:00
eugeneswalker
dc897535df py-loguru: add v0.2.5, v0.3.0 (#37574)
* py-loguru: add v0.2.5

* py-loguru: add v0.3.0
2023-05-09 11:16:02 -07:00
kwryankrattiger
45e1d3498c CI: Backwards compatibility requires script override behavior (#37015) 2023-05-09 10:42:06 -06:00
eugeneswalker
af0f094292 memkind: parallel = false (#37566) 2023-05-09 09:04:54 -07:00
Alec Scott
44b51acb7b z-checker: add v0.9.0 (#37534) 2023-05-09 06:52:43 -07:00
eugeneswalker
13dd05e5ec hip: get_paths for hipify-clang (#37559)
* hip: get_paths for hipify-clang

* fix: need to actually use get_paths now to get hipify-clang path

* set hipify-clang path differentluy for external vs spack-installed case

* [@spackbot] updating style on behalf of eugeneswalker
2023-05-09 06:51:04 -07:00
Massimiliano Culpo
89520467e0 Use single quotes to inline manifest in Dockerfiles (#37571)
fixes #22341

Using double quotes creates issues with shell variable substitutions,
in particular when the manifest has "definitions:" in it. Use single
quotes instead.
2023-05-09 13:20:25 +02:00
Harmen Stoppels
9e1440ec7b spack view copy: relocate symlinks (#32306) 2023-05-09 12:17:16 +02:00
Alec Scott
71cd94e524 gh: add conflict for v2.28.0 and macos (#37563) 2023-05-09 08:55:54 +02:00
Alec Scott
ba696de71b breseq: add v0.38.1 (#37535) 2023-05-08 14:39:45 -07:00
Alec Scott
06b63cfce3 exiv2: add v0.27.6 (#37536) 2023-05-08 14:25:11 -07:00
Alec Scott
5be1e6e852 hazelcast: add v5.2.3 (#37537) 2023-05-08 14:22:46 -07:00
Alec Scott
e651c2b122 libjpeg-turbo: add v2.1.5 (#37539) 2023-05-08 14:19:24 -07:00
Alec Scott
ec2a4869ef mlst: add v2.23.0 (#37540) 2023-05-08 14:03:57 -07:00
Alec Scott
082fb1f6e9 scitokens-cpp: add v1.0.1 (#37541) 2023-05-08 14:00:33 -07:00
Alec Scott
95a65e85df delta: add v2.3.0 (#37545) 2023-05-08 13:46:08 -07:00
Alec Scott
d9e7aa4253 fd-find: add v8.7.0 (#37547) 2023-05-08 13:43:13 -07:00
Alec Scott
5578209117 druid: add v1.2.8 (#37546) 2023-05-08 13:42:10 -07:00
Alec Scott
b013a2de50 fd-find: add v8.7.0 (#37547) 2023-05-08 13:28:50 -07:00
Mark W. Krentel
d1c722a49c hpcviewer: add version 2023.04 (#37556) 2023-05-08 12:30:10 -07:00
eugeneswalker
3446feff70 use latest trilinos for +cuda variants (#37164) 2023-05-08 12:29:54 -07:00
eugeneswalker
41afeacaba new package: psalg (#37357)
* new package: psalg

* use new maintainer syntax

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-05-08 12:26:19 -07:00
698 changed files with 9998 additions and 3794 deletions

View File

@@ -137,6 +137,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd spack-test

View File

@@ -72,6 +72,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd spack-test

View File

@@ -1,3 +1,221 @@
# v0.20.0 (2023-05-21)
`v0.20.0` is a major feature release.
## Features in this release
1. **`requires()` directive and enhanced package requirements**
We've added some more enhancements to requirements in Spack (#36286).
There is a new `requires()` directive for packages. `requires()` is the opposite of
`conflicts()`. You can use it to impose constraints on this package when certain
conditions are met:
```python
requires(
"%apple-clang",
when="platform=darwin",
msg="This package builds only with clang on macOS"
)
```
More on this in [the docs](
https://spack.rtfd.io/en/latest/packaging_guide.html#conflicts-and-requirements).
You can also now add a `when:` clause to `requires:` in your `packages.yaml`
configuration or in an environment:
```yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "Only OpenMPI 4.1.5 and up can build with fancy compilers"
```
More details can be found [here](
https://spack.readthedocs.io/en/latest/build_settings.html#package-requirements)
2. **Exact versions**
Spack did not previously have a way to distinguish a version if it was a prefix of
some other version. For example, `@3.2` would match `3.2`, `3.2.1`, `3.2.2`, etc. You
can now match *exactly* `3.2` with `@=3.2`. This is useful, for example, if you need
to patch *only* the `3.2` version of a package. The new syntax is described in [the docs](
https://spack.readthedocs.io/en/latest/basic_usage.html#version-specifier).
Generally, when writing packages, you should prefer to use ranges like `@3.2` over
the specific versions, as this allows the concretizer more leeway when selecting
versions of dependencies. More details and recommendations are in the [packaging guide](
https://spack.readthedocs.io/en/latest/packaging_guide.html#ranges-versus-specific-versions).
See #36273 for full details on the version refactor.
3. **New testing interface**
Writing package tests is now much simpler with a new [test interface](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests).
Writing a test is now as easy as adding a method that starts with `test_`:
```python
class MyPackage(Package):
...
def test_always_fails(self):
"""use assert to always fail"""
assert False
def test_example(self):
"""run installed example"""
example = which(self.prefix.bin.example)
example()
```
You can use Python's native `assert` statement to implement your checks -- no more
need to fiddle with `run_test` or other test framework methods. Spack will
introspect the class and run `test_*` methods when you run `spack test`,
4. **More stable concretization**
* Now, `spack concretize` will *only* concretize the new portions of the environment
and will not change existing parts of an environment unless you specify `--force`.
This has always been true for `unify:false`, but not for `unify:true` and
`unify:when_possible` environments. Now it is true for all of them (#37438, #37681).
* The concretizer has a new `--reuse-deps` argument that *only* reuses dependencies.
That is, it will always treat the *roots* of your environment as it would with
`--fresh`. This allows you to upgrade just the roots of your environment while
keeping everything else stable (#30990).
5. **Weekly develop snapshot releases**
Since last year, we have maintained a buildcache of `develop` at
https://binaries.spack.io/develop, but the cache can grow to contain so many builds
as to be unwieldy. When we get a stable `develop` build, we snapshot the release and
add a corresponding tag the Spack repository. So, you can use a stack from a specific
day. There are now tags in the spack repository like:
* `develop-2023-05-14`
* `develop-2023-05-18`
that correspond to build caches like:
* https://binaries.spack.io/develop-2023-05-14/e4s
* https://binaries.spack.io/develop-2023-05-18/e4s
We plan to store these snapshot releases weekly.
6. **Specs in buildcaches can be referenced by hash.**
* Previously, you could run `spack buildcache list` and see the hashes in
buildcaches, but referring to them by hash would fail.
* You can now run commands like `spack spec` and `spack install` and refer to
buildcache hashes directly, e.g. `spack install /abc123` (#35042)
7. **New package and buildcache index websites**
Our public websites for searching packages have been completely revamped and updated.
You can check them out here:
* *Package Index*: https://packages.spack.io
* *Buildcache Index*: https://cache.spack.io
Both are searchable and more interactive than before. Currently major releases are
shown; UI for browsing `develop` snapshots is coming soon.
8. **Default CMake and Meson build types are now Release**
Spack has historically defaulted to building with optimization and debugging, but
packages like `llvm` can be enormous with debug turned on. Our default build type for
all Spack packages is now `Release` (#36679, #37436). This has a number of benefits:
* much smaller binaries;
* higher default optimization level; and
* defining `NDEBUG` disables assertions, which may lead to further speedups.
You can still get the old behavior back through requirements and package preferences.
## Other new commands and directives
* `spack checksum` can automatically add new versions to package (#24532)
* new command: `spack pkg grep` to easily search package files (#34388)
* New `maintainers` directive (#35083)
* Add `spack buildcache push` (alias to `buildcache create`) (#34861)
* Allow using `-j` to control the parallelism of concretization (#37608)
* Add `--exclude` option to 'spack external find' (#35013)
## Other new features of note
* editing: add higher-precedence `SPACK_EDITOR` environment variable
* Many YAML formatting improvements from updating `ruamel.yaml` to the latest version
supporting Python 3.6. (#31091, #24885, #37008).
* Requirements and preferences should not define (non-git) versions (#37687, #37747)
* Environments now store spack version/commit in `spack.lock` (#32801)
* User can specify the name of the `packages` subdirectory in repositories (#36643)
* Add container images supporting RHEL alternatives (#36713)
* make version(...) kwargs explicit (#36998)
## Notable refactors
* buildcache create: reproducible tarballs (#35623)
* Bootstrap most of Spack dependencies using environments (#34029)
* Split `satisfies(..., strict=True/False)` into two functions (#35681)
* spack install: simplify behavior when inside environments (#35206)
## Binary cache and stack updates
* Major simplification of CI boilerplate in stacks (#34272, #36045)
* Many improvements to our CI pipeline's reliability
## Removals, Deprecations, and disablements
* Module file generation is disabled by default; you'll need to enable it to use it (#37258)
* Support for Python 2 was deprecated in `v0.19.0` and has been removed. `v0.20.0` only
supports Python 3.6 and higher.
* Deprecated target names are no longer recognized by Spack. Use generic names instead:
* `graviton` is now `cortex_a72`
* `graviton2` is now `neoverse_n1`
* `graviton3` is now `neoverse_v1`
* `blacklist` and `whitelist` in module configuration were deprecated in `v0.19.0` and are
removed in this release. Use `exclude` and `include` instead.
* The `ignore=` parameter of the `extends()` directive has been removed. It was not used by
any builtin packages and is no longer needed to avoid conflicts in environment views (#35588).
* Support for the old YAML buildcache format has been removed. It was deprecated in `v0.19.0` (#34347).
* `spack find --bootstrap` has been removed. It was deprecated in `v0.19.0`. Use `spack
--bootstrap find` instead (#33964).
* `spack bootstrap trust` and `spack bootstrap untrust` are now removed, having been
deprecated in `v0.19.0`. Use `spack bootstrap enable` and `spack bootstrap disable`.
* The `--mirror-name`, `--mirror-url`, and `--directory` options to buildcache and
mirror commands were deprecated in `v0.19.0` and have now been removed. They have been
replaced by positional arguments (#37457).
* Deprecate `env:` as top level environment key (#37424)
* deprecate buildcache create --rel, buildcache install --allow-root (#37285)
* Support for very old perl-like spec format strings (e.g., `$_$@$%@+$+$=`) has been
removed (#37425). This was deprecated in in `v0.15` (#10556).
## Notable Bugfixes
* bugfix: don't fetch package metadata for unknown concrete specs (#36990)
* Improve package source code context display on error (#37655)
* Relax environment manifest filename requirements and lockfile identification criteria (#37413)
* `installer.py`: drop build edges of installed packages by default (#36707)
* Bugfix: package requirements with git commits (#35057, #36347)
* Package requirements: allow single specs in requirement lists (#36258)
* conditional variant values: allow boolean (#33939)
* spack uninstall: follow run/link edges on --dependents (#34058)
## Spack community stats
* 7,179 total packages, 499 new since `v0.19.0`
* 329 new Python packages
* 31 new R packages
* 336 people contributed to this release
* 317 committers to packages
* 62 committers to core
# v0.19.1 (2023-02-07)
### Spack Bugfixes

View File

@@ -20,7 +20,7 @@ packages:
awk: [gawk]
blas: [openblas, amdblis]
D: [ldc]
daal: [intel-daal]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
@@ -30,7 +30,7 @@ packages:
golang: [go, gcc]
go-or-gccgo-bootstrap: [go-bootstrap, gcc]
iconv: [libiconv]
ipp: [intel-ipp]
ipp: [intel-oneapi-ipp]
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
@@ -40,7 +40,7 @@ packages:
lua-lang: [lua, lua-luajit-openresty, lua-luajit]
luajit: [lua-luajit-openresty, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb]
mkl: [intel-mkl]
mkl: [intel-oneapi-mkl]
mpe: [mpe2]
mpi: [openmpi, mpich]
mysql-client: [mysql, mariadb-c-client]

View File

@@ -217,6 +217,7 @@ def setup(sphinx):
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),
("py:class", "spack.install_test.Pb"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -143,6 +143,26 @@ The OS that are currently supported are summarized in the table below:
* - Amazon Linux 2
- ``amazonlinux:2``
- ``spack/amazon-linux``
* - AlmaLinux 8
- ``almalinux:8``
- ``spack/almalinux8``
* - AlmaLinux 9
- ``almalinux:9``
- ``spack/almalinux9``
* - Rocky Linux 8
- ``rockylinux:8``
- ``spack/rockylinux8``
* - Rocky Linux 9
- ``rockylinux:9``
- ``spack/rockylinux9``
* - Fedora Linux 37
- ``fedora:37``
- ``spack/fedora37``
* - Fedora Linux 38
- ``fedora:38``
- ``spack/fedora38``
All the images are tagged with the corresponding release of Spack:

View File

@@ -347,7 +347,7 @@ the Environment and then install the concretized specs.
(see :ref:`build-jobs`). To speed up environment builds further, independent
packages can be installed in parallel by launching more Spack instances. For
example, the following will build at most four packages in parallel using
three background jobs:
three background jobs:
.. code-block:: console
@@ -395,7 +395,7 @@ version (and other constraints) passed as the spec argument to the
For packages with ``git`` attributes, git branches, tags, and commits can
also be used as valid concrete versions (see :ref:`version-specifier`).
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone
the ``main`` branch of the package, and ``spack install`` will install from
that git clone if ``foo`` is in the environment.
Further development on ``foo`` can be tested by reinstalling the environment,
@@ -589,10 +589,11 @@ user support groups providing a large software stack for their HPC center.
.. admonition:: Re-concretization of user specs
When using *unified* concretization (when possible), the entire set of specs will be
re-concretized after any addition of new user specs, to ensure that
the environment remains consistent / minimal. When instead unified concretization is
disabled, only the new specs will be concretized after any addition.
The ``spack concretize`` command without additional arguments will *not* change any
previously concretized specs. This may prevent it from finding a solution when using
``unify: true``, and it may prevent it from finding a minimal solution when using
``unify: when_possible``. You can force Spack to ignore the existing concrete environment
with ``spack concretize -f``.
^^^^^^^^^^^^^
Spec Matrices
@@ -1121,19 +1122,19 @@ index once every package is pushed. Note how this target uses the generated
SPACK ?= spack
BUILDCACHE_DIR = $(CURDIR)/tarballs
.PHONY: all
all: push
include env.mk
example/push/%: example/install/%
@mkdir -p $(dir $@)
$(info About to push $(SPEC) to a buildcache)
$(SPACK) -e . buildcache create --allow-root --only=package --directory $(BUILDCACHE_DIR) /$(HASH)
@touch $@
push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS))
$(info Updating the buildcache index)
$(SPACK) -e . buildcache update-index --directory $(BUILDCACHE_DIR)

File diff suppressed because it is too large Load Diff

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.0-dev (commit d02dadbac4fa8f3a60293c4fbfd59feadaf546dc)
* Version: 0.2.1 (commit 9e1117bd8a2f0581bced161f2a2e8d6294d0300b)
astunparse
----------------

View File

@@ -2083,18 +2083,28 @@
"compilers": {
"gcc": [
{
"versions": "10.3:",
"versions": "10.3:13.0",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
},
{
"versions": "13.1:",
"name": "znver4",
"flags": "-march={name} -mtune={name}"
}
],
"clang": [
{
"versions": "12.0:",
"versions": "12.0:15.9",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
},
{
"versions": "16.0:",
"name": "znver4",
"flags": "-march={name} -mtune={name}"
}
],
],
"aocc": [
{
"versions": "3.0:3.9",
@@ -2793,7 +2803,7 @@
"flags" : "-march=armv8.2-a+fp16+dotprod+crypto -mtune=cortex-a72"
},
{
"versions": "10.2",
"versions": "10.2:10.2.99",
"flags" : "-mcpu=zeus"
},
{

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.20.0.dev0"
__version__ = "0.21.0.dev0"
spack_version = __version__

View File

@@ -289,9 +289,14 @@ def _check_build_test_callbacks(pkgs, error_cls):
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
test_callbacks = getattr(pkg_cls, "build_time_test_callbacks", None)
if test_callbacks and "test" in test_callbacks:
msg = '{0} package contains "test" method in ' "build_time_test_callbacks"
instr = 'Remove "test" from: [{0}]'.format(", ".join(test_callbacks))
# TODO (post-34236): "test*"->"test_*" once remove deprecated methods
# TODO (post-34236): "test"->"test_" once remove deprecated methods
has_test_method = test_callbacks and any([m.startswith("test") for m in test_callbacks])
if has_test_method:
msg = '{0} package contains "test*" method(s) in ' "build_time_test_callbacks"
instr = 'Remove all methods whose names start with "test" from: [{0}]'.format(
", ".join(test_callbacks)
)
errors.append(error_cls(msg.format(pkg_name), [instr]))
return errors

View File

@@ -193,10 +193,17 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
db_root_dir = os.path.join(tmpdir, "db_root")
db = spack_db.Database(None, db_dir=db_root_dir, enable_transaction_locking=False)
self._index_file_cache.init_entry(cache_key)
cache_path = self._index_file_cache.cache_path(cache_key)
with self._index_file_cache.read_transaction(cache_key):
db._read_from_file(cache_path)
try:
self._index_file_cache.init_entry(cache_key)
cache_path = self._index_file_cache.cache_path(cache_key)
with self._index_file_cache.read_transaction(cache_key):
db._read_from_file(cache_path)
except spack_db.InvalidDatabaseVersionError as e:
msg = (
f"you need a newer Spack version to read the buildcache index for the "
f"following mirror: '{mirror_url}'. {e.database_version_message}"
)
raise BuildcacheIndexError(msg) from e
spec_list = db.query_local(installed=False, in_buildcache=True)
@@ -419,7 +426,7 @@ def update(self, with_cooldown=False):
self._write_local_index_cache()
if all_methods_failed:
if configured_mirror_urls and all_methods_failed:
raise FetchCacheError(fetch_errors)
if fetch_errors:
tty.warn(
@@ -2408,6 +2415,10 @@ def __init__(self, all_architectures):
self.possible_specs = specs
def __call__(self, spec, **kwargs):
"""
Args:
spec (str): The spec being searched for in its string representation or hash.
"""
matches = []
if spec.startswith("/"):
# Matching a DAG hash
@@ -2429,6 +2440,10 @@ def __str__(self):
return "{}, due to: {}".format(self.args[0], self.args[1])
class BuildcacheIndexError(spack.error.SpackError):
"""Raised when a buildcache cannot be read for any reason"""
FetchIndexResult = collections.namedtuple("FetchIndexResult", "etag hash data fresh")

View File

@@ -43,6 +43,7 @@
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
from llnl.util.lang import dedupe
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
@@ -53,7 +54,6 @@
import spack.build_systems.python
import spack.builder
import spack.config
import spack.install_test
import spack.main
import spack.package_base
import spack.paths
@@ -66,6 +66,7 @@
import spack.util.path
import spack.util.pattern
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import spack_install_test_log
from spack.installer import InstallError
from spack.util.cpus import cpus_available
from spack.util.environment import (
@@ -588,7 +589,6 @@ def set_module_variables_for_package(pkg):
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
@@ -1075,19 +1075,18 @@ def _setup_pkg_and_run(
# 'pkg' is not defined yet
pass
elif context == "test":
logfile = os.path.join(
pkg.test_suite.stage, spack.install_test.TestSuite.test_log_name(pkg.spec)
)
logfile = os.path.join(pkg.test_suite.stage, pkg.test_suite.test_log_name(pkg.spec))
error_msg = str(exc)
if isinstance(exc, (spack.multimethod.NoSuchMethodError, AttributeError)):
process = "test the installation" if context == "test" else "build from sources"
error_msg = (
"The '{}' package cannot find an attribute while trying to build "
"from sources. This might be due to a change in Spack's package format "
"The '{}' package cannot find an attribute while trying to {}. "
"This might be due to a change in Spack's package format "
"to support multiple build-systems for a single package. You can fix this "
"by updating the build recipe, and you can also report the issue as a bug. "
"by updating the {} recipe, and you can also report the issue as a bug. "
"More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure"
).format(pkg.name)
).format(pkg.name, process, context)
error_msg = colorize("@*R{{{}}}".format(error_msg))
error_msg = "{}\n\n{}".format(str(exc), error_msg)
@@ -1216,6 +1215,9 @@ def child_fun():
return child_result
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
def get_package_context(traceback, context=3):
"""Return some context for an error message when the build fails.
@@ -1244,32 +1246,38 @@ def make_stack(tb, stack=None):
stack = make_stack(traceback)
basenames = tuple(base.__name__ for base in CONTEXT_BASES)
for tb in stack:
frame = tb.tb_frame
if "self" in frame.f_locals:
# Find the first proper subclass of PackageBase.
# Find the first proper subclass of the PackageBase or BaseBuilder, but
# don't provide context if the code is actually in the base classes.
obj = frame.f_locals["self"]
if isinstance(obj, spack.package_base.PackageBase):
func = getattr(obj, tb.tb_frame.f_code.co_name, "")
if func:
typename, *_ = func.__qualname__.partition(".")
if isinstance(obj, CONTEXT_BASES) and typename not in basenames:
break
else:
return None
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
lines = [
"{0}:{1:d}, in {2}:".format(
inspect.getfile(frame.f_code),
frame.f_lineno - 1, # subtract 1 because f_lineno is 0-indexed
frame.f_code.co_name,
)
]
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
# Subtract 1 because f_lineno is 0-indexed.
fun_lineno = frame.f_lineno - start - 1
fun_lineno = lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
@@ -1360,6 +1368,13 @@ def long_message(self):
out.write("See {0} log for details:\n".format(self.log_type))
out.write(" {0}\n".format(self.log_name))
# Also output the test log path IF it exists
if self.context != "test":
test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log)
if os.path.isfile(test_log):
out.write("\nSee test log for details:\n")
out.write(" {0}n".format(test_log))
return out.getvalue()
def __str__(self):

View File

@@ -108,7 +108,10 @@ def execute_build_time_tests(builder: spack.builder.Builder):
builder: builder prescribing the test callbacks. The name of the callbacks is
stored as a list of strings in the ``build_time_test_callbacks`` attribute.
"""
builder.pkg.run_test_callbacks(builder, builder.build_time_test_callbacks, "build")
if not builder.pkg.run_tests or not builder.build_time_test_callbacks:
return
builder.pkg.tester.phase_tests(builder, "build", builder.build_time_test_callbacks)
def execute_install_time_tests(builder: spack.builder.Builder):
@@ -118,7 +121,10 @@ def execute_install_time_tests(builder: spack.builder.Builder):
builder: builder prescribing the test callbacks. The name of the callbacks is
stored as a list of strings in the ``install_time_test_callbacks`` attribute.
"""
builder.pkg.run_test_callbacks(builder, builder.install_time_test_callbacks, "install")
if not builder.pkg.run_tests or not builder.install_time_test_callbacks:
return
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
class BaseBuilder(spack.builder.Builder):

View File

@@ -137,11 +137,12 @@ def cuda_flags(arch_list):
conflicts("%gcc@11:", when="+cuda ^cuda@:11.4.0")
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.0")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.1")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -4,13 +4,16 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Common utilities for managing intel oneapi packages."""
import getpass
import os
import platform
import shutil
from os.path import basename, dirname, isdir
from llnl.util.filesystem import find_headers, find_libraries, join_path
from llnl.util.link_tree import LinkTree
from spack.directives import conflicts, variant
from spack.package import mkdirp
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -125,6 +128,31 @@ def setup_run_environment(self, env):
)
)
def symlink_dir(self, src, dest):
# Taken from: https://github.com/spack/spack/pull/31285/files
# oneapi bin/lib directories are 2 or 3 levels below the
# prefix, but it is more typical to have them directly in the
# prefix. The location has changed over time. Rather than make
# every package that needs to know where include/lib are
# located be aware of this, put in symlinks to conform. This
# is good enough for some, but not all packages.
# If we symlink top-level directories directly, files won't
# show up in views Create real dirs and symlink files instead
# Create a real directory at dest
mkdirp(dest)
# Symlink all files in src to dest keeping directories as dirs
for entry in os.listdir(src):
src_path = join_path(src, entry)
dest_path = join_path(dest, entry)
if isdir(src_path) and os.access(src_path, os.X_OK):
link_tree = LinkTree(src_path)
link_tree.merge(dest_path)
else:
os.symlink(src_path, dest_path)
class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""Base class for Intel oneAPI library packages.

View File

@@ -130,9 +130,11 @@ def __init__(self, wrapped_pkg_object, root_builder):
bases,
{
"run_tests": property(lambda x: x.wrapped_package_object.run_tests),
"test_log_file": property(lambda x: x.wrapped_package_object.test_log_file),
"test_failures": property(lambda x: x.wrapped_package_object.test_failures),
"test_requires_compiler": property(
lambda x: x.wrapped_package_object.test_requires_compiler
),
"test_suite": property(lambda x: x.wrapped_package_object.test_suite),
"tester": property(lambda x: x.wrapped_package_object.tester),
},
)
new_cls.__module__ = package_cls.__module__

View File

@@ -531,7 +531,7 @@ def __init__(self, ci_config, phases, staged_phases):
"""
self.ci_config = ci_config
self.named_jobs = ["any", "build", "cleanup", "noop", "reindex", "signing"]
self.named_jobs = ["any", "build", "copy", "cleanup", "noop", "reindex", "signing"]
self.ir = {
"jobs": {},
@@ -634,17 +634,13 @@ def generate_ir(self):
# Reindex script
{
"reindex-job": {
"script:": [
"spack buildcache update-index --keys --mirror-url {index_target_mirror}"
]
"script:": ["spack buildcache update-index --keys {index_target_mirror}"]
}
},
# Cleanup script
{
"cleanup-job": {
"script:": [
"spack -d mirror destroy --mirror-url {mirror_prefix}/$CI_PIPELINE_ID"
]
"script:": ["spack -d mirror destroy {mirror_prefix}/$CI_PIPELINE_ID"]
}
},
# Add signing job tags
@@ -760,6 +756,7 @@ def generate_gitlab_ci_yaml(
# Get the joined "ci" config with all of the current scopes resolved
ci_config = cfg.get("ci")
config_deprecated = False
if not ci_config:
tty.warn("Environment does not have `ci` a configuration")
gitlabci_config = yaml_root.get("gitlab-ci")
@@ -772,6 +769,7 @@ def generate_gitlab_ci_yaml(
)
translate_deprecated_config(gitlabci_config)
ci_config = gitlabci_config
config_deprecated = True
# Default target is gitlab...and only target is gitlab
if not ci_config.get("target", "gitlab") == "gitlab":
@@ -835,6 +833,14 @@ def generate_gitlab_ci_yaml(
# Values: "spack_pull_request", "spack_protected_branch", or not set
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE", None)
copy_only_pipeline = spack_pipeline_type == "spack_copy_only"
if copy_only_pipeline and config_deprecated:
tty.warn(
"SPACK_PIPELINE_TYPE=spack_copy_only is not supported when using\n",
"deprecated ci configuration, a no-op pipeline will be generated\n",
"instead.",
)
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
@@ -1211,7 +1217,7 @@ def main_script_replacements(cmd):
).format(c_spec, release_spec)
tty.debug(debug_msg)
if prune_dag and not rebuild_spec:
if prune_dag and not rebuild_spec and not copy_only_pipeline:
tty.debug(
"Pruning {0}/{1}, does not need rebuild.".format(
release_spec.name, release_spec.dag_hash()
@@ -1302,8 +1308,9 @@ def main_script_replacements(cmd):
max_length_needs = length_needs
max_needs_job = job_name
output_object[job_name] = job_object
job_id += 1
if not copy_only_pipeline:
output_object[job_name] = job_object
job_id += 1
if print_summary:
for phase in phases:
@@ -1333,6 +1340,17 @@ def main_script_replacements(cmd):
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
}
if copy_only_pipeline and not config_deprecated:
stage_names.append("copy")
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
sync_job["stage"] = "copy"
if artifacts_root:
sync_job["needs"] = [
{"job": generate_job_name, "pipeline": "{0}".format(parent_pipeline_id)}
]
output_object["copy"] = sync_job
job_id += 1
if job_id > 0:
if temp_storage_url_prefix:
# There were some rebuild jobs scheduled, so we will need to
@@ -1466,12 +1484,18 @@ def main_script_replacements(cmd):
sorted_output = cinw.needs_to_dependencies(sorted_output)
else:
# No jobs were generated
tty.debug("No specs to rebuild, generating no-op job")
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
noop_job["retry"] = service_job_retries
sorted_output = {"no-specs-to-rebuild": noop_job}
if copy_only_pipeline and config_deprecated:
tty.debug("Generating no-op job as copy-only is unsupported here.")
noop_job["script"] = [
'echo "copy-only pipelines are not supported with deprecated ci configs"'
]
sorted_output = {"unsupported-copy": noop_job}
else:
tty.debug("No specs to rebuild, generating no-op job")
sorted_output = {"no-specs-to-rebuild": noop_job}
if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:")
@@ -2456,7 +2480,16 @@ def populate_buildgroup(self, job_names):
msg = "Error response code ({0}) in populate_buildgroup".format(response_code)
tty.warn(msg)
def report_skipped(self, spec, directory_name, reason):
def report_skipped(self, spec: spack.spec.Spec, report_dir: str, reason: Optional[str]):
"""Explicitly report skipping testing of a spec (e.g., it's CI
configuration identifies it as known to have broken tests or
the CI installation failed).
Args:
spec: spec being tested
report_dir: directory where the report will be written
reason: reason the test is being skipped
"""
configuration = CDashConfiguration(
upload_url=self.upload_url,
packages=[spec.name],
@@ -2466,7 +2499,7 @@ def report_skipped(self, spec, directory_name, reason):
track=None,
)
reporter = CDash(configuration=configuration)
reporter.test_skipped_report(directory_name, spec, reason)
reporter.test_skipped_report(report_dir, spec, reason)
def translate_deprecated_config(config):
@@ -2481,12 +2514,14 @@ def translate_deprecated_config(config):
build_job["tags"] = config.pop("tags")
if "variables" in config:
build_job["variables"] = config.pop("variables")
# Scripts always override in old CI
if "before_script" in config:
build_job["before_script"] = config.pop("before_script")
build_job["before_script:"] = config.pop("before_script")
if "script" in config:
build_job["script"] = config.pop("script")
build_job["script:"] = config.pop("script")
if "after_script" in config:
build_job["after_script"] = config.pop("after_script")
build_job["after_script:"] = config.pop("after_script")
signing_job = None
if "signing-job-attributes" in config:
@@ -2510,8 +2545,25 @@ def translate_deprecated_config(config):
for section in mappings:
submapping_section = {"match": section["match"]}
if "runner-attributes" in section:
submapping_section["build-job"] = section["runner-attributes"]
remapped_attributes = {}
if match_behavior == "first":
for key, value in section["runner-attributes"].items():
# Scripts always override in old CI
if key == "script":
remapped_attributes["script:"] = value
elif key == "before_script":
remapped_attributes["before_script:"] = value
elif key == "after_script":
remapped_attributes["after_script:"] = value
else:
remapped_attributes[key] = value
else:
# Handle "merge" behavior be allowing scripts to merge in submapping section
remapped_attributes = section["runner-attributes"]
submapping_section["build-job"] = remapped_attributes
if "remove-attributes" in section:
# Old format only allowed tags in this section, so no extra checks are needed
submapping_section["build-job-remove"] = section["remove-attributes"]
submapping.append(submapping_section)
pipeline_gen.append({"submapping": submapping, "match_behavior": match_behavior})

View File

@@ -347,7 +347,7 @@ def iter_groups(specs, indent, all_headers):
spack.spec.architecture_color,
architecture if architecture else "no arch",
spack.spec.compiler_color,
compiler if compiler else "no compiler",
f"{compiler.display_str}" if compiler else "no compiler",
)
# Sometimes we want to display specs that are not yet concretized.

View File

@@ -98,7 +98,7 @@ def compiler_find(args):
config = spack.config.config
filename = config.get_config_filename(args.scope, "compilers")
tty.msg("Added %d new compiler%s to %s" % (n, s, filename))
colify(reversed(sorted(c.spec for c in new_compilers)), indent=4)
colify(reversed(sorted(c.spec.display_str for c in new_compilers)), indent=4)
else:
tty.msg("Found no new compilers")
tty.msg("Compilers are defined in the following files:")
@@ -112,13 +112,13 @@ def compiler_remove(args):
tty.die("No compilers match spec %s" % cspec)
elif not args.all and len(compilers) > 1:
tty.error("Multiple compilers match spec %s. Choose one:" % cspec)
colify(reversed(sorted([c.spec for c in compilers])), indent=4)
colify(reversed(sorted([c.spec.display_str for c in compilers])), indent=4)
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
sys.exit(1)
for compiler in compilers:
spack.compilers.remove_compiler_from_config(compiler.spec, scope=args.scope)
tty.msg("Removed compiler %s" % compiler.spec)
tty.msg("Removed compiler %s" % compiler.spec.display_str)
def compiler_info(args):
@@ -130,7 +130,7 @@ def compiler_info(args):
tty.die("No compilers match spec %s" % cspec)
else:
for c in compilers:
print(str(c.spec) + ":")
print(c.spec.display_str + ":")
print("\tpaths:")
for cpath in ["cc", "cxx", "f77", "fc"]:
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
@@ -188,7 +188,7 @@ def compiler_list(args):
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.compiler_color, name, os_str)
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec for c in compilers)))
colify(reversed(sorted(c.spec.display_str for c in compilers)))
def compiler(parser, args):

View File

@@ -29,6 +29,7 @@ def setup_parser(subparser):
)
spack.cmd.common.arguments.add_concretizer_args(subparser)
spack.cmd.common.arguments.add_common_arguments(subparser, ["jobs"])
def concretize(parser, args):

View File

@@ -302,7 +302,7 @@ def env_create(args):
# the environment should not include a view.
with_view = None
_env_create(
env = _env_create(
args.create_env,
init_file=args.envfile,
dir=args.dir,
@@ -310,6 +310,9 @@ def env_create(args):
keep_relative=args.keep_relative,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
"""Create a new environment, with an optional yaml description.

View File

@@ -5,7 +5,6 @@
from __future__ import print_function
import inspect
import textwrap
from itertools import zip_longest
@@ -15,9 +14,10 @@
import spack.cmd.common.arguments as arguments
import spack.fetch_strategy as fs
import spack.install_test
import spack.repo
import spack.spec
from spack.package_base import has_test_method, preferred_version
from spack.package_base import preferred_version
description = "get detailed information on a particular package"
section = "basic"
@@ -261,41 +261,7 @@ def print_tests(pkg):
# if it has been overridden and, therefore, assumed to be implemented.
color.cprint("")
color.cprint(section_title("Stand-Alone/Smoke Test Methods:"))
names = []
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
if has_test_method(pkg_cls):
pkg_base = spack.package_base.PackageBase
test_pkgs = [
str(cls.test)
for cls in inspect.getmro(pkg_cls)
if issubclass(cls, pkg_base) and cls.test != pkg_base.test
]
test_pkgs = list(set(test_pkgs))
names.extend([(test.split()[1]).lower() for test in test_pkgs])
# TODO Refactor START
# Use code from package_base.py's test_process IF this functionality is
# accepted.
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))
# hack for compilers that are not dependencies (yet)
# TODO: this all eventually goes away
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.intersects("llvm+clang"):
v_names.extend(["c", "cxx"])
# TODO Refactor END
v_specs = [spack.spec.Spec(v_name) for v_name in v_names]
for v_spec in v_specs:
try:
pkg_cls = spack.repo.path.get_pkg_class(v_spec.name)
if has_test_method(pkg_cls):
names.append("{0}.test".format(pkg_cls.name.lower()))
except spack.repo.UnknownPackageError:
pass
names = spack.install_test.test_function_names(pkg, add_virtuals=True)
if names:
colify(sorted(names), indent=4)
else:

View File

@@ -76,6 +76,14 @@ def setup_parser(subparser):
help="location of the named or current environment",
)
subparser.add_argument(
"--first",
action="store_true",
default=False,
dest="find_first",
help="use the first match if multiple packages match the spec",
)
arguments.add_common_arguments(subparser, ["spec"])
@@ -121,7 +129,7 @@ def location(parser, args):
# install_dir command matches against installed specs.
if args.install_dir:
env = ev.active_environment()
spec = spack.cmd.disambiguate_spec(specs[0], env)
spec = spack.cmd.disambiguate_spec(specs[0], env, first=args.find_first)
print(spec.prefix)
return

View File

@@ -116,21 +116,23 @@ def one_spec_or_raise(specs):
def check_module_set_name(name):
modules_config = spack.config.get("modules")
valid_names = set(
[
key
for key, value in modules_config.items()
if isinstance(value, dict) and value.get("enable", [])
]
)
if "enable" in modules_config and modules_config["enable"]:
valid_names.add("default")
modules = spack.config.get("modules")
if name != "prefix_inspections" and name in modules:
return
if name not in valid_names:
msg = "Cannot use invalid module set %s." % name
msg += " Valid module set names are %s" % list(valid_names)
raise spack.config.ConfigError(msg)
names = [k for k in modules if k != "prefix_inspections"]
if not names:
raise spack.config.ConfigError(
f"Module set configuration is missing. Cannot use module set '{name}'"
)
pretty_names = "', '".join(names)
raise spack.config.ConfigError(
f"Cannot use invalid module set '{name}'.",
f"Valid module set names are: '{pretty_names}'.",
)
_missing_modules_warning = (

View File

@@ -140,12 +140,15 @@ def _process_result(result, show, required_format, kwargs):
def solve(parser, args):
# these are the same options as `spack spec`
name_fmt = "{namespace}.{name}" if args.namespaces else "{name}"
fmt = "{@versions}{%compiler}{compiler_flags}{variants}{arch=architecture}"
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.display_format
if args.namespaces:
fmt = "{namespace}." + fmt
kwargs = {
"cover": args.cover,
"format": name_fmt + fmt,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,

View File

@@ -80,12 +80,15 @@ def setup_parser(subparser):
def spec(parser, args):
name_fmt = "{namespace}.{name}" if args.namespaces else "{name}"
fmt = "{@versions}{%compiler}{compiler_flags}{variants}{arch=architecture}"
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.display_format
if args.namespaces:
fmt = "{namespace}." + fmt
tree_kwargs = {
"cover": args.cover,
"format": name_fmt + fmt,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,

View File

@@ -11,6 +11,7 @@
import re
import shutil
import sys
from collections import Counter
from llnl.util import lang, tty
from llnl.util.tty import colify
@@ -236,9 +237,8 @@ def test_list(args):
tagged = set(spack.repo.path.packages_with_tags(*args.tag)) if args.tag else set()
def has_test_and_tags(pkg_class):
return spack.package_base.has_test_method(pkg_class) and (
not args.tag or pkg_class.name in tagged
)
tests = spack.install_test.test_functions(pkg_class)
return len(tests) and (not args.tag or pkg_class.name in tagged)
if args.list_all:
report_packages = [
@@ -358,18 +358,17 @@ def _report_suite_results(test_suite, args, constraints):
tty.msg("test specs:")
failed, skipped, untested = 0, 0, 0
counts = Counter()
for pkg_id in test_specs:
if pkg_id in results:
status = results[pkg_id]
if status == "FAILED":
failed += 1
elif status == "NO-TESTS":
untested += 1
elif status == "SKIPPED":
skipped += 1
# Backward-compatibility: NO-TESTS => NO_TESTS
status = "NO_TESTS" if status == "NO-TESTS" else status
if args.failed and status != "FAILED":
status = spack.install_test.TestStatus[status]
counts[status] += 1
if args.failed and status != spack.install_test.TestStatus.FAILED:
continue
msg = " {0} {1}".format(pkg_id, status)
@@ -381,7 +380,7 @@ def _report_suite_results(test_suite, args, constraints):
msg += "\n{0}".format("".join(f.readlines()))
tty.msg(msg)
spack.install_test.write_test_summary(failed, skipped, untested, len(test_specs))
spack.install_test.write_test_summary(counts)
else:
msg = "Test %s has no results.\n" % test_suite.name
msg += " Check if it is running with "

View File

@@ -25,7 +25,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.19"
tutorial_branch = "releases/v0.20"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -187,7 +187,9 @@ def remove_compiler_from_config(compiler_spec, scope=None):
filtered_compiler_config = [
comp
for comp in compiler_config
if spack.spec.CompilerSpec(comp["compiler"]["spec"]) != compiler_spec
if not spack.spec.parse_with_version_concrete(
comp["compiler"]["spec"], compiler=True
).satisfies(compiler_spec)
]
# Update the cache for changes
@@ -724,7 +726,7 @@ def make_compiler_list(detected_versions):
def _default_make_compilers(cmp_id, paths):
operating_system, compiler_name, version = cmp_id
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
spec = spack.spec.CompilerSpec(compiler_cls.name, version)
spec = spack.spec.CompilerSpec(compiler_cls.name, f"={version}")
paths = [paths.get(x, None) for x in ("cc", "cxx", "f77", "fc")]
# TODO: johnwparent - revist the following line as per discussion at:
# https://github.com/spack/spack/pull/33385/files#r1040036318
@@ -802,8 +804,10 @@ def name_matches(name, name_list):
toolchains.add(compiler_cls.__name__)
if len(toolchains) > 1:
if toolchains == set(["Clang", "AppleClang", "Aocc"]) or toolchains == set(
["Dpcpp", "Oneapi"]
if (
toolchains == set(["Clang", "AppleClang", "Aocc"])
# Msvc toolchain uses Intel ifx
or toolchains == set(["Msvc", "Dpcpp", "Oneapi"])
):
return False
tty.debug("[TOOLCHAINS] {0}".format(toolchains))

View File

@@ -30,7 +30,7 @@
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver).split("@")[1]
cl_ver = str(comp_ver)
sort_fn = lambda fc_ver: StrictVersion(fc_ver)
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
for ver in sort_fc_ver:
@@ -75,7 +75,7 @@ class Msvc(Compiler):
# file based on compiler executable path.
def __init__(self, *args, **kwargs):
new_pth = [pth if pth else get_valid_fortran_pth(args[0]) for pth in args[3]]
new_pth = [pth if pth else get_valid_fortran_pth(args[0].version) for pth in args[3]]
args[3][:] = new_pth
super(Msvc, self).__init__(*args, **kwargs)
if os.getenv("ONEAPI_ROOT"):

View File

@@ -10,10 +10,12 @@
from typing import Optional
import spack.environment as ev
import spack.error
import spack.schema.env
import spack.tengine as tengine
import spack.util.spack_yaml as syaml
from spack.container.images import (
from ..images import (
bootstrap_template_for,
build_info,
checkout_command,
@@ -205,12 +207,20 @@ def manifest(self):
@tengine.context_property
def os_packages_final(self):
"""Additional system packages that are needed at run-time."""
return self._os_packages_for_stage("final")
try:
return self._os_packages_for_stage("final")
except Exception as e:
msg = f"an error occurred while rendering the 'final' stage of the image: {e}"
raise spack.error.SpackError(msg) from e
@tengine.context_property
def os_packages_build(self):
"""Additional system packages that are needed at build-time."""
return self._os_packages_for_stage("build")
try:
return self._os_packages_for_stage("build")
except Exception as e:
msg = f"an error occurred while rendering the 'build' stage of the image: {e}"
raise spack.error.SpackError(msg) from e
@tengine.context_property
def os_package_update(self):
@@ -243,13 +253,24 @@ def _package_info_from(self, package_list):
if image is None:
os_pkg_manager = os_package_manager_for(image_config["os"])
else:
os_pkg_manager = self.container_config["os_packages"]["command"]
os_pkg_manager = self._os_pkg_manager()
update, install, clean = commands_for(os_pkg_manager)
Packages = collections.namedtuple("Packages", ["update", "install", "list", "clean"])
return Packages(update=update, install=install, list=package_list, clean=clean)
def _os_pkg_manager(self):
try:
os_pkg_manager = self.container_config["os_packages"]["command"]
except KeyError:
msg = (
"cannot determine the OS package manager to use.\n\n\tPlease add an "
"appropriate 'os_packages:command' entry to the spack.yaml manifest file\n"
)
raise spack.error.SpackError(msg)
return os_pkg_manager
@tengine.context_property
def extra_instructions(self):
Extras = collections.namedtuple("Extra", ["build", "final"])

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import shlex
import spack.tengine as tengine
from . import PathContext, writer
@@ -17,14 +19,15 @@ class DockerContext(PathContext):
@tengine.context_property
def manifest(self):
manifest_str = super(DockerContext, self).manifest
# Docker doesn't support HEREDOC so we need to resort to
# Docker doesn't support HEREDOC, so we need to resort to
# a horrible echo trick to have the manifest in the Dockerfile
echoed_lines = []
for idx, line in enumerate(manifest_str.split("\n")):
quoted_line = shlex.quote(line)
if idx == 0:
echoed_lines.append('&& (echo "' + line + '" \\')
echoed_lines.append("&& (echo " + quoted_line + " \\")
continue
echoed_lines.append('&& echo "' + line + '" \\')
echoed_lines.append("&& echo " + quoted_line + " \\")
echoed_lines[-1] = echoed_lines[-1].replace(" \\", ")")

View File

@@ -798,9 +798,8 @@ def check(cond, msg):
# TODO: better version checking semantics.
version = vn.Version(db["version"])
spec_reader = reader(version)
if version > _db_version:
raise InvalidDatabaseVersionError(_db_version, version)
raise InvalidDatabaseVersionError(self, _db_version, version)
elif version < _db_version:
if not any(old == version and new == _db_version for old, new in _skip_reindex):
tty.warn(
@@ -814,6 +813,8 @@ def check(cond, msg):
for k, v in self._data.items()
)
spec_reader = reader(version)
def invalid_record(hash_key, error):
return CorruptDatabaseError(
f"Invalid record in Spack database: hash: {hash_key}, cause: "
@@ -1642,7 +1643,7 @@ class CorruptDatabaseError(SpackError):
class NonConcreteSpecAddError(SpackError):
"""Raised when attemptint to add non-concrete spec to DB."""
"""Raised when attempting to add non-concrete spec to DB."""
class MissingDependenciesError(SpackError):
@@ -1650,8 +1651,17 @@ class MissingDependenciesError(SpackError):
class InvalidDatabaseVersionError(SpackError):
def __init__(self, expected, found):
super(InvalidDatabaseVersionError, self).__init__(
"Expected database version %s but found version %s." % (expected, found),
"`spack reindex` may fix this, or you may need a newer " "Spack version.",
"""Exception raised when the database metadata is newer than current Spack."""
def __init__(self, database, expected, found):
self.expected = expected
self.found = found
msg = (
f"you need a newer Spack version to read the DB in '{database.root}'. "
f"{self.database_version_message}"
)
super(InvalidDatabaseVersionError, self).__init__(msg)
@property
def database_version_message(self):
return f"The expected DB version is '{self.expected}', but '{self.found}' was found."

View File

@@ -218,8 +218,10 @@ def update_configuration(detected_packages, scope=None, buildable=True):
def _windows_drive():
"""Return Windows drive string"""
return os.environ["HOMEDRIVE"]
"""Return Windows drive string extracted from PROGRAMFILES
env var, which is garunteed to be defined for all logins"""
drive = re.match(r"([a-zA-Z]:)", os.environ["PROGRAMFILES"]).group(1)
return drive
class WindowsCompilerExternalPaths(object):

View File

@@ -16,18 +16,24 @@
The high-level format of a Spack lockfile hasn't changed much between versions, but the
contents have. Lockfiles are JSON-formatted and their top-level sections are:
1. ``_meta`` (object): this contains deatails about the file format, including:
1. ``_meta`` (object): this contains details about the file format, including:
* ``file-type``: always ``"spack-lockfile"``
* ``lockfile-version``: an integer representing the lockfile format version
* ``specfile-version``: an integer representing the spec format version (since
``v0.17``)
2. ``roots`` (list): an ordered list of records representing the roots of the Spack
2. ``spack`` (object): optional, this identifies information about Spack
used to concretize the environment:
* ``type``: required, identifies form Spack version took (e.g., ``git``, ``release``)
* ``commit``: the commit if the version is from git
* ``version``: the Spack version
3. ``roots`` (list): an ordered list of records representing the roots of the Spack
environment. Each has two fields:
* ``hash``: a Spack spec hash uniquely identifying the concrete root spec
* ``spec``: a string representation of the abstract spec that was concretized
3. ``concrete_specs``: a dictionary containing the specs in the environment.
4. ``concrete_specs``: a dictionary containing the specs in the environment.
Compatibility
-------------
@@ -271,6 +277,8 @@
Dependencies are keyed by ``hash`` (DAG hash) as well. There are no more ``build_hash``
fields in the specs, and there are no more issues with lockfiles being able to store
multiple specs with the same DAG hash (because the DAG hash is now finer-grained).
An optional ``spack`` property may be included to track version information, such as
the commit or version.
.. code-block:: json
@@ -278,8 +286,8 @@
{
"_meta": {
"file-type": "spack-lockfile",
"lockfile-version": 3,
"specfile-version": 2
"lockfile-version": 4,
"specfile-version": 3
},
"roots": [
{
@@ -326,7 +334,6 @@
}
}
}
"""
from .environment import (

View File

@@ -16,7 +16,7 @@
import urllib.parse
import urllib.request
import warnings
from typing import Any, Dict, List, Optional, Union
from typing import Any, Dict, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -31,6 +31,7 @@
import spack.error
import spack.hash_types as ht
import spack.hooks
import spack.main
import spack.paths
import spack.repo
import spack.schema.env
@@ -1118,9 +1119,9 @@ def add(self, user_spec, list_name=user_speclist_name):
raise SpackEnvironmentError(f"No list {list_name} exists in environment {self.name}")
if list_name == user_speclist_name:
if not spec.name:
if spec.anonymous:
raise SpackEnvironmentError("cannot add anonymous specs to an environment")
elif not spack.repo.path.exists(spec.name):
elif not spack.repo.path.exists(spec.name) and not spec.abstract_hash:
virtuals = spack.repo.path.provider_index.providers.keys()
if spec.name not in virtuals:
msg = "no such package: %s" % spec.name
@@ -1364,67 +1365,110 @@ def concretize(self, force=False, tests=False):
msg = "concretization strategy not implemented [{0}]"
raise SpackEnvironmentError(msg.format(self.unify))
def _concretize_together_where_possible(self, tests=False):
def _get_specs_to_concretize(
self,
) -> Tuple[Set[spack.spec.Spec], Set[spack.spec.Spec], List[spack.spec.Spec]]:
"""Compute specs to concretize for unify:true and unify:when_possible.
This includes new user specs and any already concretized specs.
Returns:
Tuple of new user specs, user specs to keep, and the specs to concretize.
"""
# Exit early if the set of concretized specs is the set of user specs
new_user_specs = set(self.user_specs) - set(self.concretized_user_specs)
kept_user_specs = set(self.user_specs) & set(self.concretized_user_specs)
if not new_user_specs:
return new_user_specs, kept_user_specs, []
concrete_specs_to_keep = [
concrete
for abstract, concrete in self.concretized_specs()
if abstract in kept_user_specs
]
specs_to_concretize = list(new_user_specs) + concrete_specs_to_keep
return new_user_specs, kept_user_specs, specs_to_concretize
def _concretize_together_where_possible(
self, tests: bool = False
) -> List[Tuple[spack.spec.Spec, spack.spec.Spec]]:
# Avoid cyclic dependency
import spack.solver.asp
# Exit early if the set of concretized specs is the set of user specs
user_specs_did_not_change = not bool(
set(self.user_specs) - set(self.concretized_user_specs)
)
if user_specs_did_not_change:
new_user_specs, _, specs_to_concretize = self._get_specs_to_concretize()
if not new_user_specs:
return []
# Proceed with concretization
old_concrete_to_abstract = {
concrete: abstract for (abstract, concrete) in self.concretized_specs()
}
self.concretized_user_specs = []
self.concretized_order = []
self.specs_by_hash = {}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
for result in solver.solve_in_rounds(self.user_specs, tests=tests):
for result in solver.solve_in_rounds(specs_to_concretize, tests=tests):
result_by_user_spec.update(result.specs_by_input)
result = []
for abstract, concrete in sorted(result_by_user_spec.items()):
# If the "abstract" spec is a concrete spec from the previous concretization
# translate it back to an abstract spec. Otherwise, keep the abstract spec
abstract = old_concrete_to_abstract.get(abstract, abstract)
if abstract in new_user_specs:
result.append((abstract, concrete))
self._add_concrete_spec(abstract, concrete)
result.append((abstract, concrete))
return result
def _concretize_together(self, tests=False):
def _concretize_together(
self, tests: bool = False
) -> List[Tuple[spack.spec.Spec, spack.spec.Spec]]:
"""Concretization strategy that concretizes all the specs
in the same DAG.
"""
# Exit early if the set of concretized specs is the set of user specs
user_specs_did_not_change = not bool(
set(self.user_specs) - set(self.concretized_user_specs)
)
if user_specs_did_not_change:
new_user_specs, kept_user_specs, specs_to_concretize = self._get_specs_to_concretize()
if not new_user_specs:
return []
# Proceed with concretization
self.concretized_user_specs = []
self.concretized_order = []
self.specs_by_hash = {}
try:
concrete_specs = spack.concretize.concretize_specs_together(
*self.user_specs, tests=tests
concrete_specs: List[spack.spec.Spec] = spack.concretize.concretize_specs_together(
*specs_to_concretize, tests=tests
)
except spack.error.UnsatisfiableSpecError as e:
# "Enhance" the error message for multiple root specs, suggest a less strict
# form of concretization.
if len(self.user_specs) > 1:
e.message += ". "
if kept_user_specs:
e.message += (
"Couldn't concretize without changing the existing environment. "
"If you are ok with changing it, try `spack concretize --force`. "
)
e.message += (
". Consider setting `concretizer:unify` to `when_possible` "
"or `false` to relax the concretizer strictness."
"You could consider setting `concretizer:unify` to `when_possible` "
"or `false` to allow multiple versions of some packages."
)
raise
concretized_specs = [x for x in zip(self.user_specs, concrete_specs)]
# set() | set() does not preserve ordering, even though sets are ordered
ordered_user_specs = list(new_user_specs) + list(kept_user_specs)
concretized_specs = [x for x in zip(ordered_user_specs, concrete_specs)]
for abstract, concrete in concretized_specs:
self._add_concrete_spec(abstract, concrete)
return concretized_specs
# zip truncates the longer list, which is exactly what we want here
return list(zip(new_user_specs, concrete_specs))
def _concretize_separately(self, tests=False):
"""Concretization strategy that concretizes separately one
@@ -1475,7 +1519,10 @@ def _concretize_separately(self, tests=False):
# Solve the environment in parallel on Linux
start = time.time()
max_processes = min(len(arguments), 16) # Number of specs # Cap on 16 cores
max_processes = min(
len(arguments), # Number of specs
spack.config.get("config:build_jobs"), # Cap on build jobs
)
# TODO: revisit this print as soon as darwin is parallel too
msg = "Starting concretization"
@@ -2069,6 +2116,14 @@ def _to_lockfile_dict(self):
hash_spec_list = zip(self.concretized_order, self.concretized_user_specs)
spack_dict = {"version": spack.spack_version}
spack_commit = spack.main.get_spack_commit()
if spack_commit:
spack_dict["type"] = "git"
spack_dict["commit"] = spack_commit
else:
spack_dict["type"] = "release"
# this is the lockfile we'll write out
data = {
# metadata about the format
@@ -2077,6 +2132,8 @@ def _to_lockfile_dict(self):
"lockfile-version": lockfile_format_version,
"specfile-version": spack.spec.SPECFILE_FORMAT_VERSION,
},
# spack version information
"spack": spack_dict,
# users specs + hashes are the 'roots' of the environment
"roots": [{"hash": h, "spec": str(s)} for h, s in hash_spec_list],
# Concrete specs by hash, including dependencies
@@ -2113,10 +2170,12 @@ def _read_lockfile_dict(self, d):
reader = READER_CLS[current_lockfile_format]
except KeyError:
msg = (
f"Spack {spack.__version__} cannot read environment lockfiles using the "
f"v{current_lockfile_format} format"
f"Spack {spack.__version__} cannot read the lockfile '{self.lock_path}', using "
f"the v{current_lockfile_format} format."
)
raise RuntimeError(msg)
if lockfile_format_version < current_lockfile_format:
msg += " You need to use a newer Spack version."
raise SpackEnvironmentError(msg)
# First pass: Put each spec in the map ignoring dependencies
for lockfile_key, node_dict in json_specs_by_hash.items():
@@ -2308,6 +2367,7 @@ def display_specs(concretized_specs):
def _tree_to_display(spec):
return spec.tree(
recurse_dependencies=True,
format=spack.spec.display_format,
status_fn=spack.spec.Spec.install_status,
hashlen=7,
hashes=True,

View File

@@ -3,13 +3,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import functools as ft
import itertools
import os
import re
import shutil
import stat
import sys
from typing import Optional
from llnl.util import tty
from llnl.util.filesystem import (
@@ -32,12 +33,14 @@
import spack.config
import spack.projections
import spack.relocate
import spack.schema.projections
import spack.spec
import spack.store
import spack.util.spack_json as s_json
import spack.util.spack_yaml as s_yaml
from spack.error import SpackError
from spack.hooks import sbang
__all__ = ["FilesystemView", "YamlFilesystemView"]
@@ -57,50 +60,47 @@ def view_hardlink(src, dst, **kwargs):
os.link(src, dst)
def view_copy(src, dst, view, spec=None):
def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
"""
Copy a file from src to dst.
Use spec and view to generate relocations
"""
shutil.copy2(src, dst)
if spec and not spec.external:
# Not metadata, we have to relocate it
shutil.copy2(src, dst, follow_symlinks=False)
# Get information on where to relocate from/to
# No need to relocate if no metadata or external.
if not spec or spec.external:
return
# This is vestigial code for the *old* location of sbang. Previously,
# sbang was a bash script, and it lived in the spack prefix. It is
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
# TODO: Not sure which one to use...
import spack.hooks.sbang as sbang
# Order of this dict is somewhat irrelevant
prefix_to_projection = {
s.prefix: view.get_projection_for_spec(s)
for s in spec.traverse(root=True, order="breadth")
if not s.external
}
# Break a package include cycle
import spack.relocate
src_stat = os.lstat(src)
orig_sbang = "#!/bin/bash {0}/bin/sbang".format(spack.paths.spack_root)
new_sbang = sbang.sbang_shebang_line()
# TODO: change this into a bulk operation instead of a per-file operation
prefix_to_projection = collections.OrderedDict(
{spec.prefix: view.get_projection_for_spec(spec)}
)
if stat.S_ISLNK(src_stat.st_mode):
spack.relocate.relocate_links(links=[dst], prefix_to_prefix=prefix_to_projection)
elif spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(binaries=[dst], prefixes=prefix_to_projection)
else:
prefix_to_projection[spack.store.layout.root] = view._root
for dep in spec.traverse():
if not dep.external:
prefix_to_projection[dep.prefix] = view.get_projection_for_spec(dep)
# This is vestigial code for the *old* location of sbang.
prefix_to_projection[
"#!/bin/bash {0}/bin/sbang".format(spack.paths.spack_root)
] = sbang.sbang_shebang_line()
if spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(binaries=[dst], prefixes=prefix_to_projection)
else:
prefix_to_projection[spack.store.layout.root] = view._root
prefix_to_projection[orig_sbang] = new_sbang
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
try:
stat = os.stat(src)
os.chown(dst, stat.st_uid, stat.st_gid)
except OSError:
tty.debug("Can't change the permissions for %s" % dst)
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
try:
os.chown(dst, src_stat.st_uid, src_stat.st_gid)
except OSError:
tty.debug("Can't change the permissions for %s" % dst)
def view_func_parser(parsed_name):

File diff suppressed because it is too large Load Diff

View File

@@ -278,6 +278,19 @@ def _print_installed_pkg(message):
print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message))
def print_install_test_log(pkg: "spack.package_base.PackageBase"):
"""Output install test log file path but only if have test failures.
Args:
pkg: instance of the package under test
"""
if not pkg.run_tests or not (pkg.tester and pkg.tester.test_failures):
# The tests were not run or there were no test failures
return
pkg.tester.print_log_path()
def _print_timer(pre, pkg_id, timer):
phases = ["{}: {}.".format(p.capitalize(), _hms(timer.duration(p))) for p in timer.phases]
phases.append("Total: {}".format(_hms(timer.duration())))
@@ -536,6 +549,25 @@ def install_msg(name, pid):
return pre + colorize("@*{Installing} @*g{%s}" % name)
def archive_install_logs(pkg, phase_log_dir):
"""
Copy install logs to their destination directory(ies)
Args:
pkg (spack.package_base.PackageBase): the package that was built and installed
phase_log_dir (str): path to the archive directory
"""
# Archive the whole stdout + stderr for the package
fs.install(pkg.log_path, pkg.install_log_path)
# Archive all phase log paths
for phase_log in pkg.phase_log_files:
log_file = os.path.basename(phase_log)
fs.install(phase_log, os.path.join(phase_log_dir, log_file))
# Archive the install-phase test log, if present
pkg.archive_install_test_log()
def log(pkg):
"""
Copy provenance into the install directory on success
@@ -553,22 +585,11 @@ def log(pkg):
# FIXME : this potentially catches too many things...
tty.debug(e)
# Archive the whole stdout + stderr for the package
fs.install(pkg.log_path, pkg.install_log_path)
# Archive all phase log paths
for phase_log in pkg.phase_log_files:
log_file = os.path.basename(phase_log)
log_file = os.path.join(os.path.dirname(packages_dir), log_file)
fs.install(phase_log, log_file)
archive_install_logs(pkg, os.path.dirname(packages_dir))
# Archive the environment modifications for the build.
fs.install(pkg.env_mods_path, pkg.install_env_path)
# Archive the install-phase test log, if present
if pkg.test_install_log_path and os.path.exists(pkg.test_install_log_path):
fs.install(pkg.test_install_log_path, pkg.install_test_install_log_path)
if os.path.exists(pkg.configure_args_path):
# Archive the args used for the build
fs.install(pkg.configure_args_path, pkg.install_configure_args_path)
@@ -1932,14 +1953,17 @@ def run(self):
self._real_install()
# Run post install hooks before build stage is removed.
self.timer.start("post-install")
spack.hooks.post_install(self.pkg.spec, self.explicit)
self.timer.stop("post-install")
# Stop the timer and save results
self.timer.stop()
with open(self.pkg.times_log_path, "w") as timelog:
self.timer.write_json(timelog)
# Run post install hooks before build stage is removed.
spack.hooks.post_install(self.pkg.spec, self.explicit)
print_install_test_log(self.pkg)
_print_timer(pre=self.pre, pkg_id=self.pkg_id, timer=self.timer)
_print_installed_pkg(self.pkg.prefix)

View File

@@ -126,6 +126,36 @@ def add_all_commands(parser):
parser.add_command(cmd)
def get_spack_commit():
"""Get the Spack git commit sha.
Returns:
(str or None) the commit sha if available, otherwise None
"""
git_path = os.path.join(spack.paths.prefix, ".git")
if not os.path.exists(git_path):
return None
git = spack.util.git.git()
if not git:
return None
rev = git(
"-C",
spack.paths.prefix,
"rev-parse",
"HEAD",
output=str,
error=os.devnull,
fail_on_error=False,
)
if git.returncode != 0:
return None
match = re.match(r"[a-f\d]{7,}$", rev)
return match.group(0) if match else None
def get_version():
"""Get a descriptive version of this instance of Spack.
@@ -134,25 +164,9 @@ def get_version():
The commit sha is only added when available.
"""
version = spack.spack_version
git_path = os.path.join(spack.paths.prefix, ".git")
if os.path.exists(git_path):
git = spack.util.git.git()
if not git:
return version
rev = git(
"-C",
spack.paths.prefix,
"rev-parse",
"HEAD",
output=str,
error=os.devnull,
fail_on_error=False,
)
if git.returncode != 0:
return version
match = re.match(r"[a-f\d]{7,}$", rev)
if match:
version += " ({0})".format(match.group(0))
commit = get_spack_commit()
if commit:
version += " ({0})".format(commit)
return version

View File

@@ -170,17 +170,12 @@ def merge_config_rules(configuration, spec):
Returns:
dict: actions to be taken on the spec passed as an argument
"""
# Get the top-level configuration for the module type we are using
module_specific_configuration = copy.deepcopy(configuration)
# Construct a dictionary with the actions we need to perform on the spec
# passed as a parameter
# Construct a dictionary with the actions we need to perform on the spec passed as a parameter
spec_configuration = {}
# The keyword 'all' is always evaluated first, all the others are
# evaluated in order of appearance in the module file
spec_configuration = module_specific_configuration.pop("all", {})
for constraint, action in module_specific_configuration.items():
spec_configuration.update(copy.deepcopy(configuration.get("all", {})))
for constraint, action in configuration.items():
if spec.satisfies(constraint):
if hasattr(constraint, "override") and constraint.override:
spec_configuration = {}
@@ -200,14 +195,14 @@ def merge_config_rules(configuration, spec):
# configuration
# Hash length in module files
hash_length = module_specific_configuration.get("hash_length", 7)
hash_length = configuration.get("hash_length", 7)
spec_configuration["hash_length"] = hash_length
verbose = module_specific_configuration.get("verbose", False)
verbose = configuration.get("verbose", False)
spec_configuration["verbose"] = verbose
# module defaults per-package
defaults = module_specific_configuration.get("defaults", [])
defaults = configuration.get("defaults", [])
spec_configuration["defaults"] = defaults
return spec_configuration

View File

@@ -7,7 +7,7 @@
import itertools
import os.path
import posixpath
from typing import Any, Dict
from typing import Any, Dict, List
import llnl.util.lang as lang
@@ -56,7 +56,7 @@ def make_context(spec, module_set_name, explicit):
return LmodContext(conf)
def guess_core_compilers(name, store=False):
def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
"""Guesses the list of core compilers installed in the system.
Args:
@@ -64,21 +64,19 @@ def guess_core_compilers(name, store=False):
modules.yaml configuration file
Returns:
List of core compilers, if found, or None
List of found core compilers
"""
core_compilers = []
for compiler_config in spack.compilers.all_compilers_config():
for compiler in spack.compilers.all_compilers():
try:
compiler = compiler_config["compiler"]
# A compiler is considered to be a core compiler if any of the
# C, C++ or Fortran compilers reside in a system directory
is_system_compiler = any(
os.path.dirname(x) in spack.util.environment.SYSTEM_DIRS
for x in compiler["paths"].values()
if x is not None
os.path.dirname(getattr(compiler, x, "")) in spack.util.environment.SYSTEM_DIRS
for x in ("cc", "cxx", "f77", "fc")
)
if is_system_compiler:
core_compilers.append(str(compiler["spec"]))
core_compilers.append(compiler.spec)
except (KeyError, TypeError, AttributeError):
continue
@@ -89,10 +87,10 @@ def guess_core_compilers(name, store=False):
modules_cfg = spack.config.get(
"modules:" + name, {}, scope=spack.config.default_modify_scope()
)
modules_cfg.setdefault("lmod", {})["core_compilers"] = core_compilers
modules_cfg.setdefault("lmod", {})["core_compilers"] = [str(x) for x in core_compilers]
spack.config.set("modules:" + name, modules_cfg, scope=spack.config.default_modify_scope())
return core_compilers or None
return core_compilers
class LmodConfiguration(BaseConfiguration):
@@ -104,7 +102,7 @@ class LmodConfiguration(BaseConfiguration):
default_projections = {"all": posixpath.join("{name}", "{version}")}
@property
def core_compilers(self):
def core_compilers(self) -> List[spack.spec.CompilerSpec]:
"""Returns the list of "Core" compilers
Raises:
@@ -112,14 +110,18 @@ def core_compilers(self):
specified in the configuration file or the sequence
is empty
"""
value = configuration(self.name).get("core_compilers") or guess_core_compilers(
self.name, store=True
)
compilers = [
spack.spec.CompilerSpec(c) for c in configuration(self.name).get("core_compilers", [])
]
if not value:
if not compilers:
compilers = guess_core_compilers(self.name, store=True)
if not compilers:
msg = 'the key "core_compilers" must be set in modules.yaml'
raise CoreCompilersNotFoundError(msg)
return value
return compilers
@property
def core_specs(self):
@@ -197,11 +199,11 @@ def provides(self):
# If it is in the list of supported compilers family -> compiler
if self.spec.name in spack.compilers.supported_compilers():
provides["compiler"] = spack.spec.CompilerSpec(self.spec.format("{name}{@version}"))
provides["compiler"] = spack.spec.CompilerSpec(self.spec.format("{name}{@versions}"))
elif self.spec.name in spack.compilers.package_name_to_compiler_name:
# If it is the package for a supported compiler, but of a different name
cname = spack.compilers.package_name_to_compiler_name[self.spec.name]
provides["compiler"] = spack.spec.CompilerSpec("%s@%s" % (cname, self.spec.version))
provides["compiler"] = spack.spec.CompilerSpec(cname, self.spec.versions)
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:
@@ -283,16 +285,18 @@ def token_to_path(self, name, value):
# If we are dealing with a core compiler, return 'Core'
core_compilers = self.conf.core_compilers
if name == "compiler" and str(value) in core_compilers:
if name == "compiler" and any(
spack.spec.CompilerSpec(value).satisfies(c) for c in core_compilers
):
return "Core"
# CompilerSpec does not have an hash, as we are not allowed to
# CompilerSpec does not have a hash, as we are not allowed to
# use different flavors of the same compiler
if name == "compiler":
return path_part_fmt.format(token=value)
# In case the hierarchy token refers to a virtual provider
# we need to append an hash to the version to distinguish
# we need to append a hash to the version to distinguish
# among flavors of the same library (e.g. openblas~openmp vs.
# openblas+openmp)
path = path_part_fmt.format(token=value)

View File

@@ -161,7 +161,7 @@ def make_compilers(self, compiler_id, paths):
compilers = []
for v in compiler_id.version:
comp = cmp_cls(
spack.spec.CompilerSpec(name + "@" + v),
spack.spec.CompilerSpec(name + "@=" + v),
self,
"any",
["cc", "CC", "ftn"],

View File

@@ -69,7 +69,15 @@
from spack.builder import run_after, run_before
from spack.dependency import all_deptypes
from spack.directives import *
from spack.install_test import get_escaped_text_output
from spack.install_test import (
SkipTest,
cache_extra_test_sources,
check_outputs,
find_required_file,
get_escaped_text_output,
install_test_root,
test_part,
)
from spack.installer import (
ExternalPackageError,
InstallError,
@@ -100,6 +108,5 @@
# These are just here for editor support; they will be replaced when the build env
# is set up.
make = MakeExecutable("make", jobs=1)
gmake = MakeExecutable("gmake", jobs=1)
ninja = MakeExecutable("ninja", jobs=1)
configure = Executable(join_path(".", "configure"))

View File

@@ -25,13 +25,12 @@
import textwrap
import time
import traceback
import types
import warnings
from typing import Any, Callable, Dict, Iterable, List, Optional, Tuple, Type
from typing import Any, Callable, Dict, Iterable, List, Optional, Tuple, Type, TypeVar
import llnl.util.filesystem as fsys
import llnl.util.tty as tty
from llnl.util.lang import classproperty, memoized, nullcontext
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.compilers
@@ -55,12 +54,18 @@
import spack.util.path
import spack.util.web
from spack.filesystem_view import YamlFilesystemView
from spack.install_test import TestFailure, TestSuite
from spack.install_test import (
PackageTest,
TestFailure,
TestStatus,
TestSuite,
cache_extra_test_sources,
install_test_root,
)
from spack.installer import InstallError, PackageInstaller
from spack.stage import ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.util.prefix import Prefix
from spack.util.web import FetchError
from spack.version import GitVersion, StandardVersion, Version
@@ -73,24 +78,21 @@
_ALLOWED_URL_SCHEMES = ["http", "https", "ftp", "file", "git"]
# Filename for the Spack build/install log.
#: Filename for the Spack build/install log.
_spack_build_logfile = "spack-build-out.txt"
# Filename for the Spack build/install environment file.
#: Filename for the Spack build/install environment file.
_spack_build_envfile = "spack-build-env.txt"
# Filename for the Spack build/install environment modifications file.
#: Filename for the Spack build/install environment modifications file.
_spack_build_envmodsfile = "spack-build-env-mods.txt"
# Filename for the Spack install phase-time test log.
_spack_install_test_log = "install-time-test-log.txt"
# Filename of json with total build and phase times (seconds)
_spack_times_log = "install_times.json"
# Filename for the Spack configure args file.
#: Filename for the Spack configure args file.
_spack_configure_argsfile = "spack-configure-args.txt"
#: Filename of json with total build and phase times (seconds)
spack_times_log = "install_times.json"
def deprecated_version(pkg, version):
"""Return True if the version is deprecated, False otherwise.
@@ -181,8 +183,7 @@ class DetectablePackageMeta(object):
def __init__(cls, name, bases, attr_dict):
if hasattr(cls, "executables") and hasattr(cls, "libraries"):
msg = "a package can have either an 'executables' or 'libraries' attribute"
msg += " [package '{0.name}' defines both]"
raise spack.error.SpackError(msg.format(cls))
raise spack.error.SpackError(f"{msg} [package '{name}' defines both]")
# On windows, extend the list of regular expressions to look for
# filenames ending with ".exe"
@@ -423,17 +424,7 @@ def remove_files_from_view(self, view, merge_map):
view.remove_files(merge_map.values())
def test_log_pathname(test_stage, spec):
"""Build the pathname of the test log file
Args:
test_stage (str): path to the test stage directory
spec (spack.spec.Spec): instance of the spec under test
Returns:
(str): the pathname of the test log file
"""
return os.path.join(test_stage, "test-{0}-out.txt".format(TestSuite.test_pkg_id(spec)))
Pb = TypeVar("Pb", bound="PackageBase")
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
@@ -638,19 +629,13 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
"tags",
]
#: Boolean. If set to ``True``, the smoke/install test requires a compiler.
#: This is currently used by smoke tests to ensure a compiler is available
#: to build a custom test code.
test_requires_compiler = False
#: Set to ``True`` to indicate the stand-alone test requires a compiler.
#: It is used to ensure a compiler and build dependencies like 'cmake'
#: are available to build a custom test code.
test_requires_compiler: bool = False
#: List of test failures encountered during a smoke/install test run.
test_failures = None
#: TestSuite instance used to manage smoke/install tests for one or more specs.
test_suite = None
#: Path to the log file used for tests
test_log_file = None
#: TestSuite instance used to manage stand-alone tests for 1+ specs.
test_suite: Optional["TestSuite"] = None
def __init__(self, spec):
# this determines how the package should be built.
@@ -672,6 +657,7 @@ def __init__(self, spec):
# init internal variables
self._stage = None
self._fetcher = None
self._tester: Optional["PackageTest"] = None
# Set up timing variables
self._fetch_time = 0.0
@@ -736,9 +722,9 @@ def possible_dependencies(
for name, conditions in cls.dependencies.items():
# check whether this dependency could be of the type asked for
types = [dep.type for cond, dep in conditions.items()]
types = set.union(*types)
if not any(d in types for d in deptype):
deptypes = [dep.type for cond, dep in conditions.items()]
deptypes = set.union(*deptypes)
if not any(d in deptypes for d in deptype):
continue
# expand virtuals if enabled, otherwise just stop at virtuals
@@ -1148,30 +1134,41 @@ def configure_args_path(self):
"""Return the configure args file path associated with staging."""
return os.path.join(self.stage.path, _spack_configure_argsfile)
@property
def test_install_log_path(self):
"""Return the install phase-time test log file path, if set."""
return getattr(self, "test_log_file", None)
@property
def install_test_install_log_path(self):
"""Return the install location for the install phase-time test log."""
return fsys.join_path(self.metadata_dir, _spack_install_test_log)
@property
def times_log_path(self):
"""Return the times log json file."""
return os.path.join(self.metadata_dir, _spack_times_log)
return os.path.join(self.metadata_dir, spack_times_log)
@property
def install_configure_args_path(self):
"""Return the configure args file path on successful installation."""
return os.path.join(self.metadata_dir, _spack_configure_argsfile)
# TODO (post-34236): Update tests and all packages that use this as a
# TODO (post-34236): package method to the function already available
# TODO (post-34236): to packages. Once done, remove this property.
@property
def install_test_root(self):
"""Return the install test root directory."""
return os.path.join(self.metadata_dir, "test")
tty.warn(
"The 'pkg.install_test_root' property is deprecated with removal "
"expected v0.21. Use 'install_test_root(pkg)' instead."
)
return install_test_root(self)
def archive_install_test_log(self):
"""Archive the install-phase test log, if present."""
if getattr(self, "tester", None):
self.tester.archive_install_test_log(self.metadata_dir)
@property
def tester(self):
if not self.spec.versions.concrete:
raise ValueError("Cannot retrieve tester for package without concrete version.")
if not self._tester:
self._tester = PackageTest(self)
return self._tester
@property
def installed(self):
@@ -1208,7 +1205,7 @@ def _make_fetcher(self):
@property
def fetcher(self):
if not self.spec.versions.concrete:
raise ValueError("Cannot retrieve fetcher for" " package without concrete version.")
raise ValueError("Cannot retrieve fetcher for package without concrete version.")
if not self._fetcher:
self._fetcher = self._make_fetcher()
return self._fetcher
@@ -1842,6 +1839,9 @@ def do_install(self, **kwargs):
builder = PackageInstaller([(self, kwargs)])
builder.install()
# TODO (post-34236): Update tests and all packages that use this as a
# TODO (post-34236): package method to the routine made available to
# TODO (post-34236): packages. Once done, remove this method.
def cache_extra_test_sources(self, srcs):
"""Copy relative source paths to the corresponding install test subdir
@@ -1856,45 +1856,13 @@ def cache_extra_test_sources(self, srcs):
be copied to the corresponding location(s) under the install
testing directory.
"""
paths = [srcs] if isinstance(srcs, str) else srcs
for path in paths:
src_path = os.path.join(self.stage.source_path, path)
dest_path = os.path.join(self.install_test_root, path)
if os.path.isdir(src_path):
fsys.install_tree(src_path, dest_path)
else:
fsys.mkdirp(os.path.dirname(dest_path))
fsys.copy(src_path, dest_path)
@contextlib.contextmanager
def _setup_test(self, verbose, externals):
self.test_failures = []
if self.test_suite:
self.test_log_file = self.test_suite.log_file_for_spec(self.spec)
self.tested_file = self.test_suite.tested_file_for_spec(self.spec)
pkg_id = self.test_suite.test_pkg_id(self.spec)
else:
self.test_log_file = fsys.join_path(self.stage.path, _spack_install_test_log)
self.test_suite = TestSuite([self.spec])
self.test_suite.stage = self.stage.path
pkg_id = self.spec.format("{name}-{version}-{hash:7}")
fsys.touch(self.test_log_file) # Otherwise log_parse complains
with tty.log.log_output(self.test_log_file, verbose) as logger:
with logger.force_echo():
tty.msg("Testing package {0}".format(pkg_id))
# use debug print levels for log file to record commands
old_debug = tty.is_debug()
tty.set_debug(True)
try:
yield logger
finally:
# reset debug level
tty.set_debug(old_debug)
msg = (
"'pkg.cache_extra_test_sources(srcs) is deprecated with removal "
"expected in v0.21. Use 'cache_extra_test_sources(pkg, srcs)' "
"instead."
)
warnings.warn(msg)
cache_extra_test_sources(self, srcs)
def do_test(self, dirty=False, externals=False):
if self.test_requires_compiler:
@@ -1909,15 +1877,31 @@ def do_test(self, dirty=False, externals=False):
)
return
kwargs = {"dirty": dirty, "fake": False, "context": "test", "externals": externals}
if tty.is_verbose():
kwargs["verbose"] = True
spack.build_environment.start_build_process(self, test_process, kwargs)
kwargs = {
"dirty": dirty,
"fake": False,
"context": "test",
"externals": externals,
"verbose": tty.is_verbose(),
}
self.tester.stand_alone_tests(kwargs)
# TODO (post-34236): Remove this deprecated method when eliminate test,
# TODO (post-34236): run_test, etc.
@property
def _test_deprecated_warning(self):
alt = f"Use any name starting with 'test_' instead in {self.spec.name}."
return f"The 'test' method is deprecated. {alt}"
# TODO (post-34236): Remove this deprecated method when eliminate test,
# TODO (post-34236): run_test, etc.
def test(self):
# Defer tests to virtual and concrete packages
pass
warnings.warn(self._test_deprecated_warning)
# TODO (post-34236): Remove this deprecated method when eliminate test,
# TODO (post-34236): run_test, etc.
def run_test(
self,
exe,
@@ -1925,7 +1909,7 @@ def run_test(
expected=[],
status=0,
installed=False,
purpose="",
purpose=None,
skip_missing=False,
work_dir=None,
):
@@ -1947,22 +1931,56 @@ def run_test(
in the install prefix bin directory or the provided work_dir
work_dir (str or None): path to the smoke test directory
"""
def test_title(purpose, test_name):
if not purpose:
return f"test: {test_name}: execute {test_name}"
match = re.search(r"test: ([^:]*): (.*)", purpose)
if match:
# The test title has all the expected parts
return purpose
match = re.search(r"test: (.*)", purpose)
if match:
reason = match.group(1)
return f"test: {test_name}: {reason}"
return f"test: {test_name}: {purpose}"
base_exe = os.path.basename(exe)
alternate = f"Use 'test_part' instead for {self.spec.name} to process {base_exe}."
warnings.warn(f"The 'run_test' method is deprecated. {alternate}")
extra = re.compile(r"[\s,\- ]")
details = (
[extra.sub("", options)]
if isinstance(options, str)
else [extra.sub("", os.path.basename(opt)) for opt in options]
)
details = "_".join([""] + details) if details else ""
test_name = f"test_{base_exe}{details}"
tty.info(test_title(purpose, test_name), format="g")
wdir = "." if work_dir is None else work_dir
with fsys.working_dir(wdir, create=True):
try:
runner = which(exe)
if runner is None and skip_missing:
self.tester.status(test_name, TestStatus.SKIPPED, f"{exe} is missing")
return
assert runner is not None, "Failed to find executable '{0}'".format(exe)
assert runner is not None, f"Failed to find executable '{exe}'"
self._run_test_helper(runner, options, expected, status, installed, purpose)
print("PASSED")
self.tester.status(test_name, TestStatus.PASSED, None)
return True
except BaseException as e:
except (AssertionError, BaseException) as e:
# print a summary of the error to the log file
# so that cdash and junit reporters know about it
exc_type, _, tb = sys.exc_info()
print("FAILED: {0}".format(e))
self.tester.status(test_name, TestStatus.FAILED, str(e))
import traceback
# remove the current call frame to exclude the extract_stack
@@ -1991,7 +2009,7 @@ def run_test(
if exc_type is spack.util.executable.ProcessError:
out = io.StringIO()
spack.build_environment.write_log_summary(
out, "test", self.test_log_file, last=1
out, "test", self.tester.test_log_file, last=1
)
m = out.getvalue()
else:
@@ -1999,7 +2017,8 @@ def run_test(
# stack instead of from traceback.
# The traceback is truncated here, so we can't use it to
# traverse the stack.
m = "\n".join(spack.build_environment.get_package_context(tb))
context = spack.build_environment.get_package_context(tb)
m = "\n".join(context) if context else ""
exc = e # e is deleted after this block
@@ -2007,28 +2026,27 @@ def run_test(
if spack.config.get("config:fail_fast", False):
raise TestFailure([(exc, m)])
else:
self.test_failures.append((exc, m))
self.tester.add_failure(exc, m)
return False
# TODO (post-34236): Remove this deprecated method when eliminate test,
# TODO (post-34236): run_test, etc.
def _run_test_helper(self, runner, options, expected, status, installed, purpose):
status = [status] if isinstance(status, int) else status
expected = [expected] if isinstance(expected, str) else expected
options = [options] if isinstance(options, str) else options
if purpose:
tty.msg(purpose)
else:
tty.debug("test: {0}: expect command status in {1}".format(runner.name, status))
if installed:
msg = "Executable '{0}' expected in prefix".format(runner.name)
msg += ", found in {0} instead".format(runner.path)
msg = f"Executable '{runner.name}' expected in prefix, "
msg += f"found in {runner.path} instead"
assert runner.path.startswith(self.spec.prefix), msg
tty.msg(f"Expecting return code in {status}")
try:
output = runner(*options, output=str.split, error=str.split)
assert 0 in status, "Expected {0} execution to fail".format(runner.name)
assert 0 in status, f"Expected {runner.name} execution to fail"
except ProcessError as err:
output = str(err)
match = re.search(r"exited with status ([0-9]+)", output)
@@ -2037,8 +2055,8 @@ def _run_test_helper(self, runner, options, expected, status, installed, purpose
for check in expected:
cmd = " ".join([runner.name] + options)
msg = "Expected '{0}' to match output of `{1}`".format(check, cmd)
msg += "\n\nOutput: {0}".format(output)
msg = f"Expected '{check}' to match output of `{cmd}`"
msg += f"\n\nOutput: {output}"
assert re.search(check, output), msg
def unit_test_check(self):
@@ -2068,21 +2086,23 @@ def build_log_path(self):
return self.install_log_path if self.spec.installed else self.log_path
@classmethod
def inject_flags(cls: Type, name: str, flags: Iterable[str]) -> FLAG_HANDLER_RETURN_TYPE:
def inject_flags(cls: Type[Pb], name: str, flags: Iterable[str]) -> FLAG_HANDLER_RETURN_TYPE:
"""
flag_handler that injects all flags through the compiler wrapper.
"""
return flags, None, None
@classmethod
def env_flags(cls: Type, name: str, flags: Iterable[str]):
def env_flags(cls: Type[Pb], name: str, flags: Iterable[str]) -> FLAG_HANDLER_RETURN_TYPE:
"""
flag_handler that adds all flags to canonical environment variables.
"""
return None, flags, None
@classmethod
def build_system_flags(cls: Type, name: str, flags: Iterable[str]) -> FLAG_HANDLER_RETURN_TYPE:
def build_system_flags(
cls: Type[Pb], name: str, flags: Iterable[str]
) -> FLAG_HANDLER_RETURN_TYPE:
"""
flag_handler that passes flags to the build system arguments. Any
package using `build_system_flags` must also implement
@@ -2170,7 +2190,7 @@ def flag_handler(self) -> FLAG_HANDLER_TYPE:
return self._flag_handler
@flag_handler.setter
def flag_handler(self, var: FLAG_HANDLER_TYPE):
def flag_handler(self, var: FLAG_HANDLER_TYPE) -> None:
self._flag_handler = var
# The flag handler method is called for each of the allowed compiler flags.
@@ -2417,165 +2437,6 @@ def rpath_args(self):
def builder(self):
return spack.builder.create(self)
@staticmethod
def run_test_callbacks(builder, method_names, callback_type="install"):
"""Tries to call all of the listed methods, returning immediately
if the list is None."""
if not builder.pkg.run_tests or method_names is None:
return
fail_fast = spack.config.get("config:fail_fast", False)
with builder.pkg._setup_test(verbose=False, externals=False) as logger:
# Report running each of the methods in the build log
print_test_message(logger, "Running {0}-time tests".format(callback_type), True)
builder.pkg.test_suite.current_test_spec = builder.pkg.spec
builder.pkg.test_suite.current_base_spec = builder.pkg.spec
if "test" in method_names:
_copy_cached_test_files(builder.pkg, builder.pkg.spec)
for name in method_names:
try:
fn = getattr(builder, name)
msg = "RUN-TESTS: {0}-time tests [{1}]".format(callback_type, name)
print_test_message(logger, msg, True)
fn()
except AttributeError as e:
msg = "RUN-TESTS: method not implemented [{0}]".format(name)
print_test_message(logger, msg, True)
builder.pkg.test_failures.append((e, msg))
if fail_fast:
break
# Raise any collected failures here
if builder.pkg.test_failures:
raise TestFailure(builder.pkg.test_failures)
def has_test_method(pkg):
"""Determine if the package defines its own stand-alone test method.
Args:
pkg (str): the package being checked
Returns:
(bool): ``True`` if the package overrides the default method; else
``False``
"""
if not inspect.isclass(pkg):
tty.die("{0}: is not a class, it is {1}".format(pkg, type(pkg)))
return (issubclass(pkg, PackageBase) and pkg.test != PackageBase.test) or (
isinstance(pkg, PackageBase) and pkg.test.__func__ != PackageBase.test
)
def print_test_message(logger, msg, verbose):
if verbose:
with logger.force_echo():
tty.msg(msg)
else:
tty.msg(msg)
def _copy_cached_test_files(pkg, spec):
"""Copy any cached stand-alone test-related files."""
# copy installed test sources cache into test cache dir
if spec.concrete:
cache_source = spec.package.install_test_root
cache_dir = pkg.test_suite.current_test_cache_dir
if os.path.isdir(cache_source) and not os.path.exists(cache_dir):
fsys.install_tree(cache_source, cache_dir)
# copy test data into test data dir
data_source = Prefix(spec.package.package_dir).test
data_dir = pkg.test_suite.current_test_data_dir
if os.path.isdir(data_source) and not os.path.exists(data_dir):
# We assume data dir is used read-only
# maybe enforce this later
shutil.copytree(data_source, data_dir)
def test_process(pkg, kwargs):
verbose = kwargs.get("verbose", False)
externals = kwargs.get("externals", False)
with pkg._setup_test(verbose, externals) as logger:
if pkg.spec.external and not externals:
print_test_message(logger, "Skipped tests for external package", verbose)
return
if not pkg.spec.installed:
print_test_message(logger, "Skipped not installed package", verbose)
return
# run test methods from the package and all virtuals it
# provides virtuals have to be deduped by name
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))
# hack for compilers that are not dependencies (yet)
# TODO: this all eventually goes away
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.satisfies("llvm+clang"):
v_names.extend(["c", "cxx"])
test_specs = [pkg.spec] + [spack.spec.Spec(v_name) for v_name in sorted(v_names)]
ran_actual_test_function = False
try:
with fsys.working_dir(pkg.test_suite.test_dir_for_spec(pkg.spec)):
for spec in test_specs:
pkg.test_suite.current_test_spec = spec
# Fail gracefully if a virtual has no package/tests
try:
spec_pkg = spec.package
except spack.repo.UnknownPackageError:
continue
_copy_cached_test_files(pkg, spec)
# grab the function for each method so we can call
# it with the package
test_fn = spec_pkg.__class__.test
if not isinstance(test_fn, types.FunctionType):
test_fn = test_fn.__func__
# Skip any test methods consisting solely of 'pass'
# since they do not contribute to package testing.
source = (inspect.getsource(test_fn)).splitlines()[1:]
lines = (ln.strip() for ln in source)
statements = [ln for ln in lines if not ln.startswith("#")]
if len(statements) > 0 and statements[0] == "pass":
continue
# Run the tests
ran_actual_test_function = True
context = logger.force_echo if verbose else nullcontext
with context():
test_fn(pkg)
# If fail-fast was on, we error out above
# If we collect errors, raise them in batch here
if pkg.test_failures:
raise TestFailure(pkg.test_failures)
finally:
# flag the package as having been tested (i.e., ran one or more
# non-pass-only methods
if ran_actual_test_function:
fsys.touch(pkg.tested_file)
# log one more test message to provide a completion timestamp
# for CDash reporting
tty.msg("Completed testing")
else:
print_test_message(logger, "No tests to run", verbose)
inject_flags = PackageBase.inject_flags
env_flags = PackageBase.env_flags
@@ -2663,16 +2524,6 @@ def __init__(self, message, long_msg=None):
super(PackageError, self).__init__(message, long_msg)
class PackageVersionError(PackageError):
"""Raised when a version URL cannot automatically be determined."""
def __init__(self, version):
super(PackageVersionError, self).__init__(
"Cannot determine a URL automatically for version %s" % version,
"Please provide a url for this version in the package.py file.",
)
class NoURLError(PackageError):
"""Raised when someone tries to build a URL for a package with no URLs."""

View File

@@ -241,6 +241,9 @@ def accept(self, kind: TokenType):
return True
return False
def expect(self, *kinds: TokenType):
return self.next_token and self.next_token.kind in kinds
class SpecParser:
"""Parse text into specs"""
@@ -257,7 +260,9 @@ def tokens(self) -> List[Token]:
"""
return list(filter(lambda x: x.kind != TokenType.WS, tokenize(self.literal_str)))
def next_spec(self, initial_spec: Optional[spack.spec.Spec] = None) -> spack.spec.Spec:
def next_spec(
self, initial_spec: Optional[spack.spec.Spec] = None
) -> Optional[spack.spec.Spec]:
"""Return the next spec parsed from text.
Args:
@@ -267,13 +272,16 @@ def next_spec(self, initial_spec: Optional[spack.spec.Spec] = None) -> spack.spe
Return
The spec that was parsed
"""
if not self.ctx.next_token:
return initial_spec
initial_spec = initial_spec or spack.spec.Spec()
root_spec = SpecNodeParser(self.ctx).parse(initial_spec)
while True:
if self.ctx.accept(TokenType.DEPENDENCY):
dependency = SpecNodeParser(self.ctx).parse(spack.spec.Spec())
dependency = SpecNodeParser(self.ctx).parse()
if dependency == spack.spec.Spec():
if dependency is None:
msg = (
"this dependency sigil needs to be followed by a package name "
"or a node attribute (version, variant, etc.)"
@@ -292,7 +300,7 @@ def next_spec(self, initial_spec: Optional[spack.spec.Spec] = None) -> spack.spe
def all_specs(self) -> List[spack.spec.Spec]:
"""Return all the specs that remain to be parsed"""
return list(iter(self.next_spec, spack.spec.Spec()))
return list(iter(self.next_spec, None))
class SpecNodeParser:
@@ -306,7 +314,7 @@ def __init__(self, ctx):
self.has_version = False
self.has_hash = False
def parse(self, initial_spec: spack.spec.Spec) -> spack.spec.Spec:
def parse(self, initial_spec: Optional[spack.spec.Spec] = None) -> Optional[spack.spec.Spec]:
"""Parse a single spec node from a stream of tokens
Args:
@@ -315,7 +323,10 @@ def parse(self, initial_spec: spack.spec.Spec) -> spack.spec.Spec:
Return
The object passed as argument
"""
import spack.environment # Needed to retrieve by hash
if not self.ctx.next_token or self.ctx.expect(TokenType.DEPENDENCY):
return initial_spec
initial_spec = initial_spec or spack.spec.Spec()
# If we start with a package name we have a named spec, we cannot
# accept another package name afterwards in a node
@@ -390,27 +401,11 @@ def parse(self, initial_spec: spack.spec.Spec) -> spack.spec.Spec:
name = name.strip("'\" ")
value = value.strip("'\" ")
initial_spec._add_flag(name, value, propagate=True)
elif not self.has_hash and self.ctx.accept(TokenType.DAG_HASH):
dag_hash = self.ctx.current_token.value[1:]
matches = []
active_env = spack.environment.active_environment()
if active_env:
matches = active_env.get_by_hash(dag_hash)
if not matches:
matches = spack.store.db.get_by_hash(dag_hash)
if not matches:
raise spack.spec.NoSuchHashError(dag_hash)
if len(matches) != 1:
raise spack.spec.AmbiguousHashError(
f"Multiple packages specify hash beginning '{dag_hash}'.", *matches
)
spec_by_hash = matches[0]
if not spec_by_hash.satisfies(initial_spec):
raise spack.spec.InvalidHashError(initial_spec, spec_by_hash.dag_hash())
initial_spec._dup(spec_by_hash)
self.has_hash = True
elif self.ctx.expect(TokenType.DAG_HASH):
if initial_spec.abstract_hash:
break
self.ctx.accept(TokenType.DAG_HASH)
initial_spec.abstract_hash = self.ctx.current_token.value[1:]
else:
break
@@ -488,6 +483,11 @@ def parse_one_or_raise(
message += color.colorize(f"@*r{{{underline}}}")
raise ValueError(message)
if result is None:
message = "a single spec was requested, but none was parsed:"
message += f"\n{text}"
raise ValueError(message)
return result

View File

@@ -1063,14 +1063,21 @@ def dump_provenance(self, spec, path):
"Repository %s does not contain package %s." % (self.namespace, spec.fullname)
)
# Install patch files needed by the package.
package_path = self.filename_for_package_name(spec.name)
if not os.path.exists(package_path):
# Spec has no files (e.g., package, patches) to copy
tty.debug(f"{spec.name} does not have a package to dump")
return
# Install patch files needed by the (concrete) package.
fs.mkdirp(path)
for patch in itertools.chain.from_iterable(spec.package.patches.values()):
if patch.path:
if os.path.exists(patch.path):
fs.install(patch.path, path)
else:
tty.warn("Patch file did not exist: %s" % patch.path)
if spec.concrete:
for patch in itertools.chain.from_iterable(spec.package.patches.values()):
if patch.path:
if os.path.exists(patch.path):
fs.install(patch.path, path)
else:
tty.warn("Patch file did not exist: %s" % patch.path)
# Install the package.py file itself.
fs.install(self.filename_for_package_name(spec.name), path)
@@ -1233,6 +1240,9 @@ def get_pkg_class(self, pkg_name):
module = importlib.import_module(fullname)
except ImportError:
raise UnknownPackageError(pkg_name)
except Exception as e:
msg = f"cannot load package '{pkg_name}' from the '{self.namespace}' repository: {e}"
raise RepoError(msg) from e
cls = getattr(module, class_name)
if not inspect.isclass(cls):

View File

@@ -133,8 +133,9 @@ def wrapper(instance, *args, **kwargs):
# Everything else is an error (the installation
# failed outside of the child process)
package["result"] = "error"
package["stdout"] = self.fetch_log(pkg)
package["message"] = str(exc) or "Unknown error"
package["stdout"] = self.fetch_log(pkg)
package["stdout"] += package["message"]
package["exception"] = traceback.format_exc()
raise

View File

@@ -12,7 +12,7 @@
import socket
import time
import xml.sax.saxutils
from typing import Dict
from typing import Dict, Optional
from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener
@@ -113,14 +113,14 @@ def report_build_name(self, pkg_name):
else self.base_buildname
)
def build_report_for_package(self, directory_name, package, duration):
def build_report_for_package(self, report_dir, package, duration):
if "stdout" not in package:
# Skip reporting on packages that did not generate any output.
# Skip reporting on packages that do not generate output.
return
self.current_package_name = package["name"]
self.buildname = self.report_build_name(self.current_package_name)
report_data = self.initialize_report(directory_name)
report_data = self.initialize_report(report_dir)
for phase in CDASH_PHASES:
report_data[phase] = {}
report_data[phase]["loglines"] = []
@@ -215,7 +215,7 @@ def clean_log_event(event):
report_file_name = package["name"] + "_" + report_name
else:
report_file_name = report_name
phase_report = os.path.join(directory_name, report_file_name)
phase_report = os.path.join(report_dir, report_file_name)
with codecs.open(phase_report, "w", "utf-8") as f:
env = spack.tengine.make_environment()
@@ -231,7 +231,7 @@ def clean_log_event(event):
f.write(t.render(report_data))
self.upload(phase_report)
def build_report(self, directory_name, specs):
def build_report(self, report_dir, specs):
# Do an initial scan to determine if we are generating reports for more
# than one package. When we're only reporting on a single package we
# do not explicitly include the package's name in the CDash build name.
@@ -260,7 +260,7 @@ def build_report(self, directory_name, specs):
if "time" in spec:
duration = int(spec["time"])
for package in spec["packages"]:
self.build_report_for_package(directory_name, package, duration)
self.build_report_for_package(report_dir, package, duration)
self.finalize_report()
def extract_standalone_test_data(self, package, phases, report_data):
@@ -273,13 +273,13 @@ def extract_standalone_test_data(self, package, phases, report_data):
testing["generator"] = self.generator
testing["parts"] = extract_test_parts(package["name"], package["stdout"].splitlines())
def report_test_data(self, directory_name, package, phases, report_data):
def report_test_data(self, report_dir, package, phases, report_data):
"""Generate and upload the test report(s) for the package."""
for phase in phases:
# Write the report.
report_name = phase.capitalize() + ".xml"
report_file_name = package["name"] + "_" + report_name
phase_report = os.path.join(directory_name, report_file_name)
report_file_name = "_".join([package["name"], package["id"], report_name])
phase_report = os.path.join(report_dir, report_file_name)
with codecs.open(phase_report, "w", "utf-8") as f:
env = spack.tengine.make_environment()
@@ -297,7 +297,7 @@ def report_test_data(self, directory_name, package, phases, report_data):
tty.debug("Preparing to upload {0}".format(phase_report))
self.upload(phase_report)
def test_report_for_package(self, directory_name, package, duration):
def test_report_for_package(self, report_dir, package, duration):
if "stdout" not in package:
# Skip reporting on packages that did not generate any output.
tty.debug("Skipping report for {0}: No generated output".format(package["name"]))
@@ -311,14 +311,14 @@ def test_report_for_package(self, directory_name, package, duration):
self.buildname = self.report_build_name(self.current_package_name)
self.starttime = self.endtime - duration
report_data = self.initialize_report(directory_name)
report_data = self.initialize_report(report_dir)
report_data["hostname"] = socket.gethostname()
phases = ["testing"]
self.extract_standalone_test_data(package, phases, report_data)
self.report_test_data(directory_name, package, phases, report_data)
self.report_test_data(report_dir, package, phases, report_data)
def test_report(self, directory_name, specs):
def test_report(self, report_dir, specs):
"""Generate reports for each package in each spec."""
tty.debug("Processing test report")
for spec in specs:
@@ -326,21 +326,33 @@ def test_report(self, directory_name, specs):
if "time" in spec:
duration = int(spec["time"])
for package in spec["packages"]:
self.test_report_for_package(directory_name, package, duration)
self.test_report_for_package(report_dir, package, duration)
self.finalize_report()
def test_skipped_report(self, directory_name, spec, reason=None):
def test_skipped_report(
self, report_dir: str, spec: spack.spec.Spec, reason: Optional[str] = None
):
"""Explicitly report spec as being skipped (e.g., CI).
Examples are the installation failed or the package is known to have
broken tests.
Args:
report_dir: directory where the report is to be written
spec: spec being tested
reason: optional reason the test is being skipped
"""
output = "Skipped {0} package".format(spec.name)
if reason:
output += "\n{0}".format(reason)
package = {"name": spec.name, "id": spec.dag_hash(), "result": "skipped", "stdout": output}
self.test_report_for_package(directory_name, package, duration=0.0)
self.test_report_for_package(report_dir, package, duration=0.0)
def concretization_report(self, directory_name, msg):
def concretization_report(self, report_dir, msg):
self.buildname = self.base_buildname
report_data = self.initialize_report(directory_name)
report_data = self.initialize_report(report_dir)
report_data["update"] = {}
report_data["update"]["starttime"] = self.endtime
report_data["update"]["endtime"] = self.endtime
@@ -350,7 +362,7 @@ def concretization_report(self, directory_name, msg):
env = spack.tengine.make_environment()
update_template = posixpath.join(self.template_dir, "Update.xml")
t = env.get_template(update_template)
output_filename = os.path.join(directory_name, "Update.xml")
output_filename = os.path.join(report_dir, "Update.xml")
with open(output_filename, "w") as f:
f.write(t.render(report_data))
# We don't have a current package when reporting on concretization
@@ -360,9 +372,9 @@ def concretization_report(self, directory_name, msg):
self.success = False
self.finalize_report()
def initialize_report(self, directory_name):
if not os.path.exists(directory_name):
os.mkdir(directory_name)
def initialize_report(self, report_dir):
if not os.path.exists(report_dir):
os.mkdir(report_dir)
report_data = {}
report_data["buildname"] = self.buildname
report_data["buildstamp"] = self.buildstamp

View File

@@ -9,17 +9,23 @@
import llnl.util.tty as tty
from spack.install_test import TestStatus
# The keys here represent the only recognized (ctest/cdash) status values
completed = {"failed": "Completed", "passed": "Completed", "notrun": "No tests to run"}
completed = {
"failed": "Completed",
"passed": "Completed",
"skipped": "Completed",
"notrun": "No tests to run",
}
log_regexp = re.compile(r"^==> \[([0-9:.\-]*)(?:, [0-9]*)?\] (.*)")
returns_regexp = re.compile(r"\[([0-9 ,]*)\]")
skip_msgs = ["Testing package", "Results for", "Detected the following"]
skip_msgs = ["Testing package", "Results for", "Detected the following", "Warning:"]
skip_regexps = [re.compile(r"{0}".format(msg)) for msg in skip_msgs]
status_values = ["FAILED", "PASSED", "NO-TESTS"]
status_regexps = [re.compile(r"^({0})".format(stat)) for stat in status_values]
status_regexps = [re.compile(r"^({0})".format(str(stat))) for stat in TestStatus]
def add_part_output(part, line):
@@ -36,12 +42,14 @@ def elapsed(current, previous):
return diff.total_seconds()
# TODO (post-34236): Should remove with deprecated test methods since don't
# TODO (post-34236): have an XFAIL mechanism with the new test_part() approach.
def expected_failure(line):
if not line:
return False
match = returns_regexp.search(line)
xfail = "0" not in match.group(0) if match else False
xfail = "0" not in match.group(1) if match else False
return xfail
@@ -54,12 +62,12 @@ def new_part():
"name": None,
"loglines": [],
"output": None,
"status": "passed",
"status": None,
}
# TODO (post-34236): Remove this when remove deprecated methods
def part_name(source):
# TODO: Should be passed the package prefix and only remove it
elements = []
for e in source.replace("'", "").split(" "):
elements.append(os.path.basename(e) if os.sep in e else e)
@@ -73,10 +81,14 @@ def process_part_end(part, curr_time, last_time):
stat = part["status"]
if stat in completed:
# TODO (post-34236): remove the expected failure mapping when
# TODO (post-34236): remove deprecated test methods.
if stat == "passed" and expected_failure(part["desc"]):
part["completed"] = "Expected to fail"
elif part["completed"] == "Unknown":
part["completed"] = completed[stat]
elif stat is None or stat == "unknown":
part["status"] = "passed"
part["output"] = "\n".join(part["loglines"])
@@ -96,16 +108,16 @@ def status(line):
match = regex.search(line)
if match:
stat = match.group(0)
stat = "notrun" if stat == "NO-TESTS" else stat
stat = "notrun" if stat == "NO_TESTS" else stat
return stat.lower()
def extract_test_parts(default_name, outputs):
parts = []
part = {}
testdesc = ""
last_time = None
curr_time = None
for line in outputs:
line = line.strip()
if not line:
@@ -115,12 +127,16 @@ def extract_test_parts(default_name, outputs):
if skip(line):
continue
# Skipped tests start with "Skipped" and end with "package"
# The spec was explicitly reported as skipped (e.g., installation
# failed, package known to have failing tests, won't test external
# package).
if line.startswith("Skipped") and line.endswith("package"):
stat = "skipped"
part = new_part()
part["command"] = "Not Applicable"
part["completed"] = line
part["completed"] = completed[stat]
part["elapsed"] = 0.0
part["loglines"].append(line)
part["name"] = default_name
part["status"] = "notrun"
parts.append(part)
@@ -137,40 +153,53 @@ def extract_test_parts(default_name, outputs):
if msg.startswith("Installing"):
continue
# New command means the start of a new test part
if msg.startswith("'") and msg.endswith("'"):
# TODO (post-34236): Remove this check when remove run_test(),
# TODO (post-34236): etc. since no longer supporting expected
# TODO (post-34236): failures.
if msg.startswith("Expecting return code"):
if part:
part["desc"] += f"; {msg}"
continue
# Terminate without further parsing if no more test messages
if "Completed testing" in msg:
# Process last lingering part IF it didn't generate status
process_part_end(part, curr_time, last_time)
return parts
# New test parts start "test: <name>: <desc>".
if msg.startswith("test: "):
# Update the last part processed
process_part_end(part, curr_time, last_time)
part = new_part()
part["command"] = msg
part["name"] = part_name(msg)
desc = msg.split(":")
part["name"] = desc[1].strip()
part["desc"] = ":".join(desc[2:]).strip()
parts.append(part)
# Save off the optional test description if it was
# tty.debuged *prior to* the command and reset
if testdesc:
part["desc"] = testdesc
testdesc = ""
# There is no guarantee of a 1-to-1 mapping of a test part and
# a (single) command (or executable) since the introduction of
# PR 34236.
#
# Note that tests where the package does not save the output
# (e.g., output=str.split, error=str.split) will not have
# a command printed to the test log.
elif msg.startswith("'") and msg.endswith("'"):
if part:
if part["command"]:
part["command"] += "; " + msg.replace("'", "")
else:
part["command"] = msg.replace("'", "")
else:
part = new_part()
part["command"] = msg.replace("'", "")
else:
# Update the last part processed since a new log message
# means a non-test action
process_part_end(part, curr_time, last_time)
if testdesc:
# We had a test description but no command so treat
# as a new part (e.g., some import tests)
part = new_part()
part["name"] = "_".join(testdesc.split())
part["command"] = "unknown"
part["desc"] = testdesc
parts.append(part)
process_part_end(part, curr_time, curr_time)
# Assuming this is a description for the next test part
testdesc = msg
else:
tty.debug("Did not recognize test output '{0}'".format(line))
@@ -197,12 +226,14 @@ def extract_test_parts(default_name, outputs):
# If no parts, create a skeleton to flag that the tests are not run
if not parts:
part = new_part()
stat = "notrun"
part["command"] = "Not Applicable"
stat = "failed" if outputs[0].startswith("Cannot open log") else "notrun"
part["command"] = "unknown"
part["completed"] = completed[stat]
part["elapsed"] = 0.0
part["name"] = default_name
part["status"] = stat
part["output"] = "\n".join(outputs)
parts.append(part)
return parts

View File

@@ -89,6 +89,11 @@
"additionalProperties": False,
"properties": {"build-job": attributes_schema, "build-job-remove": attributes_schema},
},
{
"type": "object",
"additionalProperties": False,
"properties": {"copy-job": attributes_schema, "copy-job-remove": attributes_schema},
},
{
"type": "object",
"additionalProperties": False,

View File

@@ -861,9 +861,9 @@ class SpackSolverSetup(object):
def __init__(self, tests=False):
self.gen = None # set by setup()
self.declared_versions = {}
self.possible_versions = {}
self.deprecated_versions = {}
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
self.possible_virtuals = None
self.possible_compilers = []
@@ -1669,9 +1669,34 @@ class Body(object):
if concrete_build_deps or dtype != "build":
clauses.append(fn.attr("depends_on", spec.name, dep.name, dtype))
# Ensure Spack will not coconcretize this with another provider
# for the same virtual
for virtual in dep.package.virtuals_provided:
# TODO: We have to look up info from package.py here, but we'd
# TODO: like to avoid this entirely. We should not need to look
# TODO: up potentially wrong info if we have virtual edge info.
try:
try:
pkg = dep.package
except spack.repo.UnknownNamespaceError:
# Try to look up the package of the same name and use its
# providers. This is as good as we can do without edge info.
pkg_class = spack.repo.path.get_pkg_class(dep.name)
spec = spack.spec.Spec(f"{dep.name}@{dep.version}")
pkg = pkg_class(spec)
virtuals = pkg.virtuals_provided
except spack.repo.UnknownPackageError:
# Skip virtual node constriants for renamed/deleted packages,
# so their binaries can still be installed.
# NOTE: with current specs (which lack edge attributes) this
# can allow concretizations with two providers, but it's unlikely.
continue
# Don't concretize with two providers of the same virtual.
# See above for exception for unknown packages.
# TODO: we will eventually record provider information on edges,
# TODO: which avoids the need for the package lookup above.
for virtual in virtuals:
clauses.append(fn.attr("virtual_node", virtual.name))
clauses.append(fn.provider(dep.name, virtual.name))
@@ -1697,10 +1722,6 @@ class Body(object):
def build_version_dict(self, possible_pkgs):
"""Declare any versions in specs not declared in packages."""
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
for pkg_name in possible_pkgs:
@@ -1734,13 +1755,47 @@ def key_fn(item):
# All the preferred version from packages.yaml, versions in external
# specs will be computed later
version_preferences = packages_yaml.get(pkg_name, {}).get("version", [])
for idx, v in enumerate(version_preferences):
# v can be a string so force it into an actual version for comparisons
ver = vn.Version(v)
version_defs = []
pkg_class = spack.repo.path.get_pkg_class(pkg_name)
for vstr in version_preferences:
v = vn.ver(vstr)
if isinstance(v, vn.GitVersion):
version_defs.append(v)
else:
satisfying_versions = self._check_for_defined_matching_versions(pkg_class, v)
# Amongst all defined versions satisfying this specific
# preference, the highest-numbered version is the
# most-preferred: therefore sort satisfying versions
# from greatest to least
version_defs.extend(sorted(satisfying_versions, reverse=True))
for weight, vdef in enumerate(llnl.util.lang.dedupe(version_defs)):
self.declared_versions[pkg_name].append(
DeclaredVersion(version=ver, idx=idx, origin=Provenance.PACKAGES_YAML)
DeclaredVersion(version=vdef, idx=weight, origin=Provenance.PACKAGES_YAML)
)
self.possible_versions[pkg_name].add(ver)
self.possible_versions[pkg_name].add(vdef)
def _check_for_defined_matching_versions(self, pkg_class, v):
"""Given a version specification (which may be a concrete version,
range, etc.), determine if any package.py version declarations
or externals define a version which satisfies it.
This is primarily for determining whether a version request (e.g.
version preferences, which should not themselves define versions)
refers to a defined version.
This function raises an exception if no satisfying versions are
found.
"""
pkg_name = pkg_class.name
satisfying_versions = list(x for x in pkg_class.versions if x.satisfies(v))
satisfying_versions.extend(x for x in self.possible_versions[pkg_name] if x.satisfies(v))
if not satisfying_versions:
raise spack.config.ConfigError(
"Preference for version {0} does not match any version"
" defined for {1} (in its package.py or any external)".format(str(v), pkg_name)
)
return satisfying_versions
def add_concrete_versions_from_specs(self, specs, origin):
"""Add concrete versions to possible versions from lists of CLI/dev specs."""
@@ -2173,14 +2228,6 @@ def setup(self, driver, specs, reuse=None):
# get possible compilers
self.possible_compilers = self.generate_possible_compilers(specs)
# traverse all specs and packages to build dict of possible versions
self.build_version_dict(possible)
self.add_concrete_versions_from_specs(specs, Provenance.SPEC)
self.add_concrete_versions_from_specs(dev_specs, Provenance.DEV_SPEC)
req_version_specs = _get_versioned_specs_from_pkg_requirements()
self.add_concrete_versions_from_specs(req_version_specs, Provenance.PACKAGE_REQUIREMENT)
self.gen.h1("Concrete input spec definitions")
self.define_concrete_input_specs(specs, possible)
@@ -2208,6 +2255,14 @@ def setup(self, driver, specs, reuse=None):
self.provider_requirements()
self.external_packages()
# traverse all specs and packages to build dict of possible versions
self.build_version_dict(possible)
self.add_concrete_versions_from_specs(specs, Provenance.SPEC)
self.add_concrete_versions_from_specs(dev_specs, Provenance.DEV_SPEC)
req_version_specs = self._get_versioned_specs_from_pkg_requirements()
self.add_concrete_versions_from_specs(req_version_specs, Provenance.PACKAGE_REQUIREMENT)
self.gen.h1("Package Constraints")
for pkg in sorted(self.pkgs):
self.gen.h2("Package rules: %s" % pkg)
@@ -2254,55 +2309,78 @@ def literal_specs(self, specs):
if self.concretize_everything:
self.gen.fact(fn.concretize_everything())
def _get_versioned_specs_from_pkg_requirements(self):
"""If package requirements mention versions that are not mentioned
elsewhere, then we need to collect those to mark them as possible
versions.
"""
req_version_specs = list()
config = spack.config.get("packages")
for pkg_name, d in config.items():
if pkg_name == "all":
continue
if "require" in d:
req_version_specs.extend(self._specs_from_requires(pkg_name, d["require"]))
return req_version_specs
def _get_versioned_specs_from_pkg_requirements():
"""If package requirements mention versions that are not mentioned
elsewhere, then we need to collect those to mark them as possible
versions.
"""
req_version_specs = list()
config = spack.config.get("packages")
for pkg_name, d in config.items():
if pkg_name == "all":
continue
if "require" in d:
req_version_specs.extend(_specs_from_requires(pkg_name, d["require"]))
return req_version_specs
def _specs_from_requires(pkg_name, section):
if isinstance(section, str):
spec = spack.spec.Spec(section)
if not spec.name:
spec.name = pkg_name
extracted_specs = [spec]
else:
spec_strs = []
for spec_group in section:
if isinstance(spec_group, str):
spec_strs.append(spec_group)
else:
# Otherwise it is an object. The object can contain a single
# "spec" constraint, or a list of them with "any_of" or
# "one_of" policy.
if "spec" in spec_group:
new_constraints = [spec_group["spec"]]
else:
key = "one_of" if "one_of" in spec_group else "any_of"
new_constraints = spec_group[key]
spec_strs.extend(new_constraints)
extracted_specs = []
for spec_str in spec_strs:
spec = spack.spec.Spec(spec_str)
def _specs_from_requires(self, pkg_name, section):
"""Collect specs from requirements which define versions (i.e. those that
have a concrete version). Requirements can define *new* versions if
they are included as part of an equivalence (hash=number) but not
otherwise.
"""
if isinstance(section, str):
spec = spack.spec.Spec(section)
if not spec.name:
spec.name = pkg_name
extracted_specs.append(spec)
extracted_specs = [spec]
else:
spec_strs = []
for spec_group in section:
if isinstance(spec_group, str):
spec_strs.append(spec_group)
else:
# Otherwise it is an object. The object can contain a single
# "spec" constraint, or a list of them with "any_of" or
# "one_of" policy.
if "spec" in spec_group:
new_constraints = [spec_group["spec"]]
else:
key = "one_of" if "one_of" in spec_group else "any_of"
new_constraints = spec_group[key]
spec_strs.extend(new_constraints)
version_specs = [x for x in extracted_specs if x.versions.concrete]
for spec in version_specs:
spec.attach_git_version_lookup()
return version_specs
extracted_specs = []
for spec_str in spec_strs:
spec = spack.spec.Spec(spec_str)
if not spec.name:
spec.name = pkg_name
extracted_specs.append(spec)
version_specs = []
for spec in extracted_specs:
if spec.versions.concrete:
# Note: this includes git versions
version_specs.append(spec)
continue
# Prefer spec's name if it exists, in case the spec is
# requiring a specific implementation inside of a virtual section
# e.g. packages:mpi:require:openmpi@4.0.1
pkg_class = spack.repo.path.get_pkg_class(spec.name or pkg_name)
satisfying_versions = self._check_for_defined_matching_versions(
pkg_class, spec.versions
)
# Version ranges ("@1.3" without the "=", "@1.2:1.4") and lists
# will end up here
ordered_satisfying_versions = sorted(satisfying_versions, reverse=True)
vspecs = list(spack.spec.Spec("@{0}".format(x)) for x in ordered_satisfying_versions)
version_specs.extend(vspecs)
for spec in version_specs:
spec.attach_git_version_lookup()
return version_specs
class SpecBuilder(object):
@@ -2463,11 +2541,16 @@ def reorder_flags(self):
# add flags from each source, lowest to highest precedence
for name in sorted_sources:
source = self._specs[name] if name in self._hash_specs else cmd_specs[name]
extend_flag_list(from_sources, source.compiler_flags.get(flag_type, []))
all_src_flags = list()
per_pkg_sources = [self._specs[name]]
if name in cmd_specs:
per_pkg_sources.append(cmd_specs[name])
for source in per_pkg_sources:
all_src_flags.extend(source.compiler_flags.get(flag_type, []))
extend_flag_list(from_sources, all_src_flags)
# compiler flags from compilers config are lowest precedence
ordered_compiler_flags = from_compiler + from_sources
ordered_compiler_flags = list(llnl.util.lang.dedupe(from_compiler + from_sources))
compiler_flags = spec.compiler_flags.get(flag_type, [])
msg = "%s does not equal %s" % (set(compiler_flags), set(ordered_compiler_flags))

View File

@@ -494,6 +494,19 @@ requirement_group_satisfied(Package, X) :-
activate_requirement(Package, X),
requirement_group(Package, X).
% TODO: the following two choice rules allow the solver to add compiler
% flags if their only source is from a requirement. This is overly-specific
% and should use a more-generic approach like in https://github.com/spack/spack/pull/37180
{ attr("node_flag", A1, A2, A3) } :-
requirement_group_member(Y, Package, X),
activate_requirement(Package, X),
imposed_constraint(Y,"node_flag_set", A1, A2, A3).
{ attr("node_flag_source", A1, A2, A3) } :-
requirement_group_member(Y, Package, X),
activate_requirement(Package, X),
imposed_constraint(Y,"node_flag_source", A1, A2, A3).
requirement_weight(Package, Group, W) :-
W = #min {
Z : requirement_has_weight(Y, Z), condition_holds(Y), requirement_group_member(Y, Package, Group);

View File

@@ -110,7 +110,6 @@
"UnsatisfiableDependencySpecError",
"AmbiguousHashError",
"InvalidHashError",
"NoSuchHashError",
"RedundantSpecError",
"SpecDeprecatedError",
]
@@ -145,9 +144,20 @@
#: ``color_formats.keys()``.
_separators = "[\\%s]" % "\\".join(color_formats.keys())
default_format = "{name}{@versions}"
default_format += "{%compiler.name}{@compiler.versions}{compiler_flags}"
default_format += "{variants}{arch=architecture}"
#: Default format for Spec.format(). This format can be round-tripped, so that:
#: Spec(Spec("string").format()) == Spec("string)"
default_format = (
"{name}{@versions}"
"{%compiler.name}{@compiler.versions}{compiler_flags}"
"{variants}{arch=architecture}{/abstract_hash}"
)
#: Display format, which eliminates extra `@=` in the output, for readability.
display_format = (
"{name}{@version}"
"{%compiler.name}{@compiler.version}{compiler_flags}"
"{variants}{arch=architecture}{/abstract_hash}"
)
#: Regular expression to pull spec contents out of clearsigned signature
#: file.
@@ -669,6 +679,16 @@ def from_dict(d):
d = d["compiler"]
return CompilerSpec(d["name"], vn.VersionList.from_dict(d))
@property
def display_str(self):
"""Equivalent to {compiler.name}{@compiler.version} for Specs, without extra
@= for readability."""
if self.concrete:
return f"{self.name}@{self.version}"
elif self.versions != vn.any_version:
return f"{self.name}@{self.versions}"
return self.name
def __str__(self):
out = self.name
if self.versions and self.versions != vn.any_version:
@@ -1249,6 +1269,7 @@ def copy(self, *args, **kwargs):
class Spec(object):
#: Cache for spec's prefix, computed lazily in the corresponding property
_prefix = None
abstract_hash = None
@staticmethod
def default_arch():
@@ -1556,7 +1577,7 @@ def _set_compiler(self, compiler):
def _add_dependency(self, spec: "Spec", *, deptypes: dp.DependencyArgument):
"""Called by the parser to add another spec as a dependency."""
if spec.name not in self._dependencies:
if spec.name not in self._dependencies or not spec.name:
self.add_dependency_edge(spec, deptypes=deptypes)
return
@@ -1617,6 +1638,10 @@ def fullname(self):
else (self.name if self.name else "")
)
@property
def anonymous(self):
return not self.name and not self.abstract_hash
@property
def root(self):
"""Follow dependent links and find the root of this spec's DAG.
@@ -1632,7 +1657,9 @@ def root(self):
@property
def package(self):
assert self.concrete, "Spec.package can only be called on concrete specs"
assert self.concrete, "{0}: Spec.package can only be called on concrete specs".format(
self.name
)
if not self._package:
self._package = spack.repo.path.get(self)
return self._package
@@ -1713,14 +1740,14 @@ def traverse_edges(self, **kwargs):
def short_spec(self):
"""Returns a version of the spec with the dependencies hashed
instead of completely enumerated."""
spec_format = "{name}{@version}{%compiler}"
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{arch=architecture}{/hash:7}"
return self.format(spec_format)
@property
def cshort_spec(self):
"""Returns an auto-colorized version of ``self.short_spec``."""
spec_format = "{name}{@version}{%compiler}"
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{arch=architecture}{/hash:7}"
return self.cformat(spec_format)
@@ -1823,6 +1850,73 @@ def process_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.process_hash(), bits)
def _lookup_hash(self):
"""Lookup just one spec with an abstract hash, returning a spec from the the environment,
store, or finally, binary caches."""
import spack.environment
matches = []
active_env = spack.environment.active_environment()
if active_env:
env_matches = active_env.get_by_hash(self.abstract_hash) or []
matches = [m for m in env_matches if m._satisfies(self)]
if not matches:
db_matches = spack.store.db.get_by_hash(self.abstract_hash) or []
matches = [m for m in db_matches if m._satisfies(self)]
if not matches:
query = spack.binary_distribution.BinaryCacheQuery(True)
remote_matches = query("/" + self.abstract_hash) or []
matches = [m for m in remote_matches if m._satisfies(self)]
if not matches:
raise InvalidHashError(self, self.abstract_hash)
if len(matches) != 1:
raise spack.spec.AmbiguousHashError(
f"Multiple packages specify hash beginning '{self.abstract_hash}'.", *matches
)
return matches[0]
def lookup_hash(self):
"""Given a spec with an abstract hash, return a copy of the spec with all properties and
dependencies by looking up the hash in the environment, store, or finally, binary caches.
This is non-destructive."""
if self.concrete or not any(node.abstract_hash for node in self.traverse()):
return self
spec = self.copy(deps=False)
# root spec is replaced
if spec.abstract_hash:
new = self._lookup_hash()
spec._dup(new)
return spec
# Get dependencies that need to be replaced
for node in self.traverse(root=False):
if node.abstract_hash:
new = node._lookup_hash()
spec._add_dependency(new, deptypes=())
# reattach nodes that were not otherwise satisfied by new dependencies
for node in self.traverse(root=False):
if not any(n._satisfies(node) for n in spec.traverse()):
spec._add_dependency(node.copy(), deptypes=())
return spec
def replace_hash(self):
"""Given a spec with an abstract hash, attempt to populate all properties and dependencies
by looking up the hash in the environment, store, or finally, binary caches.
This is destructive."""
if not any(node for node in self.traverse(order="post") if node.abstract_hash):
return
spec_by_hash = self.lookup_hash()
self._dup(spec_by_hash)
def to_node_dict(self, hash=ht.dag_hash):
"""Create a dictionary representing the state of this Spec.
@@ -2581,6 +2675,8 @@ def _old_concretize(self, tests=False, deprecation_warning=True):
)
warnings.warn(msg)
self.replace_hash()
if not self.name:
raise spack.error.SpecError("Attempting to concretize anonymous spec")
@@ -2703,11 +2799,11 @@ def inject_patches_variant(root):
# Also record all patches required on dependencies by
# depends_on(..., patch=...)
for dspec in root.traverse_edges(deptype=all, cover="edges", root=False):
pkg_deps = dspec.parent.package_class.dependencies
if dspec.spec.name not in pkg_deps:
if dspec.spec.concrete:
continue
if dspec.spec.concrete:
pkg_deps = dspec.parent.package_class.dependencies
if dspec.spec.name not in pkg_deps:
continue
patches = []
@@ -2779,8 +2875,13 @@ def ensure_no_deprecated(root):
def _new_concretize(self, tests=False):
import spack.solver.asp
if not self.name:
raise spack.error.SpecError("Spec has no name; cannot concretize an anonymous spec")
self.replace_hash()
for node in self.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if self._concrete:
return
@@ -3363,6 +3464,11 @@ def constrain(self, other, deps=True):
raise spack.error.UnsatisfiableSpecError(self, other, "constrain a concrete spec")
other = self._autospec(other)
if other.abstract_hash:
if not self.abstract_hash or other.abstract_hash.startswith(self.abstract_hash):
self.abstract_hash = other.abstract_hash
elif not self.abstract_hash.startswith(other.abstract_hash):
raise InvalidHashError(self, other.abstract_hash)
if not (self.name == other.name or (not self.name) or (not other.name)):
raise UnsatisfiableSpecNameError(self.name, other.name)
@@ -3521,6 +3627,12 @@ def intersects(self, other: "Spec", deps: bool = True) -> bool:
"""
other = self._autospec(other)
lhs = self.lookup_hash() or self
rhs = other.lookup_hash() or other
return lhs._intersects(rhs, deps)
def _intersects(self, other: "Spec", deps: bool = True) -> bool:
if other.concrete and self.concrete:
return self.dag_hash() == other.dag_hash()
@@ -3586,9 +3698,18 @@ def intersects(self, other: "Spec", deps: bool = True) -> bool:
else:
return True
def _intersects_dependencies(self, other):
def satisfies(self, other, deps=True):
"""
This checks constraints on common dependencies against each other.
"""
other = self._autospec(other)
lhs = self.lookup_hash() or self
rhs = other.lookup_hash() or other
return lhs._satisfies(rhs, deps=deps)
def _intersects_dependencies(self, other):
if not other._dependencies or not self._dependencies:
# one spec *could* eventually satisfy the other
return True
@@ -3623,7 +3744,7 @@ def _intersects_dependencies(self, other):
return True
def satisfies(self, other: "Spec", deps: bool = True) -> bool:
def _satisfies(self, other: "Spec", deps: bool = True) -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
Args:
@@ -3768,6 +3889,7 @@ def _dup(self, other, deps=True, cleardeps=True):
and self.external_path != other.external_path
and self.external_modules != other.external_modules
and self.compiler_flags != other.compiler_flags
and self.abstract_hash != other.abstract_hash
)
self._package = None
@@ -3810,6 +3932,8 @@ def _dup(self, other, deps=True, cleardeps=True):
self._concrete = other._concrete
self.abstract_hash = other.abstract_hash
if self._concrete:
self._dunder_hash = other._dunder_hash
self._normal = other._normal
@@ -3999,6 +4123,7 @@ def _cmp_node(self):
yield self.compiler
yield self.compiler_flags
yield self.architecture
yield self.abstract_hash
# this is not present on older specs
yield getattr(self, "_package_hash", None)
@@ -4009,7 +4134,10 @@ def eq_node(self, other):
def _cmp_iter(self):
"""Lazily yield components of self for comparison."""
for item in self._cmp_node():
cmp_spec = self.lookup_hash() or self
for item in cmp_spec._cmp_node():
yield item
# This needs to be in _cmp_iter so that no specs with different process hashes
@@ -4020,10 +4148,10 @@ def _cmp_iter(self):
# TODO: they exist for speed. We should benchmark whether it's really worth
# TODO: having two types of hashing now that we use `json` instead of `yaml` for
# TODO: spec hashing.
yield self.process_hash() if self.concrete else None
yield cmp_spec.process_hash() if cmp_spec.concrete else None
def deps():
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
for dep in sorted(itertools.chain.from_iterable(cmp_spec._dependencies.values())):
yield dep.spec.name
yield tuple(sorted(dep.deptypes))
yield hash(dep.spec)
@@ -4144,7 +4272,7 @@ def write_attribute(spec, attribute, color):
raise SpecFormatSigilError(sig, "versions", attribute)
elif sig == "%" and attribute not in ("compiler", "compiler.name"):
raise SpecFormatSigilError(sig, "compilers", attribute)
elif sig == "/" and not re.match(r"hash(:\d+)?$", attribute):
elif sig == "/" and not re.match(r"(abstract_)?hash(:\d+)?$", attribute):
raise SpecFormatSigilError(sig, "DAG hashes", attribute)
elif sig == " arch=" and attribute not in ("architecture", "arch"):
raise SpecFormatSigilError(sig, "the architecture", attribute)
@@ -4264,7 +4392,9 @@ def cformat(self, *args, **kwargs):
return self.format(*args, **kwargs)
def __str__(self):
sorted_nodes = [self] + sorted(self.traverse(root=False), key=lambda x: x.name)
sorted_nodes = [self] + sorted(
self.traverse(root=False), key=lambda x: x.name or x.abstract_hash
)
spec_str = " ^".join(d.format() for d in sorted_nodes)
return spec_str.strip()
@@ -5064,14 +5194,9 @@ def __init__(self, msg, *specs):
class InvalidHashError(spack.error.SpecError):
def __init__(self, spec, hash):
super(InvalidHashError, self).__init__(
"The spec specified by %s does not match provided spec %s" % (hash, spec)
)
class NoSuchHashError(spack.error.SpecError):
def __init__(self, hash):
super(NoSuchHashError, self).__init__("No installed spec matches the hash: '%s'" % hash)
msg = f"No spec with hash {hash} could be found to match {spec}."
msg += " Either the hash does not exist, or it does not match other spec constraints."
super(InvalidHashError, self).__init__(msg)
class SpecFilenameError(spack.error.SpecError):

View File

@@ -21,7 +21,7 @@
(["wrong-variant-in-depends-on"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a GitHub patch URL without full_index=1
(["invalid-github-patch-url"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a stand-alone 'test' method in build-time callbacks
# This package has a stand-alone 'test*' method in build-time callbacks
(["fail-test-audit"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has no issues
(["mpileaks"], None),

View File

@@ -7,6 +7,8 @@
import pytest
from llnl.util.filesystem import touch
import spack.paths
@@ -125,6 +127,7 @@ def test_build_time_tests_are_executed_from_default_builder():
@pytest.mark.regression("34518")
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_wrapped_pkg():
"""Confirm 'run_tests' is accessible through wrappers."""
s = spack.spec.Spec("old-style-autotools").concretized()
builder = spack.builder.create(s.package)
assert s.package.run_tests is False
@@ -139,12 +142,29 @@ def test_monkey_patching_wrapped_pkg():
@pytest.mark.regression("34440")
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_test_log_file():
"""Confirm 'test_log_file' is accessible through wrappers."""
s = spack.spec.Spec("old-style-autotools").concretized()
builder = spack.builder.create(s.package)
assert s.package.test_log_file is None
assert builder.pkg.test_log_file is None
assert builder.pkg_with_dispatcher.test_log_file is None
s.package.test_log_file = "/some/file"
assert builder.pkg.test_log_file == "/some/file"
assert builder.pkg_with_dispatcher.test_log_file == "/some/file"
s.package.tester.test_log_file = "/some/file"
assert builder.pkg.tester.test_log_file == "/some/file"
assert builder.pkg_with_dispatcher.tester.test_log_file == "/some/file"
# Windows context manager's __exit__ fails with ValueError ("I/O operation
# on closed file").
@pytest.mark.skipif(sys.platform == "win32", reason="Does not run on windows")
def test_install_time_test_callback(tmpdir, config, mock_packages, mock_stage):
"""Confirm able to run stand-alone test as a post-install callback."""
s = spack.spec.Spec("py-test-callback").concretized()
builder = spack.builder.create(s.package)
builder.pkg.run_tests = True
s.package.tester.test_log_file = tmpdir.join("install_test.log")
touch(s.package.tester.test_log_file)
for phase_fn in builder:
phase_fn.execute()
with open(s.package.tester.test_log_file, "r") as f:
results = f.read().replace("\n", " ")
assert "PyTestCallback test" in results

View File

@@ -566,8 +566,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
ci.run_standalone_tests(**args)
out = capfd.readouterr()[0]
# CDash *and* log file output means log file ignored
assert "xml option is ignored" in out
assert "0 passed of 0" in out
assert "xml option is ignored with CDash" in out
# copy test results (though none)
artifacts_dir = tmp_path / "artifacts"
@@ -595,9 +594,10 @@ def test_ci_skipped_report(tmpdir, mock_packages, config):
reason = "Testing skip"
handler.report_skipped(spec, tmpdir.strpath, reason=reason)
report = fs.join_path(tmpdir, "{0}_Testing.xml".format(pkg))
expected = "Skipped {0} package".format(pkg)
with open(report, "r") as f:
reports = [name for name in tmpdir.listdir() if str(name).endswith("Testing.xml")]
assert len(reports) == 1
expected = f"Skipped {pkg} package"
with open(reports[0], "r") as f:
have = [0, 0]
for line in f:
if expected in line:

View File

@@ -234,7 +234,7 @@ def test_ci_generate_with_env(
assert "rebuild-index" in yaml_contents
rebuild_job = yaml_contents["rebuild-index"]
expected = "spack buildcache update-index --keys --mirror-url {0}".format(mirror_url)
expected = "spack buildcache update-index --keys {0}".format(mirror_url)
assert rebuild_job["script"][0] == expected
assert rebuild_job["custom_attribute"] == "custom!"
@@ -2392,7 +2392,7 @@ def test_gitlab_ci_deprecated(
assert "rebuild-index" in yaml_contents
rebuild_job = yaml_contents["rebuild-index"]
expected = "spack buildcache update-index --keys --mirror-url {0}".format(mirror_url)
expected = "spack buildcache update-index --keys {0}".format(mirror_url)
assert rebuild_job["script"][0] == expected
assert "variables" in yaml_contents

View File

@@ -107,10 +107,10 @@ def test_compiler_find_no_apple_gcc(no_compilers_yaml, working_env, tmpdir):
def test_compiler_remove(mutable_config, mock_packages):
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
spack.cmd.compiler.compiler_remove(args)
compilers = spack.compilers.all_compiler_specs()
assert spack.spec.CompilerSpec("gcc@4.5.0") not in compilers
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
@pytest.mark.skipif(
@@ -208,8 +208,8 @@ def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, clangdir):
assert "gcc@8.4.0" in output
config = spack.compilers.get_compiler_config("site", False)
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@11.0.0")
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@8.4.0")
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@=8.4.0")
gfortran_path = str(clangdir.join("gfortran-8"))
@@ -250,7 +250,7 @@ def test_compiler_find_prefer_no_suffix(no_compilers_yaml, working_env, clangdir
assert "gcc@8.4.0" in output
config = spack.compilers.get_compiler_config("site", False)
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@11.0.0")
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
assert clang["paths"]["cc"] == str(clangdir.join("clang"))
assert clang["paths"]["cxx"] == str(clangdir.join("clang++"))
@@ -277,7 +277,7 @@ def test_compiler_find_path_order(no_compilers_yaml, working_env, clangdir):
config = spack.compilers.get_compiler_config("site", False)
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@8.4.0")
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@=8.4.0")
assert gcc["paths"] == {
"cc": str(clangdir.join("first_in_path", "gcc-8")),

View File

@@ -7,6 +7,7 @@
import pytest
import spack.environment as ev
from spack import spack_version
from spack.main import SpackCommand
pytestmark = pytest.mark.usefixtures("config", "mutable_mock_repo")
@@ -54,3 +55,6 @@ def test_concretize_root_test_dependencies_are_concretized(unify, mutable_mock_e
add("b")
concretize("--test", "root")
assert e.matching_spec("test-dependency")
data = e._to_lockfile_dict()
assert data["spack"]["version"] == spack_version

View File

@@ -701,6 +701,7 @@ def test_env_with_config(environment_from_manifest):
def test_with_config_bad_include(environment_from_manifest):
"""Confirm missing include paths raise expected exception and error."""
e = environment_from_manifest(
"""
spack:
@@ -709,14 +710,10 @@ def test_with_config_bad_include(environment_from_manifest):
- no/such/file.yaml
"""
)
with pytest.raises(spack.config.ConfigFileError) as exc:
with pytest.raises(spack.config.ConfigFileError, match="2 missing include path"):
with e:
e.concretize()
err = str(exc)
assert "missing include" in err
assert "/no/such/directory" in err
assert os.path.join("no", "such", "file.yaml") in err
assert ev.active_environment() is None
@@ -909,7 +906,7 @@ def test_env_config_precedence(environment_from_manifest):
mpileaks:
version: ["2.2"]
libelf:
version: ["0.8.11"]
version: ["0.8.10"]
"""
)
@@ -2407,7 +2404,11 @@ def test_concretize_user_specs_together():
# Concretize a second time using 'mpich2' as the MPI provider
e.remove("mpich")
e.add("mpich2")
e.concretize()
# Concretizing without invalidating the concrete spec for mpileaks fails
with pytest.raises(spack.error.UnsatisfiableSpecError):
e.concretize()
e.concretize(force=True)
assert all("mpich2" in spec for _, spec in e.concretized_specs())
assert all("mpich" not in spec for _, spec in e.concretized_specs())
@@ -2438,7 +2439,7 @@ def test_duplicate_packages_raise_when_concretizing_together():
e.add("mpich")
with pytest.raises(
spack.error.UnsatisfiableSpecError, match=r"relax the concretizer strictness"
spack.error.UnsatisfiableSpecError, match=r"You could consider setting `concretizer:unify`"
):
e.concretize()
@@ -3298,3 +3299,22 @@ def test_environment_created_in_users_location(mutable_config, tmpdir):
assert dir_name in out
assert env_dir in ev.root(dir_name)
assert os.path.isdir(os.path.join(env_dir, dir_name))
def test_environment_created_from_lockfile_has_view(mock_packages, tmpdir):
"""When an env is created from a lockfile, a view should be generated for it"""
env_a = str(tmpdir.join("a"))
env_b = str(tmpdir.join("b"))
# Create an environment and install a package in it
env("create", "-d", env_a)
with ev.Environment(env_a):
add("libelf")
install("--fake")
# Create another environment from the lockfile of the first environment
env("create", "-d", env_b, os.path.join(env_a, "spack.lock"))
# Make sure the view was created
with ev.Environment(env_b) as e:
assert os.path.isdir(e.view_path_default)

View File

@@ -357,3 +357,18 @@ def test_find_loaded(database, working_env):
output = find("--loaded")
expected = find()
assert output == expected
@pytest.mark.regression("37712")
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
"""Tests that having an active environment with a root spec containing a compiler constrained
by a version range (i.e. @X.Y rather the single version than @=X.Y) doesn't result in an error
when invoking "spack find".
"""
test_environment = ev.create_in_dir(tmp_path)
test_environment.add("zlib %gcc@12.1.0")
test_environment.write()
with test_environment:
output = find()
assert "zlib%gcc@12.1.0" in output

View File

@@ -1072,11 +1072,18 @@ def test_install_empty_env(
],
)
def test_installation_fail_tests(install_mockery, mock_fetch, name, method):
"""Confirm build-time tests with unknown methods fail."""
output = install("--test=root", "--no-cache", name, fail_on_error=False)
# Check that there is a single test failure reported
assert output.count("TestFailure: 1 test failed") == 1
# Check that the method appears twice: no attribute error and in message
assert output.count(method) == 2
assert output.count("method not implemented") == 1
assert output.count("TestFailure: 1 tests failed") == 1
# Check that the path to the test log file is also output
assert "See test log for details" in output
def test_install_use_buildcache(

View File

@@ -37,6 +37,15 @@ def mock_spec():
shutil.rmtree(s.package.stage.path)
def test_location_first(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test with and without the --first option"""
install = SpackCommand("install")
install("libelf@0.8.12")
install("libelf@0.8.13")
# This would normally return an error without --first
assert location("--first", "--install-dir", "libelf")
def test_location_build_dir(mock_spec):
"""Tests spack location --build-dir."""
spec, pkg = mock_spec

View File

@@ -41,7 +41,7 @@ def _module_files(module_type, *specs):
["rm", "doesnotexist"], # Try to remove a non existing module
["find", "mpileaks"], # Try to find a module with multiple matches
["find", "doesnotexist"], # Try to find a module with no matches
["find", "--unkown_args"], # Try to give an unknown argument
["find", "--unknown_args"], # Try to give an unknown argument
]
)
def failure_args(request):

View File

@@ -85,7 +85,15 @@ def mock_pkg_git_repo(git, tmpdir_factory):
@pytest.fixture(scope="module")
def mock_pkg_names():
repo = spack.repo.path.get_repo("builtin.mock")
names = set(name for name in repo.all_package_names() if not name.startswith("pkg-"))
# Be sure to include virtual packages since packages with stand-alone
# tests may inherit additional tests from the virtuals they provide,
# such as packages that implement `mpi`.
names = set(
name
for name in repo.all_package_names(include_virtuals=True)
if not name.startswith("pkg-")
)
return names

View File

@@ -24,12 +24,12 @@
def test_spec():
output = spec("mpileaks")
assert "mpileaks@=2.3" in output
assert "callpath@=1.0" in output
assert "dyninst@=8.2" in output
assert "libdwarf@=20130729" in output
assert "libelf@=0.8.1" in output
assert "mpich@=3.0.4" in output
assert "mpileaks@2.3" in output
assert "callpath@1.0" in output
assert "dyninst@8.2" in output
assert "libdwarf@20130729" in output
assert "libelf@0.8.1" in output
assert "mpich@3.0.4" in output
def test_spec_concretizer_args(mutable_config, mutable_database):
@@ -86,10 +86,9 @@ def test_spec_parse_unquoted_flags_report():
# cflags, we just explain how to fix it for the immediate next arg.
spec("gcc cflags=-Os -pipe -other-arg-that-gets-ignored cflags=-I /usr/include")
# Verify that the generated error message is nicely formatted.
assert str(cm.value) == dedent(
'''\
No installed spec matches the hash: 'usr'
expected_message = dedent(
'''\
Some compiler or linker flags were provided without quoting their arguments,
which now causes spack to try to parse the *next* argument as a spec component
such as a variant instead of an additional compiler or linker flag. If the
@@ -100,6 +99,8 @@ def test_spec_parse_unquoted_flags_report():
(2) cflags=-I /usr/include => cflags="-I /usr/include"'''
)
assert expected_message in str(cm.value)
# Verify that the same unquoted cflags report is generated in the error message even
# if it fails during concretization, not just during parsing.
with pytest.raises(spack.error.SpackError) as cm:
@@ -196,12 +197,12 @@ def test_env_aware_spec(mutable_mock_env_path):
with env:
output = spec()
assert "mpileaks@=2.3" in output
assert "callpath@=1.0" in output
assert "dyninst@=8.2" in output
assert "libdwarf@=20130729" in output
assert "libelf@=0.8.1" in output
assert "mpich@=3.0.4" in output
assert "mpileaks@2.3" in output
assert "callpath@1.0" in output
assert "dyninst@8.2" in output
assert "libdwarf@20130729" in output
assert "libelf@0.8.1" in output
assert "mpich@3.0.4" in output
@pytest.mark.parametrize(

View File

@@ -16,6 +16,7 @@
import spack.package_base
import spack.paths
import spack.store
from spack.install_test import TestStatus
from spack.main import SpackCommand
install = SpackCommand("install")
@@ -59,15 +60,14 @@ def test_test_dup_alias(
"""Ensure re-using an alias fails with suggestion to change."""
install("libdwarf")
# Run the tests with the alias once
out = spack_test("run", "--alias", "libdwarf", "libdwarf")
assert "Spack test libdwarf" in out
# Run the (no) tests with the alias once
spack_test("run", "--alias", "libdwarf", "libdwarf")
# Try again with the alias but don't let it fail on the error
with capfd.disabled():
out = spack_test("run", "--alias", "libdwarf", "libdwarf", fail_on_error=False)
assert "already exists" in out
assert "already exists" in out and "Try another alias" in out
def test_test_output(
@@ -83,51 +83,39 @@ def test_test_output(
# Grab test stage directory contents
testdir = os.path.join(mock_test_stage, stage_files[0])
testdir_files = os.listdir(testdir)
testlogs = [name for name in testdir_files if str(name).endswith("out.txt")]
assert len(testlogs) == 1
# Grab the output from the test log
testlog = list(filter(lambda x: x.endswith("out.txt") and x != "results.txt", testdir_files))
outfile = os.path.join(testdir, testlog[0])
# Grab the output from the test log to confirm expected result
outfile = os.path.join(testdir, testlogs[0])
with open(outfile, "r") as f:
output = f.read()
assert "BEFORE TEST" in output
assert "true: expect command status in [" in output
assert "AFTER TEST" in output
assert "FAILED" not in output
assert "test_print" in output
assert "PASSED" in output
def test_test_output_on_error(
mock_packages, mock_archive, mock_fetch, install_mockery_mutable_config, capfd, mock_test_stage
@pytest.mark.parametrize(
"pkg_name,failure", [("test-error", "exited with status 1"), ("test-fail", "not callable")]
)
def test_test_output_fails(
mock_packages,
mock_archive,
mock_fetch,
install_mockery_mutable_config,
mock_test_stage,
pkg_name,
failure,
):
install("test-error")
# capfd interferes with Spack's capturing
with capfd.disabled():
out = spack_test("run", "test-error", fail_on_error=False)
"""Confirm stand-alone test failure with expected outputs."""
install(pkg_name)
out = spack_test("run", pkg_name, fail_on_error=False)
# Confirm package-specific failure is in the output
assert failure in out
# Confirm standard failure tagging AND test log reference also output
assert "TestFailure" in out
assert "Command exited with status 1" in out
def test_test_output_on_failure(
mock_packages, mock_archive, mock_fetch, install_mockery_mutable_config, capfd, mock_test_stage
):
install("test-fail")
with capfd.disabled():
out = spack_test("run", "test-fail", fail_on_error=False)
assert "Expected 'not in the output' to match output of `true`" in out
assert "TestFailure" in out
def test_show_log_on_error(
mock_packages, mock_archive, mock_fetch, install_mockery_mutable_config, capfd, mock_test_stage
):
"""Make sure spack prints location of test log on failure."""
install("test-error")
with capfd.disabled():
out = spack_test("run", "test-error", fail_on_error=False)
assert "See test log" in out
assert mock_test_stage in out
assert "See test log for details" in out
@pytest.mark.usefixtures(
@@ -136,11 +124,12 @@ def test_show_log_on_error(
@pytest.mark.parametrize(
"pkg_name,msgs",
[
("test-error", ["FAILED: Command exited", "TestFailure"]),
("test-fail", ["FAILED: Expected", "TestFailure"]),
("test-error", ["exited with status 1", "TestFailure"]),
("test-fail", ["not callable", "TestFailure"]),
],
)
def test_junit_output_with_failures(tmpdir, mock_test_stage, pkg_name, msgs):
"""Confirm stand-alone test failure expected outputs in JUnit reporting."""
install(pkg_name)
with tmpdir.as_cwd():
spack_test(
@@ -173,6 +162,7 @@ def test_cdash_output_test_error(
mock_test_stage,
capfd,
):
"""Confirm stand-alone test error expected outputs in CDash reporting."""
install("test-error")
with tmpdir.as_cwd():
spack_test(
@@ -183,12 +173,10 @@ def test_cdash_output_test_error(
fail_on_error=False,
)
report_dir = tmpdir.join("cdash_reports")
print(tmpdir.listdir())
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("test-error_Testing.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "FAILED: Command exited with status 1" in content
reports = [name for name in report_dir.listdir() if str(name).endswith("Testing.xml")]
assert len(reports) == 1
content = reports[0].open().read()
assert "Command exited with status 1" in content
def test_cdash_upload_clean_test(
@@ -203,10 +191,12 @@ def test_cdash_upload_clean_test(
with tmpdir.as_cwd():
spack_test("run", "--log-file=cdash_reports", "--log-format=cdash", "printing-package")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("printing-package_Testing.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
reports = [name for name in report_dir.listdir() if str(name).endswith("Testing.xml")]
assert len(reports) == 1
content = reports[0].open().read()
assert "passed" in content
assert "Running test_print" in content, "Expected first command output"
assert "second command" in content, "Expected second command output"
assert "</Test>" in content
assert "<Text>" not in content
@@ -226,17 +216,19 @@ def test_test_help_cdash(mock_test_stage):
def test_test_list_all(mock_packages):
"""make sure `spack test list --all` returns all packages with tests"""
"""Confirm `spack test list --all` returns all packages with test methods"""
pkgs = spack_test("list", "--all").strip().split()
assert set(pkgs) == set(
[
"fail-test-audit",
"mpich",
"printing-package",
"py-extension1",
"py-extension2",
"py-test-callback",
"simple-standalone-test",
"test-error",
"test-fail",
"fail-test-audit",
]
)
@@ -248,15 +240,6 @@ def test_test_list(mock_packages, mock_archive, mock_fetch, install_mockery_muta
assert pkg_with_tests in output
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
def test_has_test_method_fails(capsys):
with pytest.raises(SystemExit):
spack.package_base.has_test_method("printing-package")
captured = capsys.readouterr()[1]
assert "is not a class" in captured
def test_read_old_results(mock_packages, mock_test_stage):
"""Take test data generated before the switch to full hash everywhere
and make sure we can still read it in"""
@@ -276,7 +259,7 @@ def test_read_old_results(mock_packages, mock_test_stage):
# The results command should still print the old test results
results_output = spack_test("results")
assert "PASSED" in results_output
assert str(TestStatus.PASSED) in results_output
def test_test_results_none(mock_packages, mock_test_stage):
@@ -291,15 +274,10 @@ def test_test_results_none(mock_packages, mock_test_stage):
@pytest.mark.parametrize(
"status,expected",
[
("FAILED", "1 failed"),
("NO-TESTS", "1 no-tests"),
("SKIPPED", "1 skipped"),
("PASSED", "1 passed"),
],
"status", [TestStatus.FAILED, TestStatus.NO_TESTS, TestStatus.SKIPPED, TestStatus.PASSED]
)
def test_test_results_status(mock_packages, mock_test_stage, status, expected):
def test_test_results_status(mock_packages, mock_test_stage, status):
"""Confirm 'spack test results' returns expected status."""
name = "trivial"
spec = spack.spec.Spec("trivial-smoke-test").concretized()
suite = spack.install_test.TestSuite([spec], name)
@@ -313,11 +291,11 @@ def test_test_results_status(mock_packages, mock_test_stage, status, expected):
args.insert(1, opt)
results = spack_test(*args)
if opt == "--failed" and status != "FAILED":
assert status not in results
if opt == "--failed" and status != TestStatus.FAILED:
assert str(status) not in results
else:
assert status in results
assert expected in results
assert str(status) in results
assert "1 {0}".format(status.lower()) in results
@pytest.mark.regression("35337")
@@ -341,3 +319,17 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
spack.cmd.common.arguments.sanitize_reporter_options(args)
filename = spack.cmd.test.report_filename(args, suite)
assert filename != "https://blahblah/submit.php?project=debugging"
def test_test_output_multiple_specs(
mock_test_stage, mock_packages, mock_archive, mock_fetch, install_mockery_mutable_config
):
"""Ensure proper reporting for suite with skipped, failing, and passed tests."""
install("test-error", "simple-standalone-test@0.9", "simple-standalone-test@1.0")
out = spack_test("run", "test-error", "simple-standalone-test", fail_on_error=False)
# Note that a spec with passing *and* skipped tests is still considered
# to have passed at this level. If you want to see the spec-specific
# part result summaries, you'll have to look at the "test-out.txt" files
# for each spec.
assert "1 failed, 2 passed of 3 specs" in out

View File

@@ -152,7 +152,9 @@ def test_preferred_versions(self):
assert spec.version == Version("2.2")
def test_preferred_versions_mixed_version_types(self):
update_packages("mixedversions", "version", ["2.0"])
if spack.config.get("config:concretizer") == "original":
pytest.skip("This behavior is not enforced for the old concretizer")
update_packages("mixedversions", "version", ["=2.0"])
spec = concretize("mixedversions")
assert spec.version == Version("2.0")
@@ -228,6 +230,29 @@ def test_preferred(self):
spec.concretize()
assert spec.version == Version("3.5.0")
def test_preferred_undefined_raises(self):
"""Preference should not specify an undefined version"""
if spack.config.get("config:concretizer") == "original":
pytest.xfail("This behavior is not enforced for the old concretizer")
update_packages("python", "version", ["3.5.0.1"])
spec = Spec("python")
with pytest.raises(spack.config.ConfigError):
spec.concretize()
def test_preferred_truncated(self):
"""Versions without "=" are treated as version ranges: if there is
a satisfying version defined in the package.py, we should use that
(don't define a new version).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("This behavior is not enforced for the old concretizer")
update_packages("python", "version", ["3.5"])
spec = Spec("python")
spec.concretize()
assert spec.satisfies("@3.5.1")
def test_develop(self):
"""Test concretization with develop-like versions"""
spec = Spec("develop-test")

View File

@@ -66,6 +66,28 @@ class V(Package):
)
_pkgt = (
"t",
"""\
class T(Package):
version('2.1')
version('2.0')
depends_on('u', when='@2.1:')
""",
)
_pkgu = (
"u",
"""\
class U(Package):
version('1.1')
version('1.0')
""",
)
@pytest.fixture
def create_test_repo(tmpdir, mutable_config):
repo_path = str(tmpdir)
@@ -79,7 +101,7 @@ def create_test_repo(tmpdir, mutable_config):
)
packages_dir = tmpdir.join("packages")
for pkg_name, pkg_str in [_pkgx, _pkgy, _pkgv]:
for pkg_name, pkg_str in [_pkgx, _pkgy, _pkgv, _pkgt, _pkgu]:
pkg_dir = packages_dir.ensure(pkg_name, dir=True)
pkg_file = pkg_dir.join("package.py")
with open(str(pkg_file), "w") as f:
@@ -144,6 +166,45 @@ def test_requirement_isnt_optional(concretize_scope, test_repo):
Spec("x@1.1").concretize()
def test_require_undefined_version(concretize_scope, test_repo):
"""If a requirement specifies a numbered version that isn't in
the associated package.py and isn't part of a Git hash
equivalence (hash=number), then Spack should raise an error
(it is assumed this is a typo, and raising the error here
avoids a likely error when Spack attempts to fetch the version).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
conf_str = """\
packages:
x:
require: "@1.2"
"""
update_packages_config(conf_str)
with pytest.raises(spack.config.ConfigError):
Spec("x").concretize()
def test_require_truncated(concretize_scope, test_repo):
"""A requirement specifies a version range, with satisfying
versions defined in the package.py. Make sure we choose one
of the defined versions (vs. allowing the requirement to
define a new version).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
conf_str = """\
packages:
x:
require: "@1"
"""
update_packages_config(conf_str)
xspec = Spec("x").concretized()
assert xspec.satisfies("@1.1")
def test_git_user_supplied_reference_satisfaction(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
@@ -220,6 +281,40 @@ def test_requirement_adds_new_version(
assert s1.version.ref == a_commit_hash
def test_requirement_adds_version_satisfies(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
"""Make sure that new versions added by requirements are factored into
conditions. In this case create a new version that satisfies a
depends_on condition and make sure it is triggered (i.e. the
dependency is added).
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration" " requirements")
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(
spack.package_base.PackageBase, "git", path_to_file_url(repo_path), raising=False
)
# Sanity check: early version of T does not include U
s0 = Spec("t@2.0").concretized()
assert not ("u" in s0)
conf_str = """\
packages:
t:
require: "@{0}=2.2"
""".format(
commits[0]
)
update_packages_config(conf_str)
s1 = Spec("t").concretized()
assert "u" in s1
assert s1.satisfies("@2.2")
def test_requirement_adds_git_hash_version(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
@@ -272,8 +367,11 @@ def test_requirement_adds_multiple_new_versions(
def test_preference_adds_new_version(
concretize_scope, test_repo, mock_git_version_info, monkeypatch
):
"""Normally a preference cannot define a new version, but that constraint
is ignored if the version is a Git hash-based version.
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
pytest.skip("Original concretizer does not enforce this constraint for preferences")
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(
@@ -296,6 +394,29 @@ def test_preference_adds_new_version(
assert not s3.satisfies("@2.3")
def test_external_adds_new_version_that_is_preferred(concretize_scope, test_repo):
"""Test that we can use a version, not declared in package recipe, as the
preferred version if that version appears in an external spec.
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not enforce this constraint for preferences")
conf_str = """\
packages:
y:
version: ["2.7"]
externals:
- spec: y@2.7 # Not defined in y
prefix: /fake/nonexistent/path/
buildable: false
"""
update_packages_config(conf_str)
spec = Spec("x").concretized()
assert spec["y"].satisfies("@2.7")
assert spack.version.Version("2.7") not in spec["y"].package.versions
def test_requirement_is_successfully_applied(concretize_scope, test_repo):
"""If a simple requirement can be satisfied, make sure the
concretization succeeds and the requirement spec is applied.
@@ -381,6 +502,22 @@ def test_one_package_multiple_oneof_groups(concretize_scope, test_repo):
assert s2.satisfies("%gcc+shared")
@pytest.mark.regression("34241")
def test_require_cflags(concretize_scope, test_repo):
"""Ensures that flags can be required from configuration."""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration" " requirements")
conf_str = """\
packages:
y:
require: cflags="-g"
"""
update_packages_config(conf_str)
spec = Spec("y").concretized()
assert spec.satisfies("cflags=-g")
def test_requirements_for_package_that_is_not_needed(concretize_scope, test_repo):
"""Specify requirements for specs that are not concretized or
a dependency of a concretized spec (in other words, none of

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
import pytest
import spack.container.writers as writers
@@ -149,3 +151,14 @@ def test_not_stripping_all_symbols(minimal_configuration):
content = writers.create(minimal_configuration)()
assert "xargs strip" in content
assert "xargs strip -s" not in content
@pytest.mark.regression("22341")
def test_using_single_quotes_in_dockerfiles(minimal_configuration):
"""Tests that Dockerfiles written by Spack use single quotes in manifest, to avoid issues
with shell substitution. This may happen e.g. when users have "definitions:" they want to
expand in dockerfiles.
"""
manifest_in_docker = writers.create(minimal_configuration).manifest
assert not re.search(r"echo\s*\"", manifest_in_docker, flags=re.MULTILINE)
assert re.search(r"echo\s*'", manifest_in_docker)

View File

@@ -4,7 +4,7 @@ lmod:
hash_length: 0
core_compilers:
- 'clang@3.3'
- 'clang@12.0.0'
core_specs:
- 'mpich@3.0.1'

View File

@@ -0,0 +1,5 @@
enable:
- lmod
lmod:
core_compilers:
- 'clang@12.0.0'

View File

@@ -0,0 +1,5 @@
enable:
- lmod
lmod:
core_compilers:
- 'clang@=12.0.0'

View File

@@ -1,6 +1,7 @@
==> Testing package printing-package-1.0-hzgcoow
BEFORE TEST
==> [2022-02-28-20:21:46.510616] test: true: expect command status in [0]
==> [2022-02-28-20:21:46.510937] '/bin/true'
PASSED
AFTER TEST
==> [2022-12-06-20:21:46.550943] test: test_print: Test python print example.
==> [2022-12-06-20:21:46.553219] '/usr/tce/bin/python' '-c' 'print("Running test_print")'
Running test_print
==> [2022-12-06-20:21:46.721077] '/usr/tce/bin/python' '-c' 'print("Running test_print")'
PASSED: test_print
==> [2022-12-06-20:21:46.822608] Completed testing

View File

@@ -31,194 +31,164 @@ class Amdfftw(FftwBase):
Example : spack install amdfftw precision=float
"""
_name = 'amdfftw'
_name = "amdfftw"
homepage = "https://developer.amd.com/amd-aocl/fftw/"
url = "https://github.com/amd/amd-fftw/archive/3.0.tar.gz"
git = "https://github.com/amd/amd-fftw.git"
maintainers = ['amd-toolchain-support']
maintainers("amd-toolchain-support")
version('3.1', sha256='3e777f3acef13fa1910db097e818b1d0d03a6a36ef41186247c6ab1ab0afc132')
version('3.0.1', sha256='87030c6bbb9c710f0a64f4f306ba6aa91dc4b182bb804c9022b35aef274d1a4c')
version('3.0', sha256='a69deaf45478a59a69f77c4f7e9872967f1cfe996592dd12beb6318f18ea0bcd')
version('2.2', sha256='de9d777236fb290c335860b458131678f75aa0799c641490c644c843f0e246f8')
version("3.1", sha256="3e777f3acef13fa1910db097e818b1d0d03a6a36ef41186247c6ab1ab0afc132")
version("3.0.1", sha256="87030c6bbb9c710f0a64f4f306ba6aa91dc4b182bb804c9022b35aef274d1a4c")
version("3.0", sha256="a69deaf45478a59a69f77c4f7e9872967f1cfe996592dd12beb6318f18ea0bcd")
version("2.2", sha256="de9d777236fb290c335860b458131678f75aa0799c641490c644c843f0e246f8")
variant('shared', default=True,
description='Builds a shared version of the library')
variant('openmp', default=True,
description='Enable OpenMP support')
variant('threads', default=False,
description='Enable SMP threads support')
variant('debug', default=False,
description='Builds a debug version of the library')
variant("shared", default=True, description="Builds a shared version of the library")
variant("openmp", default=True, description="Enable OpenMP support")
variant("threads", default=False, description="Enable SMP threads support")
variant("debug", default=False, description="Builds a debug version of the library")
variant(
'amd-fast-planner',
"amd-fast-planner",
default=False,
description='Option to reduce the planning time without much'
'tradeoff in the performance. It is supported for'
'Float and double precisions only.')
description="Option to reduce the planning time without much"
"tradeoff in the performance. It is supported for"
"Float and double precisions only.",
)
variant("amd-top-n-planner", default=False, description="Build with amd-top-n-planner support")
variant(
'amd-top-n-planner',
default=False,
description='Build with amd-top-n-planner support')
variant(
'amd-mpi-vader-limit',
default=False,
description='Build with amd-mpi-vader-limit support')
variant(
'static',
default=False,
description='Build with static suppport')
variant(
'amd-trans',
default=False,
description='Build with amd-trans suppport')
variant(
'amd-app-opt',
default=False,
description='Build with amd-app-opt suppport')
"amd-mpi-vader-limit", default=False, description="Build with amd-mpi-vader-limit support"
)
variant("static", default=False, description="Build with static suppport")
variant("amd-trans", default=False, description="Build with amd-trans suppport")
variant("amd-app-opt", default=False, description="Build with amd-app-opt suppport")
depends_on('texinfo')
depends_on("texinfo")
provides('fftw-api@3', when='@2:')
provides("fftw-api@3", when="@2:")
conflicts(
'precision=quad',
when='@2.2 %aocc',
msg='Quad precision is not supported by AOCC clang version 2.2')
"precision=quad",
when="@2.2 %aocc",
msg="Quad precision is not supported by AOCC clang version 2.2",
)
conflicts(
'+debug',
when='@2.2 %aocc',
msg='debug mode is not supported by AOCC clang version 2.2')
"+debug", when="@2.2 %aocc", msg="debug mode is not supported by AOCC clang version 2.2"
)
conflicts("%gcc@:7.2", when="@2.2:", msg="GCC version above 7.2 is required for AMDFFTW")
conflicts(
'%gcc@:7.2',
when='@2.2:',
msg='GCC version above 7.2 is required for AMDFFTW')
"+amd-fast-planner ", when="+mpi", msg="mpi thread is not supported with amd-fast-planner"
)
conflicts(
'+amd-fast-planner ',
when='+mpi',
msg='mpi thread is not supported with amd-fast-planner')
"+amd-fast-planner", when="@2.2", msg="amd-fast-planner is supported from 3.0 onwards"
)
conflicts(
'+amd-fast-planner',
when='@2.2',
msg='amd-fast-planner is supported from 3.0 onwards')
"+amd-fast-planner",
when="precision=quad",
msg="Quad precision is not supported with amd-fast-planner",
)
conflicts(
'+amd-fast-planner',
when='precision=quad',
msg='Quad precision is not supported with amd-fast-planner')
"+amd-fast-planner",
when="precision=long_double",
msg="long_double precision is not supported with amd-fast-planner",
)
conflicts(
'+amd-fast-planner',
when='precision=long_double',
msg='long_double precision is not supported with amd-fast-planner')
"+amd-top-n-planner",
when="@:3.0.0",
msg="amd-top-n-planner is supported from 3.0.1 onwards",
)
conflicts(
'+amd-top-n-planner',
when='@:3.0.0',
msg='amd-top-n-planner is supported from 3.0.1 onwards')
"+amd-top-n-planner",
when="precision=long_double",
msg="long_double precision is not supported with amd-top-n-planner",
)
conflicts(
'+amd-top-n-planner',
when='precision=long_double',
msg='long_double precision is not supported with amd-top-n-planner')
"+amd-top-n-planner",
when="precision=quad",
msg="Quad precision is not supported with amd-top-n-planner",
)
conflicts(
'+amd-top-n-planner',
when='precision=quad',
msg='Quad precision is not supported with amd-top-n-planner')
"+amd-top-n-planner",
when="+amd-fast-planner",
msg="amd-top-n-planner cannot be used with amd-fast-planner",
)
conflicts(
'+amd-top-n-planner',
when='+amd-fast-planner',
msg='amd-top-n-planner cannot be used with amd-fast-planner')
"+amd-top-n-planner", when="+threads", msg="amd-top-n-planner works only for single thread"
)
conflicts(
'+amd-top-n-planner',
when='+threads',
msg='amd-top-n-planner works only for single thread')
"+amd-top-n-planner", when="+mpi", msg="mpi thread is not supported with amd-top-n-planner"
)
conflicts(
'+amd-top-n-planner',
when='+mpi',
msg='mpi thread is not supported with amd-top-n-planner')
"+amd-top-n-planner",
when="+openmp",
msg="openmp thread is not supported with amd-top-n-planner",
)
conflicts(
'+amd-top-n-planner',
when='+openmp',
msg='openmp thread is not supported with amd-top-n-planner')
"+amd-mpi-vader-limit",
when="@:3.0.0",
msg="amd-mpi-vader-limit is supported from 3.0.1 onwards",
)
conflicts(
'+amd-mpi-vader-limit',
when='@:3.0.0',
msg='amd-mpi-vader-limit is supported from 3.0.1 onwards')
"+amd-mpi-vader-limit",
when="precision=quad",
msg="Quad precision is not supported with amd-mpi-vader-limit",
)
conflicts("+amd-trans", when="+threads", msg="amd-trans works only for single thread")
conflicts("+amd-trans", when="+mpi", msg="mpi thread is not supported with amd-trans")
conflicts("+amd-trans", when="+openmp", msg="openmp thread is not supported with amd-trans")
conflicts(
'+amd-mpi-vader-limit',
when='precision=quad',
msg='Quad precision is not supported with amd-mpi-vader-limit')
"+amd-trans",
when="precision=long_double",
msg="long_double precision is not supported with amd-trans",
)
conflicts(
'+amd-trans',
when='+threads',
msg='amd-trans works only for single thread')
"+amd-trans", when="precision=quad", msg="Quad precision is not supported with amd-trans"
)
conflicts("+amd-app-opt", when="@:3.0.1", msg="amd-app-opt is supported from 3.1 onwards")
conflicts("+amd-app-opt", when="+mpi", msg="mpi thread is not supported with amd-app-opt")
conflicts(
'+amd-trans',
when='+mpi',
msg='mpi thread is not supported with amd-trans')
"+amd-app-opt",
when="precision=long_double",
msg="long_double precision is not supported with amd-app-opt",
)
conflicts(
'+amd-trans',
when='+openmp',
msg='openmp thread is not supported with amd-trans')
conflicts(
'+amd-trans',
when='precision=long_double',
msg='long_double precision is not supported with amd-trans')
conflicts(
'+amd-trans',
when='precision=quad',
msg='Quad precision is not supported with amd-trans')
conflicts(
'+amd-app-opt',
when='@:3.0.1',
msg='amd-app-opt is supported from 3.1 onwards')
conflicts(
'+amd-app-opt',
when='+mpi',
msg='mpi thread is not supported with amd-app-opt')
conflicts(
'+amd-app-opt',
when='precision=long_double',
msg='long_double precision is not supported with amd-app-opt')
conflicts(
'+amd-app-opt',
when='precision=quad',
msg='Quad precision is not supported with amd-app-opt')
"+amd-app-opt",
when="precision=quad",
msg="Quad precision is not supported with amd-app-opt",
)
def configure(self, spec, prefix):
"""Configure function"""
# Base options
options = [
'--prefix={0}'.format(prefix),
'--enable-amd-opt'
]
options = ["--prefix={0}".format(prefix), "--enable-amd-opt"]
# Check if compiler is AOCC
if '%aocc' in spec:
options.append('CC={0}'.format(os.path.basename(spack_cc)))
options.append('FC={0}'.format(os.path.basename(spack_fc)))
options.append('F77={0}'.format(os.path.basename(spack_fc)))
if "%aocc" in spec:
options.append("CC={0}".format(os.path.basename(spack_cc)))
options.append("FC={0}".format(os.path.basename(spack_fc)))
options.append("F77={0}".format(os.path.basename(spack_fc)))
if '+debug' in spec:
options.append('--enable-debug')
if "+debug" in spec:
options.append("--enable-debug")
if '+mpi' in spec:
options.append('--enable-mpi')
options.append('--enable-amd-mpifft')
if "+mpi" in spec:
options.append("--enable-mpi")
options.append("--enable-amd-mpifft")
else:
options.append('--disable-mpi')
options.append('--disable-amd-mpifft')
options.append("--disable-mpi")
options.append("--disable-amd-mpifft")
options.extend(self.enable_or_disable('shared'))
options.extend(self.enable_or_disable('openmp'))
options.extend(self.enable_or_disable('threads'))
options.extend(self.enable_or_disable('amd-fast-planner'))
options.extend(self.enable_or_disable('amd-top-n-planner'))
options.extend(self.enable_or_disable('amd-mpi-vader-limit'))
options.extend(self.enable_or_disable('static'))
options.extend(self.enable_or_disable('amd-trans'))
options.extend(self.enable_or_disable('amd-app-opt'))
options.extend(self.enable_or_disable("shared"))
options.extend(self.enable_or_disable("openmp"))
options.extend(self.enable_or_disable("threads"))
options.extend(self.enable_or_disable("amd-fast-planner"))
options.extend(self.enable_or_disable("amd-top-n-planner"))
options.extend(self.enable_or_disable("amd-mpi-vader-limit"))
options.extend(self.enable_or_disable("static"))
options.extend(self.enable_or_disable("amd-trans"))
options.extend(self.enable_or_disable("amd-app-opt"))
if not self.compiler.f77 or not self.compiler.fc:
options.append('--disable-fortran')
options.append("--disable-fortran")
# Cross compilation is supported in amd-fftw by making use of target
# variable to set AMD_ARCH configure option.
@@ -226,17 +196,16 @@ class Amdfftw(FftwBase):
# use target variable to set appropriate -march option in AMD_ARCH.
arch = spec.architecture
options.append(
'AMD_ARCH={0}'.format(
arch.target.optimization_flags(
spec.compiler).split('=')[-1]))
"AMD_ARCH={0}".format(arch.target.optimization_flags(spec.compiler).split("=")[-1])
)
# Specific SIMD support.
# float and double precisions are supported
simd_features = ['sse2', 'avx', 'avx2']
simd_features = ["sse2", "avx", "avx2"]
simd_options = []
for feature in simd_features:
msg = '--enable-{0}' if feature in spec.target else '--disable-{0}'
msg = "--enable-{0}" if feature in spec.target else "--disable-{0}"
simd_options.append(msg.format(feature))
# When enabling configure option "--enable-amd-opt", do not use the
@@ -246,20 +215,19 @@ class Amdfftw(FftwBase):
# Double is the default precision, for all the others we need
# to enable the corresponding option.
enable_precision = {
'float': ['--enable-float'],
'double': None,
'long_double': ['--enable-long-double'],
'quad': ['--enable-quad-precision']
"float": ["--enable-float"],
"double": None,
"long_double": ["--enable-long-double"],
"quad": ["--enable-quad-precision"],
}
# Different precisions must be configured and compiled one at a time
configure = Executable('../configure')
configure = Executable("../configure")
for precision in self.selected_precisions:
opts = (enable_precision[precision] or []) + options[:]
# SIMD optimizations are available only for float and double
if precision in ('float', 'double'):
if precision in ("float", "double"):
opts += simd_options
with working_dir(precision, create=True):

View File

@@ -34,7 +34,7 @@ class Legion(CMakePackage):
homepage = "https://legion.stanford.edu/"
git = "https://github.com/StanfordLegion/legion.git"
maintainers = ['pmccormick', 'streichler']
maintainers('pmccormick', 'streichler')
tags = ['e4s']
version('21.03.0', tag='legion-21.03.0')
version('stable', branch='stable')
@@ -355,7 +355,7 @@ class Legion(CMakePackage):
def cache_test_sources(self):
"""Copy the example source files after the package is installed to an
install test subdirectory for use during `spack test run`."""
self.cache_extra_test_sources([join_path('examples', 'local_function_tasks')])
cache_extra_test_sources(self, [join_path('examples', 'local_function_tasks')])
def run_local_function_tasks_test(self):
"""Run stand alone test: local_function_tasks"""

View File

@@ -16,21 +16,21 @@ from spack.package import *
class Llvm(CMakePackage, CudaPackage):
"""The LLVM Project is a collection of modular and reusable compiler and
toolchain technologies. Despite its name, LLVM has little to do
with traditional virtual machines, though it does provide helpful
libraries that can be used to build them. The name "LLVM" itself
is not an acronym; it is the full name of the project.
toolchain technologies. Despite its name, LLVM has little to do
with traditional virtual machines, though it does provide helpful
libraries that can be used to build them. The name "LLVM" itself
is not an acronym; it is the full name of the project.
"""
homepage = "https://llvm.org/"
url = "https://github.com/llvm/llvm-project/archive/llvmorg-7.1.0.tar.gz"
list_url = "https://releases.llvm.org/download.html"
git = "https://github.com/llvm/llvm-project"
maintainers = ['trws', 'haampie']
maintainers("trws", "haampie")
tags = ['e4s']
tags = ["e4s"]
generator = 'Ninja'
generator = "Ninja"
family = "compiler" # Used by lmod
@@ -80,13 +80,12 @@ class Llvm(CMakePackage, CudaPackage):
# to save space, build with `build_type=Release`.
variant(
"clang",
default=True,
description="Build the LLVM C/C++/Objective-C compiler frontend",
"clang", default=True, description="Build the LLVM C/C++/Objective-C compiler frontend"
)
variant(
"flang",
default=False, when='@11: +clang',
default=False,
when="@11: +clang",
description="Build the LLVM Fortran compiler frontend "
"(experimental - parser only, needs GCC)",
)
@@ -95,27 +94,23 @@ class Llvm(CMakePackage, CudaPackage):
default=False,
description="Include debugging code in OpenMP runtime libraries",
)
variant("lldb", default=True, when='+clang', description="Build the LLVM debugger")
variant("lldb", default=True, when="+clang", description="Build the LLVM debugger")
variant("lld", default=True, description="Build the LLVM linker")
variant("mlir", default=False, when='@10:', description="Build with MLIR support")
variant("mlir", default=False, when="@10:", description="Build with MLIR support")
variant(
"internal_unwind",
default=True, when='+clang',
description="Build the libcxxabi libunwind",
"internal_unwind", default=True, when="+clang", description="Build the libcxxabi libunwind"
)
variant(
"polly",
default=True,
description="Build the LLVM polyhedral optimization plugin, "
"only builds for 3.7.0+",
description="Build the LLVM polyhedral optimization plugin, " "only builds for 3.7.0+",
)
variant(
"libcxx",
default=True, when='+clang',
description="Build the LLVM C++ standard library",
"libcxx", default=True, when="+clang", description="Build the LLVM C++ standard library"
)
variant(
"compiler-rt", when='+clang',
"compiler-rt",
when="+clang",
default=True,
description="Build LLVM compiler runtime, including sanitizers",
)
@@ -124,11 +119,7 @@ class Llvm(CMakePackage, CudaPackage):
default=(sys.platform != "darwin"),
description="Add support for LTO with the gold linker plugin",
)
variant(
"split_dwarf",
default=False,
description="Build with split dwarf information",
)
variant("split_dwarf", default=False, description="Build with split dwarf information")
variant(
"llvm_dylib",
default=True,
@@ -136,18 +127,40 @@ class Llvm(CMakePackage, CudaPackage):
)
variant(
"link_llvm_dylib",
default=False, when='+llvm_dylib',
default=False,
when="+llvm_dylib",
description="Link LLVM tools against the LLVM shared library",
)
variant(
"targets",
default="none",
description=("What targets to build. Spack's target family is always added "
"(e.g. X86 is automatically enabled when targeting znver2)."),
values=("all", "none", "aarch64", "amdgpu", "arm", "avr", "bpf", "cppbackend",
"hexagon", "lanai", "mips", "msp430", "nvptx", "powerpc", "riscv",
"sparc", "systemz", "webassembly", "x86", "xcore"),
multi=True
description=(
"What targets to build. Spack's target family is always added "
"(e.g. X86 is automatically enabled when targeting znver2)."
),
values=(
"all",
"none",
"aarch64",
"amdgpu",
"arm",
"avr",
"bpf",
"cppbackend",
"hexagon",
"lanai",
"mips",
"msp430",
"nvptx",
"powerpc",
"riscv",
"sparc",
"systemz",
"webassembly",
"x86",
"xcore",
),
multi=True,
)
variant(
"build_type",
@@ -157,51 +170,52 @@ class Llvm(CMakePackage, CudaPackage):
)
variant(
"omp_tsan",
default=False, when='@6:',
default=False,
when="@6:",
description="Build with OpenMP capable thread sanitizer",
)
variant(
"omp_as_runtime",
default=True,
when='+clang @12:',
when="+clang @12:",
description="Build OpenMP runtime via ENABLE_RUNTIME by just-built Clang",
)
variant('code_signing', default=False,
when='+lldb platform=darwin',
description="Enable code-signing on macOS")
variant("python", default=False, description="Install python bindings")
variant('version_suffix', default='none', description="Add a symbol suffix")
variant(
'shlib_symbol_version',
default='none',
"code_signing",
default=False,
when="+lldb platform=darwin",
description="Enable code-signing on macOS",
)
variant("python", default=False, description="Install python bindings")
variant("version_suffix", default="none", description="Add a symbol suffix")
variant(
"shlib_symbol_version",
default="none",
description="Add shared library symbol version",
when='@13:'
when="@13:",
)
variant(
'z3',
default=False,
when='+clang @8:',
description='Use Z3 for the clang static analyzer'
"z3", default=False, when="+clang @8:", description="Use Z3 for the clang static analyzer"
)
provides('libllvm@14', when='@14.0.0:14')
provides('libllvm@13', when='@13.0.0:13')
provides('libllvm@12', when='@12.0.0:12')
provides('libllvm@11', when='@11.0.0:11')
provides('libllvm@10', when='@10.0.0:10')
provides('libllvm@9', when='@9.0.0:9')
provides('libllvm@8', when='@8.0.0:8')
provides('libllvm@7', when='@7.0.0:7')
provides('libllvm@6', when='@6.0.0:6')
provides('libllvm@5', when='@5.0.0:5')
provides('libllvm@4', when='@4.0.0:4')
provides('libllvm@3', when='@3.0.0:3')
provides("libllvm@14", when="@14.0.0:14")
provides("libllvm@13", when="@13.0.0:13")
provides("libllvm@12", when="@12.0.0:12")
provides("libllvm@11", when="@11.0.0:11")
provides("libllvm@10", when="@10.0.0:10")
provides("libllvm@9", when="@9.0.0:9")
provides("libllvm@8", when="@8.0.0:8")
provides("libllvm@7", when="@7.0.0:7")
provides("libllvm@6", when="@6.0.0:6")
provides("libllvm@5", when="@5.0.0:5")
provides("libllvm@4", when="@4.0.0:4")
provides("libllvm@3", when="@3.0.0:3")
extends("python", when="+python")
# Build dependency
depends_on("cmake@3.4.3:", type="build")
depends_on('cmake@3.13.4:', type='build', when='@12:')
depends_on("cmake@3.13.4:", type="build", when="@12:")
depends_on("ninja", type="build")
depends_on("python@2.7:2.8", when="@:4 ~python", type="build")
depends_on("python", when="@5: ~python", type="build")
@@ -242,7 +256,7 @@ class Llvm(CMakePackage, CudaPackage):
# clang/lib: a lambda parameter cannot shadow an explicitly captured entity
conflicts("%clang@8:", when="@:4")
# Internal compiler error on gcc 8.4 on aarch64 https://bugzilla.redhat.com/show_bug.cgi?id=1958295
conflicts('%gcc@8.4:8.4.9', when='@12: target=aarch64:')
conflicts("%gcc@8.4:8.4.9", when="@12: target=aarch64:")
# When these versions are concretized, but not explicitly with +libcxx, these
# conflicts will enable clingo to set ~libcxx, making the build successful:
@@ -252,17 +266,17 @@ class Llvm(CMakePackage, CudaPackage):
# GCC 11 - latest stable release per GCC release page
# Clang: 11, 12 - latest two stable releases per LLVM release page
# AppleClang 12 - latest stable release per Xcode release page
conflicts("%gcc@:10", when="@13:+libcxx")
conflicts("%clang@:10", when="@13:+libcxx")
conflicts("%gcc@:10", when="@13:+libcxx")
conflicts("%clang@:10", when="@13:+libcxx")
conflicts("%apple-clang@:11", when="@13:+libcxx")
# libcxx-4 and compiler-rt-4 fail to build with "newer" clang and gcc versions:
conflicts('%gcc@7:', when='@:4+libcxx')
conflicts('%clang@6:', when='@:4+libcxx')
conflicts('%apple-clang@6:', when='@:4+libcxx')
conflicts('%gcc@7:', when='@:4+compiler-rt')
conflicts('%clang@6:', when='@:4+compiler-rt')
conflicts('%apple-clang@6:', when='@:4+compiler-rt')
conflicts("%gcc@7:", when="@:4+libcxx")
conflicts("%clang@6:", when="@:4+libcxx")
conflicts("%apple-clang@6:", when="@:4+libcxx")
conflicts("%gcc@7:", when="@:4+compiler-rt")
conflicts("%clang@6:", when="@:4+compiler-rt")
conflicts("%apple-clang@6:", when="@:4+compiler-rt")
# cuda_arch value must be specified
conflicts("cuda_arch=none", when="+cuda", msg="A value for cuda_arch must be specified.")
@@ -270,27 +284,27 @@ class Llvm(CMakePackage, CudaPackage):
# LLVM bug https://bugs.llvm.org/show_bug.cgi?id=48234
# CMake bug: https://gitlab.kitware.com/cmake/cmake/-/issues/21469
# Fixed in upstream versions of both
conflicts('^cmake@3.19.0', when='@6:11.0.0')
conflicts("^cmake@3.19.0", when="@6:11.0.0")
# Github issue #4986
patch("llvm_gcc7.patch", when="@4.0.0:4.0.1+lldb %gcc@7.0:")
# sys/ustat.h has been removed in favour of statfs from glibc-2.28. Use fixed sizes:
patch('llvm5-sanitizer-ustat.patch', when="@4:6.0.0+compiler-rt")
patch("llvm5-sanitizer-ustat.patch", when="@4:6.0.0+compiler-rt")
# Fix lld templates: https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=230463
patch('llvm4-lld-ELF-Symbols.patch', when="@4+lld%clang@6:")
patch('llvm5-lld-ELF-Symbols.patch', when="@5+lld%clang@7:")
patch("llvm4-lld-ELF-Symbols.patch", when="@4+lld%clang@6:")
patch("llvm5-lld-ELF-Symbols.patch", when="@5+lld%clang@7:")
# Fix missing std:size_t in 'llvm@4:5' when built with '%clang@7:'
patch('xray_buffer_queue-cstddef.patch', when="@4:5+compiler-rt%clang@7:")
patch("xray_buffer_queue-cstddef.patch", when="@4:5+compiler-rt%clang@7:")
# https://github.com/llvm/llvm-project/commit/947f9692440836dcb8d88b74b69dd379d85974ce
patch('sanitizer-ipc_perm_mode.patch', when="@5:7+compiler-rt%clang@11:")
patch('sanitizer-ipc_perm_mode.patch', when="@5:9+compiler-rt%gcc@9:")
patch("sanitizer-ipc_perm_mode.patch", when="@5:7+compiler-rt%clang@11:")
patch("sanitizer-ipc_perm_mode.patch", when="@5:9+compiler-rt%gcc@9:")
# github.com/spack/spack/issues/24270: MicrosoftDemangle for %gcc@10: and %clang@13:
patch('missing-includes.patch', when='@8')
patch("missing-includes.patch", when="@8")
# Backport from llvm master + additional fix
# see https://bugs.llvm.org/show_bug.cgi?id=39696
@@ -315,33 +329,33 @@ class Llvm(CMakePackage, CudaPackage):
patch("llvm_python_path.patch", when="@:11")
# Workaround for issue https://github.com/spack/spack/issues/18197
patch('llvm7_intel.patch', when='@7 %intel@18.0.2,19.0.0:19.1.99')
patch("llvm7_intel.patch", when="@7 %intel@18.0.2,19.0.0:19.1.99")
# Remove cyclades support to build against newer kernel headers
# https://reviews.llvm.org/D102059
patch('no_cyclades.patch', when='@10:12.0.0')
patch('no_cyclades9.patch', when='@6:9')
patch("no_cyclades.patch", when="@10:12.0.0")
patch("no_cyclades9.patch", when="@6:9")
patch('llvm-gcc11.patch', when='@9:11%gcc@11:')
patch("llvm-gcc11.patch", when="@9:11%gcc@11:")
# add -lpthread to build OpenMP libraries with Fujitsu compiler
patch('llvm12-thread.patch', when='@12 %fj')
patch('llvm13-thread.patch', when='@13 %fj')
patch("llvm12-thread.patch", when="@12 %fj")
patch("llvm13-thread.patch", when="@13 %fj")
# avoid build failed with Fujitsu compiler
patch('llvm13-fujitsu.patch', when='@13 %fj')
patch("llvm13-fujitsu.patch", when="@13 %fj")
# patch for missing hwloc.h include for libompd
patch('llvm14-hwloc-ompd.patch', when='@14')
patch("llvm14-hwloc-ompd.patch", when="@14")
# make libflags a list in openmp subproject when ~omp_as_runtime
patch('libomp-libflags-as-list.patch', when='@3.7:')
patch("libomp-libflags-as-list.patch", when="@3.7:")
# The functions and attributes below implement external package
# detection for LLVM. See:
#
# https://spack.readthedocs.io/en/latest/packaging_guide.html#making-a-package-discoverable-with-spack-external-find
executables = ['clang', 'flang', 'ld.lld', 'lldb']
executables = ["clang", "flang", "ld.lld", "lldb"]
@classmethod
def filter_detected_exes(cls, prefix, exes_in_prefix):
@@ -351,7 +365,7 @@ class Llvm(CMakePackage, CudaPackage):
# on some port and would hang Spack during detection.
# clang-cl and clang-cpp are dev tools that we don't
# need to test
if any(x in exe for x in ('vscode', 'cpp', '-cl', '-gpu')):
if any(x in exe for x in ("vscode", "cpp", "-cl", "-gpu")):
continue
result.append(exe)
return result
@@ -360,20 +374,20 @@ class Llvm(CMakePackage, CudaPackage):
def determine_version(cls, exe):
version_regex = re.compile(
# Normal clang compiler versions are left as-is
r'clang version ([^ )\n]+)-svn[~.\w\d-]*|'
r"clang version ([^ )\n]+)-svn[~.\w\d-]*|"
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r'clang version ([^ )\n]+?)-[~.\w\d-]*|'
r'clang version ([^ )\n]+)|'
r"clang version ([^ )\n]+?)-[~.\w\d-]*|"
r"clang version ([^ )\n]+)|"
# LLDB
r'lldb version ([^ )\n]+)|'
r"lldb version ([^ )\n]+)|"
# LLD
r'LLD ([^ )\n]+) \(compatible with GNU linkers\)'
r"LLD ([^ )\n]+) \(compatible with GNU linkers\)"
)
try:
compiler = Executable(exe)
output = compiler('--version', output=str, error=str)
if 'Apple' in output:
output = compiler("--version", output=str, error=str)
if "Apple" in output:
return None
match = version_regex.search(output)
if match:
@@ -387,38 +401,39 @@ class Llvm(CMakePackage, CudaPackage):
@classmethod
def determine_variants(cls, exes, version_str):
variants, compilers = ['+clang'], {}
variants, compilers = ["+clang"], {}
lld_found, lldb_found = False, False
for exe in exes:
if 'clang++' in exe:
compilers['cxx'] = exe
elif 'clang' in exe:
compilers['c'] = exe
elif 'flang' in exe:
variants.append('+flang')
compilers['fc'] = exe
compilers['f77'] = exe
elif 'ld.lld' in exe:
if "clang++" in exe:
compilers["cxx"] = exe
elif "clang" in exe:
compilers["c"] = exe
elif "flang" in exe:
variants.append("+flang")
compilers["fc"] = exe
compilers["f77"] = exe
elif "ld.lld" in exe:
lld_found = True
compilers['ld'] = exe
elif 'lldb' in exe:
compilers["ld"] = exe
elif "lldb" in exe:
lldb_found = True
compilers['lldb'] = exe
compilers["lldb"] = exe
variants.append('+lld' if lld_found else '~lld')
variants.append('+lldb' if lldb_found else '~lldb')
variants.append("+lld" if lld_found else "~lld")
variants.append("+lldb" if lldb_found else "~lldb")
return ''.join(variants), {'compilers': compilers}
return "".join(variants), {"compilers": compilers}
@classmethod
def validate_detected_spec(cls, spec, extra_attributes):
# For LLVM 'compilers' is a mandatory attribute
msg = ('the extra attribute "compilers" must be set for '
'the detected spec "{0}"'.format(spec))
assert 'compilers' in extra_attributes, msg
compilers = extra_attributes['compilers']
for key in ('c', 'cxx'):
msg = '{0} compiler not found for {1}'
msg = 'the extra attribute "compilers" must be set for ' 'the detected spec "{0}"'.format(
spec
)
assert "compilers" in extra_attributes, msg
compilers = extra_attributes["compilers"]
for key in ("c", "cxx"):
msg = "{0} compiler not found for {1}"
assert key in compilers, msg.format(key, spec)
@property
@@ -426,10 +441,10 @@ class Llvm(CMakePackage, CudaPackage):
msg = "cannot retrieve C compiler [spec is not concrete]"
assert self.spec.concrete, msg
if self.spec.external:
return self.spec.extra_attributes['compilers'].get('c', None)
return self.spec.extra_attributes["compilers"].get("c", None)
result = None
if '+clang' in self.spec:
result = os.path.join(self.spec.prefix.bin, 'clang')
if "+clang" in self.spec:
result = os.path.join(self.spec.prefix.bin, "clang")
return result
@property
@@ -437,10 +452,10 @@ class Llvm(CMakePackage, CudaPackage):
msg = "cannot retrieve C++ compiler [spec is not concrete]"
assert self.spec.concrete, msg
if self.spec.external:
return self.spec.extra_attributes['compilers'].get('cxx', None)
return self.spec.extra_attributes["compilers"].get("cxx", None)
result = None
if '+clang' in self.spec:
result = os.path.join(self.spec.prefix.bin, 'clang++')
if "+clang" in self.spec:
result = os.path.join(self.spec.prefix.bin, "clang++")
return result
@property
@@ -448,10 +463,10 @@ class Llvm(CMakePackage, CudaPackage):
msg = "cannot retrieve Fortran compiler [spec is not concrete]"
assert self.spec.concrete, msg
if self.spec.external:
return self.spec.extra_attributes['compilers'].get('fc', None)
return self.spec.extra_attributes["compilers"].get("fc", None)
result = None
if '+flang' in self.spec:
result = os.path.join(self.spec.prefix.bin, 'flang')
if "+flang" in self.spec:
result = os.path.join(self.spec.prefix.bin, "flang")
return result
@property
@@ -459,27 +474,25 @@ class Llvm(CMakePackage, CudaPackage):
msg = "cannot retrieve Fortran 77 compiler [spec is not concrete]"
assert self.spec.concrete, msg
if self.spec.external:
return self.spec.extra_attributes['compilers'].get('f77', None)
return self.spec.extra_attributes["compilers"].get("f77", None)
result = None
if '+flang' in self.spec:
result = os.path.join(self.spec.prefix.bin, 'flang')
if "+flang" in self.spec:
result = os.path.join(self.spec.prefix.bin, "flang")
return result
@property
def libs(self):
return LibraryList(self.llvm_config("--libfiles", "all",
result="list"))
return LibraryList(self.llvm_config("--libfiles", "all", result="list"))
@run_before('cmake')
@run_before("cmake")
def codesign_check(self):
if self.spec.satisfies("+code_signing"):
codesign = which('codesign')
mkdir('tmp')
llvm_check_file = join_path('tmp', 'llvm_check')
copy('/usr/bin/false', llvm_check_file)
codesign = which("codesign")
mkdir("tmp")
llvm_check_file = join_path("tmp", "llvm_check")
copy("/usr/bin/false", llvm_check_file)
try:
codesign('-f', '-s', 'lldb_codesign', '--dryrun',
llvm_check_file)
codesign("-f", "-s", "lldb_codesign", "--dryrun", llvm_check_file)
except ProcessError:
# Newer LLVM versions have a simple script that sets up
@@ -489,32 +502,32 @@ class Llvm(CMakePackage, CudaPackage):
setup()
except Exception:
raise RuntimeError(
'spack was unable to either find or set up'
'code-signing on your system. Please refer to'
'https://lldb.llvm.org/resources/build.html#'
'code-signing-on-macos for details on how to'
'create this identity.'
"spack was unable to either find or set up"
"code-signing on your system. Please refer to"
"https://lldb.llvm.org/resources/build.html#"
"code-signing-on-macos for details on how to"
"create this identity."
)
def flag_handler(self, name, flags):
if name == 'cxxflags':
if name == "cxxflags":
flags.append(self.compiler.cxx11_flag)
return(None, flags, None)
elif name == 'ldflags' and self.spec.satisfies('%intel'):
flags.append('-shared-intel')
return(None, flags, None)
return(flags, None, None)
return (None, flags, None)
elif name == "ldflags" and self.spec.satisfies("%intel"):
flags.append("-shared-intel")
return (None, flags, None)
return (flags, None, None)
def setup_build_environment(self, env):
"""When using %clang, add only its ld.lld-$ver and/or ld.lld to our PATH"""
if self.compiler.name in ['clang', 'apple-clang']:
for lld in 'ld.lld-{0}'.format(self.compiler.version.version[0]), 'ld.lld':
if self.compiler.name in ["clang", "apple-clang"]:
for lld in "ld.lld-{0}".format(self.compiler.version.version[0]), "ld.lld":
bin = os.path.join(os.path.dirname(self.compiler.cc), lld)
sym = os.path.join(self.stage.path, 'ld.lld')
sym = os.path.join(self.stage.path, "ld.lld")
if os.path.exists(bin) and not os.path.exists(sym):
mkdirp(self.stage.path)
os.symlink(bin, sym)
env.prepend_path('PATH', self.stage.path)
env.prepend_path("PATH", self.stage.path)
def setup_run_environment(self, env):
if "+clang" in self.spec:
@@ -531,7 +544,7 @@ class Llvm(CMakePackage, CudaPackage):
define = CMakePackage.define
from_variant = self.define_from_variant
python = spec['python']
python = spec["python"]
cmake_args = [
define("LLVM_REQUIRES_RTTI", True),
define("LLVM_ENABLE_RTTI", True),
@@ -544,14 +557,13 @@ class Llvm(CMakePackage, CudaPackage):
define("LIBOMP_HWLOC_INSTALL_DIR", spec["hwloc"].prefix),
]
version_suffix = spec.variants['version_suffix'].value
if version_suffix != 'none':
cmake_args.append(define('LLVM_VERSION_SUFFIX', version_suffix))
version_suffix = spec.variants["version_suffix"].value
if version_suffix != "none":
cmake_args.append(define("LLVM_VERSION_SUFFIX", version_suffix))
shlib_symbol_version = spec.variants.get('shlib_symbol_version', None)
if shlib_symbol_version is not None and shlib_symbol_version.value != 'none':
cmake_args.append(define('LLVM_SHLIB_SYMBOL_VERSION',
shlib_symbol_version.value))
shlib_symbol_version = spec.variants.get("shlib_symbol_version", None)
if shlib_symbol_version is not None and shlib_symbol_version.value != "none":
cmake_args.append(define("LLVM_SHLIB_SYMBOL_VERSION", shlib_symbol_version.value))
if python.version >= Version("3"):
cmake_args.append(define("Python3_EXECUTABLE", python.command.path))
@@ -562,47 +574,56 @@ class Llvm(CMakePackage, CudaPackage):
runtimes = []
if "+cuda" in spec:
cmake_args.extend([
define("CUDA_TOOLKIT_ROOT_DIR", spec["cuda"].prefix),
define("LIBOMPTARGET_NVPTX_COMPUTE_CAPABILITIES",
",".join(spec.variants["cuda_arch"].value)),
define("CLANG_OPENMP_NVPTX_DEFAULT_ARCH",
"sm_{0}".format(spec.variants["cuda_arch"].value[-1])),
])
cmake_args.extend(
[
define("CUDA_TOOLKIT_ROOT_DIR", spec["cuda"].prefix),
define(
"LIBOMPTARGET_NVPTX_COMPUTE_CAPABILITIES",
",".join(spec.variants["cuda_arch"].value),
),
define(
"CLANG_OPENMP_NVPTX_DEFAULT_ARCH",
"sm_{0}".format(spec.variants["cuda_arch"].value[-1]),
),
]
)
if "+omp_as_runtime" in spec:
cmake_args.extend([
define("LIBOMPTARGET_NVPTX_ENABLE_BCLIB", True),
# work around bad libelf detection in libomptarget
define("LIBOMPTARGET_DEP_LIBELF_INCLUDE_DIR",
spec["libelf"].prefix.include),
])
cmake_args.extend(
[
define("LIBOMPTARGET_NVPTX_ENABLE_BCLIB", True),
# work around bad libelf detection in libomptarget
define(
"LIBOMPTARGET_DEP_LIBELF_INCLUDE_DIR", spec["libelf"].prefix.include
),
]
)
else:
# still build libomptarget but disable cuda
cmake_args.extend([
define("CUDA_TOOLKIT_ROOT_DIR", "IGNORE"),
define("CUDA_SDK_ROOT_DIR", "IGNORE"),
define("CUDA_NVCC_EXECUTABLE", "IGNORE"),
define("LIBOMPTARGET_DEP_CUDA_DRIVER_LIBRARIES", "IGNORE"),
])
cmake_args.extend(
[
define("CUDA_TOOLKIT_ROOT_DIR", "IGNORE"),
define("CUDA_SDK_ROOT_DIR", "IGNORE"),
define("CUDA_NVCC_EXECUTABLE", "IGNORE"),
define("LIBOMPTARGET_DEP_CUDA_DRIVER_LIBRARIES", "IGNORE"),
]
)
cmake_args.append(from_variant("LIBOMPTARGET_ENABLE_DEBUG", "omp_debug"))
if "+lldb" in spec:
projects.append("lldb")
cmake_args.append(define('LLDB_ENABLE_LIBEDIT', True))
cmake_args.append(define('LLDB_ENABLE_NCURSES', True))
cmake_args.append(define('LLDB_ENABLE_LIBXML2', False))
if spec.version >= Version('10'):
cmake_args.append(from_variant("LLDB_ENABLE_PYTHON", 'python'))
cmake_args.append(define("LLDB_ENABLE_LIBEDIT", True))
cmake_args.append(define("LLDB_ENABLE_NCURSES", True))
cmake_args.append(define("LLDB_ENABLE_LIBXML2", False))
if spec.version >= Version("10"):
cmake_args.append(from_variant("LLDB_ENABLE_PYTHON", "python"))
else:
cmake_args.append(define("LLDB_DISABLE_PYTHON", '~python' in spec))
cmake_args.append(define("LLDB_DISABLE_PYTHON", "~python" in spec))
if spec.satisfies("@5.0.0: +python"):
cmake_args.append(define("LLDB_USE_SYSTEM_SIX", True))
if "+gold" in spec:
cmake_args.append(
define("LLVM_BINUTILS_INCDIR", spec["binutils"].prefix.include)
)
cmake_args.append(define("LLVM_BINUTILS_INCDIR", spec["binutils"].prefix.include))
if "+clang" in spec:
projects.append("clang")
@@ -612,10 +633,10 @@ class Llvm(CMakePackage, CudaPackage):
else:
projects.append("openmp")
if '@8' in spec:
cmake_args.append(from_variant('CLANG_ANALYZER_ENABLE_Z3_SOLVER', 'z3'))
elif '@9:' in spec:
cmake_args.append(from_variant('LLVM_ENABLE_Z3_SOLVER', 'z3'))
if "@8" in spec:
cmake_args.append(from_variant("CLANG_ANALYZER_ENABLE_Z3_SOLVER", "z3"))
elif "@9:" in spec:
cmake_args.append(from_variant("LLVM_ENABLE_Z3_SOLVER", "z3"))
if "+flang" in spec:
projects.append("flang")
@@ -634,26 +655,26 @@ class Llvm(CMakePackage, CudaPackage):
projects.append("polly")
cmake_args.append(define("LINK_POLLY_INTO_TOOLS", True))
cmake_args.extend([
define("BUILD_SHARED_LIBS", False),
from_variant("LLVM_BUILD_LLVM_DYLIB", "llvm_dylib"),
from_variant("LLVM_LINK_LLVM_DYLIB", "link_llvm_dylib"),
from_variant("LLVM_USE_SPLIT_DWARF", "split_dwarf"),
# By default on Linux, libc++.so is a ldscript. CMake fails to add
# CMAKE_INSTALL_RPATH to it, which fails. Statically link libc++abi.a
# into libc++.so, linking with -lc++ or -stdlib=libc++ is enough.
define('LIBCXX_ENABLE_STATIC_ABI_LIBRARY', True)
])
cmake_args.extend(
[
define("BUILD_SHARED_LIBS", False),
from_variant("LLVM_BUILD_LLVM_DYLIB", "llvm_dylib"),
from_variant("LLVM_LINK_LLVM_DYLIB", "link_llvm_dylib"),
from_variant("LLVM_USE_SPLIT_DWARF", "split_dwarf"),
# By default on Linux, libc++.so is a ldscript. CMake fails to add
# CMAKE_INSTALL_RPATH to it, which fails. Statically link libc++abi.a
# into libc++.so, linking with -lc++ or -stdlib=libc++ is enough.
define("LIBCXX_ENABLE_STATIC_ABI_LIBRARY", True),
]
)
cmake_args.append(define(
"LLVM_TARGETS_TO_BUILD",
get_llvm_targets_to_build(spec)))
cmake_args.append(define("LLVM_TARGETS_TO_BUILD", get_llvm_targets_to_build(spec)))
cmake_args.append(from_variant("LIBOMP_TSAN_SUPPORT", "omp_tsan"))
if self.compiler.name == "gcc":
compiler = Executable(self.compiler.cc)
gcc_output = compiler('-print-search-dirs', output=str, error=str)
gcc_output = compiler("-print-search-dirs", output=str, error=str)
for line in gcc_output.splitlines():
if line.startswith("install:"):
@@ -665,7 +686,7 @@ class Llvm(CMakePackage, CudaPackage):
cmake_args.append(define("GCC_INSTALL_PREFIX", gcc_prefix))
if self.spec.satisfies("~code_signing platform=darwin"):
cmake_args.append(define('LLDB_USE_SYSTEM_DEBUGSERVER', True))
cmake_args.append(define("LLDB_USE_SYSTEM_DEBUGSERVER", True))
# Semicolon seperated list of projects to enable
cmake_args.append(define("LLVM_ENABLE_PROJECTS", projects))
@@ -689,20 +710,24 @@ class Llvm(CMakePackage, CudaPackage):
# rebuild libomptarget to get bytecode runtime library files
with working_dir(ompdir, create=True):
cmake_args = [
'-G', 'Ninja',
define('CMAKE_BUILD_TYPE', spec.variants['build_type'].value),
"-G",
"Ninja",
define("CMAKE_BUILD_TYPE", spec.variants["build_type"].value),
define("CMAKE_C_COMPILER", spec.prefix.bin + "/clang"),
define("CMAKE_CXX_COMPILER", spec.prefix.bin + "/clang++"),
define("CMAKE_INSTALL_PREFIX", spec.prefix),
define('CMAKE_PREFIX_PATH', prefix_paths)
define("CMAKE_PREFIX_PATH", prefix_paths),
]
cmake_args.extend(self.cmake_args())
cmake_args.extend([
define("LIBOMPTARGET_NVPTX_ENABLE_BCLIB", True),
define("LIBOMPTARGET_DEP_LIBELF_INCLUDE_DIR",
spec["libelf"].prefix.include),
self.stage.source_path + "/openmp",
])
cmake_args.extend(
[
define("LIBOMPTARGET_NVPTX_ENABLE_BCLIB", True),
define(
"LIBOMPTARGET_DEP_LIBELF_INCLUDE_DIR", spec["libelf"].prefix.include
),
self.stage.source_path + "/openmp",
]
)
cmake(*cmake_args)
ninja()
@@ -717,22 +742,22 @@ class Llvm(CMakePackage, CudaPackage):
install_tree("bin", join_path(self.prefix, "libexec", "llvm"))
def llvm_config(self, *args, **kwargs):
lc = Executable(self.prefix.bin.join('llvm-config'))
if not kwargs.get('output'):
kwargs['output'] = str
lc = Executable(self.prefix.bin.join("llvm-config"))
if not kwargs.get("output"):
kwargs["output"] = str
ret = lc(*args, **kwargs)
if kwargs.get('result') == "list":
if kwargs.get("result") == "list":
return ret.split()
else:
return ret
def get_llvm_targets_to_build(spec):
targets = spec.variants['targets'].value
targets = spec.variants["targets"].value
# Build everything?
if 'all' in targets:
return 'all'
if "all" in targets:
return "all"
# Convert targets variant values to CMake LLVM_TARGETS_TO_BUILD array.
spack_to_cmake = {
@@ -753,10 +778,10 @@ def get_llvm_targets_to_build(spec):
"systemz": "SystemZ",
"webassembly": "WebAssembly",
"x86": "X86",
"xcore": "XCore"
"xcore": "XCore",
}
if 'none' in targets:
if "none" in targets:
llvm_targets = set()
else:
llvm_targets = set(spack_to_cmake[target] for target in targets)

View File

@@ -27,8 +27,7 @@ class Mfem(Package, CudaPackage, ROCmPackage):
homepage = 'http://www.mfem.org'
git = 'https://github.com/mfem/mfem.git'
maintainers = ['v-dobrev', 'tzanio', 'acfisher',
'goxberry', 'markcmiller86']
maintainers('v-dobrev', 'tzanio', 'acfisher', 'goxberry', 'markcmiller86')
test_requires_compiler = True
@@ -815,8 +814,7 @@ class Mfem(Package, CudaPackage, ROCmPackage):
def cache_test_sources(self):
"""Copy the example source files after the package is installed to an
install test subdirectory for use during `spack test run`."""
self.cache_extra_test_sources([self.examples_src_dir,
self.examples_data_dir])
cache_extra_test_sources(self, [self.examples_src_dir, self.examples_data_dir])
def test(self):
test_dir = join_path(

View File

@@ -22,127 +22,140 @@ class PyTorch(PythonPackage, CudaPackage):
with strong GPU acceleration."""
homepage = "https://pytorch.org/"
git = "https://github.com/pytorch/pytorch.git"
git = "https://github.com/pytorch/pytorch.git"
maintainers = ['adamjstewart']
maintainers("adamjstewart")
# Exact set of modules is version- and variant-specific, just attempt to import the
# core libraries to ensure that the package was successfully installed.
import_modules = ['torch', 'torch.autograd', 'torch.nn', 'torch.utils']
import_modules = ["torch", "torch.autograd", "torch.nn", "torch.utils"]
version('master', branch='master', submodules=True)
version('1.10.1', tag='v1.10.1', submodules=True)
version('1.10.0', tag='v1.10.0', submodules=True)
version('1.9.1', tag='v1.9.1', submodules=True)
version('1.9.0', tag='v1.9.0', submodules=True)
version('1.8.2', tag='v1.8.2', submodules=True)
version('1.8.1', tag='v1.8.1', submodules=True)
version('1.8.0', tag='v1.8.0', submodules=True)
version('1.7.1', tag='v1.7.1', submodules=True)
version('1.7.0', tag='v1.7.0', submodules=True)
version('1.6.0', tag='v1.6.0', submodules=True)
version('1.5.1', tag='v1.5.1', submodules=True)
version('1.5.0', tag='v1.5.0', submodules=True)
version('1.4.1', tag='v1.4.1', submodules=True)
version('1.4.0', tag='v1.4.0', submodules=True, deprecated=True,
submodules_delete=['third_party/fbgemm'])
version('1.3.1', tag='v1.3.1', submodules=True)
version('1.3.0', tag='v1.3.0', submodules=True)
version('1.2.0', tag='v1.2.0', submodules=True)
version('1.1.0', tag='v1.1.0', submodules=True)
version('1.0.1', tag='v1.0.1', submodules=True)
version('1.0.0', tag='v1.0.0', submodules=True)
version('0.4.1', tag='v0.4.1', submodules=True, deprecated=True,
submodules_delete=['third_party/nervanagpu'])
version('0.4.0', tag='v0.4.0', submodules=True, deprecated=True)
version('0.3.1', tag='v0.3.1', submodules=True, deprecated=True)
version("master", branch="master", submodules=True)
version("1.10.1", tag="v1.10.1", submodules=True)
version("1.10.0", tag="v1.10.0", submodules=True)
version("1.9.1", tag="v1.9.1", submodules=True)
version("1.9.0", tag="v1.9.0", submodules=True)
version("1.8.2", tag="v1.8.2", submodules=True)
version("1.8.1", tag="v1.8.1", submodules=True)
version("1.8.0", tag="v1.8.0", submodules=True)
version("1.7.1", tag="v1.7.1", submodules=True)
version("1.7.0", tag="v1.7.0", submodules=True)
version("1.6.0", tag="v1.6.0", submodules=True)
version("1.5.1", tag="v1.5.1", submodules=True)
version("1.5.0", tag="v1.5.0", submodules=True)
version("1.4.1", tag="v1.4.1", submodules=True)
version(
"1.4.0",
tag="v1.4.0",
submodules=True,
deprecated=True,
submodules_delete=["third_party/fbgemm"],
)
version("1.3.1", tag="v1.3.1", submodules=True)
version("1.3.0", tag="v1.3.0", submodules=True)
version("1.2.0", tag="v1.2.0", submodules=True)
version("1.1.0", tag="v1.1.0", submodules=True)
version("1.0.1", tag="v1.0.1", submodules=True)
version("1.0.0", tag="v1.0.0", submodules=True)
version(
"0.4.1",
tag="v0.4.1",
submodules=True,
deprecated=True,
submodules_delete=["third_party/nervanagpu"],
)
version("0.4.0", tag="v0.4.0", submodules=True, deprecated=True)
version("0.3.1", tag="v0.3.1", submodules=True, deprecated=True)
is_darwin = sys.platform == 'darwin'
is_darwin = sys.platform == "darwin"
# All options are defined in CMakeLists.txt.
# Some are listed in setup.py, but not all.
variant('caffe2', default=True, description='Build Caffe2')
variant('test', default=False, description='Build C++ test binaries')
variant('cuda', default=not is_darwin, description='Use CUDA')
variant('rocm', default=False, description='Use ROCm')
variant('cudnn', default=not is_darwin, description='Use cuDNN')
variant('fbgemm', default=True, description='Use FBGEMM (quantized 8-bit server operators)')
variant('kineto', default=True, description='Use Kineto profiling library')
variant('magma', default=not is_darwin, description='Use MAGMA')
variant('metal', default=is_darwin, description='Use Metal for Caffe2 iOS build')
variant('nccl', default=not is_darwin, description='Use NCCL')
variant('nnpack', default=True, description='Use NNPACK')
variant('numa', default=not is_darwin, description='Use NUMA')
variant('numpy', default=True, description='Use NumPy')
variant('openmp', default=True, description='Use OpenMP for parallel code')
variant('qnnpack', default=True, description='Use QNNPACK (quantized 8-bit operators)')
variant('valgrind', default=not is_darwin, description='Use Valgrind')
variant('xnnpack', default=True, description='Use XNNPACK')
variant('mkldnn', default=True, description='Use MKLDNN')
variant('distributed', default=not is_darwin, description='Use distributed')
variant('mpi', default=not is_darwin, description='Use MPI for Caffe2')
variant('gloo', default=not is_darwin, description='Use Gloo')
variant('tensorpipe', default=not is_darwin, description='Use TensorPipe')
variant('onnx_ml', default=True, description='Enable traditional ONNX ML API')
variant('breakpad', default=True, description='Enable breakpad crash dump library')
variant("caffe2", default=True, description="Build Caffe2")
variant("test", default=False, description="Build C++ test binaries")
variant("cuda", default=not is_darwin, description="Use CUDA")
variant("rocm", default=False, description="Use ROCm")
variant("cudnn", default=not is_darwin, description="Use cuDNN")
variant("fbgemm", default=True, description="Use FBGEMM (quantized 8-bit server operators)")
variant("kineto", default=True, description="Use Kineto profiling library")
variant("magma", default=not is_darwin, description="Use MAGMA")
variant("metal", default=is_darwin, description="Use Metal for Caffe2 iOS build")
variant("nccl", default=not is_darwin, description="Use NCCL")
variant("nnpack", default=True, description="Use NNPACK")
variant("numa", default=not is_darwin, description="Use NUMA")
variant("numpy", default=True, description="Use NumPy")
variant("openmp", default=True, description="Use OpenMP for parallel code")
variant("qnnpack", default=True, description="Use QNNPACK (quantized 8-bit operators)")
variant("valgrind", default=not is_darwin, description="Use Valgrind")
variant("xnnpack", default=True, description="Use XNNPACK")
variant("mkldnn", default=True, description="Use MKLDNN")
variant("distributed", default=not is_darwin, description="Use distributed")
variant("mpi", default=not is_darwin, description="Use MPI for Caffe2")
variant("gloo", default=not is_darwin, description="Use Gloo")
variant("tensorpipe", default=not is_darwin, description="Use TensorPipe")
variant("onnx_ml", default=True, description="Enable traditional ONNX ML API")
variant("breakpad", default=True, description="Enable breakpad crash dump library")
conflicts('+cuda', when='+rocm')
conflicts('+cudnn', when='~cuda')
conflicts('+magma', when='~cuda')
conflicts('+nccl', when='~cuda~rocm')
conflicts('+nccl', when='platform=darwin')
conflicts('+numa', when='platform=darwin', msg='Only available on Linux')
conflicts('+valgrind', when='platform=darwin', msg='Only available on Linux')
conflicts('+mpi', when='~distributed')
conflicts('+gloo', when='~distributed')
conflicts('+tensorpipe', when='~distributed')
conflicts('+kineto', when='@:1.7')
conflicts('+valgrind', when='@:1.7')
conflicts('~caffe2', when='@0.4.0:1.6') # no way to disable caffe2?
conflicts('+caffe2', when='@:0.3.1') # caffe2 did not yet exist?
conflicts('+tensorpipe', when='@:1.5')
conflicts('+xnnpack', when='@:1.4')
conflicts('~onnx_ml', when='@:1.4') # no way to disable ONNX?
conflicts('+rocm', when='@:0.4')
conflicts('+cudnn', when='@:0.4')
conflicts('+fbgemm', when='@:0.4,1.4.0')
conflicts('+qnnpack', when='@:0.4')
conflicts('+mkldnn', when='@:0.4')
conflicts('+breakpad', when='@:1.9') # Option appeared in 1.10.0
conflicts('+breakpad', when='target=ppc64:', msg='Unsupported')
conflicts('+breakpad', when='target=ppc64le:', msg='Unsupported')
conflicts("+cuda", when="+rocm")
conflicts("+cudnn", when="~cuda")
conflicts("+magma", when="~cuda")
conflicts("+nccl", when="~cuda~rocm")
conflicts("+nccl", when="platform=darwin")
conflicts("+numa", when="platform=darwin", msg="Only available on Linux")
conflicts("+valgrind", when="platform=darwin", msg="Only available on Linux")
conflicts("+mpi", when="~distributed")
conflicts("+gloo", when="~distributed")
conflicts("+tensorpipe", when="~distributed")
conflicts("+kineto", when="@:1.7")
conflicts("+valgrind", when="@:1.7")
conflicts("~caffe2", when="@0.4.0:1.6") # no way to disable caffe2?
conflicts("+caffe2", when="@:0.3.1") # caffe2 did not yet exist?
conflicts("+tensorpipe", when="@:1.5")
conflicts("+xnnpack", when="@:1.4")
conflicts("~onnx_ml", when="@:1.4") # no way to disable ONNX?
conflicts("+rocm", when="@:0.4")
conflicts("+cudnn", when="@:0.4")
conflicts("+fbgemm", when="@:0.4,1.4.0")
conflicts("+qnnpack", when="@:0.4")
conflicts("+mkldnn", when="@:0.4")
conflicts("+breakpad", when="@:1.9") # Option appeared in 1.10.0
conflicts("+breakpad", when="target=ppc64:", msg="Unsupported")
conflicts("+breakpad", when="target=ppc64le:", msg="Unsupported")
conflicts('cuda_arch=none', when='+cuda',
msg='Must specify CUDA compute capabilities of your GPU, see '
'https://developer.nvidia.com/cuda-gpus')
conflicts(
"cuda_arch=none",
when="+cuda",
msg="Must specify CUDA compute capabilities of your GPU, see "
"https://developer.nvidia.com/cuda-gpus",
)
# Required dependencies
depends_on('cmake@3.5:', type='build')
depends_on("cmake@3.5:", type="build")
# Use Ninja generator to speed up build times, automatically used if found
depends_on('ninja@1.5:', when='@1.1.0:', type='build')
depends_on("ninja@1.5:", when="@1.1.0:", type="build")
# See python_min_version in setup.py
depends_on('python@3.6.2:', when='@1.7.1:', type=('build', 'link', 'run'))
depends_on('python@3.6.1:', when='@1.6.0:1.7.0', type=('build', 'link', 'run'))
depends_on('python@3.5:', when='@1.5.0:1.5', type=('build', 'link', 'run'))
depends_on('python@2.7:2.8,3.5:', when='@1.4.0:1.4', type=('build', 'link', 'run'))
depends_on('python@2.7:2.8,3.5:3.7', when='@:1.3', type=('build', 'link', 'run'))
depends_on('py-setuptools', type=('build', 'run'))
depends_on('py-future', when='@1.5:', type=('build', 'run'))
depends_on('py-future', when='@1.1: ^python@:2', type=('build', 'run'))
depends_on('py-pyyaml', type=('build', 'run'))
depends_on('py-typing', when='@0.4: ^python@:3.4', type=('build', 'run'))
depends_on('py-typing-extensions', when='@1.7:', type=('build', 'run'))
depends_on('py-pybind11@2.6.2', when='@1.8.0:', type=('build', 'link', 'run'))
depends_on('py-pybind11@2.3.0', when='@1.1.0:1.7', type=('build', 'link', 'run'))
depends_on('py-pybind11@2.2.4', when='@1.0.0:1.0', type=('build', 'link', 'run'))
depends_on('py-pybind11@2.2.2', when='@0.4.0:0.4', type=('build', 'link', 'run'))
depends_on('py-dataclasses', when='@1.7: ^python@3.6.0:3.6', type=('build', 'run'))
depends_on('py-tqdm', type='run')
depends_on('py-protobuf', when='@0.4:', type=('build', 'run'))
depends_on('protobuf', when='@0.4:')
depends_on('blas')
depends_on('lapack')
depends_on('eigen', when='@0.4:')
depends_on("python@3.6.2:", when="@1.7.1:", type=("build", "link", "run"))
depends_on("python@3.6.1:", when="@1.6.0:1.7.0", type=("build", "link", "run"))
depends_on("python@3.5:", when="@1.5.0:1.5", type=("build", "link", "run"))
depends_on("python@2.7:2.8,3.5:", when="@1.4.0:1.4", type=("build", "link", "run"))
depends_on("python@2.7:2.8,3.5:3.7", when="@:1.3", type=("build", "link", "run"))
depends_on("py-setuptools", type=("build", "run"))
depends_on("py-future", when="@1.5:", type=("build", "run"))
depends_on("py-future", when="@1.1: ^python@:2", type=("build", "run"))
depends_on("py-pyyaml", type=("build", "run"))
depends_on("py-typing", when="@0.4: ^python@:3.4", type=("build", "run"))
depends_on("py-typing-extensions", when="@1.7:", type=("build", "run"))
depends_on("py-pybind11@2.6.2", when="@1.8.0:", type=("build", "link", "run"))
depends_on("py-pybind11@2.3.0", when="@1.1.0:1.7", type=("build", "link", "run"))
depends_on("py-pybind11@2.2.4", when="@1.0.0:1.0", type=("build", "link", "run"))
depends_on("py-pybind11@2.2.2", when="@0.4.0:0.4", type=("build", "link", "run"))
depends_on("py-dataclasses", when="@1.7: ^python@3.6.0:3.6", type=("build", "run"))
depends_on("py-tqdm", type="run")
depends_on("py-protobuf", when="@0.4:", type=("build", "run"))
depends_on("protobuf", when="@0.4:")
depends_on("blas")
depends_on("lapack")
depends_on("eigen", when="@0.4:")
# https://github.com/pytorch/pytorch/issues/60329
# depends_on('cpuinfo@2020-12-17', when='@1.8.0:')
# depends_on('cpuinfo@2020-06-11', when='@1.6.0:1.7')
@@ -152,30 +165,30 @@ class PyTorch(PythonPackage, CudaPackage):
# depends_on('sleef@3.4.0_2019-07-30', when='@1.6.0:1.7')
# https://github.com/Maratyszcza/FP16/issues/18
# depends_on('fp16@2020-05-14', when='@1.6.0:')
depends_on('pthreadpool@2021-04-13', when='@1.9.0:')
depends_on('pthreadpool@2020-10-05', when='@1.8.0:1.8')
depends_on('pthreadpool@2020-06-15', when='@1.6.0:1.7')
depends_on('psimd@2020-05-17', when='@1.6.0:')
depends_on('fxdiv@2020-04-17', when='@1.6.0:')
depends_on('benchmark', when='@1.6:+test')
depends_on("pthreadpool@2021-04-13", when="@1.9.0:")
depends_on("pthreadpool@2020-10-05", when="@1.8.0:1.8")
depends_on("pthreadpool@2020-06-15", when="@1.6.0:1.7")
depends_on("psimd@2020-05-17", when="@1.6.0:")
depends_on("fxdiv@2020-04-17", when="@1.6.0:")
depends_on("benchmark", when="@1.6:+test")
# Optional dependencies
depends_on('cuda@7.5:', when='+cuda', type=('build', 'link', 'run'))
depends_on('cuda@9:', when='@1.1:+cuda', type=('build', 'link', 'run'))
depends_on('cuda@9.2:', when='@1.6:+cuda', type=('build', 'link', 'run'))
depends_on('cudnn@6.0:7', when='@:1.0+cudnn')
depends_on('cudnn@7.0:7', when='@1.1.0:1.5+cudnn')
depends_on('cudnn@7.0:', when='@1.6.0:+cudnn')
depends_on('magma', when='+magma')
depends_on('nccl', when='+nccl')
depends_on('numactl', when='+numa')
depends_on('py-numpy', when='+numpy', type=('build', 'run'))
depends_on('llvm-openmp', when='%apple-clang +openmp')
depends_on('valgrind', when='+valgrind')
depends_on("cuda@7.5:", when="+cuda", type=("build", "link", "run"))
depends_on("cuda@9:", when="@1.1:+cuda", type=("build", "link", "run"))
depends_on("cuda@9.2:", when="@1.6:+cuda", type=("build", "link", "run"))
depends_on("cudnn@6.0:7", when="@:1.0+cudnn")
depends_on("cudnn@7.0:7", when="@1.1.0:1.5+cudnn")
depends_on("cudnn@7.0:", when="@1.6.0:+cudnn")
depends_on("magma", when="+magma")
depends_on("nccl", when="+nccl")
depends_on("numactl", when="+numa")
depends_on("py-numpy", when="+numpy", type=("build", "run"))
depends_on("llvm-openmp", when="%apple-clang +openmp")
depends_on("valgrind", when="+valgrind")
# https://github.com/pytorch/pytorch/issues/60332
# depends_on('xnnpack@2021-02-22', when='@1.8.0:+xnnpack')
# depends_on('xnnpack@2020-03-23', when='@1.6.0:1.7+xnnpack')
depends_on('mpi', when='+mpi')
depends_on("mpi", when="+mpi")
# https://github.com/pytorch/pytorch/issues/60270
# depends_on('gloo@2021-05-04', when='@1.9.0:+gloo')
# depends_on('gloo@2020-09-18', when='@1.7.0:1.8+gloo')
@@ -183,31 +196,35 @@ class PyTorch(PythonPackage, CudaPackage):
# https://github.com/pytorch/pytorch/issues/60331
# depends_on('onnx@1.8.0_2020-11-03', when='@1.8.0:+onnx_ml')
# depends_on('onnx@1.7.0_2020-05-31', when='@1.6.0:1.7+onnx_ml')
depends_on('mkl', when='+mkldnn')
depends_on("mkl", when="+mkldnn")
# Test dependencies
depends_on('py-hypothesis', type='test')
depends_on('py-six', type='test')
depends_on('py-psutil', type='test')
depends_on("py-hypothesis", type="test")
depends_on("py-six", type="test")
depends_on("py-psutil", type="test")
# Fix BLAS being overridden by MKL
# https://github.com/pytorch/pytorch/issues/60328
patch('https://patch-diff.githubusercontent.com/raw/pytorch/pytorch/pull/59220.patch',
sha256='e37afffe45cf7594c22050109942370e49983ad772d12ebccf508377dc9dcfc9',
when='@1.2.0:')
patch(
"https://patch-diff.githubusercontent.com/raw/pytorch/pytorch/pull/59220.patch",
sha256="e37afffe45cf7594c22050109942370e49983ad772d12ebccf508377dc9dcfc9",
when="@1.2.0:",
)
# Fixes build on older systems with glibc <2.12
patch('https://patch-diff.githubusercontent.com/raw/pytorch/pytorch/pull/55063.patch',
sha256='e17eaa42f5d7c18bf0d7c37d7b0910127a01ad53fdce3e226a92893356a70395',
when='@1.1.0:1.8.1')
patch(
"https://patch-diff.githubusercontent.com/raw/pytorch/pytorch/pull/55063.patch",
sha256="e17eaa42f5d7c18bf0d7c37d7b0910127a01ad53fdce3e226a92893356a70395",
when="@1.1.0:1.8.1",
)
# Fixes CMake configuration error when XNNPACK is disabled
# https://github.com/pytorch/pytorch/pull/35607
# https://github.com/pytorch/pytorch/pull/37865
patch('xnnpack.patch', when='@1.5.0:1.5')
patch("xnnpack.patch", when="@1.5.0:1.5")
# Fixes build error when ROCm is enabled for pytorch-1.5 release
patch('rocm.patch', when='@1.5.0:1.5+rocm')
patch("rocm.patch", when="@1.5.0:1.5+rocm")
# Fixes fatal error: sleef.h: No such file or directory
# https://github.com/pytorch/pytorch/pull/35359
@@ -216,47 +233,56 @@ class PyTorch(PythonPackage, CudaPackage):
# Fixes compilation with Clang 9.0.0 and Apple Clang 11.0.3
# https://github.com/pytorch/pytorch/pull/37086
patch('https://github.com/pytorch/pytorch/commit/e921cd222a8fbeabf5a3e74e83e0d8dfb01aa8b5.patch',
sha256='17561b16cd2db22f10c0fe1fdcb428aecb0ac3964ba022a41343a6bb8cba7049',
when='@1.1:1.5')
patch(
"https://github.com/pytorch/pytorch/commit/e921cd222a8fbeabf5a3e74e83e0d8dfb01aa8b5.patch",
sha256="17561b16cd2db22f10c0fe1fdcb428aecb0ac3964ba022a41343a6bb8cba7049",
when="@1.1:1.5",
)
# Removes duplicate definition of getCusparseErrorString
# https://github.com/pytorch/pytorch/issues/32083
patch('cusparseGetErrorString.patch', when='@0.4.1:1.0^cuda@10.1.243:')
patch("cusparseGetErrorString.patch", when="@0.4.1:1.0^cuda@10.1.243:")
# Fixes 'FindOpenMP.cmake'
# to detect openmp settings used by Fujitsu compiler.
patch('detect_omp_of_fujitsu_compiler.patch', when='%fj')
patch("detect_omp_of_fujitsu_compiler.patch", when="%fj")
# Fix compilation of +distributed~tensorpipe
# https://github.com/pytorch/pytorch/issues/68002
patch('https://github.com/pytorch/pytorch/commit/c075f0f633fa0136e68f0a455b5b74d7b500865c.patch',
sha256='e69e41b5c171bfb00d1b5d4ee55dd5e4c8975483230274af4ab461acd37e40b8', when='@1.10.0+distributed~tensorpipe')
patch(
"https://github.com/pytorch/pytorch/commit/c075f0f633fa0136e68f0a455b5b74d7b500865c.patch",
sha256="e69e41b5c171bfb00d1b5d4ee55dd5e4c8975483230274af4ab461acd37e40b8",
when="@1.10.0+distributed~tensorpipe",
)
# Both build and install run cmake/make/make install
# Only run once to speed up build times
phases = ['install']
phases = ["install"]
@property
def libs(self):
root = join_path(self.prefix, self.spec['python'].package.site_packages_dir,
'torch', 'lib')
return find_libraries('libtorch', root)
root = join_path(
self.prefix, self.spec["python"].package.site_packages_dir, "torch", "lib"
)
return find_libraries("libtorch", root)
@property
def headers(self):
root = join_path(self.prefix, self.spec['python'].package.site_packages_dir,
'torch', 'include')
root = join_path(
self.prefix, self.spec["python"].package.site_packages_dir, "torch", "include"
)
headers = find_all_headers(root)
headers.directories = [root]
return headers
@when('@1.5.0:')
@when("@1.5.0:")
def patch(self):
# https://github.com/pytorch/pytorch/issues/52208
filter_file('torch_global_deps PROPERTIES LINKER_LANGUAGE C',
'torch_global_deps PROPERTIES LINKER_LANGUAGE CXX',
'caffe2/CMakeLists.txt')
filter_file(
"torch_global_deps PROPERTIES LINKER_LANGUAGE C",
"torch_global_deps PROPERTIES LINKER_LANGUAGE CXX",
"caffe2/CMakeLists.txt",
)
def setup_build_environment(self, env):
"""Set environment variables used to control the build.
@@ -269,7 +295,8 @@ class PyTorch(PythonPackage, CudaPackage):
most flags defined in ``CMakeLists.txt`` can be specified as
environment variables.
"""
def enable_or_disable(variant, keyword='USE', var=None, newer=False):
def enable_or_disable(variant, keyword="USE", var=None, newer=False):
"""Set environment variable to enable or disable support for a
particular variant.
@@ -284,137 +311,135 @@ class PyTorch(PythonPackage, CudaPackage):
# Version 1.1.0 switched from NO_* to USE_* or BUILD_*
# But some newer variants have always used USE_* or BUILD_*
if self.spec.satisfies('@1.1:') or newer:
if '+' + variant in self.spec:
env.set(keyword + '_' + var, 'ON')
if self.spec.satisfies("@1.1:") or newer:
if "+" + variant in self.spec:
env.set(keyword + "_" + var, "ON")
else:
env.set(keyword + '_' + var, 'OFF')
env.set(keyword + "_" + var, "OFF")
else:
if '+' + variant in self.spec:
env.unset('NO_' + var)
if "+" + variant in self.spec:
env.unset("NO_" + var)
else:
env.set('NO_' + var, 'ON')
env.set("NO_" + var, "ON")
# Build in parallel to speed up build times
env.set('MAX_JOBS', make_jobs)
env.set("MAX_JOBS", make_jobs)
# Spack logs have trouble handling colored output
env.set('COLORIZE_OUTPUT', 'OFF')
env.set("COLORIZE_OUTPUT", "OFF")
if self.spec.satisfies('@0.4:'):
enable_or_disable('test', keyword='BUILD')
if self.spec.satisfies("@0.4:"):
enable_or_disable("test", keyword="BUILD")
if self.spec.satisfies('@1.7:'):
enable_or_disable('caffe2', keyword='BUILD')
if self.spec.satisfies("@1.7:"):
enable_or_disable("caffe2", keyword="BUILD")
enable_or_disable('cuda')
if '+cuda' in self.spec:
enable_or_disable("cuda")
if "+cuda" in self.spec:
# cmake/public/cuda.cmake
# cmake/Modules_CUDA_fix/upstream/FindCUDA.cmake
env.unset('CUDA_ROOT')
torch_cuda_arch = ';'.join('{0:.1f}'.format(float(i) / 10.0) for i
in
self.spec.variants['cuda_arch'].value)
env.set('TORCH_CUDA_ARCH_LIST', torch_cuda_arch)
env.unset("CUDA_ROOT")
torch_cuda_arch = ";".join(
"{0:.1f}".format(float(i) / 10.0) for i in self.spec.variants["cuda_arch"].value
)
env.set("TORCH_CUDA_ARCH_LIST", torch_cuda_arch)
enable_or_disable('rocm')
enable_or_disable("rocm")
enable_or_disable('cudnn')
if '+cudnn' in self.spec:
enable_or_disable("cudnn")
if "+cudnn" in self.spec:
# cmake/Modules_CUDA_fix/FindCUDNN.cmake
env.set('CUDNN_INCLUDE_DIR', self.spec['cudnn'].prefix.include)
env.set('CUDNN_LIBRARY', self.spec['cudnn'].libs[0])
env.set("CUDNN_INCLUDE_DIR", self.spec["cudnn"].prefix.include)
env.set("CUDNN_LIBRARY", self.spec["cudnn"].libs[0])
enable_or_disable('fbgemm')
if self.spec.satisfies('@1.8:'):
enable_or_disable('kineto')
enable_or_disable('magma')
enable_or_disable('metal')
if self.spec.satisfies('@1.10:'):
enable_or_disable('breakpad')
enable_or_disable("fbgemm")
if self.spec.satisfies("@1.8:"):
enable_or_disable("kineto")
enable_or_disable("magma")
enable_or_disable("metal")
if self.spec.satisfies("@1.10:"):
enable_or_disable("breakpad")
enable_or_disable('nccl')
if '+nccl' in self.spec:
env.set('NCCL_LIB_DIR', self.spec['nccl'].libs.directories[0])
env.set('NCCL_INCLUDE_DIR', self.spec['nccl'].prefix.include)
enable_or_disable("nccl")
if "+nccl" in self.spec:
env.set("NCCL_LIB_DIR", self.spec["nccl"].libs.directories[0])
env.set("NCCL_INCLUDE_DIR", self.spec["nccl"].prefix.include)
# cmake/External/nnpack.cmake
enable_or_disable('nnpack')
enable_or_disable("nnpack")
enable_or_disable('numa')
if '+numa' in self.spec:
enable_or_disable("numa")
if "+numa" in self.spec:
# cmake/Modules/FindNuma.cmake
env.set('NUMA_ROOT_DIR', self.spec['numactl'].prefix)
env.set("NUMA_ROOT_DIR", self.spec["numactl"].prefix)
# cmake/Modules/FindNumPy.cmake
enable_or_disable('numpy')
enable_or_disable("numpy")
# cmake/Modules/FindOpenMP.cmake
enable_or_disable('openmp', newer=True)
enable_or_disable('qnnpack')
if self.spec.satisfies('@1.3:'):
enable_or_disable('qnnpack', var='PYTORCH_QNNPACK')
if self.spec.satisfies('@1.8:'):
enable_or_disable('valgrind')
if self.spec.satisfies('@1.5:'):
enable_or_disable('xnnpack')
enable_or_disable('mkldnn')
enable_or_disable('distributed')
enable_or_disable('mpi')
enable_or_disable("openmp", newer=True)
enable_or_disable("qnnpack")
if self.spec.satisfies("@1.3:"):
enable_or_disable("qnnpack", var="PYTORCH_QNNPACK")
if self.spec.satisfies("@1.8:"):
enable_or_disable("valgrind")
if self.spec.satisfies("@1.5:"):
enable_or_disable("xnnpack")
enable_or_disable("mkldnn")
enable_or_disable("distributed")
enable_or_disable("mpi")
# cmake/Modules/FindGloo.cmake
enable_or_disable('gloo', newer=True)
if self.spec.satisfies('@1.6:'):
enable_or_disable('tensorpipe')
enable_or_disable("gloo", newer=True)
if self.spec.satisfies("@1.6:"):
enable_or_disable("tensorpipe")
if '+onnx_ml' in self.spec:
env.set('ONNX_ML', 'ON')
if "+onnx_ml" in self.spec:
env.set("ONNX_ML", "ON")
else:
env.set('ONNX_ML', 'OFF')
env.set("ONNX_ML", "OFF")
if not self.spec.satisfies('@master'):
env.set('PYTORCH_BUILD_VERSION', self.version)
env.set('PYTORCH_BUILD_NUMBER', 0)
if not self.spec.satisfies("@master"):
env.set("PYTORCH_BUILD_VERSION", self.version)
env.set("PYTORCH_BUILD_NUMBER", 0)
# BLAS to be used by Caffe2
# Options defined in cmake/Dependencies.cmake and cmake/Modules/FindBLAS.cmake
if self.spec['blas'].name == 'atlas':
env.set('BLAS', 'ATLAS')
env.set('WITH_BLAS', 'atlas')
elif self.spec['blas'].name in ['blis', 'amdblis']:
env.set('BLAS', 'BLIS')
env.set('WITH_BLAS', 'blis')
elif self.spec['blas'].name == 'eigen':
env.set('BLAS', 'Eigen')
elif self.spec['lapack'].name in ['libflame', 'amdlibflame']:
env.set('BLAS', 'FLAME')
env.set('WITH_BLAS', 'FLAME')
elif self.spec['blas'].name in [
'intel-mkl', 'intel-parallel-studio', 'intel-oneapi-mkl']:
env.set('BLAS', 'MKL')
env.set('WITH_BLAS', 'mkl')
elif self.spec['blas'].name == 'openblas':
env.set('BLAS', 'OpenBLAS')
env.set('WITH_BLAS', 'open')
elif self.spec['blas'].name == 'veclibfort':
env.set('BLAS', 'vecLib')
env.set('WITH_BLAS', 'veclib')
if self.spec["blas"].name == "atlas":
env.set("BLAS", "ATLAS")
env.set("WITH_BLAS", "atlas")
elif self.spec["blas"].name in ["blis", "amdblis"]:
env.set("BLAS", "BLIS")
env.set("WITH_BLAS", "blis")
elif self.spec["blas"].name == "eigen":
env.set("BLAS", "Eigen")
elif self.spec["lapack"].name in ["libflame", "amdlibflame"]:
env.set("BLAS", "FLAME")
env.set("WITH_BLAS", "FLAME")
elif self.spec["blas"].name in ["intel-mkl", "intel-parallel-studio", "intel-oneapi-mkl"]:
env.set("BLAS", "MKL")
env.set("WITH_BLAS", "mkl")
elif self.spec["blas"].name == "openblas":
env.set("BLAS", "OpenBLAS")
env.set("WITH_BLAS", "open")
elif self.spec["blas"].name == "veclibfort":
env.set("BLAS", "vecLib")
env.set("WITH_BLAS", "veclib")
else:
env.set('BLAS', 'Generic')
env.set('WITH_BLAS', 'generic')
env.set("BLAS", "Generic")
env.set("WITH_BLAS", "generic")
# Don't use vendored third-party libraries when possible
env.set('BUILD_CUSTOM_PROTOBUF', 'OFF')
env.set('USE_SYSTEM_NCCL', 'ON')
env.set('USE_SYSTEM_EIGEN_INSTALL', 'ON')
if self.spec.satisfies('@0.4:'):
env.set('pybind11_DIR', self.spec['py-pybind11'].prefix)
env.set('pybind11_INCLUDE_DIR',
self.spec['py-pybind11'].prefix.include)
if self.spec.satisfies('@1.10:'):
env.set('USE_SYSTEM_PYBIND11', 'ON')
env.set("BUILD_CUSTOM_PROTOBUF", "OFF")
env.set("USE_SYSTEM_NCCL", "ON")
env.set("USE_SYSTEM_EIGEN_INSTALL", "ON")
if self.spec.satisfies("@0.4:"):
env.set("pybind11_DIR", self.spec["py-pybind11"].prefix)
env.set("pybind11_INCLUDE_DIR", self.spec["py-pybind11"].prefix.include)
if self.spec.satisfies("@1.10:"):
env.set("USE_SYSTEM_PYBIND11", "ON")
# https://github.com/pytorch/pytorch/issues/60334
# if self.spec.satisfies('@1.8:'):
# env.set('USE_SYSTEM_SLEEF', 'ON')
if self.spec.satisfies('@1.6:'):
if self.spec.satisfies("@1.6:"):
# env.set('USE_SYSTEM_LIBS', 'ON')
# https://github.com/pytorch/pytorch/issues/60329
# env.set('USE_SYSTEM_CPUINFO', 'ON')
@@ -422,27 +447,26 @@ class PyTorch(PythonPackage, CudaPackage):
# env.set('USE_SYSTEM_GLOO', 'ON')
# https://github.com/Maratyszcza/FP16/issues/18
# env.set('USE_SYSTEM_FP16', 'ON')
env.set('USE_SYSTEM_PTHREADPOOL', 'ON')
env.set('USE_SYSTEM_PSIMD', 'ON')
env.set('USE_SYSTEM_FXDIV', 'ON')
env.set('USE_SYSTEM_BENCHMARK', 'ON')
env.set("USE_SYSTEM_PTHREADPOOL", "ON")
env.set("USE_SYSTEM_PSIMD", "ON")
env.set("USE_SYSTEM_FXDIV", "ON")
env.set("USE_SYSTEM_BENCHMARK", "ON")
# https://github.com/pytorch/pytorch/issues/60331
# env.set('USE_SYSTEM_ONNX', 'ON')
# https://github.com/pytorch/pytorch/issues/60332
# env.set('USE_SYSTEM_XNNPACK', 'ON')
@run_before('install')
@run_before("install")
def build_amd(self):
if '+rocm' in self.spec:
python(os.path.join('tools', 'amd_build', 'build_amd.py'))
if "+rocm" in self.spec:
python(os.path.join("tools", "amd_build", "build_amd.py"))
@run_after('install')
@run_after("install")
@on_package_attributes(run_tests=True)
def install_test(self):
with working_dir('test'):
python('run_test.py')
with working_dir("test"):
python("run_test.py")
# Tests need to be re-added since `phases` was overridden
run_after('install')(
PythonPackage._run_default_install_time_test_callbacks)
run_after('install')(PythonPackage.sanity_check_prefix)
run_after("install")(PythonPackage._run_default_install_time_test_callbacks)
run_after("install")(PythonPackage.sanity_check_prefix)

File diff suppressed because it is too large Load Diff

View File

@@ -30,6 +30,7 @@
import spack.repo
import spack.spec
import spack.store
import spack.version as vn
from spack.schema.database_index import schema
from spack.util.executable import Executable
@@ -1051,3 +1052,16 @@ def test_query_installed_when_package_unknown(database, tmpdir):
assert not s.installed_upstream
with pytest.raises(spack.repo.UnknownNamespaceError):
s.package
def test_error_message_when_using_too_new_db(database, monkeypatch):
"""Sometimes the database format needs to be bumped. When that happens, we have forward
incompatibilities that need to be reported in a clear way to the user, in case we moved
back to an older version of Spack. This test ensures that the error message for a too
new database version stays comprehensible across refactoring of the database code.
"""
monkeypatch.setattr(spack.database, "_db_version", vn.Version("0"))
with pytest.raises(
spack.database.InvalidDatabaseVersionError, match="you need a newer Spack version"
):
spack.database.Database(database.root)._read()

View File

@@ -474,3 +474,29 @@ def test_initialize_from_random_file_as_manifest(tmp_path, filename):
assert not os.path.exists(env_dir / ev.lockfile_name)
assert os.path.exists(env_dir / ev.manifest_name)
assert filecmp.cmp(env_dir / ev.manifest_name, init_file, shallow=False)
def test_error_message_when_using_too_new_lockfile(tmp_path):
"""Sometimes the lockfile format needs to be bumped. When that happens, we have forward
incompatibilities that need to be reported in a clear way to the user, in case we moved
back to an older version of Spack. This test ensures that the error message for a too
new lockfile version stays comprehensible across refactoring of the environment code.
"""
init_file = tmp_path / ev.lockfile_name
env_dir = tmp_path / "env_dir"
init_file.write_text(
"""
{
"_meta": {
"file-type": "spack-lockfile",
"lockfile-version": 100,
"specfile-version": 3
},
"roots": [],
"concrete_specs": {}
}\n
"""
)
ev.initialize_environment_dir(env_dir, init_file)
with pytest.raises(ev.SpackEnvironmentError, match="You need to use a newer Spack version."):
ev.Environment(env_dir)

View File

@@ -23,6 +23,7 @@
_spack_build_envfile,
_spack_build_logfile,
_spack_configure_argsfile,
spack_times_log,
)
from spack.spec import Spec
@@ -243,7 +244,7 @@ def test_install_times(install_mockery, mock_fetch, mutable_mock_repo):
spec.package.do_install()
# Ensure dependency directory exists after the installation.
install_times = os.path.join(spec.package.prefix, ".spack", "install_times.json")
install_times = os.path.join(spec.package.prefix, ".spack", spack_times_log)
assert os.path.isfile(install_times)
# Ensure the phases are included
@@ -252,7 +253,7 @@ def test_install_times(install_mockery, mock_fetch, mutable_mock_repo):
# The order should be maintained
phases = [x["name"] for x in times["phases"]]
assert phases == ["stage", "one", "two", "three", "install"]
assert phases == ["stage", "one", "two", "three", "install", "post-install"]
assert all(isinstance(x["seconds"], float) for x in times["phases"])

View File

@@ -1384,3 +1384,39 @@ def test_single_external_implicit_install(install_mockery, explicit_args, is_exp
s.external_path = "/usr"
create_installer([(s, explicit_args)]).install()
assert spack.store.db.get_record(pkg).explicit == is_explicit
@pytest.mark.parametrize("run_tests", [True, False])
def test_print_install_test_log_skipped(install_mockery, mock_packages, capfd, run_tests):
"""Confirm printing of install log skipped if not run/no failures."""
name = "trivial-install-test-package"
s = spack.spec.Spec(name).concretized()
pkg = s.package
pkg.run_tests = run_tests
spack.installer.print_install_test_log(pkg)
out = capfd.readouterr()[0]
assert out == ""
def test_print_install_test_log_failures(
tmpdir, install_mockery, mock_packages, ensure_debug, capfd
):
"""Confirm expected outputs when there are test failures."""
name = "trivial-install-test-package"
s = spack.spec.Spec(name).concretized()
pkg = s.package
# Missing test log is an error
pkg.run_tests = True
pkg.tester.test_log_file = str(tmpdir.join("test-log.txt"))
pkg.tester.add_failure(AssertionError("test"), "test-failure")
spack.installer.print_install_test_log(pkg)
err = capfd.readouterr()[1]
assert "no test log file" in err
# Having test log results in path being output
fs.touch(pkg.tester.test_log_file)
spack.installer.print_install_test_log(pkg)
out = capfd.readouterr()[0]
assert "See test results at" in out

View File

@@ -8,6 +8,8 @@
import pytest
import spack.cmd.modules
import spack.config
import spack.error
import spack.modules.tcl
import spack.package_base
@@ -187,3 +189,31 @@ def find_nothing(*args):
assert module_path
spack.package_base.PackageBase.uninstall_by_spec(spec)
@pytest.mark.regression("37649")
def test_check_module_set_name(mutable_config):
"""Tests that modules set name are validated correctly and an error is reported if the
name we require does not exist or is reserved by the configuration."""
# Minimal modules.yaml config.
spack.config.set(
"modules",
{
"prefix_inspections": {"./bin": ["PATH"]},
# module sets
"first": {},
"second": {},
},
)
# Valid module set name
spack.cmd.modules.check_module_set_name("first")
# Invalid module set names
msg = "Valid module set names are"
with pytest.raises(spack.config.ConfigError, match=msg):
spack.cmd.modules.check_module_set_name("prefix_inspections")
with pytest.raises(spack.config.ConfigError, match=msg):
spack.cmd.modules.check_module_set_name("third")

View File

@@ -45,6 +45,18 @@ def provider(request):
@pytest.mark.usefixtures("config", "mock_packages")
class TestLmod(object):
@pytest.mark.regression("37788")
@pytest.mark.parametrize("modules_config", ["core_compilers", "core_compilers_at_equal"])
def test_layout_for_specs_compiled_with_core_compilers(
self, modules_config, module_configuration, factory
):
"""Tests that specs compiled with core compilers are in the 'Core' folder. Also tests that
we can use both ``compiler@version`` and ``compiler@=version`` to specify a core compiler.
"""
module_configuration(modules_config)
module, spec = factory("libelf%clang@12.0.0")
assert "Core" in module.layout.available_path_parts
def test_file_layout(self, compiler, provider, factory, module_configuration):
"""Tests the layout of files in the hierarchy is the one expected."""
module_configuration("complex_hierarchy")
@@ -61,7 +73,7 @@ def test_file_layout(self, compiler, provider, factory, module_configuration):
# is transformed to r"Core" if the compiler is listed among core
# compilers
# Check that specs listed as core_specs are transformed to "Core"
if compiler == "clang@=3.3" or spec_string == "mpich@3.0.1":
if compiler == "clang@=12.0.0" or spec_string == "mpich@3.0.1":
assert "Core" in layout.available_path_parts
else:
assert compiler.replace("@=", "/") in layout.available_path_parts
@@ -95,7 +107,7 @@ def test_compilers_provided_different_name(self, factory, module_configuration):
provides = module.conf.provides
assert "compiler" in provides
assert provides["compiler"] == spack.spec.CompilerSpec("oneapi@3.0")
assert provides["compiler"] == spack.spec.CompilerSpec("oneapi@=3.0")
def test_simple_case(self, modulefile_content, module_configuration):
"""Tests the generation of a simple Lua module file."""

View File

@@ -17,8 +17,11 @@
import llnl.util.filesystem as fs
import spack.install_test
import spack.package_base
import spack.repo
from spack.build_systems.generic import Package
from spack.installer import InstallError
@pytest.fixture(scope="module")
@@ -117,14 +120,14 @@ def test_possible_dependencies_with_multiple_classes(mock_packages, mpileaks_pos
assert expected == spack.package_base.possible_dependencies(*pkgs)
def setup_install_test(source_paths, install_test_root):
def setup_install_test(source_paths, test_root):
"""
Set up the install test by creating sources and install test roots.
The convention used here is to create an empty file if the path name
ends with an extension otherwise, a directory is created.
"""
fs.mkdirp(install_test_root)
fs.mkdirp(test_root)
for path in source_paths:
if os.path.splitext(path)[1]:
fs.touchp(path)
@@ -159,10 +162,11 @@ def test_cache_extra_sources(install_mockery, spec, sources, extras, expect):
"""Test the package's cache extra test sources helper function."""
s = spack.spec.Spec(spec).concretized()
s.package.spec.concretize()
source_path = s.package.stage.source_path
source_path = s.package.stage.source_path
srcs = [fs.join_path(source_path, src) for src in sources]
setup_install_test(srcs, s.package.install_test_root)
test_root = spack.install_test.install_test_root(s.package)
setup_install_test(srcs, test_root)
emsg_dir = "Expected {0} to be a directory"
emsg_file = "Expected {0} to be a file"
@@ -173,10 +177,10 @@ def test_cache_extra_sources(install_mockery, spec, sources, extras, expect):
else:
assert os.path.isdir(src), emsg_dir.format(src)
s.package.cache_extra_test_sources(extras)
spack.install_test.cache_extra_test_sources(s.package, extras)
src_dests = [fs.join_path(s.package.install_test_root, src) for src in sources]
exp_dests = [fs.join_path(s.package.install_test_root, e) for e in expect]
src_dests = [fs.join_path(test_root, src) for src in sources]
exp_dests = [fs.join_path(test_root, e) for e in expect]
poss_dests = set(src_dests) | set(exp_dests)
msg = "Expected {0} to{1} exist"
@@ -192,3 +196,146 @@ def test_cache_extra_sources(install_mockery, spec, sources, extras, expect):
# Perform a little cleanup
shutil.rmtree(os.path.dirname(source_path))
def test_cache_extra_sources_fails(install_mockery):
s = spack.spec.Spec("a").concretized()
s.package.spec.concretize()
with pytest.raises(InstallError) as exc_info:
spack.install_test.cache_extra_test_sources(s.package, ["/a/b", "no-such-file"])
errors = str(exc_info.value)
assert "'/a/b') must be relative" in errors
assert "'no-such-file') for the copy does not exist" in errors
def test_package_exes_and_libs():
with pytest.raises(spack.error.SpackError, match="defines both"):
class BadDetectablePackage(spack.package.Package):
executables = ["findme"]
libraries = ["libFindMe.a"]
def test_package_url_and_urls():
class URLsPackage(spack.package.Package):
url = "https://www.example.com/url-package-1.0.tgz"
urls = ["https://www.example.com/archive"]
s = spack.spec.Spec("a")
with pytest.raises(ValueError, match="defines both"):
URLsPackage(s)
def test_package_license():
class LicensedPackage(spack.package.Package):
extendees = None # currently a required attribute for is_extension()
license_files = None
s = spack.spec.Spec("a")
pkg = LicensedPackage(s)
assert pkg.global_license_file is None
pkg.license_files = ["license.txt"]
assert os.path.basename(pkg.global_license_file) == pkg.license_files[0]
class BaseTestPackage(Package):
extendees = None # currently a required attribute for is_extension()
def test_package_version_fails():
s = spack.spec.Spec("a")
pkg = BaseTestPackage(s)
with pytest.raises(ValueError, match="does not have a concrete version"):
pkg.version()
def test_package_tester_fails():
s = spack.spec.Spec("a")
pkg = BaseTestPackage(s)
with pytest.raises(ValueError, match="without concrete version"):
pkg.tester()
def test_package_fetcher_fails():
s = spack.spec.Spec("a")
pkg = BaseTestPackage(s)
with pytest.raises(ValueError, match="without concrete version"):
pkg.fetcher
def test_package_no_extendees():
s = spack.spec.Spec("a")
pkg = BaseTestPackage(s)
assert pkg.extendee_args is None
def test_package_test_no_compilers(mock_packages, monkeypatch, capfd):
def compilers(compiler, arch_spec):
return None
monkeypatch.setattr(spack.compilers, "compilers_for_spec", compilers)
s = spack.spec.Spec("a")
pkg = BaseTestPackage(s)
pkg.test_requires_compiler = True
pkg.do_test()
error = capfd.readouterr()[1]
assert "Skipping tests for package" in error
assert "test requires missing compiler" in error
# TODO (post-34236): Remove when remove deprecated run_test(), etc.
@pytest.mark.parametrize(
"msg,installed,purpose,expected",
[
("do-nothing", False, "test: echo", "do-nothing"),
("not installed", True, "test: echo not installed", "expected in prefix"),
],
)
def test_package_run_test_install(
install_mockery_mutable_config, mock_fetch, capfd, msg, installed, purpose, expected
):
"""Confirm expected outputs from run_test for installed/not installed exe."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
pkg.run_test(
"echo", msg, expected=[expected], installed=installed, purpose=purpose, work_dir="."
)
output = capfd.readouterr()[0]
assert expected in output
# TODO (post-34236): Remove when remove deprecated run_test(), etc.
@pytest.mark.parametrize(
"skip,failures,status",
[
(True, 0, str(spack.install_test.TestStatus.SKIPPED)),
(False, 1, str(spack.install_test.TestStatus.FAILED)),
],
)
def test_package_run_test_missing(
install_mockery_mutable_config, mock_fetch, capfd, skip, failures, status
):
"""Confirm expected results from run_test for missing exe when skip or not."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
pkg.run_test("no-possible-program", skip_missing=skip)
output = capfd.readouterr()[0]
assert len(pkg.tester.test_failures) == failures
assert status in output
# TODO (post-34236): Remove when remove deprecated run_test(), etc.
def test_package_run_test_fail_fast(install_mockery_mutable_config, mock_fetch):
"""Confirm expected exception when run_test with fail_fast enabled."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
with spack.config.override("config:fail_fast", True):
with pytest.raises(spack.install_test.TestFailure, match="Failed to find executable"):
pkg.run_test("no-possible-program")

View File

@@ -312,14 +312,6 @@ def test_fetch_options(version_str, digest_end, extra_options):
assert fetcher.extra_options == extra_options
def test_has_test_method_fails(capsys):
with pytest.raises(SystemExit):
spack.package_base.has_test_method("printing-package")
captured = capsys.readouterr()[1]
assert "is not a class" in captured
def test_package_deprecated_version(mock_packages, mock_fetch, mock_stage):
spec = Spec("deprecated-versions")
pkg_cls = spack.repo.path.get_pkg_class(spec.name)

View File

@@ -152,3 +152,18 @@ def test_repo_path_handles_package_removal(tmpdir, mock_packages):
with spack.repo.use_repositories(builder.root, override=False) as repos:
r = repos.repo_for_pkg("c")
assert r.namespace == "builtin.mock"
def test_repo_dump_virtuals(tmpdir, mutable_mock_repo, mock_packages, ensure_debug, capsys):
# Start with a package-less virtual
vspec = spack.spec.Spec("something")
mutable_mock_repo.dump_provenance(vspec, tmpdir)
captured = capsys.readouterr()[1]
assert "does not have a package" in captured
# Now with a virtual with a package
vspec = spack.spec.Spec("externalvirtual")
mutable_mock_repo.dump_provenance(vspec, tmpdir)
captured = capsys.readouterr()[1]
assert "Installing" in captured
assert "package.py" in os.listdir(tmpdir), "Expected the virtual's package to be copied"

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import pytest
import llnl.util.filesystem as fs
@@ -9,10 +11,12 @@
import spack.reporters.extract
import spack.spec
from spack.install_test import TestStatus
from spack.reporters import CDash, CDashConfiguration
# Use a path variable to appease Spack style line length checks
fake_install_prefix = fs.join_path(
os.sep,
"usr",
"spack",
"spack",
@@ -28,17 +32,41 @@
)
def test_reporters_extract_basics():
# This test has a description, command, and status
fake_bin = fs.join_path(fake_install_prefix, "bin", "fake")
name = "test_no_status"
desc = "basic description"
status = TestStatus.PASSED
outputs = """
==> Testing package fake-1.0-abcdefg
==> [2022-02-15-18:44:21.250165] test: {0}: {1}
==> [2022-02-15-18:44:21.250200] '{2}'
{3}: {0}
""".format(
name, desc, fake_bin, status
).splitlines()
parts = spack.reporters.extract.extract_test_parts("fake", outputs)
assert len(parts) == 1
assert parts[0]["command"] == "{0}".format(fake_bin)
assert parts[0]["desc"] == desc
assert parts[0]["loglines"] == ["{0}: {1}".format(status, name)]
assert parts[0]["status"] == status.lower()
def test_reporters_extract_no_parts(capfd):
# This test ticks three boxes:
# 1) has Installing, which is skipped;
# 2) does not define any test parts;
# 3) has a status value without a part so generates a warning
status = TestStatus.NO_TESTS
outputs = """
==> Testing package fake-1.0-abcdefg
==> [2022-02-11-17:14:38.875259] Installing {0} to {1}
NO-TESTS
{2}
""".format(
fake_install_test_root, fake_test_cache
fake_install_test_root, fake_test_cache, status
).splitlines()
parts = spack.reporters.extract.extract_test_parts("fake", outputs)
@@ -49,61 +77,67 @@ def test_reporters_extract_no_parts(capfd):
assert "No part to add status" in err
def test_reporters_extract_no_command():
# This test ticks 2 boxes:
# 1) has a test description with no command or status
# 2) has a test description, command, and status
fake_bin = fs.join_path(fake_install_prefix, "bin", "fake")
outputs = """
==> Testing package fake-1.0-abcdefg
==> [2022-02-15-18:44:21.250165] command with no status
==> [2022-02-15-18:44:21.250175] running test program
==> [2022-02-15-18:44:21.250200] '{0}'
PASSED
""".format(
fake_bin
).splitlines()
parts = spack.reporters.extract.extract_test_parts("fake", outputs)
assert len(parts) == 2
assert parts[0]["command"] == "unknown"
assert parts[1]["loglines"] == ["PASSED"]
assert parts[1]["elapsed"] == 0.0
def test_reporters_extract_missing_desc():
# This test parts with and without descriptions *and* a test part that has
# multiple commands
fake_bin = fs.join_path(fake_install_prefix, "bin", "importer")
names = ["test_fake_bin", "test_fake_util", "test_multiple_commands"]
descs = ["", "import fake util module", ""]
failed = TestStatus.FAILED
passed = TestStatus.PASSED
results = [passed, failed, passed]
outputs = """
==> Testing package fake-1.0-abcdefg
==> [2022-02-15-18:44:21.250165] '{0}' '-c' 'import fake.bin'
PASSED
==> [2022-02-15-18:44:21.250200] '{0}' '-c' 'import fake.util'
PASSED
==> [2022-02-15-18:44:21.250165] test: {0}: {1}
==> [2022-02-15-18:44:21.250170] '{5}' '-c' 'import fake.bin'
{2}: {0}
==> [2022-02-15-18:44:21.250185] test: {3}: {4}
==> [2022-02-15-18:44:21.250200] '{5}' '-c' 'import fake.util'
{6}: {3}
==> [2022-02-15-18:44:21.250205] test: {7}: {8}
==> [2022-02-15-18:44:21.250210] 'exe1 1'
==> [2022-02-15-18:44:21.250250] 'exe2 2'
{9}: {7}
""".format(
fake_bin
names[0],
descs[0],
results[0],
names[1],
descs[1],
fake_bin,
results[1],
names[2],
descs[2],
results[2],
).splitlines()
parts = spack.reporters.extract.extract_test_parts("fake", outputs)
assert len(parts) == 2
assert parts[0]["desc"] is None
assert parts[1]["desc"] is None
assert len(parts) == 3
for i, (name, desc, status) in enumerate(zip(names, descs, results)):
assert parts[i]["name"] == name
assert parts[i]["desc"] == desc
assert parts[i]["status"] == status.lower()
assert parts[2]["command"] == "exe1 1; exe2 2"
# TODO (post-34236): Remove this test when removing deprecated run_test(), etc.
def test_reporters_extract_xfail():
fake_bin = fs.join_path(fake_install_prefix, "bin", "fake-app")
outputs = """
==> Testing package fake-1.0-abcdefg
==> [2022-02-15-18:44:21.250165] Expecting return code in [3]
==> [2022-02-15-18:44:21.250165] test: test_fake: Checking fake imports
==> [2022-02-15-18:44:21.250175] Expecting return code in [3]
==> [2022-02-15-18:44:21.250200] '{0}'
PASSED
{1}
""".format(
fake_bin
fake_bin, str(TestStatus.PASSED)
).splitlines()
parts = spack.reporters.extract.extract_test_parts("fake", outputs)
assert len(parts) == 1
parts[0]["command"] == fake_bin
parts[0]["completed"] == "Expected to fail"
@@ -123,6 +157,7 @@ def test_reporters_extract_skipped(state):
parts[0]["completed"] == expected
# TODO (post-34236): Remove this test when removing deprecated run_test(), etc.
def test_reporters_skip():
# This test ticks 3 boxes:
# 1) covers an as yet uncovered skip messages
@@ -134,7 +169,7 @@ def test_reporters_skip():
==> Testing package fake-1.0-abcdefg
==> [2022-02-15-18:44:21.250165, 123456] Detected the following modules: fake1
==> {0}
==> [2022-02-15-18:44:21.250175, 123456] running fake program
==> [2022-02-15-18:44:21.250175, 123456] test: test_fake: running fake program
==> [2022-02-15-18:44:21.250200, 123456] '{1}'
INVALID
Results for test suite abcdefghijklmn
@@ -150,6 +185,27 @@ def test_reporters_skip():
assert parts[0]["elapsed"] == 0.0
def test_reporters_skip_new():
outputs = """
==> [2023-04-06-15:55:13.094025] test: test_skip:
SKIPPED: test_skip: Package must be built with +python
==> [2023-04-06-15:55:13.540029] Completed testing
==> [2023-04-06-15:55:13.540275]
======================= SUMMARY: fake-1.0-abcdefg ========================
fake::test_skip .. SKIPPED
=========================== 1 skipped of 1 part ==========================
""".splitlines()
parts = spack.reporters.extract.extract_test_parts("fake", outputs)
assert len(parts) == 1
part = parts[0]
assert part["name"] == "test_skip"
assert part["status"] == "skipped"
assert part["completed"] == "Completed"
assert part["loglines"][0].startswith("SKIPPED:")
def test_reporters_report_for_package_no_stdout(tmpdir, monkeypatch, capfd):
class MockCDash(CDash):
def upload(*args, **kwargs):

View File

@@ -239,6 +239,22 @@ def test_abstract_specs_can_constrain_each_other(self, lhs, rhs, expected):
assert c1 == c2
assert c1 == expected
def test_constrain_specs_by_hash(self, default_mock_concretization, database):
"""Test that Specs specified only by their hashes can constrain eachother."""
mpich_dag_hash = "/" + database.query_one("mpich").dag_hash()
spec = Spec(mpich_dag_hash[:7])
assert spec.constrain(Spec(mpich_dag_hash)) is False
assert spec.abstract_hash == mpich_dag_hash[1:]
def test_mismatched_constrain_spec_by_hash(self, default_mock_concretization, database):
"""Test that Specs specified only by their incompatible hashes fail appropriately."""
lhs = "/" + database.query_one("callpath ^mpich").dag_hash()
rhs = "/" + database.query_one("callpath ^mpich2").dag_hash()
with pytest.raises(spack.spec.InvalidHashError):
Spec(lhs).constrain(Spec(rhs))
with pytest.raises(spack.spec.InvalidHashError):
Spec(lhs[:7]).constrain(Spec(rhs))
@pytest.mark.parametrize(
"lhs,rhs", [("libelf", Spec()), ("libelf", "@0:1"), ("libelf", "@0:1 %gcc")]
)

View File

@@ -23,7 +23,7 @@
FAIL_ON_WINDOWS = pytest.mark.xfail(
sys.platform == "win32",
raises=(SpecTokenizationError, spack.spec.NoSuchHashError),
raises=(SpecTokenizationError, spack.spec.InvalidHashError),
reason="Unix style path on Windows",
)
@@ -631,21 +631,34 @@ def test_spec_by_hash_tokens(text, tokens):
@pytest.mark.db
def test_spec_by_hash(database):
def test_spec_by_hash(database, monkeypatch, config):
mpileaks = database.query_one("mpileaks ^zmpi")
b = spack.spec.Spec("b").concretized()
monkeypatch.setattr(spack.binary_distribution, "update_cache_and_get_specs", lambda: [b])
hash_str = f"/{mpileaks.dag_hash()}"
assert str(SpecParser(hash_str).next_spec()) == str(mpileaks)
parsed_spec = SpecParser(hash_str).next_spec()
parsed_spec.replace_hash()
assert parsed_spec == mpileaks
short_hash_str = f"/{mpileaks.dag_hash()[:5]}"
assert str(SpecParser(short_hash_str).next_spec()) == str(mpileaks)
parsed_spec = SpecParser(short_hash_str).next_spec()
parsed_spec.replace_hash()
assert parsed_spec == mpileaks
name_version_and_hash = f"{mpileaks.name}@{mpileaks.version} /{mpileaks.dag_hash()[:5]}"
assert str(SpecParser(name_version_and_hash).next_spec()) == str(mpileaks)
parsed_spec = SpecParser(name_version_and_hash).next_spec()
parsed_spec.replace_hash()
assert parsed_spec == mpileaks
b_hash = f"/{b.dag_hash()}"
parsed_spec = SpecParser(b_hash).next_spec()
parsed_spec.replace_hash()
assert parsed_spec == b
@pytest.mark.db
def test_dep_spec_by_hash(database):
def test_dep_spec_by_hash(database, config):
mpileaks_zmpi = database.query_one("mpileaks ^zmpi")
zmpi = database.query_one("zmpi")
fake = database.query_one("fake")
@@ -653,26 +666,25 @@ def test_dep_spec_by_hash(database):
assert "fake" in mpileaks_zmpi
assert "zmpi" in mpileaks_zmpi
mpileaks_hash_fake = SpecParser(f"mpileaks ^/{fake.dag_hash()}").next_spec()
mpileaks_hash_fake = SpecParser(f"mpileaks ^/{fake.dag_hash()} ^zmpi").next_spec()
mpileaks_hash_fake.replace_hash()
assert "fake" in mpileaks_hash_fake
assert mpileaks_hash_fake["fake"] == fake
assert "zmpi" in mpileaks_hash_fake
assert mpileaks_hash_fake["zmpi"] == spack.spec.Spec("zmpi")
mpileaks_hash_zmpi = SpecParser(
f"mpileaks %{mpileaks_zmpi.compiler} ^ /{zmpi.dag_hash()}"
).next_spec()
mpileaks_hash_zmpi.replace_hash()
assert "zmpi" in mpileaks_hash_zmpi
assert mpileaks_hash_zmpi["zmpi"] == zmpi
# notice: the round-trip str -> Spec loses specificity when
# since %gcc@=x gets printed as %gcc@x. So stick to satisfies
# here, unless/until we want to differentiate between ranges
# and specific versions in the future.
# assert mpileaks_hash_zmpi.compiler == mpileaks_zmpi.compiler
assert mpileaks_zmpi.compiler.satisfies(mpileaks_hash_zmpi.compiler)
mpileaks_hash_fake_and_zmpi = SpecParser(
f"mpileaks ^/{fake.dag_hash()[:4]} ^ /{zmpi.dag_hash()[:5]}"
).next_spec()
mpileaks_hash_fake_and_zmpi.replace_hash()
assert "zmpi" in mpileaks_hash_fake_and_zmpi
assert mpileaks_hash_fake_and_zmpi["zmpi"] == zmpi
@@ -681,7 +693,7 @@ def test_dep_spec_by_hash(database):
@pytest.mark.db
def test_multiple_specs_with_hash(database):
def test_multiple_specs_with_hash(database, config):
mpileaks_zmpi = database.query_one("mpileaks ^zmpi")
callpath_mpich2 = database.query_one("callpath ^mpich2")
@@ -713,7 +725,7 @@ def test_multiple_specs_with_hash(database):
@pytest.mark.db
def test_ambiguous_hash(mutable_database, default_mock_concretization):
def test_ambiguous_hash(mutable_database, default_mock_concretization, config):
x1 = default_mock_concretization("a")
x2 = x1.copy()
x1._hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
@@ -723,31 +735,43 @@ def test_ambiguous_hash(mutable_database, default_mock_concretization):
# ambiguity in first hash character
with pytest.raises(spack.spec.AmbiguousHashError):
SpecParser("/x").next_spec()
parsed_spec = SpecParser("/x").next_spec()
parsed_spec.replace_hash()
# ambiguity in first hash character AND spec name
with pytest.raises(spack.spec.AmbiguousHashError):
SpecParser("a/x").next_spec()
parsed_spec = SpecParser("a/x").next_spec()
parsed_spec.replace_hash()
@pytest.mark.db
def test_invalid_hash(database):
def test_invalid_hash(database, config):
zmpi = database.query_one("zmpi")
mpich = database.query_one("mpich")
# name + incompatible hash
with pytest.raises(spack.spec.InvalidHashError):
SpecParser(f"zmpi /{mpich.dag_hash()}").next_spec()
parsed_spec = SpecParser(f"zmpi /{mpich.dag_hash()}").next_spec()
parsed_spec.replace_hash()
with pytest.raises(spack.spec.InvalidHashError):
SpecParser(f"mpich /{zmpi.dag_hash()}").next_spec()
parsed_spec = SpecParser(f"mpich /{zmpi.dag_hash()}").next_spec()
parsed_spec.replace_hash()
# name + dep + incompatible hash
with pytest.raises(spack.spec.InvalidHashError):
SpecParser(f"mpileaks ^zmpi /{mpich.dag_hash()}").next_spec()
parsed_spec = SpecParser(f"mpileaks ^zmpi /{mpich.dag_hash()}").next_spec()
parsed_spec.replace_hash()
def test_invalid_hash_dep(database, config):
mpich = database.query_one("mpich")
hash = mpich.dag_hash()
with pytest.raises(spack.spec.InvalidHashError):
spack.spec.Spec(f"callpath ^zlib/{hash}").replace_hash()
@pytest.mark.db
def test_nonexistent_hash(database):
def test_nonexistent_hash(database, config):
"""Ensure we get errors for non existent hashes."""
specs = database.query()
@@ -756,26 +780,40 @@ def test_nonexistent_hash(database):
hashes = [s._hash for s in specs]
assert no_such_hash not in [h[: len(no_such_hash)] for h in hashes]
with pytest.raises(spack.spec.NoSuchHashError):
SpecParser(f"/{no_such_hash}").next_spec()
with pytest.raises(spack.spec.InvalidHashError):
parsed_spec = SpecParser(f"/{no_such_hash}").next_spec()
parsed_spec.replace_hash()
@pytest.mark.db
@pytest.mark.parametrize(
"query_str,text_fmt",
"spec1,spec2,constraint",
[
("mpileaks ^zmpi", r"/{hash}%{0.compiler}"),
("callpath ^zmpi", r"callpath /{hash} ^libelf"),
("dyninst", r'/{hash} cflags="-O3 -fPIC"'),
("mpileaks ^mpich2", r"mpileaks/{hash} @{0.version}"),
("zlib", "hdf5", None),
("zlib+shared", "zlib~shared", "+shared"),
("hdf5+mpi^zmpi", "hdf5~mpi", "^zmpi"),
("hdf5+mpi^mpich+debug", "hdf5+mpi^mpich~debug", "^mpich+debug"),
],
)
def test_redundant_spec(query_str, text_fmt, database):
"""Check that redundant spec constraints raise errors."""
spec = database.query_one(query_str)
text = text_fmt.format(spec, hash=spec.dag_hash())
with pytest.raises(spack.spec.RedundantSpecError):
SpecParser(text).next_spec()
def test_disambiguate_hash_by_spec(spec1, spec2, constraint, mock_packages, monkeypatch, config):
spec1_concrete = spack.spec.Spec(spec1).concretized()
spec2_concrete = spack.spec.Spec(spec2).concretized()
spec1_concrete._hash = "spec1"
spec2_concrete._hash = "spec2"
monkeypatch.setattr(
spack.binary_distribution,
"update_cache_and_get_specs",
lambda: [spec1_concrete, spec2_concrete],
)
# Ordering is tricky -- for constraints we want after, for names we want before
if not constraint:
spec = spack.spec.Spec(spec1 + "/spec")
else:
spec = spack.spec.Spec("/spec" + constraint)
assert spec.lookup_hash() == spec1_concrete
@pytest.mark.parametrize(
@@ -858,11 +896,13 @@ def test_error_conditions(text, exc_cls):
"libfoo ^./libdwarf.yaml", spack.spec.NoSuchSpecFileError, marks=FAIL_ON_WINDOWS
),
pytest.param(
"/bogus/path/libdwarf.yamlfoobar", spack.spec.SpecFilenameError, marks=FAIL_ON_WINDOWS
"/bogus/path/libdwarf.yamlfoobar",
spack.spec.NoSuchSpecFileError,
marks=FAIL_ON_WINDOWS,
),
pytest.param(
"libdwarf^/bogus/path/libelf.yamlfoobar ^/path/to/bogus.yaml",
spack.spec.SpecFilenameError,
spack.spec.NoSuchSpecFileError,
marks=FAIL_ON_WINDOWS,
),
pytest.param(
@@ -895,7 +935,7 @@ def test_error_conditions(text, exc_cls):
)
def test_specfile_error_conditions_windows(text, exc_cls):
with pytest.raises(exc_cls):
SpecParser(text).next_spec()
SpecParser(text).all_specs()
@pytest.mark.parametrize(

View File

@@ -2,13 +2,18 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import os
import sys
import pytest
from llnl.util.filesystem import join_path, mkdirp, touch
import spack.install_test
import spack.spec
from spack.install_test import TestStatus
from spack.util.executable import which
def _true(*args, **kwargs):
@@ -16,7 +21,7 @@ def _true(*args, **kwargs):
return True
def ensure_results(filename, expected):
def ensure_results(filename, expected, present=True):
assert os.path.exists(filename)
with open(filename, "r") as fd:
lines = fd.readlines()
@@ -25,10 +30,13 @@ def ensure_results(filename, expected):
if expected in line:
have = True
break
assert have
if present:
assert have, f"Expected '{expected}' in the file"
else:
assert not have, f"Expected '{expected}' NOT to be in the file"
def test_test_log_pathname(mock_packages, config):
def test_test_log_name(mock_packages, config):
"""Ensure test log path is reasonable."""
spec = spack.spec.Spec("libdwarf").concretized()
@@ -74,8 +82,8 @@ def test_write_test_result(mock_packages, mock_test_stage):
assert spec.name in msg
def test_test_uninstalled(mock_packages, install_mockery, mock_test_stage):
"""Attempt to perform stand-alone test for uninstalled package."""
def test_test_not_installed(mock_packages, install_mockery, mock_test_stage):
"""Attempt to perform stand-alone test for not_installed package."""
spec = spack.spec.Spec("trivial-smoke-test").concretized()
test_suite = spack.install_test.TestSuite([spec])
@@ -87,7 +95,7 @@ def test_test_uninstalled(mock_packages, install_mockery, mock_test_stage):
@pytest.mark.parametrize(
"arguments,status,msg",
[({}, "SKIPPED", "Skipped"), ({"externals": True}, "NO-TESTS", "No tests")],
[({}, TestStatus.SKIPPED, "Skipped"), ({"externals": True}, TestStatus.NO_TESTS, "No tests")],
)
def test_test_external(
mock_packages, install_mockery, mock_test_stage, monkeypatch, arguments, status, msg
@@ -101,7 +109,7 @@ def test_test_external(
test_suite = spack.install_test.TestSuite([spec])
test_suite(**arguments)
ensure_results(test_suite.results_file, status)
ensure_results(test_suite.results_file, str(status))
if arguments:
ensure_results(test_suite.log_file_for_spec(spec), msg)
@@ -149,6 +157,7 @@ def test_test_spec_passes(mock_packages, install_mockery, mock_test_stage, monke
ensure_results(test_suite.results_file, "PASSED")
ensure_results(test_suite.log_file_for_spec(spec), "simple stand-alone")
ensure_results(test_suite.log_file_for_spec(spec), "standalone-ifc", present=False)
def test_get_test_suite():
@@ -181,3 +190,313 @@ def add_suite(package):
with pytest.raises(spack.install_test.TestSuiteNameError) as exc_info:
spack.install_test.get_test_suite(name)
assert "many suites named" in str(exc_info)
@pytest.mark.parametrize(
"virtuals,expected",
[(False, ["Mpich.test_mpich"]), (True, ["Mpi.test_hello", "Mpich.test_mpich"])],
)
def test_test_function_names(mock_packages, install_mockery, virtuals, expected):
"""Confirm test_function_names works as expected with/without virtuals."""
spec = spack.spec.Spec("mpich").concretized()
tests = spack.install_test.test_function_names(spec.package, add_virtuals=virtuals)
assert sorted(tests) == sorted(expected)
def test_test_functions_fails():
"""Confirm test_functions raises error if no package."""
with pytest.raises(ValueError, match="Expected a package"):
spack.install_test.test_functions(str)
def test_test_functions_pkgless(mock_packages, install_mockery, ensure_debug, capsys):
"""Confirm works for package providing a package-less virtual."""
spec = spack.spec.Spec("simple-standalone-test").concretized()
fns = spack.install_test.test_functions(spec.package, add_virtuals=True)
out = capsys.readouterr()
assert len(fns) == 2, "Expected two test functions"
for f in fns:
assert f[1].__name__ in ["test_echo", "test_skip"]
assert "virtual does not appear to have a package file" in out[1]
# TODO: This test should go away when compilers as dependencies is supported
def test_test_virtuals():
"""Confirm virtuals picks up non-unique, provided compilers."""
# This is an unrealistic case but it is set up to retrieve all possible
# virtual names in a single call.
def satisfies(spec):
return True
# Ensure spec will pick up the llvm+clang virtual compiler package names.
VirtualSpec = collections.namedtuple("VirtualSpec", ["name", "satisfies"])
vspec = VirtualSpec("llvm", satisfies)
# Ensure the package name is in the list that provides c, cxx, and fortran
# to pick up the three associated compilers and that virtuals provided will
# be deduped.
MyPackage = collections.namedtuple("MyPackage", ["name", "spec", "virtuals_provided"])
pkg = MyPackage("gcc", vspec, [vspec, vspec])
# This check assumes the method will not provide a unique set of compilers
v_names = spack.install_test.virtuals(pkg)
for name, number in [("c", 2), ("cxx", 2), ("fortran", 1), ("llvm", 1)]:
assert v_names.count(name) == number, "Expected {0} of '{1}'".format(number, name)
def test_package_copy_test_files_fails(mock_packages):
"""Confirm copy_test_files fails as expected without package or test_suite."""
vspec = spack.spec.Spec("something")
# Try without a package
with pytest.raises(spack.install_test.TestSuiteError) as exc_info:
spack.install_test.copy_test_files(None, vspec)
assert "without a package" in str(exc_info)
# Try with a package without a test suite
MyPackage = collections.namedtuple("MyPackage", ["name", "spec", "test_suite"])
pkg = MyPackage("SomePackage", vspec, None)
with pytest.raises(spack.install_test.TestSuiteError) as exc_info:
spack.install_test.copy_test_files(pkg, vspec)
assert "test suite is missing" in str(exc_info)
def test_package_copy_test_files_skips(mock_packages, ensure_debug, capsys):
"""Confirm copy_test_files errors as expected if no package class found."""
# Try with a non-concrete spec and package with a test suite
MockSuite = collections.namedtuple("TestSuite", ["specs"])
MyPackage = collections.namedtuple("MyPackage", ["name", "spec", "test_suite"])
vspec = spack.spec.Spec("something")
pkg = MyPackage("SomePackage", vspec, MockSuite([]))
spack.install_test.copy_test_files(pkg, vspec)
out = capsys.readouterr()[1]
assert "skipping test data copy" in out
assert "no package class found" in out
def test_process_test_parts(mock_packages):
"""Confirm process_test_parts fails as expected without package or test_suite."""
# Try without a package
with pytest.raises(spack.install_test.TestSuiteError) as exc_info:
spack.install_test.process_test_parts(None, [])
assert "without a package" in str(exc_info)
# Try with a package without a test suite
MyPackage = collections.namedtuple("MyPackage", ["name", "test_suite"])
pkg = MyPackage("SomePackage", None)
with pytest.raises(spack.install_test.TestSuiteError) as exc_info:
spack.install_test.process_test_parts(pkg, [])
assert "test suite is missing" in str(exc_info)
def test_test_part_fail(tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage):
"""Confirm test_part with a ProcessError results in FAILED status."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
pkg.tester.test_log_file = str(tmpdir.join("test-log.txt"))
touch(pkg.tester.test_log_file)
name = "test_fail"
with spack.install_test.test_part(pkg, name, "fake ProcessError"):
raise spack.util.executable.ProcessError("Mock failure")
for part_name, status in pkg.tester.test_parts.items():
assert part_name.endswith(name)
assert status == TestStatus.FAILED
def test_test_part_pass(install_mockery_mutable_config, mock_fetch, mock_test_stage):
"""Confirm test_part that succeeds results in PASSED status."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
name = "test_echo"
msg = "nothing"
with spack.install_test.test_part(pkg, name, "echo"):
echo = which("echo")
echo(msg)
for part_name, status in pkg.tester.test_parts.items():
assert part_name.endswith(name)
assert status == TestStatus.PASSED
def test_test_part_skip(install_mockery_mutable_config, mock_fetch, mock_test_stage):
"""Confirm test_part that raises SkipTest results in test status SKIPPED."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
name = "test_skip"
with spack.install_test.test_part(pkg, name, "raise SkipTest"):
raise spack.install_test.SkipTest("Skipping the test")
for part_name, status in pkg.tester.test_parts.items():
assert part_name.endswith(name)
assert status == TestStatus.SKIPPED
def test_test_part_missing_exe_fail_fast(
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage
):
"""Confirm test_part with fail fast enabled raises exception."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
pkg.tester.test_log_file = str(tmpdir.join("test-log.txt"))
touch(pkg.tester.test_log_file)
name = "test_fail_fast"
with spack.config.override("config:fail_fast", True):
with pytest.raises(spack.install_test.TestFailure, match="object is not callable"):
with spack.install_test.test_part(pkg, name, "fail fast"):
missing = which("no-possible-program")
missing()
test_parts = pkg.tester.test_parts
assert len(test_parts) == 1
for part_name, status in test_parts.items():
assert part_name.endswith(name)
assert status == TestStatus.FAILED
def test_test_part_missing_exe(
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage
):
"""Confirm test_part with missing executable fails."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
pkg.tester.test_log_file = str(tmpdir.join("test-log.txt"))
touch(pkg.tester.test_log_file)
name = "test_missing_exe"
with spack.install_test.test_part(pkg, name, "missing exe"):
missing = which("no-possible-program")
missing()
test_parts = pkg.tester.test_parts
assert len(test_parts) == 1
for part_name, status in test_parts.items():
assert part_name.endswith(name)
assert status == TestStatus.FAILED
# TODO (embedded test parts): Update this once embedded test part tracking
# TODO (embedded test parts): properly handles the nested context managers.
@pytest.mark.parametrize(
"current,substatuses,expected",
[
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.PASSED),
(TestStatus.FAILED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.FAILED),
(TestStatus.SKIPPED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.SKIPPED),
(TestStatus.NO_TESTS, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.NO_TESTS),
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.SKIPPED], TestStatus.PASSED),
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.FAILED], TestStatus.FAILED),
(TestStatus.PASSED, [TestStatus.SKIPPED, TestStatus.SKIPPED], TestStatus.SKIPPED),
],
)
def test_embedded_test_part_status(
install_mockery_mutable_config, mock_fetch, mock_test_stage, current, substatuses, expected
):
"""Check to ensure the status of the enclosing test part reflects summary of embedded parts."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
base_name = "test_example"
part_name = f"{pkg.__class__.__name__}::{base_name}"
pkg.tester.test_parts[part_name] = current
for i, status in enumerate(substatuses):
pkg.tester.test_parts[f"{part_name}_{i}"] = status
pkg.tester.status(base_name, current)
assert pkg.tester.test_parts[part_name] == expected
@pytest.mark.parametrize(
"statuses,expected",
[
([TestStatus.PASSED, TestStatus.PASSED], TestStatus.PASSED),
([TestStatus.PASSED, TestStatus.SKIPPED], TestStatus.PASSED),
([TestStatus.PASSED, TestStatus.FAILED], TestStatus.FAILED),
([TestStatus.SKIPPED, TestStatus.SKIPPED], TestStatus.SKIPPED),
([], TestStatus.NO_TESTS),
],
)
def test_write_tested_status(
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage, statuses, expected
):
"""Check to ensure the status of the enclosing test part reflects summary of embedded parts."""
s = spack.spec.Spec("trivial-smoke-test").concretized()
pkg = s.package
for i, status in enumerate(statuses):
pkg.tester.test_parts[f"test_{i}"] = status
pkg.tester.counts[status] += 1
pkg.tester.tested_file = tmpdir.join("test-log.txt")
pkg.tester.write_tested_status()
with open(pkg.tester.tested_file, "r") as f:
status = int(f.read().strip("\n"))
assert TestStatus(status) == expected
def test_check_special_outputs(tmpdir):
"""This test covers two related helper methods"""
contents = """CREATE TABLE packages (
name varchar(80) primary key,
has_code integer,
url varchar(160));
INSERT INTO packages VALUES('sqlite',1,'https://www.sqlite.org');
INSERT INTO packages VALUES('readline',1,'https://tiswww.case.edu/php/chet/readline/rltop.html');
INSERT INTO packages VALUES('xsdk',0,'http://xsdk.info');
COMMIT;
"""
filename = tmpdir.join("special.txt")
with open(filename, "w") as f:
f.write(contents)
expected = spack.install_test.get_escaped_text_output(filename)
spack.install_test.check_outputs(expected, contents)
# Let's also cover case where something expected is NOT in the output
expected.append("should not find me")
with pytest.raises(RuntimeError, match="Expected"):
spack.install_test.check_outputs(expected, contents)
def test_find_required_file(tmpdir):
filename = "myexe"
dirs = ["a", "b"]
for d in dirs:
path = tmpdir.join(d)
mkdirp(path)
touch(join_path(path, filename))
path = join_path(tmpdir.join("c"), "d")
mkdirp(path)
touch(join_path(path, filename))
# First just find a single path
results = spack.install_test.find_required_file(
tmpdir.join("c"), filename, expected=1, recursive=True
)
assert isinstance(results, str)
# Ensure none file if do not recursively search that directory
with pytest.raises(spack.install_test.SkipTest, match="Expected 1"):
spack.install_test.find_required_file(
tmpdir.join("c"), filename, expected=1, recursive=False
)
# Now make sure we get all of the files
results = spack.install_test.find_required_file(tmpdir, filename, expected=3, recursive=True)
assert isinstance(results, list) and len(results) == 3
def test_packagetest_fails(mock_packages):
MyPackage = collections.namedtuple("MyPackage", ["spec"])
s = spack.spec.Spec("a")
pkg = MyPackage(s)
with pytest.raises(ValueError, match="require a concrete package"):
spack.install_test.PackageTest(pkg)

View File

@@ -5,6 +5,7 @@
import os
import shutil
import sys
import pytest
@@ -62,6 +63,7 @@ def test_native_unpacking(tmpdir_factory, archive_file):
assert "TEST" in contents
@pytest.mark.skipif(sys.platform == "win32", reason="Only Python unpacking available on Windows")
@pytest.mark.parametrize("archive_file", ext_archive.keys(), indirect=True)
def test_system_unpacking(tmpdir_factory, archive_file, compr_support_check):
# actually run test

View File

@@ -337,15 +337,15 @@ def test_remove_complex_package_logic_filtered():
("grads", "rrlmwml3f2frdnqavmro3ias66h5b2ce"),
("llvm", "nufffum5dabmaf4l5tpfcblnbfjknvd3"),
# has @when("@4.1.0") and raw unicode literals
("mfem", "tiiv7uq7v2xtv24vdij5ptcv76dpazrw"),
("mfem@4.0.0", "tiiv7uq7v2xtv24vdij5ptcv76dpazrw"),
("mfem@4.1.0", "gxastq64to74qt4he4knpyjfdhh5auel"),
("mfem", "qtneutm6khd6epd2rhyuv2y6zavsxbed"),
("mfem@4.0.0", "qtneutm6khd6epd2rhyuv2y6zavsxbed"),
("mfem@4.1.0", "uit2ydzhra3b2mlvnq262qlrqqmuwq3d"),
# has @when("@1.5.0:")
("py-torch", "qs7djgqn7dy7r3ps4g7hv2pjvjk4qkhd"),
("py-torch@1.0", "qs7djgqn7dy7r3ps4g7hv2pjvjk4qkhd"),
("py-torch@1.6", "p4ine4hc6f2ik2f2wyuwieslqbozll5w"),
# has a print with multiple arguments
("legion", "zdpawm4avw3fllxcutvmqb5c3bj5twqt"),
("legion", "sffy6vz3dusxnxeetofoomlaieukygoj"),
# has nested `with when()` blocks and loops
("trilinos", "vqrgscjrla4hi7bllink7v6v6dwxgc2p"),
],

View File

@@ -18,7 +18,7 @@
def sort_edges(edges):
edges.sort(key=lambda edge: edge.spec.name)
edges.sort(key=lambda edge: (edge.spec.name or "", edge.spec.abstract_hash or ""))
return edges

View File

@@ -14,6 +14,7 @@
from llnl.util import tty
import spack.util.path as spath
from spack.error import SpackError
from spack.util.executable import CommandNotFoundError, which
# Supported archive extensions.
@@ -68,12 +69,9 @@ def allowed_archive(path):
return False if not path else any(path.endswith(t) for t in ALLOWED_ARCHIVE_TYPES)
def _untar(archive_file):
"""Untar archive. Prefer native Python `tarfile`
but fall back to system utility if there is a failure
to find the native Python module (tar on Unix).
Filters archives through native support gzip and xz
compression formats.
def _system_untar(archive_file):
"""Returns path to unarchived tar file.
Untars archive via system tar.
Args:
archive_file (str): absolute path to the archive to be extracted.
@@ -88,32 +86,50 @@ def _untar(archive_file):
def _bunzip2(archive_file):
"""Use Python's bz2 module to decompress bz2 compressed archives
"""Returns path to decompressed file.
Uses Python's bz2 module to decompress bz2 compressed archives
Fall back to system utility failing to find Python module `bz2`
Args:
archive_file (str): absolute path to the bz2 archive to be decompressed
"""
if is_bz2_supported():
return _py_bunzip(archive_file)
else:
return _system_bunzip(archive_file)
def _py_bunzip(archive_file):
"""Returns path to decompressed file.
Decompresses bz2 compressed archives/files via python's bz2 module"""
decompressed_file = os.path.basename(strip_compression_extension(archive_file, "bz2"))
working_dir = os.getcwd()
archive_out = os.path.join(working_dir, decompressed_file)
f_bz = bz2.BZ2File(archive_file, mode="rb")
with open(archive_out, "wb") as ar:
shutil.copyfileobj(f_bz, ar)
f_bz.close()
return archive_out
def _system_bunzip(archive_file):
"""Returns path to decompressed file.
Decompresses bz2 compressed archives/files via system bzip2 utility"""
compressed_file_name = os.path.basename(archive_file)
decompressed_file = os.path.basename(strip_extension(archive_file, "bz2"))
decompressed_file = os.path.basename(strip_compression_extension(archive_file, "bz2"))
working_dir = os.getcwd()
archive_out = os.path.join(working_dir, decompressed_file)
copy_path = os.path.join(working_dir, compressed_file_name)
if is_bz2_supported():
f_bz = bz2.BZ2File(archive_file, mode="rb")
with open(archive_out, "wb") as ar:
shutil.copyfileobj(f_bz, ar)
f_bz.close()
else:
shutil.copy(archive_file, copy_path)
bunzip2 = which("bunzip2", required=True)
bunzip2.add_default_arg("-q")
return bunzip2(copy_path)
shutil.copy(archive_file, copy_path)
bunzip2 = which("bunzip2", required=True)
bunzip2.add_default_arg("-q")
bunzip2(copy_path)
return archive_out
def _gunzip(archive_file):
"""Decompress `.gz` extensions. Prefer native Python `gzip` module.
"""Returns path to gunzip'd file
Decompresses `.gz` extensions. Prefer native Python `gzip` module.
Failing back to system utility gunzip.
Like gunzip, but extracts in the current working directory
instead of in-place.
@@ -121,34 +137,42 @@ def _gunzip(archive_file):
Args:
archive_file (str): absolute path of the file to be decompressed
"""
decompressed_file = os.path.basename(strip_extension(archive_file, "gz"))
if is_gzip_supported():
return _py_gunzip(archive_file)
else:
return _system_gunzip(archive_file)
def _py_gunzip(archive_file):
"""Returns path to gunzip'd file
Decompresses `.gz` compressed archvies via python gzip module"""
decompressed_file = os.path.basename(strip_compression_extension(archive_file, "gz"))
working_dir = os.getcwd()
destination_abspath = os.path.join(working_dir, decompressed_file)
if is_gzip_supported():
f_in = gzip.open(archive_file, "rb")
with open(destination_abspath, "wb") as f_out:
shutil.copyfileobj(f_in, f_out)
f_in.close()
else:
_system_gunzip(archive_file)
f_in = gzip.open(archive_file, "rb")
with open(destination_abspath, "wb") as f_out:
shutil.copyfileobj(f_in, f_out)
f_in.close()
return destination_abspath
def _system_gunzip(archive_file):
decompressed_file = os.path.basename(strip_extension(archive_file, "gz"))
"""Returns path to gunzip'd file
Decompresses `.gz` compressed files via system gzip"""
decompressed_file = os.path.basename(strip_compression_extension(archive_file, "gz"))
working_dir = os.getcwd()
destination_abspath = os.path.join(working_dir, decompressed_file)
compressed_file = os.path.basename(archive_file)
copy_path = os.path.join(working_dir, compressed_file)
shutil.copy(archive_file, copy_path)
gzip = which("gzip")
gzip = which("gzip", required=True)
gzip.add_default_arg("-d")
gzip(copy_path)
return destination_abspath
def _unzip(archive_file):
"""
"""Returns path to extracted zip archive
Extract Zipfile, searching for unzip system executable
If unavailable, search for 'tar' executable on system and use instead
@@ -157,7 +181,7 @@ def _unzip(archive_file):
"""
extracted_file = os.path.basename(strip_extension(archive_file, "zip"))
if sys.platform == "win32":
return _untar(archive_file)
return _system_untar(archive_file)
else:
exe = "unzip"
arg = "-q"
@@ -167,66 +191,76 @@ def _unzip(archive_file):
return extracted_file
def _unZ(archive_file):
def _system_unZ(archive_file):
"""Returns path to decompressed file
Decompress UNIX compress style compression
Utilizes gunzip on unix and 7zip on Windows
"""
if sys.platform == "win32":
result = _7zip(archive_file)
result = _system_7zip(archive_file)
else:
result = _system_gunzip(archive_file)
return result
def _lzma_decomp(archive_file):
"""Decompress lzma compressed files. Prefer Python native
"""Returns path to decompressed xz file.
Decompress lzma compressed files. Prefer Python native
lzma module, but fall back on command line xz tooling
to find available Python support. This is the xz command
on Unix and 7z on Windows"""
to find available Python support."""
if is_lzma_supported():
decompressed_file = os.path.basename(strip_extension(archive_file, "xz"))
archive_out = os.path.join(os.getcwd(), decompressed_file)
with open(archive_out, "wb") as ar:
with lzma.open(archive_file) as lar:
shutil.copyfileobj(lar, ar)
return _py_lzma(archive_file)
else:
if sys.platform == "win32":
return _7zip(archive_file)
else:
return _xz(archive_file)
return _xz(archive_file)
def _win_compressed_tarball_handler(archive_file):
"""Decompress and extract compressed tarballs on Windows.
This method uses 7zip in conjunction with the tar utility
to perform decompression and extraction in a two step process
first using 7zip to decompress, and tar to extract.
def _win_compressed_tarball_handler(decompressor):
"""Returns function pointer to two stage decompression
and extraction method
Decompress and extract compressed tarballs on Windows.
This method uses a decompression method in conjunction with
the tar utility to perform decompression and extraction in
a two step process first using decompressor to decompress,
and tar to extract.
The motivation for this method is the inability of 7zip
to directly decompress and extract compressed archives
in a single shot without undocumented workarounds, and
the Windows tar utility's lack of access to the xz tool (unsupported on Windows)
The motivation for this method is Windows tar utility's lack
of access to the xz tool (unsupported natively on Windows) but
can be installed manually or via spack
"""
# perform intermediate extraction step
# record name of new archive so we can extract
# and later clean up
decomped_tarball = _7zip(archive_file)
# 7zip is able to one shot extract compressed archives
# that have been named .txz. If that is the case, there will
# be no intermediate archvie to extract.
if check_extension(decomped_tarball, "tar"):
# run tar on newly decomped archive
outfile = _untar(decomped_tarball)
# clean intermediate archive to mimic end result
# produced by one shot decomp/extraction
os.remove(decomped_tarball)
return outfile
return decomped_tarball
def unarchive(archive_file):
# perform intermediate extraction step
# record name of new archive so we can extract
# and later clean up
decomped_tarball = decompressor(archive_file)
if check_extension(decomped_tarball, "tar"):
# run tar on newly decomped archive
outfile = _system_untar(decomped_tarball)
# clean intermediate archive to mimic end result
# produced by one shot decomp/extraction
os.remove(decomped_tarball)
return outfile
return decomped_tarball
return unarchive
def _py_lzma(archive_file):
"""Returns path to decompressed .xz files
Decompress lzma compressed .xz files via python lzma module"""
decompressed_file = os.path.basename(strip_extension(archive_file, "xz"))
archive_out = os.path.join(os.getcwd(), decompressed_file)
with open(archive_out, "wb") as ar:
with lzma.open(archive_file) as lar:
shutil.copyfileobj(lar, ar)
return archive_out
def _xz(archive_file):
"""Decompress lzma compressed .xz files via xz command line
tool. Available only on Unix
"""Returns path to decompressed xz files
Decompress lzma compressed .xz files via xz command line
tool.
"""
if sys.platform == "win32":
raise RuntimeError("XZ tool unavailable on Windows")
decompressed_file = os.path.basename(strip_extension(archive_file, "xz"))
working_dir = os.getcwd()
destination_abspath = os.path.join(working_dir, decompressed_file)
@@ -239,21 +273,20 @@ def _xz(archive_file):
return destination_abspath
def _7zip(archive_file):
"""Unpack/decompress with 7z executable
def _system_7zip(archive_file):
"""Returns path to decompressed file
Unpack/decompress with 7z executable
7z is able to handle a number file extensions however
it may not be available on system.
Without 7z, Windows users with certain versions of Python may
be unable to extract .xz files, and all Windows users will be unable
to extract .Z files. If we cannot find 7z either externally or a
Spack installed copy, we fail, but inform the user that 7z can
be installed via `spack install 7zip`
Args:
archive_file (str): absolute path of file to be unarchived
"""
outfile = os.path.basename(strip_last_extension(archive_file))
outfile = os.path.basename(strip_compression_extension(archive_file))
_7z = which("7z")
if not _7z:
raise CommandNotFoundError(
@@ -267,12 +300,10 @@ def _7zip(archive_file):
def decompressor_for(path, extension=None):
"""Returns a function pointer to appropriate decompression
algorithm based on extension type.
Args:
path (str): path of the archive file requiring decompression
"""
"""Returns appropriate decompression/extraction algorithm function pointer
for provided extension. If extension is none, it is computed
from the `path` and the decompression function is derived
from that information."""
if not extension:
extension = extension_from_file(path, decompress=True)
@@ -282,14 +313,28 @@ def decompressor_for(path, extension=None):
unrecognized file extension: '%s'"
% extension
)
if sys.platform == "win32":
return decompressor_for_win(extension)
else:
return decompressor_for_nix(extension)
if re.match(r"\.?zip$", extension) or path.endswith(".zip"):
def decompressor_for_nix(extension):
"""Returns a function pointer to appropriate decompression
algorithm based on extension type and unix specific considerations
i.e. a reasonable expectation system utils like gzip, bzip2, and xz are
available
Args:
path (str): path of the archive file requiring decompression
"""
if re.match(r"zip$", extension):
return _unzip
if re.match(r"gz", extension):
if re.match(r"gz$", extension):
return _gunzip
if re.match(r"bz2", extension):
if re.match(r"bz2$", extension):
return _bunzip2
# Python does not have native support
@@ -297,21 +342,80 @@ def decompressor_for(path, extension=None):
# we rely on external tools such as tar,
# 7z, or uncompressZ
if re.match(r"Z$", extension):
return _unZ
return _system_unZ
# Python and platform may not have support for lzma
# compression. If no lzma support, use tools available on systems
# 7zip on Windows and the xz tool on Unix systems.
if re.match(r"xz", extension):
if re.match(r"xz$", extension):
return _lzma_decomp
# Catch tar.xz/tar.Z files here for Windows
# as the tar utility on Windows cannot handle such
# compression types directly
if ("xz" in extension or "Z" in extension) and sys.platform == "win32":
return _win_compressed_tarball_handler
return _system_untar
return _untar
def _determine_py_decomp_archive_strategy(extension):
"""Returns appropriate python based decompression strategy
based on extension type"""
# Only rely on Python decompression support for gz
if re.match(r"gz$", extension):
return _py_gunzip
# Only rely on Python decompression support for bzip2
if re.match(r"bz2$", extension):
return _py_bunzip
# Only rely on Python decompression support for xz
if re.match(r"xz$", extension):
return _py_lzma
return None
def decompressor_for_win(extension):
"""Returns a function pointer to appropriate decompression
algorithm based on extension type and Windows specific considerations
Windows natively vendors *only* tar, no other archive/compression utilities
So we must rely exclusively on Python module support for all compression
operations, tar for tarballs and zip files, and 7zip for Z compressed archives
and files as Python does not provide support for the UNIX compress algorithm
Args:
path (str): path of the archive file requiring decompression
extension (str): extension
"""
extension = expand_contracted_extension(extension)
# Windows native tar can handle .zip extensions, use standard
# unzip method
if re.match(r"zip$", extension):
return _unzip
# if extension is standard tarball, invoke Windows native tar
if re.match(r"tar$", extension):
return _system_untar
# Python does not have native support
# of any kind for .Z files. In these cases,
# we rely on 7zip, which must be installed outside
# of spack and added to the PATH or externally detected
if re.match(r"Z$", extension):
return _system_unZ
# Windows vendors no native decompression tools, attempt to derive
# python based decompression strategy
# Expand extension from contracted extension i.e. tar.gz from .tgz
# no-op on non contracted extensions
compression_extension = compression_ext_from_compressed_archive(extension)
decompressor = _determine_py_decomp_archive_strategy(compression_extension)
if not decompressor:
raise SpackError(
"Spack was unable to determine a proper decompression strategy for"
f"valid extension: {extension}"
"This is a bug, please file an issue at https://github.com/spack/spack/issues"
)
if "tar" not in extension:
return decompressor
return _win_compressed_tarball_handler(decompressor)
class FileTypeInterface:
@@ -589,7 +693,7 @@ def extension_from_file(file, decompress=False):
def extension_from_path(path):
"""Get the allowed archive extension for a path.
"""Returns the allowed archive extension for a path.
If path does not include a valid archive extension
(see`spack.util.compression.ALLOWED_ARCHIVE_TYPES`) return None
"""
@@ -602,19 +706,23 @@ def extension_from_path(path):
return None
def strip_last_extension(path):
"""Strips last supported archive extension from path"""
if path:
for ext in ALLOWED_SINGLE_EXT_ARCHIVE_TYPES:
mod_path = check_and_remove_ext(path, ext)
if mod_path != path:
return mod_path
def strip_compression_extension(path, ext=None):
"""Returns path with last supported or provided archive extension stripped"""
path = expand_contracted_extension_in_path(path)
exts_to_check = EXTS
if ext:
exts_to_check = [ext]
for ext_check in exts_to_check:
mod_path = check_and_remove_ext(path, ext_check)
if mod_path != path:
return mod_path
return path
def strip_extension(path, ext=None):
"""Get the part of a path that does not include its compressed
type extension."""
"""Returns the part of a path that does not include extension.
If ext is given, only attempts to remove that extension. If no
extension given, attempts to strip any valid extension from path"""
if ext:
return check_and_remove_ext(path, ext)
for t in ALLOWED_ARCHIVE_TYPES:
@@ -625,7 +733,8 @@ def strip_extension(path, ext=None):
def check_extension(path, ext):
"""Check if extension is present in path"""
"""Returns true if extension is present in path
false otherwise"""
# Strip sourceforge suffix.
prefix, _ = spath.find_sourceforge_suffix(path)
if not ext.startswith(r"\."):
@@ -636,7 +745,7 @@ def check_extension(path, ext):
def reg_remove_ext(path, ext):
"""Regex remove ext from path"""
"""Returns path with ext remove via regex"""
if path and ext:
suffix = r"\.%s$" % ext
return re.sub(suffix, "", path)
@@ -644,8 +753,41 @@ def reg_remove_ext(path, ext):
def check_and_remove_ext(path, ext):
"""If given extension is present in path, remove and return,
otherwise just return path"""
"""Returns path with extension removed if extension
is present in path. Otherwise just returns path"""
if check_extension(path, ext):
return reg_remove_ext(path, ext)
return path
def _substitute_extension(path, old_ext, new_ext):
"""Returns path with old_ext replaced with new_ext.
old_ext and new_ext can be extension strings or regexs"""
return re.sub(rf"{old_ext}", rf"{new_ext}", path)
def expand_contracted_extension_in_path(path, ext=None):
"""Returns path with any contraction extension (i.e. tgz) expanded
(i.e. tar.gz). If ext is specified, only attempt to expand that extension"""
if not ext:
ext = extension_from_path(path)
expanded_ext = expand_contracted_extension(ext)
if expanded_ext != ext:
return _substitute_extension(path, ext, expanded_ext)
return path
def expand_contracted_extension(extension):
"""Return expanded version of contracted extension
i.e. .tgz -> .tar.gz, no op on non contracted extensions"""
extension = extension.strip(".")
contraction_map = {"tgz": "tar.gz", "txz": "tar.xz", "tbz": "tar.bz2", "tbz2": "tar.bz2"}
return contraction_map.get(extension, extension)
def compression_ext_from_compressed_archive(extension):
"""Returns compression extension for a compressed archive"""
extension = expand_contracted_extension(extension)
for ext in [*EXTS]:
if ext in extension:
return ext

Some files were not shown because too many files have changed in this diff Show More