Compare commits

..

1516 Commits

Author SHA1 Message Date
Gregory Becker
0e39c8b1b7 bugfix for relative dev path 2021-12-07 15:06:32 -08:00
Harmen Stoppels
3d1b9e4dbc "spack buildcache install": don't catch exception (#27674)
Remove a try/catch for an error with no handling. If the affected
code doesn't execute successfully, then the associated variable
is undefined and another (more-obscure) error occurs shortly after.
2021-12-07 13:17:17 -08:00
Massimiliano Culpo
f81d84dfc6 Release procedure: add a step to update docs (#27734) 2021-12-07 11:30:14 +00:00
Asher Mancinelli
7254d0fe94 HiOp: add versions, variants for rocm (#27824) 2021-12-07 11:38:10 +01:00
Glenn Johnson
3160ab66db 3dtk: set boost constraint (#27806) 2021-12-07 11:28:41 +01:00
Axel Huebl
941e6505d0 WarpX: 21.12 (#27800)
Update `warpx` & `py-warpx` to the latest release, `21.12`.
2021-12-06 17:16:46 -08:00
iarspider
687bc7d40e py-rich: add version 10.9.0 (#27826) 2021-12-06 15:32:01 -07:00
Manuela Kuhn
949923c03f r-blob: add 1.2.2 (#27797) 2021-12-06 22:00:59 +00:00
Manuela Kuhn
1d8575cd83 r-colorspace: add 2.0-2 (#27815) 2021-12-06 15:43:26 -06:00
Manuela Kuhn
3949cd567d r-jquerylib: add new package (#27716) 2021-12-06 15:42:20 -06:00
Manuela Kuhn
4ece86c64a r-brio: add 1.1.3 (#27801) 2021-12-06 15:25:43 -06:00
Manuela Kuhn
1a2d522eaa r-conquer: add 1.2.1 (#27802) 2021-12-06 15:24:56 -06:00
Manuela Kuhn
c18e91d771 r-crayon: 1.4.2 (#27803) 2021-12-06 15:23:32 -06:00
Manuela Kuhn
451a4f2cfa r-brms: add 2.16.3 (#27809) 2021-12-06 15:22:34 -06:00
Manuela Kuhn
88175417bb r-cli: add 3.1.0 (#27810) 2021-12-06 15:20:26 -06:00
Manuela Kuhn
4de7c6a901 r-car: add 3.0-12 (#27811) 2021-12-06 15:14:21 -06:00
Manuela Kuhn
0d5eb2657a r-colourpicker: add 1.1.1 (#27816) 2021-12-06 14:09:58 -06:00
Manuela Kuhn
9b3ac00c2d r-backports: add 1.4.0 (#27710) 2021-12-06 14:08:31 -06:00
Manuela Kuhn
b4f47cbdec r-bh: add 1.75.0-0 (#27711) 2021-12-06 14:07:01 -06:00
Manuela Kuhn
ccda43673a r-rcpparmadillo: add 0.10.7.3.0 (#27712) 2021-12-06 14:04:55 -06:00
Jen Herting
3b8a34240e [py-pyspellchecker] New package (#27788)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-12-06 10:57:50 -06:00
Jen Herting
d0fffa9212 apktool: new package (#27782) 2021-12-06 17:20:31 +01:00
Hector Barrios
0106e6ec9c mumps: fix problem when using Intel OneAPI mkl static lib (#27725) 2021-12-06 17:14:33 +01:00
Glenn Johnson
37a4c0ff59 r-prettydoc: fix dependency (#27791)
The markdown dependency is `r-rmarkdown` rather than `r-markdown`.
2021-12-06 17:13:44 +01:00
Glenn Johnson
30007f7897 r-affy: set r version constraint (#27792) 2021-12-06 17:00:03 +01:00
Glenn Johnson
caba5c4692 Remove the autotools resources (#27793)
This essentially reverts #18917 as it seems to no longer be necessary
due to recent autotools changes in core spack.
2021-12-06 16:59:25 +01:00
Wouter Deconinck
bc3a3d9249 opencascade: add v7.6.0 (#27799) 2021-12-06 16:44:19 +01:00
Glenn Johnson
599b8c3533 suite-sparse: Add CXSparse and SPQR to targets (#27804) 2021-12-06 16:40:07 +01:00
Michael Kuhn
e6432d2380 glib: add 2.70.2 (#27812) 2021-12-06 16:34:26 +01:00
Glenn Johnson
39ebf9c5a7 Set preferred version of vtk to 9.0.3 (#27817)
There is a library symbol issue with libioss and python with version
9.1.0 so set preferred version to 9.0.3.
2021-12-06 16:28:35 +01:00
Wouter Deconinck
a332c26780 acts: use variants['cuda_arch'] only when +cuda (#27813)
Since #27185, the cuda_arch variant values are conditional on +cuda. This means that for -cuda specs, the installation fails with:
```
==> acts: Executing phase: 'cmake'
==> Error: KeyError: 'cuda_arch'

/home/wdconinc/git/spack/var/spack/repos/builtin/packages/acts/package.py:222, in cmake_args:
        219        log_failure_threshold = spec.variants['log_failure_threshold'].value
        220        args.append("-DACTS_LOG_FAILURE_THRESHOLD={0}".format(log_failure_threshold))
        221
  >>    222        cuda_arch = spec.variants['cuda_arch'].value
        223        if cuda_arch != 'none':
        224            args.append('-DCUDA_FLAGS=-arch=sm_{0}'.format(cuda_arch[0]))
        225
```
2021-12-06 16:25:27 +01:00
Glenn Johnson
a8a226b748 bohrium: missing dependencies (#27807) 2021-12-06 16:23:25 +01:00
Wouter Deconinck
9b1b38d2de librsvg,libcroco: add doc variant to avoid unsolvable constraints due to gtkplus (#27790)
* [libcroco] add variant doc, depends_on gtk-doc when +doc
* [librsvg] add variant doc, depends_on gtk-doc when +doc
2021-12-06 14:08:22 +01:00
Wouter Deconinck
2bb075c850 rivet: hepmc=3: Fix prefix of --with-hepmc3 (#27814) 2021-12-06 10:42:54 +01:00
Glenn Johnson
235edd0742 new package: py-tensorflow-datasets (#27746)
* new package: py-tensorflow-datasets

- includes new dependency package: py-tensorflow-metadata

* Update var/spack/repos/builtin/packages/py-tensorflow-datasets/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow-metadata/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-12-05 12:06:43 -06:00
Glenn Johnson
252cd48b9c py-astropy: set version constraint on cfitsio (#27805) 2021-12-05 12:06:09 -06:00
Ben Bergen
c2def5c2f4 cgdb: add depends_on('gdb', type='run') (#27753)
* Added gdb Dependency

When using spack to install cgdb, a spack-built gdb is necessary to
avoid dynamic link errors.

- Added maintainer: tuxfan
- Set preferred to 'master' (best version for spack currently)

* Update: The gdb dependency added by this PR is for runtime

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-12-05 08:02:23 -07:00
Manuela Kuhn
25a9888102 r-v8: add 3.6.0 (#27714)
* r-v8: add 3.6.0

* Add conflict and deprecate version 3.4.0
2021-12-05 00:56:47 -06:00
Manuela Kuhn
1eec99ce1d py-imagesize: add 1.3.0 (#27787) 2021-12-03 21:05:05 -07:00
Manuela Kuhn
c0dab7617f py-jeepney: add 0.7.1 (#27786) 2021-12-03 20:16:56 -07:00
Chung ShinYee
041973d1d5 julia: update versions and add znver3 to Julia's CPU target (#27770)
* julia: new versions.

* julia: Support AMD Milan CPU znver3.
2021-12-03 18:31:47 -07:00
kwryankrattiger
203ccdd976 LLVM: Revert build_llvm_dylib to llvm_dylib (#27761) 2021-12-03 23:47:05 +01:00
Tamara Dahlgren
6f9dc1c926 berkeley-db: Add minimal external detection (#27752) 2021-12-03 23:42:45 +01:00
Manuela Kuhn
324e046f06 r-tictoc: add 1.0.1 (#27715) 2021-12-03 16:14:29 -06:00
Asher Mancinelli
21d1e03e37 ExaGO: add new versions (#27768) 2021-12-03 13:59:16 -08:00
Mark W. Krentel
34c1d196a4 hpctoolkit: add support for smoke tests (#27783)
This adds support in spack for both build/install tests (spack install
--run-tests) and post-install smoke tests (spack test run).

Hpctoolkit itself only recently added tests, so for now, this only
applies to branch master.
2021-12-03 12:52:34 -08:00
Jen Herting
bd2b5cc7aa [libzmq] added version 4.3.4 (#27780) 2021-12-03 13:44:04 -07:00
Simo Tuomisto
cfb8d6f8ac New package: mujoco (#27656) 2021-12-03 11:40:09 -08:00
Weiqun Zhang
0a656de60b amrex: add new release 21.12 (#27781) 2021-12-03 11:00:14 -08:00
Hans Pabst
23bd0f3178 LIBXSMM 1.17 (#27775) 2021-12-03 10:48:40 -08:00
Hans Pabst
0e7220b2f2 Fixed typo (NAMD). (#27774) 2021-12-03 10:40:34 -08:00
Bernhard Kaindl
53eb24af9b singularity, singularityce: Add 3.8.5 and ce-3.9.1 (#27778) 2021-12-03 11:23:08 -07:00
Richarda Butler
86b17193de Parallel-netcdf: Update Spack test dir (#27232)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-12-03 10:03:14 -08:00
Filippo Spiga
96a50db344 CUDA: add v11.5.1 (#27689) 2021-12-03 06:13:51 -07:00
Ethan Stam
4c922ec572 ParaView: add dependencies needed in paraview@master (#27745)
* The proj and nlohmann-json packages are needed in paraview@master
* ParaView: bump master subdir version to 5.10
2021-12-02 17:18:23 -08:00
Terry Cojean
b80b085575 Ginkgo package: add oneAPI support (#26868)
To use this, you can "spack install intel-oneapi-compilers" and then
"spack compiler add" the new compiler. You would need to install with
"spack install ginkgo+oneapi%dpcpp"
2021-12-02 17:13:25 -08:00
Michael Kuhn
fbd67feead nss: add 3.73 (#27767) 2021-12-03 00:55:57 +01:00
Peter Scheibel
edb971a10e Support packages which need to explicitly refer to dpcpp by name (#27168)
* Hack to support packages which need to explicitly refer to dpcpp by name
* cc script needs to know about dpcpp
2021-12-02 15:49:20 -08:00
Michael Kuhn
693a15ea00 nspr: add 4.32 (#27766) 2021-12-03 00:36:50 +01:00
Bernhard Kaindl
4d1a6cd362 libseccomp package: add version 2.5.3 (#27756)
- Use .tar.gz archive
- Update 2.3.3 to use .tar.gz archive (and update checksum)
- autoreconf dependency is no-longer required
- The new version depends on gperf
2021-12-02 14:52:23 -08:00
Bernhard Kaindl
9e1e7fa1d7 lvm2 package: add version 2.03.14 (#27760) 2021-12-02 14:37:59 -08:00
Ben Darwin
07ab4623e6 New package: f3d (add version 1.1.1) (#27755) 2021-12-02 14:35:55 -08:00
Jen Herting
66c5199362 New package: py-fire (#27731)
* dependency for stylegan2

* fixed copyright

* [py-fire]

- fixed some formatting
- url -> pypi

* [py-fire] removed extra https://

* [py-fire]

- removed python version restriction
- enum34 -> py-enum34

* [py-fire] removed .999

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-12-02 14:58:25 -06:00
Jen Herting
ca32b08825 New package: py-vector-quantize-pytorch (#27759)
* [py-vector-quantize-pytorch] New package

* [py-vector-quantize-pytorch] flake8

* [py-vector-quantize-pytorch] removed unnecessary dependency

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-12-02 14:53:49 -06:00
Ye Luo
1cdb764422 QMCPACK: fix use of MPI wrappers (#27744) 2021-12-01 18:39:06 -08:00
Tiziano Müller
d0beab8012 elpa package: add version 2021.05.002_bugfix (#27748)
* elpa: filter _bugfix in version for include dir
* libxc: add version 5.1.7
2021-12-01 18:29:23 -08:00
Bernhard Kaindl
04519d261d New package: goshimmer (#27339) 2021-12-01 17:05:30 -08:00
Axel Huebl
e590ec6f3f Add: openPMD-viewer (#27516)
* Add: openPMD-viewer

Add the `py-openpmd-viewer` package.

* openPMD-viewer: Improve Variant Control

* openPMD-viewer: Update to Source from Pip/PyPA

https://spack.readthedocs.io/en/latest/build_systems/pythonpackage.html#pypi-vs-github

* Fix style: comment

* openPMD-viewer: fix requirements

add reviewer feedback
2021-12-01 16:19:54 -07:00
Tiziano Müller
08b00f8804 cp2k: fix build issues without cuda, and with elpa on openSUSE (#27738) 2021-12-01 17:00:01 -06:00
Vanessasaurus
326acea29d py-tern: new package (#27599)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-12-01 14:41:05 -07:00
Jen Herting
97126cac4b New package: py-yq (#27729)
* pypi build of yq

* [py-yq]

- removed duplicate py-yaml dependency
- fixed py-yaml version number

* [py-yq] fix py-xmltodict version range

* [py-yq] self assign maintainer

* [py-yq] setuptools is runtime dependency

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-12-01 13:25:24 -08:00
Jen Herting
2ddc522864 New package: py-retry (#27730)
* package dependency

* [py-retry]

- added homepage
- removed unnecessary depends_on

* [py-retry] added dependencies back

* [py-retry] removed .999

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-12-01 14:14:55 -07:00
Joseph Wang
d7b21fc40b madgraph5amc package: use Spack compiler wrapper (#27523)
This fix allows madgraph5amc to build with Clang.
2021-12-01 12:18:07 -08:00
kwryankrattiger
041f91b176 ECP-DAV package: Add sensei to SDK (#27537)
- Add sensei to the SDK with appropriate propagations
- Rework variants to SENSEI package to avoid providing broken builds
- Turn off miniapps by default, these are examples and not critical to using sensei
2021-12-01 12:10:22 -08:00
Thomas Green
ab85e79a91 py-pybind11: add CMake dependency (#27573) 2021-12-01 11:57:27 -08:00
Manuela Kuhn
faa04a03c0 seacas: fix x11 dependency (#27737) 2021-12-01 10:11:14 -08:00
Mark W. Krentel
820d49b84e libmonitor: add version 2021.11.08 (#27740) 2021-12-01 10:09:49 -08:00
Marc Joos
81bdf1a386 wi4mpi: add new package (#27530) 2021-12-01 09:29:08 -07:00
Jen Herting
5a9bc4bfb2 New package: py-gin-config (#27732)
* [py-gin-config] created template

* [py-gin-config]

- depends on setup tools
- added homepage
- added description
- removed fixmes
2021-12-01 09:52:59 -06:00
Severin Strobl
67c8a63a0a adol-c: add variant stdczero (#27721) 2021-12-01 16:24:15 +01:00
Luis Yanes
4859649229 portcullis: add v1.2.3 (#27661)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-12-01 05:34:52 -07:00
Vicente Bolea
6e90a853f4 ParaView: Add 5.10.0-RC2 release (#27696) 2021-12-01 11:53:54 +01:00
Howard Pritchard
aacbc64a23 OPENMPI: add 4.0.7 release (#27697)
add the 4.0.7 release to the OMPI spack package

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-12-01 11:53:16 +01:00
ravil-mobile
658f42ef0f dpcpp: added a new package (#25130)
dpcpp stands for Data Parallel C++
dpcpp is Intel's implementation of SYCL

Co-authored-by: ravil <ravil.dorozhinskii@tum.de>
2021-12-01 11:43:38 +01:00
Ye Luo
787cc535e9 qmcpack: more restriction on openblas variants (#27723) 2021-12-01 11:30:58 +01:00
Vasileios Karakasis
452a693aab Add more ReFrame versions (#27728) 2021-11-30 15:56:03 -07:00
Ye Luo
23206fa4b5 quantum-espresso: add CMake and libxc variants (#27724) 2021-11-30 23:26:39 +01:00
Glenn Johnson
bcaa87574d vtk package: add version 9.1.0; add new dependencies (#27467)
This PR updates the vtk package to use new variable names and adds some
dependencies.

- add version 9.1.0
- add version 1.4.2 to gl2ps package. This is needed to use gl2ps as a
  dependency.
- new package: utf8cpp, used as a dependency for version 9.
- add dependencies when possible
    - pugixml
    - libogg
    - libtheora
    - utf8cpp
    - gl2ps
    - proj
- turn off configuring MPI if ~mpi
- always use the package-provided FindHDF5.cmake for versions up to 8.
  Version @9: already does this so does not need a patch.
- use new CMake variables for version 9
- remove unused CMake variables depending on version
2021-11-30 15:14:05 -07:00
Piotr Luszczek
d57bcc0b85 cpio: reuse patch for %gcc10: for %clang as well (#27589)
and replace patch download with simple filter_file() to fix the duplicate definition.

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-11-30 23:09:00 +01:00
Sebastian Schmitt
c96a6f990e Bump Brian2 (#27700)
* Bump Brian2

* Remove doc variant; the documentation is not build anyways
2021-11-30 11:32:04 -07:00
Marco De La Pierre
88cf4ce865 updated recipe for quantum-espresso 6.8 (#27607) 2021-11-30 10:05:53 -08:00
Manuela Kuhn
5e193618cb gtk-doc: fix source url and add 1.33.2 (#27719) 2021-11-30 09:29:06 -07:00
Davide Mancusi
0e83c0e792 gtkplus: fix generation of configure_args (#27718) 2021-11-30 17:26:39 +01:00
Massimiliano Culpo
645a7dc14c spack audit: fix API calls (#27713)
This broke in #24858
2021-11-30 14:59:55 +01:00
Adam J. Stewart
5044df88ab Add citation information to GitHub (#27518) 2021-11-30 01:37:50 -07:00
Manuela Kuhn
ab2bc933b8 py-idna: add 3.3 (#27708) 2021-11-29 21:28:57 -07:00
Valentin Volkl
b13778c2c6 py-iminuit: add new versions 2.8.4 (#27680)
* py-iminuit: add new versions 2.8.4

* py-iminuit: update python dependency

* py-iminuit: fix typo

* flake8
2021-11-29 20:19:52 -07:00
Manuela Kuhn
54df6d0a56 py-setuptools: add 59.4.0 (#27692) 2021-11-29 19:52:50 -07:00
Manuela Kuhn
44746dfbd6 r-tidyverse: add 1.3.1 (#27291) 2021-11-29 18:24:44 -06:00
Harmen Stoppels
a86d574208 dsfmt package: fix shared libs (#27628)
Also:

* Remove 2.2.3 (doesn't build shared libs)
* Add versions 2.2.4 and 2.2.5
2021-11-29 15:57:32 -08:00
Satish Balay
cb71308d22 petsc: archive configure.log make.log from the build (#27677)
Also enable verbose build
2021-11-29 16:10:50 -07:00
Ben Darwin
02b4a65086 py-intensity-normalization: add package (at version 2.1.1) (#27593)
* py-scikit-fuzzy: new package (at version 0.4.2)

* py-intensity-normalization: new package (at version 2.1.1)
2021-11-29 15:33:16 -06:00
Manuela Kuhn
838294b10d py-gevent: add 21.8.0 (#27699) 2021-11-29 15:29:09 -06:00
Manuela Kuhn
22eebff261 py-pyparsing: add 3.0.6 (#27693)
* py-pyparsing: add 3.0.6

* Add suggestions
2021-11-29 14:02:08 -07:00
Manuela Kuhn
a98e8ad4b5 py-packaging: add 21.3 (#27691)
* py-packaging: add 21.3

* Update var/spack/repos/builtin/packages/py-packaging/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-29 13:56:09 -07:00
Bernhard Kaindl
2d34082b0c llvm: Fix building llvm@4:9 using %clang@6: and %gcc@10: (#27233)
Add z3 variant, fix @:9%gcc@9: with glibc2.31, fix no_cyclades range
2021-11-29 12:48:18 -08:00
Manuela Kuhn
1f1da6806c py-tifffile: add 2021.11.2 (#27695) 2021-11-29 12:52:55 -07:00
Manuela Kuhn
647a26bc9e py-httpretty: add 1.1.4 (#27687) 2021-11-29 11:42:58 -06:00
Manuela Kuhn
561b0ca029 py-zope-interface: add 5.4.0 (#27596) 2021-11-29 11:42:00 -06:00
Manuela Kuhn
3c5295ee8a py-gast: add 0.5.3 (#27686) 2021-11-29 11:40:59 -06:00
Manuela Kuhn
914cf5ff87 py-importlib-metadata: add 4.8.2 (#27690) 2021-11-29 11:40:17 -06:00
Thomas Madlener
ecb588740a Speed up install of environments with dev packages (#27167)
* only check file modification times once per dev package
2021-11-29 09:34:23 -08:00
Manuela Kuhn
4858a26400 py-humanize: add 3.12.0 (#27688) 2021-11-29 10:58:09 -06:00
Eric Brugger
995a29d2f3 VisIt: remove upper range restriction on llvm. (#27648) 2021-11-29 17:38:31 +01:00
Adam J. Stewart
fcd5f960f8 py-matplotlib: add v3.5.0 (#27486)
* py-matplotlib: add v3.5.0

* Remove deprecated versions, update setup.cfg filename

* Add additional dependencies, save config file

* Add patch to fix matplotlibrc
2021-11-29 17:19:01 +01:00
Howard Pritchard
54d88f1a1c OpenMPI: add version 4.1.2 (#27645)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-11-29 16:45:20 +01:00
Adam J. Stewart
fee8ac9721 py-scipy: add v1.7.3 (#27649) 2021-11-29 16:44:17 +01:00
Manuela Kuhn
1326e9d538 py-greenlet: add v1.1.2 (#27594) 2021-11-29 16:43:38 +01:00
Bernhard Kaindl
bd0960cda8 gpgme: Add 0.16.0 and address test suite issues (#27604)
A part of the gpgme testsuite by default even runs during normal
make and make install phases, creating a public keyring in ~/.gnupg.
Prevent this and avoid build failures in containers due to another
problem of the test suite and fix a test case of the new 0.16.0 release.
2021-11-29 16:43:12 +01:00
Ethan Stam
cfd2ea0cff ParaView: Build edition strings should be uppercase (#27534) 2021-11-29 16:38:25 +01:00
Adam J. Stewart
e5e9dc2d92 py-pytorch-lightning: add v1.5.3 (#27650) 2021-11-29 16:31:58 +01:00
Derek Ryan Strong
f8d03b1060 pixz: add v1.0.7 (#27651) 2021-11-29 16:29:10 +01:00
Harmen Stoppels
037ece674b distro: don't use deprecated linux_distribution (#27658) 2021-11-29 16:26:19 +01:00
Nicolas Cornu
87cf38a427 Random123: add v1.14.0 from github (#27660) 2021-11-29 16:13:50 +01:00
Massimiliano Culpo
0d10408a25 bootstrap: restrict patchelf to v0.13.x (#27685) 2021-11-29 15:41:25 +01:00
Bernhard Kaindl
48d98b4c9e podman: new package (with dependencies) (#27603)
When the uidmap tools are installed on a system, this allows to run
containers as unprivileged user (rootless and daemonless) slimilar
to singularity, but using a familiar CLI: "alias docker=podman"

This is helpful to run e.g. spack builds in containers to reproduce
build failures from CI without requiring a installation of docker.
The required dependencies of podman are added as well.
2021-11-29 15:39:19 +01:00
Valentin Volkl
64c4fbd220 edm4hep: update hepmc dependency (#27678) 2021-11-29 14:55:07 +01:00
Bernhard Kaindl
af0ba4e3ef gtkplus: fix build of GTK2 versions with gcc-11, skip X tests (#26970)
Fix to not attempt to patch a nonexisting file for old versions
when building with gcc-11. Skip the build-time tests as all access
the X DISPLAY and open many windows on the screen.
2021-11-29 14:46:51 +01:00
Harmen Stoppels
5dce4d79bd patchelf: add v0.13.1, v0.14, v0.14.1 (#27681) 2021-11-29 13:46:15 +01:00
Paul Ferrell
c0edb17b93 Handle byte sequences which are not encoded as UTF8 while logging. (#21447)
Fix builds which produce a lines with non-UTF8 output while logging
The alternative is to read in binary mode, and then decode while
ignoring errors.
2021-11-29 13:27:02 +01:00
Vanessasaurus
bdde70c9d3 "branch" spelling mistake in libbeato (#27682)
I was browsing package metadata, as one does on a Sunday, and stumbled across a new kind of version attribute - brancch! I suspect this is supposed to be "branch."
2021-11-29 13:17:40 +01:00
iarspider
8ff81d4362 Fix py-astroid recipe (#27670) 2021-11-28 11:07:53 -06:00
Mark Grondona
1b71ceb384 add flux-core v0.31.0 and flux-sched v0.20.0 (#27336)
Update flux-core and flux-sched package.py to include latest releases.

For flux-sched:

 - Add patch to disable false-positive-happy valgrind test
 - pin yaml-cpp to 0.6.3 due to issue described at:
   https://github.com/flux-framework/flux-sched/issues/886
2021-11-28 02:18:00 +01:00
Joseph Wang
0ab5828d0d gdk-pixbuf: Fix 2.42.2 and add 2.42.6 (#27254)
Starting with meson 0.60, unknown args produce errors and
the -Dx11 arg is only present in @:2.40
https://gitlab.gnome.org/GNOME/gtk-osx/-/issues/44

Add tiff variant: Default disabled since fails the tests in part.
Only libtiff@:3.9 pass, but these old versions have severe security issues.

Deprecate @:2.41 as they are affected by the high-severity CVE-2021-20240:
https://nvd.nist.gov/vuln/detail/CVE-2021-20240
2021-11-28 01:30:14 +01:00
iarspider
d5b243de14 py-scipy: update pybind11 versions (#27679) 2021-11-27 16:03:43 -06:00
iarspider
1d8d975721 Apply requested fix from #27643 (#27672) 2021-11-26 16:45:38 -06:00
iarspider
96ce0e393d New version: py-send2trash 1.8.0 (#27583) 2021-11-26 14:35:03 -07:00
Jannis Schönleber
cf3762369c dyninst: use spack libiberty path (#25779) 2021-11-26 15:58:53 -05:00
iarspider
8138ebaa32 New version: py-tomlkit 0.7.2 (#27633) 2021-11-26 13:11:20 -07:00
iarspider
eb42514124 New version: py-threadpoolctl 3.0.0 (#27632) 2021-11-26 13:10:58 -07:00
Harmen Stoppels
4d6891162d Use bash in setup_git.sh (#27676) 2021-11-26 18:03:05 +00:00
iarspider
978f091822 Fix py-aiohttp recipe (#27671) 2021-11-26 08:17:01 -07:00
Massimiliano Culpo
a96f2f603b Bootstrap patchelf like GnuPG (#27532)
Remove a custom bootstrapping procedure to
use spack.bootstrap instead

Modifications:
* Reference count the bootstrap context manager
* Avoid SpackCommand to make the bootstrapping
  procedure more transparent
* Put back requirement on patchelf being in PATH for unit tests
* Add an e2e test to check bootstrapping patchelf
2021-11-26 15:32:13 +01:00
iarspider
2cdc758860 Fix version constraint in py-ipykernel (#27665)
* Fix version constrains in py-ipykernel and py-ipython

Before the fix:
```
$ spack spec  py-ipykernel@6.4.1  ^py-jupyter-client@7.0.6
==> Error: py-ipykernel@6.4.1 ^py-jupyter-client@7.0.6 is unsatisfiable, conflicts are:
  no version satisfies the given constraints
```

After the fix:
```
```
(thanks god the old concretizer is still there - it provides sane error messages!)

* Fix py-ipython recipe

* Revert "Fix py-ipython recipe"

This reverts commit d65071665f.
2021-11-25 17:58:43 -07:00
Seth R. Johnson
9904159356 New package: py-sphinx-argparse (#27663) 2021-11-25 13:54:34 -06:00
Maxim Belkin
6e095a9741 module_file_support: update format for configuration (#27598) 2021-11-25 08:41:32 +01:00
iarspider
90b4f55001 New version: py-theano 1.0.5 (#27631) 2021-11-24 18:16:48 -07:00
iarspider
60e683f5d3 New version: py-requests 2.26.0 (redo of #27579) (#27647) 2021-11-24 17:38:01 -07:00
iarspider
a9a6e00d14 New version: py-rsa 4.7.2 (#27581) 2021-11-24 16:52:44 -07:00
iarspider
6fa0feb7de New version: py-rich 10.14.0 (redo of #27580) (#27646) 2021-11-24 16:04:53 -07:00
iarspider
0db93a5dea py-virtualenv package: add version 20.10.0; new dependency py-distlib (#27637)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-24 15:38:03 -07:00
Massimiliano Culpo
270ba10962 spack flake8: remove deprecated command (#27290)
The "spack flake8" command wwas deprecated in favor
of "spack style". The deprecation wwarning is in the
0.17.X series, so removing it for v0.18.x
2021-11-24 14:20:11 -08:00
iarspider
f8025d4f88 New version: py-requests-oauthlib 1.3.0 (#27578) 2021-11-24 16:20:02 -06:00
liuyangzhuan
83f97e8eba Adding packages for gptune and its dependencies (#26936)
* added package gptune with all its dependencies: adding py-autotune, pygmo, py-pyaml, py-autotune, py-gpy, py-lhsmdu, py-hpbandster, pagmo2, py-opentuner; modifying superlu-dist, py-scikit-optimize

* adding gptune package

* minor fix for macos spack test

* update patch for py-scikit-optimize; update test files for gptune

* fixing gptune package style error

* fixing unit tests

* a few changes reviewed in the PR

* improved gptune package.py with a few newly added/improved dependencies

* fixed a few style errors

* minor fix on package name py-pyro4

* fixing more style errors

* Update var/spack/repos/builtin/packages/py-scikit-optimize/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* resolved a few issues in the PR

* fixing file permissions

* a few minor changes

* style correction

* minor correction to jq package file

* Update var/spack/repos/builtin/packages/py-pyro4/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fixing a few issues in the PR

* adding py-selectors34 required by py-pyro4

* improved the superlu-dist package

* improved the superlu-dist package

* moree changes to gptune and py-selectors34 based on the PR

* Update var/spack/repos/builtin/packages/py-selectors34/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-24 16:06:43 -06:00
Brian Spilner
d6b0c838dd cdo package: add version 2.0.1 (#27608) 2021-11-24 13:20:28 -08:00
Paul
94bcd21ccd Go package: add versions 1.17.3 and 1.16.10 (#27614) 2021-11-24 13:10:23 -08:00
Satish Balay
d8ac578ce1 petsc: add variant kokkos (with +cuda) (#27624) 2021-11-24 13:09:14 -08:00
Tim Haines
f9bae91dea Dyninst package: Add versions 12.0.0 and 12.0.1 (#27623) 2021-11-24 12:23:30 -08:00
iarspider
6147b47719 py-terminado package: add version 0.12.1 (#27629) 2021-11-24 12:03:04 -08:00
iarspider
191de76ccb py-testpath package: add version 0.5.0 (#27630) 2021-11-24 12:02:18 -08:00
iarspider
b500d40a25 py-uproot package: add version 4.1.8 (#27635) 2021-11-24 12:01:25 -08:00
iarspider
43c663c5cf py-virtualenv-clone package: add version 0.5.7 (#27636) 2021-11-24 11:58:41 -08:00
iarspider
3fdab96728 py-virtualenvwrapper package: add version 4.8.4 (#27638) 2021-11-24 11:45:36 -08:00
iarspider
fcd51d81d9 py-wcwidth package: add version 0.2.5 (#27639) 2021-11-24 11:44:51 -08:00
iarspider
83a2014ede py-websocket-client package: add version 1.2.1 (#27640) 2021-11-24 11:43:42 -08:00
iarspider
1c38810e11 py-wheel package: add version 0.37.0 (#27641) 2021-11-24 11:42:02 -08:00
Maciej Wójcik
10ca5f1cd2 GROMACS-SWAXS package: add versions including 2021.4-0.1 (#27642) 2021-11-24 11:41:16 -08:00
iarspider
e7eaebebd4 py-yarl package: add version 1.7.2 (#27643) 2021-11-24 11:40:06 -08:00
Robert Cohn
76ad803f25 intel-oneapi-mkl: add cluster libs and option for static linking (#26256) 2021-11-24 11:04:05 -08:00
Harmen Stoppels
dbf67d912c Make patchelf test use the realpath (#27618)
I think this test should be removed, but when it stays, it should at
least follow the symlink, cause it fails for me if I let spack build
patchelf and have a symlink in a view.
2021-11-24 11:17:23 +01:00
Massimiliano Culpo
5c3dfacdc5 Update distro to v1.6.0 (#27263) 2021-11-24 10:10:11 +00:00
albestro
635984f077 umpire: fix gcc@10.3.0 conflict (#27620)
Make sure that `gcc@10.3.0-[number]` also matches the conflict.
2021-11-24 02:49:51 -07:00
Massimiliano Culpo
70d5d234db Update Jinja2 to v2.11.3 and MarkupSafe to v1.1.1 (#27264) 2021-11-24 10:21:35 +01:00
Massimiliano Culpo
12da0a9a69 Update six to v1.16.0 (#27265) 2021-11-24 10:20:04 +01:00
Harmen Stoppels
97ebcb99b4 libblastrampoline: use tarballs and add maintainer (#27619)
* Use tarballs for libblastrampoline

* Add myself as maintainer
2021-11-24 09:45:57 +01:00
Vanessasaurus
a5bd6acd90 Adding oras go package (#27605)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-11-24 07:06:29 +01:00
Sreenivasa Murthy Kolam
d6d78b8457 rocm-tensile: Add new GPUs to tensile-architecture list (#26913) 2021-11-23 18:35:16 -08:00
Thomas Dickerson
c263b64d2e Racket package: disable parallel build; add variants (#27506)
- Prevent `-j` flags to `make`, which has been known to cause problems
  with Racket builds.
- Add variants for various common build flags, including support
  for both versions of the Racket VM environment.

In addition:

- Prefer the minimal release to improve install times. Bells and
  whistles carry their own runtime dependencies and should be
  installed via `raco`. An enterprising user may even create a
  `RacoPackage` class to make spack aware of `raco` installed packages.
- Match the official version numbering scheme.
2021-11-23 16:45:18 -08:00
Harmen Stoppels
dfc95cdf1c llvm: use ninja by default (#27521)
* llvm: use ninja by default

* Use ninja for omp when it's not a runtime
2021-11-23 16:28:17 -08:00
John Wohlbier
0f1c04ed7e Add encryption library packages: helib, palisade-development, and seal (#27495)
Add packages for three common homomorphic encryption libraries.
HElib, Palisade, and Seal. Also add package for number theoretic library NTL.
2021-11-23 15:11:37 -08:00
Harmen Stoppels
cced832cac Fix leaky tests (#27616)
* fix: cc.py should use a function not session scope
* fix: don't let build env vars leak to other tests
* fix: don't leak build env in dev_build test
2021-11-23 14:10:48 -08:00
Manuela Kuhn
2d20e557df py-flask: add 2.0.2 (#27615) 2021-11-23 11:11:12 -07:00
Vanessasaurus
14607b352c adding new package go cosign (#27606)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-11-23 09:56:45 -08:00
Massimiliano Culpo
fa7189b480 Remove support for Python 2.6 (#27256)
Modifications:
- [x] Removed `centos:6` unit test, adjusted vermin checks
- [x] Removed backport of `collections.OrderedDict`
- [x] Removed backport of `functools.total_ordering`
- [x] Removed Python 2.6 specific skip markers in unit tests
- [x] Fixed a few minor Python 2.6 related TODOs in code

Updating the vendored dependencies will be done in separate PRs
2021-11-23 09:06:17 -08:00
Manuela Kuhn
812663de62 py-cython: add 3.0.0a9 (#27609)
* py-cython: add 3.0.0a9

* Update var/spack/repos/builtin/packages/py-cython/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-23 16:33:29 +00:00
Larry Knox
f96e3371a0 Update to latest commit hash for vol-log-based. (#27612)
Update minimum hdf5 version to 1.13.0.
2021-11-23 10:09:19 -05:00
Larry Knox
8476a2e4a1 Add hdf5-vol-external-passthrough package. (#27289)
* Add hdf5-vol-external-passthrough package.

* Add version v1.0 for vol-external-passthrough release.

* Remove incorrect comment.

* Add source url and checksum.  Update HDF5 minimum version.
2021-11-23 10:06:48 -05:00
Nathan Hanford
2104f1273a bugfix: Allow legacy tests to be read after hash break (#26078)
* added a test case

Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
2021-11-22 21:49:41 -07:00
iarspider
feb66cba01 New version: py-stevedore 3.5.0 (#27586) 2021-11-22 18:31:31 -06:00
Manuela Kuhn
b961491ad2 py-zope-event: add 4.5.0 (#27595) 2021-11-22 16:05:13 -07:00
iarspider
411b3ecc37 New version: py-singledispatch 3.7.0 (#27584)
* New version: py-singledispatch 3.7.0

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-22 15:44:07 -07:00
Robert Blake
3e63bc08ee ncurses: fix external package detection on OS X (#27490) 2021-11-22 14:37:58 -08:00
Piotr Luszczek
e90d5ad6cf Intel packages: add support for LLVM OpenMP (#26517) 2021-11-22 13:41:43 -08:00
Harmen Stoppels
abec10fcd5 suite-sparse package: add support for blas symbol suffix (#27510) 2021-11-22 13:18:39 -08:00
Massimiliano Culpo
6f8fd5b7a7 raja: fix recipe for old versions (#27591)
Raja didn't depend on camp prior to v0.12.0
2021-11-22 22:07:54 +01:00
Massimiliano Culpo
5eba5dc271 Make CUDA and ROCm architecture conditional (#27185)
* Make CUDA and ROCm architecture conditional

fixes #14337

The variant to specify which architecture to use
for CUDA and ROCm are now conditional on +cuda and
+rocm respectively.

* cp2k: make all CUDA related variants conditional on +cuda
2021-11-22 07:54:19 -05:00
Wouter Deconinck
5f10562ad1 [geant4] new version geant4@10.7.3 (#27568)
* [geant4] new version geant4@10.7.3

Release notes: https://geant4-data.web.cern.ch/ReleaseNotes/Patch4.10.7-3.txt, no build system changes.

* [geant4] depends_on geant4-data@10.7.3

* [geant4-data] new version @10.7.3
2021-11-22 12:37:11 +00:00
Hadrien G
596f2e7c4d acts: add v15, embrace conditional variants (#27529) 2021-11-22 10:49:54 +01:00
Mahendra Paipuri
1a95b979d8 Disable rsmi for hwloc when rocm is not enabled (#27547)
Co-authored-by: mahendrapaipuri <mahendra.paipuri@inria.fr>
2021-11-22 10:47:56 +01:00
Harmen Stoppels
0024e5cc9b Make _enable_or_disable(...) return an empty array for conditional variants whose condition is not met (#27504) 2021-11-22 10:47:09 +01:00
Adam J. Stewart
cda9d6d981 py-rapidfuzz: add new package (#27559) 2021-11-22 10:38:33 +01:00
Manuela Kuhn
d3b35c4063 py-itsdangerous: add 2.0.1 (#27556) 2021-11-22 10:38:10 +01:00
Harmen Stoppels
3bfd78e6b8 ccache: speed up builds, don't build tests (#27567) 2021-11-22 10:28:05 +01:00
psakievich
c167b7d532 nalu-wind: remove cxxstd from trilinos dependency to enable testing for cpp17 (#27563) 2021-11-22 10:18:30 +01:00
Wouter Deconinck
6d30d87d7e mesa: add v21.2.4 and v21.2.5 (#27571)
Last bugfix versions before just released version @21.3.0 which will be part of a separate PR.
2021-11-22 10:14:00 +01:00
Thomas Green
e7b97c9473 harminv: update for blas/lapack (#27572) 2021-11-22 09:51:46 +01:00
Sebastian Ehlert
8d1e66a627 fpm: add v0.5.0 (#27575) 2021-11-22 09:48:10 +01:00
Adam J. Stewart
7c14c86ffd py-laspy: add new package (#27576) 2021-11-22 09:45:30 +01:00
Tracy-Pantleo
49b0952371 portage, tangram, wonton: update dependencies (#26886)
* tangram/wonton spack

* portage spack update

* wonton spack updated

* tangram spack updated

* white space removed

* Update package.py

* Update package.py

* Update package.py

* Update package.py

portage spack upstream

* Update package.py

for loop test

* Update package.py

style errors

* Update package.py

* tangram

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

edits

* Update package.py

* Update package.py

remove @master

* Update package.py

remove @master

* Update var/spack/repos/builtin/packages/portage/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* Update var/spack/repos/builtin/packages/wonton/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* Update package.py

* Update package.py

* Update package.py

* Update package.py

Co-authored-by: Tracy Nicole Pantleo <tpantleo@varan.lanl.gov>
Co-authored-by: Tracy Nicole Pantleo - 361537 <tpantleo@darwin-fe1.lanl.gov>
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-11-19 22:09:56 -05:00
iarspider
e47fa05eb8 New version: py-pyzmq 22.3.0 (#27543)
* New version: py-pyzmq 22.3.0

* Update var/spack/repos/builtin/packages/py-pyzmq/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-19 13:53:02 -07:00
Adam J. Stewart
2c5fc14bf9 py-progressbar2: add v3.55.0 (#27558) 2021-11-19 14:35:44 -06:00
Adam J. Stewart
fd0f0e564d py-ruamel-yaml-clib: link to python (#27557) 2021-11-19 14:35:31 -06:00
Manuela Kuhn
aa934d00e3 py-jinja2: add 3.0.3 (#27560) 2021-11-19 14:35:18 -06:00
Raghu Raja
f26d25e24b libfabric: Add v1.14.0 release (#27562)
Signed-off-by: Raghu Raja <raghu@enfabrica.net>
2021-11-19 13:35:00 -07:00
Jen Herting
99f954f1c1 [py-pickle5] New package (#27533)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-11-19 14:31:22 -06:00
Joseph Snyder
d759612523 Add connection specification to mirror creation (#24244)
* Add connection specification to mirror creation

This allows each mirror to contain information about the credentials
used to access it.

Update command and tests based on comments

Switch to only "long form" flags for the s3 connection information.
Use the "any" function instead of checking for an empty list when looking
for s3 connection information.

Split test to use the access token separately from the access id and key.
Use long flag form in test.

Add endpoint_url to available S3 options.

Extend the special parameters for an S3 mirror to accept the
endpoint_url parameter.

Add a test.

* Add connection information per URL not per mirror

Expand the mirror-based connection information to be per-URL.
This will allow a user to specify different S3 connection information
for both the fetch and the push URLs.

Add a parameter for "profile", another way of storing the id/secret pair.

* Switch from "access_profile" to "profile"
2021-11-19 15:28:34 -05:00
Manuela Kuhn
a5c50220db py-werkzeug: add 2.0.2 (#27555) 2021-11-19 12:17:06 -07:00
Manuela Kuhn
cdc5b32240 py-cycler: add 0.11.0 and get sources from pypi (#27550)
* py-cycler: add 0.11.0

* Update var/spack/repos/builtin/packages/py-cycler/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-19 11:20:14 -07:00
Manuela Kuhn
efe635a3f8 py-debugpy: add 1.5.1 (#27551) 2021-11-19 11:05:12 -07:00
iarspider
0288ef8733 New version: py-pyaml 6.0 (#27542) 2021-11-19 11:35:50 -06:00
iarspider
e08ebff139 New version: py-qtconsole 5.2.0 (#27554)
* New version: py-qtconsole 5.2.0

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-19 10:10:59 -07:00
Jen Herting
af1b61569f New package: py-lws (#27536)
* [py-lws] added package

* added cython support

* updates for upstream

* updated py-lws with cython only option

* changes to py-lws for flake8

* [py-lws]

- update copyright
- url -> pypi

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-11-19 11:03:52 -06:00
Manuela Kuhn
7784ff55d6 py-filelock: add 3.4.0 (#27553) 2021-11-19 09:35:07 -07:00
iarspider
7be632b558 New version: openldap 2.6.0; (#27520)
* New version: openldap 2.6.0;
fix recipe for groff (requires pkg-config to find uchardet);
fix recipe for openldap (requires groff to build documentation)

* Restrict openldap versions of py-python-ldap and percona-server

* Update var/spack/repos/builtin/packages/groff/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-19 10:31:40 -06:00
iarspider
047062eedf New versopn: py-python-daemon 2.3.0 (#27519)
* New versopn: py-python-daemon 2.3.0

* Update package.py
2021-11-19 10:30:47 -06:00
iarspider
f472716e96 New version: py-pythran 0.10.0 (#27528)
* New version: py-pythran 0.10.0

* Update package.py
2021-11-19 10:21:21 -06:00
Manuela Kuhn
347da5f7b3 py-babel: add 2.9.1 (#27544) 2021-11-19 10:20:58 -06:00
Manuela Kuhn
4a18957093 py-coverage: add 6.1.2 (#27545) 2021-11-19 10:20:07 -06:00
Harmen Stoppels
3fcc6c669f Spack requires Python 2.7: from 0.18.0: (#27505)
* Spack requires Python 2.7: from 0.18.0:

* Better bounds
2021-11-19 10:18:21 -06:00
Manuela Kuhn
33b380937a py-cryptography: add 3.4.8 and 35.0.0 (#27548) 2021-11-19 10:16:21 -06:00
Manuela Kuhn
eb515b87d7 py-click: add 8.0.3 (#27549) 2021-11-19 10:13:24 -06:00
Harmen Stoppels
c5aee4d9b4 define_from_variant: return an empty string for non-existing variants (#27503)
This permits to use conditional variants without a lot of boilerplate.
2021-11-19 14:10:00 +01:00
iarspider
3db918b1cd New version: py-pytools 2021.2.9 (#27531) 2021-11-19 02:23:03 -07:00
Michael Kuhn
eefff81289 otf: specify zlib prefix explicitly (#27491)
This causes `otfconfig` to also list zlib's library directory.
Otherwise, `-lz` cannot be found. Also remove `--with-zlibsymbols`,
which doesn't seem to be supported anymore.
2021-11-19 09:37:27 +01:00
Michael Davis
3375db12a5 Adding --reuse to dev-build command. (#27487) 2021-11-19 09:25:45 +01:00
Diego Alvarez
4d059ead8c nextflow: add v21.10.1 (#27538) 2021-11-19 09:21:14 +01:00
Eric Brugger
4cf71b2c7f VisIt: update for building with 3.2.1. (#27036)
Include several several patches to vtk 8.1 for building on a 
system with no system install X11 libraries or include files.

Specify specific versions of dependent packages that are known to work with 3.2.1.

Tested on spock.olcf.ornl.gov. The GUI came up and rendered images 
and an image was successfully saved using off screen rendering from 
data from curv2d.silo.
2021-11-19 09:05:13 +01:00
Massimiliano Culpo
57d3b02800 Remove spurious debug print (#27541) 2021-11-19 09:02:22 +01:00
haralmha
4d1f09c752 thepeg: fix version condition bug (#27501) 2021-11-19 08:56:00 +01:00
iarspider
c183111a61 py-rapidjson: add version 1.5 (#27527) 2021-11-18 17:52:06 -08:00
Manuela Kuhn
0b6ce17dc4 py-ply: get sources from pypi (#27526)
* py-ply: use pypi

* Add py-setuptools dependency
2021-11-18 16:52:51 -07:00
Massimiliano Culpo
e3cd91af53 Refactor bootstrap of "spack style" dependencies (#27485)
Remove the "get_executable" function from the
spack.bootstrap module. Now "flake8", "isort",
"mypy" and "black" will use the same
bootstrapping method as GnuPG.
2021-11-18 15:23:09 +01:00
iarspider
d0c4aeb0c0 Fix cyrus-sasl recipe: requires groff to build documentation (#27522) 2021-11-18 15:10:37 +01:00
Massimiliano Culpo
f981682bdc Allow recent pytest versions to be used with Spack (#25371)
Currently Spack vendors `pytest` at a version which is three major 
versions behind the latest (3.2.5 vs. 6.2.4). We do that since v3.2.5 
is the latest version supporting Python 2.6. Remaining so much 
behind the currently supported versions though might introduce 
some incompatibilities and is surely a technical debt.

This PR modifies Spack to:
- Use the vendored `pytest@3.2.5` only as a fallback solution, 
  if the Python interpreter used for Spack doesn't provide a newer one
- Be able to parse `pytest --collect-only` in all the different output 
  formats from v3.2.5 to v6.2.4 and use it consistently for `spack unit-test --list-*`
- Updating the unit tests in Github Actions to use a more recent `pytest` version
2021-11-18 15:08:59 +01:00
Harmen Stoppels
8f7640dbef ci: run style unit tests only if we target develop (#27472)
Some tests assume the base branch is develop, but this branch may not
have been checked out.
2021-11-18 13:00:39 +01:00
Alec Scott
7ee736a158 Add Clingo v5.5.1 (#27512) 2021-11-18 06:27:01 +01:00
Glenn Johnson
b260efd9c4 new package: py-scikit-learn-extra (#27439)
* new package: py-scikit-learn-extra

* Remove extra blank line

* Add missing py-cython dependency
2021-11-17 23:13:19 -06:00
Manuela Kuhn
b4fcb4d43c py-envisage: add 6.0.1 (#27517) 2021-11-17 23:12:28 -06:00
haralmha
301a506aa2 py-pygraphviz: Add version 1.7 (#26712)
* Add version 1.7

* Fixup! Add version 1.7
2021-11-17 23:08:56 -06:00
Harmen Stoppels
40a6ac62d3 llvm: introduce [build/link]_llvm_dylib (#27450)
Apart from building a single dylib for LLVM, users should also be able
to link tools against it.
2021-11-17 20:10:59 -07:00
Alec Scott
d24d13559b Add Htslib v1.14, Samtools v1.14, and Bcftools v1.14 (#27515) 2021-11-17 14:30:18 -08:00
Axel Huebl
367c3d8a83 openPMD-validator: 1.1.0.2 (#27514)
Add the latest versions.
2021-11-17 14:26:23 -08:00
Alec Scott
684d9cef23 Add LMOD v8.5.27 (#27513) 2021-11-17 14:21:57 -08:00
Ben Bergen
c459adbf6a Updated Repository Information (#27496) 2021-11-17 10:02:41 -08:00
Jose E. Roman
d9a39d6128 New patch release SLEPc 3.16.1 (#27509) 2021-11-17 09:57:53 -08:00
Manuela Kuhn
26c5001ec4 py-ipykernel: add 5.5.6 (#27498) 2021-11-17 11:06:08 -06:00
Harmen Stoppels
c05746eac2 llvm: use LIBCXX_ENABLE_STATIC_ABI_LIBRARY=ON (#27448)
The previous workaround of using CMAKE_INSTALL_RPATH=ON was used to
avoid CMake trying to write an RPATH into the linker script libcxx.so,
which is nonsensical. See commit f86ed1e.

However, CMAKE_INSTALL_RPATH=ON seems to disable the build RPATH, which
breaks LLVM during the build when it has to locate its build-time shared
libraries (e.g. libLLVM.so). That required yet another workaround, where
some shared libraries were installed "by hand", so that they were picked
up from the install libdir. See commit 8a81229.

This was a dirty workaround, and also makes it impossible to use ninja,
since we explicitly invoked make.

This commit removes the two old workaround, and sets
LIBCXX_ENABLE_STATIC_ABI_LIBRARY=ON, so that libc++abi.a is linked into
libc++.so, which makes it enough to link with -lc++ or invoke clang++
with -stdlib=libc++, so that our install succeeds and linking LLVM's c++
standard lib is still easy.
2021-11-17 09:14:05 -07:00
Harmen Stoppels
d900abe7e1 openblas: add a symbol suffix variant (#27500)
Some packages use a 64_ or _64 symbol suffix for the ilp64 (= 64-bit
integers) intefrace for BLAS. In particular if we want to support shim
libraries like libopenblastrampoline supporting both the 32 and 64 bit
integer version of blas, it must be possible to distinguish between the
two.
2021-11-17 08:52:56 -07:00
Bernhard Kaindl
2765861705 qt+webkit: Build needs Py2, but mesa/Meson needs Py3 (#27466)
mesa inherits MesonPackage (since October 2020) which depends on Py@3.
The conflicts('mesa') enables a regular build of `qt@5.7:5.15+webkit`
without having to specify the exact version by causing the concretizer
to select mesa18 which does not depend on python@3.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-11-17 07:42:49 -05:00
Harmen Stoppels
ac8993ac38 New version of Spack and new conflicts (#27384) 2021-11-17 11:25:33 +01:00
Harmen Stoppels
cc62689504 Fix overly generic exceptions in log parser (#27413)
This type of error is skipped:

make[1]: *** [Makefile:222: /tmp/user/spack-stage/.../spack-src/usr/lib/julia/libopenblas64_.so.so] Error 1

but it's useful to have it, especially when a package sets a variable
incorrectly in makefiles
2021-11-17 11:24:14 +01:00
Vasileios Karakasis
9260c38408 Add ReFrame 3.9.1 (#27493) 2021-11-17 00:44:59 +00:00
iarspider
b4ac385d70 New version: py-pytest-cov 3.0.0; add 'toml' variant to py-coverage (#27478)
* New version: py-pytest-cov 3.0.0; add 'toml' variant to py-coverage

* Update package.py
2021-11-16 17:16:59 -07:00
iarspider
85a98fc758 New version: py-pylint 2.8.2; new package py-platformdirs (#27473)
* New version: py-pylint 2.8.2; new package py-platformdirs

* Update var/spack/repos/builtin/packages/py-platformdirs/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pylint/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-16 17:10:52 -07:00
Harmen Stoppels
0221639e8a llvm: maintainers (#27482) 2021-11-16 15:41:02 -07:00
Brian Spilner
7ad9fc7fdb cdo: new release cdo-2.0.0 (#27457) 2021-11-16 22:08:43 +01:00
Robert Cohn
67cba372e8 Intel mpi: allow use of external libfabric (#27292)
Intel mpi comes with an installation of libfabric (which it needs as a
dependency). It can use other implementations of libfabric at runtime
though, so if you install a package that depends on `mpi` and
`libfabric`, you can specify `intel-mpi+external-libfabric` and ensure
that the Spack-built instance is used (both by `intel-mpi` and the
root).

Apply analogous change to intel-oneapi-mpi.
2021-11-16 12:55:24 -08:00
iarspider
b194b957ce New versions: py-pyasn1 0.4.8, py-pyasn1-modules 0.2.8 (#27453)
* New versions: py-pyasn1 0.4.8, py-pyasn1-modules 0.2.8

* Update package.py
2021-11-16 13:52:56 -07:00
Thomas Kluyver
df5892ac89 Add py-h5py 3.6.0 (#27476) 2021-11-16 13:28:08 -06:00
iarspider
a432d971b1 New version: py-pyrsistent 0.18.0 (#27477)
* New version: py-pyrsistent 0.18.0

* Update package.py

* Update var/spack/repos/builtin/packages/py-pyrsistent/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-16 12:20:04 -07:00
Manuela Kuhn
85cf8af026 py-pyqt-builder: add new package (#27484) 2021-11-16 12:11:30 -07:00
iarspider
a867320021 New version: py-pytest 6.2.5 (#27481)
* New version: py-pytest 6.2.5

* Update var/spack/repos/builtin/packages/py-pytest/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-16 12:11:08 -07:00
Harmen Stoppels
a70d917da6 Bump curl (#27471) 2021-11-16 10:30:28 -08:00
iarspider
b9314fd887 New version: py-pu 1.11.0 (#27452) 2021-11-16 11:49:42 -06:00
iarspider
de916bd27b New version: py-pycuda 2021.1 (#27458) 2021-11-16 11:46:23 -06:00
iarspider
9f6dd08e1c New version: py-pycurl 7.44.1 (#27470) 2021-11-16 11:40:43 -06:00
iarspider
db307f9946 New version: py-pymongo 3.12.1 (#27474) 2021-11-16 11:39:28 -06:00
Harmen Stoppels
7c3b146789 llvm: flang implies mlir (#27449) 2021-11-16 10:08:07 -07:00
Harmen Stoppels
54fc26bf34 Add Python 3.9.9 (#27475) 2021-11-16 10:17:23 -06:00
Harmen Stoppels
a780f96a25 Add a symbol suffix option to LLVM (#27445) 2021-11-16 06:06:07 -08:00
Arjun Guha
bb54e9a4be racket: fix URL extrapolation (#27459) 2021-11-16 08:38:47 +01:00
Bernhard Kaindl
ada598668b mesa: Use the llvm-config of spec['llvm'] for '+llvm' (#27235)
Fix builds on hosts where /usr/bin/llvm-config-* is found and provides an
incompatible version: Ensure that the llvm-config of spec['llvm'] is used.
2021-11-16 08:07:48 +01:00
Robert Cohn
8c7fdadeba Add a maintainer for Intel packages (#27465) 2021-11-16 08:03:12 +01:00
Ben Bergen
5a8271d0c6 lanl-cmake-modules: new package (#27447)
Several projects have not yet added modern CMake support. This package
adds find_package support for some of these projects.
2021-11-16 07:31:24 +01:00
Hans Fangohr
35bf91012f oommf: new package (#26933)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-11-15 15:04:08 +01:00
Erik Schnetter
beddcd6a6d pgplot: Correct build error with shared libraries (#27154)
When building pgplot as shared library, dependencies need to be listed explicitly.
2021-11-15 13:51:43 +01:00
Thomas Madlener
6ce4a5ce8c lcio: add v2.17 (#27238) 2021-11-15 13:49:43 +01:00
Edgar Leon
4a71704ae4 mpibind: add python bindings (#27127) 2021-11-15 13:48:14 +01:00
Kyle Gerheiser
ce9ae3c70d Add module variables for NCEPLIBS (#27162)
Use setup_run_environment to search for libraries and set env variables for module generation.

Libraries are installed with CMAKE_INSTALL_LIBDIR, which can be lib or lib64 depending on the machine, which makes it impossible to hardcode through modules.yaml.
2021-11-15 13:21:53 +01:00
Harmen Stoppels
55e853c160 mbedtls: fix shared libs needed section and add new version (#27285) 2021-11-15 13:10:51 +01:00
Sinan
90a92c4efc libspatialite: add v5.0.1 (#27181)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-11-15 11:48:08 +01:00
kwryankrattiger
fbfc27d707 ECP-DAV-SDK: remove catalyst (#27204)
Temporarily remove +catalyst, assumes catalyst is on when +paraview
2021-11-15 11:46:14 +01:00
Michael Kuhn
aac505d4ff libfuse: fix build with new glibc (#27376)
glibc 2.34+ breaks libfuse@2 (see https://bugs.gentoo.org/803923). While
we are at it, backport a few patches from upstream.
2021-11-15 11:29:44 +01:00
laestrada
e5a0986433 gchp: add v13.2.1 (#27318)
Co-authored-by: laestrada <lestrada00@gmail.com>
2021-11-15 11:28:25 +01:00
Glenn Johnson
2ea1037369 llvm: remove cyclades code from llvm-10:11 (#27385) 2021-11-15 11:27:00 +01:00
Glenn Johnson
faeea04712 star-ccm-plus: add version and environment setup (#27388)
- add version 16.06.008_01
- set up run environment
2021-11-15 11:26:05 +01:00
Glenn Johnson
cdcde36f51 perl-forks: handle non-threaded perl (#27392)
If the perl that perl-forks is built against is non-threaded the build
system will drop into interactive mode to ask about simulating ithreads.
This causes the build to hang. Set FORKS_SIMULATE_USEITHREADS to avoid
going into interactive mode.
2021-11-15 11:24:36 +01:00
Glenn Johnson
85a33bd2f1 perl-dbd-mysql: depend on mysql-client (#27393) 2021-11-15 11:24:09 +01:00
Glenn Johnson
19493e4932 ncl: add dependencies (#27394)
- zstd
- makedepend
2021-11-15 11:23:50 +01:00
Glenn Johnson
557556bbfd maker: add perl-dbd-mysql dependency (#27395) 2021-11-15 11:23:30 +01:00
Glenn Johnson
d01129e9c8 mariadb-c-client: add krb5 dependency (#27396) 2021-11-15 11:23:08 +01:00
Diego Alvarez
972ae2393a nextflow: add v21.10.0 (#27400) 2021-11-15 11:22:39 +01:00
Harmen Stoppels
21308eb5cc Turn some verbose messages into debug messages (#27408) 2021-11-15 11:21:37 +01:00
Valentin Volkl
4c0f5f85c0 edm4hep: add v0.4 (#27412) 2021-11-15 11:17:17 +01:00
Harmen Stoppels
9573eb5315 dsfmt: new package (#27415) 2021-11-15 11:15:42 +01:00
Glenn Johnson
d1df044b42 asciidoc: add autotools dependencies (#27418) 2021-11-15 11:13:00 +01:00
Glenn Johnson
9171d56146 czmq: use Spack's docbook (#27419) 2021-11-15 11:11:29 +01:00
Glenn Johnson
8dc8282651 dbus: use Spack's docbook (#27420) 2021-11-15 11:11:11 +01:00
Glenn Johnson
6e646e42a5 gpu-burn: set spack compiler (#27421) 2021-11-15 11:10:57 +01:00
Glenn Johnson
91b8a87fc7 interproscan: add missing dependency (#27423) 2021-11-15 11:10:27 +01:00
Glenn Johnson
5aaee953c5 libbeagle: add opencl variant (#27424) 2021-11-15 11:10:02 +01:00
Glenn Johnson
9d9fc522c8 libdrm: use Spack docbook (#27425) 2021-11-15 11:09:04 +01:00
Glenn Johnson
5e8368d0c9 libint: fix build (#27426)
- use project autogen.sh script
- set boost path
- make sure to build the compiler before the library
2021-11-15 11:08:48 +01:00
Glenn Johnson
c91b2b42a8 libzmq: use Spack docbook (#27427) 2021-11-15 11:07:30 +01:00
Desmond Orton
4cc42789db py-mxfold2: new package at v0.1.1 (#27383) 2021-11-15 11:07:02 +01:00
Glenn Johnson
f281b59394 hdf-eos5: use szip virtual (#27422) 2021-11-15 11:05:44 +01:00
Oliver Perks
4bd6a11c25 MEEP: update version and libctl (#27433) 2021-11-15 11:03:49 +01:00
Chris White
7b4c08db21 umpire: disable building docs (#27434) 2021-11-15 11:01:20 +01:00
Brian Van Essen
100b212538 Added support for new version of Cray libsci. (#27435) 2021-11-15 11:00:43 +01:00
Arjun Guha
9877c21c50 racket: add new package at v8.3.0 (#27446) 2021-11-15 10:56:26 +01:00
Adam J. Stewart
54cc7d0f30 py-pytorch-lightning: add v1.5.0 (#27266) 2021-11-15 10:49:35 +01:00
Adam J. Stewart
ee8f46e826 py-build: add new package (#27270) 2021-11-15 10:48:43 +01:00
Manuela Kuhn
d9a687af45 py-sip: add 5.5.0 and 6.4.0 (#27357) 2021-11-14 22:18:08 -06:00
iarspider
3706fef9af New version: py-pexpect 4.8.0 (#27401) 2021-11-13 17:31:56 -07:00
iarspider
76fed9c04d New version: py-plac; fix tests (#27405) 2021-11-13 17:28:27 -07:00
Glenn Johnson
7a00cef055 qt: turn off features if not supported by the kernel (#27438)
If building Qt on a system with a recent glibc but an older kernel, it
is possible that Qt configures features based on glibc that are not
supported in the kernel. This PR tests the kernel version and ensures
certain features are disabled if the kernel does not support them.
2021-11-13 15:39:50 -05:00
Manuela Kuhn
9fb9399487 py-pycortex: add new package (#27432) 2021-11-13 12:24:51 -06:00
Manuela Kuhn
94eb175e07 py-lxml: add 4.6.4 (#27440) 2021-11-13 12:19:46 -06:00
Manuela Kuhn
8e9ef0f6f6 py-neurokit2: add 0.1.5 (#27441) 2021-11-13 12:17:14 -06:00
Manuela Kuhn
02355d32bd py-deepdiff: add new package (#27442) 2021-11-13 12:14:46 -06:00
Manuela Kuhn
e3389e9017 py-pytest-runner: add 5.3.1 (#27443) 2021-11-13 12:12:59 -06:00
Manuela Kuhn
984715e859 py-imageio: add 2.10.3 (#27431) 2021-11-13 05:22:27 -07:00
Seth R. Johnson
31fec1f730 py-h5sh: replace github with pypi (#27409) 2021-11-12 19:01:25 -06:00
Satish Balay
446e4ff54b xsdk: add version @0.7.0 (#27250)
- add xsdk_depends_on() that propagates cuda, rocm options
- enable +cuda for hypre, mfem, amrex
- enable +rocm for mfem, sundials, magma, amrex, tasmanian, ginkgo, heffte
- deprecate @0.5.0
- remove 'debug' variant as it doesn't really exist
- heffte: remove binding to openmpi+cuda
- dtk: use 3.1-rc2 on macos (3.1-rc3 elsewhere)
- petsc: disable trilinos

Co-authored-by: Cody Balos <balos1@llnl.gov>
2021-11-12 17:20:35 -06:00
Cody Balos
203a5a0f71 sundials: fix for rocm support (#27295) 2021-11-12 16:13:32 -06:00
Manuela Kuhn
98fbfa2739 py-pybids: add 0.14.0 (#27430) 2021-11-12 15:04:51 -07:00
Cody Balos
c4b52098f3 superlu-dist: fix case for cce fortran compiler (#27296)
* superlu-dist: fix build with cce fortran compiler
2021-11-12 15:27:15 -06:00
iarspider
3f4d0943d7 New version: py-pkgconfig 1.5.5 (#27402)
* New version: py-pkgconfig 1.5.5

* Actually add version 1.5.5; fix dependency
2021-11-12 12:59:00 -07:00
iarspider
67f4e2674f New version: py-prettytable 2.4.0; update homepage (#27407)
* New version: py-prettytable 2.4.0; update homepage

* Update var/spack/repos/builtin/packages/py-prettytable/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-12 12:52:48 -07:00
iarspider
441a76b646 New version: py-prometheus-client 0.12.0 ... (#27410)
* New version: py-prometheus-client 0.12.0; new dependency (py-twisted) version 21.7.0 + it's dependencies

* Apply suggestions from code review (1/?)

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Changes from review (2/?)

* Changes from review (3/?)

* Changes from review (4/?)

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-12 11:31:47 -07:00
Harmen Stoppels
4b16c6fb0c bump python 3.9.8 (#27381) 2021-11-12 11:17:16 -06:00
Glenn Johnson
8d06abb8ed py-numpy: add support for intel-oneapi-mkl (#27390) 2021-11-12 11:16:09 -06:00
Glenn Johnson
77203c940c py-grpcio: add re2 dependency and set paths (#27391) 2021-11-12 11:14:45 -06:00
iarspider
f9c4cb5f8a New version: py-pathlib2 2.3.6 (#27398) 2021-11-12 11:12:03 -06:00
iarspider
1d166a2151 New version: py-pbr 5.7.0 (#27399) 2021-11-12 11:11:04 -06:00
Jen Herting
79e520957b New package: py-pyscipopt (#27363)
* build of pyscipopt w/ scipoptsuite 7.0.1

* removed type from scipoptsuite"

* removed url

* updated to 3.4.0

* [py-pyscipopt] removed redundant python dependency and fixed dependency types

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-11-12 10:07:59 -07:00
iarspider
9c386b181a New version: py-pkginfo 1.7.1 (#27403) 2021-11-12 11:04:36 -06:00
iarspider
c9e96018be New version: py-pluggy 1.0.0 (#27406) 2021-11-12 11:03:25 -06:00
Seth R. Johnson
a04cc4470e Add PyPI docs and warning in auto-generated package (#27404)
* docs: Add cross-references for pypi setup

* create: add warning for missing pypi
2021-11-12 10:58:44 -06:00
Andrew W Elble
407f68a67b new package: py-qiskit-aer and dependencies (#27198)
* new package: py-qiskit-aer and dependencies

+updates for dependencies

* flake8 fix

* forgot homepages

* incorporate feedback

* py-cmake and py-ninja to py-tweedledum, prevent py-cmake from
completely rebuilding cmake

* py-ninja: avoid rebuilding ninja

* other feedback fixes

* additional fixes
2021-11-12 10:45:07 -06:00
Massimiliano Culpo
c5dd3265ed Update the Public key of the tutorial (#27370) 2021-11-12 11:46:02 +01:00
Harmen Stoppels
9637ed05f5 Fix overloaded argparse keys (#27379)
Commands should not reuse option names defined in main.
2021-11-11 23:34:18 -08:00
Desmond Orton
11f6aac1e3 New Package: py-setuptools-cpp (#27382) 2021-11-11 23:30:18 -08:00
Adam J. Stewart
dbad9fd58a py-torchgeo: add version 0.1.0 (#27274) 2021-11-11 23:25:20 -08:00
Harmen Stoppels
62b1c3411c suite-sparse: general fixes (#27283)
- disable graphblas by default (very slow to compile)
- fix patch upperbound for cuda 11
- remove find_system_libs; not sure why it was added in the first place,
but it makes spack rather unusable as it introduces an rpath to /lib/...
2021-11-11 23:21:53 -08:00
Ben Cowan
f5e107e046 gcc package: Mac OS patch not needed for @12: 2021-11-11 23:15:38 -08:00
Robert Cohn
c91074514a intel-oneapi-mkl: define MKLROOT for dependents (#27302)
Fixes #27260
2021-11-11 23:12:48 -08:00
Adam J. Stewart
c8fdce28e6 PyTorch: fix compilation of +distributed~tensorpipe (#27337) 2021-11-11 23:09:10 -08:00
Tiziano Müller
38592b9e4f boost: ensure Spack wrappers are used for %intel (#27348)
Fixes #26117
2021-11-11 23:02:08 -08:00
Manuela Kuhn
f777cc079d py-formulaic: add new package (#27373) 2021-11-11 22:22:51 -07:00
Adam J. Stewart
17025f5354 GDAL: add v3.4.0 (#27288) 2021-11-11 19:31:51 -07:00
Charles Doutriaux
7858a2f05c Adds Sina Python Package to Spack (#27219) 2021-11-11 17:20:23 -08:00
eugeneswalker
a35d3b895b e4s ci: enable tau +mpi +python variants (#27273) 2021-11-11 17:14:31 -08:00
Chris White
1d8c2498ee update exercise build option for 0.14.0 (#27338) 2021-11-11 17:06:08 -08:00
Mark W. Krentel
f548243e9e elfutils: add version 0.186 (#27345) 2021-11-11 17:04:53 -08:00
Timothy Brown
eb9a520adb Adding a new ESMF version and checksum. (#27367) 2021-11-11 17:02:08 -08:00
Adam J. Stewart
89a59b98e7 py-scipy: add v1.7.2 (#27252) 2021-11-12 00:40:29 +01:00
Adam J. Stewart
e3645d666e py-numpy: add v1.21.4 (#27251) 2021-11-12 00:11:26 +01:00
Manuela Kuhn
dcef7b066c py-pillow: add 8.4.0 (#27371) 2021-11-11 16:04:45 -07:00
Seth R. Johnson
c9ed4bce66 py-wurlitzer: new package (#27328)
* py-wurlitzer: new package

* py-wurlitzer: use PyPI repository

* py-wurlitzer: use pypi attribute
2021-11-11 14:34:50 -07:00
Jen Herting
44b5c1e41a New package: py-parmed (#27362)
* [py-parmed] created template

* [py-parmed]

- added homepage
- added description
- removed fixmes
- added dependencies
2021-11-11 13:58:29 -06:00
iarspider
0d06bf6027 New version: py-nbconvert 6.2.0 (#27350)
* New version: py-nbconvert 6.2.0

* Update package.py
2021-11-11 11:58:56 -07:00
iarspider
a2eadbbcdf New version: py-pandocfilters 1.5.0 (#27359) 2021-11-11 11:07:53 -07:00
iarspider
347ad11b68 New version: py-parso 0.8.2 (#27360)
* New version: py-parso 0.8.2

* Update package.py
2021-11-11 10:58:45 -07:00
eugeneswalker
f2920cbf62 tau: add v2.31 (#27342) 2021-11-11 09:53:07 -08:00
Manuela Kuhn
7513a644b9 py-h5py: add v3.5.0 (#27344) 2021-11-11 18:51:52 +01:00
Harmen Stoppels
f097a24253 fuse-overlayfs: add maintainer, add v1.7, v1.7.1 (#27349) 2021-11-11 18:50:32 +01:00
iarspider
53168d4e57 New version: py-notebook 6.4.5 (#27353) 2021-11-11 09:55:44 -07:00
Manuela Kuhn
72f41a6591 py-interface-meta: add new package (#27340) 2021-11-11 10:48:46 -06:00
iarspider
00c84aa54f New version: py-oauthlib 3.1.1 (#27356)
* New version: py-oauthlib 3.1.1

* Update var/spack/repos/builtin/packages/py-oauthlib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-11 09:43:50 -07:00
haralmha
e32dd27eb7 Generalize env var PYTHON to avoid version conflicts (#27334)
* Generalize env var PYTHON to avoid version conflicts

* Use available python executable

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-11 10:37:54 -06:00
iarspider
9ce0abf34f New version: py-nbconvert 0.5.5 (#27347) 2021-11-11 10:37:10 -06:00
iarspider
569a9a9dcc New version: py-nest-asyncio 1.5.1 (#27351) 2021-11-11 10:35:33 -06:00
iarspider
945594fa98 New version: py-networkx 2.6.3 (#27352) 2021-11-11 10:29:45 -06:00
Daryl W. Grunau
770bbba676 Trilinos: fix DEP_INCLUDE_DIRS definition (#27341) 2021-11-11 05:30:44 -05:00
Dan Bonachea
8d021a2915 upcxx: Update the UPC++ package to 2021.9.0 (#26996)
* upcxx: Update the UPC++ package to 2021.9.0

* Add the new release, and a missing older one.

* Remove the spack package cruft for supporting the obsolete build system that
  was present in older versions that are no longer supported.

* General cleanups.

Support for library versions older than 2020.3.0 is officially retired,
for two reasons:

1. Releases prior to 2020.3.0 had a required dependency on Python 2,
   which is [officially EOL](https://www.python.org/doc/sunset-python-2/)
   as of Jan 1 2020, and is no longer considered secure.
2. (Most importantly) The UPC++ development team is unable/unwilling to
   support releases more than two years old.  UPC++ provides robust
   backwards-compatibility to earlier releases of UPC++ v1.0, with very
   rare well-documented/well-motivated exceptions.  Users are strongly
   encouraged to update to a current version of UPC++.

NOTE: Most of the lines changed in this commit are simply re-indentation,
and thus might be best reviewed in a diff that ignores whitespace.

* upcxx: Detect Cray XC more explicitly

This change is necessary to prevent false matches occuring on new Cray Shasta
systems, which do not use the aries network but were incorrectly being treated
as a Cray XC + aries platform.

UPC++ has not yet deployed official native support for Cray Shasta, but this
change is sufficient to allow building the portable backends there.
2021-11-10 13:48:02 -08:00
Manuela Kuhn
8411cc934d py-setupmeta: add new package (#27327)
* py-setupmeta: add new package

* Update var/spack/repos/builtin/packages/py-setupmeta/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-10 13:43:46 -07:00
iarspider
5e0208a710 New version: py-mpld3 0.5.5 (#27329)
* New version: py-mpld3 0.5.5

* Update package.py

* Update package.py
2021-11-10 11:07:51 -07:00
Manuela Kuhn
335bcdc206 py-traits: add 6.3.1 (#27321) 2021-11-10 11:49:13 -06:00
Manuela Kuhn
89d98a780d py-traitlets: add 5.1.1 (#27322) 2021-11-10 11:48:14 -06:00
iarspider
b5e0ac35c8 New versions: py-more-itertools 8.9.0, 8.11.0 (#27325) 2021-11-10 11:45:27 -06:00
Manuela Kuhn
a31f2be91a py-wrapt: add 1.13.3 (#27326) 2021-11-10 11:42:28 -06:00
iarspider
4633f19971 New verison of py-mpmath: 1.2.1 (#27331) 2021-11-10 11:33:20 -06:00
iarspider
0e5cc0d79b New versions of py-multidict: 5.1.0 and 5.2.0 (#27332) 2021-11-10 10:31:46 -07:00
Veselin Dobrev
ceabb96c89 Add a patch for mfem v4.3 to support cusparse >= 11.4 (#27267) 2021-11-09 20:29:06 -06:00
iarspider
723f2f465b New version: py-markdown 3.3.4 (#27310)
* New version: py-markdown 3.3.4

* Changes from review

* Update package.py
2021-11-09 17:19:41 -07:00
Karen C. Tsai
376e5ffd6e fix typo (#27319) 2021-11-09 17:16:41 -07:00
Massimiliano Culpo
b16bfe4f5f spack tutorial: fix output to screen (#27316)
Spack was checking out v0.17, the output reported v0.16
2021-11-09 15:10:22 -08:00
Maxim Belkin
0ab5d42bd5 Fix typos (ouptut) (#27317) 2021-11-09 16:11:34 -06:00
Harmen Stoppels
0322431c5c libssh2: add new versions and crypto=mbedtls|openssl (#27284) 2021-11-09 22:24:23 +01:00
Karen C. Tsai
81a0f7222c create py-sphinx-multiversion spackage (#27314) 2021-11-09 14:09:13 -07:00
iarspider
d8f4339c18 New version: py-mako 1.1.5 (#27308) 2021-11-09 13:39:34 -06:00
Manuela Kuhn
41d4473df4 py-pooch: add 1.5.2 (#27305) 2021-11-09 13:37:16 -06:00
Olivier Cessenat
31c932eec9 octave: override qtchooser, add bz2 variant (#26802)
* octave: override qtchooser, add bz2 variant

* fix texinfo not found from "spack install --test=root -v"

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-11-09 19:36:42 +01:00
Kyle Gerheiser
86c8c3306b Add variant for the --enable-two-level-namespace option in MPICH (#27230) 2021-11-09 15:48:01 +00:00
Jordan Galby
6e9c0a8155 Fix log-format reporter ignoring install errors (#25961)
When running `spack install --log-format junit|cdash ...`, install
errors were ignored. This made spack continue building dependents of
failed install, ignoring `--fail-fast`, and exit 0 at the end.
2021-11-09 15:47:32 +00:00
Larry Knox
8a836213f7 Add hdf5-vol-async package. (#26874)
* Add hdf5-vol-async package.
Add HDF5 1.13.0-rc6 version for building vol-async.

* Style test required another blank line.

* Change hdf5 dependency to develop-1.13+mpi+threadsafe.

* Update args for hdf5-vol-async.
2021-11-09 10:31:34 -05:00
Harmen Stoppels
6e6847d9c7 openlibm: new package (#27286) 2021-11-09 04:52:12 -08:00
Harmen Stoppels
321604b4e5 utf8proc: add new versions (#27287) 2021-11-09 04:50:34 -08:00
iarspider
a7363af47e New version: py-jsonpickle 2.0.0 (#27145) 2021-11-09 05:46:51 -07:00
eugeneswalker
560abc9c14 flecsi 2.1.0: requires gcc>=9 (#27195) 2021-11-09 12:32:41 +00:00
Martin Pokorny
65b0588e80 Fix usage of CMakePackage.define_from_variant() (#27307)
Added `self` target to method calls
2021-11-09 04:52:46 -07:00
Alec Scott
a193e6a204 Add v1.57.0 to Rclone (#27205) 2021-11-09 04:34:48 -07:00
Andrey Prokopenko
7763c7a1cc Deprecate pre-release ArborX version (#27248) 2021-11-09 12:04:54 +01:00
Mikhail Titov
e272e5039d rct: update packages (RP, RU) with new versions (#27306) 2021-11-09 12:03:35 +01:00
Jean-Paul Pelteret
8f88f34973 SymEngine: Update package to 0.8.1 (#27304) 2021-11-09 12:03:07 +01:00
Jean-Paul Pelteret
b4a2fd1fcd deal.II: Update for 9.3.2 release (#27303) 2021-11-09 12:02:51 +01:00
Massimiliano Culpo
752aa7adde Allow triggering the "Bootstrap" workflow manually (#27298) 2021-11-09 12:02:24 +01:00
Robert Underwood
2153ac5863 openmpi fix external find for 0.17 (#27255) 2021-11-09 11:58:14 +01:00
TZ
808375429d openmpi: does not support "--without-pmix" (#27258)
Open MPI currently fails to build with scheduler=slurm if +pmix is
not given with a fatal error due to ``config_args +=
self.with_or_without('pmix', ...)`` resulting in --without-pmix.
However, Open MPI's configure points out "Note that Open MPI does
not support --without-pmix."

The PR only adds "--with-pmix=PATH" if +pmix is part of the spec.
Otherwise, nothing is added and Open MPI can fall back to its
internal PMIX sources.

(The other alternative would be to depend on +pmix in for
scheduler=slurm as is done for +pmi.)
2021-11-09 11:55:56 +01:00
Dylan Simon
2b990b400e make --enable-locks actually enable locks (#24675) 2021-11-09 10:52:08 +00:00
Desmond Orton
34be8d0670 Rebased develop to fix pipeline issues (#27209) 2021-11-09 11:47:59 +01:00
Desmond Orton
eae2027db7 Build Fix (#27210) 2021-11-09 11:46:51 +01:00
Hanwen
65aadf0332 aws-parallelcluster: add v2.11.3 (#27244) 2021-11-09 11:42:09 +01:00
Glenn Johnson
44bf54edee xrootd: add new version and variant (#27245)
- add version 5.3.2
- add krb variant
- add patch to not look for systemd
2021-11-09 11:41:39 +01:00
estewart08
daa8a2fe87 [AMD][rocm-openmp-extras] Update for ROCm 4.3. (#25482)
- Added new checksums for 4.3.
- Now using llvm-amdgpu ~openmp in order to use the rocm-device-libs
  build as external project in llvm-amdgpu package. We still need
  to pull device-libs in using resource for the build as some headers
  are not installed.
- Updated symlink creation to now remove an existing link if  present
  to avoid issues on partial reinstalls when debugging.
- Adjusted the flang_warning to be a part of Cmake options instead of
  a filter_file for better compatability.
- The dependency on hsa-rocr-dev created some problems as type was changed
  to the default build/link. This issue was because ROCr uses libelf and
  the build of openmp expects elfutils. When link is specified libelf
  was being found in the path first, causing errors. This was
  introduced with the llvm-amdgpu external project build of device-libs.
- On a bare bone installation of sles15 it was noted that libquadmath0 was
  needed as a dependency. On 18.04 gcc-multilib was also needed.

* Workaround for libelf headers being used instead of elfutils.
2021-11-09 10:36:13 +00:00
Cameron Rutherford
b8d6351f80 Remove requiring hiop+shared in ExaGO. (#27141) 2021-11-09 11:35:10 +01:00
Cory Bloor
7cb3e9bf43 hipblas: Add spack test support (#27074) 2021-11-09 11:32:36 +01:00
luker
8ce37d9153 HPCG Fix for the Cray complier (CCE) (#25197)
CCE's c++ compiler (Clang based) does not accept the '-ftree-vectorizer-verbose=0'
flag. That flag is removed and a suitable substitution is made.
2021-11-09 10:26:02 +00:00
Cory Bloor
a9c6881bbc rocsolver: Add spack test support (#26919) 2021-11-09 11:25:03 +01:00
Ben Darwin
7c118ee22b ants: fix build by setting BUILD_TESTING=OFF (#26768)
Due to Kitware API changes, default ANTs builds were failing, presumably for all versions (https://github.com/ANTsX/ANTs/issues/1236).
This commit defaults BUILD_TESTING to OFF, preventing calls against
these APIs and fixing all versions.
Note that the ANTs test suite was not clean anyway (e.g. ANTs/#842).
2021-11-09 11:19:32 +01:00
Valentin Volkl
a3dd0e7861 build_environment: clean *_ROOT variables (#26474)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-11-09 11:16:42 +01:00
Joseph Wang
abdd206f65 add libtirpc for rpc/types.h (#26628)
rpc/types.h has been removed in recent glibc
2021-11-09 11:14:28 +01:00
Sreenivasa Murthy Kolam
1d09474975 fix rvs build issue by using yaml-cpp spack recipe (#26000) 2021-11-09 11:09:15 +01:00
Tsuyoshi Yamaura
3484434fe2 New package: SCALE (#25574) 2021-11-09 00:19:47 -08:00
iarspider
f164bae4a3 Python tests: skip importing weirdly-named modules (#27151)
* Python tests: allow importing weirdly-named modules

e.g. with dashes in name

* SIP tests: allow importing weirdly-named modules

* Skip modules with invalid names

* Changes from review

* Update from review

* Update from review

* Cleanup
2021-11-08 19:22:33 +00:00
Brian Van Essen
db504db9aa Added new hash values for recent versions. (#27294) 2021-11-08 19:02:30 +00:00
iarspider
bff894900e New version: py-lz4 3.1.3; use external lz4 instead of bundled one (#27282)
* New version: py-lz4 3.1.3; use external lz4 instead of bundled one

* Update var/spack/repos/builtin/packages/py-lz4/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Changes from review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-08 18:14:46 +00:00
iarspider
71f8460357 New versions of py-keyring: 23.1.0, 23.2.0, 23.2.1 (#27276) 2021-11-08 10:59:40 -06:00
iarspider
f9d0007546 New version: py-lazy-object-proxy 1.6.0 (#27277) 2021-11-08 10:58:26 -06:00
iarspider
14232b40d9 New version: py-lockfile 0.12.2 (#27279) 2021-11-08 10:56:34 -06:00
Seth R. Johnson
18d506c7fb trilinos: disable optional packages by default (#26815) 2021-11-08 08:29:35 -05:00
Maciej Wójcik
c384ac88a0 gromacs-swaxs: new package (#26902) 2021-11-08 13:51:23 +01:00
Maciej Wójcik
8f59707f04 gromacs-chain-coordinate: inherit from gromacs package (#27246) 2021-11-08 13:50:42 +01:00
Massimiliano Culpo
62f0c70a8c OpenSUSE: try to apply workaround to avoid CI error (#27275) 2021-11-08 11:56:19 +00:00
Ben Bergen
1fc753af66 Updated Versions (#27268)
- Added 7.1 release
- Added master git branch
2021-11-07 19:41:49 -07:00
Brice Videau
f02dec2bbb Update py-ytopt package and add necessary dependencies (#26879)
* Add spack package py-ytopt-team-ytopt and required dependencies.

* Removed old ytop package.

* Added author as maintainer.

* Fix style.

* Update var/spack/repos/builtin/packages/py-config-space/package.py

Update python dependency to 3.7

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-config-space/package.py

Remove run dependency from py-cython.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-config-space/package.py

Added run dependency type for py-pyparsing.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Updated description of py-dh-scikit-optimize.

* Source py-dh-scikit-optimize from PyPI.

* Added latest py-dh-scikit-optimize version.

* Made plots option False by default for py-dh-scikit-optimize.

* Removed 0.9.4 as it needs additional dependencies.

* Added version dependencies.

* Added missing py-joblib dependencies.

* Added run dependency type.

* Added python 2.7+ as supported for py-pyaml.

* Change py-config-space to py-configspace.

* Added dependency on python 3.6+.

* Fix py-configspace package naming.

* Changed py-autotune to py-ytopt-autotune.

* Update var/spack/repos/builtin/packages/py-pyaml/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Added debug variant with py-ray dependency.

* Added missing py-mpi4py missing dependency.

* Removed erroneous variant.

* Added debug variant to py-ray.

* Fix indentation.

* Removed debug variant of py-ray.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-11-06 10:52:30 -05:00
Todd Gamblin
285548588f Update CHANGELOG.md for 0.17.0 2021-11-05 17:22:32 -07:00
Todd Gamblin
0e3d0516e6 bump version number to 0.17.0 2021-11-05 17:11:37 -07:00
Ben Darwin
12429bda35 py-itk: fix dependencies (#27164) 2021-11-05 16:26:18 -05:00
Seth R. Johnson
c5be548046 New package: GNDS (#27176)
Adds a new package for the AMPX/SCALE implementation of the GNDS
interface,
https://www.oecd.org/publications/specifications-for-the-generalised-nuclear-database-structure-gnds-94d5e451-en.htm
.
2021-11-05 14:04:47 -06:00
Bernhard Kaindl
d9d1319442 qt: replace conflicts('%gcc@11:', when='@5.9:5.14') with -include limits (#27241)
Noting that missing numeric_limits was the cause of the compile issues
with gcc-11, I tested adding -include limits fixing @5.9:5.14%gcc@11.
Therefore, we can replace the conflicts('%gcc@11:', when='@5.9:5.14').

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-11-05 14:06:31 -04:00
Theofilos Manitaras
549bd70449 py-mpi4py: Add newer versions (#27239)
* py-mpi4py: Add newer versions

* Address PR comments
2021-11-05 11:07:44 -06:00
Axel Huebl
4f692e4d9f openPMD-api: 0.14.3 (#27211)
Add the latest release.
2021-11-05 10:53:03 -06:00
Massimiliano Culpo
0feb5ec70a Prevent additional properties to be in the answer set when reusing specs (#27240)
* Prevent additional properties to be in the answer set when reusing specs

fixes #27237

The mechanism to reuse concrete specs relies on imposing
the set of constraints stemming from the concrete spec
being reused.

We also need to prevent that other constraints get added
to this set.
2021-11-05 10:52:44 -06:00
Peter Brady
178e15c39d ctags uses custom autogen.sh script (#27229) 2021-11-05 09:46:09 -07:00
Sinan
0b4b731479 package/py-zarr_add_v2.10.2 (#27212)
* package/py-zarr_add_v2.10.2

* improve python dep version constraints

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-11-05 09:23:41 -05:00
Bernhard Kaindl
91c7c24260 hdf5: Skip racy test cases(which loop endless on many cores) (#27068) 2021-11-05 09:59:45 -04:00
Seth R. Johnson
42d8e9eeb5 trilinos: explicitly use variants instead of spec for TPLs (#27221)
Fixes https://github.com/spack/spack/pull/27188#issuecomment-961212214
2021-11-05 09:07:33 -04:00
Manuela Kuhn
d2c26fef46 r-readr: add 2.0.2 (#27177) 2021-11-05 06:34:40 -06:00
Mikhail Titov
d7ae9414c2 exaworks: swift-t package (#27234) 2021-11-05 10:46:35 +01:00
Carlos Bederián
b06198f1d4 gmsh: add dependency on gl and glu providers when +fltk (#27169) 2021-11-05 10:45:53 +01:00
Harmen Stoppels
8bb5ed8464 make version docs reflect reality (#27149)
* make version docs reflect reality

* typo and make things

* 2.6 -> 2.7 in example
2021-11-05 09:39:31 +00:00
Massimiliano Culpo
e93a2db8b7 hpx: simplify instrumentation_args function (#27226) 2021-11-05 02:34:40 -06:00
Todd Gamblin
e13e697067 commands: spack load --list alias for spack find --loaded (#27184)
See #25249 and https://github.com/spack/spack/pull/27159#issuecomment-958163679.
This adds `spack load --list` as an alias for `spack find --loaded`.  The new command is
not as powerful as `spack find --loaded`, as you can't combine it with all the queries or
formats that `spack find` provides.  However, it is more intuitively located in the command
structure in that it appears in the output of `spack load --help`.

The idea here is that people can use `spack load --list`  for simple stuff but fall back to
`spack find --loaded` if they need more.

- add help to `spack load --list` that references `spack find`
- factor some parts of `spack find` out to be called from `spack load`
- add shell tests
- update docs

Co-authored-by: Peter Josef Scheibel <scheibel1@llnl.gov>
Co-authored-by: Richarda Butler <39577672+RikkiButler20@users.noreply.github.com>
2021-11-05 00:58:29 -07:00
Simon Frasch
bfbf9deb74 spfft: add version 1.0.5 (#27223) 2021-11-05 01:52:37 -06:00
Simon Frasch
6bae3cd6bb spla: add version 1.5.2 (#27222) 2021-11-05 08:45:34 +01:00
Todd Gamblin
8e76244266 docs for experimental --reuse argument to spack install
Add docs for `--reuse`, along with a warning that it will likely be
removed and refactored.
2021-11-05 00:15:47 -07:00
Gregory Becker
5efa76a033 error message for reusing multiple hashes for package 2021-11-05 00:15:47 -07:00
Todd Gamblin
ac1e05fe1b concretizer: add error messages and simplify asp.py 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
0186f0f955 Fix logic program for multi-valued variants
Reformulate variant rules so that we minimize both

1. The number of non-default values being used
2. The number of default values not-being used

This is crucial for MV variants where we may have
more than one default value
2021-11-05 00:15:47 -07:00
Todd Gamblin
e0c3d074c0 bugfix: handle hashes that only exist in input specs
In our tests, we use concrete specs generated from mock packages,
which *only* occur as inputs to the solver. This fixes two problems:

1. We weren't previously adding facts to encode the necessary
   `depends_on()` relationships, and specs were unsatisfiable on
   reachability.

2. Our hash lookup for reconstructing the DAG does not
   consider that a hash may have come from the inputs.
2021-11-05 00:15:47 -07:00
Todd Gamblin
a4a2ed3c34 concretizer: exempt already-installed specs from compiler and variant rules
Concrete specs that are already installed or that come from a buildcache
may have compilers and variant settings that we do not recognize, but that
shouldn't prevent reuse (at least not until we have a more detailed compiler
model).

- [x] make sure compiler and variant consistency rules only apply to
      built specs
- [x] don't validate concrete specs on input, either -- they're concrete
      and we shouldn't apply today's rules to yesterday's build
2021-11-05 00:15:47 -07:00
Todd Gamblin
49ed41b028 spack diff: more flexible tests, restore transitive diff with spec_clauses
In switching to hash facts for concrete specs, we lost the transitive facts
from dependencies. This was fine for solves, because they were implied by
the imposed constraints from every hash. However, for `spack diff`, we want
to see what the hashes mean, so we need another mode for `spec_clauses()` to
show that.

This adds a `expand_hashes` argument to `spec_clauses()` that allows us to
output *both* the hashes and their implications on dependencies. We use
this mode in `spack diff`.
2021-11-05 00:15:47 -07:00
Massimiliano Culpo
3e3e84ba30 Add a missing definition in the logic program 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
be2cf16b67 Add buildcache to reusable specs 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
31dfad9c16 spack install: add --reuse argument 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
e2744fafa1 spack concretize: add --reuse argument 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
290f57c779 spack spec: add --reuse argument 2021-11-05 00:15:47 -07:00
Todd Gamblin
652fa663b5 concretizer: get rid of last maximize directive in concretize.lp
- [x] Get rid of forgotten maximize directive.
- [x] Simplify variant handling
- [x] Fix bug in treatment of defaults on externals (don't count
      non-default variants on externals against them)
2021-11-05 00:15:47 -07:00
Massimiliano Culpo
da57b8775f Update command completion 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
0d74a4f46e Trim dependencies on externals 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
0b80035eaa Fix a unit test to match the new OS semantics
CNL, debian6 and Suse are not compatible
2021-11-05 00:15:47 -07:00
Massimiliano Culpo
4d25fc0068 ASP-based solve: if an OS is set, respect the value 2021-11-05 00:15:47 -07:00
Massimiliano Culpo
6e297b9ba1 Fix a type in "variant_not_default" rule 2021-11-05 00:15:47 -07:00
Todd Gamblin
ace4586bf8 concretizer: rework spack solve output to handle reuse better 2021-11-05 00:15:47 -07:00
Todd Gamblin
c537785f6f spec: ensure_valid_variants() should not validate concrete specs
Variants in concrete specs are "always" correct -- or at least we assume
them to be b/c they were concretized before. Their variants need not match
the current version of the package.
2021-11-05 00:15:47 -07:00
Todd Gamblin
b60a95cd5d concretizer: unify handling of single- and multi-valued variants
Multi-valued variants previously maximized default values to handle
cases where the default contained two values, e.g.:

    variant("foo", default="bar,baz")

This is because previously we were minimizing non-default values, and
`foo=bar`, `foo=baz`, and `foo=bar,baz` all had the same score, as
none of them had any "non-default" values.

This commit changes the approach and considers a non-default value
to be either a value set to something not default *or* the absence
of a default value from the set value.  This allows multi- and
single-valued variants to be handled the same way, with the same
minimization criterion.  It alse means that the "best" value for every
optimization criterion is now zero, which allows us to make useful
assumptions about the optimization criteria.
2021-11-05 00:15:47 -07:00
Todd Gamblin
b88da9d73d concretizer: reuse installs, but assign default values for new builds
Minimizing builds is tricky. We want a minimizing criterion because
we want to reuse the avaialble installs, but we also want things that
have to be built to stick to *default preferences* from the package
and from the user. We therefore treat built specs differently and
apply a different set of optimization criteria to them. Spack's *first*
priority is to reuse what it can, but if it builds something, the built
specs will respect defaults and preferences.

This is implemented by bumping the priority of optimization criteria
for built specs -- so that they take precedence over the otherwise
topmost-priority criterion to reuse what is installed.

The scheme relies on all of our optimization criteria being minimizations.
That is, we need the case where all specs are reused to be better than
any built spec could be. Basically, if nothing is built, all the build
criteria are zero (the best possible) and the number of built packages
dominates. If something *has* to be built, it must be strictly worse
than full reuse, because:

  1. it increases the number of built specs
  2. it must have either zero or some positive number for all criteria

Our optimziation criteria effectively sum into two buckets at once to
accomplish this. We use a `build_priority()` number to shift the
priority of optimization criteria for built specs higher.
2021-11-05 00:15:47 -07:00
Todd Gamblin
cfb60ab9e1 tests: make spack diff test more lenient
The constraints in the `spack diff` test were very specific and assumed
a lot about the structure of what was being diffed. Relax them a bit to
make them more resilient to changes.
2021-11-05 00:15:47 -07:00
Todd Gamblin
9eb94be6dd concretizer: only minimize builds when --reuse is enabled.
Make the first minimization conditional on whether `--reuse` is enabled in the solve.
If `--reuse` is not enabled, there will be nothing in the set to minimize and the
objective function (for this criterion) will be 0 for every answer set.
2021-11-05 00:15:47 -07:00
Todd Gamblin
40b914503e concretizer: adjust integrity constraints to only apply to builds.
Many of the integrity constraints in the concretizer are there to restrict how solves are done, but
they ignore that past solves may have had different initial conditions. For example, for things
we're building, we want the allowed variants to be restricted to those currently in Spack packages,
but if we are reusing a concrete spec, we need to be flexible about names that may have existed in
old packages.

Similarly, restrictions around compatibility of OS's, compiler versions, compiler OS support, etc.
are really only about what is supported by the *current* set of compilers/build tools known to
Spack, not about what we may get from concrete specs.

- [x] restrict certain integrity constraints to only apply to packages that we need to build, and
      omit concrete specs from consideration.
2021-11-05 00:15:47 -07:00
Todd Gamblin
2c142f9dd4 concretizer: rework operating system semantics for installed packages
The OS logic in the concretizer is still the way it was in the first version.
Defaults are implemented in a fairly inflexible way using straight logic. Most
of the other sections have been reworked to leave these kinds of decisions to
optimization. This commit does that for OS's as well.

As with targets, we optimize for target matches. We also try to optimize for
OS matches between nodes. Additionally, this commit adds the notion of
"OS compatibility" where we allow for builds to depend on binaries for certain
other OS's. e.g, for macos, a bigsur build can depend on an already installed
(concrete) catalina build. One cool thing about this is that we can declare
additional compatible OS's later, e.g. CentOS and RHEL.
2021-11-05 00:15:47 -07:00
Todd Gamblin
9c70d51a4f concretizer: impose() for concrete specs should use body facts.
The concretizer doesn't get a say in whether constraints from
concrete specs are imposed, so use body facts for them.
2021-11-05 00:15:47 -07:00
Todd Gamblin
3866b3e7d3 include installed hashes in solve and optimize for reuse 2021-11-05 00:15:47 -07:00
Todd Gamblin
7abe4ab309 rename checked_spec_clauses() to spec_clauses() 2021-11-05 00:15:47 -07:00
Todd Gamblin
ad5d632eeb add --reuse option to spack solve 2021-11-05 00:15:47 -07:00
Sinan
288176326c new package: librttopo (#27182)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-11-04 21:01:42 -06:00
Alec Scott
82b45d112f [New Package] Add Restic v0.12.1 (#27208)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-11-04 17:12:34 -07:00
Massimiliano Culpo
839057e98d Rename the temporary scope for bootstrap buildcache (#27231)
If we don't rename Spack will fail with:
```
ImportError: cannot bootstrap the "clingo" Python module from spec "clingo-bootstrap@spack+python %gcc target=x86_64" due to the following failures:
    'spack-install' raised ValueError: Invalid config scope: 'bootstrap'.  Must be one of odict_keys(['_builtin', 'defaults', 'defaults/cray', 'bootstrap/cray', 'disable_modules', 'overrides-0'])
    Please run `spack -d spec zlib` for more verbose error messages
```
in case bootstrapping from binaries fails and we are
falling back to bootstrapping from sources.
2021-11-04 16:17:00 -07:00
Ben Darwin
f2a42ac4c6 c3d: add new package (#27155) 2021-11-04 15:38:28 -07:00
Tamara Dahlgren
080f1872b8 Add the spack tutorial environment as a cloud pipeline stack (#27137) 2021-11-04 15:14:46 -07:00
Jerome Soumagne
c25a4ecfc2 libfabric: add 1.13.2 (#27202) 2021-11-04 15:04:40 -07:00
Massimiliano Culpo
79f754a968 Sort arguments lexicographically in command's help (#27196) 2021-11-04 12:41:58 -07:00
downloadico
084ce46107 exciting: add "oxygen" version, multiple fixes (#27217)
ensure that none of ^intel-mkl, ^intel-mpi, and ^mkl are used, unless
the compiler is intel.
Fix bad logic in the src/src_xs/m_makespectrum.f90 file in the oxygen version.
Add the -fallow-argument-mismatch for gcc >= 10.
2021-11-04 12:24:27 -07:00
Cameron Smith
7542d8adc8 omega-h: add support for stand-alone testing (#26931)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-11-04 12:02:02 -07:00
Joe Schoonover
4cc2adcbc3 Add new versions for HOHQMesh and switch to tar-ball releases (#27194)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-11-04 11:54:02 -07:00
Jon Rood
ed3d459a83 Add RelWithDebInfo to protobuf build_type list. (#27220) 2021-11-04 10:16:44 -07:00
Andre Merzky
f6ed8b6150 Feature/exaworks spack (#27216)
* add exaworks spack meta package

* add maintainer

* switch to `BundlePackage`

* flake8

* another flake8 fix

* remove incorrect dependency
2021-11-04 06:07:45 -06:00
Massimiliano Culpo
1dcabdbc8d mfem, hpx: fix recipes after conditional variants (#27215) 2021-11-04 03:52:38 -06:00
Seth R. Johnson
503576c017 kokkos: refactor defines to use helper functions (#27189) 2021-11-04 09:31:29 +01:00
Erik Schnetter
6a7b07c60b ssht: New version 1.5.1 (#27173) 2021-11-03 22:10:56 -06:00
Cameron Stanavige
7102e295b9 scr: 3.0rc2 release, variants and deps updates (#27178)
* scr: 3.0rc2 release, variants and deps updates

This adds 3.0rc2 release for end users to aid in testing scr for
upcoming 3.0 release.

Included in this change:
- Require most recent component versions for this release
- Add a variant for PDSH as it is now an optional dependency with
this release
- Add bbapi and datawarp (dw) variants
- bbapi_fallback variant now requires bbapi variant with latest
release
- Add variants to enable/disable examples and tests
- Add shared variant and current conflicts with ~shared
- Update cmake_args to account for added variants where needed

Additional updates:
- Add maintainers
- Use lists and for loops to clean up repetitive code involving all
components
- Use self.define and self.define_from_variant to clean up cmake_args
- Use consistent quoting throughout package

* Un-deprecate v2 and legacy

* Use new conditional variants
2021-11-03 20:06:02 -07:00
Erik Schnetter
b9cdaa5429 simulationio package: add variants asdf, hdf5, rnpl, silo (#27172)
Define new variants asdf, hdf5, rnpl, silo to allow disabling or
enabling dependencies.
2021-11-03 18:07:35 -07:00
Erik Schnetter
0d30799bef hwloc: New version 2.6.0 (#27170) 2021-11-03 15:45:38 -07:00
Erik Schnetter
4045ab45de mpitrampoline: New version 2.2.0 (#27171) 2021-11-03 15:42:14 -07:00
Erik Schnetter
91a6e38404 shtools: disable libtool, add 4.9.1, fix --test=root (#27014)
The Makefile expects the "other" libtool, not the GNU libtool we have in Spack.
Closes https://github.com/spack/spack/issues/26993
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-11-03 23:33:52 +01:00
Thomas Helfer
c0a81399bf tfel and mgis: add new versions and fix tests (#27011)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-11-03 23:24:28 +01:00
Andrew W Elble
7b3e02471b new package: telegraf (#27201) 2021-11-03 15:00:33 -07:00
David Boehme
47babb2518 Add Caliper v2.7.0 (#27197) 2021-11-03 13:22:43 -06:00
Seth R. Johnson
12a0738030 trilinos: mark conflicts in @:13.1+tpetra^cuda@11 (#27188)
* trilinos: fix @13.0.1+tpetra^cuda@11
* Mark CUDA conflict with old versions and always define TPL
* trilinos: patch doesn't build so just mark as conflict
2021-11-03 12:38:03 -06:00
Axel Huebl
e5a9beed28 WarpX: 21.11 (#27158)
Update `warpx` & `py-warpx` to the latest release, `21.11`.
2021-11-03 11:18:23 -07:00
Kyle Gerheiser
25f1aad1c8 nemsio package: add version 2.5.4; add option to build without MPI (#27030)
Version 2.5.4 adds an option which allows Nemsio to be built without MPI
2021-11-03 10:18:17 -07:00
Manuela Kuhn
09eb79b571 r-reprex: add 2.0.1 (#27174) 2021-11-03 12:05:28 -05:00
Satish Balay
346f3652c6 petsc, py-petsc4py: add versions 3.16.1 (#27152) 2021-11-03 10:52:35 -05:00
Manuela Kuhn
78b1512966 py-datalad: add 0.15.3 (#27193) 2021-11-03 09:49:46 -06:00
iarspider
dede8c9d6b New version: py-jupyterlab-pygments 0.1.2 (#27186) 2021-11-03 10:27:55 -05:00
Manuela Kuhn
8e4d5a0922 sip: fix python_include_dir (#26953) 2021-11-03 10:27:04 -05:00
Manuela Kuhn
5c13c5892b py-nibetaseries: add new package (#27187) 2021-11-03 10:26:30 -05:00
Greg Becker
67cd92e6a3 Allow conditional variants (#24858)
A common question from users has been how to model variants 
that are new in new versions of a package, or variants that are 
dependent on other variants. Our stock answer so far has been
an unsatisfying combination of "just have it do nothing in the old
version" and "tell Spack it conflicts".

This PR enables conditional variants, on any spec condition. The 
syntax is straightforward, and matches that of previous features.
2021-11-03 08:11:31 +01:00
Massimiliano Culpo
78c08fccd5 Bootstrap GnuPG (#24003)
* GnuPG: allow bootstrapping from buildcache and sources

* Add a test to bootstrap GnuPG from binaries

* Disable bootstrapping in tests

* Add e2e test to bootstrap GnuPG from sources on Ubuntu

* Add e2e test to bootstrap GnuPG on macOS
2021-11-02 23:15:24 -07:00
Richarda Butler
1a3747b2b3 Update docs how to display loaded modules (#27159)
* Update spack load docs
2021-11-02 22:12:08 -07:00
dependabot[bot]
a382a6c0e2 build(deps): bump actions/checkout from 2.3.5 to 2.4.0 (#27179)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2.3.5 to 2.4.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](1e204e9a92...ec3a7ce113)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-03 02:52:28 +00:00
Manuela Kuhn
61ded65888 r-dtplyr: add new package (#27112) 2021-11-03 00:24:21 +00:00
Dan Lipsa
f1fb816d93 Add build editions for catalyst builds. (#22676)
* Add build editions for catalyst builds.

* Fix style.

* Build edition works only for 5.8:
2021-11-02 17:36:53 -04:00
kwryankrattiger
f1afd5ff27 Add and propagate CUDA variants for DAV SDK (#26476) 2021-11-02 17:31:50 -04:00
Satish Balay
c9f8dd93f3 trilinos: new version 13.2.0 (#27106)
* trilinos: add @13.2.0, and switch default to cxxstd=14

* trilinos: fix python dependency when using +ifpack or +ifpack2

* trilinos: add conflict for ~epetra +ml when @13.2.0:

* trilinos: keep 13.0.1 as the preferred version

* Update var/spack/repos/builtin/packages/trilinos/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* update

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-11-02 13:07:42 -06:00
Greg Becker
b3711c0d9d Improved error messages from clingo (#26719)
This PR adds error message sentinels to the clingo solve, attached to each of the rules that could fail a solve. The unsat core is then restricted to these messages, which makes the minimization problem tractable. Errors that can only be generated by a bug in the logic program or generating code are prefaced with "Internal error" to make clear to users that something has gone wrong on the Spack side of things.

* minimize unsat cores manually

* only errors messages are choices/assumptions for performance

* pre-check for unreachable nodes

* update tests for new error message

* make clingo concretization errors show up in cdash reports fully

* clingo: make import of clingo.ast parsing routines robust to clingo version

Older `clingo` has `parse_string`; newer `clingo` has `parse_files`.  Make the
code work wtih both.

* make AST access functions backward-compatible with clingo 5.4.0

Clingo AST API has changed since 5.4.0; make some functions to help us
handle both versions of the AST.

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-11-02 10:55:50 -07:00
Manuela Kuhn
3187689862 r-googlesheets4: add new package (#27114) 2021-11-02 11:46:48 -06:00
Manuela Kuhn
00b6927c65 r-dbplyr: add 2.1.1 (#27111) 2021-11-02 17:35:56 +00:00
Manuela Kuhn
9dc790bbe2 r-haven: add 2.4.3 (#27115) 2021-11-02 17:07:55 +00:00
Manuela Kuhn
ae76692c2e r-lubridate: add 1.8.0 (#27117) 2021-11-02 16:55:27 +00:00
iarspider
fa63bebf36 New versions of py-jupyter-server; fix tests (#27153)
* New versions of py-jupyter-server; fix tests

* Update package.py
2021-11-02 10:37:53 -06:00
Tom Payerle
eee4522103 trilinos: Additional fix for linking C code when built with PyTrilinos (#19834)
This removes `-lpytrilinos` from Makefile.export.Trilinos so that C code
trying to link against a Trilinos built with PyTrilinos does not fail
due to undefined references to python routines (libpytrilinos is only
used when importing PyTrilinos in python, in which case those references
are already defined by Python).

There was already a bit of code to do something similar for C codes
importing Trilinos via a CMake mechanism, this extends that to a basic
Makefile mechanism as well.  This patch also updates the comments to
remove a stale link discussing this issue, and replacing with links to
the some Trilinos issue reports related to the matter.
2021-11-02 12:31:10 -04:00
Manuela Kuhn
b8a44870a4 r-vroom: add new package (#27118) 2021-11-02 11:28:43 -05:00
Manuela Kuhn
426033bc5b r-brms: add 2.16.1 (#27148) 2021-11-02 11:21:54 -05:00
Manuela Kuhn
c2a7f8c608 r-rvest: add 1.0.2 (#27147) 2021-11-02 11:21:09 -05:00
Manuela Kuhn
429a60c703 r-callr: add 3.7.0 (#27146) 2021-11-02 11:19:09 -05:00
Manuela Kuhn
23f8a7331e r-forcats: add 0.5.1 (#27113) 2021-11-02 11:17:22 -05:00
Manuela Kuhn
6e2de9e41f r-processx-3.5.2 (#27119) 2021-11-02 11:13:13 -05:00
Seth R. Johnson
9cfecec002 relocate: do not change library id to use rpaths on package install (#27139)
After #26608 I got a report about missing rpaths when building a
downstream package independently using a spack-installed toolchain
(@tmdelellis). This occurred because the spack-installed libraries were
being linked into the downstream app, but the rpaths were not being
manually added. Prior to #26608 autotools-installed libs would retain
their hard-coded path and would thus propagate their link information
into the downstream library on mac.

We could solve this problem *if* the mac linker (ld) respected
`LD_RUN_PATH` like it does on GNU systems, i.e. adding `rpath` entries
to each item in the environment variable. However on mac we would have
to manually add rpaths either using spack's compiler wrapper scripts or
manually (e.g. using `CMAKE_BUILD_RPATH` and pointing to the libraries of
all the autotools-installed spack libraries).

The easier and safer thing to do for now is to simply stop changing the
dylib IDs.
2021-11-02 17:04:29 +01:00
Manuela Kuhn
34b2742817 r-hms: add 1.1.1 (#27116) 2021-11-02 11:03:52 -05:00
Manuela Kuhn
9d124bdac6 r-crayon: add 1.4.1 (#27110) 2021-11-02 10:49:56 -05:00
Manuela Kuhn
507f7a94d9 r-broom: add 0.7.9 and 0.7.10 (#27109) 2021-11-02 10:48:32 -05:00
Sinan
5fdf6e5f68 package/qgis_revert_incorrect_constraint (#27140)
* package/qgis_revert_incorrect_constraint

* fix bug

* also update dependency constraints

* also update python version constraints

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-11-02 10:43:29 -05:00
iarspider
669954a71d New version: py-isort 5.9.3 (#27144) 2021-11-02 10:42:34 -05:00
iarspider
6f44bf01e0 New version: py-jupyter-console 6.4.0; download sources from pip (#27150) 2021-11-02 10:37:37 -05:00
Joseph Wang
372fc78e98 yoda: add zlib as a dependency (#26454)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-11-02 12:37:23 +01:00
Cameron Smith
ca30940868 pumi: support build/install time and stand-alone testing (#26832) 2021-11-02 12:18:46 +01:00
Bernhard Kaindl
5a4d03060b gtk packages: fix dependencies (#26960)
gconf depends on gettext and libintl (dep: intltool)
glibmm, gtkmm, libcanberra and cups need pkgconfig
glibmm needs libsigc++ < 2.9x(which are 3.x pre-releases)
libsigc++@:2.9 depends on m4 for the build

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-11-02 10:39:25 +00:00
Chien-Chang Feng
bf411c7c55 pgplot: add intel compiler support and add more driver (#26761) 2021-11-02 11:30:28 +01:00
Massimiliano Culpo
3f3048e157 Fix GitHub Action's container build (#27143)
#26538 introduced a typo that causes the Docker image
build to fail.
2021-11-02 04:02:04 -06:00
Jean Luca Bez
85bdf4f74e h5bench: update maintainers and versions (#27083) 2021-11-02 10:52:36 +01:00
Glenn Johnson
cf50905ee9 modifications to docbook-xml (#27131)
- added more versions in case packages request a specific version of
  docbook-xml
- added a "current" alias to handle when packages use that
2021-11-02 10:50:07 +01:00
Glenn Johnson
f972863712 fsl: new version, updated constraints and patches (#27129)
- add version 6.0.5
- add patch to allow fsl to use newer gcc versions
- add patch to allow fsl to use newer cuda versions
- remove constraints on gcc and cuda versions
- add filters to prevent using system headers and libraries
- clean up the installed tree
2021-11-02 10:49:22 +01:00
Michael Kuhn
1e26e25bc8 spack arch: add --generic argument (#27061)
The `--generic` argument allows printing the best generic target for the
current machine. This can be quite handy when wanting to find the
generic architecture to use when building a shared software stack for
multiple machines.
2021-11-02 10:19:23 +01:00
Tamara Dahlgren
9d3d7c68fb Add tag filters to spack test list (#26842) 2021-11-02 10:00:21 +01:00
dependabot[bot]
94e0bf0112 build(deps): bump actions/checkout from 2.3.4 to 2.3.5 (#27135)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2.3.4 to 2.3.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](5a4ac9002d...1e204e9a92)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-02 08:55:28 +01:00
iarspider
80807e6dd4 Add missing dependency to py-setuptools-scm (#27132) 2021-11-01 20:04:43 -06:00
Weiqun Zhang
840c9b7b4a amrex: add new release 21.11 (#27126) 2021-11-02 02:49:03 +01:00
Vicente Bolea
6a5ea3db99 vtkm: add v1.7.0-rc1 (#26882) 2021-11-02 02:42:10 +01:00
Manuela Kuhn
78a073218f r-tzdb: add new package (#27105) 2021-11-01 17:10:13 -05:00
Manuela Kuhn
881b2dcf05 r-rcpp: add 1.0.7 (#27088) 2021-11-01 17:09:18 -05:00
eugeneswalker
8bc01ff63c py-pylint needs pip for build (#27123)
* py-pylint: needs py-pip for build

* alphabetize py- dependencies

* add comment pointing to issue

* fix style
2021-11-01 21:50:36 +00:00
Andre Merzky
e0a929ba60 New package: exaworks (#27054) 2021-11-01 14:15:41 -07:00
Tamara Dahlgren
d4cecd9ab2 feature: add "spack tags" command (#26136)
This PR adds a "spack tags" command to output package tags or 
(available) packages with those tags. It also ensures each package
is listed in the tag cache ONLY ONCE per tag.
2021-11-01 20:40:29 +00:00
Chuck Atkins
b56f464c29 GCC 11 fixes (#27122)
* adios2: Fix compile errors for gcc 11

* unifyfs: Suppress bogus warnings for gcc 11

* conduit: Fix compile errors for gcc 11
2021-11-01 14:31:39 -06:00
Harmen Stoppels
6845307384 Pin actions to a commit sha instead of tag (#26538) 2021-11-01 10:44:03 -07:00
Bernhard Kaindl
02aa1d5d48 intel-gpu-tools: add v1.20 (#26588)
Add version 1.20, fix including the glib-2.0 header files
and add missing dependencies: libunwind and kmod.
2021-11-01 17:35:14 +01:00
Bernhard Kaindl
8ca411c749 glib: skip tests which we cannot make pass (#26693)
glib has a few tests which have external dependencies or
try to access the X server. We cannot run those.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-11-01 17:33:35 +01:00
Jon Rood
888de27210 kokkos-nvcc-wrapper: add HPE MPT to MPI environment variables (#27069) 2021-11-01 17:31:33 +01:00
Chuck Atkins
15d407c674 ci: Enable more packages in the DVSDK CI pipeline (#27025)
* ci: Enable more packages in the DVSDK CI pipeline

* doxygen: Add conflicts for gcc bugs

* dray: Add version constraints for api breakage with newer deps
2021-11-01 08:54:50 -07:00
Satish Balay
a1eb5596ec kokkos-kernels: add variant 'shared' (#27097)
* kokkos-kernels: add variant 'shared'

* Update var/spack/repos/builtin/packages/kokkos-kernels/package.py
2021-11-01 11:45:26 -04:00
iarspider
6c4f891b8f New version: py-typing-extensions 3.10.0.2 (#27120) 2021-11-01 10:04:18 -05:00
iarspider
99ee9a826a New version: py-beautifulsoup4 4.10.0 (#27121) 2021-11-01 10:02:57 -05:00
iarspider
8dcbd2ee6f alpgen: fix cms recipe (#26812) 2021-11-01 15:13:25 +01:00
Mahendra Paipuri
be0df5c47a ucx:add rocm variant (#26992)
Co-authored-by: mahendrapaipuri <mahendra.paipuri@inria.fr>
2021-11-01 15:05:56 +01:00
Ben Boeckel
80ba7040f8 python: detect Python from custom Xcode installations (#27023) 2021-11-01 07:55:15 -05:00
Hans Pabst
9094d6bb68 Updated LIBXSMM. (#27108) 2021-11-01 04:34:52 -06:00
Kenny Shen
3253faf01e cpp-argparse: new package (#27107) 2021-11-01 10:54:25 +01:00
Massimiliano Culpo
d73b1b9742 Fix caching of spack.repo.all_package_names() (#26991)
fixes #24522
2021-11-01 02:16:30 -06:00
Manuela Kuhn
b87678c2dd r-tidyr: add 1.1.4 (#27099) 2021-10-31 21:44:33 -05:00
Manuela Kuhn
8b8e7bd8e9 r-cpp11: add 0.4.0 (#27103) 2021-10-31 21:42:52 -05:00
Manuela Kuhn
200a1079d3 r-curl: add 4.3.2 (#27102) 2021-10-31 21:42:25 -05:00
Manuela Kuhn
3cd8381eb2 r-tidyselect: add 1.1.1 (#27101) 2021-10-31 21:41:16 -05:00
Manuela Kuhn
6b76b76d26 r-data-table: add 1.14.2 (#27100) 2021-10-31 21:40:36 -05:00
Manuela Kuhn
3406e8e632 r-dplyr: add 1.0.7 (#27098) 2021-10-31 21:40:07 -05:00
Manuela Kuhn
f363864f7f r-posterior: add new package (#27093) 2021-10-31 21:39:04 -05:00
Manuela Kuhn
543e5480a1 r-googledrive: add new package (#27095) 2021-10-31 21:37:26 -05:00
Manuela Kuhn
f8be56c941 r-withr: add 2.4.2 (#27094) 2021-10-31 21:35:41 -05:00
Manuela Kuhn
b6f6bbe371 r-nlme: add 3.1-153 (#27092) 2021-10-31 21:34:11 -05:00
Manuela Kuhn
ec8dc1830b r-bayesplot-1.8.1 (#27091) 2021-10-31 21:31:29 -05:00
Manuela Kuhn
86d2ab5240 r-mgcv: add 1.8-38 (#27090) 2021-10-31 21:30:34 -05:00
Manuela Kuhn
885e4d50e8 r-matrix: add 1.3-4 (#27089) 2021-10-31 21:29:54 -05:00
Manuela Kuhn
9397ca0a9f r-tibble: add 3.1.5 (#27087) 2021-10-31 21:25:19 -05:00
Manuela Kuhn
2b3207073d r-matrixstats: add 0.61.0 (#27086) 2021-10-31 21:23:10 -05:00
Manuela Kuhn
75a6d50fdc r-ids: add new package (#27104) 2021-10-31 21:21:54 -05:00
Bernhard Kaindl
6344d163b3 qt: @5.8:5.14.2 don't build with gcc@11, fix build of 5.6.3 (#27072)
5.14.2 fails with %gcc@11 with Error: 'numeric_limits' is not a class template
5.8.0 has multiple compile failures as well: Extend the conflict to those too.
- Fix also the confgigure of @5.6.3 (tested with %gcc@11)
2021-10-31 21:22:15 -04:00
Sinan
6c1f952bda package/qgis: fix runtime issue, improve package file, add new versions (#27084)
* package/qgis: fix runtime issue, improve package file, add new versions

* replace conflict with depends_on

* tidy up

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-10-31 18:24:44 -05:00
iarspider
f45ef21e37 New versions of py-gitpython: 3.1.13 - 3.1.24 (#27052)
* New versions of py-gitpython

* Update package.py
2021-10-30 11:19:51 -05:00
iarspider
30573f5e5c New versions: py-gitdb 4.0.7, 4.0.8, 4.0.9 (#27051)
* New versions: py-gitdb 4.0.7, 4.0.8, 4.0.9

* Update var/spack/repos/builtin/packages/py-gitdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-30 11:19:22 -05:00
iarspider
4262a66f5e New versions of py-google-auth and py-google-auth-oauthlib (#27056)
* New versions of py-google-auth and py-google-auth-oauthlib

* Update var/spack/repos/builtin/packages/py-google-auth/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-30 11:18:48 -05:00
iarspider
bdfb281a35 New version: py-deprecation 2.1.0 (#27001)
* New version: py-deprecation 2.1.0

* Update package.py
2021-10-30 07:25:31 -06:00
iarspider
0ef0a27ea0 New version: py-html5lib 1.1 (#27059)
* New verion: py-html5lib 1.1

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-30 06:10:32 -06:00
Glenn Johnson
15e5508fcf r-vctrs: Fix checksums after setting url to CRAN (#27080)
This package had its url set to a github url and was then changed to a
CRAN url. The checksums need to change as a result.
2021-10-30 09:51:20 +02:00
Vasileios Karakasis
b1c4c1b6ca Add more ReFrame versions (#27082) 2021-10-30 09:17:30 +02:00
Jon Rood
c19a4e12c7 trilinos: Avoid Intel compiler segfaults in Trilinos with STK (#27073) 2021-10-29 22:03:46 -04:00
Kyle Gerheiser
c19514e540 w3emc package: add versions 2.9.2 and 2.7.3 (#27031) 2021-10-29 16:39:45 -07:00
Manuela Kuhn
657b9204ca New package: r-distributional (#27048) 2021-10-29 16:37:59 -07:00
Adam J. Stewart
1aaa7bd089 GDAL package: add version 3.3.3 (#27071) 2021-10-29 16:21:28 -07:00
iarspider
9448864c0e New versions: ipython 7.27.0 and 7.28.0 (#27066)
* New versions: ipython 7.27.0 and 7.28.0

* Changes from MR (1/2)

* Fix dep name (2/2)
2021-10-29 17:13:44 -06:00
Scott Wittenburg
f2a36bdf14 pipelines: llvm kills the xlarge, use huge (#27079) 2021-10-29 22:01:35 +00:00
Peter Scheibel
7eddf3ae9b For Spack commands that fail but don't throw exceptions, we were discarding the return code (#27077) 2021-10-29 14:14:41 -07:00
iarspider
b9e63c9f42 New version: py-importlib-resources 5.3.0 (#27064)
* New version: py-importlib-resources 5.3.0

* Update var/spack/repos/builtin/packages/py-importlib-resources/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-29 15:07:59 -06:00
Jen Herting
c38d3044fa New package: py-pypulse (#27026)
* python package py-pypulse created - per Andy

* URL fixed

* [py-pypulse]

- removed duplicate dependency
- updated copyright

* [py-pypulse] added version 0.1.1

* [py-pypulse] depends on setuptools

* [py-pypulse] url -> pypi

* [py-pypulse] simplified python dependency

Co-authored-by: Alex C Leute <aclrc@sporcsubmit.rc.rit.edu>
2021-10-29 14:58:58 -06:00
iarspider
ec23ce4176 New version: py-immutables 0.16 (#27060)
* New version: py-immutables 0.16

* Changes from review
2021-10-29 14:53:12 -06:00
Bernhard Kaindl
9a1626e54a py-slurm-pipeline: Add 4.0.4, Fix base.py: import six for @:3 (#26968)
* py-slurm-pipeline: Add 4.0.4, Fix base.py: import six for @:3

Before 4.0.0, slurm_pipeline/base.py has: `from six import string_types`

* Added depends_on('py-pytest@6.2.2:', type='build') as requested by Adam

* remove comment requested to be removed
2021-10-29 14:52:57 -06:00
Jon Rood
91408ef6ad trilinos: add MPICH and HPE MPT MPI environment variables (#27070) 2021-10-29 14:05:10 -06:00
genric
749640faf9 py-ipyparallel: add 7.1.0 (#26984) 2021-10-29 14:10:36 -05:00
Simon Pintarelli
494ba67704 sirius: add mpi datatypes patch for recent cray mpich (#27065) 2021-10-29 12:31:56 -06:00
Baptiste Jonglez
b8bc030a3c py-torch: Add a breakpad variant, disable it for ppc64 and ppc64le (#26990) 2021-10-29 13:00:48 -05:00
Massimiliano Culpo
3eb52b48b8 config add: infer type based on JSON schema validation errors (#27035)
- [x] Allow dding enumerated types and types whose default value is forbidden by the schema
- [x] Add a test for using enumerated types in the tests for `spack config add`
- [x] Make `config add` tests use the `mutable_config` fixture so they do not
      affect other tests

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-10-29 18:44:49 +02:00
Manuela Kuhn
20b7485cd4 r-cli: add 3.0.1 (#27047) 2021-10-29 10:57:14 -05:00
Manuela Kuhn
d36b045e2d r-afex: add 1.0-1 (#27032) 2021-10-29 10:55:46 -05:00
Thomas Madlener
874f06e29c curl: fix mbedtls versions and certs config (#26877)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-29 08:55:48 -06:00
Jon Rood
962d06441e m4: set times of m4.texi to fix build on Summit GPFS (#26994)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-29 16:05:41 +02:00
Bernhard Kaindl
fd10e54409 libssh2: skip the testsuite when docker is not installed (#26962)
The build-time testsuite which would be run when building
with tests needs docker. Check that it exists before
attempting to execute the tests.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-10-29 07:04:40 -06:00
Harmen Stoppels
49034abd76 Fix exit codes posix shell wrapper (#27012)
* Correct exit code in sh wrapper

* Fix tests

* SC2069
2021-10-29 08:10:22 +00:00
Harmen Stoppels
f610b506cd elfio: add minimum version requirements to cmake (#27033) 2021-10-29 09:28:19 +02:00
Axel Huebl
5a7496eb82 WarpX & HiPACE++: Constrain FFTW for No-MPI (#27043)
Contrain FFTW for no-MPI to simplify builds and logic to handle.
2021-10-29 09:19:23 +02:00
Valentin Churavy
7a6a232730 add Julia 1.7.0-rc2 (#27016) 2021-10-29 08:24:38 +02:00
Manuela Kuhn
4031ea5e03 r-gargle: add new (#27046) 2021-10-29 05:19:50 +00:00
Manuela Kuhn
852e14a655 r-openssl: add 1.4.5 (#27045) 2021-10-29 04:17:35 +00:00
Manuela Kuhn
5b2d9ae0b0 r-car: add 3.0-11 (#27044) 2021-10-29 04:14:06 +00:00
Manuela Kuhn
dc3b818192 r-future: add 1.22.1 (#27041) 2021-10-29 03:58:00 +00:00
Manuela Kuhn
5c1874c387 r-pillar: add 1.6.4 (#27040) 2021-10-29 03:47:42 +00:00
Manuela Kuhn
fcf6a46aeb r-digest: add 0.6.28 (#27039) 2021-10-29 03:00:10 +00:00
Manuela Kuhn
180bb5003c r-farver: add 2.1.0 (#27038) 2021-10-29 02:46:55 +00:00
Manuela Kuhn
2e880c5513 r-ggplot2: add 3.3.5 (#27037) 2021-10-29 02:36:49 +00:00
Manuela Kuhn
71412f547d r-generics: add 0.1.1 (#27034) 2021-10-29 01:44:13 +00:00
Harmen Stoppels
574395af93 Fix exit codes in fish (#27028) 2021-10-29 01:10:31 +00:00
Manuela Kuhn
c04b2fa26a r-rappdirs: add 0.3.3 (#26989) 2021-10-28 17:55:38 -05:00
Manuela Kuhn
19c77f11b6 r-emmeans: add 1.7.0 (#26987) 2021-10-28 17:54:24 -05:00
Manuela Kuhn
1f4ada6b22 r-fansi: add 0.5.0 (#26986) 2021-10-28 17:51:45 -05:00
Manuela Kuhn
4ff87ea04f r-lifecycle: add 1.0.1 (#26975) 2021-10-28 17:42:56 -05:00
Manuela Kuhn
d5520a264b r-parallelly: add 1.28.1 (#26974) 2021-10-28 17:40:58 -05:00
Manuela Kuhn
999044033b r-vctrs: add 0.3.8 (#26966) 2021-10-28 17:39:09 -05:00
Todd Gamblin
233dabbd4f bugfix: config edit should work with a malformed spack.yaml
If you don't format `spack.yaml` correctly, `spack config edit` still fails and
you have to edit your `spack.yaml` manually.

- [x] Add some code to `_main()` to defer `ConfigFormatError` when loading the
  environment, until we know what command is being run.

- [x] Make `spack config edit` use `SPACK_ENV` instead of the config scope
  object to find `spack.yaml`, so it can work even if the environment is bad.

Co-authored-by: scheibelp <scheibel1@llnl.gov>
2021-10-28 15:37:44 -07:00
Todd Gamblin
374e3465c5 bugfix: spack config get <section> in environments
`spack config get <section>` was erroneously returning just the `spack.yaml`
for the environment.

It should return the combined configuration for that section (including
anything from `spack.yaml`), even in an environment.

- [x] reorder conditions in `cmd/config.py` to fix
2021-10-28 15:37:44 -07:00
Todd Gamblin
2bd513d659 config: ensure that options like --debug are set first
`spack --debug config edit` was not working properly -- it would not do show a
stack trace for configuration errors.

- [x] Rework `_main()` and add some notes for maintainers on where things need
      to go for configuration to work properly.
- [x] Move config setup to *after* command-line parsing is done.

Co-authored-by: scheibelp <scheibel1@llnl.gov>
2021-10-28 15:37:44 -07:00
Todd Gamblin
56ad721eb5 errors: Rework error handling in main()
`main()` has grown, and in some cases code that can generate errors has gotten
outside the top-level try/catch in there. This means that simple errors like
config issues give you large stack traces, which shouldn't happen without
`--debug`.

- [x] Split `main()` into `main()` for the top-level error handling and
      `_main()` with all logic.
2021-10-28 15:37:44 -07:00
Manuela Kuhn
732be7dec6 r-ellipsis: add 0.3.2 (#26965) 2021-10-28 17:36:10 -05:00
Manuela Kuhn
895bd75762 r-rlang: add 0.4.12 (#26963) 2021-10-28 17:34:06 -05:00
Manuela Kuhn
dd1eb7ea18 r-mvtnorm: add 1.1-3 (#26957) 2021-10-28 17:31:43 -05:00
Manuela Kuhn
494dc0bd13 r-lme4: add 1.1-27.1 (#26955)
* r-lme4: add 1.1-27.1

* Use cran instead of explicit url
2021-10-28 17:30:21 -05:00
Manuela Kuhn
4b2564a2d6 llvm: fix gcc11 build for @11 (#27013) 2021-10-28 16:07:56 -06:00
iarspider
eb2d44a57b New versions of py-flake8 and py-pyflakes (#27008)
* New versions of py-flake8 and py-pyflakes

* Changes from review
2021-10-28 21:56:28 +00:00
Todd Gamblin
a1216138f6 config: fix SPACK_DISABLE_LOCAL_CONFIG, remove $user_config_path (#27022)
There were some loose ends left in ##26735 that cause errors when
using `SPACK_DISABLE_LOCAL_CONFIG`.

- [x] Fix hard-coded `~/.spack` references in `install_test.py` and `monitor.py`

Also, if `SPACK_DISABLE_LOCAL_CONFIG` is used, there is the issue that
`$user_config_path`, when used in configuration files, makes no sense,
because there is no user config scope.

Since we already have `$user_cache_path` in configuration files, and since there
really shouldn't be *any* data stored in a configuration scope (which is what
you'd configure in `config.yaml`/`bootstrap.yaml`/etc., this just removes
`$user_config_path`.

There will *always* be a `$user_cache_path`, as Spack needs to write files, but
we shouldn't rely on the existence of a particular configuration scope in the
Spack code, as scopes are configurable, both in number and location.

- [x] Remove `$user_config_path` substitution.
- [x] Fix reference to `$user_config_path` in `etc/spack/deaults/bootstrap.yaml`
      to refer to `$user_cache_path`, which is where it was intended to be.
2021-10-28 21:33:44 +00:00
Daryl W. Grunau
d0e177e711 depend on libevent when +pmix (#27020)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2021-10-28 23:19:19 +02:00
Phil Carns
7fd1d2b03f mochi-margo: add version 0.9.6 (#26951) 2021-10-28 23:11:30 +02:00
Valentin Churavy
7f8b0d4820 add MPItrampoline 2.0.0 (#27019) 2021-10-28 22:20:38 +02:00
iarspider
612639534a New version: py-distro 1.6.0 (#27003) 2021-10-28 14:38:36 -05:00
iarspider
f512bb1dc1 New versions: docutils 0.17, 0.17.1, 0.18 (#27005) 2021-10-28 13:53:02 -05:00
Harmen Stoppels
6d030ba137 Deactivate previous env before activating new one (#25409)
* Deactivate previous env before activating new one

Currently on develop you can run `spack env activate` multiple times to switch
between environments, but they leave traces, even though Spack only supports
one active environment at a time.

Currently:

```console
$ spack env create a
$ spack env create b
$ spack env activate -p a
[a] $ spack env activate -p b
[b] [a] $ spack env activate -p b
[a] [b] [a] $ spack env activate -p a
[a] [b] [c] $ echo $MANPATH | tr ":" "\n"
/path/to/environments/a/.spack-env/view/share/man
/path/to/environments/a/.spack-env/view/man
/path/to/environments/b/.spack-env/view/share/man
/path/to/environments/b/.spack-env/view/man
```

This PR fixes that:

```console
$ spack env activate -p a
[a] $ spack env activate -p b
[b] $ spack env activate -p a
[a] $ echo $MANPATH | tr ":" "\n"
/path/to/environments/a/.spack-env/view/share/man
/path/to/environments/a/.spack-env/view/man
```
2021-10-28 11:39:25 -07:00
Tom Scogland
87e456d59c spack setup-env.sh: make zsh loading async compatible, and ~10x faster (in some cases) (#26120)
Currently spack is a bit of a bad actor as a zsh plugin, and it was my
fault.  The autoload and compinit should really be handled by the user,
as was made abundantly clear when I found spack was doing completion
initialization for *all* of my plugins due to a deferred setup that was
getting messed up by it.

Making this conditional took spack load time from 1.5 seconds (with
module loading disabled) to 0.029 seconds. I can actually afford to load
spack by default with this change in.

Hopefully someday we'll do proper zsh completion support, but for now
this helps a lot.

* use zsh hist expansion in place of dirname
* only run (bash)compinit if compdef/complete missing
* add zsh compiled files to .gitignore
* move changes to .in file, because spack
2021-10-28 11:32:59 -07:00
iarspider
5faa457a35 New version: py-fasteners 0.16.3 (#27006) 2021-10-28 13:15:50 -05:00
Harmen Stoppels
8ca3a0fdf8 Remove failing macOS test (#27009) 2021-10-28 09:30:51 -07:00
Brent Huisman
63fcb0331b Add Pybind11 v2.8 (#26867)
* Add Pybind11 v2.8

* Add Python dependency

* Update package.py

* Added Pybind v2.8.1
2021-10-28 16:14:49 +00:00
Robert Blackwell
8fd94e3114 YamlFilesystemView: improve file removal performance via batching (#24355)
* Drastically improve YamlFilesystemView file removal via batching

The `remove_file` routine has to check if the file is owned by multiple packages, so it doesn't
remove necessary files. This is done by the `get_all_specs` routine, which walks the entire
package tree. With large numbers of packages on shared file systems, this can take seconds
per file tree traversal, which adds up extremely quickly. For example, a single deactivate
of a largish python package in our software stack on GPFS took approximately 40 minutes.

This patch simply replaces `remove_file` with a batch `remove_files` routine. This routine
removes a list of files rather than a single file, requiring only one traversal per batch. In
practice this means a package can be removed in seconds time, rather than potentially hours,
essentially a ~100x speedup (ignoring initial deactivation logic, which takes about 3 minutes
in our test setup).
2021-10-28 07:39:16 -07:00
Harmen Stoppels
c13f915735 cmake: add v3.21.4, v3.20.6 (#27004) 2021-10-28 13:59:46 +00:00
Cameron Stanavige
afbc67fdd8 dtcmp & lwgrp: add shared variant (#26999) 2021-10-28 15:00:54 +02:00
Michael Kuhn
e9f3ef785d Fix sbang hook for non-writable files (#27007)
* Fix sbang hook for non-writable files

PR #26793 seems to have broken the sbang hook for files with missing
write permissions. Installing perl now breaks with the following error:
```
==> [2021-10-28-12:09:26.832759] Error: PermissionError: [Errno 13] Permission denied: '$SPACK/opt/spack/linux-fedora34-zen2/gcc-11.2.1/perl-5.34.0-afuweplnhphcojcowsc2mb5ngncmczk4/bin/cpanm'
```

Temporarily add write permissions to the original file so it can be
overwritten with the patched one.

And test that file permissions are preserved in sbang even for non-writable files

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-28 14:49:23 +02:00
Erik Schnetter
7c886bcac1 nsimd: add v3.0.1, determine SIMD variant automatically by default (#26850) 2021-10-28 14:38:48 +02:00
Paul Ferrell
4ee37c37de buildcaches: fix directory link relocation (#26948)
When relocating a binary distribution, Spack only checks files to see
if they are a link that needs to be relocated. Directories can be
such links as well, however, and need to undergo the same checks
and potential relocation.
2021-10-28 14:34:31 +02:00
Seth R. Johnson
890095e876 llvm: use cmake helper functions (#26988)
* llvm: use cmake helper functipack stns

* llvm: review feedback
2021-10-27 20:26:22 +00:00
iarspider
7416df692a New versions: py-cffi 1.15.0, 1.14.6 (#26979)
* New versions: py-cffi 1.15.0, 1.14.6

* Changes from review
2021-10-27 15:01:15 -05:00
iarspider
dd0770fd64 New versions of py-cachetools (#26976)
* New versions of py-cachetools

* Changes from review
2021-10-27 15:00:48 -05:00
iarspider
704c94429b New version: py-bottle@0.12.19 (#26973)
* New version: py-bottle@0.12.19

* Changes from review
2021-10-27 15:00:19 -05:00
Mark W. Krentel
21d909784c hpcviewer: add support for macosx, add version 2021.10 (#26823) 2021-10-27 19:51:14 +00:00
Kyle Gerheiser
e35eacf87b Add w3emc version 2.9.1 (#26880) 2021-10-27 12:05:06 -06:00
iarspider
dc40405fd6 New version: py-contextlib2 21.6.0 (#26985) 2021-10-27 11:08:25 -06:00
Massimiliano Culpo
3d5444fdd8 Remove documentation tests from GitHub Actions (#26981)
We moved documentation tests to readthedocs since a while,
so remove the one on GitHub.
2021-10-27 19:02:52 +02:00
iarspider
80d4a83636 New versions: py-boken@2.3.3, 2.4.0, 2.4.1 (#26972) 2021-10-27 11:59:42 -05:00
iarspider
39f46b1c3b New version: py-certifi 2021.10.8 (#26978) 2021-10-27 11:54:37 -05:00
H. Joe Lee
1fcc9c6552 hdf5-vol-log: add new package (#26956) 2021-10-27 16:48:44 +00:00
iarspider
1842785eae New version: py-commonmark 0.9.1 (#26983) 2021-10-27 11:47:28 -05:00
Mosè Giordano
acb8ab338d fftw: add v3.3.10 (#26982) 2021-10-27 15:29:53 +00:00
Bernhard Kaindl
d1803af957 fontconfig: add v2.13.94 and fix test with dash (#26961)
Fix install --test=root with /bin/sh -> dash: A test uses
SIGINT SIGTERM SIGABRT EXIT for trap -> use signal numbers
2021-10-27 16:20:57 +02:00
Bernhard Kaindl
f4431851ab perl-extutils-installpaths: depend on perl-extutils-config (#26969) 2021-10-27 16:02:14 +02:00
Pieter Ghysels
1f728ab4ce strumpack: add v6.1.0, remove unused variants (#26971) 2021-10-27 15:58:54 +02:00
Valentin Volkl
9fa20b8a39 recola: fix compilation (#26634)
* recola: fix compilation

* Update var/spack/repos/builtin/packages/recola-sm/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* flake8

* fixes

* fix typo

* fix typo

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-10-27 07:19:35 -06:00
Todd Gamblin
4f124bc9e7 tests: speed up spack list tests (#26958)
`spack list` tests are not using mock packages for some reason, and many
are marked as potentially slow. This isn't really necessary; we don't need
6,000 packages to test the command.

- [x] update tests to use `mock_packages` fixture
- [x] remove `maybeslow` annotations
2021-10-27 05:10:39 -06:00
Harmen Stoppels
e04b172eb0 Allow non-UTF-8 encoding in sbang hook (#26793)
Currently Spack reads full files containing shebangs to memory as
strings, meaning Spack would have to guess their encoding. Currently
Spack has a fixed guess of UTF-8.

This is unnecessary, since e.g. the Linux kernel does not assume an
encoding on paths at all, it's just bytes and some delimiters on the
byte level.

This commit does the following:

1. Shebangs are treated as bytes, so that e.g. latin1 encoded files do
not throw UnicodeEncoding errors, and adds a test for this.
2. No more bytes than necessary are read to memory, we only have to read
until the first newline, and from there on we an copy the file byte by
bytes instead of decoding and re-encoding text.
3. We cap the number of bytes read to 4096, if no newline is found
before that, we don't attempt to patch it.
4. Add support for luajit too.

This should make Spack both more efficient and usable for non-UTF8
files.
2021-10-27 02:59:10 -07:00
Harmen Stoppels
2fd87046cd Fix assumption v.concrete => isinstance(v, Version) (#26537)
* Add test
* Only extend with Git version when using Version
* xfail v.concrete test
2021-10-27 02:58:04 -07:00
Harmen Stoppels
ae6e83b1d5 config: overrides for caches and system and user scopes (#26735)
Spack's `system` and `user` scopes provide ways for administrators and
users to set global defaults for all Spack instances, but for use cases
where one wants a clean Spack installation, these scopes can be undesirable.
For example, users may want to opt out of global system configuration, or
they may want to ignore their own home directory settings when running in
a continuous integration environment.

Spack also, by default, keeps various caches and user data in `~/.spack`,
but users may want to override these locations.

Spack provides three environment variables that allow you to override or
opt out of configuration locations:

 * `SPACK_USER_CONFIG_PATH`: Override the path to use for the
   `user` (`~/.spack`) scope.

 * `SPACK_SYSTEM_CONFIG_PATH`: Override the path to use for the
   `system` (`/etc/spack`) scope.

 * `SPACK_DISABLE_LOCAL_CONFIG`: set this environment variable to completely
   disable *both* the system and user configuration directories. Spack will
   only consider its own defaults and `site` configuration locations.

And one that allows you to move the default cache location:

 * `SPACK_USER_CACHE_PATH`: Override the default path to use for user data
   (misc_cache, tests, reports, etc.)

With these settings, if you want to isolate Spack in a CI environment, you can do this:

   export SPACK_DISABLE_LOCAL_CONFIG=true
   export SPACK_USER_CACHE_PATH=/tmp/spack

This is a stop-gap approach until we have figured out how to deal with
the system and user config scopes more generally, as there are plans to
potentially / eventually get rid of them.

**User config**

Spack is a bit of a pain when you have:

- a shared $HOME folder across different systems.
- multiple Spack versions on the same system.

**System config**

- On shared systems with a versioned programming environment / toolkit,
  system administrators want to provide config for each version (e.g.
  21.09, 21.10) of the programming environment, and the user Spack
  instance should be able to pick this up without a steep learning
  curve.
- On shared systems the user should be able to opt out of the
  hard-coded config scope in /etc/spack, since it may be incompatible
  with their particular instance. Currently Spack can only opt out of all
  config scopes through overrides with `"config:":`, `"packages:":`, but that
  also drops the defaults config, which would have to be repeated, which
  is undesirable, especially the lengthy packages.yaml.

An example use case is: having config in this folder:

```
/path/to/programming/environment/{version}/{compilers,packages}.yaml
```

and have `module load spack-system-config` set the variable

```
SPACK_SYSTEM_CONFIG_PATH=/path/to/programming/environment/{version}
```

where the user no longer has to worry about what `{version}` they are
on.

**Continuous integration**

Finally, there is the use case of continuous integration, which may
clone an arbitrary Spack version, which optimally should not pick up
system or user config from the previous run (like may happen in
classical bare metal non-containerized filesystem side effect ridden
jenkins pipelines). In fact this is very similar to how spack itself
tries to avoid picking up system dependencies during builds...

**But environments solve this?**

- You could do `include`s in environment files to get similar behavior
  to the spack_system_config_path example, but environments require you
  to:
  1) require paths to individual config files, not directories.
  2) fail if the listed config file does not exist
- They allow you to override config scopes, but this is generally too
  rigurous, as it requires you to repeat the default config, in
  particular packages.yaml, and just defies the point of layered config.

Co-authored-by: Tom Scogland <tscogland@llnl.gov>
Co-authored-by: Tim Fuller <tjfulle@sandia.gov>
Co-authored-by: Steve Leak <sleak@lbl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-10-26 18:08:25 -07:00
Ye Luo
7dc94b6683 Build OpenMP in LLVM via LLVM_ENABLE_RUNTIMES. (#26870) 2021-10-26 17:43:28 -06:00
Hao Lyu
02bea6d2d2 fix bug when there is version id in the path of compiler (#26916) 2021-10-26 14:12:58 -07:00
Greg Becker
9a637bbd09 modules: allow user to remove arch dir (#24156)
* allow no arch-dir modules

* add tests for modules with no arch

* document arch-specific module roots
2021-10-26 13:26:09 -07:00
Mark W. Krentel
444e156685 hpctoolkit: add version 2021.10.15 (#26881) 2021-10-26 13:15:09 -07:00
Richarda Butler
bc616a60b7 Py-Libensemble: Add E4S testsuite stand alone test (#26270) 2021-10-26 13:11:53 -07:00
Miroslav Stoyanov
ad03981468 fix the spack test dir (#26816) 2021-10-26 13:07:04 -07:00
Sreenivasa Murthy Kolam
5ee2ab314c ROCm packages: add RelWithDebInfo build_type (#26888)
Also set default build_type to Release for many ROCm packages.
2021-10-26 12:18:38 -07:00
Greg Becker
a8a08f66ad modules: configurable module defaults (#24367)
Any spec satisfying a default will be symlinked to `default`

If multiple specs have modulefiles in the same directory and satisfy
configured module defaults, then whichever was written last will be
default.
2021-10-26 19:34:06 +02:00
Kyle Gerheiser
dee75a4945 upp: Add version 10.0.10 (#26946) 2021-10-26 19:26:34 +02:00
Seth R. Johnson
aacdd5614e htop: add new URL and versions (#26928) 2021-10-26 19:14:06 +02:00
Ben Morgan
a2e5a28892 virtest: prevent out-of-order build/test (#26944)
Use of `-R` flag to CTest command causes "empty-14" test to run,
by matching "empty", before the empty-14 target is built.

Patch CTest command in buildscript to match name exactly.
2021-10-26 18:55:54 +02:00
Asher Mancinelli
94a9733822 hiop: update constraints and add new version (#26905)
* Update hiop package dependencies

* Use single quotes to shrink diff

* add hiop 0.5.1

* apply flake8

* Apply formatting suggestions
2021-10-26 11:43:03 -04:00
iarspider
fb1f3b1a1c Add checksum for py-argon2-cffi 21.1.0 and update python dependency (#26894)
* Add checksum for py-argon2-cffi 21.1.0 and update python dependency

* Update package.py
2021-10-26 09:32:11 -06:00
iarspider
d673e634d0 New versions: py-bleach 4.0.0 and 4.1.0 (#26947) 2021-10-26 09:23:10 -06:00
iarspider
1f1f121e8f New version: py-beniget 0.4.1 (#26945) 2021-10-26 09:02:03 -06:00
Wouter Deconinck
ba2a03e1da geant4: depends_on vecgeom@1.1.8:1.1 range (#26917)
* [geant4] depends_on vecgeom@1.1.8:1.1 range

While previous versions were unclear, the [geant4.10.7 release notes](https://geant4-data.web.cern.ch/ReleaseNotes/ReleaseNotes4.10.7.html) indicate that vecgeom@1.1.8 is a minimum required version, not an exact required version ("Set VecGeom-1.1.8 as minimum required version for optional build with VecGeom."). This will allow some more freedom on the concretizer solutions while allowing geant4 to take advantage of bugfixes and improvements in vecgeom.

* [vecgeom] new version 1.1.17
2021-10-26 10:31:37 -04:00
Seth R. Johnson
dad68e41e0 freetype: explicitly specify dependencies (#26942)
Freetype picked up 'brotli' from homebrew, causing issues downstream.
2021-10-26 04:07:48 -06:00
eugeneswalker
1ae38037ef tau: add version 2.30.2 (#26941) 2021-10-26 05:36:29 -04:00
Chuck Atkins
80d8c93452 unifyfs: Remove the hdf5 variant (examples only) (#26932)
UnifyFS doesn't actually use HDF5 for anything.  Enabling it only enables
a few examples to be built so it's not actually a dependency of the package.
2021-10-26 09:42:52 +02:00
Seth R. Johnson
bdcbc4cefe qt package: versions @:5.13 don't build with gcc@11: (#26922) 2021-10-25 18:20:54 -07:00
Ronak Buch
1d53810d77 charmpp: add version 7.0.0 (#26940) 2021-10-25 18:16:37 -07:00
Adam J. Stewart
4a8c53472d py-kornia: add version 0.6.1 (#26939) 2021-10-25 18:15:57 -07:00
Adam J. Stewart
b05df2cdc7 py-shapely: add version 1.8.0 (#26937) 2021-10-25 18:15:22 -07:00
Adam J. Stewart
92cef8d7ad py-scikit-learn: add version 1.0.1 (#26934) 2021-10-25 18:14:17 -07:00
Adam J. Stewart
912c6ff6e8 PyTorch/torchvision: add version 1.10.0/0.11.1 (#26889)
* For py-torch: Also update dependencies: many version constraints
  with an upper bound of @1.9 are now open (e.g. `@1.8.0:1.9` is
  converted to `@1.8.0:`).
* For py-torchvision: Also add 0.11.0 and update ^pil constraint
  to avoid building with 8.3.0
2021-10-25 15:26:12 -07:00
iarspider
cdcdd71b41 Add new versions of py-autopep8 and py-pycodestyle (#26924)
* Add new versions of py-autopep8 (1.5.7, 1.6.0) and py-pycodestyle (2.7.0, 2.8.0)

* Update package.py

* Restore old versions
2021-10-25 22:08:52 +00:00
Daniel Arndt
e221617386 deal.II package: Update CMake variable for >=9.3.0 (#26909) 2021-10-25 15:00:40 -07:00
Massimiliano Culpo
6063600a7b containerize: pin the Spack version used in a container (#21910)
This PR permits to specify the `url` and `ref` of the Spack instance used in a container recipe simply by expanding the YAML schema as outlined in #20442:
```yaml
container:
  images:
    os: amazonlinux:2
    spack:
      ref: develop
      resolve_sha: true
```
The `resolve_sha` option, if true, verifies the `ref` by cloning the Spack repository in a temporary directory and transforming any tag or branch name to a commit sha. When this new ability is leveraged an additional "bootstrap" stage is added, which builds an image with Spack setup and ready to install software. The Spack repository to be used can be customized with the `url` keyword under `spack`.

Modifications:
- [x] Permit to pin the version of Spack, either by branch or tag or sha
- [x] Added a few new OSes (centos:8, amazonlinux:2, ubuntu:20.04, alpine:3, cuda:11.2.1)
- [x] Permit to print the bootstrap image as a standalone
- [x] Add documentation on the new part of the schema
- [x] Add unit tests for different use cases
2021-10-25 13:09:27 -07:00
iarspider
ff65e6352f py-avro: new version 1.10.2 (#26927)
* Add checksum for py-avro@1.10.2

* Update package.py
2021-10-25 11:45:33 -05:00
Olli Lupton
06c983d38f cuda: add 11.4.1, 11.4.2, 11.5.0. (#26892)
* cuda: add 11.4.1, 11.4.2, 11.5.0.

Note that the curses dependency from cuda-gdb was dropped in 11.4.0.

* Update clang/gcc constraints.

* Address review, assume clang 12 is OK from 11.4.1 onwards.

* superlu-dist@7.1.0 conflicts with cuda@11.5.0.

* Update var/spack/repos/builtin/packages/superlu-dist/package.py

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-25 09:05:21 -07:00
iarspider
b0752bf1b3 Add checksum for py-atomicwrites@1.4.0 (#26923) 2021-10-25 10:54:04 -05:00
Kendra Long!
c9847766e4 draco: new version 7_12_0 (#26907)
* Add draco-7_12_0 to package file

* Update hash to zip version
2021-10-25 11:35:38 -04:00
Kelly (KT) Thompson
86a9c703db eospac: new default version 6.5.0 (#26723)
* [pkg][new version] Provide eospac@6.5.0 and mark it as default.

* Merge in changes found in #21629

* Mark all alpha/beta versions as deprecated.

- Addresses @sethrj's recommendation
- Also add a note indicated why these versions are marked this way.
2021-10-25 11:10:46 -04:00
Harmen Stoppels
276e637522 Reduce verbosity of module files warning
1. Currently it prints not just the spec name, but the dependencies +
their variants + their compilers + their architectures + ...
2. It's clear from the context what spec the message applies to, so,
let's not print the spec at all.
2021-10-25 17:07:56 +02:00
Olivier Cessenat
b51e0b363e silo: new release 4.11 (#26876) 2021-10-25 08:13:54 -06:00
Manuela Kuhn
e9740a9978 py-nipype: add 1.7.0 (#26883) 2021-10-25 06:55:50 -06:00
Harmen Stoppels
cc8d8cc9cb Return early in do_fetch when there is no code or a package is external (#26926)
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2021-10-25 13:51:23 +02:00
Todd Gamblin
de8e795563 virtuals: simplify virtual handling
These three rules in `concretize.lp` are overly complex:

```prolog
:- not provider(Package, Virtual),
   provides_virtual(Package, Virtual),
   virtual_node(Virtual).
```

```prolog
  :- provides_virtual(Package, V1), provides_virtual(Package, V2), V1 != V2,
     provider(Package, V1), not provider(Package, V2),
     virtual_node(V1), virtual_node(V2).
```

```prolog
provider(Package, Virtual) :- root(Package), provides_virtual(Package, Virtual).
```

and they can be simplified to just:

```prolog
provider(Package, Virtual) :- node(Package), provides_virtual(Package, Virtual).
```

- [x] simplify virtual rules to just one implication
- [x] rename `provides_virtual` to `virtual_condition_holds`
2021-10-25 09:11:04 +02:00
Massimiliano Culpo
6d69d23aa5 Add a unit test to prevent regression 2021-10-25 09:11:04 +02:00
Massimiliano Culpo
dd4d7bae1d ASP-based solver: a package eligible to provide a virtual must provide it
fixes #26866

This semantics fits with the way Spack currently treats providers of
virtual dependencies. It needs to be revisited when #15569 is reworked
with a new syntax.
2021-10-25 09:11:04 +02:00
Miguel Dias Costa
1e90160d68 berkeleygw: force openmp propagation on some providers of blas / ffw-api (#26918) 2021-10-25 07:42:05 +02:00
Michael Kuhn
dfcd5d4c81 hyperfine: new package (#26901) 2021-10-24 20:47:56 -04:00
Satish Balay
e3b7eb418f Add new math solver versions (#26903)
* tasmanian: add @7.7
* butterflypack: add @2.0.0
* pflotran: add @3.0.2
* alquimia: add @1.0.9
* superlu-dist: add @7.1.1
2021-10-24 20:45:44 -04:00
Michael Kuhn
fb5d4d9397 meson: add 0.60.0 (#26921) 2021-10-24 20:35:11 -04:00
Scott Wittenburg
2003fa1b35 Mark flaky test_ci_rebuild as xfail (#26911) 2021-10-24 22:46:26 +02:00
Michael Kuhn
b481b5114a Fix pkg-config dependencies (#26912)
pkg-config and pkgconf are implementations of the pkgconfig provider.
2021-10-24 20:39:50 +02:00
Michael Kuhn
d7148a74a0 glib: add 2.70.0 (#26915) 2021-10-24 20:39:30 +02:00
Seth R. Johnson
c255e91bba gcc: support alternate mechanism for providing GCC rpaths (#26590)
* gcc: support runtime ability to not install spack rpaths

Fixes #26582 .

* gcc: Fix malformed specs file and add docs

The updated docs point out that the spack-modified GCC does *not*
follow the usual behavior of LD_RUN_PATH!

* gcc: fix bad rpath on macOS

This bug has been around since the beginning of the GCC package file:
the rpath command it generates for macOS writes a single (invalid)
rpath entry.

* gcc: only write rpaths for directories with shared libraries

The original lib64+lib was just a hack for "in case either has one" but
it's easy to tell whether either has libraries that can be directly
referenced.
2021-10-24 13:46:52 -04:00
Michael Kuhn
5a38165674 python: add 3.9.7, 3.8.12, 3.7.12 and 3.6.15 (#26914) 2021-10-24 11:43:39 -06:00
Morten Kristensen
9e11e62bca py-vermin: add latest version 1.3.1 (#26920)
* py-vermin: add latest version 1.3.1

* Exclude line from Vermin since version is already being checked for

Vermin 1.3.1 finds that `encoding` kwarg of builtin `open()` requires Python 3+.
2021-10-24 16:49:05 +00:00
Mark Grondona
7fee7a7ce1 flux-core, -sched: update to latest versions, fix czmq build error (#26840)
* flux-core: fix compile error with czmq_containers
* Fix build with clang and gcc-11
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-23 17:37:44 +02:00
Jen Herting
7380161ea6 [py-emcee] added versions 3.0.2 and 3.1.1 (#26910)
* [py-emcee] added version 3.0.2

* [py-emcee] added version 3.1.1
2021-10-22 17:52:45 -06:00
eugeneswalker
0baec34dd7 E4S amd64 CI: add parsec (#26906) 2021-10-22 15:02:47 -06:00
Wouter Deconinck
d04b6e0fb5 acts: fix misspelled variant in depends_on (#26904)
Correct variant is plural `unit_tests`.
2021-10-22 12:19:51 -06:00
Harmen Stoppels
336c60c618 Document backport in py (#26897) 2021-10-22 19:14:35 +02:00
iarspider
0beb2e1ab2 Update py-aiohttp to 3.7.4 and py-chartdet to 4.0 (#26893)
* Update py-aiohttp to 3.7.4 and py-chartdet to 4.0

* Changes from review

* Update package.py

* Update var/spack/repos/builtin/packages/py-chardet/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-22 16:36:56 +00:00
iarspider
aa2bcbdc12 Use pypi for astroid; add versions 2.7.3 and 2.8.3; update dependencies (#26900)
* Use pypi for astroid; add versions 2.7.3 and 2.8.3; update dependencies

* py-six is not required for recent versions
2021-10-22 16:21:22 +00:00
Wouter Deconinck
d5880f3e7e acts: add v14.0.0, v14.1.0 added "python" and "analysis" variants (#26657) 2021-10-22 17:43:35 +02:00
Robert Cimrman
2d41d6fc9e py-sfepy: fix version 2021.3 tar.gz hash (#26890) 2021-10-22 10:23:58 -05:00
iarspider
8349a8ac60 Add py-asn1crypto@1.4.0 (#26895) 2021-10-22 10:22:30 -05:00
Manuela Kuhn
edc7341625 py-jupyterlab: add 3.1.18 and 3.2.1 (#26896) 2021-10-22 10:21:00 -05:00
Harmen Stoppels
609a42d63b Shorten long shebangs only if the execute permission is set (#26899)
The OS should only interpret shebangs, if a file is executable. 

Thus, there should be no need to modify files where no execute bit is set. 

This solves issues that are e.g. encountered while packaging software as 
COVISE (https://github.com/hlrs-vis/covise), which includes example data
in Tecplot format. The sbang post-install hook is applied to every installed
file that starts with the two characters #!, but this fails on the binary Tecplot
files, as they happen to start with #!TDV. Decoding them with UTF-8 fails 
and an exception is thrown during post_install.

Co-authored-by: Martin Aumüller <aumuell@reserv.at>
2021-10-22 16:55:19 +02:00
Harmen Stoppels
d274769761 Backport #186 from py-py to fix macOS failures (#26653)
Backports the relevant bits of 0f77b6e66f
2021-10-22 13:52:46 +02:00
Harmen Stoppels
7cec6476c7 Remove hard-coded repository in centos6 unit test (#26885)
This makes it easier to run tests in a fork.
2021-10-22 10:24:03 +02:00
Doug Jacobsen
d1d0021647 Add GCS Bucket Mirrors (#26382)
This commit contains changes to support Google Cloud Storage 
buckets as mirrors, meant for hosting Spack build-caches. This 
feature is beneficial for folks that are running infrastructure on 
Google Cloud Platform. On public cloud systems, resources are
ephemeral and in many cases, installing compilers, MPI flavors,
and user packages from scratch takes up considerable time.

Giving users the ability to host a Spack mirror that can store build
caches in GCS buckets offers a clean solution for reducing
application rebuilds for Google Cloud infrastructure.

Co-authored-by: Joe Schoonover <joe@fluidnumerics.com>
2021-10-22 06:22:38 +02:00
Miroslav Stoyanov
d024faf044 heffte: new version with rocm (#26878) 2021-10-22 01:12:46 +00:00
Cameron Stanavige
8b8bec2179 scr/veloc: component releases (#26716)
* scr/veloc: component releases

Update the ECP-VeloC component packages in preparation for an
upcoming scr@3.0rc2 release.

All
- Add new release versions
- Add new `shared` variant for all components
- Add zlib link dependency to packages that were missing it
- Add maintainers
- Use self.define and self.define_from_variant to clean up cmake_args

axl
- Add independent vendor async support variants

rankstr
- Update older version sha that fails checksum on install

* Fix scr build error

Lock dependencies for scr@3.0rc1 to the versions released at the same
time.
2021-10-21 18:05:59 -07:00
Frank Willmore
a707bf345d Add vacuumms package (#26838) 2021-10-21 17:12:08 -07:00
Robert Cimrman
9b28f99dd2 py-sfepy: add v2021.3 (#26865) 2021-10-21 23:50:38 +00:00
Manuela Kuhn
dbe1060420 py-neurokit2: add 0.1.4.1 (#26871)
* py-neurokit2: add 0.1.4.1

* Update var/spack/repos/builtin/packages/py-neurokit2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-21 21:03:04 +00:00
eugeneswalker
3720d06e26 update E4S CI environments in preparation for 21.11 release (#26826)
* update E4S CI environments in preparation for 21.11 release

* e4s ci env: use clingo
2021-10-21 07:06:02 -07:00
Alexander Jaust
3176c6d860 [py-pyprecice] add v2.3.0.1 and fix hash of v2.2.1.1 (#26846) 2021-10-21 08:36:24 -05:00
Harmen Stoppels
3fa0654d0b Make 'spack location -e' print the current env, and 'spack cd -e' go to the current env (#26654) 2021-10-21 13:19:48 +02:00
Eric Brugger
7f4bf2b2e1 vtk: make use of system GLEW dependent on osmesa being disabled. (#26764) 2021-10-21 09:53:13 +02:00
Vanessasaurus
3fe1785d33 libabigail: support source install (#26807)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-10-21 09:22:22 +02:00
Wouter Deconinck
b65937e193 kassiopeia: add v3.8.0; new variants log4cxx, boost (#26841) 2021-10-21 07:44:57 +02:00
Alexander Jaust
97d244eb95 Make superlu-dist@7.1.0 request CMake 3.18.1 or newer (#26849) 2021-10-21 07:43:30 +02:00
Edward Hartnett
88591536c8 bacio: add v2.5.0 (#26851) 2021-10-21 07:41:47 +02:00
RichardABunt
f78e5383cc arm-forge: add v21.1 (#26852) 2021-10-21 07:41:06 +02:00
downloadico
34d4091dbc tramonto: add new package (#26858) 2021-10-21 07:18:33 +02:00
Stephen Hudson
afd4a72937 libEnsemble: add v0.8.0 (#26864) 2021-10-21 07:14:31 +02:00
Adam J. Stewart
342f89e277 py-numpy: add v1.21.3 (#26860) 2021-10-21 07:13:43 +02:00
Luc Berger
fc7178da17 Kokkos Kernels: updating maintainers and releases (#26855)
Adding the latests releases of Kokkos Kernels and the official maintainers for the package.
2021-10-20 12:19:56 -07:00
iarspider
594325ae09 Add py-sniffio version 1.2.0 and fix dependencies (#26847)
* Add py-sniffio version 1.2.0 and fix dependencies

* Changes from review
2021-10-20 16:30:26 +00:00
iarspider
fcff01aaaf Update py-anyio to 3.3.4 (#26821)
* Update py-anyio to 3.3.4

* Changes from PR
2021-10-20 10:36:45 -05:00
Manuela Kuhn
b9c633d086 py-bids-validator: add 1.8.4 (#26844) 2021-10-20 10:33:27 -05:00
kwryankrattiger
45bea7cef7 Update ECP dav helper for propagating variants (#26175) 2021-10-20 11:22:07 -04:00
Martin Aumüller
029b47ad72 embree: add checksums for 3.12.2 & 3.13.1 (#26675)
Includes fix for for dependency ispc: fix build if cc -m32 is not possible
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-20 16:41:18 +02:00
Tamara Dahlgren
cc8b6ca69f Add --preferred and --latest tospack checksum (#25830) 2021-10-20 13:38:55 +00:00
Adam J. Stewart
f184900b1a damask: fix maintainer GitHub handle (#26829) 2021-10-20 14:57:13 +02:00
Wouter Deconinck
abc8cfe8fa cppgsl: branch master is now main (#26834) 2021-10-20 14:56:01 +02:00
Massimiliano Culpo
56209cb114 Reduce verbosity of error messages when concretizing environments (#26843)
With this commit stacktraces of subprocesses are shown only if debug mode is active
2021-10-20 11:30:07 +00:00
Alexander Jaust
26b58701bc Fix typo in repositories.rst (#26845) 2021-10-20 11:11:17 +00:00
Wouter Deconinck
42a6e1fcee root: prepend dependent_spec.prefix.include to ROOT_INCLUDE_PATH (#26379)
Spack is not populating CPATH anymore (e3f97b37e6 (diff-259adc895c0b2e8fca42ffb99d8051eec0712c868d12d8da255d32f1663acdc7)), and downstream packages ([gaudi](1aa7758dbb/var/spack/repos/builtin/packages/gaudi/package.py (L116))) have alrady started to include this in their package.py files. Instead of propagating this to all downstream packages, it tries to address the issue at the source.
2021-10-20 09:15:14 +02:00
Massimiliano Culpo
1983d6d9d0 Pipelines: pin importlib-metadata and setuptools-scm versions (#26831)
This will hopefully be a hotfix for the issue with pipelines
in #26813 and #26820
2021-10-19 16:09:40 -07:00
Chris White
b7d94ebc4f allow camp to use blt package (#26765) 2021-10-19 16:09:15 -07:00
Harmen Stoppels
1b634f05e0 A single process pool is not something to boast about (#26837) 2021-10-19 23:04:52 +00:00
Greg Becker
7dc0ca4ee6 cray architecture detection for zen3/milan (#26827)
* Update cray architecture detection for milan

Update the cray architecture module table with x86-milan -> zen3
Make cray architecture more robust to back off from frontend
architecture to a recent ancestor if necessary. This should make
future cray updates less paingful for users.

Co-authored-by: Gregory Becker <becker33.llnl.gov>
Co-authored-by: Todd Gamblin <gamblin2@llnl.gov>
2021-10-19 21:39:50 +00:00
Kai Germaschewski
501fa6767b adios2: fix unresolved symbols in 2.6.0 when built with gcc10 (#23871) 2021-10-19 21:09:07 +00:00
Harmen Stoppels
e7c7f44bb6 Reduce verbosity of threaded concretization (#26822)
1. Don't use 16 digits of precision for the seconds, round to 2 digits after the comma
2. Don't print if we don't concretize (i.e. `spack concretize` without `-f` doesn't have to tell me it did nothing in `0.00` seconds)
2021-10-19 18:33:17 +00:00
iarspider
bc99d8a2fd Update py-zipp (#26819)
* Update py-zipp

* Fix typo
2021-10-19 13:29:32 -05:00
Garth N. Wells
2bfff5338f py-fenics-ffcx: dependency updates (#26783)
* Update py-fenics-ffcx dependencies

* Relax some version numbering

* Remove stray colon
2021-10-19 13:28:20 -05:00
iarspider
a678a66683 py-setuptools-scm: make py-tomli dependency an open range (#26820) 2021-10-19 16:08:56 +00:00
Massimiliano Culpo
4c082a5357 Relax os constraints in e4s pipelines (#26547) 2021-10-19 10:51:37 -05:00
Massimiliano Culpo
2d45a9d617 Speed-up environment concretization on linux with a process pool (#26264)
* Speed-up environment concretization with a process pool

We can exploit the fact that the environment is concretized
separately and use a pool of processes to concretize it.

* Add module spack.util.parallel

Module includes `pool` and `parallel_map` abstractions,
along with implementation details for both.

* Add a new hash type to pass specs across processes

* Add tty msg with concretization time
2021-10-19 10:09:34 -05:00
Ryan Mast
64a323b22d spdlog: add v1.9.0, v1.9.1 and v1.9.2 (#26777) 2021-10-19 16:55:36 +02:00
Ryan Mast
f7666f0074 asio: Add versions up to 1.20.0 (#26778) 2021-10-19 16:54:55 +02:00
Ryan Mast
b72723d5d9 cli11: added v2.1.1, v2.1.0 (#26780) 2021-10-19 16:54:33 +02:00
Ryan Mast
65d344b2f8 cli11: disable building the examples (#26781) 2021-10-19 16:53:45 +02:00
Adam J. Stewart
9afc5ba198 py-pandas: add v1.3.4 (#26788) 2021-10-19 16:50:42 +02:00
Christopher Kotfila
ad35251860 Fix trigger and child links in pipeline docs (#26814) 2021-10-19 14:44:36 +00:00
Paul
9eaccf3bbf Add Go 1.17.2 and 1.16.9 (#26794) 2021-10-19 16:30:59 +02:00
Mark Grondona
bb1833bc8c flux: update maintainer of flux-core, flux-sched (#26800) 2021-10-19 16:30:33 +02:00
Robert Pavel
c66aa858c2 Made Legion Dependency on Gasnet Tarball Explicit (#26805)
Made legion dependency on a gasnet tarball explicit so as to take
advantage of spack mirrors for the purpose of deploying on machines with
firewalls
2021-10-19 16:28:43 +02:00
Erik Schnetter
4fc8ed10cb nsimd: add v3.0 (#26806) 2021-10-19 16:27:29 +02:00
Adam J. Stewart
0de8c65a2d Libtiff: improve compression support (#26809) 2021-10-19 16:17:37 +02:00
genric
42fb34e903 py-xarray: add v0.18.2 (#26811) 2021-10-19 16:15:56 +02:00
iarspider
d6fc914a9b Update py-importlib-metadata and py-setuptools-scm (#26813) 2021-10-19 16:08:53 +02:00
Massimiliano Culpo
79c92062a8 Gitlab pipelines: use images from the Spack organization (#26796) 2021-10-19 14:38:39 +02:00
Scott Wittenburg
95538de731 Speed up pipeline generation (#26622)
- [x] Stage already concretized specs instead of abstract ones
- [x] Reduce number of network calls by reading naughty list up front
2021-10-18 20:58:02 -07:00
Miguel Dias Costa
d41ddb8a9c new package: berkeleygw (#21455) 2021-10-18 18:17:58 -07:00
Harmen Stoppels
2b54132b9a cosma: add new versions and improve package (#24136)
* cosma: add new versions and improve package

* Move method below depends_on's
2021-10-18 18:06:14 -07:00
Todd Gamblin
c5ca0db27f patches: make re-applied patches idempotent (#26784)
We use POSIX `patch` to apply patches to files when building, but
`patch` by default prompts the user when it looks like a patch
has already been applied. This means that:

1. If a patch lands in upstream and we don't disable it
   in a package, the build will start failing.
2. `spack develop` builds (which keep the stage around) will
   fail the second time you try to use them.

To avoid that, we can run `patch` with `-N` (also called
`--forward`, but the long option is not in POSIX). `-N` causes
`patch` to just ignore patches that have already been applied.
This *almost* makes `patch` idempotent, except that it returns 1
when it detects already applied patches with `-N`, so we have to
look at the output of the command to see if it's safe to ignore
the error.

- [x] Remove non-POSIX `-s` option from `patch` call
- [x] Add `-N` option to `patch`
- [x] Ignore error status when `patch` returns 1 due to `-N`
- [x] Add tests for applying a patch twice and applying a bad patch
- [x] Tweak `spack.util.executable` so that it saves the error that
      *would have been* raised with `fail_on_error=True`. This lets
      us easily re-raise it.

Co-authored-by: Greg Becker <becker33@llnl.gov>
2021-10-18 23:11:42 +00:00
Mickaël Schoentgen
a118e799ae httpie: add v2.6.0 (#26791)
Signed-off-by: Mickaël Schoentgen <contact@tiger-222.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-18 21:21:08 +00:00
Seth R. Johnson
bf42d3d49b file and python dependents: add missing dependencies (#26785)
* py-magic: delete redundant package

This package is actually named py-python-magic (since the project itself
is "python-magic").

* New package: libmagic

* Py-python-magic: add required runtime dependency on libmagic and new version

* Py-filemagic: add required runtime dependency

* py-magic: restore and mark as redundant

This reverts commit 4cab7fb69e.

* file: add implicit dependencies and static variant

Replaces redundant libmagic that I added. Compression headers were previously
being picked up from the system.

* Fix py-python-magic dependency

* Update python version requirements
2021-10-18 21:11:16 +00:00
Miroslav Stoyanov
e0ff44a056 tasmanian: add smoke test (#26763) 2021-10-18 16:09:46 -04:00
Seth R. Johnson
c48b733773 Make macOS installed libraries more relocatable (#26608)
* relocate: call install_name_tool less

* zstd: fix race condition

Multiple times on my mac, trying to install in parallel led to failures
from multiple tasks trying to simultaneously create `$PREFIX/lib`.

* PackageMeta: simplify callback flush

* Relocate: use spack.platforms instead of platform

* Relocate: code improvements

* fix zstd

* Automatically fix rpaths for packages on macOS

* Only change library IDs when the path is already in the rpath

This restores the hardcoded library path for GCC.

* Delete nonexistent rpaths and add more testing

* Relocate: Allow @executable_path and @loader_path
2021-10-18 13:34:16 -04:00
Shahzeb Siddiqui
3c013b5be6 docutils > 0.17 issue with rendering list items in sphinx (#26355)
* downgrade_docutils_version

* invalid version

* Update requirements.txt

* Improve spelling and shorten the reference link

* Update spack.yaml

* update version requirement

* update version to maximum of 0.16

Co-authored-by: bernhardkaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-18 16:55:46 +00:00
Harmen Stoppels
30e8dd95b5 Remove unused exist_errors in installer.py (#26650) 2021-10-18 15:53:51 +02:00
Harmen Stoppels
1e5f7b3542 Don't print error output in the test whether gpgconf works (#26682) 2021-10-18 15:52:53 +02:00
Sreenivasa Murthy Kolam
1156c7d0a9 allow multiple values for tensile_architecture and expand the gpu list for rocm-4.3.1 (#26745) 2021-10-18 08:48:07 +02:00
Xavier Delaruelle
cab17c4ac3 environment-modules: add version 5.0.1 (#26786) 2021-10-18 08:46:57 +02:00
Harmen Stoppels
33ef7d57c1 Revert 19736 because conflicts are avoided by clingo by default (#26721) 2021-10-18 08:41:35 +02:00
Valentin Volkl
9c7e71df34 py-jupytext: add new package (#26732)
* py-jupytext: add new package

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update jupytext dependencies

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-jupytext: remove py-jupyerlab dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-17 18:51:42 -05:00
Morten Kristensen
03f84fb440 py-vermin: add latest version 1.3.0 (#26787) 2021-10-17 11:46:53 -05:00
Valentin Volkl
d496568ff9 py-gevent: add version 1.5 (#26731)
* py-gevent: add version 1.5

* py-gevent: update dependencies for v1.5.0

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-gevent/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-17 11:07:11 -05:00
Sam Reeve
ba62b691a6 Add ECP tags for CoPA and related packages (#26739)
* Add ECP tags for CoPA (and related) packages

* Update CoPA maintainers
2021-10-17 10:00:28 -04:00
Ryan Mast
b794b5bd4d nlohmann-json: update to version 3.10.4 (#26779) 2021-10-17 09:56:20 -04:00
Bernhard Kaindl
64143f970e [Fix for the GitLab CI] phist: prefer @1.9.5 (1.9.6 is not compatible w/ mpich%gcc:9) (#26773)
* phist: Prefer 1.9.5 (1.9.6 uses mpi_f08, but not available in CI)

* phist: remove dupe of 1.9.5, missing preferred=True

Also, for 1.9.6, patch the (most, one does not work) tests to use
2021-10-16 11:18:07 -07:00
Scot Halverson
b2f059e547 Update GASNet package.py to include version 2021.9.0 (#26736) 2021-10-15 21:15:32 +02:00
Mosè Giordano
2cf7d43e62 libblastrampoline: add v3.1.0 (#26769)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-15 18:17:26 +00:00
Brice Videau
0bc1bffe50 Fix ruby dependent extensions. (#26729)
* Fix ruby dependent extensions.

* Added Kerilk as maintainer.
2021-10-15 16:59:32 +00:00
Axel Huebl
33da53e325 GCC: Conflict for <12 for M1 (#26318)
aarch64/M1 is only a supported build combination for GCC in
the planned GCC 12+ release.
2021-10-15 12:30:09 -04:00
Mickaël Schoentgen
6686db42ad py-charset-normalizer: add v2.0.7 (#26756)
Signed-off-by: Mickaël Schoentgen <contact@tiger-222.fr>
2021-10-15 15:58:09 +00:00
Manuela Kuhn
ec67bec22a py-rdflib: add 6.0.2 (#26757) 2021-10-15 10:35:59 -05:00
Manuela Kuhn
2a5a576667 py-ipykernel: add 6.4.1 and fix deps (#26758) 2021-10-15 10:34:51 -05:00
Manuela Kuhn
5dc8094f84 py-setuptools: add 58.2.0 (#26759) 2021-10-15 10:31:52 -05:00
Manuela Kuhn
012b4f479f py-jupyter-client: add 6.1.12 (#26760) 2021-10-15 10:30:50 -05:00
Harmen Stoppels
f8e4aa7d70 Revert "Don't run lsb_release on linux (#26707)" (#26754)
This reverts commit fcac95b065.
2021-10-15 09:34:04 +00:00
Harmen Stoppels
e0fbf09239 EnvironmentModifications: allow disabling stack tracing (#26706)
Currently Spack keeps track of the origin in the code of any
modification to the environment variables. This is very slow 
and enabled unconditionally even in code paths where the 
origin of the modification is never queried.

The only place where we inspect the origins of environment 
modifications is before we start a build, If there's an override 
of the type `e.set(...)` after incremental changes like 
`e.append_path(..)`, which is a "suspicious" change.

This is very rare though.

If an override like this ever happens, it might mean a package is
broken. If that leads to build errors, we can just ask the user to run
`spack -d install ...` and check the warnings issued by Spack to find
the origins of the problem.
2021-10-15 10:00:44 +02:00
Vanessasaurus
842e56efb8 libabigail: add v2.0 (#26753)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-10-15 07:53:29 +02:00
Joseph Wang
0749d94ad3 Disable parallel builds in groff and gosam-contrib (#26730)
Work around to #26726 and #26714
2021-10-15 07:20:18 +02:00
Cameron Rutherford
2bc0ea70ed HiOp: add v0.5.0 + small changes in dependencies (#26744) 2021-10-15 07:09:09 +02:00
kwryankrattiger
bf5ef3b6b9 paraview: add adios2 variant (#26728) 2021-10-15 07:07:04 +02:00
Timothy Brown
7ccdae5233 Removing NCEP Post (ncep-post). (#26749)
UPP and ncep-post are the same package, so this PR 
removes the duplication. 

ncep-post was originally named after the upstream repo
that now changed its name to UPP.
2021-10-15 07:04:14 +02:00
Eric Brugger
ea453db674 vtk: modify conflict between osmesa and qt (#26752) 2021-10-15 06:58:43 +02:00
Tamara Dahlgren
41d375f6a4 Stand-alone tests: disallow re-using an alias (#25881)
It can be frustrating to successfully run `spack test run --alias <name>` only to find you cannot get the results because you already use `<name>` in some previous stand-alone test execution.  This PR prevents that from happening.
2021-10-14 15:08:00 -07:00
Tamara Dahlgren
beb8a36792 Remove extra tag assignments (#26692) 2021-10-14 14:50:09 -07:00
Manuela Kuhn
28ebe2f495 py-datalad: add 0.15.2 (#26750) 2021-10-14 21:44:52 +00:00
Harmen Stoppels
4acda0839b cp2k: use variant propagation trick for virtuals (#26737) 2021-10-14 23:11:22 +02:00
Massimiliano Culpo
eded8f48dc ASP-based solver: add a rule for version uniqueness in virtual packages (#26740)
fixes #26718

A virtual package may or may not have a version, but it
never has more than one. Previously we were missing a rule
for that.
2021-10-14 23:06:41 +02:00
Christoph Junghans
d9d0ceb726 add py-pyh5md and update py-espressopp (#26746)
* add  py-pyh5md and update py-espressopp

* Update package.py
2021-10-14 19:17:02 +00:00
Bernhard Kaindl
6ca4d554cd libfive: Add all variants, +qt needs qt@5.15.2:+opengl (#26629)
Refresh of deps to fix the build and add variants from CMakeLists.txt
2021-10-14 11:42:01 -07:00
Mosè Giordano
74c6b7d8c1 sombrero: add version 2021-08-16 (#26741) 2021-10-14 19:41:50 +02:00
Bernhard Kaindl
95030a97b6 vc: Enable the testsuite, excluding tests failing on Zen2 (#26699)
This fixes running the testsuite, it adds the package virtest for it.
2021-10-14 09:02:31 -07:00
Miroslav Stoyanov
b64a7fe1ff switch to the smoke testing included in heffte (#26720) 2021-10-14 08:58:29 -07:00
iarspider
160371f879 alpgen: new package (#26713)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-14 16:58:09 +02:00
Thomas Madlener
49354140be edm4hep: new version, fix tests (depends on catch2) (#26679) 2021-10-14 14:04:45 +00:00
Thomas Madlener
25704ae8e6 podio: new version and fix python unittest env (#26649) 2021-10-14 13:54:02 +00:00
Bernhard Kaindl
a61853816f phist: Fix build of 1.9.6, fix build- and install-tests (#26727)
Primary fix:

Due to a typo in a version range, overlapping PR merges resulted
in a build failure of the latest version:
Don't attempt to remove a non-existing file for version 1.9.6.

Secondary fixes:

update_tpetra_gotypes.patch was mentioned twice, and the version
range has to exclude @1.4.2, to which it cannot be applied.

Add depend_on() py-pytest, py-numpy and pkgconfig with type='test'

@:1.9.0 fail with 'Rank mismatch with gfortran@10:, add a conflicts().

raise InstallError('~mpi not possible with kernel_lib=builtin!')
when applicable.

Fixes for spack install --test=root phist:

mpiexec -n12 puts a lot of stress on a pod and gets stuck in a loop
very often: Reduce the mpiexec procs and the number of threads.

Remove @run_after('build') @on_package_attributes(run_tests=True):
from 'def check()': fixes it from getting called twice

The build script of 'make test_install' for the installcheck expects
the examples to be copied to self.stage.path: Provide them.
2021-10-14 08:14:25 -04:00
AMD Toolchain Support
8c1399ff7c LAMMPS: update recipe for %aocc (#26710)
* updating the recipe for betterment

* addressing the suggesions received from reviewers

* adding package helper macros

Co-authored-by: mohan002 <mohbabul@amd.com>
2021-10-14 12:03:15 +00:00
Eric Brugger
2bc97f62fd Qt: Qt fixes for a Cray AMD system. (#26722)
* Qt fixes for a Cray AMD system.

* Update to latest changes.
2021-10-14 07:41:26 -04:00
Massimiliano Culpo
949094544e Constrain abstract specs rather than concatenating strings in the "when" context manager (#26700)
Using the Spec.constrain method doesn't work since it might
trigger a repository lookup which could break our directives
and triggers a circular import error.

To fix that we introduce a function to merge abstract anonymous
specs, based only on package names, which does not perform any
lookup in the repository.
2021-10-14 12:33:10 +02:00
Bernhard Kaindl
2c3ea68dd1 openslide: Fix missing dependencies: gdk-pixbuf and perl-alien-libxml2 (#26620)
Add missing pkgconfig to openslide and its dep perl-alien-libxml2.

Fix shared-mime-info to be a runtime dependency of gdk-pixbuf,
Otherwise, configure cannot detect use gdk-pixbuf without error.
2021-10-13 19:07:27 -05:00
Bernhard Kaindl
99200f91b7 r-imager: add depends_on('r+X') and bump version (#26697) 2021-10-13 19:02:54 -05:00
Satish Balay
36bd69021d dealii: @9.3.1 has build errors with boost@1.77.0 - so add dependency on boost@1.76.0 [or lower] (#26709) 2021-10-14 01:30:53 +02:00
Harmen Stoppels
fcac95b065 Don't run lsb_release on linux (#26707)
Running `lsb_release` on Linux takes about 50ms because it is written in
Python. We do not use the output, so this change makes use not call it.
2021-10-14 01:27:24 +02:00
Manuela Kuhn
0227fc3ddc py-mne: add full variant (#26702) 2021-10-13 16:59:46 -05:00
Keita Iwabuchi
7c8c29a231 metall: add version 0.17 (#26694)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-13 22:10:16 +02:00
Bernhard Kaindl
5782cb6083 magics: Add v4.9.3 to fix build with gcc@11, skip broken testcase (#26695)
To build with gcc-11, v4.9.3 is needed, conflict added for older revs.
2021-10-13 12:14:37 -07:00
Bernhard Kaindl
f4a132256a qgis: fix build of LTS release with proj>7 (#26696)
Co-authored-by: Sinan <sbulutw@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-13 20:42:18 +02:00
David Beckingsale
2864212ae3 Add camp 0.3.0 and 0.2.3 (#26717) 2021-10-13 18:41:35 +00:00
Bernhard Kaindl
87663c6796 vapor: Fix the build and update: Use correct deps and find numpy incdir (#26630)
vapor needs proj@:7 and gives a list of tested dependency versions.
Make it find the numpy include path and add version 3.5.0 as well
2021-10-13 13:25:18 -04:00
Bernhard Kaindl
14a929a651 sfcgal: build fails with cgal@:4.6, works with cgal@4.7: (#26642)
Use depends_on('cgal@4.7: +core') to fix the build
2021-10-13 11:47:09 -05:00
Bernhard Kaindl
bcd1272253 wireshark: Fix install race and skip network capture tests (#26698)
The network capture tests can't pass when built as normal user.
2021-10-13 12:39:19 -04:00
Todd Kordenbrock
6125117b5d SEACAS: add a Faodel variant (#26583)
* SEACAS: add a Faodel variant

* Use safer CMake and variant packages instead of directly adding parameters
Add a "+faodel ~mpi" dependency to balance "+faodel +mpi"
2021-10-13 12:39:11 -04:00
Satish Balay
37278c9fa0 superlu-dist add version 7.1.0 (#26708) 2021-10-13 11:34:43 -05:00
Patrick Gartung
047c95aa8d buildcache: do one less tar file extraction
The buildcache is now extracted in a temporary folder within the current store,
moved to its final place and relocated. 

"spack clean -s" has been extended to also clean the temporary extraction directory.

Add hardlinks with absolute paths for libraries in the corge, garply and quux packages
to detect incorrect handling of hardlinks in tests.
2021-10-13 17:38:29 +02:00
haralmha
e6b76578d2 Add version 4.12.0 (#26532) 2021-10-13 14:07:48 +00:00
iarspider
81c272fcb7 photos-f: new package (Fortran version) (#26703) 2021-10-13 15:56:11 +02:00
Jose E. Roman
c47eb7217e slepc: set up SLEPC_DIR for dependent packages (#26701) 2021-10-13 15:51:08 +02:00
Jen Herting
8ad608a756 py-convokit: new package (#26236)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-10-13 15:50:07 +02:00
Joseph Wang
252784ccf5 cppgsl: disable tests on gcc11 (#26593) 2021-10-13 09:36:45 -04:00
Bernhard Kaindl
b771c4ea01 phist: Fix build: ppc64_sse.patch only applies to 1.9.4 (#26704)
ppc64_sse.patch can only be applied to 1.9.4:
* Older releases don't have the patched file
* All newer releases carry the change of the patch already.
2021-10-13 09:20:50 -04:00
Francis Kloss
53461b7b04 salome-medcoupling: new package (with dependencies) (#25785)
Adds new packages for using MEDCoupling from SALOME platform
2021-10-13 13:50:37 +02:00
Joe Schoonover
4d58661d08 feq-parse: add version 1.1.0 and update maintainer (#26060) 2021-10-13 00:20:21 +00:00
Valentin Volkl
81d157b05e garfieldpp: update dependencies, add variant (#25816) 2021-10-13 02:13:50 +02:00
Jen Herting
0f9080738f [py-spacy] added version 2.3.7 (#25999) 2021-10-12 23:57:04 +00:00
Scott McMillan
b937aa205b Fix Amber patch target specification (#26687)
Co-authored-by: Scott McMillan <smcmillan@nvidia.com>
2021-10-13 00:15:05 +02:00
Jose E. Roman
31b0fe21a2 py-slepc4py: add missing depends_on() (#26688) 2021-10-13 00:09:19 +02:00
Bernhard Kaindl
e7bfaeea52 libical: Add missing deps: pkgconfig, glib and libxml2 (#26618)
Libical needs pkgconfig, glib and libxml2 to build.
2021-10-12 23:22:49 +02:00
Harmen Stoppels
1ed695dca7 Improve error messages for bootstrap download failures (#26599) 2021-10-12 22:11:07 +02:00
Bernhard Kaindl
b6ad9848d2 babelflow, parallelmergetree: fix build with gcc11 (#26681)
gcc-11 does not include the <limits> and <algorithm> as side effect
of including other header, at least not as often as earlier gcc did.
2021-10-12 21:45:00 +02:00
Stephen Herbein
8d04c8d23c flux-core, flux-sched: add 0.29.0, 0.18.0 and cleanup env vars (#26391)
Problem: Flux expects the `FLUX_PMI_LIBRARY_PATH` to point directly at
the `libpmi.so` installed by Flux.  When the env var is unset,
prepending to it results in this behavior.  In the rare case that the
env var is already set, then the spack `libpmi.so` gets prepended with a
`:`, which Flux then attempts to interpret as a single path.

Solution: don't prepend to the path, instead set the path to point to
the `libpmi.so` (which will be undone when Flux is unloaded).

* flux-core: remove deprecated environment variables

The earliest checksummed version in this package is 0.15.0. As of
0.12.0, wreck (and its associated paths) no longer exist in Flux. As of
0.13.0, the `FLUX_RCX_PATH` variables are no longer used. So clean up
these env vars from the `setup_run_environment`.
2021-10-12 21:39:43 +02:00
Adam J. Stewart
47554e1e2f GMT: add conflict for GCC 11 (#26684) 2021-10-12 17:56:17 +00:00
Bernhard Kaindl
862ce517ce gromacs: @2018:2020: add #include <limits> for newer %gcc builds (#26678)
gromacs@2018:2020.6 is fixed to build with gcc@11.2.0
by adding #include <limits> to a few header files.

Thanks to Maciej Wójcik <w8jcik@gmail.com> for testing versions.
2021-10-12 11:39:09 -06:00
Alexander Jaust
50a2316a15 Add missing spack command in basic usage tutorial (#26646)
The `find` command was missing for the examples forcing colorized output. Without this (or another suitable) command, spack produces output that is not using any color. Thus, without the `find` command one does not see any difference between forced colorized and non-colorized output.
2021-10-12 19:23:53 +02:00
Mark W. Krentel
2edfccf61d binutils: fix parallel make for version 2.36 (#26611)
There was a bug in 2.36.* of missing Makefile dependencies.  The
previous workaround was to require 2.36 to be built serially.  This is
now fixed upstream in 2.37 and this PR adds the patch to restore
parallel make to 2.36.
2021-10-12 19:01:46 +02:00
Veselin Dobrev
e2a64bb483 mfem: patch @4.3.0 to support hypre up to v2.23.0 (#26640) 2021-10-12 18:29:56 +02:00
Manuela Kuhn
ff66c237c0 py-niworkflows: add new package (#26639)
* py-niworkflows: add new package

* Update var/spack/repos/builtin/packages/py-niworkflows/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove unnecessary comment

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-12 11:22:44 -05:00
Manuela Kuhn
f8910c7859 py-nistats: add new package (#26662)
* py-nistats: add new package

* Update var/spack/repos/builtin/packages/py-nistats/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove `conflicts`

* remove test dependencies

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-12 11:22:00 -05:00
Harmen Stoppels
e168320bb1 spack: Add package (#25979)
* Make python 2 use 'from __future__ import absolute_import' to allow import spack.pkgkit

* Add Spack

* Improve ranges
2021-10-12 11:39:39 -04:00
Sergei Shudler
f58f5f8a0d babelflow, parallelmergetree: add the current versions (#26660) 2021-10-12 17:36:13 +02:00
Vanessasaurus
ce7eebfc1f allowing spack monitor to handle redirect (#26666)
when deployed on kubernetes, the server sends back permanent redirect responses.
This is elegantly handled by the requests library, but not urllib that we have
to use here, so I have to manually handle it by parsing the exception to
get the Location header, and then retrying the request there.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2021-10-12 17:29:22 +02:00
Jonas Thies
f66ae104bf phist: force MPI compiler wrappers (#26312)
* packages/phist, re #26002: force phist to use MPI compiler wrappers (copied from trilinos package)

* packages/phist re #26002, use cmake-provded FindMPI module only

* packages/phist source code formatting

* packages/phist: set MPI_HOME rather than MPI_BASE_DIR, thanks @sethri.

* phist: delete own FindMPI.cmake for older versions (rather than patching it away)

* packages/phist: remove blank line

* phist: adjust sorting of imports

* phist: change order of imports
2021-10-12 11:11:00 -04:00
Joseph Wang
1e382d6d20 madgraph5amc: Add changes fixing bugs shown by gcc10 (#26643) 2021-10-12 15:55:48 +02:00
Martin Aumüller
6f8f37ffb1 ispc: update development branch name and limit to llvm@12 (#26676)
1.16 and 1.16.1 are not compatible with LLVM 13
2021-10-12 14:31:24 +02:00
Massimiliano Culpo
551120ee0b ASP-based solver: decrease the priority of multi-valued variant optimization for root (#26677)
The ASP-based solver maximizes the number of values in multi-valued
variants (if other higher order constraints are met), to avoid cases
where only a subset of the values that have been specified on the
command line or imposed by another constraint are selected.

Here we swap the priority of this optimization target with the
selection of the default providers, to avoid unexpected results
like the one in #26598
2021-10-12 14:15:48 +02:00
Harmen Stoppels
c2bf585d17 Fix potentially broken shutil.rmtree in tests (#26665)
Seems like https://bugs.python.org/issue29699 is relevant. Better to
just ignore errors when removing them tmpdir. The OS will remove it
anyways.

Errors are happening randomly from tests that are using this fixture.
2021-10-12 14:01:52 +02:00
Martin Diehl
66b32b337f damask{,-grid,-mesh}: add @3.0.0-alpha5 (#26570) 2021-10-12 13:22:09 +02:00
Maciej Wójcik
4b2cbd3aea gromacs: Add Gromacs 2020.6 and Plumed 2.7.2 (#26663) 2021-10-12 13:01:10 +02:00
Bernhard Kaindl
cb06f91df7 boost: Fix build of 1.53:1.54 with glibc>=2.17 (#26659)
Fix missing declaration of uintptr_t with glibc>=2.17 in 1.53:1.54
See: https://bugs.gentoo.org/482372
2021-10-12 10:59:36 +02:00
Harmen Stoppels
0c0831861c Avoid quadratic complexity in log parser (#26568)
TL;DR: there are matching groups trying to match 1 or more occurrences of
something. We don't use the matching group. Therefore it's sufficient to test
for 1 occurrence. This reduce quadratic complexity to linear time.

---

When parsing logs of an mpich build, I'm getting a 4 minute (!!) wait
with 16 threads for regexes to run:

```
In [1]: %time p.parse("mpich.log")
Wall time: 4min 14s
```

That's really unacceptably slow... 

After some digging, it seems a few regexes tend to have `O(n^2)` scaling
where `n` is the string / log line length. I don't think they *necessarily*
should scale like that, but it seems that way. The common pattern is this

```
([^:]+): error
```

which matches `: error` literally, and then one or more non-colons before that. So
for a log line like this:

```
abcdefghijklmnopqrstuvwxyz: error etc etc
```

Any of these are potential group matches when using `search` in Python:

```
abcdefghijklmnopqrstuvwxyz
 bcdefghijklmnopqrstuvwxyz
  cdefghijklmnopqrstuvwxyz
                         ⋮
                        yz
                         z
```

but clearly the capture group should return the longest match.

My hypothesis is that Python has a very bad implementation of `search`
that somehow considers all of these, even though it can be implemented
in linear time by scanning for `: error` first, and then greedily expanding
the longest possible `[^:]+` match to the left. If Python indeed considers
all possible matches, then with `n` matches of length `1 .. n` you
see the `O(n^2)` slowness (i verified this by replacing + with {1,k}
and doubling `k`, it doubles the execution time indeed).

This PR fixes this by removing the `+`, so effectively changing the 
O(n^2) into a O(n) worst case.

The reason we are fine with dropping `+` is that we don't use the
capture group anywhere, so, we just ensure `:: error` is not a match
but `x: error` is.

After going from O(n^2) to O(n), the 15MB mpich build log is parsed
in `1.288s`, so about 200x faster.

Just to be sure I've also updated `^CMake Error.*:` to `^CMake Error`,
so that it does not match with all the possible `:`'s in the line.
Another option is to use `.*?` there to make it quit scanning as soon as
possible, but what line that starts with `CMake Error` that does not have
a colon is really a false positive...
2021-10-12 00:05:11 -07:00
Manuela Kuhn
580a20c243 py-templateflow: add 0.4.2 (#26471)
* py-templateflow: add 0.4.2

* py-templateflow: fix python dependency for 0.4.2

* py-templateflow: remove wheel dependency for older versions
2021-10-11 21:37:50 -05:00
Manuela Kuhn
eb486371b2 py-pysurfer: add new package (#26638) 2021-10-11 21:36:43 -05:00
Manuela Kuhn
0da892f2dd py-pandas: fix installation and tests for versions @:0.25 (#26668) 2021-10-12 00:44:32 +00:00
Daniele Cesarini
a36a5ab624 fleur: new package (#26631) 2021-10-12 02:31:23 +02:00
Todd Kordenbrock
af2ecf87d4 Faodel: Update for the v1.2108.1 release (#26516) 2021-10-11 23:17:42 +00:00
Valentin Volkl
a0b9face0d frontier-client: add missing openssl dependency (#26636) 2021-10-12 01:02:12 +02:00
Kevin Pedretti
0b62974160 openblas: fix build on riscv64 (#26565)
OpenBLAS now has support for the riscv64 architecture. This commit
extends the spack openblas package.py to handle building on riscv64.
2021-10-12 00:55:45 +02:00
Alan Sill
833f1de24a bcl2fastq2: add boost@1.55 as default boost dependency (#26655)
When PR #26659 is merged, boost version boost@1.54.0 will be available for build too.

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-11 22:21:17 +00:00
Bernhard Kaindl
4eb176b070 motif package: fix linking the simple_app demo program (#26574) 2021-10-11 15:00:42 -07:00
Desmond Orton
95e63e0c29 New Package Qualimap:2.2.1 (#26615)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-11 19:28:34 +00:00
Derek Ryan Strong
2644a15ff9 GNU parallel package: add version 20210922 (#26591) 2021-10-11 12:08:06 -07:00
Marie Houillon
c7785b74be openCARP packages: add version 8.1 (#26602) 2021-10-11 12:07:05 -07:00
Bernhard Kaindl
260f4ca190 mapserver: Add missing deps: giflib and postgresql (#26619) 2021-10-11 12:04:18 -07:00
Michael Kuhn
d1f3279607 installer: Support showing status information in terminal title (#16259)
Installing packages with a lot of dependencies does not have an easy way
of judging the current progress (apart from running `spack spec -I pkg`
in another terminal). This change allows Spack to update the terminal's
title with status information, including its current progress as well as
information about the current and total number of packages.
2021-10-11 17:54:59 +02:00
Seth R. Johnson
8f62039d45 llvm: add conflict for newer apple-clang versions (#26633) 2021-10-11 17:49:23 +02:00
Bernhard Kaindl
2e9512fe8b mesa: gallium fails with llvm@13: use 'llvm@6:12', add mesa@21.2.3 (#26627)
The software rasterizer of Mesa's Gallium3D(even @21.2.3) fails to
build with llvm@13.0.0, use: depends_on('llvm@6:12', when='+llvm')
2021-10-11 17:34:33 +02:00
Matthew Archer
e034930775 kahip: update build system to cmake for v3.11, retain scons for older versions (#25645)
* kahip: update to cmake for v3.11, retain scons for older versions

* kahip: update build system to cmake for v3.11, retain SCons for older versions

* address PR comments and add maintainer

* address PR comments - correct version to 2.10, add deprecated and url, and remove scons version
2021-10-11 10:20:43 -05:00
Daniele Cesarini
79aba5942a Update of siesta libs (#26489) 2021-10-11 09:09:39 -05:00
iarspider
78200b6b41 Add new versions of kiwisolver (#26597) 2021-10-11 09:09:01 -05:00
Harmen Stoppels
89220bc0e1 Only install env modifications in <prefix>/.spack (#24081)
- Do not store the full list of environment variables in
  <prefix>/.spack/spack-build-env.txt because it may contain user secrets.

- Only store environment variable modifications upon installation.

- Variables like PATH may still contain user and system paths to make
  spack-build-env.txt sourceable. Variables containing paths are
  modified through prepending/appending, and if we don't apply these
  to the current environment variable, we end up with statements like
  `export PATH=/path/to/spack/bin` with system paths missing, meaning
  no system binaries are in the path, which is a bad user experience.

- Do write the full environment to spack-build-env.txt in the staging dir,
  but ensure it is readonly for the current user, to make it a bit safer
  on shared systems.
2021-10-11 09:07:45 -05:00
Harmen Stoppels
c0c9ab113e Add spack env activate --temp (#25388)
Creates an environment in a temporary directory and activates it, which
is useful for a quick ephemeral environment:

```
$ spack env activate -p --temp
[spack-1a203lyg] $ spack add zlib
==> Adding zlib to environment /tmp/spack-1a203lyg
==> Updating view at /tmp/spack-1a203lyg/.spack-env/view
```
2021-10-11 06:56:03 -04:00
Harmen Stoppels
f28b08bf02 Remove unused --dependencies flag (#25731) 2021-10-11 10:16:11 +02:00
iarspider
ad75f74334 New package: py-cppy (required for py-kiwisolver) (#26645) 2021-10-10 17:14:20 -05:00
Manuela Kuhn
4e1fdfeb18 py-nilearn: add 0.4.2 and 0.6.2 (#26635) 2021-10-09 19:14:13 -05:00
Manuela Kuhn
7f46b9a669 py-pyvistaqt: add new package (#26637) 2021-10-09 21:00:00 +00:00
Michael Kuhn
5deb8a05c4 environment-modules: fix build (#26632)
PR #25904 moved the `--with-tcl` option to only older versions. However,
without this option, the build breaks:
```
checking for Tcl configuration... configure: error: Can't find Tcl configuration definitions. Use --with-tcl to specify a directory containing tclConfig.sh
```
2021-10-09 12:43:54 -05:00
Bernhard Kaindl
9194ca1e2f idna: Add versions 2.9 for sierrapy #23768 and 3.2(current) (#26616)
py-sierrapy was merged accidentally and pins its versions to exact
numbers. Add 2.9 as the version for sierrapy and the current 3.2
2021-10-09 11:31:11 -05:00
Joseph Wang
e6cc92554d py-awkward: add py-yaml as depends (#26626) 2021-10-09 17:35:22 +02:00
Joseph Wang
03ab5dee31 evtgen: fix pythia typo (#26625) 2021-10-09 16:11:58 +02:00
Bernhard Kaindl
93df47f4d5 librsvg: fix build when does not use -pthread for linking (#26592)
librsvg uses pthread_atfork() but does not use -pthread on Ubuntu 18.04 %gcc@8
2021-10-09 13:10:13 +02:00
Derek Ryan Strong
4b5cc8e3bd Add R 4.1.1 (#26589) 2021-10-08 20:02:03 -05:00
Derek Ryan Strong
e89d4b730a Add Julia 1.6.3 (#26624) 2021-10-09 00:58:19 +00:00
Massimiliano Culpo
2386630e10 Remove DB reindex during a read operation (#26601)
The DB should be what is trusted for certain operations.
If it is not present when read we should assume the
corresponding store is empty, rather than trying a
write operation during a read.

* Add a unit test
* Document what needs to be there in tests
2021-10-08 22:35:23 +00:00
Bernhard Kaindl
21ce23816e py-twisted,py-storm: dep on zope.interface, bump storm version (#26610)
* py-twisted,py-storm: dep on zope.interface, bump storm version

py-twisted and py-storm's import tests need zope.interface.
py-storm: Use pypi and add version 0.25. It didn't change reqs.
zope.infterface@4.5.0 imports removed Feature: Use setuptools@:45

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-storm: all deps updated with type=('build', 'run')

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 21:20:39 +00:00
psakievich
8ebb705871 Trilinos: update for CUDA and Nalu-Wind (#26614) 2021-10-08 21:08:54 +00:00
Bernhard Kaindl
7e8f2e0c11 py-hypothesis: Add variants django, dumpy, pandas and fix import tests (#26604) 2021-10-08 21:00:47 +00:00
Manuela Kuhn
654268b065 py-bcrypt, py-bleach, py-decorator, py-pygdal: fix python dependency (#26596)
* py-bcrypt, py-bleach, py-decorator, py-pygdal: fix python dependency

* Update var/spack/repos/builtin/packages/py-bleach/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 15:56:22 -05:00
Bernhard Kaindl
6d0d8d021a py-pymatgen: fix build of old versions, bump version to 2021.3.9 (#26249)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 19:54:40 +00:00
Bernhard Kaindl
f1f614ee9f py-anytree: Add dep py-six@1.9.0 as required by setup.py (#26603)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 19:36:12 +00:00
Manuela Kuhn
88c33686fd py-matplotlib: fix 3.4.3 (#26586)
* py-matplotlib: fix 3.4.3

* Update var/spack/repos/builtin/packages/py-matplotlib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 13:52:10 -05:00
Bernhard Kaindl
de86726483 py-traceback2: Fix depends_on: add six and py-linecache2 (#26607) 2021-10-08 13:35:36 -05:00
Bernhard Kaindl
a95c1dd615 py-keras-preprocessing: Add missing deps: six@1.9.0: and numpy@1.9.1: (#26605)
* py-keras-preprocessing: Add missing deps: six@1.9.0: and numpy@1.9.1:

Add deps: pip download --no-binary :all: keras-preprocessing==1.1.2
Collecting numpy>=1.9.1
  Installing build dependencies: started
Collecting six>=1.9.0

* Update var/spack/repos/builtin/packages/py-keras-preprocessing/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 13:34:25 -05:00
Harmen Stoppels
7d89a95028 Fix leaky spack.binary_distribution.binary_index in tests (#26609)
* Fix issues with leaky binary index across tests

* More rigorous binary_index reset as now other tests are failing :(
2021-10-08 13:41:47 -04:00
Manuela Kuhn
32b35e1a78 py-neurora: add new package (#26479) 2021-10-08 13:39:45 +00:00
Tamara Dahlgren
7f2611a960 Allow Version('') and map it to the empty tuple (#25953) 2021-10-08 10:36:54 +02:00
Rodrigo Ceccato de Freitas
169d0a5649 cling: add missing CMake dependency (#26577) 2021-10-08 09:28:37 +02:00
Adam J. Stewart
f390289629 More strict ReadTheDocs tests (#26580) 2021-10-08 09:27:17 +02:00
Daniel G Travieso
10de12c7d0 add hash field to spec on find --json and assert in test its there (#26443)
Co-authored-by: Daniel Travieso <daniel@dgtravieso.com>
2021-10-07 23:50:05 -07:00
Bernhard Kaindl
449a5832c8 llvm: llvm@13+libcxx needs a very recent C++ compiler (#26584)
libc++-13 does not support %gcc@:10, see:
https://bugs.llvm.org/show_bug.cgi?id=51359#c1

https://libcxx.llvm.org/#platform-and-compiler-support says:
- GCC    11     - latest stable release per GCC’s release page
- Clang: 11, 12 - latest two stable releases per LLVM’s release page
- AppleClang 12 - latest stable release per Xcode’s release page
2021-10-08 00:25:51 +00:00
Pedro Demarchi Gomes
28529f9eaf re2 pic support (#26513) 2021-10-08 01:01:40 +02:00
Oliver Perks
da31c7e894 Updatepackage/minigmg (#26467)
* MiniGMG, add support for optimised flags + SIMDe implementation of AVX instrinsics

* Add .gitlab-ci.yml

* NVHPC fast

* remove CI

* Syntax fix
2021-10-07 14:49:14 -05:00
Paul Ferrell
0b9914e2f5 Fix for license symlinking issue. (#26576)
When a symlink to a license file exists but is broken, the license symlink post-install hook fails
because os.path.exists() checks the existence of the target not the symlink itself.
os.path.lexists() is the proper function to use.
2021-10-07 19:18:35 +00:00
Seth R. Johnson
9853fd50e2 itk: use CMakePackage helpers (#26572) 2021-10-07 10:49:51 -05:00
Scott Wittenburg
0561af1975 Pipelines: retry service job on system errors (#26508)
Retry rebuild-index, cleanup, and no-op jobs automatically if they fail
due to infrastructure-related problems.
2021-10-07 08:59:51 -06:00
Kevin Huck
b33a0923e1 apex: support profiling/tracing HIP applications (#26569)
libz is added for compressing google trace events output.
2021-10-07 16:22:40 +02:00
Harmen Stoppels
05834e7c9d Memoize the result of spack.platforms.host() (#26573) 2021-10-07 14:04:05 +02:00
Olivier Cessenat
20ee786d09 visit: add an external find function (determine_version) (#25817)
* visit: add an external find function (determine_version)

* visit: correct too long comment line

* visit: forgot to set executables

* visit: external find uses signgle dash version

* visit: found as external asking visit version
2021-10-07 10:19:58 +00:00
Manuela Kuhn
653d5e1590 py-mayavi: add 4.7.3 (#26566) 2021-10-06 19:15:34 -05:00
Tyler Funnell
60909f7250 fish: adding version 3.3.1 (#26488)
* fish: adding version 3.3.1

* adding maintainer

* Update var/spack/repos/builtin/packages/fish/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-06 20:42:45 +00:00
Manuela Kuhn
b787065bdb py-scikit-image: add 0.18.3 and fix dependencies (#26406) 2021-10-06 15:30:27 -05:00
Jen Herting
3945e2ad93 New package: py-clean-text (#26511)
* [py-clean-text] created template

* [py-clean-text]

- added description
- added dependencies
- removed fixmes
2021-10-06 19:44:33 +00:00
Tamara Dahlgren
affd2236e6 Provide more info in SbangPathError to aid CI debugging (#26316) 2021-10-06 21:03:33 +02:00
Kevin Pedretti
cb5b28392e Patch from upstream needed to build numactl on riscv64. (#26541)
The most recent release of numactl (2.0.14) fails to build on riscv64
because of a missing "-latomic". This patch from upstream resolves this
issue. It can be dropped once the next version of numactl is released.
2021-10-06 11:50:21 -07:00
Rodrigo Ceccato de Freitas
b1f7b39a06 ucx: fix typo in config description (#26564) 2021-10-06 18:44:22 +00:00
Manuela Kuhn
770c4fd0b2 py-nipype: add 1.4.2 (#26472) 2021-10-06 20:35:42 +02:00
Massimiliano Culpo
98ee00b977 Restore the correct computation of stores in environments (#26560)
Environments push/pop scopes upon activation. If some lazily
evaluated value depending on the current configuration was
computed and cached before the scopes are pushed / popped
there will be an inconsistency in the current state.

This PR fixes the issue for stores, but it would be better
to move away from global state.
2021-10-06 11:32:26 -07:00
Jose E. Roman
fdf76ecb51 New release slepc4py 3.16.0 (#26468) 2021-10-06 11:18:53 -07:00
Manuela Kuhn
9a75f44846 py-svgutils: add 0.3.1 (#26470) 2021-10-06 11:15:36 -07:00
haralmha
6ce4d26c25 Add 1.4.2 (#26475) 2021-10-06 11:13:55 -07:00
haralmha
8fc04984ba Add 1.9.3 (#26483) 2021-10-06 11:10:58 -07:00
haralmha
cfde432b19 Add 1.1.1 (#26484) 2021-10-06 11:10:05 -07:00
haralmha
ae9428df84 Add 14.16.1 (#26485) 2021-10-06 11:09:06 -07:00
haralmha
533705f4e3 Add 1.0.2 (#26486) 2021-10-06 10:50:28 -07:00
haralmha
2b6d7cc74c Add 4.2.0 (#26498) 2021-10-06 10:35:04 -07:00
g-mathias
9978353057 New package: intel oneapi advisor (#26535)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2021-10-06 10:28:55 -07:00
Massimiliano Culpo
319ae9254e Remove the spack.architecture module (#25986)
The `spack.architecture` module contains an `Arch` class that is very similar to `spack.spec.ArchSpec` but points to platform, operating system and target objects rather than "names". There's a TODO in the class since 2016:

abb0f6e27c/lib/spack/spack/architecture.py (L70-L75)

and this PR basically addresses that. Since there are just a few places where the `Arch` class was used, here we query the relevant platform objects where they are needed directly from `spack.platforms`. This permits to clean the code from vestigial logic.

Modifications:
- [x] Remove the `spack.architecture` module and replace its use by `spack.platforms`
- [x] Remove unneeded tests
2021-10-06 10:28:12 -07:00
g-mathias
232086c6af new package: intel oneapi inspector (#26549)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2021-10-06 10:25:18 -07:00
Danny McClanahan
e2e8e5c1e2 use conflicts() instead of referring to SpecError in mesa (#26562)
* mesa: use conflicts() instead of referring to SpecError
2021-10-06 10:07:42 -07:00
Paul R. C. Kent
fefe624141 llvm: new version 13.0.0 (#26563) 2021-10-06 16:53:49 +00:00
Alec Scott
4732c722e7 Add v1.56.2 to Rclone (#26556) 2021-10-06 09:53:34 -07:00
Alec Scott
97407795f6 Add v5.3.0 to Superlu (#26558) 2021-10-06 09:50:12 -07:00
haralmha
fe723b2ad8 py-jupyter-client: add 6.1.12 (#26503) 2021-10-06 09:21:10 -04:00
haralmha
d27bb7984a py-ptyprocess: add 0.7.0 (#26504) 2021-10-06 09:20:30 -04:00
haralmha
a59d6be99b py-setuptools: add 57.1.0 (#26505) 2021-10-06 09:19:49 -04:00
haralmha
ceb962fe17 py-six: add 1.16.0 (#26506) 2021-10-06 09:18:55 -04:00
Manuela Kuhn
11bdd18b1f py-matplotlib: fix qhull dependency (#26553) 2021-10-06 09:18:11 -04:00
Jen Herting
b92fd08028 py-ftfy: added version 6.0.3 (#26509) 2021-10-06 08:51:15 -04:00
Jen Herting
bc152fc6c4 New package: py-emoji (#26510)
* [py-emoji] created template

* [py-emoji]

- removed fixmes
- added homepage
- added description
- added dependencies
2021-10-06 08:50:24 -04:00
Frédéric Simonis
51f77f8af1 precice: add version 2.3.0 (#26551) 2021-10-06 08:44:59 -04:00
haralmha
8366f87cbf gnuplot: add version 5.4.2 (#26529) 2021-10-06 08:41:49 -04:00
haralmha
ab5962e997 hadoop: add version 2.7.5 (#26530) 2021-10-06 08:41:28 -04:00
haralmha
943f1e7a3a Add 7.6.3 (#26502) 2021-10-06 14:04:48 +02:00
Ivan Maidanski
1d258151f3 bdw-gc: add v8.0.6 (#26545) 2021-10-06 07:24:51 -04:00
haralmha
2cdfa33677 Add 6.0.2 (#26501) 2021-10-06 13:19:29 +02:00
haralmha
199bd9f8f8 py-decorator: Add version 5.0.9 and 5.10. (#26500)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 02:24:47 +00:00
haralmha
0c02f6c36d py-hepdata-validator: Add version 0.2.3 and 0.3.0 (#26496)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 02:15:06 +00:00
haralmha
c97c135f1d py-heapdict: Add version 1.0.0 (#26495)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 01:16:18 +00:00
haralmha
d07c366408 Add 1.28.1 (#26494)
py-grpcio: Add version 1.28.1
2021-10-06 02:27:55 +02:00
haralmha
df93becc99 Add 2021.4.1 (#26493)
py-distributed: Add version 2021.4.1
2021-10-06 02:26:05 +02:00
haralmha
5e9497e05a Add 0.7.1 (#26492)
py-defusedxml: add version 0.7.1
2021-10-06 02:25:04 +02:00
haralmha
2e28281f1a py-bleach: Add version 3.3.1 (#26490)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 00:18:09 +00:00
haralmha
e8b9074300 py-dask: add version 2021.4.1 (#26491) 2021-10-06 02:17:52 +02:00
haralmha
eafa0ff085 py-pygdal: Add versions 3.3.0.10 and 3.3.2.10 (#26528)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-05 23:57:59 +00:00
haralmha
b20e7317ab py-backcall: Add version 0.2.0 (#26487) 2021-10-06 01:55:27 +02:00
haralmha
2773178897 Add versions 3.1.6, 3.1.7 and 3.2.0 (#26527) 2021-10-06 01:48:18 +02:00
Bernhard Kaindl
e39866f443 h5utils: Fix build and add new version (#26133)
@1.12.1+png depends_on libpng@:1.5.0 because it needs the old API

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-10-05 23:13:02 +00:00
Massimiliano Culpo
d7fc17d5db Set explicitly write permission for packages (#26539) 2021-10-05 23:12:25 +00:00
Olivier Cessenat
cc6508ec09 vtk: gui support for vtk 9 added (#25636) 2021-10-05 22:54:39 +00:00
haralmha
3157a97743 Add version 3.16.1 (#26534) 2021-10-06 00:27:17 +02:00
haralmha
aac7e28ea0 dd4hep: Add version 01-18 (#26525)
* Add 01-18

* Keep master on top and change version name format
2021-10-05 17:37:31 -04:00
Kevin Pedretti
47607dcac5 Use gnuconfig package for config file replacement for RISC-V. (#26364)
* Use gnuconfig package for config file replacement for RISC-V.

This extends the changes in #26035 to handle RISC-V. Before this change,
many packages fail to configure on riscv64 due to config.guess being too
old to know about RISC-V. This is seen out of the box when clingo fails
to build from source due to pkgconfig failing to configure, throwing
error: "configure: error: cannot guess build type; you must specify one".

* Add riscv64 architecture

* Update vendored archspec from upstream project.
These archspec updates include changes needed to support riscv64.

* Update archspec's __init__.py to reflect the commit hash of archspec being used.
2021-10-05 19:22:55 +00:00
Harmen Stoppels
d998ea1bd4 Move shell aware env into spack.environment.shell (#25608)
Cherry-picked from #25564 so this is standalone.

With this PR we can activate an environment in Spack itself, without computing changes to environment variables only necessary for "shell aware" env activation.

1. Activating an environment:
    
    ```python
    spack.environment.activate(Environment(xyz)) -> None
    ```
    this basically just sets `_active_environment` and modifies some config scopes.

2. Activating an environment **and** getting environment variable modifications for the shell:

    ```python
    spack.environment.shell.activate(Environment(xyz)) -> EnvironmentModifications
    ```

This should make it easier/faster to do unit tests and scripting with spack, without the cli interface.
2021-10-05 18:25:43 +00:00
haralmha
713bbdbe7c postgresql: Add versions 14.0 and 12.2 (#26499) 2021-10-05 13:34:46 +02:00
haralmha
789beaffb7 doxygen: Add versions 1.9.2 and 1.8.18 (#26497) 2021-10-05 12:49:48 +02:00
Ricardo Jesus
4ee74c01e3 meme: Fix compilation with arm and nvhpc compilers (#24883)
* Fix compilation with `arm` and `nvhpc` compilers

* Update package.py
2021-10-05 05:55:12 -04:00
Jonas Thies
7f2fd50d20 phist: add a patch for the case +host arch=ppc64le (#26216) 2021-10-05 08:33:41 +00:00
Guilherme Perrotta
7ea9c75494 py-flatbuffers: add new package (#26444)
Python port of the "flatbuffers" package
2021-10-05 10:10:43 +02:00
Ivan Maidanski
1a651f1dca libatomic_ops: add v7.6.12 (#26512) 2021-10-05 10:08:57 +02:00
Manuela Kuhn
4e6f2ede0a py-rsatoolbox: add new package (#26514) 2021-10-05 10:07:03 +02:00
Axel Huebl
e1fb77496c WarpX: 21.10 (#26507) 2021-10-05 10:01:26 +02:00
David Gardner
85de527668 sundials: Add 5.8.0 and sycl variant (#26524) 2021-10-05 10:00:11 +02:00
Heiko Bauke
f6a4f6bda7 mpl: add new package (#26522) 2021-10-05 09:59:14 +02:00
Mihael Hategan
95846ad443 py-parsl: new package (see https://parsl-project.org/) (#26360) 2021-10-05 09:53:02 +02:00
Massimiliano Culpo
337b54fab0 Isolate bootstrap configuration from user configuration (#26071)
* Isolate bootstrap configuration from user configuration

* Search for build dependencies automatically if bootstrapping from sources

The bootstrapping logic will search for build dependencies
automatically if bootstrapping anything form sources. Any
external spec, if found, is written in a scope that is specific
to bootstrapping.

* Don't clean the bootstrap store with "spack clean -a"

* Copy bootstrap.yaml and config.yaml in the bootstrap area
2021-10-05 09:16:09 +02:00
Seth R. Johnson
3cf426df99 zstd: fix install name on macOS (#26518)
Reverting from CMake to Make install caused
`-install_path=/usr/local/lib/libzstd.1.dylib` to be hardcoded into the
zstd. Now we explicitly pass the PREFIX into the build command so that
the correct spack install path is saved.

Fixes #26438 and also the ROOT install issue I had :)
2021-10-05 02:03:48 +00:00
Nisarg Patel
b5673d70de molden: fix build with gcc@10: (#25803) 2021-10-05 01:36:20 +00:00
Todd Gamblin
84c878b66a cc: make error messages more clear
- [x] Our wrapper error messages are sometimes hard to differentiate from other build
      output, so prefix all errors from `die()` with '[spack cc] ERROR:'

- [x] The error we raise when running, say, `fc` without a Fortran compiler was not
      clear enough. Clarify the message and the comment.
2021-10-04 18:30:19 -07:00
Todd Gamblin
052b2e1b08 cc: convert compiler wrapper to posix shell
This converts everything in cc to POSIX sh, except for the parts currently
handled with bash arrays. Tests are still passing.

This version tries to be as straightforward as possible. Specifically, most conversions
are kept simple -- convert ifs to ifs, handle indirect expansion the way we do in
`setup-env.sh`, only mess with the logic in `cc`, and don't mess with the python code at
all.

The big refactor is for arrays. We can't rely on bash's nice arrays and be ignorant of
separators anymore. So:

1. To avoid complicated separator logic, there are three types of lists. They are:

    * `$lsep`-separated lists, which end with `_list`. `lsep` is customizable, but we
      picked `^G` (alarm bell) for `$lsep` because it's ASCII and it's unlikely that it
      would actually appear in any arguments. If we need to get fancier (and I will lose
      faith in the world if we do) then we could consider XON or XOFF.
    * `:`-separated directory lists, which end with `_dirs`, `_DIRS`, `PATH`, or `PATHS`
    * Whitespace-separated lists (like flags), which can have any other name.

    Whitespace and colon-separated lists come with the territory with PATHs from env
    vars and lists of flags. `^G` separated lists are what we use for most internal
    variables, b/c it's more likely to work.

2. To avoid subshells, use a bunch of functions that do dirty `eval` stuff instead. This
   adds 3 functions to deal with lists:

    * `append LISTNAME ELEMENT [SEP]` will put `ELEMENT` at the end of the list called
      `LISTNAME`. You can optionally say what separator you expect to use. Note that we
      are taking advantage of everything being global and passing lists by name.

    * `prepend LISTNAME ELEMENT [SEP]` like append, but puts `ELEMENT` at the start of
      `LISTNAME`

    * `extend LISTNAME1 LISTNAME2 [PREFIX]` appends everything in LISTNAME2 to
       LISTNAME1, and optionally prepends `PREFIX` to every element (this is useful for
       things like `-I`, `-isystem `, etc.

    * `preextend LISTNAME1 LISTNAME2 [PREFIX]` prepends everything in LISTNAME2 to
       LISTNAME1 in order, and optionally prepends `PREFIX` to every element.

The routines determine the separator for each argument by its name, so we don't have to
pass around separators everywhere. Amazingly, as long as you do not expand variables'
values within an `eval` environment, you can do all this and still preserve quoting.
When iterating over lists, the user of this API still has to set and unset `IFS`
properly.

We ended up having to ignore shellcheck SC2034 (unused variable), because using evals
all over the place means that shellcheck doesn't notice that our list variables are
actually used.

So far this is looking pretty good. I took the most complex unit test I could find
(which runs a sample link line) and ran the same command line 200 times in a shell
script.  Times are roughly as follows:

For this invocation:

```console
$ bash -c 'time (for i in `seq 1 200`; do ~/test_cc.sh > /dev/null; done)'
```

I get the following performance numbers (the listed shells are what I put in `cc`'s
shebang):

**Original**
* Old version of `cc` with arrays and `bash v3.2.57` (macOS builtin): `4.462s` (`.022s` / call)
* Old version of `cc` with arrays and `bash v5.1.8` (Homebrew): `3.267s` (`.016s` / call)

**Using many subshells (#26408)**
*  with `bash v3.2.57`: `25.302s` (`.127s` / call)
*  with `bash v5.1.8`: `27.801s` (`.139s` / call)
*  with `dash`: `15.302s` (`.077s` / call)

This version didn't seem to work with zsh.

**This PR (no subshells)**
*  with `bash v3.2.57`: `4.973s` (`.025s` / call)
*  with `bash v5.1.8`: `4.984s` (`.025s` / call)
*  with `zsh`: `2.995s` (`.015s` / call)
*  with `dash`: `1.890s` (`.0095s` / call)

Dash, with the new posix design, is easily the winner.

So there are several interesting things to note here:

1. Running the posix version in `bash` is slower than using `bash` arrays. That is to be
   expected because it's doing a bunch of string processing where it likely did not have
   to before, at least in `bash`.

2. `zsh`, at least on macOS, is significantly faster than the ancient `bash` they ship
   with the system. Using `zsh` with the new version also makes the posix wrappers
   faster than `develop`. So it's worth preferring `zsh` if we have it. I suppose we
   should also try this with newer `bash` on Linux.

3. `bash v5.1.8` seems to be significantly faster than the old system `bash v3.2.57` for
   arrays. For straight POSIX stuff, it's a little slower. It did not seem to matter
   whether `--posix` was used.

4. `dash` is way faster than `bash` or `zsh`, so the real payoff just comes from being
   able to use it. I am not sure if that is mostly startup time, but it's significant.
   `dash` is ~2.4x faster than the original `bash` with arrays.

So, doing a lot of string stuff is slower than arrays, but converting to posix seems
worth it to be able to exploit `dash`.

- [x] Convert all but array-related portions to sh
- [x] Fix basic shellcheck issues.
- [x] Convert arrays to use a few convenience functions: `append` and `extend`
- [x] Get `cc` tests passing.
- [x] Add `cc` tests where needed passing.
- [x] Benchmarking.

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2021-10-04 18:30:19 -07:00
Todd Gamblin
472638f025 .gitignore needs to be below env and ENV for case-insensitive FS 2021-10-04 18:30:19 -07:00
haralmha
8d0025b8af Add 5.2.0 (#26481) 2021-10-04 19:37:34 -05:00
Martin Pokorny
32f8dad0e2 log4cxx: new version and fix for c++11 (#26480)
* Add version 0.12.1

* Add variant to build with C++11 standard

build with c++11 standard requires boost threads, and needs explicit setting of
CMAKE_CXX_STANDARD
2021-10-05 00:34:03 +00:00
KoyamaSohei
c426386f46 intel-tbb: install pkgconfig file (#23977)
* intel-tbb: install pkgconfig file

* intel-tbb: install pkgconfig file when @:2021.2.0

* intel-tbb: add blank line

* intel-tbb: fix library name to refer

* intel-tbb: fix library name to refer again

* intel-tbb: use self.prefix.lib.pkgconfig
2021-10-04 21:38:31 +00:00
Tamara Dahlgren
5a9e5ddb3d Stand-alone tests: distinguish NO-TESTS from PASSED (#25880)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-10-04 19:57:08 +00:00
snehring
675bdada49 esys-particle: add new package esys-particle (#25338)
* esys-particle: add new package esys-particle

* esys-particle: requested changes

* esys-particle: placating isort
2021-10-04 19:41:31 +00:00
Daniel Ahlin
c59223effe gcc: apply nvcc patch to more versions (#24295)
Apply to all known affected versions (10.1.0 to 10.3.0 and 11.1.0)
2021-10-04 21:02:57 +02:00
haralmha
6a6bc3ecbb git: new version 2.29.2 (#26478) 2021-10-04 18:23:38 +00:00
iarspider
b88c80bb1b Add new pythia8 variants and fix url_for_version (#26280)
* Add new pythia8 variants
Add proper description to the existing ones

* Flake-8

* Flake-8

* Style

* Fix Pythia8 recipe

* Fix typo in variant description

* Update package.py
2021-10-04 12:52:51 -05:00
haralmha
b9e8d2d648 cppgsl: new version 3.1.0 (#26477) 2021-10-04 13:25:44 -04:00
Manuela Kuhn
377c781d7e py-scikit-learn: add 0.19.2 and 0.22.2.post1 (#26473) 2021-10-04 18:48:26 +02:00
snehring
9a8712f4cb masurca: Fix build with glibc-2.32+ (#26173)
remove sysctl.h which is not used by the build
2021-10-04 17:59:01 +02:00
Pedro Demarchi Gomes
90fa50d9df Avoid replacing symlinked spack.yaml when concretizing an environment (#26428) 2021-10-04 14:59:03 +00:00
Pedro Demarchi Gomes
4ae71b0297 use umpire version 5.0.1:5 (#26451) 2021-10-04 16:32:47 +02:00
Vicente Bolea
572791006d paraview: add v5.10.0-RC1 release (#26291) 2021-10-04 09:32:03 -04:00
mcuma
fa528c96e6 octave: add support for MKL (#25952) 2021-10-04 14:57:57 +02:00
iarspider
4f1c195bf9 grpc: update recipe (#25261)
1) forward `+shared` to re2
2) add `cxxstd` variant
3) use "more idiomatic" way of setting CMake options
2021-10-04 14:53:17 +02:00
Andreas Baumbach
35dd474473 Improve an error message in stage.py (#23737) 2021-10-04 14:47:14 +02:00
Tiziano Müller
2ca58e91d6 cp2k: bugfix for intel oneapi (#26143) 2021-10-04 14:41:49 +02:00
Oliver Perks
ad98e88173 arm: added download and better detection (#25460) 2021-10-04 14:35:20 +02:00
Harmen Stoppels
e557730c96 gnupg: Add version 1 (#23798)
From the gnupg.org website:

> GnuPG 1.4 is the old, single binary version which still support the
> unsafe PGP-2 keys. However, it lacks many modern features and will
> receive only important updates.

I'm starting to appreciate gpg1 more, because it is relocatable (gng2
has hard-coded paths to gpg-agent and other tools) and it does not
require gpg-agent at all.
2021-10-04 08:29:41 -04:00
iarspider
50b1b654c9 evtgen: fix mac build and version 2.0.0 with pythia >= 8.304. (#25933) 2021-10-04 14:06:49 +02:00
Manuela Kuhn
e9accaaca1 py-duecredit: add v0.6.5 (#26465) 2021-10-04 14:01:12 +02:00
Manuela Kuhn
68648b3ff5 py-nibabel: add v2.4.1 (#26462) 2021-10-04 14:00:49 +02:00
Valentin Volkl
e4d5bcf916 podio: remove build_type variant (provided by base package class already) (#26463) 2021-10-04 14:00:20 +02:00
Bernhard Kaindl
3fff338844 freebayes: Fix running the testsuite and add pkgconfig (#26440)
freebayes needs the tools of the vcflib it includes to run tests
2021-10-04 14:00:08 +02:00
Nikolay Simakov
225927c1c3 graph500: added option -fcommon for gcc@10.2 (#25367)
* graph500: added option -fcommon for gcc@10.2:, otherwise failed to build with "multiple definition of `column'"

* graph500: moved setting cflag to flag_handler
2021-10-04 13:42:33 +02:00
iarspider
bf9cf87d9b libunwind: add variants (#26099) 2021-10-04 13:04:39 +02:00
Manuela Kuhn
f1839c6aae py-pybids: add v0.9.5 (#26461) 2021-10-04 10:35:06 +00:00
Harmen Stoppels
d84d7f0599 Disable tests on hip for cmake 3.21 (#26460) 2021-10-04 10:20:07 +00:00
Manuela Kuhn
e2ee3066cf py-mne: add new package (#26446) 2021-10-04 09:56:58 +00:00
Manuela Kuhn
1f451924b1 py-sphinxcontrib-napoleon: add new package (#26457) 2021-10-04 09:52:56 +00:00
Manuela Kuhn
8bcbead42b py-templateflow: add new package (#26458) 2021-10-04 09:47:37 +00:00
Dylan Simon
9926799aeb Fix NonVirtualInHierarchyError message format (#26008) 2021-10-04 10:38:09 +02:00
Seth R. Johnson
205b414162 hdf5: allow implicit functions on apple-clang (#26409)
Work around issues in older hdf5 build and overzealous build flags:
```
 >> 1420    /var/folders/j4/fznvdyhx4875h6fhkqjn2kdr4jvyqd/T/9te/spack-stage/spack-stage-hdf5-1.10.4-feyl6tz6hpx5kl7m33avpuacwje2ubul/spack-src/src/H5Odeprec.c:141:8: error: implicit decl
             aration of function 'H5CX_set_apl' is invalid in C99 [-Werror,-Wimplicit-function-declaration]
```
2021-10-04 10:19:56 +02:00
Seth R. Johnson
5d431087ab python: correctly disable ~tkinter when @3.8 (#26365)
The older patch does not apply so the build ends up failing:
```
     1539    In file included from /private/var/folders/fy/x2xtwh1n7fn0_0q2kk29xkv9vvmbqb/T/s3j/spack-stage/spack-stage-python-3.8.11
             -6jyb6sxztfs6fw26xdbc3ktmbtut3ypr/spack-src/Modules/_tkinter.c:48:
  >> 1540    /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/tk.h:86:11: f
             atal error: 'X11/Xlib.h' file not found
     1541    #       include <X11/Xlib.h>
     1542                    ^~~~~~~~~~~~
     1543    1 error generated.
```
2021-10-04 10:18:23 +02:00
iarspider
7104b8599e Update byacc (#25837)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-04 09:55:20 +02:00
G-Ragghianti
7009724f0d papi: new variants rocm and rocm_smi (#26214) 2021-10-04 09:53:17 +02:00
Manuela Kuhn
81abbb41bc git-annex: add 8.20210804 for amd64 (#26449) 2021-10-04 09:50:05 +02:00
Manuela Kuhn
d0da7c8440 py-pockets: add new package (#26452) 2021-10-04 09:43:10 +02:00
Massimiliano Culpo
69abc4d860 Build ppc64le docker images (#26442)
* Update archspec
* Add ppc64le to docker images
2021-10-04 09:34:53 +02:00
Massimiliano Culpo
e91815de7c Update Spec.full_hash docstring (#26456)
The docstring is outdated since #21735
when the build hash has been included
in the full hash.
2021-10-04 07:33:22 +00:00
Garth N. Wells
baa50c6679 FEniCSx: fix CMake root directory and dependency versions (#26445) 2021-10-04 09:20:53 +02:00
Manuela Kuhn
44d7218038 py-seaborn: add v0.11.2 (#26447) 2021-10-04 09:17:38 +02:00
Manuela Kuhn
9e0aabdef3 py-transforms3d: add new package (#26448) 2021-10-04 09:17:04 +02:00
Joseph Wang
91598912f7 ocaml: add v4.13.1 (#26453)
This is needed in order to make it work with glibc-2.34
2021-10-04 08:47:30 +02:00
Daniel Arndt
862bcc5059 ArborX: Explicitly set path to Kokkos (#26347)
* Explicitly set path to Kokkos for ArborX testing

* Improve formatting

* Update var/spack/repos/builtin/packages/arborx/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* Remove blank line

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-10-03 23:04:52 -04:00
Hector
2f6590ddd1 source-highlight: add gcc11 patch (#26412) 2021-10-03 16:07:39 +00:00
Lorién López Villellas
c0b383b3e1 PICSAR: added support for GCC >10.0 and arm compiler (#24927)
* -fallow-argument-mismatch flag added when compiling with GCC to avoid a compilation error when using a GCC version > 10.0.

Co-authored-by: Haz99 <jsalamerosanz@gmail.com>

* Filtered every occurrence of "!$OMP SIMD SAFELEN(LVEC2)" when compiling with nvhcp to avoid a compilation error.

Co-authored-by: Haz99 <jsalamerosanz@gmail.com>

* Line with more than 80 characters split into multiple lines.

Co-authored-by: Haz99 <jsalamerosanz@gmail.com>
2021-10-03 10:11:04 -04:00
Jordan Galby
0d6a2381b2 Fix JSONDecodeError when using compiler modules (#25624)
When using modules for compiler (and/or external package), if a
package's `setup_[dependent_]build_environment` sets `PYTHONHOME`, it
can influence the python subprocess executed to gather module
information.

The error seen was:

```
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
```

But the actual hidden error happened in the `python -c 'import
json...'` subprocess, which made it return an empty string as json:

```
ModuleNotFoundError: No module named 'encodings'
```

This fix uses `python -E` to ignore `PYTHONHOME` and
`PYTHONPATH`. Should be safe here because the python subprocess code
only use packages built-in python.

The python subprocess in `environment.py` was also patched to be safe
and consistent.
2021-10-03 16:10:33 +02:00
Harmen Stoppels
b9e72557e8 Remove .99 from version ranges (#26422)
In most cases, .99 can be omitted thanks to #26402 .
2021-10-03 09:09:02 -04:00
Tim Moon
2de116d285 DiHydrogen: Specify required NVSHMEM variants (#26384) 2021-10-03 08:30:27 -04:00
Bernhard Kaindl
f6ce95bfcb chombo: several build fixes: csh, space regex and install (#26424)
See description of the the PR for details on the changes.
2021-10-03 14:28:28 +02:00
Harmen Stoppels
e53bd0eba8 bison: new versions 3.8:3.8.2 (#26439) 2021-10-03 07:57:03 -04:00
Takahiro Ueda
d603f4519d form: use version tarballs, fix and enable gmp and zlib variants (#26425)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
Co-authored-by: iarspider <iarspider@gmail.com>
2021-10-03 08:45:33 +00:00
Michael Kuhn
d4b3724557 meson: add 0.59.2 (#26430) 2021-10-03 10:41:08 +02:00
Michael Kuhn
abce326ea3 openblas: add 0.3.18 (#26429) 2021-10-03 10:38:48 +02:00
Manuela Kuhn
4c1285b1c2 py-svgutils: new package (#26437) 2021-10-03 02:38:36 +02:00
Marc Fehling
34a1e4d8b6 p4est: bump/update to 2.8 (#26081) 2021-10-03 01:56:03 +02:00
Manuela Kuhn
ee568cb6a4 py-packaging: bump version to 21.0 (#26436) 2021-10-02 23:31:24 +00:00
Manuela Kuhn
0db072dc4a py-attrs: bump version to 21.2.0 (#26435) 2021-10-02 23:23:02 +00:00
Manuela Kuhn
6a188b2b13 py-nitransforms: new package (#26434) 2021-10-03 01:06:13 +02:00
Wouter Deconinck
846428a661 podio: add supported CMAKE_BUILD_TYPE values (#26432) 2021-10-03 00:38:36 +02:00
Manuela Kuhn
aee1d44edf py-pybids: bump version to 0.13.2 (#26433) 2021-10-03 00:36:02 +02:00
Sreenivasa Murthy Kolam
f1c83b00d4 atmi: replace /usr/bin/rsync with depends_on rsync (#26353) 2021-10-03 00:32:46 +02:00
Manuela Kuhn
8dcd568769 py-ipyvtk-simple: new package (#26431) 2021-10-02 23:18:08 +02:00
Josh Bowden
aaf35fd520 damaris: bump version to 1.5.0, add variant examples (#26220) 2021-10-02 23:06:17 +02:00
Manuela Kuhn
bc6edc7baf py-ipycanvas: new package (#26426) 2021-10-02 22:30:09 +02:00
Olivier Cessenat
2ce3be3a27 fltk: new version 1.3.7, new variant xft (#26423) 2021-10-02 16:27:38 -04:00
Manuela Kuhn
b65360f721 py-ipyevents: add new package (#26427) 2021-10-02 22:23:09 +02:00
Robert Romero
57803887e8 cddlib: use github URLs and bump version (#26315)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-02 19:50:32 +00:00
Weiqun Zhang
43e08af7b2 amrex: bump version to 21.10 (#26416) 2021-10-02 18:25:50 +02:00
Massimiliano Culpo
0e469fc01e clingo-bootstrap: add a variant for static libstdc++ (#26421) 2021-10-02 14:53:24 +02:00
Mihael Hategan
b3d3ce1c37 py-pytest-random-order: new package - plugin for py-pytest. (#26413) 2021-10-02 12:38:09 +02:00
Rob Falgout
d9f25f223f Add hypre release 2.23.0 (#26418) 2021-10-02 09:52:50 +02:00
Howard Pritchard
3575c14a6a ucx: add version 1.11.2 (#26417)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-10-02 09:49:48 +02:00
Wouter Deconinck
314f5fdb97 New versions: geant4@10.7.2, geant4-data@10.7.2 (#25449)
* [geant4] new version 10.7.2

* [geant4-data] new version 10.7.2
2021-10-01 17:08:15 -07:00
Mihael Hategan
d8f23c0f6d Added py-typeguard package. (#26411)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 23:58:39 +00:00
Harmen Stoppels
d0e49ae4bb Simplify setup_package in build environment (#26070)
* Remove redundant preserve environment code in build environment

* Remove fix for a bug in a module

See https://github.com/spack/spack/issues/3153#issuecomment-280460041,
this shouldn't be part of core spack.

* Don't module unload cray-libsci on all platforms
2021-10-01 19:41:30 -04:00
Cory Bloor
b6169c213d Fix error message when test throws AttributeError (#25895)
Narrow the scope of the try/except block, to avoid a misleading
error message if fn() throws an AttributeError.
2021-10-01 19:40:24 -04:00
Mihael Hategan
d19105f761 Added py-sqlalchemy-stubs package. (#26414)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 21:50:55 +00:00
bernhardkaindl
9893718f78 axel(download manager): fix build, bump version (#26140) 2021-10-01 23:48:33 +02:00
Seth R. Johnson
2f630226e7 trilinos: add conflict and enable gtest for developers (#26150) 2021-10-01 15:53:16 -04:00
Harmen Stoppels
50feaae81c Allow non-empty ranges 1.1.0:1.1 (#26402) 2021-10-01 12:23:26 -07:00
Chris Richardson
bce269df13 Fenicsx update dependency and build for main branch (#25864)
* Updates to make it work again for main
* Use property

Co-authored-by: Chris <cnr12@cam.ac.uk>
2021-10-01 12:05:19 -07:00
Harmen Stoppels
18760de972 Spack install: handle failed restore of backup (#25647)
Spack has logic to preserve an installation prefix when it is being
overwritten: if the new install fails, the old files are restored.
This PR adds error handling for when this backup restoration fails
(i.e. the new install fails, and then some unexpected error prevents
restoration from the backup).
2021-10-01 11:40:48 -07:00
Olivier Cessenat
df590bb6ee hypre: add version 2.22.1; add fortran variant; becomes AutotoolsPackage (#25781) 2021-10-01 11:31:24 -07:00
bernhardkaindl
da171bd561 uftrace(dynamic function graph tracer): bump version (#26224) 2021-10-01 18:28:25 +00:00
Luciano Zago
f878f769a2 gh: add new package (#26405) 2021-10-01 17:58:47 +00:00
bernhardkaindl
ffb6597808 xrootd: Add current version 5.3.1 and restrict to openssl@1 (#26303) 2021-10-01 17:50:35 +00:00
Piotr Luszczek
f75ad02ba2 Add new package: ffte (#26340) 2021-10-01 17:27:38 +00:00
Scott Wittenburg
9f09156923 Retry pipeline generation jobs in certain cases 2021-10-01 10:12:37 -07:00
Scott Wittenburg
d11156f361 Add DAG scheduling to child pipelines 2021-10-01 10:12:37 -07:00
Scott Wittenburg
4637c51c7f Service jobs do not need an active environment 2021-10-01 10:12:37 -07:00
Scott Wittenburg
d233c20725 Activate concrete env dir in all service jobs 2021-10-01 10:12:37 -07:00
Scott Wittenburg
5f527075bf Use same image to build e4s as to generate it 2021-10-01 10:12:37 -07:00
Scott Wittenburg
f6fcfd2acc Use same image to build dvsdk as to generate it 2021-10-01 10:12:37 -07:00
Scott Wittenburg
ae092915ac Use default runner image for radiuss 2021-10-01 10:12:37 -07:00
Victor
98466f9b12 Add pmix variant to openmpi package (#26342)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-01 18:43:55 +02:00
Manuela Kuhn
0735419592 py-traitsui: add v7.2.1 (#26392) 2021-10-01 15:57:21 +00:00
Daniel Arndt
7293913e1c DataTransferKit: add v3.1-rc3 (#26269) 2021-10-01 17:33:09 +02:00
Seth R. Johnson
f2ef34638d graphviz: macOS patch not needed on newer version (#26292)
The autoconf file was updated in 2.49.0 but is still needed in older
versions.
2021-10-01 17:30:23 +02:00
bernhardkaindl
6aff4831e3 cmor: bump version and depend on the specific libuuid packages needed by version. (#26149) 2021-10-01 17:29:39 +02:00
Georgiana Mania
7d77f36f83 eigen: add v3.4.0 (#26296) 2021-10-01 17:24:58 +02:00
haralmha
9c8041b8e4 hoppet: add new package (#26297)
Co-authored-by: Harald Minde Hansen <hahansen@office.dyndns.cern.ch>
2021-10-01 17:24:17 +02:00
Manuela Kuhn
eff5bd4005 py-coverage: add v5.5 (#26407) 2021-10-01 15:14:41 +00:00
Axel Huebl
177fe731b1 llvm-openmp: 12.0.1 (#26320)
Add the latest LLVM OpenMP release.
This one compiles for aarch64/M1 on mac.
2021-10-01 17:06:12 +02:00
bernhardkaindl
6b96486d42 py-gsutil: add v5.2 (#26337) 2021-10-01 17:03:54 +02:00
Daniel Arndt
f168d264d1 ArborX: add v1.1 (#26344) 2021-10-01 17:02:05 +02:00
Michael Kuhn
12dc01318e otf: Explicitly disable MPI and ZOIDFS (#26346)
Otherwise, global installations of MPI could be picked up by OTF.
2021-10-01 17:01:31 +02:00
bernhardkaindl
ad8b8b3377 damask-grid, damask-mesh: fix spack install --test=root (#26350) 2021-10-01 16:59:31 +02:00
Edgar Leon
4078c5e537 mpibind: add v0.7.0 and new flux variant (#26359) 2021-10-01 16:56:22 +02:00
Adam Moody
60aa97b25f LWGRP: add v1.0.4 DTCMP: add v1.1.2 and v1.1.3 (#26362) 2021-10-01 16:42:10 +02:00
Satish Balay
6362b3014a petsc,py-petsc4py add versions 3.15.5, 3.16.0 (#26375) 2021-10-01 16:37:28 +02:00
Christoph Conrads
07e91e8afa med: work around "could not find TARGET hdf5" error (#25312)
Fixes #24671, fixes #26051
2021-10-01 10:27:22 -04:00
Ryan Mast
c3c28d510f ns-3-dev: add v3.31, v3.32, v3.33, and v3.34 (#26383) 2021-10-01 16:15:39 +02:00
Pieter Ghysels
d765b4b53d STRUMPACK: add v6.0.0 (#26386)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 16:12:41 +02:00
Manuela Kuhn
66866e45f2 py-pyface: add v7.3.0 (#26389) 2021-10-01 16:07:27 +02:00
Manuela Kuhn
ef202cbd37 py-ipywidgets: add 7.6.5 and get sources from pypi (#26393) 2021-10-01 16:06:25 +02:00
Mihael Hategan
e9609b62c6 Python Globus SDK (globus.org): add new package (#26395) 2021-10-01 16:05:49 +02:00
bernhardkaindl
da7666a967 py-pyside: restrict version of dependency on v1.2.2 (#26396)
Restrict @:1.2.2 to older sphinx versions to not fail the build
2021-10-01 16:04:36 +02:00
Manuela Kuhn
627c7007ef py-pyvista: add new package (#26398) 2021-10-01 16:00:30 +02:00
Anna Masalskaya
8fc770608d Add oneAPI packages from 2021.4 release (#26401) 2021-10-01 15:58:32 +02:00
Jose E. Roman
91f668a695 SLEPc: add v3.16 (#26403) 2021-10-01 15:55:42 +02:00
Manuela Kuhn
f775e81fca py-fury: add new package (#26404) 2021-10-01 15:53:50 +02:00
Harmen Stoppels
5b9f60f9bb bootstrapping: improve error messages (#26399) 2021-10-01 13:00:14 +02:00
bernhardkaindl
6ca42f0199 boost: @1.77.0: need updated python_jam.patch for +python (#26363)
One hunk changed and the new patch is refreshed using quilt.
2021-10-01 11:09:32 +02:00
Tamara Dahlgren
2d1ebbe0a2 Add info command tests to increase coverage (#26127) 2021-09-30 22:13:45 -04:00
Tamara Dahlgren
3feab42203 py-setuptools-rust: remove checksum url workaround (#25843) 2021-09-30 22:13:26 -04:00
Tamara Dahlgren
7aedeca764 Replace spec-related install_test asserts with exceptions; added unit tests (#25982) 2021-09-30 22:05:06 -04:00
Manuela Kuhn
c58cd83e38 py-shiboken2: add new package (#26390) 2021-10-01 01:19:41 +00:00
Manuela Kuhn
032ce20171 vtk: fix install path to lib (#26387) 2021-10-01 02:52:09 +02:00
Paul Spencer
046ed47b1f New package: slurm-drmaa (#25424)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 00:07:03 +00:00
Manuela Kuhn
c076aeb192 py-jupyterlab: add new package (#26357) 2021-10-01 01:33:28 +02:00
Massimiliano Culpo
74d36076c2 ArchSpec: minor cleanup of a few methods (#26376)
* Remove vestigial code to be compatible with Spack v0.9.X
* ArchSpec: reworked __repr__ to be more adherent to common Python idioms
* ArchSpec: simplified __init__.py and copy()
2021-09-30 16:03:39 -07:00
Robert Underwood
a75c86f178 GDB: Fix for GMP and Python (#26366)
closes  #26354 and #26358

Previously we did not pass paths for GDB or GMP and ./configure would
get confused about which one to pull from.  Be more specific.

Built with all variants enabled and fixed the fixable versions and variants:
@:8.1 were fixable by limiting python versions for these to @:3.6.
7.10.1 and 7.11(.1) were fixable to build with glibc-2.25 and newer
using two long patches.
gdb 7.8 and 7.9 weren't fixable as there is no backport if the fix
to build these with glibc-2.25 and newer:
http://lists.busybox.net/pipermail/buildroot/2017-March/188055.html

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-30 22:40:35 +00:00
Massimiliano Culpo
8ade8a77dd Build container images on Github Actions and push to multiple registries (#26247)
Modifications:
- Modify the workflow to build container images without pushing when the workflow file itself is modified
- Strip the leading ghcr.io/spack/ from env.container env.versioned to prepare pushing to multiple registries
- Fixed CentOS 7 and Amazon Linux builds
- Login and push to Docker Hub as well as Github Action
- Add a badge to README.md with the status of docker images
2021-09-30 23:34:47 +02:00
Jen Herting
7bda430de0 [py-tabulate] added version 0.8.9 (#25974) 2021-09-30 23:02:14 +02:00
Cory Bloor
4238318de2 rocsolver: Tighten rocsolver package dependencies (#25917)
- Specify CMake minimum version more precisely
- Ensure rocBLAS is available at build time
- Limit workaround for missing rocblas include path
  to the only affected version (4.1.0)
- Make hip a build and link dependency
- Remove hip's link dependencies

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-30 22:57:28 +02:00
Robert Underwood
7694c58736 sz: fix for external python sitelib (#26067)
fixes #25973
2021-09-30 11:52:20 -07:00
bernhardkaindl
af54f7b15b gromacs-chain-coordinate: Fix calling the test suite (#26367) 2021-09-30 18:18:03 +02:00
Harmen Stoppels
1aa7758dbb match .spack literal, not as a regex (#26374) 2021-09-30 14:54:57 +00:00
Harmen Stoppels
727dcef2f4 Disable __skip_rocmclang again (#26187)
CMake 3.21.3 disables the broken hipcc-as-rocmclang detection again.

From the release notes:

> The AMD ROCm Platform hipcc compiler was identifed by CMake 3.21.0
> through 3.21.2 as a distinct compiler with id ROCMClang. This has been
> removed because it caused regressions. Instead:
> * hipcc may no longer be used as a HIP compiler because it interferes
>   with flags CMake needs to pass to Clang. Use Clang directly.
> * hipcc may once again be used as a CXX compiler, and is treated as
>   whatever compiler it selects underneath, as CMake 3.20 and below
>   did.
2021-09-30 16:01:17 +02:00
Harmen Stoppels
4dee7d2b22 Add a conflict for patchelf to catch errors earlier in bootstrapping (#26373) 2021-09-30 15:18:56 +02:00
Harmen Stoppels
525aa11827 Bump version from v0.16.2 to v0.16.3 (#26372) 2021-09-30 11:54:48 +00:00
Olivier Cessenat
9e70fc630e dos2unix: new version 7.4.2 and enforce link with gettext (#26371) 2021-09-30 12:04:32 +02:00
Harmen Stoppels
b96d836a72 Move gnuconfig under the spack org (#26370) 2021-09-30 11:30:29 +02:00
Diego Alvarez
56eddf2e3c Added OpenJDK 17, 11 for Darwin, and set preferred to 11 (#26368) 2021-09-30 11:29:36 +02:00
Maciej Wójcik
7c3b2c1aa8 gromacs-chain-coordinate: new package (#25426) 2021-09-30 04:12:37 +02:00
Shahzeb Siddiqui
ef9967c7f4 Add e4s tags (#23382)
Add 'e4s' tags for all packages according to https://github.com/E4S-Project/e4s/blob/master/E4S_Products.md
2021-09-29 17:57:52 -07:00
acastanedam
02f356159b linux-pam: include more official (recent) releases (#26065) 2021-09-29 20:51:39 -04:00
David Beckingsale
d31830eaf5 Move new CUDA conflicts inside when('~allow-unsupported-compilers') block (#26132)
* Move new CUDA conflicts inside when('~allow-unsupported-compilers') block

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2021-09-30 00:22:26 +00:00
Valentin Volkl
2ea1bf84ec py-snappy: add patch to fix dependencies (#26352)
* py-snappy: add patch to fix dependencies

* Update var/spack/repos/builtin/packages/py-snappy/req.patch

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-29 15:50:42 -05:00
Manuela Kuhn
0a7804f52e py-jupyterlab: add 3.0.14, 3.0.18 and 3.1.14 (#26339) 2021-09-29 14:34:18 -05:00
Valentin Volkl
805d59d47f geos: fix patch versions (#26351) 2021-09-29 18:32:37 +00:00
Manuela Kuhn
10638ea449 py-jupyter-packaging: add 0.7.12 and 0.10.6 (#26338)
* py-jupyter-packaging: add 0.7.12 and 0.10.6

* Update var/spack/repos/builtin/packages/py-jupyter-packaging/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-jupyter-packaging/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-29 18:31:32 +00:00
Howard Pritchard
6d613bded0 cray-mvapich2: add a package to support (#26273)
The format of the HPE/Cray supplied module for cray-mvapich2 on HPE apollo systems is
very different from the cray-mpich module supplied on Cray EX and XE
systems.

Recent changes to the cray-mpich package -

https://github.com/spack/spack/pull/23470

broke support for cray-mvapich2 and relies now on the structure of the
cray-mpich module to work properly.

Rather than try to support two very different vendor mpich modules
using the same spack package, just add another one specialized for
the cray-mvapich2 module.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2021-09-29 20:12:13 +02:00
Martin Diehl
f32feeee45 new package(s): DAMASK (damask, damask-grid, damask-mesh) (#25692) 2021-09-29 19:57:01 +02:00
Manuela Kuhn
c75e84e9f3 py-qtpy: add 1.11.2 (#26343) 2021-09-29 11:19:23 -05:00
Harmen Stoppels
7fdb879308 ca-certificates-mozilla for openssl & curl (#26263)
1. Changes the variant of openssl to `certs=mozilla/system/none` so that
   users can pick whether they want Spack or system certs, or if they
   don't want certs at all.
2. Keeps the default behavior of openssl to use certs=systems.
3. Changes the curl configuration to not guess the ca path during
   config, but rather fall back to whatever the tls provider is
   configured with. If we don't do this, curl will still pick up system
   certs if it finds them.

As a minor fix, it also adds the build dep `pkgconfig` to curl, since
that's being used during the configure phase to get openssl compilation
flags.
2021-09-29 09:05:58 -07:00
iarspider
24263c9e92 New package: py-async-lru (#26328) 2021-09-29 18:05:36 +02:00
Manuela Kuhn
7cd3af5d52 py-mffpy: add new package (#26345) 2021-09-29 17:59:50 +02:00
bernhardkaindl
4a31f0fb0f py-mock: fix depends of @:2.0.0 and bump version (#26336)
* py-mock: fix depends of `@:2.0.0` and bump version

fixes the build of `py-gsutil`, it depends on `'py-mock@:2.0.0'`.

* Update var/spack/repos/builtin/packages/py-mock/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mock/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Apply the other requested changes

* Add requested change: Add the python@3.6 for newer versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-29 15:39:57 +00:00
Greg Sjaardema
5132cfcc63 seacas: new release and fixes for metis/parmetis (#26310)
* seacas: new release and fixes for metis/parmetis

* Update to add sha256 checksum for latest seacas release

* Updated the documentation strings with new applications

* Fixed the metis/parmetis variants and logic depending on whether mpi
  is enabled/disabled. (There is still a zoltan issue I need to fix,
  but this will at least allow seacas to be built without
  metis/parmetis or with +mpi+parmetis.  The ~mpi+metis still needs
  work elsewhere.)

* Enable cpup, slice, zellij in +applications
2021-09-29 16:53:49 +02:00
Harmen Stoppels
9d39a1bf42 Un-unset TERM and DISPLAY (#26326)
For interactive `spack build-env`'s this is undesired behavior
2021-09-29 16:28:32 +02:00
bernhardkaindl
11278dd55a py-monotonic: bump version to 1.6 (#26335)
Fixes `spack solve py-gsutil`: It depends on 'py-monotonic@1.4:'
2021-09-29 09:19:47 -05:00
Pat Riehecky
e45979d39a Added OpenJDK 16.0.2 (#26322) 2021-09-29 14:19:38 +00:00
MichaelLaufer
ed54d6c926 py-netcdf4: adds parallel IO support (#26178) 2021-09-29 08:21:16 -05:00
iarspider
634b0b25c4 Add new package: py-backports-entry-points-selectable (#26329) 2021-09-29 15:09:27 +02:00
Manuela Kuhn
ac29b70d18 py-dipy: add new package (#26331) 2021-09-29 15:05:34 +02:00
Manuela Kuhn
e01947effc py-imageio-ffmpeg: add 0.4.5, fix installation and import (#26330) 2021-09-29 14:56:16 +02:00
Manuela Kuhn
78dbac69e9 py-pyqt5-sip: add new package (#26325) 2021-09-29 14:52:29 +02:00
bernhardkaindl
80c8978691 krb5: bump version and limit depends_on('openssl') to @1 (#26332)
krb5 uses an openssl RSA API which is deprecated in OpenSSL@3.0.0
with an const/non-const mismatch.
2021-09-29 11:24:08 +00:00
Manuela Kuhn
ab6b531e10 py-argcomplete: add 1.12.3 (#26228) 2021-09-29 13:10:03 +02:00
Olli Lupton
cffa1e7f6c Add ccache 4.4.2. (#26327) 2021-09-29 09:41:28 +00:00
Axel Huebl
c5410bd333 FFTW: NEON SIMD is float-only (#26321)
Fix on Apple Mac M1 (aarch64):
```
configure: error: NEON requires single precision
```
2021-09-29 02:40:45 +00:00
Ben Darwin
c578882bb2 itk: add version 5.2.1 (#26307) 2021-09-28 18:15:42 -07:00
Greg Sjaardema
a550416a1f netcdf-c: Update for 4.8.1 and new download site (#26309)
NetCDF-4.8.1 has been released.

    As discussed in https://github.com/Unidata/netcdf-c/issues/2110
    (netcdf-c-4.8.1.tar.gz not on ftp site... #2110), the canonical
    download site for netCDF releases has been changed and the previous
    ftp site is no longer available.

    This PR updates the `url` to point to the new recommended download
    site and updates the sha256 checksums for the new tar files.
2021-09-28 17:21:28 -07:00
Gregory Lee
82d857dced qhull deprecated libqhull.so in 2020.2 (#26304) 2021-09-28 16:54:30 -07:00
Manuela Kuhn
32b5669e8d py-pytz: add 2021.1 (#26289) 2021-09-28 18:07:28 +00:00
iarspider
a2c16356a1 dire: Fix depends for new pythia version scheme (#26294)
Pythia version numbering was changed in #26255 from xyyy to x.yyy
2021-09-28 18:05:52 +02:00
Edward Hartnett
4f4984e313 wrf-io: added NOAA software maintainers, added openmp variant (#26113) 2021-09-28 08:55:09 -07:00
Tamara Dahlgren
5ace5ca7a4 Post 26211 rebase updates (#26246) 2021-09-28 08:53:59 -07:00
Manuela Kuhn
519f75b865 py-scooby: add new package (#26284) 2021-09-28 17:51:37 +02:00
Diego Alvarez
023d9e3b4c Added OpenJDK 11.0.12_7 (#26299) 2021-09-28 17:47:45 +02:00
iarspider
9810a80c6d Update nlohmann-json to 3.10.2 (#26260)
* Update nlohmann-json

* fix testsuite

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-28 08:45:34 -07:00
Harmen Stoppels
6c33047514 Squashfs is a makefilepackage (#26300) 2021-09-28 17:45:16 +02:00
bernhardkaindl
3c5a4157b0 py-anuga: add git main version to support build with python@3.5: (#26248)
* py-anuga: add git main version to support build with python@3.5:

py-anuga's main branch has been converted to Python-3 recently.

* py-triangle: use pypi, py-anuga: Fixed depends and test suite works now
2021-09-28 10:37:09 -05:00
bernhardkaindl
d00c9b47f2 py-tomopy: version bump and master, add new deps, make runtest pass (#26252)
Current py-tomopy has many features and dependencies, add them.
2021-09-28 10:33:29 -05:00
Manuela Kuhn
79808f92ae py-numba: add 0.54.0 and restrict old dependencies (#26281)
* py-numba: add 0.54.0 and restrict old dependencies

* Update var/spack/repos/builtin/packages/py-numba/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-28 10:32:44 -05:00
Manuela Kuhn
0f5c63b0c3 py-xlrd: add 2.0.1 (#26282) 2021-09-28 10:31:45 -05:00
Manuela Kuhn
b8dd8b07f9 py-appdirs: add 1.4.4 (#26298) 2021-09-28 10:26:23 -05:00
Diego Alvarez
f65a50d9f9 Update Nextflow (#26042) 2021-09-28 07:15:17 -04:00
Harmen Stoppels
4ce9c839ac Add OpenSSL 3 (#26097)
* OpenSSL 3.0.0

* Remove openssl constraint in e4s to test 3.0.0

* Restrict openssl

* Restrict openssl to @:1 in unifyfs

* Revert "Remove openssl constraint in e4s to test 3.0.0"

This reverts commit 0f0355609771764280ab1b6a523c80843a4f85d6.

* Prefer 1.x
2021-09-28 10:36:55 +02:00
Massimiliano Culpo
aa8727f6f9 Move detection logic in its own package (#26119)
The logic to perform detection of already installed
packages has been extracted from cmd/external.py
and put into the spack.detection package.

In this way it can be reused programmatically for
other purposes, like bootstrapping.

The new implementation accounts for cases where the
executables are placed in a subdirectory within <prefix>/bin
2021-09-28 09:05:49 +02:00
bernhardkaindl
499d39f211 libp11: Fix build and add new version (#26135)
The build needs `pkgconfig` and `openssl`, `m4` is already added by `autoconf`.
Also add the current version of `libp11` to the list of versions.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-28 04:01:08 +02:00
Manuela Kuhn
cd8e85d802 py-dataclasses: add 0.8 (#26283) 2021-09-27 20:18:07 -05:00
Manuela Kuhn
c1cfbc5536 py-meshio: add 5.0.1 and 4.4.6 (#26285) 2021-09-27 20:17:22 -05:00
Manuela Kuhn
87b23cc9cd py-deprecated: add 1.2.13 and get sources from pypi (#26286) 2021-09-27 20:15:35 -05:00
Harmen Stoppels
87450f3688 Use gnuconfig package for config file replacement (#26035)
* Use gnuconfig package for config file replacement

Currently the autotools build system tries to pick up config.sub and
config.guess files from the system (in /usr/share) on arm and power.
This is introduces an implicit system dependency which we can avoid by
distributing config.guess and config.sub files in a separate package,
such as the new `gnuconfig` package which is very lightweight/text only
(unlike automake where we previously pulled these files from as a
backup). This PR adds `gnuconfig` as an unconditional build dependency
for arm and power archs.

In case the user needs a system version of config.sub and config.guess,
they are free to mark `gnuconfig` as an external package with the prefix
pointing to the directory containing the config files:

```yaml
    gnuconfig:
      externals:
      - spec: gnuconfig@master
        prefix: /tmp/tmp.ooBlkyAKdw/lol
      buildable: false
```

Apart from that, this PR gives some better instructions for users when
replacing config files goes wrong.

* Mock needs this package too now, because autotools adds a depends_on

* Add documentation

* Make patch_config_files a prop, fix the docs, add integrations tests

* Make macOS happy
2021-09-27 18:38:14 -04:00
Tamara Dahlgren
c0da0d83ff qhull: use secure github URLS instead of www.qhull.org downloads (#26211) 2021-09-27 14:50:43 -07:00
Manuela Kuhn
9b7a8f69fb py-python-picard: add new package (#26268) 2021-09-27 23:39:30 +02:00
Brian Van Essen
acc8673e1a Updated versions for cuDNN and NVSHMEM (#26272) 2021-09-27 13:54:15 -07:00
Adam J. Stewart
5c1e133d23 py-torchvision: add v0.10.1 (#26267) 2021-09-27 13:35:53 -07:00
Manuela Kuhn
1ad8c0d67d py-llvmlite: add 0.37.0 (#26278) 2021-09-27 13:33:28 -07:00
Manuela Kuhn
044e2def4f py-datalad: move datalad wtf test over from py-datalad-metalad (#26274)
* py-datalad: move datalad wtf test over from py-datalad-metalad

* Update var/spack/repos/builtin/packages/py-datalad/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-27 15:24:48 -05:00
Manuela Kuhn
44355a4f18 py-numexpr: add 2.7.3 (#26279) 2021-09-27 20:13:59 +00:00
Manuela Kuhn
6b94d7e6a3 py-tqdm: add 4.62.3 (#26277) 2021-09-27 19:55:22 +00:00
Manuela Kuhn
441c602d2d py-nilearn: add 0.8.1 (#26276) 2021-09-27 14:50:13 -05:00
Thomas Madlener
285df128a1 [openloops][vbfnlo] Fix recipes and updated dependencies (#24461)
* [vbfnlo] Add doc variant to toggle building of docs

* [openloops] Add scons to dependencies

Make sure that the build_processes does not accidentally pick up a
non-suitable scons version from the underlying system

* [openloops] Set OLPYTHON to make sure the right scons is picked

* [openloops] Fix Flake8 style complaints
2021-09-27 11:40:33 -07:00
Richarda Butler
c68536e9d3 MPICH: Add E4S testsuite stand alone test (#25634) 2021-09-27 11:36:53 -07:00
Ben Darwin
c07af15946 mrtrix3: add version 3.0.3 (#25943) 2021-09-27 11:33:42 -07:00
iarspider
e256d02963 Update highfive (#26262)
* Add new versions of highfive
* Fix CMake option name
2021-09-27 11:24:37 -07:00
Marie Houillon
87c7a7e103 New packages for openCARP (#25909)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 11:21:43 -07:00
Melven Roehrig-Zoellner
6b2192ef96 source-highlight: allow to build the git master branch (#25947)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 11:16:55 -07:00
Miroslav Stoyanov
5130f43eb3 added workaround for rocm/cmake bug in tasmanian (#25949) 2021-09-27 11:14:21 -07:00
Edward Hartnett
c592e17d49 added package g2c from NOAAs NCEPLIBS project (#26190)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 11:03:54 -07:00
Olivier Cessenat
27c0b4a0f7 sundials package: @:5.7.0 requires hypre@:2.22.0 (#25903)
sundials@:5.7.0 fails to build with hypre@2.22.1
2021-09-27 10:58:54 -07:00
Steven Smith
ee63cfe0fd New Package: EcoSLIM (#25975)
Co-authored-by: Steven Smith <Steven Smith>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 10:38:53 -07:00
psakievich
df9d1dd1cb Correct path comparisons for fs view (#25891)
* Fix path comparisons in copy views

* Get correct permissions

* Set group id though os

Co-authored-by: Philip Sakievich <psakiev@sanida.gov>
2021-09-27 09:51:38 -07:00
jacorvar
33957f2745 add mantainer to bedops (#26141) 2021-09-27 16:23:50 +00:00
bernhardkaindl
0cc80f5118 spindle: It seems it needs mpi.h to compile, adding depends on mpi. (#26151)
Workaround this compile error for gcc by adding -Wno-narrowing for it:
spindle_logd.cc:65:76: error: narrowing conversion of '255' from 'int' to 'char'
spindle_logd.cc:65:76: error: narrowing conversion of '223' from 'int' to 'char'
spindle_logd.cc:65:76: error: narrowing conversion of '191' from 'int' to 'char'

spindle 0.8.1 wants to compile tests with mpi.h, newer versions need mpicc,
thus add: depends_on("mpi"). Spindle supports the --no-mpi to disable MPI.
2021-09-27 08:58:47 -07:00
Edward Hartnett
ffedc3637c added NOAAs UPP package (#26198) 2021-09-27 08:56:05 -07:00
Adam J. Stewart
4d3becdfd9 py-torch: add v1.9.1 (#26159) 2021-09-27 09:30:47 -05:00
bernhardkaindl
4a130b9185 perl: Fix test case bug perl#15544: Don't fail with PATH >1000 chars (#26245)
Fix the perl test case bug Perl/perl5#15544
Variable PATH longer than 1000 characters (as is usual with spack) fails a perl test case
The fix is: Don't test PATH in testcase perlbug.t
Fixes `spack install --test=all` for specs triggering a build and test of perl!
2021-09-27 15:18:36 +02:00
Manuela Kuhn
2acfe55b74 exempi: fix expat dependency (#26221)
* exempi: fix expat dependency and fix test with --test=root

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-27 14:52:23 +02:00
iarspider
8710353336 pythia8: update to v8.306, change url, set environment variables for dependents (#26255) 2021-09-27 09:05:51 +00:00
Jen Herting
514d04c8f1 New package: r-klar (#25390) 2021-09-27 02:53:03 +00:00
Jen Herting
87e91178a6 New package: r-questionr (#25395) 2021-09-26 21:35:24 -05:00
Manuela Kuhn
e90534d45e py-python-gitlab: add 2.10.1 (#26229) 2021-09-27 01:24:05 +02:00
Wouter Deconinck
caa33b7b05 [acts] eigen is not only a build dependency (#26257)
The ActsConfig.cmake includes a `find_dependency(Eigen3)` line. Requiring depends_on type='build' does not expose eigen to dependents during their build.

Ref: https://github.com/acts-project/acts/blob/v13.0.0/cmake/ActsConfig.cmake.in#L59
2021-09-27 01:23:44 +02:00
lorddavidiii
8fa6597d43 cfitsio: add version 4.0.0 (#26218) 2021-09-27 01:18:09 +02:00
Seth R. Johnson
7e2fc45354 trilinos: default gotype=long_long (#26253)
"Long long" is the default type when building trilinos on its own, and
many downstream packages (both in and out of spack) rely on it. E4S
already sets this explicitly to long_long.
2021-09-26 16:51:23 +02:00
Desmond Orton
3e7cee27d7 masurca: add versions up to v4.0.5 (#26171)
Co-authored-by: las_djorton <las_djorton@babybuild.las.iastate.edu>
2021-09-26 15:53:59 +02:00
Wouter Deconinck
798ef2bdb0 vtk: depends_on freetype @:2.10.2 through @:9.0.1 only (#26180)
The issues in https://gitlab.kitware.com/vtk/vtk/-/issues/18033 were fixed in:
- dae1718d50
- 31e8e4ebeb

These fixes are in vtk@9.0.2 https://github.com/Kitware/VTK/releases/tag/v9.0.2 and beyond.
2021-09-26 15:52:51 +02:00
Brian Van Essen
cc658c5f1e lbann: add support for building with the ONNX C++ library (#26130) 2021-09-26 15:50:09 +02:00
Valentin Volkl
1f972f90c8 podio, edm4hep: fix tests (#26116)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-09-26 15:49:10 +02:00
Brian Van Essen
123c105771 onnx: fix python version in environments (#26131)
When using the ONNX package inside of an environment that specifies a
python3 executable, it will attempt to use a system installed
version.  This can lead to a failure where the system python and the
environment python don't agree and the system python ends up with an
invalid environment.  Forces ONNX to use the same version of python as
the rest of the spec.

Co-authored-by: Greg Becker <becker33@llnl.gov>
2021-09-26 15:48:45 +02:00
Brian Van Essen
b909560ed5 Added CMake command to export compiler directories to support emacs LSP mode. (#26118) 2021-09-26 15:47:46 +02:00
Paul Spencer
0e6e9c23a1 libsignal-protocol-c: new package (#26121)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-26 15:47:15 +02:00
Hadrien G
a99d8463ac diffutils: Add v3.8, fixes an incompatibility with newer glibc (#26237) 2021-09-26 12:30:16 +02:00
Harmen Stoppels
e76462da38 Compiler conflict in umpire as a range (#26161) 2021-09-26 12:28:30 +02:00
Harmen Stoppels
fbda52d1b0 Curl 7.79.0 (#26075) 2021-09-26 12:28:05 +02:00
Manuela Kuhn
ab85961e73 py-datalad-hirni: add new package (#26227) 2021-09-26 09:20:36 +00:00
Nikolay Simakov
72cabcf454 enzo: build extra tools, add optimization variant (#25401)
* added inits and rings tools to be built
* added opt variant
2021-09-26 10:59:06 +02:00
iarspider
344b37730b jemalloc: add more variants (#26144) 2021-09-26 10:56:27 +02:00
bernhardkaindl
5967a10432 candle-benchmarks: depend_on()s: Fix obsoleted and renamed variants (#26250)
Fix solving/concretizing candle-benchmarks:
py-theano: The requested variant +gpu us now named +cuda
opencv: The requested variants +python and +zlib are now fixed deps
2021-09-26 10:37:01 +02:00
bernhardkaindl
6ccda81368 log_parser.py: Find failed test case messages in error logs (#25694)
- Match failed autotest tests show the word "FAILED" near the end
- Match "FAIL: ", "FATAL: ", "failed ", "Failed test" of other suites
- autotest "   ok"$ means the test passed, independend of text before.
- autoconf messages showing missing tools are fatal later, show them.
2021-09-26 10:35:04 +02:00
bernhardkaindl
34415f6878 swiftsim: Fix build with version bump and depends_on('gsl') (#26166)
swiftsim-0.7.0 didn't build because of a configure script issue and
more. Fix the build by a version bump and adding `depends_on('gsl')`.
2021-09-26 10:32:44 +02:00
bernhardkaindl
a947680e37 dropwatch: make check starts a daemon which does not stop: Skip it (#26164)
dropwatch is a network packet drop checker and it's make check starts
a daemon which does not terminate.

- Skip this test to not block builds.
- Add depends_on('pkgconfig', type='build')
  It is needed in case the host does not have pkg-config installed.
- Remove the depends_on('m4', type='build'):
  The depends_on('autoconf', type='build') pulls m4 as it needs it.
2021-09-26 10:32:04 +02:00
bernhardkaindl
2154786b61 dnstop: needs ncurses and it forgot to add -ltinfo (#26163)
dnstop misses a depends('ncurses') and it needs
to link with libtinfo from ncurses as well.
2021-09-26 10:30:42 +02:00
bernhardkaindl
9a64a229a0 xfwp: Fix build using: env.append_flags('CPPFLAGS', '-D_GNU_SOURCE') (#26157)
Without -D_GNU_SOURCE, the build of xfwp has many compilation errors.
Example: io.c:1039:7: error: implicit declaration of function 'swab'
2021-09-26 10:29:43 +02:00
bernhardkaindl
7189a69c70 rivet: Fix of build and tests on Ubuntu 18.04 w/ version bump (#26152)
When using Ubuntu's gcc-8.4.0 on Ubuntu 18.04 to compile rivet-3.1.3,
compilation errors related to UnstableParticles(), "UFS" show up.

Compilation with this compiler is fixed in rivet-3.1.4, adding it.

Adding type='link' to the depends on 'hepmc' and 'hepmc' fixes
the tests to find libHepMC.so.4 in `spack install --tests=all`

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2021-09-26 10:29:08 +02:00
bernhardkaindl
fee0831b63 acct: skip installcheck: fails checking if all progs have --help (#26093)
Checks if all programs support --help and --version but not all do.
2021-09-26 10:26:21 +02:00
bernhardkaindl
919a893c08 mountpointattr: Fix checksum of 1.1.1 and fix build(patch is only for 1.1) (#26087) 2021-09-26 10:25:50 +02:00
bernhardkaindl
91ce8354a2 cairo: pass tests: devs wrote that other people shouldn't attempt them (#26086)
The cairo test suite is huge, has many backends and the README states
that running and attempting to pass it is not a goal for normal users,
it has so many dependencies into the system, including fonts, that
passing it is not a goal realistically in reach soon:
Skip it, it takes far too long to be practical.
2021-09-26 10:25:26 +02:00
bernhardkaindl
c411779c51 libfuse: Fix build when meson doesn't get host's udev pkg (#26147)
Despite the patch disabling installation of rules, meson's setup
stage looks up the udev package to get `/lib/udev/rules.d`, but as
spack has no `systemd/udev` package, it would fail to build.

Fix such builds by passing `-Dudevrulesdir` and bump version to 3.10.5
2021-09-26 10:24:30 +02:00
Manuela Kuhn
17857083b0 py-datalad-container: add new package (#26222) 2021-09-26 10:18:17 +02:00
Manuela Kuhn
dbbc2879e5 py-datalad-neuroimaging: add new package (#26223) 2021-09-26 10:17:22 +02:00
Manuela Kuhn
d627026910 py-datalad-webapp: add new package (#26226) 2021-09-26 09:50:01 +02:00
Manuela Kuhn
3b323d6c90 py-datalad-metalad: add new package (#26225)
Added py-nose for import and a simple installtest from appveyor.yml

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-26 07:44:18 +00:00
bernhardkaindl
7ca657aef6 libidl builds using flex and bison: Add them as build-depends (#26183) 2021-09-26 09:40:18 +02:00
Jen Herting
27e6032f73 New package: py-gpyopt (#26232)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-26 07:34:18 +00:00
Adam J. Stewart
76ca57bb2f py-scikit-learn: add v1.0 (#26243) 2021-09-26 08:33:58 +02:00
Jen Herting
56858eb34e first build of py-doe2 a fork of py-doe (#26235)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-09-26 08:12:40 +02:00
Carlos Bederián
b4b3b256bc ucx: use bfd instead of lld with %aocc (#26254) 2021-09-26 04:47:07 +02:00
Jen Herting
3c14d130ca New package: r-styler (#25399) 2021-09-25 20:50:22 +00:00
Sreenivasa Murthy Kolam
b56e1d1f6d limit the amd_target_sram_ecc to rocm-3.9.0-4.0.0 (#26240) 2021-09-25 19:18:44 +00:00
Sam Reeve
b2376db632 cabana: add 0.4 and new dependency variants (#25847)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Christoph Junghans <junghans@votca.org>
2021-09-25 12:04:24 -06:00
Glenn Johnson
d8dc1f2c80 new package: dicom3tools (#25700) 2021-09-25 08:43:17 -06:00
Jen Herting
348cebfb8f py-sobol-seq: new package (#26234)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-09-25 14:39:36 +02:00
Jen Herting
3f7d6763e0 New package: py-pydoe (#26233)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-09-25 14:34:35 +02:00
bernhardkaindl
ff511e090a autotools doc PR: No depends_on('m4') with depends_on('autoconf') (#26101)
* autotoolspackage.rst: No depends_on('m4') with depends_on('autoconf')
  - Remove `m4` from the example depends_on() lines for the autoreconf phase.
  - Change the branch used as example from develop to master as it is
    far more common in the packages of spack's builtin repo.
- Fix the wrong info that libtoolize and aclocal are run explicitly
  in the autoreconf phase by default. autoreconf calls these internally
  as needed, thus autotools.py also does not call them directly.
- Add that autoreconf() also adds -I<aclocal-prefix>/share/aclocal.
- Add an example how to set autoreconf_extra_args.
- Add an example of a custom autoreconf phase for running autogen.sh.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-25 10:15:03 +02:00
Alec Scott
ef7cf83ea5 Add Go v1.17.1 (#26128) 2021-09-25 01:22:34 -06:00
acastanedam
b67cfc0fd1 gpi-2: improve GPI-2 installation (#25808)
Based on the original script of R. Mijakovic further improvements of
GPI-2 installation, in particular different official versions,
configuration setups and even testing. Importantly, the non-autotools
way of installation for older versions is also considered, which is
relevant for some packages using GPI-2.

Co-authored-by: Arcesio Castaneda Medina <arcesio.castaneda.medina@itwm.fraunhofer.de>
2021-09-25 00:58:46 -06:00
bernhardkaindl
e3dc586064 xload: fix build by disabling gettext and add v1.1.3 (#26142)
xload failed with unresolved referenced to libintl functions:
Disabled it's use of gettext calls and added the last "new" version.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-25 00:58:32 -06:00
AMD Toolchain Support
ed13addf3b Cloverleaf: adding AOCC support (#26148)
Co-authored-by: Mohan Babu <mohbabul@amd.com>
2021-09-25 00:49:43 -06:00
Adam J. Stewart
b8e21f2b0b GDAL: fix build of Java bindings (#26244) 2021-09-25 00:01:45 -06:00
Manuela Kuhn
7e67652983 py-datalad: add 0.15.1 (#26230) 2021-09-24 23:40:48 -06:00
kwryankrattiger
a9cb31ab49 Enable cinema variant in ecp-data-vis-sdk (#26176) 2021-09-25 07:18:34 +02:00
Filippo Spiga
c725ca83ce nvhpc: add v21.9 (#26217) 2021-09-25 07:15:22 +02:00
bernhardkaindl
2eddc60baa tauola: Fix build: The current build wants lhapdf headers by default (#26154)
The hand-written `configure` script of this package does not handle
--without-<feature> at all. The source wants to use `lhapdf` headers
even if support of lhapdf is not indicated using `--with-lhapdf`.

Enable the variant `lhapdf` by default: It fixes the build of the
current package and provides the TauSpinner feature as well.
2021-09-24 20:16:59 -06:00
bernhardkaindl
13d313d3ad scallop: Fix build by bumping the version to 0.10.5 (#26162)
build of 0.10.3 fails with CoinPragma.hpp: No such file or directory
2021-09-24 18:11:06 -06:00
Olivier Cessenat
8ed7f65bc8 pigz: new version 2.6 (#26066) 2021-09-24 17:17:08 -06:00
bernhardkaindl
742bd1f149 nix: bump version, add new depends and make installcheck pass (#26179)
The current version of `nix` has some more features and has more
dependencies. The installcheck is quite involved but passes now.
2021-09-24 17:08:10 -06:00
bernhardkaindl
8d70f94aae cgdcbxd: Fix build: Use DESTDIR=self.prefix to not write to /etc (#26145)
Fix the build for normal non-root/non-system-user builds, as we cannot
know that we'd have to uninstall these files even if installed as root.
Also add `pkgconfig` and remove not explicitly needed `depends_on('m4')`
2021-09-24 16:32:09 -06:00
luker
ab90cd8fc4 Tau rocm fix (#26134)
* update the Tau package to use the correct ROCm dependencies and prefixes

    1st:
    When the rocm variant is selected, tau defaults to look for rocm in /opt/rocm
    which is not guarenteed to be the correct location -- this has been fixed
    to provide the prefix for hsa-rocr-dev (which is now a dependency when +rocm is
    selected).

    2nd:
    the rocprofiler dependency package was not specified correctly, it should
    be called rocprofiler-dev, also rocprofiler-dev is a dependency when
    +rocprofiler is selected.

added roctracer support
2021-09-24 16:29:04 -06:00
bernhardkaindl
cdcecda9d0 sst-dumpi: Correct the wrong checksum of the current version (#26181)
Correct the sha256 of the current version as shown by `spack checksum`
2021-09-24 14:17:16 -06:00
bernhardkaindl
adc7fee12e w3m: Fix build by disabling RAND_egd and japanese messages (#26168)
w3m's build fails with `undefined reference to `RAND_egd'` which
is an deprecated insecure feature and from building japanese messages.

Disabling both makes the build of `w3m` work.
2021-09-24 13:26:14 -06:00
bernhardkaindl
12252f1ca4 iproute2: add v5.10.0 and v5.11.0, add missing deps (#26182)
iproute2 needs libmnl and builds using flex and bison: Add them.
2021-09-24 12:37:52 -06:00
bernhardkaindl
7efb18b502 nginx: Bump version for OpenSSL@3 and restrict @:1.21.2 to openssl@:1 (#26156) 2021-09-24 12:16:51 -06:00
bernhardkaindl
45c84a2b04 xdriinfo: uses glXGetProcAddressARB, add: depends_on('gl') (#26184) 2021-09-24 12:08:07 -06:00
Edward Hartnett
2c9dfcabf8 added NOAA package prod_util (#26197) 2021-09-24 11:44:09 -06:00
Edward Hartnett
1a0e1597c8 nemsio: Add v2.5.3, which includes dependency change, and maintainers (#26188) 2021-09-24 11:41:15 -06:00
Adam J. Stewart
9a17dc0001 py-torchvision: rename master -> main (#26160) 2021-09-24 11:32:23 -06:00
Robert Cohn
3a48f5f931 intel-oneapi-mpi/mkl packages: add ilp64 support (#26045) 2021-09-24 10:32:06 -07:00
Edward Hartnett
73913a5d51 added GFDL FMS package (#26191) 2021-09-24 11:14:16 -06:00
Scott Wittenburg
45b70d9798 Pipelines: Disable ppc builds until we have resources or make it smaller (#26238) 2021-09-24 08:24:36 -07:00
bernhardkaindl
cdbb586a93 autotools.py/autoreconf: Show the depends_on()s to add to the package (#26115)
This commit shows a template for cut-and-paste into the package to fix it:

```py
==> fast-global-file-status: Executing phase: 'autoreconf'
==> Error: RuntimeError: Cannot generate configure: missing dependencies autoconf, automake, libtool.

Please add the following lines to the package:

    depends_on('autoconf', type='build', when='@master')
    depends_on('automake', type='build', when='@master')
    depends_on('libtool', type='build', when='@master')

Update the version (when='@master') as needed.
```
    
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-24 09:17:07 -06:00
bernhardkaindl
bcf708098d kmod: Add the missing depends on m4 and pkgconfig (#26137)
The build needs pkgconfig and lzma, m4 is already added by autoconf.

Disable generation of kmod manpages as spack does not have xsltproc yet.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-24 08:02:09 -06:00
Edward Hartnett
71378bcf39 landsfcutil: new package (#26193) 2021-09-24 13:48:58 +02:00
Wouter Deconinck
f68b2ea73a acts: add v13.0.0 (#26196) 2021-09-24 13:47:25 +02:00
Edward Hartnett
2d8f06bfa0 ufs-utils: added NOAA maintainers (#26201) 2021-09-24 13:07:18 +02:00
Harmen Stoppels
87e1ed0ef1 Bump cmake 3.21.3 (#26186) 2021-09-24 04:13:52 -06:00
Troy Redfearn
ef6013da9c Added new versions and deprecated vulnerable versions (#26206) 2021-09-24 03:37:44 -06:00
Tamara Dahlgren
c348daf4dc Add 'radiuss' tags to RADIUSS packages (#26212) 2021-09-24 03:20:14 -06:00
Edward Hartnett
21eaf31291 g2tmpl: added NOAA maintainers (#26192) 2021-09-24 02:35:02 -06:00
Edward Hartnett
6d2ce0535f added NOAA package grib-util (#26200)
* added NOAA package grib-util
2021-09-24 02:17:07 -06:00
Ben Bergen
2e4528c55c Added mising '+' for kokkos (#26202) 2021-09-24 02:14:25 -06:00
Edward Hartnett
6f8c9edefc added NOAA ncio package (#26194) 2021-09-24 02:14:11 -06:00
Edward Hartnett
d532f5b40c added NOAA package nemsiogfs (#26195) 2021-09-24 02:08:06 -06:00
Wouter Deconinck
fb2e764f50 assimp: depends_on zlib (#26199)
Assimp searches for zlib (or builds its own version). When it searches, it can find a system install that is not provided by spack. Ref: d286aadbdf/CMakeLists.txt (L451)
2021-09-24 02:05:06 -06:00
Andrew-Dunning-NNL
7bb5dd7a14 gradle: add new versions through 7.2 (#26203) 2021-09-24 02:02:17 -06:00
iarspider
1299c714c4 gosamcontrib: add libs variants (#26030) 2021-09-24 10:00:02 +02:00
Ryan Mast
2bcdb33666 helics: Add versions 2.8.0, 3.0.0, and 3.0.1 (#26046) 2021-09-24 01:52:52 -06:00
Harmen Stoppels
a6bb3a66ea Remove centos:6 image references (#26095)
This was EOL November 30th, 2020. I believe the "builds" are failing on
develop because of it.
2021-09-24 09:47:49 +02:00
Wouter Deconinck
117ea5a239 opencascade: add v7.5.3; added VTK INCLUDE cmake flag (#26209) 2021-09-24 09:27:07 +02:00
Massimiliano Culpo
03c203effa Use Leap instead of Tumbleweed for e2e bootstrapping test (#26205)
Tumbleweed has been broken for a couple of days. The attempt
to fix it in #26170 didn't really work. Let's try to move to
a more stable release series for OpenSuse.
2021-09-24 01:22:40 -06:00
Mikhail Titov
ba452f7645 radical cybertools: add v1.8.0 (#26215) 2021-09-24 09:22:02 +02:00
Edward Hartnett
54ea7caf3f g2: add maintainers and version 3.4.5 (#26105)
* added new release, added noaa software maintainers to maintainer list

* updated comment

* removed trailing whitespace
2021-09-24 00:17:07 -06:00
kjrstory
bdab4c9f1f fontconfig: add dependency python (#25960) 2021-09-23 23:26:05 -06:00
iarspider
72c22696ac gperftools package: add variants (#26032)
* Make libunwind optional
* Add support for sized_delete and debugalloc

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-23 22:40:28 -06:00
Thomas Kluyver
fdc9cb0adb h5py: new version 3.4 (#25935)
No changes to dependencies or supported Python versions.

https://docs.h5py.org/en/stable/whatsnew/3.4.html
2021-09-23 12:12:33 -06:00
bernhardkaindl
ea459993db python: Fix regression in python-2.7.17+-distutils-C++.patch (#25821) 2021-09-23 07:38:58 -05:00
Christoph Conrads
45c6fe2b94 SQLite: make variants discoverable (#25885)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-23 09:54:02 +02:00
Harmen Stoppels
56a81ec81c Pin opensuse image in bootstrap tests (#26170)
Currently zypper in opensuse containers throws 'not permitted'

Temporarily fix the digest until they fixed their upstream package manager issues
2021-09-23 08:03:43 +02:00
Gregory Becker
1195ac02e5 Merge tag 'v0.16.3' into develop 2021-09-22 16:25:40 -07:00
Brian Van Essen
fa92482675 dihydrogen package: add missing dependency (#25896) 2021-09-22 12:03:15 -07:00
Edward Hartnett
76ad001c56 added NOAA software maintainers to maintainer list (#26112) 2021-09-22 07:24:16 -04:00
bernhardkaindl
4ccd2e8ff0 pango: Fix build: restore autotools-based versions (#26084)
Fix the build of pango and it's 20 dependents: Only provide the versions which
support the build using autotools (conversion to MesonPackage didn't progress)
This only restores the list of versions of August 10, before the build broke.
2021-09-22 05:13:49 -06:00
Alec Scott
2b6a8ccf29 Add Rclone v1.56.1 (#26124) 2021-09-22 04:07:48 -06:00
albestro
7613511718 add conflict (#26028) 2021-09-22 12:07:19 +02:00
Alec Scott
ed3aa9633d Add Picard v2.26.2 (#26125) 2021-09-22 04:04:44 -06:00
kjrstory
24ca007448 su2: add version 7.0.4-7.2.0 (#25956) 2021-09-22 01:07:44 -06:00
Alec Scott
86ddcd0e2d Add Slepc v3.15.2 (#26123) 2021-09-21 23:13:51 -06:00
Edward Hartnett
5097a41675 gfsio: add NOAA software maintainers (#26106)
* added NOAA software maintainers to maintainer list

* added comment about NCEPLIBS
2021-09-21 19:20:47 -06:00
Harmen Stoppels
3b55c2e715 Bump version and update changelog 2021-09-21 16:58:41 -07:00
Harmen Stoppels
7caa844d49 Fix style tests 2021-09-21 16:58:41 -07:00
Harmen Stoppels
4cd6381ed5 Remove centos:6 image references
This was EOL November 30th, 2020. I believe the "builds" are failing on
develop because of it.
2021-09-21 16:58:41 -07:00
Massimiliano Culpo
aebc8f6ce5 docker: remove boto3 from CentOS 6 since it requires and updated pip (#24813) 2021-09-21 16:58:41 -07:00
Massimiliano Culpo
073c92d526 docker: Fix CentOS 6 build on Docker Hub (#24804)
This change make yum usable again on CentOS 6
2021-09-21 16:58:41 -07:00
Greg Becker
0e85d7011e Ensure all roots of an installed environment are marked explicit in db (#24277) 2021-09-21 16:58:41 -07:00
Todd Gamblin
77e633efa1 locks: only open lockfiles once instead of for every lock held (#24794)
This adds lockfile tracking to Spack's lock mechanism, so that we ensure that there
is only one open file descriptor per inode.

The `fcntl` locks that Spack uses are associated with an inode and a process.
This is convenient, because if a process exits, it releases its locks.
Unfortunately, this also means that if you close a file, *all* locks associated
with that file's inode are released, regardless of whether the process has any
other open file descriptors on it.

Because of this, we need to track open lock files so that we only close them when
a process no longer needs them.  We do this by tracking each lockfile by its
inode and process id.  This has several nice properties:

1. Tracking by pid ensures that, if we fork, we don't inadvertently track the parent
   process's lockfiles. `fcntl` locks are not inherited across forks, so we'll
   just track new lockfiles in the child.
2. Tracking by inode ensures that referencs are counted per inode, and that we don't
   inadvertently close a file whose inode still has open locks.
3. Tracking by both pid and inode ensures that we only open lockfiles the minimum
   number of times necessary for the locks we have.

Note: as mentioned elsewhere, these locks aren't thread safe -- they're designed to
work in Python and assume the GIL.

Tasks:
- [x] Introduce an `OpenFileTracker` class to track open file descriptors by inode.
- [x] Reference-count open file descriptors and only close them if they're no longer
      needed (this avoids inadvertently releasing locks that should not be released).
2021-09-21 16:58:41 -07:00
Todd Gamblin
2ae92ebdbb Use AWS CloudFront for source mirror (#23978)
Spack's source mirror was previously in a plain old S3 bucket. That will still
work, but we can do better. This switches to AWS's CloudFront CDN for hosting
the mirror.

CloudFront is 16x faster (or more) than the old bucket.

- [x] change mirror to https://mirror.spack.io
2021-09-21 16:58:41 -07:00
Harmen Stoppels
7f29dd238f Cray: fix extracting paths from module files (#23472)
Co-authored-by: Tiziano Müller <tm@dev-zero.ch>
2021-09-21 16:58:41 -07:00
Adam J. Stewart
0f486080b3 Fix use of quotes in Python build system (#22279) 2021-09-21 16:58:41 -07:00
Michael Kuhn
6b0c775448 clang/llvm: fix version detection (#19978)
This PR fixes two problems with clang/llvm's version detection. clang's
version output looks like this:

```
clang version 11.0.0
Target: x86_64-unknown-linux-gnu
```

This caused clang's version to be misdetected as:

```
clang@11.0.0
Target:
```

This resulted in errors when trying to actually use it as a compiler.

When using `spack external find`, we couldn't determine the compiler
version, resulting in errors like this:

```
==> Warning: "llvm@11.0.0+clang+lld+lldb" has been detected on the system but will not be added to packages.yaml [reason=c compiler not found for llvm@11.0.0+clang+lld+lldb]
```

Changing the regex to only match until the end of the line fixes these
problems.

Fixes: #19473
2021-09-21 16:58:41 -07:00
Adam J. Stewart
9120856c01 Fix fetching for Python 3.9.6 (#24686)
When using Python 3.9.6, Spack is no longer able to fetch anything. Commands like `spack fetch` and `spack install` all break.

Python 3.9.6 includes a [new change](https://github.com/python/cpython/pull/25853/files#diff-b3712475a413ec972134c0260c8f1eb1deefb66184f740ef00c37b4487ef873eR462) that means that `scheme` must be a string, it cannot be None. The solution is to use an empty string like the method default.

Fixes #24644. Also see https://github.com/Homebrew/homebrew-core/pull/80175 where this issue was discovered by CI. Thanks @branchvincent for reporting such a serious issue before any actual users encountered it!

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-09-21 16:58:41 -07:00
Edward Hartnett
90d953d3f6 w3emc: add NOAA software maintainers (#26110) 2021-09-21 17:54:19 -06:00
Tiziano Müller
d34b2638c7 boost: fix for @1.77.0%intel (#25965)
Add patch for build script from boost repo.
2021-09-21 16:07:25 -07:00
bernhardkaindl
979c355c99 spack/build_environment.py: Clean MAKEFLAGS, DISPLAY and TERM (#26092)
clean_environment(): Unset three more environment variables:

MAKEFLAGS: Affects make, can eg indirectly inhibit enabling parallel build
DISPLAY: Tests of GUI widget libraries might try to connect to an X server
TERM: Could make testsuites attempt to color their output
2021-09-22 00:23:10 +02:00
Edward Hartnett
92f199d57b added NOAA software maintainers to the maintainers list (#26102) 2021-09-21 16:16:49 -06:00
Edward Hartnett
fd716ce0c5 sfcio: add NOAA software maintainers (#26108)
* added NOAA software maintainers to maintainer list

* added comment about NCEPLIBS project
2021-09-21 15:28:37 -06:00
Edward Hartnett
d865caa7f7 ip2: add NOAA software maintainers and deprecation note (#26107)
* added NOAA software maintainers to maintainer list, added comment about library being deprecated

* deleted trailing whitespace
2021-09-21 14:49:13 -06:00
bernhardkaindl
981551b47f libtool: fix running the unit-tests with spack install --test root (#25707)
Besides adding autoconf and automake as needed for tests of 2.4.6,
skip Fortran test cases when Fortran compilers are not provided.
2021-09-21 14:18:29 -06:00
Tamara Dahlgren
c3cb863b82 Feature: Add deprecated versions section to spack info output (#25972) 2021-09-21 11:49:36 -07:00
AMD Toolchain Support
8de77bb304 Fix for - Installation issue: amdlibflame #25878 (#25987)
* Fix for - Installation issue: amdlibflame #25878

* Updated with python+pythoncmd - Symlink 'python3' executable to 'python'
2021-09-21 11:15:23 -07:00
Edward Hartnett
a8977f828e w3nco: add NOAA software maintainers (#26111) 2021-09-21 12:08:43 -06:00
Jonas Thies
6c487f4c6d packages/phist new version 1.9.5 (#26114) 2021-09-21 11:57:22 -06:00
Edward Hartnett
24500d4115 added noaa software maintainers to maintainer list (#26103) 2021-09-21 09:58:53 -07:00
iarspider
2741037c69 Fix FORM recipe (#26104) 2021-09-21 09:57:40 -07:00
Edward Hartnett
d278837316 sigio: add NOAA software maintainers (#26109) 2021-09-21 10:39:05 -06:00
bernhardkaindl
22cfc19bcb ddd,debuild,flux-sched: add missing dependencies (#26090) 2021-09-21 18:08:11 +02:00
Melven Roehrig-Zoellner
c24413a530 petsc: fix for enabling openmp (#25942)
* petsc: fix for enabling openmp

* petsc: shorten comment (style guidelines)

* petsc: move flag to make code more clear
2021-09-21 07:13:35 -06:00
natshineman
544b3f447d mvapich2-gdr: add v2.3.6 (#26076)
Co-authored-by: Nick Contini <contini.26@buckeyemail.osu.edu>
2021-09-21 06:19:56 -06:00
Gregory Lee
05ca19d9d7 fgfs: fix missing autotools depends_on (#26005)
* added build deps for fgfs

* added build deps when building master branch
2021-09-21 07:40:59 -04:00
Harmen Stoppels
a326713826 py-abcpy: new package (#25713) 2021-09-21 13:10:16 +02:00
bernhardkaindl
1b95c97bb3 ccfits: add v2.6 (#26089)
ccfits@2.5 doesn't compile on Ubuntu 18.04: Update to version 2.6.
2021-09-21 04:43:48 -06:00
bernhardkaindl
c6a33c28de Fix deps of diffmark,faiss,fgsl,fipcheck,nbdkit,ncbi-magicblast,xcb-util-cursor (#26091)
Fix missing type='build' deps for cpio, m4, pkg-config and python in
diffmark, faiss, fgsl, fipcheck, nbdkit, ncbi-magicblast and xcb-util-cursor
2021-09-21 04:16:43 -06:00
Harmen Stoppels
58663692a4 Rename 'variant_name' to 'variant' and document it in autotools build system (#26064) 2021-09-21 11:27:41 +02:00
Christoph Junghans
45ba4d94bd votca-* packages: add version 2021.2 (#26058) 2021-09-20 16:14:19 -07:00
Adam J. Stewart
3e3b7b1394 New package: py-torchgeo (#26059) 2021-09-20 16:12:55 -07:00
Jordan Ogas
d0fd9e6d5f charliecloud package: add version 0.25 (#26073) 2021-09-20 16:08:41 -07:00
adityakavalur
88880c5369 Include newer versions of UCX (#26072) 2021-09-20 23:41:26 +02:00
Danny McClanahan
aa7e354a3a patch serf and scons to use #!/usr/bin/env python3 (#26062) 2021-09-20 22:12:30 +02:00
iarspider
f3d2f62468 Add variants to FORM recipe (#25963) 2021-09-20 13:53:54 -06:00
Edward Hartnett
65f285831e added NOAA maintainers to ip package (#26055) 2021-09-20 11:36:03 -06:00
Olivier Cessenat
0197b364a2 mfem and petsc: alter hypre dependencies (#25902)
Hypre latest version 2.22.1 breaks MFEM and PETSc.
2021-09-20 09:51:29 -07:00
Edward Hartnett
96eefea9a3 added NOAA software team to maintainers list (#26056) 2021-09-20 09:01:53 -06:00
Olivier Cessenat
1ea58e0cb9 p7zip: resolve gcc 10 conflict (#25676)
Fix credit: Eric Brugger
2021-09-20 07:43:07 -04:00
Tamara Dahlgren
c6f9e9baf6 strumpack: Update stand-alone test to use stage directory (#25751) 2021-09-20 07:36:58 -04:00
kjrstory
430caaf5f6 paraview: patch for version 5.9.1 with gcc 11.1.0 (#25887)
* paraview: patch for version 5.9.1 with gcc 11.1.0

* Fix : blank line contains whitespace
2021-09-20 07:31:54 -04:00
Robert Underwood
6476e0cf02 gdb: new version v11.1 (#25920) 2021-09-20 07:30:37 -04:00
Jen Herting
acd4186a88 New packages: py-imgaug (#25894)
* updates from sid

* [py-imgaug] url -> pypi

* [py-imgaug] added version 0.4.0 and fixed up dependencies

* [py-imgaug] updated copyright

Co-authored-by: Andrew Elble <aweits@rit.edu>
2021-09-20 07:29:43 -04:00
iarspider
ca8d16c9d1 Allow setting variant name in AutotoolsPackage._activate_or_not (#26054) 2021-09-20 10:54:24 +02:00
jacorvar
d20fa421ca Fix R version (#26061)
R version should be `3.0.0` instead of `3.00`.
2021-09-20 10:47:40 +02:00
Edward Hartnett
8fceaf4a33 added NOAA software team to maintainers list (#26057) 2021-09-19 19:02:02 -06:00
Vanessasaurus
534bad0ff7 updating OmegaH to 9.34.1 (#26015) 2021-09-19 08:55:54 -06:00
Todd Gamblin
4f8b643404 Create SECURITY.md 2021-09-19 06:43:14 -07:00
Vanessasaurus
c2f42a6f09 updating kokkos to 3.4.01 (#26013) 2021-09-19 06:10:55 -06:00
Vanessasaurus
148c7aeb06 updating singularity to 3.8.3 (#26020)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-09-19 04:20:15 -06:00
jacorvar
4f79a83dd1 Fix R version (#26047)
R version should be `3.0.0`, in contrast to `3.00`.
2021-09-19 12:14:14 +02:00
Alec Scott
2d4d51eb5e Add v1.6.0 to Benchmark (#25996) 2021-09-18 20:59:02 -06:00
Cameron Smith
b14d6d217e omegah: add version 9.34.1 (#25828) 2021-09-18 19:58:44 -06:00
Alec Scott
5b400a69b7 Add v0.935 to Angsd (#25995) 2021-09-18 18:46:55 -06:00
Adam J. Stewart
c0839b14d5 Python: use platform-specific site packages dir (#25998) 2021-09-19 00:37:50 +00:00
Seth R. Johnson
6ff5795342 cmake: allow gcc on macOS for newer versions (#25994) 2021-09-18 17:34:43 -06:00
Michele Martone
60f0d0ac20 librsb: add v1.2.0.10 (#26044) 2021-09-18 17:43:41 -04:00
Axel Huebl
8f3482b2ce FFTW: Fix OpenMP Build on macOS (#26039) 2021-09-18 15:13:47 -06:00
Vanessasaurus
fc79a5da17 updating siesta to 4.0.2 (#26021) 2021-09-18 08:05:02 -06:00
Vanessasaurus
1880db8457 updating universal ctags to 5.2.0.xxxx (#26024) 2021-09-18 07:13:55 -06:00
Vanessasaurus
6538727fb3 updating sparsehash to 2.0.4 (#26023) 2021-09-18 06:52:59 -06:00
Vanessasaurus
f5275fecc5 updating htslib to 1.13 (#26012) 2021-09-18 06:26:00 -06:00
Vanessasaurus
d499309433 updating nco to 5.0.1 (#26014) 2021-09-18 06:23:05 -06:00
Vanessasaurus
c5e2662dea updating graphviz to 2.49.0 (#26011) 2021-09-18 06:19:56 -06:00
Vanessasaurus
7b0a505795 updating veloc to 1.5 and cleaning up spacing (#26025) 2021-09-18 06:10:56 -06:00
Vanessasaurus
e8b86b7ec6 updating cloc to 1.9.0 (#26009) 2021-09-18 05:44:04 -06:00
Vanessasaurus
608a0b1f8f updating poppler (#26016) 2021-09-18 05:10:56 -06:00
Vanessasaurus
abd11cf042 updating raxml to 8.2.12 (#26018) 2021-09-18 04:58:50 -06:00
Vanessasaurus
8ad910d8f1 updating protobuf (Gooooogle!) to 3.18.0 (#26017) 2021-09-18 04:40:49 -06:00
Vanessasaurus
d5c985b888 updating samtools to 1.13 (#26019) 2021-09-18 04:16:51 -06:00
Vanessasaurus
f0443d0327 updating spades to 3.15.3 (#26022) 2021-09-18 03:46:50 -06:00
Vanessasaurus
9af1858bff updating gatk to 4.2.2.0 (#26010) 2021-09-18 10:07:28 +02:00
David Beckingsale
fa3265ea51 Convert RAJA, CHAI and Umpire to CachedCMakePackages (#25788)
* Switch Umpire to CMakeCachedPackage
* Fix missing import
* Correct tests option in Umpire
* Switch RAJA to CachedCMakePackage
* Convert CHAI to CachedCMakePackage
* Corrections in RAJA
* Patches in Umpire & RAJA for BLT target export
* Fixup style
* Fixup incorrect use of cmake_cache_string
2021-09-17 21:29:41 -07:00
Chris White
79ac572f21 update tutorial version of hdf5 (#25368) 2021-09-17 17:34:56 -07:00
Massimiliano Culpo
b847bb72f0 Bootstrap should search for compilers after switching config scopes (#26029)
fixes #25992

Currently the bootstrapping process may need a compiler.

When bootstrapping from sources the need is obvious, while
when bootstrapping from binaries it's currently needed in
case patchelf is not on the system (since it will be then
bootstrapped from sources).

Before this PR we were searching for compilers as the
first operation, in case they were not declared in
the configuration. This fails in case we start
bootstrapping from within an environment.

The fix is to defer the search until we have swapped
configuration.
2021-09-17 18:28:48 -06:00
Harmen Stoppels
4d36c40cfb Bump reframe (#25970) 2021-09-17 14:23:05 -06:00
Olli Lupton
ae9adba900 Add ccache v4.4.1. (#25957) 2021-09-17 13:37:58 -06:00
Cyrus Harrison
bd415ec841 improve ascent package to use stages and cmake base (#25720)
* improve ascent package to use stages and cmake base

* style

* more style
2021-09-17 10:37:16 -07:00
iarspider
7e7de25aba fmt: add variant for shared library (#25969) 2021-09-17 08:04:55 -06:00
eugeneswalker
730720d50a variant build: openmp_ref should be openmp (#26006) 2021-09-17 06:38:00 -06:00
iarspider
8e486c1e57 gosam: new version 2.1.1 (#25985) 2021-09-17 06:10:41 -06:00
Kurt Sansom
be8e52fbbe GCC: patch for gcc 10.3.0 ICE when using nvcc (#25980)
* fix: patch for gcc 10.3.0 ICE when using nvcc

* fix: use URL reference instead

* fix: add missing sha256sum
2021-09-16 18:43:56 -06:00
Edward Hartnett
f8fae997d3 added package.py for GPTL (#25993) 2021-09-16 17:11:05 -06:00
Massimiliano Culpo
71a3173a32 Filter UserWarning out of test output (#26001) 2021-09-16 14:56:00 -06:00
Miroslav Stoyanov
e027cecff2 workaround a cmake/rocm bug in heffte (#25948) 2021-09-16 13:26:07 -06:00
Alec Scott
bb29c5d674 Add v7.0.2 to Admixtools (#25997) 2021-09-16 12:08:15 -06:00
G-Ragghianti
c4e26ac7c8 Fix for problem with cmake@3.21 (#25989) 2021-09-16 11:52:59 -06:00
AMD Toolchain Support
cf81046bb1 New package: ROMS (#25990)
Co-authored-by: Mohan Babu <mohbabul@amd.com>
2021-09-16 10:34:45 -07:00
Michael Kuhn
2d34acf29e cc: Use parameter expansion instead of basename (#24509)
While debugging #24508, I noticed that we call `basename` in `cc`. The
same can be achieved by using Bash's parameter expansion, saving one
external process per call.

Parameter expansion cannot replace basename for directories in some
cases, but is guaranteed to work for executables.
2021-09-16 16:25:49 +00:00
Michael Kuhn
d73fe19d93 Recommend Git's manyFiles feature (#25977)
Git 2.24 introduced a feature flag for repositories with many files, see:
https://github.blog/2019-11-03-highlights-from-git-2-24/#feature-macros

Since Spack's Git repository contains roughly 8,500 files, it can be
worthwhile to enable this, especially on slow file systems such as NFS:
```
$ hyperfine --warmup 3 'cd spack-default; git status' 'cd spack-manyfiles; git status'
Benchmark #1: cd spack-default; git status
  Time (mean ± σ):      3.388 s ±  0.095 s    [User: 256.2 ms, System: 625.8 ms]
  Range (min … max):    3.168 s …  3.535 s    10 runs

Benchmark #2: cd spack-manyfiles; git status
  Time (mean ± σ):     168.7 ms ±  10.9 ms    [User: 98.6 ms, System: 126.1 ms]
  Range (min … max):   144.8 ms … 188.0 ms    19 runs

Summary
  'cd spack-manyfiles; git status' ran
   20.09 ± 1.42 times faster than 'cd spack-default; git status'
```
2021-09-16 09:41:10 -06:00
Harmen Stoppels
5b211c90f5 Bump sirius 7.2.x (#25939) 2021-09-16 15:00:58 +02:00
Massimiliano Culpo
fa6366a7df Add a deprecation warning when using the old concretizer (#25966) 2021-09-16 13:25:24 +02:00
Mikael Simberg
b09ad2cc8c Update HPX package (#25775)
* Add support for C++20 to HPX package

* Enable unity builds in HPX package when available

* Add support for HIP/ROCm to HPX package

* Rearrange and update required versions for HPX package

* Add C++20 option to asio package
2021-09-16 13:24:17 +02:00
Harmen Stoppels
ccfdac8402 Improve bootstrapping docs a hair (#25962) 2021-09-16 07:02:31 -04:00
Harmen Stoppels
abb0f6e27c Fix NameError in foreground/background test (#25967) 2021-09-16 10:39:07 +02:00
Harmen Stoppels
3fe9b34362 Inform the user about bootstrapping (#25964) 2021-09-16 09:28:24 +02:00
jacorvar
c0122242ee bedops: Fix checksum for 2.4.40 (#25958)
Fixes #25951
2021-09-15 16:17:40 -07:00
Adam J. Stewart
0015f700b7 py-pybind11: use PythonPackage install method (#25650) 2021-09-15 11:57:33 -07:00
dependabot[bot]
5d23638fdc build(deps): bump codecov/codecov-action from 2.0.3 to 2.1.0 (#25925)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 2.0.3 to 2.1.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/codecov/codecov-action/compare/v2.0.3...v2.1.0)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-09-15 16:02:05 +02:00
Tamara Dahlgren
f0b4afe7db Raise exception when 1+ stand-alone tests fail (#25857) 2021-09-15 08:04:59 -04:00
Ben Corbett
75497537a4 Added LvArray 0.2.2 (#25950) 2021-09-15 01:44:07 -06:00
Massimiliano Culpo
c52426ea7a Make clingo the default solver (#25502)
Modifications:
- [x] Change `defaults/config.yaml`
- [x] Add a fix for bootstrapping patchelf from sources if `compilers.yaml` is empty
- [x] Make `SPACK_TEST_SOLVER=clingo` the default for unit-tests
- [x] Fix package failures in the e4s pipeline

Caveats:
1. CentOS 6 still uses the original concretizer as it can't connect to the buildcache due to issues with `ssl` (bootstrapping from sources requires a C++14 capable compiler)
1. I had to update the image tag for GitlabCI in e699f14.  
1. libtool v2.4.2 has been deprecated and other packages received some update
2021-09-14 22:44:16 -07:00
Adam J. Stewart
0d0d438c11 Add a __reduce__ method to Environment (#25678)
* Add a __reduce__ method to Environment
* Add unit test
* Convert Path to str
2021-09-14 22:37:36 -07:00
Vanessasaurus
ef5ad4eb34 Adding ability to compare git references to spack install (#24639)
This will allow a user to (from anywhere a Spec is parsed including both name and version) refer to a git commit in lieu of 
a package version, and be able to make comparisons with releases in the history based on commits (or with other commits). We do this by way of:

 - Adding a property, is_commit, to a version, meaning I can always check if a version is a commit and then change some action.
 - Adding an attribute to the Version object which can lookup commits from a git repo and find the last known version before that commit, and the distance
 - Construct new Version comparators, which are tuples. For normal versions, they are unchanged. For commits with a previous version x.y.z, d commits away, the comparator is (x, y, z, '', d). For commits with no previous version, the comparator is ('', d) where d is the distance from the first commit in the repo.
 - Metadata on git commits is cached in the misc_cache, for quick lookup later.
 - Git repos are cached as bare repos in `~/.spack/git_repos`
 - In both caches, git repo urls are turned into file paths within the cache

If a commit cannot be found in the cached git repo, we fetch from the repo. If a commit is found in the cached metadata, we do not recompare to newly downloaded tags (assuming repo structure does not change). The cached metadata may be thrown out by using the `spack clean -m` option if you know the repo structure has changed in a way that invalidates existing entries. Future work will include automatic updates.

# Finding previous versions
Spack will search the repo for any tags that match the string of a version given by the `version` directive. Spack will also search for any tags that match `v + string` for any version string. Beyond that, Spack will search for tags that match a SEMVER regex (i.e., tags of the form x.y.z) and interpret those tags as valid versions as well. Future work will increase the breadth of tags understood by Spack

For each tag, Spack queries git to determine whether the tag is an ancestor of the commit in question or not. Spack then sorts the tags that are ancestors of the commit by commit-distance in the repo, and takes the nearest ancestor. The version represented by that tag is listed as the previous version for the commit.

Not all commits will find a previous version, depending on the package workflow. Future work may enable more tangential relationships between commits and versions to be discovered, but many commits in real world git repos require human knowledge to associate with a most recent previous version. Future work will also allow packages to specify commit/tag/version relationships manually for such situations.

# Version comparisons.
The empty string is a valid component of a Spack version tuple, and is in fact the lowest-valued component. It cannot be generated as part of any valid version. These two characteristics make it perfect for delineating previous versions from distances. For any version x.y.z, (x, y, z, '', _) will be less than any "real" version beginning x.y.z. This ensures that no distance from a release will cause the commit to be interpreted as "greater than" a version which is not an ancestor of it.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-09-14 22:12:34 -07:00
David Beckingsale
c3bc3e61aa gcc: apply backported fixes to v4.9.3 (#25945) 2021-09-15 07:10:04 +02:00
Satish Balay
148071ac8a dealii: add version 9.3.1 (#25915) 2021-09-14 15:26:08 -06:00
Scott Wittenburg
bf7c12b4df Pipelines: (Re)enable E4S on Power stack (#25921)
Pipelines: (Re)enable E4S on Power stack
2021-09-14 14:55:50 -06:00
Weston Ortiz
56c375743a Add missing mumps TPL commands (#25940) 2021-09-14 20:26:37 +01:00
Vanessasaurus
c6edfa3f31 Update spack monitor to support new spec (#25928)
This PR coincides with tiny changes to spack to support spack monitor using the new spec
the corresponding spack monitor PR is at https://github.com/spack/spack-monitor/pull/31.
Since there are no changes to the database we can actually update the current server
fairly easily, so either someone can test locally or we can just update and then
test from that (and update as needed).

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-09-14 08:02:04 -06:00
Harmen Stoppels
fc96c49b0b Add py-pot with patch (#25712) 2021-09-14 15:55:02 +02:00
Harmen Stoppels
ce790a89f2 reframe: set PYTHONPATH at runtime (#25842) 2021-09-14 15:54:14 +02:00
Valentin Volkl
1b633e1ca4 ocaml: add patch for clang@11: (#25886) 2021-09-14 09:34:57 +02:00
Ben Darwin
2ac9dc76c4 dcmtk: add v3.6.4, v3.6.5, v3.6.6 (#25923) 2021-09-14 09:22:48 +02:00
Weston Ortiz
d7c5aa46fe trilinos: variant for libx11 (#25823) 2021-09-14 08:53:53 +02:00
Edward Hartnett
f6eb16982a added new version of parallelio library (#25916) 2021-09-13 18:22:54 -06:00
David Beckingsale
be1c4bc563 Rename camp 'main' version (#25918) 2021-09-13 16:55:55 -06:00
Timothy Brown
998de97091 ESMF, NEMSIO and UFS-UTILS changes. (#25846)
* ESMF and NEMSIO changes.

- Updating ESMF to set the COMM correctly when using Intel oneapi.
- Explicitly setting the CMake MPI Fortran compiler for NEMSIO.

* Update UFS utils CMake to use MPI_<lang>_COMPILER.
2021-09-13 16:13:43 -06:00
Greg Becker
dad69e7d7c Fix environment reading from lockfile to trust written hashes (#25879)
#22845 revealed a long-standing bug that had never been triggered before, because the
hashing algorithm had been stable for multiple years while the bug was in production. The
bug was that when reading a concretized environment, Spack did not properly read in the
build hashes associated with the specs in the environment. Those hashes were recomputed
(and as long as we didn't change the algorithm, were recomputed identically). Spack's
policy, though, is never to recompute a hash. Once something is installed, we respect its
metadata hash forever -- even if internally Spack changes the hashing method. Put
differently, once something is concretized, it has a concrete hash, and that's it -- forever.

When we changed the hashing algorithm for performance in #22845 we exposed the bug.
This PR fixes the bug at its source, but properly reading in the cached build hash attributes
associated with the specs. I've also renamed some variables in the Environment class
methods to make a mistake of this sort more difficult to make in the future.

* ensure environment build hashes are never recomputed
* add comment clarifying reattachment of env build hashes
* bump lockfile version and include specfile version in env meta
* Fix unit-test for v1 to v2 conversion

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-09-13 15:25:48 -06:00
Massimiliano Culpo
c699e907fc Update command to setup tutorial (#24488) 2021-06-23 19:38:36 -05:00
2113 changed files with 46824 additions and 14728 deletions

View File

@@ -1,6 +1,8 @@
name: Bootstrapping
on:
# This Workflow can be triggered manually
workflow_dispatch:
pull_request:
branches:
- develop
@@ -19,7 +21,7 @@ on:
jobs:
fedora-sources:
fedora-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
steps:
@@ -29,7 +31,7 @@ jobs:
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -46,7 +48,7 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-sources:
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
@@ -59,7 +61,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -76,18 +78,47 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-sources:
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "opensuse/tumbleweed:latest"
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd -m spack-test
chown -R spack-test .
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
steps:
- name: Install dependencies
run: |
zypper update -y
# Harden CI by applying the workaround described here: https://www.suse.com/support/kb/doc/?id=000019505
zypper update -y || zypper update -y
zypper install -y \
bzip2 curl file gcc-c++ gcc gcc-fortran tar git gpg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -101,13 +132,13 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-sources:
macos-clingo-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install cmake bison@2.7 tree
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -126,8 +157,8 @@ jobs:
- name: Install dependencies
run: |
brew install tree
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Bootstrap clingo
@@ -137,15 +168,14 @@ jobs:
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-binaries:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['2.7', '3.5', '3.6', '3.7', '3.8', '3.9']
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Setup repo and non-root user
@@ -159,3 +189,94 @@ jobs:
spack bootstrap untrust spack-install
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- uses: actions/checkout@v2
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd -m spack-test
chown -R spack-test .
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- uses: actions/checkout@v2
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd -m spack-test
chown -R spack-test .
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap untrust github-actions
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-binaries:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- uses: actions/checkout@v2
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
macos-gnupg-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- uses: actions/checkout@v2
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap untrust github-actions
spack -d gpg list
tree ~/.spack/bootstrap/store/

View File

@@ -1,15 +1,26 @@
name: Build & Deploy Docker Containers
name: Containers
on:
# This Workflow can be triggered manually
workflow_dispatch:
# Build new Spack develop containers nightly.
schedule:
- cron: '34 0 * * *'
# Run on pull requests that modify this file
pull_request:
branches:
- develop
paths:
- '.github/workflows/build-containers.yml'
# Let's also build & tag Spack containers on releases.
release:
types: [published]
jobs:
deploy-images:
runs-on: ubuntu-latest
runs-on: ubuntu-latest
permissions:
packages: write
strategy:
# Even if one container fails to build we still want the others
# to continue their builds.
@@ -17,19 +28,19 @@ jobs:
# A matrix of Dockerfile paths, associated tags, and which architectures
# they support.
matrix:
dockerfile: [[amazon-linux, amazonlinux-2.dockerfile, 'linux/amd64,linux/arm64'],
[centos7, centos-7.dockerfile, 'linux/amd64,linux/arm64'],
[leap15, leap-15.dockerfile, 'linux/amd64,linux/arm64'],
[ubuntu-xenial, ubuntu-1604.dockerfile, 'linux/amd64,linux/arm64'],
[ubuntu-bionic, ubuntu-1804.dockerfile, 'linux/amd64,linux/arm64']]
dockerfile: [[amazon-linux, amazonlinux-2.dockerfile, 'linux/amd64,linux/arm64'],
[centos7, centos-7.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le'],
[leap15, leap-15.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le'],
[ubuntu-xenial, ubuntu-1604.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le'],
[ubuntu-bionic, ubuntu-1804.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le']]
name: Build ${{ matrix.dockerfile[0] }}
steps:
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
container="ghcr.io/spack/${{ matrix.dockerfile[0]}}:latest"
container="${{ matrix.dockerfile[0] }}:latest"
echo "container=${container}" >> $GITHUB_ENV
echo "versioned=${container}" >> $GITHUB_ENV
@@ -37,7 +48,7 @@ jobs:
- name: Set Container Tag on Release
if: github.event_name == 'release'
run: |
versioned="ghcr.io/spack/${{matrix.dockerfile[0]}}:${GITHUB_REF##*/}"
versioned="${{matrix.dockerfile[0]}}:${GITHUB_REF##*/}"
echo "versioned=${versioned}" >> $GITHUB_ENV
- name: Check ${{ matrix.dockerfile[1] }} Exists
@@ -48,25 +59,33 @@ jobs:
exit 1;
fi
- name: Set up QEMU
uses: docker/setup-qemu-action@27d0a4f181a40b142cce983c5393082c365d1480 # @v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@94ab11c41e45d028884a99163086648e898eed25 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@v1
uses: docker/login-action@f054a8b539a109f9f41c372932f1ae047eff08c9 # @v1
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Log in to DockerHub
uses: docker/login-action@f054a8b539a109f9f41c372932f1ae047eff08c9 # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[1] }}
uses: docker/build-push-action@v2
uses: docker/build-push-action@a66e35b9cbcf4ad0ea91ffcaf7bbad63ad9e0229 # @v2
with:
file: share/spack/docker/${{matrix.dockerfile[1]}}
platforms: ${{ matrix.dockerfile[2] }}
push: true
push: ${{ github.event_name != 'pull_request' }}
tags: |
${{ env.container }}
${{ env.versioned }}
spack/${{ env.container }}
spack/${{ env.versioned }}
ghcr.io/spack/${{ env.container }}
ghcr.io/spack/${{ env.versioned }}

View File

@@ -24,8 +24,8 @@ jobs:
name: gcc with clang
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -39,8 +39,8 @@ jobs:
runs-on: macos-latest
timeout-minutes: 700
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -52,8 +52,8 @@ jobs:
name: scipy, mpl, pd
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -62,17 +62,3 @@ jobs:
spack install -v --fail-fast py-scipy %apple-clang
spack install -v --fail-fast py-matplotlib %apple-clang
spack install -v --fail-fast py-pandas %apple-clang
install_mpi4py_clang:
name: mpi4py, petsc4py
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.9
- name: spack install
run: |
. .github/workflows/install_spack.sh
spack install -v --fail-fast py-mpi4py %apple-clang
spack install -v --fail-fast py-petsc4py %apple-clang

View File

@@ -1,9 +1,8 @@
#!/usr/bin/env sh
#!/bin/bash -e
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
# With fetch-depth: 0 we have a remote develop
# but not a local branch. Don't do this on develop
if [ "$(git branch --show-current)" != "develop" ]
then
git branch develop origin/develop
# create a local pr base branch
if [[ -n $GITHUB_BASE_REF ]]; then
git fetch origin "${GITHUB_BASE_REF}:${GITHUB_BASE_REF}"
fi

View File

@@ -15,8 +15,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python Packages
@@ -24,17 +24,17 @@ jobs:
pip install --upgrade pip
pip install --upgrade vermin
- name: vermin (Spack's Core)
run: vermin --backport argparse --violations --backport typing -t=2.6- -t=3.5- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.5- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport argparse --violations --backport typing -t=2.6- -t=3.5- -vvv var/spack/repos
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.5- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python packages
@@ -48,26 +48,6 @@ jobs:
- name: Run style tests
run: |
share/spack/qa/run-style-tests
# Build the documentation
documentation:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Install System packages
run: |
sudo apt-get -y update
sudo apt-get install -y coreutils ninja-build graphviz
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade -r lib/spack/docs/requirements.txt
- name: Build documentation
run: |
share/spack/qa/run-doc-tests
# Check which files have been updated by the PR
changes:
runs-on: ubuntu-latest
@@ -77,12 +57,12 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
with_coverage: ${{ steps.coverage.outputs.with_coverage }}
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@v2
- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
@@ -112,17 +92,17 @@ jobs:
# Run unit tests with different configurations on linux
unittests:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [2.7, 3.5, 3.6, 3.7, 3.8, 3.9]
concretizer: ['original', 'clingo']
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -134,7 +114,7 @@ jobs:
patchelf cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage[toml]
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
# ensure style checks are not skipped in unit tests for python >= 3.6
# note that true/false (i.e., 1/0) are opposite in conditions in python and bash
if python -c 'import sys; sys.exit(not sys.version_info >= (3, 6))'; then
@@ -171,19 +151,19 @@ jobs:
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@v2.0.3
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
shell:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -193,7 +173,7 @@ jobs:
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage[toml]
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -209,44 +189,15 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@v2.0.3
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: shelltests,linux
# Test for Python2.6 run on Centos 6
centos6:
needs: [ validate, style, documentation, changes ]
runs-on: ubuntu-latest
container: spack/github-actions:centos6
steps:
- name: Run unit tests (full test-suite)
# The CentOS 6 container doesn't run with coverage, but
# under the same conditions it runs the full test suite
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
HOME: /home/spack-test
run: |
whoami && echo $HOME && cd $HOME
git clone https://github.com/spack/spack.git && cd spack
git fetch origin ${{ github.ref }}:test-branch
git checkout test-branch
share/spack/qa/run-unit-tests
- name: Run unit tests (only package tests)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
env:
HOME: /home/spack-test
ONLY_PACKAGES: true
run: |
whoami && echo $HOME && cd $HOME
git clone https://github.com/spack/spack.git && cd spack
git fetch origin ${{ github.ref }}:test-branch
git checkout test-branch
share/spack/qa/run-unit-tests
# Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack
rhel8-platform-python:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
container: registry.access.redhat.com/ubi8/ubi
@@ -256,7 +207,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -268,16 +219,17 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -289,7 +241,7 @@ jobs:
patchelf kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage[toml] clingo
pip install --upgrade pip six setuptools pytest codecov coverage[toml] clingo
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -311,28 +263,28 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@v2.0.3
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
build:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: macos-latest
strategy:
matrix:
python-version: [3.8]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade codecov coverage[toml]
pip install --upgrade pytest codecov coverage[toml]
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov
@@ -357,7 +309,7 @@ jobs:
echo "ONLY PACKAGE RECIPES CHANGED [skipping coverage]"
$(which spack) unit-test -x -m "not maybeslow" -k "package_sanity"
fi
- uses: codecov/codecov-action@v2.0.3
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
files: ./coverage.xml

5
.gitignore vendored
View File

@@ -132,11 +132,11 @@ celerybeat.pid
.env
.venv
env/
!/lib/spack/env
venv/
ENV/
env.bak/
venv.bak/
!/lib/spack/env
# Spyder project settings
.spyderproject
@@ -210,6 +210,9 @@ tramp
/eshell/history
/eshell/lastdir
# zsh byte-compiled files
*.zwc
# elpa packages
/elpa/

View File

@@ -2,6 +2,7 @@ version: 2
sphinx:
configuration: lib/spack/docs/conf.py
fail_on_warning: true
python:
version: 3.7

View File

@@ -1,3 +1,187 @@
# v0.17.0 (2021-11-05)
`v0.17.0` is a major feature release.
## Major features in this release
1. **New concretizer is now default**
The new concretizer introduced as an experimental feature in `v0.16.0`
is now the default (#25502). The new concretizer is based on the
[clingo](https://github.com/potassco/clingo) logic programming system,
and it enables us to do much higher quality and faster dependency solving
The old concretizer is still available via the `concretizer: original`
setting, but it is deprecated and will be removed in `v0.18.0`.
2. **Binary Bootstrapping**
To make it easier to use the new concretizer and binary packages,
Spack now bootstraps `clingo` and `GnuPG` from public binaries. If it
is not able to bootstrap them from binaries, it installs them from
source code. With these changes, you should still be able to clone Spack
and start using it almost immediately. (#21446, #22354, #22489, #22606,
#22720, #22720, #23677, #23946, #24003, #25138, #25607, #25964, #26029,
#26399, #26599).
3. **Reuse existing packages (experimental)**
The most wanted feature from our
[2020 user survey](https://spack.io/spack-user-survey-2020/) and
the most wanted Spack feature of all time (#25310). `spack install`,
`spack spec`, and `spack concretize` now have a `--reuse` option, which
causes Spack to minimize the number of rebuilds it does. The `--reuse`
option will try to find existing installations and binary packages locally
and in registered mirrors, and will prefer to use them over building new
versions. This will allow users to build from source *far* less than in
prior versions of Spack. This feature will continue to be improved, with
configuration options and better CLI expected in `v0.17.1`. It will become
the *default* concretization mode in `v0.18.0`.
4. **Better error messages**
We have improved the error messages generated by the new concretizer by
using *unsatisfiable cores*. Spack will now print a summary of the types
of constraints that were violated to make a spec unsatisfiable (#26719).
5. **Conditional variants**
Variants can now have a `when="<spec>"` clause, allowing them to be
conditional based on the version or other attributes of a package (#24858).
6. **Git commit versions**
In an environment and on the command-line, you can now provide a full,
40-character git commit as a version for any package with a top-level
`git` URL. e.g., `spack install hdf5@45bb27f58240a8da7ebb4efc821a1a964d7712a8`.
Spack will compare the commit to tags in the git repository to understand
what versions it is ahead of or behind.
7. **Override local config and cache directories**
You can now set `SPACK_DISABLE_LOCAL_CONFIG` to disable the `~/.spack` and
`/etc/spack` configuration scopes. `SPACK_USER_CACHE_PATH` allows you to
move caches out of `~/.spack`, as well (#27022, #26735). This addresses
common problems where users could not isolate CI environments from local
configuration.
8. **Improvements to Spack Containerize**
For added reproducibility, you can now pin the Spack version used by
`spack containerize` (#21910). The container build will only build
with the Spack version pinned at build recipe creation instead of the
latest Spack version.
9. **New commands for dealing with tags**
The `spack tags` command allows you to list tags on packages (#26136), and you
can list tests and filter tags with `spack test list` (#26842).
## Other new features of note
* Copy and relocate environment views as stand-alone installations (#24832)
* `spack diff` command can diff two installed specs (#22283, #25169)
* `spack -c <config>` can set one-off config parameters on CLI (#22251)
* `spack load --list` is an alias for `spack find --loaded` (#27184)
* `spack gpg` can export private key with `--secret` (#22557)
* `spack style` automatically bootstraps dependencies (#24819)
* `spack style --fix` automatically invokes `isort` (#24071)
* build dependencies can be installed from build caches with `--include-build-deps` (#19955)
* `spack audit` command for checking package constraints (#23053)
* Spack can now fetch from `CVS` repositories (yep, really) (#23212)
* `spack monitor` lets you upload analysis about installations to a
[spack monitor server](https://github.com/spack/spack-monitor) (#23804, #24321,
#23777, #25928))
* `spack python --path` shows which `python` Spack is using (#22006)
* `spack env activate --temp` can create temporary environments (#25388)
* `--preferred` and `--latest` options for `spack checksum` (#25830)
* `cc` is now pure posix and runs on Alpine (#26259)
* `SPACK_PYTHON` environment variable sets which `python` spack uses (#21222)
* `SPACK_SKIP_MODULES` lets you source `setup-env.sh` faster if you don't need modules (#24545)
## Major internal refactors
* `spec.yaml` files are now `spec.json`, yielding a large speed improvement (#22845)
* Splicing allows Spack specs to store mixed build provenance (#20262)
* More extensive hooks API for installations (#21930)
* New internal API for getting the active environment (#25439)
## Performance Improvements
* Parallelize separate concretization in environments; Previously 55 min E4S solve
now takes 2.5 min (#26264)
* Drastically improve YamlFilesystemView file removal performance via batching (#24355)
* Speed up spec comparison (#21618)
* Speed up environment activation (#25633)
## Archspec improvements
* support for new generic `x86_64_v2`, `x86_64_v3`, `x86_64_v4` targets
(see [archspec#31](https://github.com/archspec/archspec-json/pull/31))
* `spack arch --generic` lets you get the best generic architecture for
your node (#27061)
* added support for aocc (#20124), `arm` compiler on `graviton2` (#24904)
and on `a64fx` (#24524),
## Infrastructure, buildcaches, and services
* Add support for GCS Bucket Mirrors (#26382)
* Add `spackbot` to help package maintainers with notifications. See
[spack.github.io/spackbot](https://spack.github.io/spackbot/)
* Reproducible pipeline builds with `spack ci rebuild` (#22887)
* Removed redundant concretizations from GitLab pipeline generation (#26622)
* Spack CI no longer generates jobs for unbuilt specs (#20435)
* Every pull request pipeline has its own buildcache (#25529)
* `--no-add` installs only specified specs and only if already present in… (#22657)
* Add environment-aware `spack buildcache sync` command (#25470)
* Binary cache installation speedups and improvements (#19690, #20768)
## Deprecations and Removals
* `spack setup` was deprecated in v0.16.0, and has now been removed.
Use `spack develop` and `spack dev-build`.
* Remove unused `--dependencies` flag from `spack load` (#25731)
* Remove stubs for `spack module [refresh|find|rm|loads]`, all of which
were deprecated in 2018.
## Notable Bugfixes
* Deactivate previous env before activating new one (#25409)
* Many fixes to error codes from `spack install` (#21319, #27012, #25314)
* config add: infer type based on JSON schema validation errors (#27035)
* `spack config edit` now works even if `spack.yaml` is broken (#24689)
## Packages
* Allow non-empty version ranges like `1.1.0:1.1` (#26402)
* Remove `.99`'s from many version ranges (#26422)
* Python: use platform-specific site packages dir (#25998)
* `CachedCMakePackage` for using *.cmake initial config files (#19316)
* `lua-lang` allows swapping `lua` and `luajit` (#22492)
* Better support for `ld.gold` and `ld.lld` (#25626)
* build times are now stored as metadata in `$prefix/.spack` (#21179)
* post-install tests can be reused in smoke tests (#20298)
* Packages can use `pypi` attribute to infer `homepage`/`url`/`list_url` (#17587)
* Use gnuconfig package for `config.guess` file replacement (#26035)
* patches: make re-applied patches idempotent (#26784)
## Spack community stats
* 5969 total packages, 920 new since `v0.16.0`
* 358 new Python packages, 175 new R packages
* 513 people contributed to this release
* 490 committers to packages
* 105 committers to core
* Lots of GPU updates:
* ~77 CUDA-related commits
* ~66 AMD-related updates
* ~27 OneAPI-related commits
* 30 commits from AMD toolchain support
* `spack test` usage in packages is increasing
* 1669 packages with tests (mostly generic python tests)
* 93 packages with their own tests
# v0.16.3 (2021-09-21)
* clang/llvm: fix version detection (#19978)
* Fix use of quotes in Python build system (#22279)
* Cray: fix extracting paths from module files (#23472)
* Use AWS CloudFront for source mirror (#23978)
* Ensure all roots of an installed environment are marked explicit in db (#24277)
* Fix fetching for Python 3.8 and 3.9 (#24686)
* locks: only open lockfiles once instead of for every lock held (#24794)
* Remove the EOL centos:6 docker image
# v0.16.2 (2021-05-22)
* Major performance improvement for `spack load` and other commands. (#23661)

58
CITATION.cff Normal file
View File

@@ -0,0 +1,58 @@
# If you are referencing Spack in a publication, please cite the SC'15 paper
# described here.
#
# Here's the raw citation:
#
# Todd Gamblin, Matthew P. LeGendre, Michael R. Collette, Gregory L. Lee,
# Adam Moody, Bronis R. de Supinski, and W. Scott Futral.
# The Spack Package Manager: Bringing Order to HPC Software Chaos.
# In Supercomputing 2015 (SC15), Austin, Texas, November 15-20 2015. LLNL-CONF-669890.
#
# Or, in BibTeX:
#
# @inproceedings{Gamblin_The_Spack_Package_2015,
# address = {Austin, Texas, USA},
# author = {Gamblin, Todd and LeGendre, Matthew and
# Collette, Michael R. and Lee, Gregory L. and
# Moody, Adam and de Supinski, Bronis R. and Futral, Scott},
# doi = {10.1145/2807591.2807623},
# month = {November 15-20},
# note = {LLNL-CONF-669890},
# series = {Supercomputing 2015 (SC15)},
# title = {{The Spack Package Manager: Bringing Order to HPC Software Chaos}},
# url = {https://github.com/spack/spack},
# year = {2015}
# }
#
# And here's the CITATION.cff format:
#
cff-version: 1.2.0
message: "If you are referencing Spack in a publication, please cite the paper below."
preferred-citation:
type: conference-paper
doi: "10.1145/2807591.2807623"
url: "https://github.com/spack/spack"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
given-names: "Matthew"
- family-names: "Collette"
given-names: "Michael R."
- family-names: "Lee"
given-names: "Gregory L."
- family-names: "Moody"
given-names: "Adam"
- family-names: "de Supinski"
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "USA"
month: November 15-20
year: 2015
notes: LLNL-CONF-669890

View File

@@ -4,6 +4,7 @@
[![Bootstrapping](https://github.com/spack/spack/actions/workflows/bootstrap.yml/badge.svg)](https://github.com/spack/spack/actions/workflows/bootstrap.yml)
[![macOS Builds (nightly)](https://github.com/spack/spack/workflows/macOS%20builds%20nightly/badge.svg?branch=develop)](https://github.com/spack/spack/actions?query=workflow%3A%22macOS+builds+nightly%22)
[![codecov](https://codecov.io/gh/spack/spack/branch/develop/graph/badge.svg)](https://codecov.io/gh/spack/spack)
[![Containers](https://github.com/spack/spack/actions/workflows/build-containers.yml/badge.svg)](https://github.com/spack/spack/actions/workflows/build-containers.yml)
[![Read the Docs](https://readthedocs.org/projects/spack/badge/?version=latest)](https://spack.readthedocs.io)
[![Slack](https://slack.spack.io/badge.svg)](https://slack.spack.io)
@@ -26,7 +27,7 @@ for examples and highlights.
To install spack and your first package, make sure you have Python.
Then:
$ git clone https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install zlib
@@ -124,6 +125,9 @@ If you are referencing Spack in a publication, please cite the following paper:
[**The Spack Package Manager: Bringing Order to HPC Software Chaos**](https://www.computer.org/csdl/proceedings/sc/2015/3723/00/2807623.pdf).
In *Supercomputing 2015 (SC15)*, Austin, Texas, November 15-20 2015. LLNL-CONF-669890.
On GitHub, you can copy this citation in APA or BibTeX format via the "Cite this repository"
button. Or, see the comments in `CITATION.cff` for the raw BibTeX.
License
----------------

24
SECURITY.md Normal file
View File

@@ -0,0 +1,24 @@
# Security Policy
## Supported Versions
We provide security updates for the following releases.
For more on Spack's release structure, see
[`README.md`](https://github.com/spack/spack#releases).
| Version | Supported |
| ------- | ------------------ |
| develop | :white_check_mark: |
| 0.16.x | :white_check_mark: |
## Reporting a Vulnerability
To report a vulnerability or other security
issue, email maintainers@spack.io.
You can expect to hear back within two days.
If your security issue is accepted, we will do
our best to release a fix within a week. If
fixing the issue will take longer than this,
we will discuss timeline options with you.

View File

@@ -33,11 +33,11 @@ import sys
min_python3 = (3, 5)
if sys.version_info[:2] < (2, 6) or (
if sys.version_info[:2] < (2, 7) or (
sys.version_info[:2] >= (3, 0) and sys.version_info[:2] < min_python3
):
v_info = sys.version_info[:3]
msg = "Spack requires Python 2.6, 2.7 or %d.%d or higher " % min_python3
msg = "Spack requires Python 2.7 or %d.%d or higher " % min_python3
msg += "You are running spack with Python %d.%d.%d." % v_info
sys.exit(msg)
@@ -54,8 +54,6 @@ spack_external_libs = os.path.join(spack_lib_path, "external")
if sys.version_info[:2] <= (2, 7):
sys.path.insert(0, os.path.join(spack_external_libs, "py2"))
if sys.version_info[:2] == (2, 6):
sys.path.insert(0, os.path.join(spack_external_libs, "py26"))
sys.path.insert(0, spack_external_libs)

View File

@@ -4,7 +4,7 @@ bootstrap:
enable: true
# Root directory for bootstrapping work. The software bootstrapped
# by Spack is installed in a "store" subfolder of this root directory
root: ~/.spack/bootstrap
root: $user_cache_path/bootstrap
# Methods that can be used to bootstrap software. Each method may or
# may not be able to bootstrap all of the software that Spack needs,
# depending on its type.
@@ -29,4 +29,4 @@ bootstrap:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow
github-actions: true
spack-install: true
spack-install: true

View File

@@ -42,8 +42,8 @@ config:
# (i.e., ``$TMP` or ``$TMPDIR``).
#
# Another option that prevents conflicts and potential permission issues is
# to specify `~/.spack/stage`, which ensures each user builds in their home
# directory.
# to specify `$user_cache_path/stage`, which ensures each user builds in their
# home directory.
#
# A more traditional path uses the value of `$spack/var/spack/stage`, which
# builds directly inside Spack's instance without staging them in a
@@ -60,13 +60,13 @@ config:
# identifies Spack staging to avoid accidentally wiping out non-Spack work.
build_stage:
- $tempdir/$user/spack-stage
- ~/.spack/stage
- $user_cache_path/stage
# - $spack/var/spack/stage
# Directory in which to run tests and store test results.
# Tests will be stored in directories named by date/time and package
# name/hash.
test_stage: ~/.spack/test
test_stage: $user_cache_path/test
# Cache directory for already downloaded source tarballs and archived
# repositories. This can be purged with `spack clean --downloads`.
@@ -75,7 +75,7 @@ config:
# Cache directory for miscellaneous files, like the package index.
# This can be purged with `spack clean --misc-cache`
misc_cache: ~/.spack/cache
misc_cache: $user_cache_path/cache
# Timeout in seconds used for downloading sources etc. This only applies
@@ -134,7 +134,7 @@ config:
# enabling locks.
locks: true
# The default url fetch method to use.
# The default url fetch method to use.
# If set to 'curl', Spack will require curl on the user's system
# If set to 'urllib', Spack will use python built-in libs to fetch
url_fetch_method: urllib
@@ -160,11 +160,10 @@ config:
# sufficiently for many specs.
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences.
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'clingo' currently requires the clingo ASP solver to be installed and
# built with python bindings. 'original' is built in.
concretizer: original
concretizer: clingo
# How long to wait to lock the Spack installation database. This lock is used
@@ -191,3 +190,8 @@ config:
# Set to 'false' to allow installation on filesystems that doesn't allow setgid bit
# manipulation by unprivileged user (e.g. AFS)
allow_sgid: true
# Whether to set the terminal title to display status information during
# building and installing packages. This gives information about Spack's
# current progress as well as the current and total number of packages.
terminal_title: false

View File

@@ -31,13 +31,13 @@ colorized output with a flag
.. code-block:: console
$ spack --color always | less -R
$ spack --color always find | less -R
or an environment variable
.. code-block:: console
$ SPACK_COLOR=always spack | less -R
$ SPACK_COLOR=always spack find | less -R
--------------------------
Listing available packages
@@ -188,6 +188,34 @@ configuration a **spec**. In the commands above, ``mpileaks`` and
``mpileaks@3.0.4`` are both valid *specs*. We'll talk more about how
you can use them to customize an installation in :ref:`sec-specs`.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Reusing installed dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. warning::
The ``--reuse`` option described here is experimental, and it will
likely be replaced with a different option and configuration settings
in the next Spack release.
By default, when you run ``spack install``, Spack tries to build a new
version of the package you asked for, along with updated versions of
its dependencies. This gets you the latest versions and configurations,
but it can result in unwanted rebuilds if you update Spack frequently.
If you want Spack to try hard to reuse existing installations as dependencies,
you can add the ``--reuse`` option:
.. code-block:: console
$ spack install --reuse mpich
This will not do anything if ``mpich`` is already installed. If ``mpich``
is not installed, but dependencies like ``hwloc`` and ``libfabric`` are,
the ``mpich`` will be build with the installed versions, if possible.
You can use the :ref:`spack spec -I <cmd-spack-spec>` command to see what
will be reused and what will be built before you install.
.. _cmd-spack-uninstall:
^^^^^^^^^^^^^^^^^^^
@@ -868,8 +896,9 @@ your path:
These commands will add appropriate directories to your ``PATH``,
``MANPATH``, ``CPATH``, and ``LD_LIBRARY_PATH`` according to the
:ref:`prefix inspections <customize-env-modifications>` defined in your
modules configuration. When you no longer want to use a package, you
can type unload or unuse similarly:
modules configuration.
When you no longer want to use a package, you can type unload or
unuse similarly:
.. code-block:: console
@@ -910,6 +939,22 @@ first ``libelf`` above, you would run:
$ spack load /qmm4kso
To see which packages that you have loaded to your enviornment you would
use ``spack find --loaded``.
.. code-block:: console
$ spack find --loaded
==> 2 installed packages
-- linux-debian7 / gcc@4.4.7 ------------------------------------
libelf@0.8.13
-- linux-debian7 / intel@15.0.0 ---------------------------------
libelf@0.8.13
You can also use ``spack load --list`` to get the same output, but it
does not have the full set of query options that ``spack find`` offers.
We'll learn more about Spack's spec syntax in the next section.
@@ -1649,6 +1694,7 @@ and it will be added to the ``PYTHONPATH`` in your current shell:
Now ``import numpy`` will succeed for as long as you keep your current
session open.
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules

View File

@@ -112,20 +112,44 @@ phase runs:
.. code-block:: console
$ libtoolize
$ aclocal
$ autoreconf --install --verbose --force
$ autoreconf --install --verbose --force -I <aclocal-prefix>/share/aclocal
All you need to do is add a few Autotools dependencies to the package.
Most stable releases will come with a ``configure`` script, but if you
check out a commit from the ``develop`` branch, you would want to add:
In case you need to add more arguments, override ``autoreconf_extra_args``
in your ``package.py`` on class scope like this:
.. code-block:: python
depends_on('autoconf', type='build', when='@develop')
depends_on('automake', type='build', when='@develop')
depends_on('libtool', type='build', when='@develop')
depends_on('m4', type='build', when='@develop')
autoreconf_extra_args = ["-Im4"]
All you need to do is add a few Autotools dependencies to the package.
Most stable releases will come with a ``configure`` script, but if you
check out a commit from the ``master`` branch, you would want to add:
.. code-block:: python
depends_on('autoconf', type='build', when='@master')
depends_on('automake', type='build', when='@master')
depends_on('libtool', type='build', when='@master')
It is typically redundant to list the ``m4`` macro processor package as a
dependency, since ``autoconf`` already depends on it.
"""""""""""""""""""""""""""""""
Using a custom autoreconf phase
"""""""""""""""""""""""""""""""
In some cases, it might be needed to replace the default implementation
of the autoreconf phase with one running a script interpreter. In this
example, the ``bash`` shell is used to run the ``autogen.sh`` script.
.. code-block:: python
def autoreconf(self, spec, prefix):
which('bash')('autogen.sh')
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""
In some cases, developers might need to distribute a patch that modifies
one of the files used to generate ``configure`` or ``Makefile.in``.
@@ -135,6 +159,57 @@ create a new patch that directly modifies ``configure``. That way,
Spack can use the secondary patch and additional build system
dependencies aren't necessary.
""""""""""""""""""""""""""""
Old Autotools helper scripts
""""""""""""""""""""""""""""
Autotools based tarballs come with helper scripts such as ``config.sub`` and
``config.guess``. It is the responsibility of the developers to keep these files
up to date so that they run on every platform, but for very old software
releases this is impossible. In these cases Spack can help to replace these
files with newer ones, without having to add the heavy dependency on
``automake``.
Automatic helper script replacement is currently enabled by default on
``ppc64le`` and ``aarch64``, as these are the known cases where old scripts fail.
On these targets, ``AutotoolsPackage`` adds a build dependency on ``gnuconfig``,
which is a very light-weight package with newer versions of the helper files.
Spack then tries to run all the helper scripts it can find in the release, and
replaces them on failure with the helper scripts from ``gnuconfig``.
To opt out of this feature, use the following setting:
.. code-block:: python
patch_config_files = False
To enable it conditionally on different architectures, define a property and
make the package depend on ``gnuconfig`` as a build dependency:
.. code-block
depends_on('gnuconfig', when='@1.0:')
@property
def patch_config_files(self):
return self.spec.satisfies("@1.0:")
.. note::
On some exotic architectures it is necessary to use system provided
``config.sub`` and ``config.guess`` files. In this case, the most
transparent solution is to mark the ``gnuconfig`` package as external and
non-buildable, with a prefix set to the directory containing the files:
.. code-block:: yaml
gnuconfig:
buildable: false
externals:
- spec: gnuconfig@master
prefix: /usr/share/configure_files/
""""""""""""""""
force_autoreconf
""""""""""""""""
@@ -324,8 +399,47 @@ options:
--with-libfabric=</path/to/libfabric>
"""""""""""""""""""""""
The ``variant`` keyword
"""""""""""""""""""""""
When Spack variants and configure flags do not correspond one-to-one, the
``variant`` keyword can be passed to ``with_or_without`` and
``enable_or_disable``. For example:
.. code-block:: python
variant('debug_tools', default=False)
config_args += self.enable_or_disable('debug-tools', variant='debug_tools')
Or when one variant controls multiple flags:
.. code-block:: python
variant('debug_tools', default=False)
config_args += self.with_or_without('memchecker', variant='debug_tools')
config_args += self.with_or_without('profiler', variant='debug_tools')
""""""""""""""""""""
activation overrides
Conditional variants
""""""""""""""""""""
When a variant is conditional and its condition is not met on the concrete spec, the
``with_or_without`` and ``enable_or_disable`` methods will simply return an empty list.
For example:
.. code-block:: python
variant('profiler', when='@2.0:')
config_args += self.with_or_without('profiler)
will neither add ``--with-profiler`` nor ``--without-profiler`` when the version is
below ``2.0``.
""""""""""""""""""""
Activation overrides
""""""""""""""""""""
Finally, the behavior of either ``with_or_without`` or

View File

@@ -145,6 +145,20 @@ and without the :meth:`~spack.build_systems.cmake.CMakePackage.define` and
return args
Spack supports CMake defines from conditional variants too. Whenever the condition on
the variant is not met, ``define_from_variant()`` will simply return an empty string,
and CMake simply ignores the empty command line argument. For example the following
.. code-block:: python
variant('example', default=True, when='@2.0:')
def cmake_args(self):
return [self.define_from_variant('EXAMPLE', 'example')]
will generate ``'cmake' '-DEXAMPLE=ON' ...`` when `@2.0: +example` is met, but will
result in ``'cmake' '' ...`` when the spec version is below ``2.0``.
^^^^^^^^^^
Generators

View File

@@ -125,12 +125,15 @@ The zip file will not contain a ``setup.py``, but it will contain a
``METADATA`` file which contains all the information you need to
write a ``package.py`` build recipe.
.. _pypi:
^^^^
PyPI
^^^^
The vast majority of Python packages are hosted on PyPI - The Python
Package Index. ``pip`` only supports packages hosted on PyPI, making
The vast majority of Python packages are hosted on PyPI (The Python
Package Index), which is :ref:`preferred over GitHub <pypi-vs-github>`
for downloading packages. ``pip`` only supports packages hosted on PyPI, making
it the only option for developers who want a simple installation.
Search for "PyPI <package-name>" to find the download page. Note that
some pages are versioned, and the first result may not be the newest
@@ -217,6 +220,7 @@ try to extract the wheel:
version('1.11.0', sha256='d8c9d24ea90457214d798b0d922489863dad518adde3638e08ef62de28fb183a', expand=False)
.. _pypi-vs-github:
"""""""""""""""
PyPI vs. GitHub
@@ -263,6 +267,9 @@ location, but PyPI is preferred for the following reasons:
PyPI is nice because it makes it physically impossible to
re-release the same version of a package with a different checksum.
Use the :ref:`pypi attribute <pypi>` to facilitate construction of PyPI package
references.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -336,7 +343,7 @@ This would be translated to:
.. code-block:: python
extends('python')
depends_on('python@3.5:3.999', type=('build', 'run'))
depends_on('python@3.5:3', type=('build', 'run'))
Many ``setup.py`` or ``setup.cfg`` files also contain information like::
@@ -568,7 +575,7 @@ check the ``METADATA`` file for lines like::
Lines that use ``Requires-Dist`` are similar to ``install_requires``.
Lines that use ``Provides-Extra`` are similar to ``extra_requires``,
and you can add a variant for those dependencies. The ``~=1.11.0``
syntax is equivalent to ``1.11.0:1.11.999``.
syntax is equivalent to ``1.11.0:1.11``.
""""""""""
setuptools
@@ -709,7 +716,7 @@ The package may have its own unit or regression tests. Spack can
run these tests during the installation by adding phase-appropriate
test methods.
For example, ``py-numpy`` adds the following as a check to run
For example, ``py-numpy`` adds the following as a check to run
after the ``install`` phase:
.. code-block:: python

View File

@@ -30,6 +30,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.insert(0, os.path.abspath('_spack_root/lib/spack/external'))
sys.path.insert(0, os.path.abspath('_spack_root/lib/spack/external/pytest-fallback'))
if sys.version_info[0] < 3:
sys.path.insert(

View File

@@ -259,3 +259,16 @@ and ld.so will ONLY search for dependencies in the ``RUNPATH`` of
the loading object.
DO NOT MIX the two options within the same install tree.
----------------------
``terminal_title``
----------------------
By setting this option to ``true``, Spack will update the terminal's title to
provide information about its current progress as well as the current and
total package numbers.
To work properly, this requires your terminal to reset its title after
Spack has finished its work, otherwise Spack's status information will
remain in the terminal's title indefinitely. Most terminals should already
be set up this way and clear Spack's status information.

View File

@@ -402,12 +402,15 @@ Spack-specific variables
Spack understands several special variables. These are:
* ``$env``: name of the currently active :ref:`environment <environments>`
* ``$spack``: path to the prefix of this Spack installation
* ``$tempdir``: default system temporary directory (as specified in
Python's `tempfile.tempdir
<https://docs.python.org/2/library/tempfile.html#tempfile.tempdir>`_
variable.
* ``$user``: name of the current user
* ``$user_cache_path``: user cache directory (``~/.spack`` unless
:ref:`overridden <local-config-overrides>`)
Note that, as with shell variables, you can write these as ``$varname``
or with braces to distinguish the variable from surrounding characters:
@@ -562,3 +565,39 @@ built in and are not overridden by a configuration file. The
command line. ``dirty`` and ``install_tree`` come from the custom
scopes ``./my-scope`` and ``./my-scope-2``, and all other configuration
options come from the default configuration files that ship with Spack.
.. _local-config-overrides:
------------------------------
Overriding Local Configuration
------------------------------
Spack's ``system`` and ``user`` scopes provide ways for administrators and users to set
global defaults for all Spack instances, but for use cases where one wants a clean Spack
installation, these scopes can be undesirable. For example, users may want to opt out of
global system configuration, or they may want to ignore their own home directory
settings when running in a continuous integration environment.
Spack also, by default, keeps various caches and user data in ``~/.spack``, but
users may want to override these locations.
Spack provides three environment variables that allow you to override or opt out of
configuration locations:
* ``SPACK_USER_CONFIG_PATH``: Override the path to use for the
``user`` scope (``~/.spack`` by default).
* ``SPACK_SYSTEM_CONFIG_PATH``: Override the path to use for the
``system`` scope (``/etc/spack`` by default).
* ``SPACK_DISABLE_LOCAL_CONFIG``: set this environment variable to completely disable
**both** the system and user configuration directories. Spack will only consider its
own defaults and ``site`` configuration locations.
And one that allows you to move the default cache location:
* ``SPACK_USER_CACHE_PATH``: Override the default path to use for user data
(misc_cache, tests, reports, etc.)
With these settings, if you want to isolate Spack in a CI environment, you can do this::
export SPACK_DISABLE_LOCAL_CONFIG=true
export SPACK_USER_CACHE_PATH=/tmp/spack

View File

@@ -126,9 +126,6 @@ are currently supported are summarized in the table below:
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - CentOS 6
- ``centos:6``
- ``spack/centos6``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -200,7 +197,7 @@ Setting Base Images
The ``images`` subsection is used to select both the image where
Spack builds the software and the image where the built software
is installed. This attribute can be set in two different ways and
is installed. This attribute can be set in different ways and
which one to use depends on the use case at hand.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -260,10 +257,54 @@ software is respectively built and installed:
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l"]
This method of selecting base images is the simplest of the two, and we advise
This is the simplest available method of selecting base images, and we advise
to use it whenever possible. There are cases though where using Spack official
images is not enough to fit production needs. In these situations users can manually
select which base image to start from in the recipe, as we'll see next.
images is not enough to fit production needs. In these situations users can
extend the recipe to start with the bootstrapping of Spack at a certain pinned
version or manually select which base image to start from in the recipe,
as we'll see next.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Use a Bootstrap Stage for Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In some cases users may want to pin the commit sha that is used for Spack, to ensure later
reproducibility, or start from a fork of the official Spack repository to try a bugfix or
a feature in the early stage of development. This is possible by being just a little more
verbose when specifying information about Spack in the ``spack.yaml`` file:
.. code-block:: yaml
images:
os: amazonlinux:2
spack:
# URL of the Spack repository to be used in the container image
url: <to-use-a-fork>
# Either a commit sha, a branch name or a tag
ref: <sha/tag/branch>
# If true turn a branch name or a tag into the corresponding commit
# sha at the time of recipe generation
resolve_sha: <true/false>
``url`` specifies the URL from which to clone Spack and defaults to https://github.com/spack/spack.
The ``ref`` attribute can be either a commit sha, a branch name or a tag. The default value in
this case is to use the ``develop`` branch, but it may change in the future to point to the latest stable
release. Finally ``resolve_sha`` transform branch names or tags into the corresponding commit
shas at the time of recipe generation, to allow for a greater reproducibility of the results
at a later time.
The list of operating systems that can be used to bootstrap Spack can be
obtained with:
.. command-output:: spack containerize --list-os
.. note::
The ``resolve_sha`` option uses ``git rev-parse`` under the hood and thus it requires
to checkout the corresponding Spack repository in a temporary folder before generating
the recipe. Recipe generation may take longer when this option is set to true because
of this additional step.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Use Custom Images Provided by Users
@@ -415,6 +456,18 @@ to customize the generation of container recipes:
- Version of Spack use in the ``build`` stage
- Valid tags for ``base:image``
- Yes, if using constrained selection of base images
* - ``images:spack:url``
- Repository from which Spack is cloned
- Any fork of Spack
- No
* - ``images:spack:ref``
- Reference for the checkout of Spack
- Either a commit sha, a branch name or a tag
- No
* - ``images:spack:resolve_sha``
- Resolve branches and tags in ``spack.yaml`` to commits in the generated recipe
- True or False (default: False)
- No
* - ``images:build``
- Image to be used in the ``build`` stage
- Any valid container image

View File

@@ -71,7 +71,7 @@ locally to speed up the review process.
new release that is causing problems. If this is the case, please file an issue.
We currently test against Python 2.6, 2.7, and 3.5-3.7 on both macOS and Linux and
We currently test against Python 2.7 and 3.5-3.9 on both macOS and Linux and
perform 3 types of tests:
.. _cmd-spack-unit-test:
@@ -338,15 +338,6 @@ Once all of the dependencies are installed, you can try building the documentati
If you see any warning or error messages, you will have to correct those before
your PR is accepted.
.. note::
There is also a ``run-doc-tests`` script in ``share/spack/qa``. The only
difference between running this script and running ``make`` by hand is that
the script will exit immediately if it encounters an error or warning. This
is necessary for CI. If you made a lot of documentation changes, it is
much quicker to run ``make`` by hand so that you can see all of the warnings
at once.
If you are editing the documentation, you should obviously be running the
documentation tests. But even if you are simply adding a new package, your
changes could cause the documentation tests to fail:

View File

@@ -210,15 +210,6 @@ Spec-related modules
but compilers aren't fully integrated with the build process
yet.
:mod:`spack.architecture`
:func:`architecture.default_arch <spack.architecture.default_arch>` is used
to determine the host architecture while building.
.. warning::
Not yet implemented. Should eventually have architecture
descriptions for cross-compiling.
^^^^^^^^^^^^^^^^^
Build environment
^^^^^^^^^^^^^^^^^
@@ -1186,6 +1177,10 @@ completed, the steps to make the major release are:
If CI is not passing, submit pull requests to ``develop`` as normal
and keep rebasing the release branch on ``develop`` until CI passes.
#. Make sure the entire documentation is up to date. If documentation
is outdated submit pull requests to ``develop`` as normal
and keep rebasing the release branch on ``develop``.
#. Follow the steps in :ref:`publishing-releases`.
#. Follow the steps in :ref:`merging-releases`.

View File

@@ -35,7 +35,7 @@ Getting Spack is easy. You can clone it from the `github repository
.. code-block:: console
$ git clone https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
This will create a directory called ``spack``.
@@ -88,74 +88,71 @@ the environment.
Bootstrapping clingo
^^^^^^^^^^^^^^^^^^^^
Spack supports using ``clingo`` as an external solver to compute which software
needs to be installed. The default configuration allows Spack to install
``clingo`` from a public buildcache, created by a Github Action workflow. In this
case the bootstrapping procedure is transparent to the user, except for a
slightly long waiting time on the first concretization of a spec:
Spack uses ``clingo`` under the hood to resolve optimal versions and variants of
dependencies when installing a package. Since ``clingo`` itself is a binary,
Spack has to install it on initial use, which is called bootstrapping.
Spack provides two ways of bootstrapping ``clingo``: from pre-built binaries
(default), or from sources. The fastest way to get started is to bootstrap from
pre-built binaries.
.. note::
When bootstrapping from pre-built binaries, Spack currently requires
``patchelf`` on Linux and ``otool`` on macOS. If ``patchelf`` is not in the
``PATH``, Spack will build it from sources, and a C++ compiler is required.
The first time you concretize a spec, Spack will bootstrap in the background:
.. code-block:: console
$ spack find -b
==> Showing internal bootstrap store at "/home/spack/.spack/bootstrap/store"
==> 0 installed packages
$ time spack spec zlib
Input spec
--------------------------------
zlib
$ time spack solve zlib
==> Best of 2 considered solutions.
==> Optimization Criteria:
Priority Criterion Value
1 deprecated versions used 0
2 version weight 0
3 number of non-default variants (roots) 0
4 multi-valued variants 0
5 preferred providers for roots 0
6 number of non-default variants (non-roots) 0
7 preferred providers (non-roots) 0
8 compiler mismatches 0
9 version badness 0
10 count of non-root multi-valued variants 0
11 non-preferred compilers 0
12 target mismatches 0
13 non-preferred targets 0
Concretized
--------------------------------
zlib@1.2.11%gcc@7.5.0+optimize+pic+shared arch=linux-ubuntu18.04-zen
zlib@1.2.11%gcc@11.1.0+optimize+pic+shared arch=linux-ubuntu18.04-broadwell
real 0m30,618s
user 0m27,278s
sys 0m1,549s
real 0m20.023s
user 0m18.351s
sys 0m0.784s
After this command you'll see that ``clingo`` has been installed for Spack's own use:
.. code-block:: console
$ spack find -b
==> Showing internal bootstrap store at "/home/spack/.spack/bootstrap/store"
==> 2 installed packages
==> Showing internal bootstrap store at "/root/.spack/bootstrap/store"
==> 3 installed packages
-- linux-rhel5-x86_64 / gcc@9.3.0 -------------------------------
clingo-bootstrap@spack python@3.6
-- linux-ubuntu18.04-zen / gcc@7.5.0 ----------------------------
patchelf@0.13
Subsequent calls to the concretizer will then be much faster:
.. code-block:: console
$ time spack solve zlib
$ time spack spec zlib
[ ... ]
real 0m1,222s
user 0m1,146s
sys 0m0,059s
real 0m0.490s
user 0m0.431s
sys 0m0.041s
If for security or for other reasons you don't want to or can't install precompiled
binaries, Spack can fall-back to bootstrap ``clingo`` from source files. To forbid
Spack from retrieving binaries from the bootstrapping buildcache, the following
command must be given:
If for security concerns you cannot bootstrap ``clingo`` from pre-built
binaries, you have to mark this bootstrapping method as untrusted. This makes
Spack fall back to bootstrapping from sources:
.. code-block:: console
$ spack bootstrap untrust github-actions
==> "github-actions" is now untrusted and will not be used for bootstrapping
since an "untrusted" way of bootstrapping software will not be considered
by Spack. You can verify the new settings are effective with:
You can verify that the new settings are effective with:
.. code-block:: console
@@ -181,33 +178,25 @@ by Spack. You can verify the new settings are effective with:
Description:
Specs built from sources by Spack. May take a long time.
When bootstrapping from sources, Spack requires a compiler with support
for C++14 (GCC on ``linux``, Apple Clang on ``darwin``) and static C++
standard libraries on ``linux``. Spack will build the required software
on the first request to concretize a spec:
.. note::
When bootstrapping from sources, Spack requires a full install of Python
including header files (e.g. ``python3-dev`` on Debian), and a compiler
with support for C++14 (GCC on Linux, Apple Clang on macOS) and static C++
standard libraries on Linux.
Spack will build the required software on the first request to concretize a spec:
.. code-block:: console
$ spack solve zlib
$ spack spec zlib
[+] /usr (external bison-3.0.4-wu5pgjchxzemk5ya2l3ddqug2d7jv6eb)
[+] /usr (external cmake-3.19.4-a4kmcfzxxy45mzku4ipmj5kdiiz5a57b)
[+] /usr (external python-3.6.9-x4fou4iqqlh5ydwddx3pvfcwznfrqztv)
==> Installing re2c-1.2.1-e3x6nxtk3ahgd63ykgy44mpuva6jhtdt
[ ... ]
==> Optimization: [0, 0, 0, 0, 0, 1, 0, 0, 0]
zlib@1.2.11%gcc@10.1.0+optimize+pic+shared arch=linux-ubuntu18.04-broadwell
.. tip::
If you want to speed-up bootstrapping ``clingo`` from sources, you may try to
search for ``cmake`` and ``bison`` on your system:
.. code-block:: console
$ spack external find cmake bison
==> The following specs have been detected on this system and added to /home/spack/.spack/packages.yaml
bison@3.0.4 cmake@3.19.4
"""""""""""""""""""
The Bootstrap Store
"""""""""""""""""""

View File

@@ -39,7 +39,7 @@ package:
.. code-block:: console
$ git clone https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install libelf

View File

@@ -213,6 +213,18 @@ location). The set ``my_custom_lmod_modules`` will install its lmod
modules to ``/path/to/install/custom/lmod/modules`` (and still install
its tcl modules, if any, to the default location).
By default, an architecture-specific directory is added to the root
directory. A module set may override that behavior by setting the
``arch_folder`` config value to ``False``.
.. code-block:: yaml
modules:
default:
roots:
tcl: /path/to/install/tcl/modules
arch_folder: false
Obviously, having multiple module sets install modules to the default
location could be confusing to users of your modules. In the next
section, we will discuss enabling and disabling module types (module
@@ -261,29 +273,30 @@ of the installed software. For instance, in the snippet below:
.. code-block:: yaml
modules:
tcl:
# The keyword `all` selects every package
all:
environment:
set:
BAR: 'bar'
# This anonymous spec selects any package that
# depends on openmpi. The double colon at the
# end clears the set of rules that matched so far.
^openmpi::
environment:
set:
BAR: 'baz'
# Selects any zlib package
zlib:
environment:
prepend_path:
LD_LIBRARY_PATH: 'foo'
# Selects zlib compiled with gcc@4.8
zlib%gcc@4.8:
environment:
unset:
- FOOBAR
default:
tcl:
# The keyword `all` selects every package
all:
environment:
set:
BAR: 'bar'
# This anonymous spec selects any package that
# depends on openmpi. The double colon at the
# end clears the set of rules that matched so far.
^openmpi::
environment:
set:
BAR: 'baz'
# Selects any zlib package
zlib:
environment:
prepend_path:
LD_LIBRARY_PATH: 'foo'
# Selects zlib compiled with gcc@4.8
zlib%gcc@4.8:
environment:
unset:
- FOOBAR
you are instructing Spack to set the environment variable ``BAR=bar`` for every module,
unless the associated spec satisfies ``^openmpi`` in which case ``BAR=baz``.
@@ -310,9 +323,10 @@ your system. If you write a configuration file like:
.. code-block:: yaml
modules:
tcl:
whitelist: ['gcc', 'llvm'] # Whitelist will have precedence over blacklist
blacklist: ['%gcc@4.4.7'] # Assuming gcc@4.4.7 is the system compiler
default:
tcl:
whitelist: ['gcc', 'llvm'] # Whitelist will have precedence over blacklist
blacklist: ['%gcc@4.4.7'] # Assuming gcc@4.4.7 is the system compiler
you will prevent the generation of module files for any package that
is compiled with ``gcc@4.4.7``, with the only exception of any ``gcc``
@@ -337,8 +351,9 @@ shows how to set hash length in the module file names:
.. code-block:: yaml
modules:
tcl:
hash_length: 7
default:
tcl:
hash_length: 7
To help make module names more readable, and to help alleviate name conflicts
with a short hash, one can use the ``suffixes`` option in the modules
@@ -348,11 +363,12 @@ For instance, the following config options,
.. code-block:: yaml
modules:
tcl:
all:
suffixes:
^python@2.7.12: 'python-2.7.12'
^openblas: 'openblas'
default:
tcl:
all:
suffixes:
^python@2.7.12: 'python-2.7.12'
^openblas: 'openblas'
will add a ``python-2.7.12`` version string to any packages compiled with
python matching the spec, ``python@2.7.12``. This is useful to know which
@@ -367,10 +383,11 @@ covered in :ref:`adding_projections_to_views`.
.. code-block:: yaml
modules:
tcl:
projections:
all: '{name}/{version}-{compiler.name}-{compiler.version}-module'
^mpi: '{name}/{version}-{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}-module'
default:
tcl:
projections:
all: '{name}/{version}-{compiler.name}-{compiler.version}-module'
^mpi: '{name}/{version}-{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}-module'
will create module files that are nested in directories by package
name, contain the version and compiler name and version, and have the
@@ -391,15 +408,16 @@ that are already in the LMod hierarchy.
.. code-block:: yaml
modules:
enable:
- tcl
tcl:
projections:
all: '{name}/{version}-{compiler.name}-{compiler.version}'
all:
conflict:
- '{name}'
- 'intel/14.0.1'
default:
enable:
- tcl
tcl:
projections:
all: '{name}/{version}-{compiler.name}-{compiler.version}'
all:
conflict:
- '{name}'
- 'intel/14.0.1'
will create module files that will conflict with ``intel/14.0.1`` and with the
base directory of the same module, effectively preventing the possibility to
@@ -419,16 +437,17 @@ that are already in the LMod hierarchy.
.. code-block:: yaml
modules:
enable:
- lmod
lmod:
core_compilers:
- 'gcc@4.8'
core_specs:
- 'python'
hierarchy:
- 'mpi'
- 'lapack'
default:
enable:
- lmod
lmod:
core_compilers:
- 'gcc@4.8'
core_specs:
- 'python'
hierarchy:
- 'mpi'
- 'lapack'
that will generate a hierarchy in which the ``lapack`` and ``mpi`` layer can be switched
independently. This allows a site to build the same libraries or applications against different
@@ -449,6 +468,36 @@ that are already in the LMod hierarchy.
For hierarchies that are deeper than three layers ``lmod spider`` may have some issues.
See `this discussion on the LMod project <https://github.com/TACC/Lmod/issues/114>`_.
""""""""""""""""""""""
Select default modules
""""""""""""""""""""""
By default, when multiple modules of the same name share a directory,
the highest version number will be the default module. This behavior
of the ``module`` command can be overridden with a symlink named
``default`` to the desired default module. If you wish to configure
default modules with Spack, add a ``defaults`` key to your modules
configuration:
.. code-block:: yaml
modules:
my-module-set:
tcl:
defaults:
- gcc@10.2.1
- hdf5@1.2.10+mpi+hl%gcc
These defaults may be arbitrarily specific. For any package that
satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
.. _customize-env-modifications:
"""""""""""""""""""""""""""""""""""
@@ -549,11 +598,12 @@ do so by using the environment blacklist:
.. code-block:: yaml
modules:
tcl:
all:
filter:
# Exclude changes to any of these variables
environment_blacklist: ['CPATH', 'LIBRARY_PATH']
default:
tcl:
all:
filter:
# Exclude changes to any of these variables
environment_blacklist: ['CPATH', 'LIBRARY_PATH']
The configuration above will generate module files that will not contain
modifications to either ``CPATH`` or ``LIBRARY_PATH``.
@@ -572,9 +622,10 @@ activated using ``spack activate``:
.. code-block:: yaml
modules:
tcl:
^python:
autoload: 'direct'
default:
tcl:
^python:
autoload: 'direct'
The configuration file above will produce module files that will
load their direct dependencies if the package installed depends on ``python``.
@@ -591,9 +642,10 @@ The allowed values for the ``autoload`` statement are either ``none``,
.. code-block:: yaml
modules:
lmod:
all:
autoload: 'direct'
default:
lmod:
all:
autoload: 'direct'
.. note::
TCL prerequisites

View File

@@ -695,20 +695,23 @@ example, ``py-sphinx-rtd-theme@0.1.10a0``. In this case, numbers are
always considered to be "newer" than letters. This is for consistency
with `RPM <https://bugzilla.redhat.com/show_bug.cgi?id=50977>`_.
Spack versions may also be arbitrary non-numeric strings; any string
here will suffice; for example, ``@develop``, ``@master``, ``@local``.
Versions are compared as follows. First, a version string is split into
multiple fields based on delimiters such as ``.``, ``-`` etc. Then
matching fields are compared using the rules below:
Spack versions may also be arbitrary non-numeric strings, for example
``@develop``, ``@master``, ``@local``.
#. The following develop-like strings are greater (newer) than all
numbers and are ordered as ``develop > main > master > head > trunk``.
The order on versions is defined as follows. A version string is split
into a list of components based on delimiters such as ``.``, ``-`` etc.
Lists are then ordered lexicographically, where components are ordered
as follows:
#. Numbers are all less than the chosen develop-like strings above,
and are sorted numerically.
#. The following special strings are considered larger than any other
numeric or non-numeric version component, and satisfy the following
order between themselves: ``develop > main > master > head > trunk``.
#. All other non-numeric versions are less than numeric versions, and
are sorted alphabetically.
#. Numbers are ordered numerically, are less than special strings, and
larger than other non-numeric components.
#. All other non-numeric components are less than numeric components,
and are ordered alphabetically.
The logic behind this sort order is two-fold:
@@ -729,7 +732,7 @@ Version selection
When concretizing, many versions might match a user-supplied spec.
For example, the spec ``python`` matches all available versions of the
package ``python``. Similarly, ``python@3:`` matches all versions of
Python3. Given a set of versions that match a spec, Spack
Python 3 and above. Given a set of versions that match a spec, Spack
concretization uses the following priorities to decide which one to
use:
@@ -1419,6 +1422,60 @@ other similar operations:
).with_default('auto').with_non_feature_values('auto'),
)
^^^^^^^^^^^^^^^^^^^^
Conditional Variants
^^^^^^^^^^^^^^^^^^^^
The variant directive accepts a ``when`` clause. The variant will only
be present on specs that otherwise satisfy the spec listed as the
``when`` clause. For example, the following class has a variant
``bar`` when it is at version 2.0 or higher.
.. code-block:: python
class Foo(Package):
...
variant('bar', default=False, when='@2.0:', description='help message')
The ``when`` clause follows the same syntax and accepts the same
values as the ``when`` argument of
:py:func:`spack.directives.depends_on`
^^^^^^^^^^^^^^^^^^^
Overriding Variants
^^^^^^^^^^^^^^^^^^^
Packages may override variants for several reasons, most often to
change the default from a variant defined in a parent class or to
change the conditions under which a variant is present on the spec.
When a variant is defined multiple times, whether in the same package
file or in a subclass and a superclass, the last definition is used
for all attributes **except** for the ``when`` clauses. The ``when``
clauses are accumulated through all invocations, and the variant is
present on the spec if any of the accumulated conditions are
satisfied.
For example, consider the following package:
.. code-block:: python
class Foo(Package):
...
variant('bar', default=False, when='@1.0', description='help1')
variant('bar', default=True, when='platform=darwin', description='help2')
...
This package ``foo`` has a variant ``bar`` when the spec satisfies
either ``@1.0`` or ``platform=darwin``, but not for other platforms at
other versions. The default for this variant, when it is present, is
always ``True``, regardless of which condition of the variant is
satisfied. This allows packages to override variants in packages or
build system classes from which they inherit, by modifying the variant
values without modifying the ``when`` clause. It also allows a package
to implement ``or`` semantics for a variant ``when`` clause by
duplicating the variant definition.
------------------------------------
Resources (expanding extra tarballs)
------------------------------------
@@ -2063,7 +2120,7 @@ Version ranges
^^^^^^^^^^^^^^
Although some packages require a specific version for their dependencies,
most can be built with a range of version. For example, if you are
most can be built with a range of versions. For example, if you are
writing a package for a legacy Python module that only works with Python
2.4 through 2.6, this would look like:
@@ -2072,9 +2129,9 @@ writing a package for a legacy Python module that only works with Python
depends_on('python@2.4:2.6')
Version ranges in Spack are *inclusive*, so ``2.4:2.6`` means any version
greater than or equal to ``2.4`` and up to and including ``2.6``. If you
want to specify that a package works with any version of Python 3, this
would look like:
greater than or equal to ``2.4`` and up to and including any ``2.6.x``. If
you want to specify that a package works with any version of Python 3 (or
higher), this would look like:
.. code-block:: python
@@ -2085,29 +2142,30 @@ requires Python 2, you can similarly leave out the lower bound:
.. code-block:: python
depends_on('python@:2.9')
depends_on('python@:2')
Notice that we didn't use ``@:3``. Version ranges are *inclusive*, so
``@:3`` means "up to and including 3".
``@:3`` means "up to and including any 3.x version".
What if a package can only be built with Python 2.6? You might be
What if a package can only be built with Python 2.7? You might be
inclined to use:
.. code-block:: python
depends_on('python@2.6')
depends_on('python@2.7')
However, this would be wrong. Spack assumes that all version constraints
are absolute, so it would try to install Python at exactly ``2.6``. The
correct way to specify this would be:
are exact, so it would try to install Python not at ``2.7.18``, but
exactly at ``2.7``, which is a non-existent version. The correct way to
specify this would be:
.. code-block:: python
depends_on('python@2.6.0:2.6.999')
depends_on('python@2.7.0:2.7')
A spec can contain multiple version ranges separated by commas.
For example, if you need Boost 1.59.0 or newer, but there are known
issues with 1.64.0, 1.65.0, and 1.66.0, you can say:
A spec can contain a version list of ranges and individual versions
separated by commas. For example, if you need Boost 1.59.0 or newer,
but there are known issues with 1.64.0, 1.65.0, and 1.66.0, you can say:
.. code-block:: python
@@ -2824,7 +2882,7 @@ is equivalent to:
depends_on('elpa+openmp', when='+openmp+elpa')
Constraints from nested context managers are also added together, but they are rarely
Constraints from nested context managers are also combined together, but they are rarely
needed or recommended.
.. _install-method:

View File

@@ -48,9 +48,9 @@ or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), thou
topics are outside the scope of this document.
Spack's pipelines are now making use of the
`trigger <https://docs.gitlab.com/12.9/ee/ci/yaml/README.html#trigger>`_ syntax to run
`trigger <https://docs.gitlab.com/ee/ci/yaml/#trigger>`_ syntax to run
dynamically generated
`child pipelines <https://docs.gitlab.com/12.9/ee/ci/parent_child_pipelines.html>`_.
`child pipelines <https://docs.gitlab.com/ee/ci/pipelines/parent_child_pipelines.html>`_.
Note that the use of dynamic child pipelines requires running Gitlab version
``>= 12.9``.

View File

@@ -335,7 +335,7 @@ merged YAML from all configuration files, use ``spack config get repos``:
- ~/myrepo
- $spack/var/spack/repos/builtin
mNote that, unlike ``spack repo list``, this does not include the
Note that, unlike ``spack repo list``, this does not include the
namespace, which is read from each repo's ``repo.yaml``.
^^^^^^^^^^^^^^^^^^^^^

View File

@@ -5,3 +5,6 @@ sphinx>=3.4,!=4.1.2
sphinxcontrib-programoutput
sphinx-rtd-theme
python-levenshtein
# Restrict to docutils <0.17 to workaround a list rendering issue in sphinx.
# https://stackoverflow.com/questions/67542699
docutils <0.17

View File

@@ -17,6 +17,7 @@ spack:
# Sphinx
- "py-sphinx@3.4:4.1.1,4.1.3:"
- py-sphinxcontrib-programoutput
- py-docutils@:0.16
- py-sphinx-rtd-theme
# VCS
- git

View File

@@ -1,5 +1,5 @@
Name, Supported Versions, Notes, Requirement Reason
Python, 2.6/2.7/3.5-3.9, , Interpreter for Spack
Python, 2.7/3.5-3.9, , Interpreter for Spack
C/C++ Compilers, , , Building software
make, , , Build software
patch, , , Build software
@@ -15,3 +15,4 @@ gnupg2, , , Sign/Verify Buildcaches
git, , , Manage Software Repositories
svn, , Optional, Manage Software Repositories
hg, , Optional, Manage Software Repositories
Python header files, , Optional (e.g. ``python3-dev`` on Debian), Bootstrapping from sources
1 Name Supported Versions Notes Requirement Reason
2 Python 2.6/2.7/3.5-3.9 2.7/3.5-3.9 Interpreter for Spack
3 C/C++ Compilers Building software
4 make Build software
5 patch Build software
15 git Manage Software Repositories
16 svn Optional Manage Software Repositories
17 hg Optional Manage Software Repositories
18 Python header files Optional (e.g. ``python3-dev`` on Debian) Bootstrapping from sources

576
lib/spack/env/cc vendored
View File

@@ -1,4 +1,5 @@
#!/bin/bash
#!/bin/sh
# shellcheck disable=SC2034 # evals in this script fool shellcheck
#
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
@@ -20,25 +21,33 @@
# -Wl,-rpath arguments for dependency /lib directories.
#
# Reset IFS to the default: whitespace-separated lists. When we use
# other separators, we set and reset it.
unset IFS
# Separator for lists whose names end with `_list`.
# We pick the alarm bell character, which is highly unlikely to
# conflict with anything. This is a literal bell character (which
# we have to use since POSIX sh does not convert escape sequences
# like '\a' outside of the format argument of `printf`).
# NOTE: Depending on your editor this may look empty, but it is not.
readonly lsep=''
# This is an array of environment variables that need to be set before
# the script runs. They are set by routines in spack.build_environment
# as part of the package installation process.
parameters=(
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_DEBUG_LOG_ID
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_TARGET_ARGS
SPACK_DTAGS_TO_ADD
SPACK_DTAGS_TO_STRIP
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS
)
readonly params="\
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_DEBUG_LOG_ID
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS"
# Optional parameters that aren't required to be set
@@ -58,60 +67,157 @@ parameters=(
# Test command is used to unit test the compiler script.
# SPACK_TEST_COMMAND
# die()
# Prints a message and exits with error 1.
function die {
echo "$@"
# die MESSAGE
# Print a message and exit with error code 1.
die() {
echo "[spack cc] ERROR: $*"
exit 1
}
# read input parameters into proper bash arrays.
# SYSTEM_DIRS is delimited by :
IFS=':' read -ra SPACK_SYSTEM_DIRS <<< "${SPACK_SYSTEM_DIRS}"
# empty VARNAME
# Return whether the variable VARNAME is unset or set to the empty string.
empty() {
eval "test -z \"\${$1}\""
}
# SPACK_<LANG>FLAGS and SPACK_LDLIBS are split by ' '
IFS=' ' read -ra SPACK_FFLAGS <<< "$SPACK_FFLAGS"
IFS=' ' read -ra SPACK_CPPFLAGS <<< "$SPACK_CPPFLAGS"
IFS=' ' read -ra SPACK_CFLAGS <<< "$SPACK_CFLAGS"
IFS=' ' read -ra SPACK_CXXFLAGS <<< "$SPACK_CXXFLAGS"
IFS=' ' read -ra SPACK_LDFLAGS <<< "$SPACK_LDFLAGS"
IFS=' ' read -ra SPACK_LDLIBS <<< "$SPACK_LDLIBS"
# setsep LISTNAME
# Set the global variable 'sep' to the separator for a list with name LISTNAME.
# There are three types of lists:
# 1. regular lists end with _list and are separated by $lsep
# 2. directory lists end with _dirs/_DIRS/PATH(S) and are separated by ':'
# 3. any other list is assumed to be separated by spaces: " "
setsep() {
case "$1" in
*_dirs|*_DIRS|*PATH|*PATHS)
sep=':'
;;
*_list)
sep="$lsep"
;;
*)
sep=" "
;;
esac
}
# prepend LISTNAME ELEMENT [SEP]
#
# Prepend ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Handles empty lists and single-element lists.
prepend() {
varname="$1"
elt="$2"
if empty "$varname"; then
eval "$varname=\"\${elt}\""
else
# Get the appropriate separator for the list we're appending to.
setsep "$varname"
eval "$varname=\"\${elt}${sep}\${$varname}\""
fi
}
# append LISTNAME ELEMENT [SEP]
#
# Append ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Handles empty lists and single-element lists.
append() {
varname="$1"
elt="$2"
if empty "$varname"; then
eval "$varname=\"\${elt}\""
else
# Get the appropriate separator for the list we're appending to.
setsep "$varname"
eval "$varname=\"\${$varname}${sep}\${elt}\""
fi
}
# extend LISTNAME1 LISTNAME2 [PREFIX]
#
# Append the elements stored in the variable LISTNAME2
# to the list stored in LISTNAME1.
# If PREFIX is provided, prepend it to each element.
extend() {
# Figure out the appropriate IFS for the list we're reading.
setsep "$2"
if [ "$sep" != " " ]; then
IFS="$sep"
fi
eval "for elt in \${$2}; do append $1 \"$3\${elt}\"; done"
unset IFS
}
# preextend LISTNAME1 LISTNAME2 [PREFIX]
#
# Prepend the elements stored in the list at LISTNAME2
# to the list at LISTNAME1, preserving order.
# If PREFIX is provided, prepend it to each element.
preextend() {
# Figure out the appropriate IFS for the list we're reading.
setsep "$2"
if [ "$sep" != " " ]; then
IFS="$sep"
fi
# first, reverse the list to prepend
_reversed_list=""
eval "for elt in \${$2}; do prepend _reversed_list \"$3\${elt}\"; done"
# prepend reversed list to preextend in order
IFS="${lsep}"
for elt in $_reversed_list; do prepend "$1" "$3${elt}"; done
unset IFS
}
# system_dir PATH
# test whether a path is a system directory
function system_dir {
system_dir() {
IFS=':' # SPACK_SYSTEM_DIRS is colon-separated
path="$1"
for sd in "${SPACK_SYSTEM_DIRS[@]}"; do
if [ "${path}" == "${sd}" ] || [ "${path}" == "${sd}/" ]; then
for sd in $SPACK_SYSTEM_DIRS; do
if [ "${path}" = "${sd}" ] || [ "${path}" = "${sd}/" ]; then
# success if path starts with a system prefix
unset IFS
return 0
fi
done
unset IFS
return 1 # fail if path starts no system prefix
}
for param in "${parameters[@]}"; do
if [[ -z ${!param+x} ]]; then
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
fi
# ensure required variables are set
for param in $params; do
if eval "test -z \"\${${param}:-}\""; then
die "Spack compiler must be run from Spack! Input '$param' is missing."
fi
done
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [[ -z ${SPACK_ADD_DEBUG_FLAGS+x} ]]; then
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
SPACK_ADD_DEBUG_FLAGS="false"
fi
# SPACK_ADD_DEBUG_FLAGS must be true/false/custom
is_valid="false"
for param in "true" "false" "custom"; do
if [ "$param" == "$SPACK_ADD_DEBUG_FLAGS" ]; then
if [ "$param" = "$SPACK_ADD_DEBUG_FLAGS" ]; then
is_valid="true"
fi
done
# Exit with error if we are given an incorrect value
if [ "$is_valid" == "false" ]; then
die "SPACK_ADD_DEBUG_FLAGS, if defined, must be one of 'true' 'false' or 'custom'"
if [ "$is_valid" = "false" ]; then
die "SPACK_ADD_DEBUG_FLAGS, if defined, must be one of 'true', 'false', or 'custom'."
fi
# Figure out the type of compiler, the language, and the mode so that
@@ -128,7 +234,7 @@ fi
# ld link
# ccld compile & link
command=$(basename "$0")
command="${0##*/}"
comp="CC"
case "$command" in
cpp)
@@ -142,7 +248,7 @@ case "$command" in
lang_flags=C
debug_flags="-g"
;;
c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC)
c++|CC|g++|clang++|armclang++|icpc|icpx|dpcpp|pgc++|nvc++|xlc++|xlc++_r|FCC)
command="$SPACK_CXX"
language="C++"
comp="CXX"
@@ -174,7 +280,7 @@ esac
# If any of the arguments below are present, then the mode is vcheck.
# In vcheck mode, nothing is added in terms of extra search paths or
# libraries.
if [[ -z $mode ]] || [[ $mode == ld ]]; then
if [ -z "$mode" ] || [ "$mode" = ld ]; then
for arg in "$@"; do
case $arg in
-v|-V|--version|-dumpversion)
@@ -186,16 +292,16 @@ if [[ -z $mode ]] || [[ $mode == ld ]]; then
fi
# Finish setting up the mode.
if [[ -z $mode ]]; then
if [ -z "$mode" ]; then
mode=ccld
for arg in "$@"; do
if [[ $arg == -E ]]; then
if [ "$arg" = "-E" ]; then
mode=cpp
break
elif [[ $arg == -S ]]; then
elif [ "$arg" = "-S" ]; then
mode=as
break
elif [[ $arg == -c ]]; then
elif [ "$arg" = "-c" ]; then
mode=cc
break
fi
@@ -222,42 +328,46 @@ dtags_to_strip="${SPACK_DTAGS_TO_STRIP}"
linker_arg="${SPACK_LINKER_ARG}"
# Set up rpath variable according to language.
eval rpath=\$SPACK_${comp}_RPATH_ARG
rpath="ERROR: RPATH ARG WAS NOT SET"
eval "rpath=\${SPACK_${comp}_RPATH_ARG:?${rpath}}"
# Dump the mode and exit if the command is dump-mode.
if [[ $SPACK_TEST_COMMAND == dump-mode ]]; then
if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
echo "$mode"
exit
fi
# Check that at least one of the real commands was actually selected,
# otherwise we don't know what to execute.
if [[ -z $command ]]; then
die "ERROR: Compiler '$SPACK_COMPILER_SPEC' does not support compiling $language programs."
# If, say, SPACK_CC is set but SPACK_FC is not, we want to know. Compilers do not
# *have* to set up Fortran executables, so we need to tell the user when a build is
# about to attempt to use them unsuccessfully.
if [ -z "$command" ]; then
die "Compiler '$SPACK_COMPILER_SPEC' does not have a $language compiler configured."
fi
#
# Filter '.' and Spack environment directories out of PATH so that
# this script doesn't just call itself
#
IFS=':' read -ra env_path <<< "$PATH"
IFS=':' read -ra spack_env_dirs <<< "$SPACK_ENV_PATH"
spack_env_dirs+=("" ".")
export PATH=""
for dir in "${env_path[@]}"; do
new_dirs=""
IFS=':'
for dir in $PATH; do
addpath=true
for env_dir in "${spack_env_dirs[@]}"; do
if [[ "${dir%%/}" == "$env_dir" ]]; then
addpath=false
break
fi
for spack_env_dir in $SPACK_ENV_PATH; do
case "${dir%%/}" in
"$spack_env_dir"|'.'|'')
addpath=false
break
;;
esac
done
if $addpath; then
export PATH="${PATH:+$PATH:}$dir"
if [ $addpath = true ]; then
append new_dirs "$dir"
fi
done
unset IFS
export PATH="$new_dirs"
if [[ $mode == vcheck ]]; then
if [ "$mode" = vcheck ]; then
exec "${command}" "$@"
fi
@@ -265,16 +375,20 @@ fi
# It doesn't work with -rpath.
# This variable controls whether they are added.
add_rpaths=true
if [[ ($mode == ld || $mode == ccld) && "$SPACK_SHORT_SPEC" =~ "darwin" ]];
then
for arg in "$@"; do
if [[ ($arg == -r && $mode == ld) ||
($arg == -r && $mode == ccld) ||
($arg == -Wl,-r && $mode == ccld) ]]; then
add_rpaths=false
break
fi
done
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
if [ "${SPACK_SHORT_SPEC#*darwin}" != "${SPACK_SHORT_SPEC}" ]; then
for arg in "$@"; do
if [ "$arg" = "-r" ]; then
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
add_rpaths=false
break
fi
elif [ "$arg" = "-Wl,-r" ] && [ "$mode" = ccld ]; then
add_rpaths=false
break
fi
done
fi
fi
# Save original command for debug logging
@@ -297,17 +411,22 @@ input_command="$*"
# The libs variable is initialized here for completeness, and it is also
# used later to inject flags supplied via `ldlibs` on the command
# line. These come into the wrappers via SPACK_LDLIBS.
#
includes=()
libdirs=()
rpaths=()
system_includes=()
system_libdirs=()
system_rpaths=()
libs=()
other_args=()
isystem_system_includes=()
isystem_includes=()
# The loop below breaks up the command line into these lists of components.
# The lists are all bell-separated to be as flexible as possible, as their
# contents may come from the command line, from ' '-separated lists,
# ':'-separated lists, etc.
include_dirs_list=""
lib_dirs_list=""
rpath_dirs_list=""
system_include_dirs_list=""
system_lib_dirs_list=""
system_rpath_dirs_list=""
isystem_system_include_dirs_list=""
isystem_include_dirs_list=""
libs_list=""
other_args_list=""
while [ $# -ne 0 ]; do
@@ -327,32 +446,32 @@ while [ $# -ne 0 ]; do
isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
isystem_system_includes+=("$arg")
append isystem_system_include_dirs_list "$arg"
else
isystem_includes+=("$arg")
append isystem_include_dirs_list "$arg"
fi
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
system_includes+=("$arg")
append system_include_dirs_list "$arg"
else
includes+=("$arg")
append include_dirs_list "$arg"
fi
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
system_libdirs+=("$arg")
append system_lib_dirs_list "$arg"
else
libdirs+=("$arg")
append lib_dirs_list "$arg"
fi
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
# and passed by ifx to the linker, which confuses it with a
# and passed by ifx to the linker, which confuses it with a
# library. Filter it out.
# TODO: generalize filtering of args with an env var, so that
# TODO: we do not have to special case this here.
@@ -363,66 +482,76 @@ while [ $# -ne 0 ]; do
fi
arg="${1#-l}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
other_args+=("-l$arg")
append other_args_list "-l$arg"
;;
-Wl,*)
arg="${1#-Wl,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if [[ "$arg" = -rpath=* ]]; then
rp="${arg#-rpath=}"
elif [[ "$arg" = --rpath=* ]]; then
rp="${arg#--rpath=}"
elif [[ "$arg" = -rpath,* ]]; then
rp="${arg#-rpath,}"
elif [[ "$arg" = --rpath,* ]]; then
rp="${arg#--rpath,}"
elif [[ "$arg" =~ ^-?-rpath$ ]]; then
shift; arg="$1"
if [[ "$arg" != -Wl,* ]]; then
die "-Wl,-rpath was not followed by -Wl,*"
fi
rp="${arg#-Wl,}"
elif [[ "$arg" = "$dtags_to_strip" ]] ; then
: # We want to remove explicitly this flag
else
other_args+=("-Wl,$arg")
fi
case "$arg" in
-rpath=*) rp="${arg#-rpath=}" ;;
--rpath=*) rp="${arg#--rpath=}" ;;
-rpath,*) rp="${arg#-rpath,}" ;;
--rpath,*) rp="${arg#--rpath,}" ;;
-rpath|--rpath)
shift; arg="$1"
case "$arg" in
-Wl,*)
rp="${arg#-Wl,}"
;;
*)
die "-Wl,-rpath was not followed by -Wl,*"
;;
esac
;;
"$dtags_to_strip")
: # We want to remove explicitly this flag
;;
*)
append other_args_list "-Wl,$arg"
;;
esac
;;
-Xlinker,*)
arg="${1#-Xlinker,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if [[ "$arg" = -rpath=* ]]; then
rp="${arg#-rpath=}"
elif [[ "$arg" = --rpath=* ]]; then
rp="${arg#--rpath=}"
elif [[ "$arg" = -rpath ]] || [[ "$arg" = --rpath ]]; then
shift; arg="$1"
if [[ "$arg" != -Xlinker,* ]]; then
die "-Xlinker,-rpath was not followed by -Xlinker,*"
fi
rp="${arg#-Xlinker,}"
else
other_args+=("-Xlinker,$arg")
fi
case "$arg" in
-rpath=*) rp="${arg#-rpath=}" ;;
--rpath=*) rp="${arg#--rpath=}" ;;
-rpath|--rpath)
shift; arg="$1"
case "$arg" in
-Xlinker,*)
rp="${arg#-Xlinker,}"
;;
*)
die "-Xlinker,-rpath was not followed by -Xlinker,*"
;;
esac
;;
*)
append other_args_list "-Xlinker,$arg"
;;
esac
;;
-Xlinker)
if [[ "$2" == "-rpath" ]]; then
if [[ "$3" != "-Xlinker" ]]; then
if [ "$2" = "-rpath" ]; then
if [ "$3" != "-Xlinker" ]; then
die "-Xlinker,-rpath was not followed by -Xlinker,*"
fi
shift 3;
rp="$1"
elif [[ "$2" = "$dtags_to_strip" ]] ; then
elif [ "$2" = "$dtags_to_strip" ]; then
shift # We want to remove explicitly this flag
else
other_args+=("$1")
append other_args_list "$1"
fi
;;
*)
if [[ "$1" = "$dtags_to_strip" ]] ; then
if [ "$1" = "$dtags_to_strip" ]; then
: # We want to remove explicitly this flag
else
other_args+=("$1")
append other_args_list "$1"
fi
;;
esac
@@ -430,9 +559,9 @@ while [ $# -ne 0 ]; do
# test rpaths against system directories in one place.
if [ -n "$rp" ]; then
if system_dir "$rp"; then
system_rpaths+=("$rp")
append system_rpath_dirs_list "$rp"
else
rpaths+=("$rp")
append rpath_dirs_list "$rp"
fi
fi
shift
@@ -445,16 +574,15 @@ done
# See the gmake manual on implicit rules for details:
# https://www.gnu.org/software/make/manual/html_node/Implicit-Variables.html
#
flags=()
flags_list=""
# Add debug flags
if [ "${SPACK_ADD_DEBUG_FLAGS}" == "true" ]; then
flags=("${flags[@]}" "${debug_flags}")
if [ "${SPACK_ADD_DEBUG_FLAGS}" = "true" ]; then
extend flags_list debug_flags
# If a custom flag is requested, derive from environment
elif [ "$SPACK_ADD_DEBUG_FLAGS" == "custom" ]; then
IFS=' ' read -ra SPACK_DEBUG_FLAGS <<< "$SPACK_DEBUG_FLAGS"
flags=("${flags[@]}" "${SPACK_DEBUG_FLAGS[@]}")
elif [ "$SPACK_ADD_DEBUG_FLAGS" = "custom" ]; then
extend flags_list SPACK_DEBUG_FLAGS
fi
# Fortran flags come before CPPFLAGS
@@ -462,7 +590,8 @@ case "$mode" in
cc|ccld)
case $lang_flags in
F)
flags=("${flags[@]}" "${SPACK_FFLAGS[@]}") ;;
extend flags_list SPACK_FFLAGS
;;
esac
;;
esac
@@ -470,7 +599,8 @@ esac
# C preprocessor flags come before any C/CXX flags
case "$mode" in
cpp|as|cc|ccld)
flags=("${flags[@]}" "${SPACK_CPPFLAGS[@]}") ;;
extend flags_list SPACK_CPPFLAGS
;;
esac
@@ -479,67 +609,67 @@ case "$mode" in
cc|ccld)
case $lang_flags in
C)
flags=("${flags[@]}" "${SPACK_CFLAGS[@]}") ;;
extend flags_list SPACK_CFLAGS
;;
CXX)
flags=("${flags[@]}" "${SPACK_CXXFLAGS[@]}") ;;
extend flags_list SPACK_CXXFLAGS
;;
esac
flags=(${SPACK_TARGET_ARGS[@]} "${flags[@]}")
# prepend target args
preextend flags_list SPACK_TARGET_ARGS
;;
esac
# Linker flags
case "$mode" in
ld|ccld)
flags=("${flags[@]}" "${SPACK_LDFLAGS[@]}") ;;
extend flags_list SPACK_LDFLAGS
;;
esac
# On macOS insert headerpad_max_install_names linker flag
if [[ ($mode == ld || $mode == ccld) && "$SPACK_SHORT_SPEC" =~ "darwin" ]];
then
case "$mode" in
ld)
flags=("${flags[@]}" -headerpad_max_install_names) ;;
ccld)
flags=("${flags[@]}" "-Wl,-headerpad_max_install_names") ;;
esac
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
if [ "${SPACK_SHORT_SPEC#*darwin}" != "${SPACK_SHORT_SPEC}" ]; then
case "$mode" in
ld)
append flags_list "-headerpad_max_install_names" ;;
ccld)
append flags_list "-Wl,-headerpad_max_install_names" ;;
esac
fi
fi
IFS=':' read -ra rpath_dirs <<< "$SPACK_RPATH_DIRS"
if [[ $mode == ccld || $mode == ld ]]; then
if [[ "$add_rpaths" != "false" ]] ; then
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
if [ "$add_rpaths" != "false" ]; then
# Append RPATH directories. Note that in the case of the
# top-level package these directories may not exist yet. For dependencies
# it is assumed that paths have already been confirmed.
rpaths=("${rpaths[@]}" "${rpath_dirs[@]}")
extend rpath_dirs_list SPACK_RPATH_DIRS
fi
fi
IFS=':' read -ra link_dirs <<< "$SPACK_LINK_DIRS"
if [[ $mode == ccld || $mode == ld ]]; then
libdirs=("${libdirs[@]}" "${link_dirs[@]}")
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
extend lib_dirs_list SPACK_LINK_DIRS
fi
# add RPATHs if we're in in any linking mode
case "$mode" in
ld|ccld)
# Set extra RPATHs
IFS=':' read -ra extra_rpaths <<< "$SPACK_COMPILER_EXTRA_RPATHS"
libdirs+=("${extra_rpaths[@]}")
if [[ "$add_rpaths" != "false" ]] ; then
rpaths+=("${extra_rpaths[@]}")
extend lib_dirs_list SPACK_COMPILER_EXTRA_RPATHS
if [ "$add_rpaths" != "false" ]; then
extend rpath_dirs_list SPACK_COMPILER_EXTRA_RPATHS
fi
# Set implicit RPATHs
IFS=':' read -ra implicit_rpaths <<< "$SPACK_COMPILER_IMPLICIT_RPATHS"
if [[ "$add_rpaths" != "false" ]] ; then
rpaths+=("${implicit_rpaths[@]}")
if [ "$add_rpaths" != "false" ]; then
extend rpath_dirs_list SPACK_COMPILER_IMPLICIT_RPATHS
fi
# Add SPACK_LDLIBS to args
for lib in "${SPACK_LDLIBS[@]}"; do
libs+=("${lib#-l}")
for lib in $SPACK_LDLIBS; do
append libs_list "${lib#-l}"
done
;;
esac
@@ -547,63 +677,62 @@ esac
#
# Finally, reassemble the command line.
#
# Includes and system includes first
args=()
# flags assembled earlier
args+=("${flags[@]}")
args_list="$flags_list"
# Insert include directories just prior to any system include directories
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list include_dirs_list "-I"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
for dir in "${includes[@]}"; do args+=("-I$dir"); done
for dir in "${isystem_includes[@]}"; do args+=("-isystem" "$dir"); done
case "$mode" in
cpp|cc|as|ccld)
if [ "$isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
else
extend args_list SPACK_INCLUDE_DIRS "-I"
fi
;;
esac
IFS=':' read -ra spack_include_dirs <<< "$SPACK_INCLUDE_DIRS"
if [[ $mode == cpp || $mode == cc || $mode == as || $mode == ccld ]]; then
if [[ "$isystem_was_used" == "true" ]] ; then
for dir in "${spack_include_dirs[@]}"; do args+=("-isystem" "$dir"); done
else
for dir in "${spack_include_dirs[@]}"; do args+=("-I$dir"); done
fi
fi
for dir in "${system_includes[@]}"; do args+=("-I$dir"); done
for dir in "${isystem_system_includes[@]}"; do args+=("-isystem" "$dir"); done
extend args_list system_include_dirs_list -I
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
# Library search paths
for dir in "${libdirs[@]}"; do args+=("-L$dir"); done
for dir in "${system_libdirs[@]}"; do args+=("-L$dir"); done
extend args_list lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
# RPATHs arguments
case "$mode" in
ccld)
if [ -n "$dtags_to_add" ] ; then args+=("$linker_arg$dtags_to_add") ; fi
for dir in "${rpaths[@]}"; do args+=("$rpath$dir"); done
for dir in "${system_rpaths[@]}"; do args+=("$rpath$dir"); done
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
;;
ld)
if [ -n "$dtags_to_add" ] ; then args+=("$dtags_to_add") ; fi
for dir in "${rpaths[@]}"; do args+=("-rpath" "$dir"); done
for dir in "${system_rpaths[@]}"; do args+=("-rpath" "$dir"); done
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
;;
esac
# Other arguments from the input command
args+=("${other_args[@]}")
extend args_list other_args_list
# Inject SPACK_LDLIBS, if supplied
for lib in "${libs[@]}"; do
args+=("-l$lib");
done
extend args_list libs_list "-l"
full_command=("$command" "${args[@]}")
full_command_list="$command"
extend full_command_list args_list
# prepend the ccache binary if we're using ccache
if [ -n "$SPACK_CCACHE_BINARY" ]; then
case "$lang_flags" in
C|CXX) # ccache only supports C languages
full_command=("${SPACK_CCACHE_BINARY}" "${full_command[@]}")
prepend full_command_list "${SPACK_CCACHE_BINARY}"
# workaround for stage being a temp folder
# see #3761#issuecomment-294352232
export CCACHE_NOHASHDIR=yes
@@ -612,25 +741,36 @@ if [ -n "$SPACK_CCACHE_BINARY" ]; then
fi
# dump the full command if the caller supplies SPACK_TEST_COMMAND=dump-args
if [[ $SPACK_TEST_COMMAND == dump-args ]]; then
IFS="
" && echo "${full_command[*]}"
exit
elif [[ $SPACK_TEST_COMMAND =~ dump-env-* ]]; then
var=${SPACK_TEST_COMMAND#dump-env-}
echo "$0: $var: ${!var}"
elif [[ -n $SPACK_TEST_COMMAND ]]; then
die "ERROR: Unknown test command"
if [ -n "${SPACK_TEST_COMMAND=}" ]; then
case "$SPACK_TEST_COMMAND" in
dump-args)
IFS="$lsep"
for arg in $full_command_list; do
echo "$arg"
done
unset IFS
exit
;;
dump-env-*)
var=${SPACK_TEST_COMMAND#dump-env-}
eval "printf '%s\n' \"\$0: \$var: \$$var\""
;;
*)
die "Unknown test command: '$SPACK_TEST_COMMAND'"
;;
esac
fi
#
# Write the input and output commands to debug logs if it's asked for.
#
if [[ $SPACK_DEBUG == TRUE ]]; then
if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
echo "[$mode] ${full_command[*]}" >> "$output_log"
echo "[$mode] ${full_command_list}" >> "$output_log"
fi
exec "${full_command[@]}"
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list

1
lib/spack/env/oneapi/dpcpp vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

View File

@@ -11,13 +11,13 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.1.2 (commit 4dbf253daf37e4a008e4beb6489f347b4a35aed4)
* Version: 0.1.2 (commit 85757b6666422fca86aa882a769bf78b0f992f54)
argparse
--------
* Homepage: https://pypi.python.org/pypi/argparse
* Usage: We include our own version to be Python 2.6 compatible.
* Usage: We include our own version to be Python 3.X compatible.
* Version: 1.4.0
* Note: This package has been slightly modified to improve
error message formatting. See the following commit if the
@@ -37,23 +37,15 @@
* Homepage: https://pypi.python.org/pypi/distro
* Usage: Provides a more stable linux distribution detection.
* Version: 1.0.4 (last version supporting Python 2.6)
functools
---------
* Homepage: https://github.com/python/cpython/blob/2.7/Lib/functools.py
* Usage: Used for implementation of total_ordering.
* Version: Unversioned
* Note: This is the functools.total_ordering implementation
from Python 2.7 backported so we can run on Python 2.6.
* Version: 1.6.0 (64946a1e2a9ff529047070657728600e006c99ff)
* Note: Last version supporting Python 2.7
jinja2
------
* Homepage: https://pypi.python.org/pypi/Jinja2
* Usage: A modern and designer-friendly templating language for Python.
* Version: 2.10
* Version: 2.11.3 (last version supporting Python 2.7)
jsonschema
----------
@@ -71,15 +63,7 @@
* Homepage: https://pypi.python.org/pypi/MarkupSafe
* Usage: Implements a XML/HTML/XHTML Markup safe string for Python.
* Version: 1.0
orderddict
----------
* Homepage: https://pypi.org/project/ordereddict/
* Usage: A drop-in substitute for Py2.7's new collections.OrderedDict
that works in Python 2.4-2.6.
* Version: 1.1
* Version: 1.1.1 (last version supporting Python 2.7)
py
--
@@ -88,6 +72,8 @@
* Usage: Needed by pytest. Library with cross-python path,
ini-parsing, io, code, and log facilities.
* Version: 1.4.34 (last version supporting Python 2.6)
* Note: This packages has been modified:
* https://github.com/pytest-dev/py/pull/186 was backported
pytest
------
@@ -118,7 +104,7 @@
* Homepage: https://pypi.python.org/pypi/six
* Usage: Python 2 and 3 compatibility utilities.
* Version: 1.11.0
* Version: 1.16.0
macholib
--------

View File

@@ -49,6 +49,19 @@ $ tox
congratulations :)
```
## Citing Archspec
If you are referencing `archspec` in a publication, please cite the following
paper:
* Massimiliano Culpo, Gregory Becker, Carlos Eduardo Arango Gutierrez, Kenneth
Hoste, and Todd Gamblin.
[**`archspec`: A library for detecting, labeling, and reasoning about
microarchitectures**](https://tgamblin.github.io/pubs/archspec-canopie-hpc-2020.pdf).
In *2nd International Workshop on Containers and New Orchestration Paradigms
for Isolated Environments in HPC (CANOPIE-HPC'20)*, Online Event, November
12, 2020.
## License
Archspec is distributed under the terms of both the MIT license and the

View File

@@ -206,11 +206,26 @@ def host():
# Get a list of possible candidates for this micro-architecture
candidates = compatible_microarchitectures(info)
# Sorting criteria for candidates
def sorting_fn(item):
return len(item.ancestors), len(item.features)
# Get the best generic micro-architecture
generic_candidates = [c for c in candidates if c.vendor == "generic"]
best_generic = max(generic_candidates, key=sorting_fn)
# Filter the candidates to be descendant of the best generic candidate.
# This is to avoid that the lack of a niche feature that can be disabled
# from e.g. BIOS prevents detection of a reasonably performant architecture
candidates = [c for c in candidates if c > best_generic]
# If we don't have candidates, return the best generic micro-architecture
if not candidates:
return best_generic
# Reverse sort of the depth for the inheritance tree among only targets we
# can use. This gets the newest target we satisfy.
return sorted(
candidates, key=lambda t: (len(t.ancestors), len(t.features)), reverse=True
)[0]
return max(candidates, key=sorting_fn)
def compatibility_check(architecture_family):
@@ -245,7 +260,13 @@ def compatibility_check_for_power(info, target):
"""Compatibility check for PPC64 and PPC64LE architectures."""
basename = platform.machine()
generation_match = re.search(r"POWER(\d+)", info.get("cpu", ""))
generation = int(generation_match.group(1))
try:
generation = int(generation_match.group(1))
except AttributeError:
# There might be no match under emulated environments. For instance
# emulating a ppc64le with QEMU and Docker still reports the host
# /proc/cpuinfo and not a Power
generation = 0
# We can use a target if it descends from our machine type and our
# generation (9 for POWER9, etc) is at least its generation.
@@ -285,3 +306,22 @@ def compatibility_check_for_aarch64(info, target):
and (target.vendor == vendor or target.vendor == "generic")
and target.features.issubset(features)
)
@compatibility_check(architecture_family="riscv64")
def compatibility_check_for_riscv64(info, target):
"""Compatibility check for riscv64 architectures."""
basename = "riscv64"
uarch = info.get("uarch")
# sifive unmatched board
if uarch == "sifive,u74-mc":
uarch = "u74mc"
# catch-all for unknown uarchs
else:
uarch = "riscv64"
arch_root = TARGETS[basename]
return (target == arch_root or arch_root in target.ancestors) and (
target == uarch or target.vendor == "generic"
)

View File

@@ -173,6 +173,12 @@ def family(self):
return roots.pop()
@property
def generic(self):
"""Returns the best generic architecture that is compatible with self"""
generics = [x for x in [self] + self.ancestors if x.vendor == "generic"]
return max(generics, key=lambda x: len(x.ancestors))
def to_dict(self, return_list_of_items=False):
"""Returns a dictionary representation of this object.

View File

@@ -2017,6 +2017,44 @@
"features": [],
"compilers": {
}
},
"riscv64": {
"from": [],
"vendor": "generic",
"features": [],
"compilers": {
"gcc": [
{
"versions": "7.1:",
"flags" : "-march=rv64gc"
}
],
"clang": [
{
"versions": "9.0:",
"flags" : "-march=rv64gc"
}
]
}
},
"u74mc": {
"from": ["riscv64"],
"vendor": "SiFive",
"features": [],
"compilers": {
"gcc": [
{
"versions": "10.2:",
"flags" : "-march=rv64gc -mtune=sifive-7-series"
}
],
"clang" : [
{
"versions": "12.0:",
"flags" : "-march=rv64gc -mtune=sifive-7-series"
}
]
}
}
},
"feature_aliases": {

View File

@@ -77,52 +77,18 @@
from six import StringIO
from six import string_types
class prefilter(object):
"""Make regular expressions faster with a simple prefiltering predicate.
Some regular expressions seem to be much more costly than others. In
most cases, we can evaluate a simple precondition, e.g.::
lambda x: "error" in x
to avoid evaluating expensive regexes on all lines in a file. This
can reduce parse time for large files by orders of magnitude when
evaluating lots of expressions.
A ``prefilter`` object is designed to act like a regex,, but
``search`` and ``match`` check the precondition before bothering to
evaluate the regular expression.
Note that ``match`` and ``search`` just return ``True`` and ``False``
at the moment. Make them return a ``MatchObject`` or ``None`` if it
becomes necessary.
"""
def __init__(self, precondition, *patterns):
self.patterns = [re.compile(p) for p in patterns]
self.pre = precondition
self.pattern = "\n ".join(
('MERGED:',) + patterns)
def search(self, text):
return self.pre(text) and any(p.search(text) for p in self.patterns)
def match(self, text):
return self.pre(text) and any(p.match(text) for p in self.patterns)
_error_matches = [
prefilter(
lambda x: any(s in x for s in (
'Error:', 'error', 'undefined reference', 'multiply defined')),
"([^:]+): error[ \\t]*[0-9]+[ \\t]*:",
"([^:]+): (Error:|error|undefined reference|multiply defined)",
"([^ :]+) ?: (error|fatal error|catastrophic error)",
"([^:]+)\\(([^\\)]+)\\) ?: (error|fatal error|catastrophic error)"),
"^FAILED",
"^FAIL: ",
"^FATAL: ",
"^failed ",
"FAILED",
"Failed test",
"^[Bb]us [Ee]rror",
"^[Ss]egmentation [Vv]iolation",
"^[Ss]egmentation [Ff]ault",
":.*[Pp]ermission [Dd]enied",
"[^ :]:[0-9]+: [^ \\t]",
"[^:]: error[ \\t]*[0-9]+[ \\t]*:",
"^Error ([0-9]+):",
"^Fatal",
"^[Ee]rror: ",
@@ -132,6 +98,9 @@ def match(self, text):
"^cc[^C]*CC: ERROR File = ([^,]+), Line = ([0-9]+)",
"^ld([^:])*:([ \\t])*ERROR([^:])*:",
"^ild:([ \\t])*\\(undefined symbol\\)",
"[^ :] : (error|fatal error|catastrophic error)",
"[^:]: (Error:|error|undefined reference|multiply defined)",
"[^:]\\([^\\)]+\\) ?: (error|fatal error|catastrophic error)",
"^fatal error C[0-9]+:",
": syntax error ",
"^collect2: ld returned 1 exit status",
@@ -140,7 +109,7 @@ def match(self, text):
"^Unresolved:",
"Undefined symbol",
"^Undefined[ \\t]+first referenced",
"^CMake Error.*:",
"^CMake Error",
":[ \\t]cannot find",
":[ \\t]can't find",
": \\*\\*\\* No rule to make target [`'].*\\'. Stop",
@@ -154,6 +123,7 @@ def match(self, text):
"ld: 0706-006 Cannot find or open library file: -l ",
"ild: \\(argument error\\) can't find library argument ::",
"^could not be found and will not be loaded.",
"^WARNING: '.*' is missing on your system",
"s:616 string too big",
"make: Fatal error: ",
"ld: 0711-993 Error occurred while writing to the output file:",
@@ -175,44 +145,38 @@ def match(self, text):
"instantiated from ",
"candidates are:",
": warning",
": WARNING",
": \\(Warning\\)",
": note",
" ok",
"Note:",
"makefile:",
"Makefile:",
":[ \\t]+Where:",
"([^ :]+):([0-9]+): Warning",
"[^ :]:[0-9]+: Warning",
"------ Build started: .* ------",
]
#: Regexes to match file/line numbers in error/warning messages
_warning_matches = [
prefilter(
lambda x: 'warning' in x,
"([^ :]+):([0-9]+): warning:",
"([^:]+): warning ([0-9]+):",
"([^:]+): warning[ \\t]*[0-9]+[ \\t]*:",
"([^ :]+) : warning",
"([^:]+): warning"),
prefilter(
lambda x: 'note:' in x,
"^([^ :]+):([0-9]+): note:"),
prefilter(
lambda x: any(s in x for s in ('Warning', 'Warnung')),
"^(Warning|Warnung) ([0-9]+):",
"^(Warning|Warnung)[ :]",
"^cxx: Warning:",
"([^ :]+):([0-9]+): (Warning|Warnung)",
"^CMake Warning.*:"),
"file: .* has no symbols",
"[^ :]:[0-9]+: warning:",
"[^ :]:[0-9]+: note:",
"^cc[^C]*CC: WARNING File = ([^,]+), Line = ([0-9]+)",
"^ld([^:])*:([ \\t])*WARNING([^:])*:",
"[^:]: warning [0-9]+:",
"^\"[^\"]+\", line [0-9]+: [Ww](arning|arnung)",
"[^:]: warning[ \\t]*[0-9]+[ \\t]*:",
"^(Warning|Warnung) ([0-9]+):",
"^(Warning|Warnung)[ :]",
"WARNING: ",
"[^ :] : warning",
"[^:]: warning",
"\", line [0-9]+\\.[0-9]+: [0-9]+-[0-9]+ \\([WI]\\)",
"^cxx: Warning:",
"file: .* has no symbols",
"[^ :]:[0-9]+: (Warning|Warnung)",
"\\([0-9]*\\): remark #[0-9]*",
"\".*\", line [0-9]+: remark\\([0-9]*\\):",
"cc-[0-9]* CC: REMARK File = .*, Line = [0-9]*",
"^CMake Warning",
"^\\[WARNING\\]",
]
@@ -223,8 +187,6 @@ def match(self, text):
"/usr/.*/X11/XResource\\.h:[0-9]+: war.*: ANSI C\\+\\+ forbids declaration",
"WARNING 84 :",
"WARNING 47 :",
"makefile:",
"Makefile:",
"warning: Clock skew detected. Your build may be incomplete.",
"/usr/openwin/include/GL/[^:]+:",
"bind_at_load",
@@ -343,8 +305,7 @@ def _profile_match(matches, exceptions, line, match_times, exc_times):
def _parse(lines, offset, profile):
def compile(regex_array):
return [regex if isinstance(regex, prefilter) else re.compile(regex)
for regex in regex_array]
return [re.compile(regex) for regex in regex_array]
error_matches = compile(_error_matches)
error_exceptions = compile(_error_exceptions)

File diff suppressed because it is too large Load Diff

View File

@@ -1,47 +0,0 @@
#
# Backport of Python 2.7's total_ordering.
#
def total_ordering(cls):
"""Class decorator that fills in missing ordering methods"""
convert = {
'__lt__': [('__gt__', lambda self, other: not (self < other or self == other)),
('__le__', lambda self, other: self < other or self == other),
('__ge__', lambda self, other: not self < other)],
'__le__': [('__ge__', lambda self, other: not self <= other or self == other),
('__lt__', lambda self, other: self <= other and not self == other),
('__gt__', lambda self, other: not self <= other)],
'__gt__': [('__lt__', lambda self, other: not (self > other or self == other)),
('__ge__', lambda self, other: self > other or self == other),
('__le__', lambda self, other: not self > other)],
'__ge__': [('__le__', lambda self, other: (not self >= other) or self == other),
('__gt__', lambda self, other: self >= other and not self == other),
('__lt__', lambda self, other: not self >= other)]
}
roots = set(dir(cls)) & set(convert)
if not roots:
raise ValueError('must define at least one ordering operation: < > <= >=')
root = max(roots) # prefer __lt__ to __le__ to __gt__ to __ge__
for opname, opfunc in convert[root]:
if opname not in roots:
opfunc.__name__ = opname
opfunc.__doc__ = getattr(int, opname).__doc__
setattr(cls, opname, opfunc)
return cls
@total_ordering
class reverse_order(object):
"""Helper for creating key functions.
This is a wrapper that inverts the sense of the natural
comparisons on the object.
"""
def __init__(self, value):
self.value = value
def __eq__(self, other):
return other.value == self.value
def __lt__(self, other):
return other.value < self.value

28
lib/spack/external/jinja2/LICENSE.rst vendored Normal file
View File

@@ -0,0 +1,28 @@
Copyright 2007 Pallets
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@@ -1,83 +1,44 @@
# -*- coding: utf-8 -*-
"""Jinja is a template engine written in pure Python. It provides a
non-XML syntax that supports inline expressions and an optional
sandboxed environment.
"""
jinja2
~~~~~~
from markupsafe import escape
from markupsafe import Markup
Jinja2 is a template engine written in pure Python. It provides a
Django inspired non-XML syntax but supports inline expressions and
an optional sandboxed environment.
from .bccache import BytecodeCache
from .bccache import FileSystemBytecodeCache
from .bccache import MemcachedBytecodeCache
from .environment import Environment
from .environment import Template
from .exceptions import TemplateAssertionError
from .exceptions import TemplateError
from .exceptions import TemplateNotFound
from .exceptions import TemplateRuntimeError
from .exceptions import TemplatesNotFound
from .exceptions import TemplateSyntaxError
from .exceptions import UndefinedError
from .filters import contextfilter
from .filters import environmentfilter
from .filters import evalcontextfilter
from .loaders import BaseLoader
from .loaders import ChoiceLoader
from .loaders import DictLoader
from .loaders import FileSystemLoader
from .loaders import FunctionLoader
from .loaders import ModuleLoader
from .loaders import PackageLoader
from .loaders import PrefixLoader
from .runtime import ChainableUndefined
from .runtime import DebugUndefined
from .runtime import make_logging_undefined
from .runtime import StrictUndefined
from .runtime import Undefined
from .utils import clear_caches
from .utils import contextfunction
from .utils import environmentfunction
from .utils import evalcontextfunction
from .utils import is_undefined
from .utils import select_autoescape
Nutshell
--------
Here a small example of a Jinja2 template::
{% extends 'base.html' %}
{% block title %}Memberlist{% endblock %}
{% block content %}
<ul>
{% for user in users %}
<li><a href="{{ user.url }}">{{ user.username }}</a></li>
{% endfor %}
</ul>
{% endblock %}
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
__docformat__ = 'restructuredtext en'
__version__ = '2.10'
# high level interface
from jinja2.environment import Environment, Template
# loaders
from jinja2.loaders import BaseLoader, FileSystemLoader, PackageLoader, \
DictLoader, FunctionLoader, PrefixLoader, ChoiceLoader, \
ModuleLoader
# bytecode caches
from jinja2.bccache import BytecodeCache, FileSystemBytecodeCache, \
MemcachedBytecodeCache
# undefined types
from jinja2.runtime import Undefined, DebugUndefined, StrictUndefined, \
make_logging_undefined
# exceptions
from jinja2.exceptions import TemplateError, UndefinedError, \
TemplateNotFound, TemplatesNotFound, TemplateSyntaxError, \
TemplateAssertionError, TemplateRuntimeError
# decorators and public utilities
from jinja2.filters import environmentfilter, contextfilter, \
evalcontextfilter
from jinja2.utils import Markup, escape, clear_caches, \
environmentfunction, evalcontextfunction, contextfunction, \
is_undefined, select_autoescape
__all__ = [
'Environment', 'Template', 'BaseLoader', 'FileSystemLoader',
'PackageLoader', 'DictLoader', 'FunctionLoader', 'PrefixLoader',
'ChoiceLoader', 'BytecodeCache', 'FileSystemBytecodeCache',
'MemcachedBytecodeCache', 'Undefined', 'DebugUndefined',
'StrictUndefined', 'TemplateError', 'UndefinedError', 'TemplateNotFound',
'TemplatesNotFound', 'TemplateSyntaxError', 'TemplateAssertionError',
'TemplateRuntimeError',
'ModuleLoader', 'environmentfilter', 'contextfilter', 'Markup', 'escape',
'environmentfunction', 'contextfunction', 'clear_caches', 'is_undefined',
'evalcontextfilter', 'evalcontextfunction', 'make_logging_undefined',
'select_autoescape',
]
def _patch_async():
from jinja2.utils import have_async_gen
if have_async_gen:
from jinja2.asyncsupport import patch_all
patch_all()
_patch_async()
del _patch_async
__version__ = "2.11.3"

View File

@@ -1,22 +1,12 @@
# -*- coding: utf-8 -*-
"""
jinja2._compat
~~~~~~~~~~~~~~
Some py2/py3 compatibility support based on a stripped down
version of six so we don't have to depend on a specific version
of it.
:copyright: Copyright 2013 by the Jinja team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
# flake8: noqa
import marshal
import sys
PY2 = sys.version_info[0] == 2
PYPY = hasattr(sys, 'pypy_translation_info')
PYPY = hasattr(sys, "pypy_translation_info")
_identity = lambda x: x
if not PY2:
unichr = chr
range_type = range
@@ -30,6 +20,7 @@
import pickle
from io import BytesIO, StringIO
NativeStringIO = StringIO
def reraise(tp, value, tb=None):
@@ -46,6 +37,9 @@ def reraise(tp, value, tb=None):
implements_to_string = _identity
encode_filename = _identity
marshal_dump = marshal.dump
marshal_load = marshal.load
else:
unichr = unichr
text_type = unicode
@@ -59,11 +53,13 @@ def reraise(tp, value, tb=None):
import cPickle as pickle
from cStringIO import StringIO as BytesIO, StringIO
NativeStringIO = BytesIO
exec('def reraise(tp, value, tb=None):\n raise tp, value, tb')
exec("def reraise(tp, value, tb=None):\n raise tp, value, tb")
from itertools import imap, izip, ifilter
intern = intern
def implements_iterator(cls):
@@ -73,14 +69,25 @@ def implements_iterator(cls):
def implements_to_string(cls):
cls.__unicode__ = cls.__str__
cls.__str__ = lambda x: x.__unicode__().encode('utf-8')
cls.__str__ = lambda x: x.__unicode__().encode("utf-8")
return cls
def encode_filename(filename):
if isinstance(filename, unicode):
return filename.encode('utf-8')
return filename.encode("utf-8")
return filename
def marshal_dump(code, f):
if isinstance(f, file):
marshal.dump(code, f)
else:
f.write(marshal.dumps(code))
def marshal_load(f):
if isinstance(f, file):
return marshal.load(f)
return marshal.loads(f.read())
def with_metaclass(meta, *bases):
"""Create a base class with a metaclass."""
@@ -90,10 +97,36 @@ def with_metaclass(meta, *bases):
class metaclass(type):
def __new__(cls, name, this_bases, d):
return meta(name, bases, d)
return type.__new__(metaclass, 'temporary_class', (), {})
return type.__new__(metaclass, "temporary_class", (), {})
try:
from urllib.parse import quote_from_bytes as url_quote
except ImportError:
from urllib import quote as url_quote
try:
from collections import abc
except ImportError:
import collections as abc
try:
from os import fspath
except ImportError:
try:
from pathlib import PurePath
except ImportError:
PurePath = None
def fspath(path):
if hasattr(path, "__fspath__"):
return path.__fspath__()
# Python 3.5 doesn't have __fspath__ yet, use str.
if PurePath is not None and isinstance(path, PurePath):
return str(path)
return path

View File

@@ -1,2 +1,6 @@
import re
# generated by scripts/generate_identifier_pattern.py
pattern = '·̀-ͯ·҃-֑҇-ׇֽֿׁׂׅׄؐ-ًؚ-ٰٟۖ-ۜ۟-۪ۤۧۨ-ܑۭܰ-݊ަ-ް߫-߳ࠖ-࠙ࠛ-ࠣࠥ-ࠧࠩ-࡙࠭-࡛ࣔ-ࣣ࣡-ःऺ-़ा-ॏ॑-ॗॢॣঁ-ঃ়া-ৄেৈো-্ৗৢৣਁ-ਃ਼ਾ-ੂੇੈੋ-੍ੑੰੱੵઁ-ઃ઼ા-ૅે-ૉો-્ૢૣଁ-ଃ଼ା-ୄେୈୋ-୍ୖୗୢୣஂா-ூெ-ைொ-்ௗఀ-ఃా-ౄె-ైొ-్ౕౖౢౣಁ-ಃ಼ಾ-ೄೆ-ೈೊ-್ೕೖೢೣഁ-ഃാ-ൄെ-ൈൊ-്ൗൢൣංඃ්ා-ුූෘ-ෟෲෳัิ-ฺ็-๎ັິ-ູົຼ່-ໍ༹༘༙༵༷༾༿ཱ-྄྆྇ྍ-ྗྙ-ྼ࿆ါ-ှၖ-ၙၞ-ၠၢ-ၤၧ-ၭၱ-ၴႂ-ႍႏႚ-ႝ፝-፟ᜒ-᜔ᜲ-᜴ᝒᝓᝲᝳ឴-៓៝᠋-᠍ᢅᢆᢩᤠ-ᤫᤰ-᤻ᨗ-ᨛᩕ-ᩞ᩠-᩿᩼᪰-᪽ᬀ-ᬄ᬴-᭄᭫-᭳ᮀ-ᮂᮡ-ᮭ᯦-᯳ᰤ-᰷᳐-᳔᳒-᳨᳭ᳲ-᳴᳸᳹᷀-᷵᷻-᷿‿⁀⁔⃐-⃥⃜⃡-⃰℘℮⳯-⵿⳱ⷠ-〪ⷿ-゙゚〯꙯ꙴ-꙽ꚞꚟ꛰꛱ꠂ꠆ꠋꠣ-ꠧꢀꢁꢴ-ꣅ꣠-꣱ꤦ-꤭ꥇ-꥓ꦀ-ꦃ꦳-꧀ꧥꨩ-ꨶꩃꩌꩍꩻ-ꩽꪰꪲ-ꪴꪷꪸꪾ꪿꫁ꫫ-ꫯꫵ꫶ꯣ-ꯪ꯬꯭ﬞ︀-️︠-︯︳︴﹍-﹏_𐇽𐋠𐍶-𐍺𐨁-𐨃𐨅𐨆𐨌-𐨏𐨸-𐨿𐨺𐫦𐫥𑀀-𑀂𑀸-𑁆𑁿-𑂂𑂰-𑂺𑄀-𑄂𑄧-𑅳𑄴𑆀-𑆂𑆳-𑇊𑇀-𑇌𑈬-𑈷𑈾𑋟-𑋪𑌀-𑌃𑌼𑌾-𑍄𑍇𑍈𑍋-𑍍𑍗𑍢𑍣𑍦-𑍬𑍰-𑍴𑐵-𑑆𑒰-𑓃𑖯-𑖵𑖸-𑗀𑗜𑗝𑘰-𑙀𑚫-𑚷𑜝-𑜫𑰯-𑰶𑰸-𑰿𑲒-𑲧𑲩-𑲶𖫰-𖫴𖬰-𖬶𖽑-𖽾𖾏-𖾒𛲝𛲞𝅥-𝅩𝅭-𝅲𝅻-𝆂𝆅-𝆋𝆪-𝆭𝉂-𝉄𝨀-𝨶𝨻-𝩬𝩵𝪄𝪛-𝪟𝪡-𝪯𞀀-𞀆𞀈-𞀘𞀛-𞀡𞀣𞀤𞀦-𞣐𞀪-𞣖𞥄-𞥊󠄀-󠇯'
pattern = re.compile(
r"[\w·̀-ͯ·҃-֑҇-ׇֽֿׁׂׅׄؐ-ًؚ-ٰٟۖ-ۜ۟-۪ۤۧۨ-ܑۭܰ-݊ަ-ް߫-߳ࠖ-࠙ࠛ-ࠣࠥ-ࠧࠩ-࡙࠭-࡛ࣔ-ࣣ࣡-ःऺ-़ा-ॏ॑-ॗॢॣঁ-ঃ়া-ৄেৈো-্ৗৢৣਁ-ਃ਼ਾ-ੂੇੈੋ-੍ੑੰੱੵઁ-ઃ઼ા-ૅે-ૉો-્ૢૣଁ-ଃ଼ା-ୄେୈୋ-୍ୖୗୢୣஂா-ூெ-ைொ-்ௗఀ-ఃా-ౄె-ైొ-్ౕౖౢౣಁ-ಃ಼ಾ-ೄೆ-ೈೊ-್ೕೖೢೣഁ-ഃാ-ൄെ-ൈൊ-്ൗൢൣංඃ්ා-ුූෘ-ෟෲෳัิ-ฺ็-๎ັິ-ູົຼ່-ໍ༹༘༙༵༷༾༿ཱ-྄྆྇ྍ-ྗྙ-ྼ࿆ါ-ှၖ-ၙၞ-ၠၢ-ၤၧ-ၭၱ-ၴႂ-ႍႏႚ-ႝ፝-፟ᜒ-᜔ᜲ-᜴ᝒᝓᝲᝳ឴-៓៝᠋-᠍ᢅᢆᢩᤠ-ᤫᤰ-᤻ᨗ-ᨛᩕ-ᩞ᩠-᩿᩼᪰-᪽ᬀ-ᬄ᬴-᭄᭫-᭳ᮀ-ᮂᮡ-ᮭ᯦-᯳ᰤ-᰷᳐-᳔᳒-᳨᳭ᳲ-᳴᳸᳹᷀-᷵᷻-᷿‿⁀⁔⃐-⃥⃜⃡-⃰℘℮⳯-⵿⳱ⷠ-〪ⷿ-゙゚〯꙯ꙴ-꙽ꚞꚟ꛰꛱ꠂ꠆ꠋꠣ-ꠧꢀꢁꢴ-ꣅ꣠-꣱ꤦ-꤭ꥇ-꥓ꦀ-ꦃ꦳-꧀ꧥꨩ-ꨶꩃꩌꩍꩻ-ꩽꪰꪲ-ꪴꪷꪸꪾ꪿꫁ꫫ-ꫯꫵ꫶ꯣ-ꯪ꯬꯭ﬞ︀-️︠-︯︳︴﹍-﹏_𐇽𐋠𐍶-𐍺𐨁-𐨃𐨅𐨆𐨌-𐨏𐨸-𐨿𐨺𐫦𐫥𑀀-𑀂𑀸-𑁆𑁿-𑂂𑂰-𑂺𑄀-𑄂𑄧-𑅳𑄴𑆀-𑆂𑆳-𑇊𑇀-𑇌𑈬-𑈷𑈾𑋟-𑋪𑌀-𑌃𑌼𑌾-𑍄𑍇𑍈𑍋-𑍍𑍗𑍢𑍣𑍦-𑍬𑍰-𑍴𑐵-𑑆𑒰-𑓃𑖯-𑖵𑖸-𑗀𑗜𑗝𑘰-𑙀𑚫-𑚷𑜝-𑜫𑰯-𑰶𑰸-𑰿𑲒-𑲧𑲩-𑲶𖫰-𖫴𖬰-𖬶𖽑-𖽾𖾏-𖾒𛲝𛲞𝅥-𝅩𝅭-𝅲𝅻-𝆂𝆅-𝆋𝆪-𝆭𝉂-𝉄𝨀-𝨶𝨻-𝩬𝩵𝪄𝪛-𝪟𝪡-𝪯𞀀-𞀆𞀈-𞀘𞀛-𞀡𞀣𞀤𞀦-𞣐𞀪-𞣖𞥄-𞥊󠄀-󠇯]+" # noqa: B950
)

View File

@@ -1,12 +1,13 @@
from functools import wraps
from jinja2.asyncsupport import auto_aiter
from jinja2 import filters
from . import filters
from .asyncsupport import auto_aiter
from .asyncsupport import auto_await
async def auto_to_seq(value):
seq = []
if hasattr(value, '__aiter__'):
if hasattr(value, "__aiter__"):
async for item in value:
seq.append(item)
else:
@@ -16,8 +17,7 @@ async def auto_to_seq(value):
async def async_select_or_reject(args, kwargs, modfunc, lookup_attr):
seq, func = filters.prepare_select_or_reject(
args, kwargs, modfunc, lookup_attr)
seq, func = filters.prepare_select_or_reject(args, kwargs, modfunc, lookup_attr)
if seq:
async for item in auto_aiter(seq):
if func(item):
@@ -26,14 +26,19 @@ async def async_select_or_reject(args, kwargs, modfunc, lookup_attr):
def dualfilter(normal_filter, async_filter):
wrap_evalctx = False
if getattr(normal_filter, 'environmentfilter', False):
is_async = lambda args: args[0].is_async
if getattr(normal_filter, "environmentfilter", False) is True:
def is_async(args):
return args[0].is_async
wrap_evalctx = False
else:
if not getattr(normal_filter, 'evalcontextfilter', False) and \
not getattr(normal_filter, 'contextfilter', False):
wrap_evalctx = True
is_async = lambda args: args[0].environment.is_async
has_evalctxfilter = getattr(normal_filter, "evalcontextfilter", False) is True
has_ctxfilter = getattr(normal_filter, "contextfilter", False) is True
wrap_evalctx = not has_evalctxfilter and not has_ctxfilter
def is_async(args):
return args[0].environment.is_async
@wraps(normal_filter)
def wrapper(*args, **kwargs):
@@ -55,6 +60,7 @@ def wrapper(*args, **kwargs):
def asyncfiltervariant(original):
def decorator(f):
return dualfilter(original, f)
return decorator
@@ -63,19 +69,22 @@ async def do_first(environment, seq):
try:
return await auto_aiter(seq).__anext__()
except StopAsyncIteration:
return environment.undefined('No first item, sequence was empty.')
return environment.undefined("No first item, sequence was empty.")
@asyncfiltervariant(filters.do_groupby)
async def do_groupby(environment, value, attribute):
expr = filters.make_attrgetter(environment, attribute)
return [filters._GroupTuple(key, await auto_to_seq(values))
for key, values in filters.groupby(sorted(
await auto_to_seq(value), key=expr), expr)]
return [
filters._GroupTuple(key, await auto_to_seq(values))
for key, values in filters.groupby(
sorted(await auto_to_seq(value), key=expr), expr
)
]
@asyncfiltervariant(filters.do_join)
async def do_join(eval_ctx, value, d=u'', attribute=None):
async def do_join(eval_ctx, value, d=u"", attribute=None):
return filters.do_join(eval_ctx, await auto_to_seq(value), d, attribute)
@@ -109,7 +118,7 @@ async def do_map(*args, **kwargs):
seq, func = filters.prepare_map(args, kwargs)
if seq:
async for item in auto_aiter(seq):
yield func(item)
yield await auto_await(func(item))
@asyncfiltervariant(filters.do_sum)
@@ -118,7 +127,10 @@ async def do_sum(environment, iterable, attribute=None, start=0):
if attribute is not None:
func = filters.make_attrgetter(environment, attribute)
else:
func = lambda x: x
def func(x):
return x
async for item in auto_aiter(iterable):
rv += func(item)
return rv
@@ -130,17 +142,17 @@ async def do_slice(value, slices, fill_with=None):
ASYNC_FILTERS = {
'first': do_first,
'groupby': do_groupby,
'join': do_join,
'list': do_list,
"first": do_first,
"groupby": do_groupby,
"join": do_join,
"list": do_list,
# we intentionally do not support do_last because that would be
# ridiculous
'reject': do_reject,
'rejectattr': do_rejectattr,
'map': do_map,
'select': do_select,
'selectattr': do_selectattr,
'sum': do_sum,
'slice': do_slice,
"reject": do_reject,
"rejectattr": do_rejectattr,
"map": do_map,
"select": do_select,
"selectattr": do_selectattr,
"sum": do_sum,
"slice": do_slice,
}

View File

@@ -1,29 +1,27 @@
# -*- coding: utf-8 -*-
"""The code for async support. Importing this patches Jinja on supported
Python versions.
"""
jinja2.asyncsupport
~~~~~~~~~~~~~~~~~~~
Has all the code for async support which is implemented as a patch
for supported Python versions.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
import sys
import asyncio
import inspect
from functools import update_wrapper
from jinja2.utils import concat, internalcode, Markup
from jinja2.environment import TemplateModule
from jinja2.runtime import LoopContextBase, _last_iteration
from markupsafe import Markup
from .environment import TemplateModule
from .runtime import LoopContext
from .utils import concat
from .utils import internalcode
from .utils import missing
async def concat_async(async_gen):
rv = []
async def collect():
async for event in async_gen:
rv.append(event)
await collect()
return concat(rv)
@@ -34,10 +32,7 @@ async def generate_async(self, *args, **kwargs):
async for event in self.root_render_func(self.new_context(vars)):
yield event
except Exception:
exc_info = sys.exc_info()
else:
return
yield self.environment.handle_exception(exc_info, True)
yield self.environment.handle_exception()
def wrap_generate_func(original_generate):
@@ -48,17 +43,18 @@ def _convert_generator(self, loop, args, kwargs):
yield loop.run_until_complete(async_gen.__anext__())
except StopAsyncIteration:
pass
def generate(self, *args, **kwargs):
if not self.environment.is_async:
return original_generate(self, *args, **kwargs)
return _convert_generator(self, asyncio.get_event_loop(), args, kwargs)
return update_wrapper(generate, original_generate)
async def render_async(self, *args, **kwargs):
if not self.environment.is_async:
raise RuntimeError('The environment was not created with async mode '
'enabled.')
raise RuntimeError("The environment was not created with async mode enabled.")
vars = dict(*args, **kwargs)
ctx = self.new_context(vars)
@@ -66,8 +62,7 @@ async def render_async(self, *args, **kwargs):
try:
return await concat_async(self.root_render_func(ctx))
except Exception:
exc_info = sys.exc_info()
return self.environment.handle_exception(exc_info, True)
return self.environment.handle_exception()
def wrap_render_func(original_render):
@@ -76,6 +71,7 @@ def render(self, *args, **kwargs):
return original_render(self, *args, **kwargs)
loop = asyncio.get_event_loop()
return loop.run_until_complete(self.render_async(*args, **kwargs))
return update_wrapper(render, original_render)
@@ -109,6 +105,7 @@ def _invoke(self, arguments, autoescape):
if not self._environment.is_async:
return original_invoke(self, arguments, autoescape)
return async_invoke(self, arguments, autoescape)
return update_wrapper(_invoke, original_invoke)
@@ -124,9 +121,9 @@ def wrap_default_module(original_default_module):
@internalcode
def _get_default_module(self):
if self.environment.is_async:
raise RuntimeError('Template module attribute is unavailable '
'in async mode')
raise RuntimeError("Template module attribute is unavailable in async mode")
return original_default_module(self)
return _get_default_module
@@ -139,30 +136,30 @@ async def make_module_async(self, vars=None, shared=False, locals=None):
def patch_template():
from jinja2 import Template
from . import Template
Template.generate = wrap_generate_func(Template.generate)
Template.generate_async = update_wrapper(
generate_async, Template.generate_async)
Template.render_async = update_wrapper(
render_async, Template.render_async)
Template.generate_async = update_wrapper(generate_async, Template.generate_async)
Template.render_async = update_wrapper(render_async, Template.render_async)
Template.render = wrap_render_func(Template.render)
Template._get_default_module = wrap_default_module(
Template._get_default_module)
Template._get_default_module = wrap_default_module(Template._get_default_module)
Template._get_default_module_async = get_default_module_async
Template.make_module_async = update_wrapper(
make_module_async, Template.make_module_async)
make_module_async, Template.make_module_async
)
def patch_runtime():
from jinja2.runtime import BlockReference, Macro
BlockReference.__call__ = wrap_block_reference_call(
BlockReference.__call__)
from .runtime import BlockReference, Macro
BlockReference.__call__ = wrap_block_reference_call(BlockReference.__call__)
Macro._invoke = wrap_macro_invoke(Macro._invoke)
def patch_filters():
from jinja2.filters import FILTERS
from jinja2.asyncfilters import ASYNC_FILTERS
from .filters import FILTERS
from .asyncfilters import ASYNC_FILTERS
FILTERS.update(ASYNC_FILTERS)
@@ -179,7 +176,7 @@ async def auto_await(value):
async def auto_aiter(iterable):
if hasattr(iterable, '__aiter__'):
if hasattr(iterable, "__aiter__"):
async for item in iterable:
yield item
return
@@ -187,70 +184,81 @@ async def auto_aiter(iterable):
yield item
class AsyncLoopContext(LoopContextBase):
def __init__(self, async_iterator, undefined, after, length, recurse=None,
depth0=0):
LoopContextBase.__init__(self, undefined, recurse, depth0)
self._async_iterator = async_iterator
self._after = after
self._length = length
class AsyncLoopContext(LoopContext):
_to_iterator = staticmethod(auto_aiter)
@property
def length(self):
if self._length is None:
raise TypeError('Loop length for some iterators cannot be '
'lazily calculated in async mode')
async def length(self):
if self._length is not None:
return self._length
try:
self._length = len(self._iterable)
except TypeError:
iterable = [x async for x in self._iterator]
self._iterator = self._to_iterator(iterable)
self._length = len(iterable) + self.index + (self._after is not missing)
return self._length
def __aiter__(self):
return AsyncLoopContextIterator(self)
@property
async def revindex0(self):
return await self.length - self.index
@property
async def revindex(self):
return await self.length - self.index0
class AsyncLoopContextIterator(object):
__slots__ = ('context',)
async def _peek_next(self):
if self._after is not missing:
return self._after
def __init__(self, context):
self.context = context
try:
self._after = await self._iterator.__anext__()
except StopAsyncIteration:
self._after = missing
return self._after
@property
async def last(self):
return await self._peek_next() is missing
@property
async def nextitem(self):
rv = await self._peek_next()
if rv is missing:
return self._undefined("there is no next item")
return rv
def __aiter__(self):
return self
async def __anext__(self):
ctx = self.context
ctx.index0 += 1
if ctx._after is _last_iteration:
raise StopAsyncIteration()
ctx._before = ctx._current
ctx._current = ctx._after
try:
ctx._after = await ctx._async_iterator.__anext__()
except StopAsyncIteration:
ctx._after = _last_iteration
return ctx._current, ctx
if self._after is not missing:
rv = self._after
self._after = missing
else:
rv = await self._iterator.__anext__()
self.index0 += 1
self._before = self._current
self._current = rv
return rv, self
async def make_async_loop_context(iterable, undefined, recurse=None, depth0=0):
# Length is more complicated and less efficient in async mode. The
# reason for this is that we cannot know if length will be used
# upfront but because length is a property we cannot lazily execute it
# later. This means that we need to buffer it up and measure :(
#
# We however only do this for actual iterators, not for async
# iterators as blocking here does not seem like the best idea in the
# world.
try:
length = len(iterable)
except (TypeError, AttributeError):
if not hasattr(iterable, '__aiter__'):
iterable = tuple(iterable)
length = len(iterable)
else:
length = None
async_iterator = auto_aiter(iterable)
try:
after = await async_iterator.__anext__()
except StopAsyncIteration:
after = _last_iteration
return AsyncLoopContext(async_iterator, undefined, after, length, recurse,
depth0)
import warnings
warnings.warn(
"This template must be recompiled with at least Jinja 2.11, or"
" it will fail in 3.0.",
DeprecationWarning,
stacklevel=2,
)
return AsyncLoopContext(iterable, undefined, recurse, depth0)
patch_all()

View File

@@ -1,60 +1,37 @@
# -*- coding: utf-8 -*-
"""The optional bytecode cache system. This is useful if you have very
complex template situations and the compilation of all those templates
slows down your application too much.
Situations where this is useful are often forking web applications that
are initialized on the first request.
"""
jinja2.bccache
~~~~~~~~~~~~~~
This module implements the bytecode cache system Jinja is optionally
using. This is useful if you have very complex template situations and
the compiliation of all those templates slow down your application too
much.
Situations where this is useful are often forking web applications that
are initialized on the first request.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD.
"""
from os import path, listdir
import os
import sys
import stat
import errno
import marshal
import tempfile
import fnmatch
import os
import stat
import sys
import tempfile
from hashlib import sha1
from jinja2.utils import open_if_exists
from jinja2._compat import BytesIO, pickle, PY2, text_type
from os import listdir
from os import path
from ._compat import BytesIO
from ._compat import marshal_dump
from ._compat import marshal_load
from ._compat import pickle
from ._compat import text_type
from .utils import open_if_exists
# marshal works better on 3.x, one hack less required
if not PY2:
marshal_dump = marshal.dump
marshal_load = marshal.load
else:
def marshal_dump(code, f):
if isinstance(f, file):
marshal.dump(code, f)
else:
f.write(marshal.dumps(code))
def marshal_load(f):
if isinstance(f, file):
return marshal.load(f)
return marshal.loads(f.read())
bc_version = 3
# magic version used to only change with new jinja versions. With 2.6
# we change this to also take Python version changes into account. The
# reason for this is that Python tends to segfault if fed earlier bytecode
# versions because someone thought it would be a good idea to reuse opcodes
# or make Python incompatible with earlier versions.
bc_magic = 'j2'.encode('ascii') + \
pickle.dumps(bc_version, 2) + \
pickle.dumps((sys.version_info[0] << 24) | sys.version_info[1])
bc_version = 4
# Magic bytes to identify Jinja bytecode cache files. Contains the
# Python major and minor version to avoid loading incompatible bytecode
# if a project upgrades its Python version.
bc_magic = (
b"j2"
+ pickle.dumps(bc_version, 2)
+ pickle.dumps((sys.version_info[0] << 24) | sys.version_info[1], 2)
)
class Bucket(object):
@@ -98,7 +75,7 @@ def load_bytecode(self, f):
def write_bytecode(self, f):
"""Dump the bytecode into the file or file like object passed."""
if self.code is None:
raise TypeError('can\'t write empty bucket')
raise TypeError("can't write empty bucket")
f.write(bc_magic)
pickle.dump(self.checksum, f, 2)
marshal_dump(self.code, f)
@@ -140,7 +117,7 @@ def dump_bytecode(self, bucket):
bucket.write_bytecode(f)
A more advanced version of a filesystem based bytecode cache is part of
Jinja2.
Jinja.
"""
def load_bytecode(self, bucket):
@@ -158,24 +135,24 @@ def dump_bytecode(self, bucket):
raise NotImplementedError()
def clear(self):
"""Clears the cache. This method is not used by Jinja2 but should be
"""Clears the cache. This method is not used by Jinja but should be
implemented to allow applications to clear the bytecode cache used
by a particular environment.
"""
def get_cache_key(self, name, filename=None):
"""Returns the unique hash key for this template name."""
hash = sha1(name.encode('utf-8'))
hash = sha1(name.encode("utf-8"))
if filename is not None:
filename = '|' + filename
filename = "|" + filename
if isinstance(filename, text_type):
filename = filename.encode('utf-8')
filename = filename.encode("utf-8")
hash.update(filename)
return hash.hexdigest()
def get_source_checksum(self, source):
"""Returns a checksum for the source."""
return sha1(source.encode('utf-8')).hexdigest()
return sha1(source.encode("utf-8")).hexdigest()
def get_bucket(self, environment, name, filename, source):
"""Return a cache bucket for the given template. All arguments are
@@ -210,7 +187,7 @@ class FileSystemBytecodeCache(BytecodeCache):
This bytecode cache supports clearing of the cache using the clear method.
"""
def __init__(self, directory=None, pattern='__jinja2_%s.cache'):
def __init__(self, directory=None, pattern="__jinja2_%s.cache"):
if directory is None:
directory = self._get_default_cache_dir()
self.directory = directory
@@ -218,19 +195,21 @@ def __init__(self, directory=None, pattern='__jinja2_%s.cache'):
def _get_default_cache_dir(self):
def _unsafe_dir():
raise RuntimeError('Cannot determine safe temp directory. You '
'need to explicitly provide one.')
raise RuntimeError(
"Cannot determine safe temp directory. You "
"need to explicitly provide one."
)
tmpdir = tempfile.gettempdir()
# On windows the temporary directory is used specific unless
# explicitly forced otherwise. We can just use that.
if os.name == 'nt':
if os.name == "nt":
return tmpdir
if not hasattr(os, 'getuid'):
if not hasattr(os, "getuid"):
_unsafe_dir()
dirname = '_jinja2-cache-%d' % os.getuid()
dirname = "_jinja2-cache-%d" % os.getuid()
actual_dir = os.path.join(tmpdir, dirname)
try:
@@ -241,18 +220,22 @@ def _unsafe_dir():
try:
os.chmod(actual_dir, stat.S_IRWXU)
actual_dir_stat = os.lstat(actual_dir)
if actual_dir_stat.st_uid != os.getuid() \
or not stat.S_ISDIR(actual_dir_stat.st_mode) \
or stat.S_IMODE(actual_dir_stat.st_mode) != stat.S_IRWXU:
if (
actual_dir_stat.st_uid != os.getuid()
or not stat.S_ISDIR(actual_dir_stat.st_mode)
or stat.S_IMODE(actual_dir_stat.st_mode) != stat.S_IRWXU
):
_unsafe_dir()
except OSError as e:
if e.errno != errno.EEXIST:
raise
actual_dir_stat = os.lstat(actual_dir)
if actual_dir_stat.st_uid != os.getuid() \
or not stat.S_ISDIR(actual_dir_stat.st_mode) \
or stat.S_IMODE(actual_dir_stat.st_mode) != stat.S_IRWXU:
if (
actual_dir_stat.st_uid != os.getuid()
or not stat.S_ISDIR(actual_dir_stat.st_mode)
or stat.S_IMODE(actual_dir_stat.st_mode) != stat.S_IRWXU
):
_unsafe_dir()
return actual_dir
@@ -261,7 +244,7 @@ def _get_cache_filename(self, bucket):
return path.join(self.directory, self.pattern % bucket.key)
def load_bytecode(self, bucket):
f = open_if_exists(self._get_cache_filename(bucket), 'rb')
f = open_if_exists(self._get_cache_filename(bucket), "rb")
if f is not None:
try:
bucket.load_bytecode(f)
@@ -269,7 +252,7 @@ def load_bytecode(self, bucket):
f.close()
def dump_bytecode(self, bucket):
f = open(self._get_cache_filename(bucket), 'wb')
f = open(self._get_cache_filename(bucket), "wb")
try:
bucket.write_bytecode(f)
finally:
@@ -280,7 +263,8 @@ def clear(self):
# write access on the file system and the function does not exist
# normally.
from os import remove
files = fnmatch.filter(listdir(self.directory), self.pattern % '*')
files = fnmatch.filter(listdir(self.directory), self.pattern % "*")
for filename in files:
try:
remove(path.join(self.directory, filename))
@@ -296,9 +280,8 @@ class MemcachedBytecodeCache(BytecodeCache):
Libraries compatible with this class:
- `werkzeug <http://werkzeug.pocoo.org/>`_.contrib.cache
- `python-memcached <https://www.tummy.com/Community/software/python-memcached/>`_
- `cmemcache <http://gijsbert.org/cmemcache/>`_
- `cachelib <https://github.com/pallets/cachelib>`_
- `python-memcached <https://pypi.org/project/python-memcached/>`_
(Unfortunately the django cache interface is not compatible because it
does not support storing binary data, only unicode. You can however pass
@@ -334,8 +317,13 @@ class MemcachedBytecodeCache(BytecodeCache):
`ignore_memcache_errors` parameter.
"""
def __init__(self, client, prefix='jinja2/bytecode/', timeout=None,
ignore_memcache_errors=True):
def __init__(
self,
client,
prefix="jinja2/bytecode/",
timeout=None,
ignore_memcache_errors=True,
):
self.client = client
self.prefix = prefix
self.timeout = timeout

File diff suppressed because it is too large Load Diff

View File

@@ -1,17 +1,6 @@
# -*- coding: utf-8 -*-
"""
jinja.constants
~~~~~~~~~~~~~~~
Various constants.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
#: list of lorem ipsum words used by the lipsum() helper function
LOREM_IPSUM_WORDS = u'''\
LOREM_IPSUM_WORDS = u"""\
a ac accumsan ad adipiscing aenean aliquam aliquet amet ante aptent arcu at
auctor augue bibendum blandit class commodo condimentum congue consectetuer
consequat conubia convallis cras cubilia cum curabitur curae cursus dapibus
@@ -29,4 +18,4 @@
sociis sociosqu sodales sollicitudin suscipit suspendisse taciti tellus tempor
tempus tincidunt torquent tortor tristique turpis ullamcorper ultrices
ultricies urna ut varius vehicula vel velit venenatis vestibulum vitae vivamus
viverra volutpat vulputate'''
viverra volutpat vulputate"""

View File

@@ -1,372 +1,268 @@
# -*- coding: utf-8 -*-
"""
jinja2.debug
~~~~~~~~~~~~
Implements the debug interface for Jinja. This module does some pretty
ugly stuff with the Python traceback system in order to achieve tracebacks
with correct line numbers, locals and contents.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
import sys
import traceback
from types import TracebackType, CodeType
from jinja2.utils import missing, internal_code
from jinja2.exceptions import TemplateSyntaxError
from jinja2._compat import iteritems, reraise, PY2
from types import CodeType
# on pypy we can take advantage of transparent proxies
try:
from __pypy__ import tproxy
except ImportError:
tproxy = None
from . import TemplateSyntaxError
from ._compat import PYPY
from .utils import internal_code
from .utils import missing
# how does the raise helper look like?
try:
exec("raise TypeError, 'foo'")
except SyntaxError:
raise_helper = 'raise __jinja_exception__[1]'
except TypeError:
raise_helper = 'raise __jinja_exception__[0], __jinja_exception__[1]'
def rewrite_traceback_stack(source=None):
"""Rewrite the current exception to replace any tracebacks from
within compiled template code with tracebacks that look like they
came from the template source.
This must be called within an ``except`` block.
class TracebackFrameProxy(object):
"""Proxies a traceback frame."""
def __init__(self, tb):
self.tb = tb
self._tb_next = None
@property
def tb_next(self):
return self._tb_next
def set_next(self, next):
if tb_set_next is not None:
try:
tb_set_next(self.tb, next and next.tb or None)
except Exception:
# this function can fail due to all the hackery it does
# on various python implementations. We just catch errors
# down and ignore them if necessary.
pass
self._tb_next = next
@property
def is_jinja_frame(self):
return '__jinja_template__' in self.tb.tb_frame.f_globals
def __getattr__(self, name):
return getattr(self.tb, name)
def make_frame_proxy(frame):
proxy = TracebackFrameProxy(frame)
if tproxy is None:
return proxy
def operation_handler(operation, *args, **kwargs):
if operation in ('__getattribute__', '__getattr__'):
return getattr(proxy, args[0])
elif operation == '__setattr__':
proxy.__setattr__(*args, **kwargs)
else:
return getattr(proxy, operation)(*args, **kwargs)
return tproxy(TracebackType, operation_handler)
class ProcessedTraceback(object):
"""Holds a Jinja preprocessed traceback for printing or reraising."""
def __init__(self, exc_type, exc_value, frames):
assert frames, 'no frames for this traceback?'
self.exc_type = exc_type
self.exc_value = exc_value
self.frames = frames
# newly concatenate the frames (which are proxies)
prev_tb = None
for tb in self.frames:
if prev_tb is not None:
prev_tb.set_next(tb)
prev_tb = tb
prev_tb.set_next(None)
def render_as_text(self, limit=None):
"""Return a string with the traceback."""
lines = traceback.format_exception(self.exc_type, self.exc_value,
self.frames[0], limit=limit)
return ''.join(lines).rstrip()
def render_as_html(self, full=False):
"""Return a unicode string with the traceback as rendered HTML."""
from jinja2.debugrenderer import render_traceback
return u'%s\n\n<!--\n%s\n-->' % (
render_traceback(self, full=full),
self.render_as_text().decode('utf-8', 'replace')
)
@property
def is_template_syntax_error(self):
"""`True` if this is a template syntax error."""
return isinstance(self.exc_value, TemplateSyntaxError)
@property
def exc_info(self):
"""Exception info tuple with a proxy around the frame objects."""
return self.exc_type, self.exc_value, self.frames[0]
@property
def standard_exc_info(self):
"""Standard python exc_info for re-raising"""
tb = self.frames[0]
# the frame will be an actual traceback (or transparent proxy) if
# we are on pypy or a python implementation with support for tproxy
if type(tb) is not TracebackType:
tb = tb.tb
return self.exc_type, self.exc_value, tb
def make_traceback(exc_info, source_hint=None):
"""Creates a processed traceback object from the exc_info."""
exc_type, exc_value, tb = exc_info
if isinstance(exc_value, TemplateSyntaxError):
exc_info = translate_syntax_error(exc_value, source_hint)
initial_skip = 0
else:
initial_skip = 1
return translate_exception(exc_info, initial_skip)
def translate_syntax_error(error, source=None):
"""Rewrites a syntax error to please traceback systems."""
error.source = source
error.translated = True
exc_info = (error.__class__, error, None)
filename = error.filename
if filename is None:
filename = '<unknown>'
return fake_exc_info(exc_info, filename, error.lineno)
def translate_exception(exc_info, initial_skip=0):
"""If passed an exc_info it will automatically rewrite the exceptions
all the way down to the correct line numbers and frames.
:param exc_info: A :meth:`sys.exc_info` tuple. If not provided,
the current ``exc_info`` is used.
:param source: For ``TemplateSyntaxError``, the original source if
known.
:return: A :meth:`sys.exc_info` tuple that can be re-raised.
"""
tb = exc_info[2]
frames = []
exc_type, exc_value, tb = sys.exc_info()
# skip some internal frames if wanted
for x in range(initial_skip):
if tb is not None:
tb = tb.tb_next
initial_tb = tb
if isinstance(exc_value, TemplateSyntaxError) and not exc_value.translated:
exc_value.translated = True
exc_value.source = source
try:
# Remove the old traceback on Python 3, otherwise the frames
# from the compiler still show up.
exc_value.with_traceback(None)
except AttributeError:
pass
# Outside of runtime, so the frame isn't executing template
# code, but it still needs to point at the template.
tb = fake_traceback(
exc_value, None, exc_value.filename or "<unknown>", exc_value.lineno
)
else:
# Skip the frame for the render function.
tb = tb.tb_next
stack = []
# Build the stack of traceback object, replacing any in template
# code with the source file and line information.
while tb is not None:
# skip frames decorated with @internalcode. These are internal
# calls we can't avoid and that are useless in template debugging
# output.
# Skip frames decorated with @internalcode. These are internal
# calls that aren't useful in template debugging output.
if tb.tb_frame.f_code in internal_code:
tb = tb.tb_next
continue
# save a reference to the next frame if we override the current
# one with a faked one.
next = tb.tb_next
template = tb.tb_frame.f_globals.get("__jinja_template__")
# fake template exceptions
template = tb.tb_frame.f_globals.get('__jinja_template__')
if template is not None:
lineno = template.get_corresponding_lineno(tb.tb_lineno)
tb = fake_exc_info(exc_info[:2] + (tb,), template.filename,
lineno)[2]
fake_tb = fake_traceback(exc_value, tb, template.filename, lineno)
stack.append(fake_tb)
else:
stack.append(tb)
frames.append(make_frame_proxy(tb))
tb = next
tb = tb.tb_next
# if we don't have any exceptions in the frames left, we have to
# reraise it unchanged.
# XXX: can we backup here? when could this happen?
if not frames:
reraise(exc_info[0], exc_info[1], exc_info[2])
tb_next = None
return ProcessedTraceback(exc_info[0], exc_info[1], frames)
# Assign tb_next in reverse to avoid circular references.
for tb in reversed(stack):
tb_next = tb_set_next(tb, tb_next)
return exc_type, exc_value, tb_next
def get_jinja_locals(real_locals):
ctx = real_locals.get('context')
if ctx:
locals = ctx.get_all().copy()
def fake_traceback(exc_value, tb, filename, lineno):
"""Produce a new traceback object that looks like it came from the
template source instead of the compiled code. The filename, line
number, and location name will point to the template, and the local
variables will be the current template context.
:param exc_value: The original exception to be re-raised to create
the new traceback.
:param tb: The original traceback to get the local variables and
code info from.
:param filename: The template filename.
:param lineno: The line number in the template source.
"""
if tb is not None:
# Replace the real locals with the context that would be
# available at that point in the template.
locals = get_template_locals(tb.tb_frame.f_locals)
locals.pop("__jinja_exception__", None)
else:
locals = {}
globals = {
"__name__": filename,
"__file__": filename,
"__jinja_exception__": exc_value,
}
# Raise an exception at the correct line number.
code = compile("\n" * (lineno - 1) + "raise __jinja_exception__", filename, "exec")
# Build a new code object that points to the template file and
# replaces the location with a block name.
try:
location = "template"
if tb is not None:
function = tb.tb_frame.f_code.co_name
if function == "root":
location = "top-level template code"
elif function.startswith("block_"):
location = 'block "%s"' % function[6:]
# Collect arguments for the new code object. CodeType only
# accepts positional arguments, and arguments were inserted in
# new Python versions.
code_args = []
for attr in (
"argcount",
"posonlyargcount", # Python 3.8
"kwonlyargcount", # Python 3
"nlocals",
"stacksize",
"flags",
"code", # codestring
"consts", # constants
"names",
"varnames",
("filename", filename),
("name", location),
"firstlineno",
"lnotab",
"freevars",
"cellvars",
):
if isinstance(attr, tuple):
# Replace with given value.
code_args.append(attr[1])
continue
try:
# Copy original value if it exists.
code_args.append(getattr(code, "co_" + attr))
except AttributeError:
# Some arguments were added later.
continue
code = CodeType(*code_args)
except Exception:
# Some environments such as Google App Engine don't support
# modifying code objects.
pass
# Execute the new code, which is guaranteed to raise, and return
# the new traceback without this frame.
try:
exec(code, globals, locals)
except BaseException:
return sys.exc_info()[2].tb_next
def get_template_locals(real_locals):
"""Based on the runtime locals, get the context that would be
available at that point in the template.
"""
# Start with the current template context.
ctx = real_locals.get("context")
if ctx:
data = ctx.get_all().copy()
else:
data = {}
# Might be in a derived context that only sets local variables
# rather than pushing a context. Local variables follow the scheme
# l_depth_name. Find the highest-depth local that has a value for
# each name.
local_overrides = {}
for name, value in iteritems(real_locals):
if not name.startswith('l_') or value is missing:
for name, value in real_locals.items():
if not name.startswith("l_") or value is missing:
# Not a template variable, or no longer relevant.
continue
try:
_, depth, name = name.split('_', 2)
_, depth, name = name.split("_", 2)
depth = int(depth)
except ValueError:
continue
cur_depth = local_overrides.get(name, (-1,))[0]
if cur_depth < depth:
local_overrides[name] = (depth, value)
for name, (_, value) in iteritems(local_overrides):
# Modify the context with any derived context.
for name, (_, value) in local_overrides.items():
if value is missing:
locals.pop(name, None)
data.pop(name, None)
else:
locals[name] = value
data[name] = value
return locals
return data
def fake_exc_info(exc_info, filename, lineno):
"""Helper for `translate_exception`."""
exc_type, exc_value, tb = exc_info
if sys.version_info >= (3, 7):
# tb_next is directly assignable as of Python 3.7
def tb_set_next(tb, tb_next):
tb.tb_next = tb_next
return tb
# figure the real context out
if tb is not None:
locals = get_jinja_locals(tb.tb_frame.f_locals)
# if there is a local called __jinja_exception__, we get
# rid of it to not break the debug functionality.
locals.pop('__jinja_exception__', None)
elif PYPY:
# PyPy might have special support, and won't work with ctypes.
try:
import tputil
except ImportError:
# Without tproxy support, use the original traceback.
def tb_set_next(tb, tb_next):
return tb
else:
locals = {}
# With tproxy support, create a proxy around the traceback that
# returns the new tb_next.
def tb_set_next(tb, tb_next):
def controller(op):
if op.opname == "__getattribute__" and op.args[0] == "tb_next":
return tb_next
# assamble fake globals we need
globals = {
'__name__': filename,
'__file__': filename,
'__jinja_exception__': exc_info[:2],
return op.delegate()
# we don't want to keep the reference to the template around
# to not cause circular dependencies, but we mark it as Jinja
# frame for the ProcessedTraceback
'__jinja_template__': None
}
# and fake the exception
code = compile('\n' * (lineno - 1) + raise_helper, filename, 'exec')
# if it's possible, change the name of the code. This won't work
# on some python environments such as google appengine
try:
if tb is None:
location = 'template'
else:
function = tb.tb_frame.f_code.co_name
if function == 'root':
location = 'top-level template code'
elif function.startswith('block_'):
location = 'block "%s"' % function[6:]
else:
location = 'template'
if PY2:
code = CodeType(0, code.co_nlocals, code.co_stacksize,
code.co_flags, code.co_code, code.co_consts,
code.co_names, code.co_varnames, filename,
location, code.co_firstlineno,
code.co_lnotab, (), ())
else:
code = CodeType(0, code.co_kwonlyargcount,
code.co_nlocals, code.co_stacksize,
code.co_flags, code.co_code, code.co_consts,
code.co_names, code.co_varnames, filename,
location, code.co_firstlineno,
code.co_lnotab, (), ())
except Exception as e:
pass
# execute the code and catch the new traceback
try:
exec(code, globals, locals)
except:
exc_info = sys.exc_info()
new_tb = exc_info[2].tb_next
# return without this frame
return exc_info[:2] + (new_tb,)
return tputil.make_proxy(controller, obj=tb)
def _init_ugly_crap():
"""This function implements a few ugly things so that we can patch the
traceback objects. The function returned allows resetting `tb_next` on
any python traceback object. Do not attempt to use this on non cpython
interpreters
"""
else:
# Use ctypes to assign tb_next at the C level since it's read-only
# from Python.
import ctypes
from types import TracebackType
if PY2:
# figure out size of _Py_ssize_t for Python 2:
if hasattr(ctypes.pythonapi, 'Py_InitModule4_64'):
_Py_ssize_t = ctypes.c_int64
else:
_Py_ssize_t = ctypes.c_int
else:
# platform ssize_t on Python 3
_Py_ssize_t = ctypes.c_ssize_t
# regular python
class _PyObject(ctypes.Structure):
pass
_PyObject._fields_ = [
('ob_refcnt', _Py_ssize_t),
('ob_type', ctypes.POINTER(_PyObject))
]
# python with trace
if hasattr(sys, 'getobjects'):
class _PyObject(ctypes.Structure):
pass
_PyObject._fields_ = [
('_ob_next', ctypes.POINTER(_PyObject)),
('_ob_prev', ctypes.POINTER(_PyObject)),
('ob_refcnt', _Py_ssize_t),
('ob_type', ctypes.POINTER(_PyObject))
class _CTraceback(ctypes.Structure):
_fields_ = [
# Extra PyObject slots when compiled with Py_TRACE_REFS.
("PyObject_HEAD", ctypes.c_byte * object().__sizeof__()),
# Only care about tb_next as an object, not a traceback.
("tb_next", ctypes.py_object),
]
class _Traceback(_PyObject):
pass
_Traceback._fields_ = [
('tb_next', ctypes.POINTER(_Traceback)),
('tb_frame', ctypes.POINTER(_PyObject)),
('tb_lasti', ctypes.c_int),
('tb_lineno', ctypes.c_int)
]
def tb_set_next(tb, tb_next):
c_tb = _CTraceback.from_address(id(tb))
def tb_set_next(tb, next):
"""Set the tb_next attribute of a traceback object."""
if not (isinstance(tb, TracebackType) and
(next is None or isinstance(next, TracebackType))):
raise TypeError('tb_set_next arguments must be traceback objects')
obj = _Traceback.from_address(id(tb))
# Clear out the old tb_next.
if tb.tb_next is not None:
old = _Traceback.from_address(id(tb.tb_next))
old.ob_refcnt -= 1
if next is None:
obj.tb_next = ctypes.POINTER(_Traceback)()
else:
next = _Traceback.from_address(id(next))
next.ob_refcnt += 1
obj.tb_next = ctypes.pointer(next)
c_tb_next = ctypes.py_object(tb.tb_next)
c_tb.tb_next = ctypes.py_object()
ctypes.pythonapi.Py_DecRef(c_tb_next)
return tb_set_next
# Assign the new tb_next.
if tb_next is not None:
c_tb_next = ctypes.py_object(tb_next)
ctypes.pythonapi.Py_IncRef(c_tb_next)
c_tb.tb_next = c_tb_next
# try to get a tb_set_next implementation if we don't have transparent
# proxies.
tb_set_next = None
if tproxy is None:
try:
tb_set_next = _init_ugly_crap()
except:
pass
del _init_ugly_crap
return tb

View File

@@ -1,56 +1,44 @@
# -*- coding: utf-8 -*-
"""
jinja2.defaults
~~~~~~~~~~~~~~~
Jinja default filters and tags.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
from jinja2._compat import range_type
from jinja2.utils import generate_lorem_ipsum, Cycler, Joiner, Namespace
from ._compat import range_type
from .filters import FILTERS as DEFAULT_FILTERS # noqa: F401
from .tests import TESTS as DEFAULT_TESTS # noqa: F401
from .utils import Cycler
from .utils import generate_lorem_ipsum
from .utils import Joiner
from .utils import Namespace
# defaults for the parser / lexer
BLOCK_START_STRING = '{%'
BLOCK_END_STRING = '%}'
VARIABLE_START_STRING = '{{'
VARIABLE_END_STRING = '}}'
COMMENT_START_STRING = '{#'
COMMENT_END_STRING = '#}'
BLOCK_START_STRING = "{%"
BLOCK_END_STRING = "%}"
VARIABLE_START_STRING = "{{"
VARIABLE_END_STRING = "}}"
COMMENT_START_STRING = "{#"
COMMENT_END_STRING = "#}"
LINE_STATEMENT_PREFIX = None
LINE_COMMENT_PREFIX = None
TRIM_BLOCKS = False
LSTRIP_BLOCKS = False
NEWLINE_SEQUENCE = '\n'
NEWLINE_SEQUENCE = "\n"
KEEP_TRAILING_NEWLINE = False
# default filters, tests and namespace
from jinja2.filters import FILTERS as DEFAULT_FILTERS
from jinja2.tests import TESTS as DEFAULT_TESTS
DEFAULT_NAMESPACE = {
'range': range_type,
'dict': dict,
'lipsum': generate_lorem_ipsum,
'cycler': Cycler,
'joiner': Joiner,
'namespace': Namespace
}
DEFAULT_NAMESPACE = {
"range": range_type,
"dict": dict,
"lipsum": generate_lorem_ipsum,
"cycler": Cycler,
"joiner": Joiner,
"namespace": Namespace,
}
# default policies
DEFAULT_POLICIES = {
'compiler.ascii_str': True,
'urlize.rel': 'noopener',
'urlize.target': None,
'truncate.leeway': 5,
'json.dumps_function': None,
'json.dumps_kwargs': {'sort_keys': True},
'ext.i18n.trimmed': False,
"compiler.ascii_str": True,
"urlize.rel": "noopener",
"urlize.target": None,
"truncate.leeway": 5,
"json.dumps_function": None,
"json.dumps_kwargs": {"sort_keys": True},
"ext.i18n.trimmed": False,
}
# export all constants
__all__ = tuple(x for x in locals().keys() if x.isupper())

View File

@@ -1,60 +1,83 @@
# -*- coding: utf-8 -*-
"""
jinja2.environment
~~~~~~~~~~~~~~~~~~
Provides a class that holds runtime and parsing time options.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""Classes for managing templates and their runtime and compile time
options.
"""
import os
import sys
import weakref
from functools import reduce, partial
from jinja2 import nodes
from jinja2.defaults import BLOCK_START_STRING, \
BLOCK_END_STRING, VARIABLE_START_STRING, VARIABLE_END_STRING, \
COMMENT_START_STRING, COMMENT_END_STRING, LINE_STATEMENT_PREFIX, \
LINE_COMMENT_PREFIX, TRIM_BLOCKS, NEWLINE_SEQUENCE, \
DEFAULT_FILTERS, DEFAULT_TESTS, DEFAULT_NAMESPACE, \
DEFAULT_POLICIES, KEEP_TRAILING_NEWLINE, LSTRIP_BLOCKS
from jinja2.lexer import get_lexer, TokenStream
from jinja2.parser import Parser
from jinja2.nodes import EvalContext
from jinja2.compiler import generate, CodeGenerator
from jinja2.runtime import Undefined, new_context, Context
from jinja2.exceptions import TemplateSyntaxError, TemplateNotFound, \
TemplatesNotFound, TemplateRuntimeError
from jinja2.utils import import_string, LRUCache, Markup, missing, \
concat, consume, internalcode, have_async_gen
from jinja2._compat import imap, ifilter, string_types, iteritems, \
text_type, reraise, implements_iterator, implements_to_string, \
encode_filename, PY2, PYPY
from functools import partial
from functools import reduce
from markupsafe import Markup
from . import nodes
from ._compat import encode_filename
from ._compat import implements_iterator
from ._compat import implements_to_string
from ._compat import iteritems
from ._compat import PY2
from ._compat import PYPY
from ._compat import reraise
from ._compat import string_types
from ._compat import text_type
from .compiler import CodeGenerator
from .compiler import generate
from .defaults import BLOCK_END_STRING
from .defaults import BLOCK_START_STRING
from .defaults import COMMENT_END_STRING
from .defaults import COMMENT_START_STRING
from .defaults import DEFAULT_FILTERS
from .defaults import DEFAULT_NAMESPACE
from .defaults import DEFAULT_POLICIES
from .defaults import DEFAULT_TESTS
from .defaults import KEEP_TRAILING_NEWLINE
from .defaults import LINE_COMMENT_PREFIX
from .defaults import LINE_STATEMENT_PREFIX
from .defaults import LSTRIP_BLOCKS
from .defaults import NEWLINE_SEQUENCE
from .defaults import TRIM_BLOCKS
from .defaults import VARIABLE_END_STRING
from .defaults import VARIABLE_START_STRING
from .exceptions import TemplateNotFound
from .exceptions import TemplateRuntimeError
from .exceptions import TemplatesNotFound
from .exceptions import TemplateSyntaxError
from .exceptions import UndefinedError
from .lexer import get_lexer
from .lexer import TokenStream
from .nodes import EvalContext
from .parser import Parser
from .runtime import Context
from .runtime import new_context
from .runtime import Undefined
from .utils import concat
from .utils import consume
from .utils import have_async_gen
from .utils import import_string
from .utils import internalcode
from .utils import LRUCache
from .utils import missing
# for direct template usage we have up to ten living environments
_spontaneous_environments = LRUCache(10)
# the function to create jinja traceback objects. This is dynamically
# imported on the first exception in the exception handler.
_make_traceback = None
def get_spontaneous_environment(cls, *args):
"""Return a new spontaneous environment. A spontaneous environment
is used for templates created directly rather than through an
existing environment.
def get_spontaneous_environment(*args):
"""Return a new spontaneous environment. A spontaneous environment is an
unnamed and unaccessible (in theory) environment that is used for
templates generated from a string and not from the file system.
:param cls: Environment class to create.
:param args: Positional arguments passed to environment.
"""
key = (cls, args)
try:
env = _spontaneous_environments.get(args)
except TypeError:
return Environment(*args)
if env is not None:
return _spontaneous_environments[key]
except KeyError:
_spontaneous_environments[key] = env = cls(*args)
env.shared = True
return env
_spontaneous_environments[args] = env = Environment(*args)
env.shared = True
return env
def create_cache(size):
@@ -93,20 +116,25 @@ def fail_for_missing_callable(string, name):
try:
name._fail_with_undefined_error()
except Exception as e:
msg = '%s (%s; did you forget to quote the callable name?)' % (msg, e)
msg = "%s (%s; did you forget to quote the callable name?)" % (msg, e)
raise TemplateRuntimeError(msg)
def _environment_sanity_check(environment):
"""Perform a sanity check on the environment."""
assert issubclass(environment.undefined, Undefined), 'undefined must ' \
'be a subclass of undefined because filters depend on it.'
assert environment.block_start_string != \
environment.variable_start_string != \
environment.comment_start_string, 'block, variable and comment ' \
'start strings must be different'
assert environment.newline_sequence in ('\r', '\r\n', '\n'), \
'newline_sequence set to unknown line ending string.'
assert issubclass(
environment.undefined, Undefined
), "undefined must be a subclass of undefined because filters depend on it."
assert (
environment.block_start_string
!= environment.variable_start_string
!= environment.comment_start_string
), "block, variable and comment start strings must be different"
assert environment.newline_sequence in (
"\r",
"\r\n",
"\n",
), "newline_sequence set to unknown line ending string."
return environment
@@ -191,7 +219,7 @@ class Environment(object):
`autoescape`
If set to ``True`` the XML/HTML autoescaping feature is enabled by
default. For more details about autoescaping see
:class:`~jinja2.utils.Markup`. As of Jinja 2.4 this can also
:class:`~markupsafe.Markup`. As of Jinja 2.4 this can also
be a callable that is passed the template name and has to
return ``True`` or ``False`` depending on autoescape should be
enabled by default.
@@ -249,10 +277,6 @@ class Environment(object):
#: must not be modified
shared = False
#: these are currently EXPERIMENTAL undocumented features.
exception_handler = None
exception_formatter = None
#: the class that is used for code generation. See
#: :class:`~jinja2.compiler.CodeGenerator` for more information.
code_generator_class = CodeGenerator
@@ -261,29 +285,31 @@ class Environment(object):
#: :class:`~jinja2.runtime.Context` for more information.
context_class = Context
def __init__(self,
block_start_string=BLOCK_START_STRING,
block_end_string=BLOCK_END_STRING,
variable_start_string=VARIABLE_START_STRING,
variable_end_string=VARIABLE_END_STRING,
comment_start_string=COMMENT_START_STRING,
comment_end_string=COMMENT_END_STRING,
line_statement_prefix=LINE_STATEMENT_PREFIX,
line_comment_prefix=LINE_COMMENT_PREFIX,
trim_blocks=TRIM_BLOCKS,
lstrip_blocks=LSTRIP_BLOCKS,
newline_sequence=NEWLINE_SEQUENCE,
keep_trailing_newline=KEEP_TRAILING_NEWLINE,
extensions=(),
optimized=True,
undefined=Undefined,
finalize=None,
autoescape=False,
loader=None,
cache_size=400,
auto_reload=True,
bytecode_cache=None,
enable_async=False):
def __init__(
self,
block_start_string=BLOCK_START_STRING,
block_end_string=BLOCK_END_STRING,
variable_start_string=VARIABLE_START_STRING,
variable_end_string=VARIABLE_END_STRING,
comment_start_string=COMMENT_START_STRING,
comment_end_string=COMMENT_END_STRING,
line_statement_prefix=LINE_STATEMENT_PREFIX,
line_comment_prefix=LINE_COMMENT_PREFIX,
trim_blocks=TRIM_BLOCKS,
lstrip_blocks=LSTRIP_BLOCKS,
newline_sequence=NEWLINE_SEQUENCE,
keep_trailing_newline=KEEP_TRAILING_NEWLINE,
extensions=(),
optimized=True,
undefined=Undefined,
finalize=None,
autoescape=False,
loader=None,
cache_size=400,
auto_reload=True,
bytecode_cache=None,
enable_async=False,
):
# !!Important notice!!
# The constructor accepts quite a few arguments that should be
# passed by keyword rather than position. However it's important to
@@ -334,6 +360,9 @@ def __init__(self,
self.enable_async = enable_async
self.is_async = self.enable_async and have_async_gen
if self.is_async:
# runs patch_all() to enable async support
from . import asyncsupport # noqa: F401
_environment_sanity_check(self)
@@ -353,15 +382,28 @@ def extend(self, **attributes):
if not hasattr(self, key):
setattr(self, key, value)
def overlay(self, block_start_string=missing, block_end_string=missing,
variable_start_string=missing, variable_end_string=missing,
comment_start_string=missing, comment_end_string=missing,
line_statement_prefix=missing, line_comment_prefix=missing,
trim_blocks=missing, lstrip_blocks=missing,
extensions=missing, optimized=missing,
undefined=missing, finalize=missing, autoescape=missing,
loader=missing, cache_size=missing, auto_reload=missing,
bytecode_cache=missing):
def overlay(
self,
block_start_string=missing,
block_end_string=missing,
variable_start_string=missing,
variable_end_string=missing,
comment_start_string=missing,
comment_end_string=missing,
line_statement_prefix=missing,
line_comment_prefix=missing,
trim_blocks=missing,
lstrip_blocks=missing,
extensions=missing,
optimized=missing,
undefined=missing,
finalize=missing,
autoescape=missing,
loader=missing,
cache_size=missing,
auto_reload=missing,
bytecode_cache=missing,
):
"""Create a new overlay environment that shares all the data with the
current environment except for cache and the overridden attributes.
Extensions cannot be removed for an overlayed environment. An overlayed
@@ -374,7 +416,7 @@ def overlay(self, block_start_string=missing, block_end_string=missing,
through.
"""
args = dict(locals())
del args['self'], args['cache_size'], args['extensions']
del args["self"], args["cache_size"], args["extensions"]
rv = object.__new__(self.__class__)
rv.__dict__.update(self.__dict__)
@@ -402,8 +444,7 @@ def overlay(self, block_start_string=missing, block_end_string=missing,
def iter_extensions(self):
"""Iterates over the extensions by priority."""
return iter(sorted(self.extensions.values(),
key=lambda x: x.priority))
return iter(sorted(self.extensions.values(), key=lambda x: x.priority))
def getitem(self, obj, argument):
"""Get an item or attribute of an object but prefer the item."""
@@ -435,8 +476,9 @@ def getattr(self, obj, attribute):
except (TypeError, LookupError, AttributeError):
return self.undefined(obj=obj, name=attribute)
def call_filter(self, name, value, args=None, kwargs=None,
context=None, eval_ctx=None):
def call_filter(
self, name, value, args=None, kwargs=None, context=None, eval_ctx=None
):
"""Invokes a filter on a value the same way the compiler does it.
Note that on Python 3 this might return a coroutine in case the
@@ -448,21 +490,22 @@ def call_filter(self, name, value, args=None, kwargs=None,
"""
func = self.filters.get(name)
if func is None:
fail_for_missing_callable('no filter named %r', name)
fail_for_missing_callable("no filter named %r", name)
args = [value] + list(args or ())
if getattr(func, 'contextfilter', False):
if getattr(func, "contextfilter", False) is True:
if context is None:
raise TemplateRuntimeError('Attempted to invoke context '
'filter without context')
raise TemplateRuntimeError(
"Attempted to invoke context filter without context"
)
args.insert(0, context)
elif getattr(func, 'evalcontextfilter', False):
elif getattr(func, "evalcontextfilter", False) is True:
if eval_ctx is None:
if context is not None:
eval_ctx = context.eval_ctx
else:
eval_ctx = EvalContext(self)
args.insert(0, eval_ctx)
elif getattr(func, 'environmentfilter', False):
elif getattr(func, "environmentfilter", False) is True:
args.insert(0, self)
return func(*args, **(kwargs or {}))
@@ -473,7 +516,7 @@ def call_test(self, name, value, args=None, kwargs=None):
"""
func = self.tests.get(name)
if func is None:
fail_for_missing_callable('no test named %r', name)
fail_for_missing_callable("no test named %r", name)
return func(value, *(args or ()), **(kwargs or {}))
@internalcode
@@ -483,14 +526,13 @@ def parse(self, source, name=None, filename=None):
executable source- or bytecode. This is useful for debugging or to
extract information from templates.
If you are :ref:`developing Jinja2 extensions <writing-extensions>`
If you are :ref:`developing Jinja extensions <writing-extensions>`
this gives you a good overview of the node tree generated.
"""
try:
return self._parse(source, name, filename)
except TemplateSyntaxError:
exc_info = sys.exc_info()
self.handle_exception(exc_info, source_hint=source)
self.handle_exception(source=source)
def _parse(self, source, name, filename):
"""Internal parsing function used by `parse` and `compile`."""
@@ -510,16 +552,18 @@ def lex(self, source, name=None, filename=None):
try:
return self.lexer.tokeniter(source, name, filename)
except TemplateSyntaxError:
exc_info = sys.exc_info()
self.handle_exception(exc_info, source_hint=source)
self.handle_exception(source=source)
def preprocess(self, source, name=None, filename=None):
"""Preprocesses the source with all extensions. This is automatically
called for all parsing and compiling methods but *not* for :meth:`lex`
because there you usually only want the actual source tokenized.
"""
return reduce(lambda s, e: e.preprocess(s, name, filename),
self.iter_extensions(), text_type(source))
return reduce(
lambda s, e: e.preprocess(s, name, filename),
self.iter_extensions(),
text_type(source),
)
def _tokenize(self, source, name, filename=None, state=None):
"""Called by the parser to do the preprocessing and filtering
@@ -539,8 +583,14 @@ def _generate(self, source, name, filename, defer_init=False):
.. versionadded:: 2.5
"""
return generate(source, self, name, filename, defer_init=defer_init,
optimized=self.optimized)
return generate(
source,
self,
name,
filename,
defer_init=defer_init,
optimized=self.optimized,
)
def _compile(self, source, filename):
"""Internal hook that can be overridden to hook a different compile
@@ -548,11 +598,10 @@ def _compile(self, source, filename):
.. versionadded:: 2.5
"""
return compile(source, filename, 'exec')
return compile(source, filename, "exec")
@internalcode
def compile(self, source, name=None, filename=None, raw=False,
defer_init=False):
def compile(self, source, name=None, filename=None, raw=False, defer_init=False):
"""Compile a node or template source code. The `name` parameter is
the load name of the template after it was joined using
:meth:`join_path` if necessary, not the filename on the file system.
@@ -577,18 +626,16 @@ def compile(self, source, name=None, filename=None, raw=False,
if isinstance(source, string_types):
source_hint = source
source = self._parse(source, name, filename)
source = self._generate(source, name, filename,
defer_init=defer_init)
source = self._generate(source, name, filename, defer_init=defer_init)
if raw:
return source
if filename is None:
filename = '<template>'
filename = "<template>"
else:
filename = encode_filename(filename)
return self._compile(source, filename)
except TemplateSyntaxError:
exc_info = sys.exc_info()
self.handle_exception(exc_info, source_hint=source_hint)
self.handle_exception(source=source_hint)
def compile_expression(self, source, undefined_to_none=True):
"""A handy helper method that returns a callable that accepts keyword
@@ -618,26 +665,32 @@ def compile_expression(self, source, undefined_to_none=True):
.. versionadded:: 2.1
"""
parser = Parser(self, source, state='variable')
exc_info = None
parser = Parser(self, source, state="variable")
try:
expr = parser.parse_expression()
if not parser.stream.eos:
raise TemplateSyntaxError('chunk after expression',
parser.stream.current.lineno,
None, None)
raise TemplateSyntaxError(
"chunk after expression", parser.stream.current.lineno, None, None
)
expr.set_environment(self)
except TemplateSyntaxError:
exc_info = sys.exc_info()
if exc_info is not None:
self.handle_exception(exc_info, source_hint=source)
body = [nodes.Assign(nodes.Name('result', 'store'), expr, lineno=1)]
if sys.exc_info() is not None:
self.handle_exception(source=source)
body = [nodes.Assign(nodes.Name("result", "store"), expr, lineno=1)]
template = self.from_string(nodes.Template(body, lineno=1))
return TemplateExpression(template, undefined_to_none)
def compile_templates(self, target, extensions=None, filter_func=None,
zip='deflated', log_function=None,
ignore_errors=True, py_compile=False):
def compile_templates(
self,
target,
extensions=None,
filter_func=None,
zip="deflated",
log_function=None,
ignore_errors=True,
py_compile=False,
):
"""Finds all the templates the loader can find, compiles them
and stores them in `target`. If `zip` is `None`, instead of in a
zipfile, the templates will be stored in a directory.
@@ -660,42 +713,52 @@ def compile_templates(self, target, extensions=None, filter_func=None,
.. versionadded:: 2.4
"""
from jinja2.loaders import ModuleLoader
from .loaders import ModuleLoader
if log_function is None:
log_function = lambda x: None
def log_function(x):
pass
if py_compile:
if not PY2 or PYPY:
from warnings import warn
warn(Warning('py_compile has no effect on pypy or Python 3'))
import warnings
warnings.warn(
"'py_compile=True' has no effect on PyPy or Python"
" 3 and will be removed in version 3.0",
DeprecationWarning,
stacklevel=2,
)
py_compile = False
else:
import imp
import marshal
py_header = imp.get_magic() + \
u'\xff\xff\xff\xff'.encode('iso-8859-15')
py_header = imp.get_magic() + u"\xff\xff\xff\xff".encode("iso-8859-15")
# Python 3.3 added a source filesize to the header
if sys.version_info >= (3, 3):
py_header += u'\x00\x00\x00\x00'.encode('iso-8859-15')
py_header += u"\x00\x00\x00\x00".encode("iso-8859-15")
def write_file(filename, data, mode):
def write_file(filename, data):
if zip:
info = ZipInfo(filename)
info.external_attr = 0o755 << 16
zip_file.writestr(info, data)
else:
f = open(os.path.join(target, filename), mode)
try:
if isinstance(data, text_type):
data = data.encode("utf8")
with open(os.path.join(target, filename), "wb") as f:
f.write(data)
finally:
f.close()
if zip is not None:
from zipfile import ZipFile, ZipInfo, ZIP_DEFLATED, ZIP_STORED
zip_file = ZipFile(target, 'w', dict(deflated=ZIP_DEFLATED,
stored=ZIP_STORED)[zip])
zip_file = ZipFile(
target, "w", dict(deflated=ZIP_DEFLATED, stored=ZIP_STORED)[zip]
)
log_function('Compiling into Zip archive "%s"' % target)
else:
if not os.path.isdir(target):
@@ -717,18 +780,16 @@ def write_file(filename, data, mode):
if py_compile:
c = self._compile(code, encode_filename(filename))
write_file(filename + 'c', py_header +
marshal.dumps(c), 'wb')
log_function('Byte-compiled "%s" as %s' %
(name, filename + 'c'))
write_file(filename + "c", py_header + marshal.dumps(c))
log_function('Byte-compiled "%s" as %s' % (name, filename + "c"))
else:
write_file(filename, code, 'w')
write_file(filename, code)
log_function('Compiled "%s" as %s' % (name, filename))
finally:
if zip:
zip_file.close()
log_function('Finished compiling templates')
log_function("Finished compiling templates")
def list_templates(self, extensions=None, filter_func=None):
"""Returns a list of templates for this environment. This requires
@@ -746,38 +807,29 @@ def list_templates(self, extensions=None, filter_func=None):
.. versionadded:: 2.4
"""
x = self.loader.list_templates()
names = self.loader.list_templates()
if extensions is not None:
if filter_func is not None:
raise TypeError('either extensions or filter_func '
'can be passed, but not both')
filter_func = lambda x: '.' in x and \
x.rsplit('.', 1)[1] in extensions
if filter_func is not None:
x = list(ifilter(filter_func, x))
return x
raise TypeError(
"either extensions or filter_func can be passed, but not both"
)
def handle_exception(self, exc_info=None, rendered=False, source_hint=None):
def filter_func(x):
return "." in x and x.rsplit(".", 1)[1] in extensions
if filter_func is not None:
names = [name for name in names if filter_func(name)]
return names
def handle_exception(self, source=None):
"""Exception handling helper. This is used internally to either raise
rewritten exceptions or return a rendered traceback for the template.
"""
global _make_traceback
if exc_info is None:
exc_info = sys.exc_info()
from .debug import rewrite_traceback_stack
# the debugging module is imported when it's used for the first time.
# we're doing a lot of stuff there and for applications that do not
# get any exceptions in template rendering there is no need to load
# all of that.
if _make_traceback is None:
from jinja2.debug import make_traceback as _make_traceback
traceback = _make_traceback(exc_info, source_hint)
if rendered and self.exception_formatter is not None:
return self.exception_formatter(traceback)
if self.exception_handler is not None:
self.exception_handler(traceback)
exc_type, exc_value, tb = traceback.standard_exc_info
reraise(exc_type, exc_value, tb)
reraise(*rewrite_traceback_stack(source=source))
def join_path(self, template, parent):
"""Join a template with the parent. By default all the lookups are
@@ -794,12 +846,13 @@ def join_path(self, template, parent):
@internalcode
def _load_template(self, name, globals):
if self.loader is None:
raise TypeError('no loader for this environment specified')
raise TypeError("no loader for this environment specified")
cache_key = (weakref.ref(self.loader), name)
if self.cache is not None:
template = self.cache.get(cache_key)
if template is not None and (not self.auto_reload or
template.is_up_to_date):
if template is not None and (
not self.auto_reload or template.is_up_to_date
):
return template
template = self.loader.load(self, name, globals)
if self.cache is not None:
@@ -835,15 +888,24 @@ def select_template(self, names, parent=None, globals=None):
before it fails. If it cannot find any of the templates, it will
raise a :exc:`TemplatesNotFound` exception.
.. versionadded:: 2.3
.. versionchanged:: 2.11
If names is :class:`Undefined`, an :exc:`UndefinedError` is
raised instead. If no templates were found and names
contains :class:`Undefined`, the message is more helpful.
.. versionchanged:: 2.4
If `names` contains a :class:`Template` object it is returned
from the function unchanged.
.. versionadded:: 2.3
"""
if isinstance(names, Undefined):
names._fail_with_undefined_error()
if not names:
raise TemplatesNotFound(message=u'Tried to select from an empty list '
u'of templates.')
raise TemplatesNotFound(
message=u"Tried to select from an empty list " u"of templates."
)
globals = self.make_globals(globals)
for name in names:
if isinstance(name, Template):
@@ -852,20 +914,19 @@ def select_template(self, names, parent=None, globals=None):
name = self.join_path(name, parent)
try:
return self._load_template(name, globals)
except TemplateNotFound:
except (TemplateNotFound, UndefinedError):
pass
raise TemplatesNotFound(names)
@internalcode
def get_or_select_template(self, template_name_or_list,
parent=None, globals=None):
def get_or_select_template(self, template_name_or_list, parent=None, globals=None):
"""Does a typecheck and dispatches to :meth:`select_template`
if an iterable of template names is given, otherwise to
:meth:`get_template`.
.. versionadded:: 2.3
"""
if isinstance(template_name_or_list, string_types):
if isinstance(template_name_or_list, (string_types, Undefined)):
return self.get_template(template_name_or_list, parent, globals)
elif isinstance(template_name_or_list, Template):
return template_name_or_list
@@ -916,32 +977,57 @@ class Template(object):
StopIteration
"""
def __new__(cls, source,
block_start_string=BLOCK_START_STRING,
block_end_string=BLOCK_END_STRING,
variable_start_string=VARIABLE_START_STRING,
variable_end_string=VARIABLE_END_STRING,
comment_start_string=COMMENT_START_STRING,
comment_end_string=COMMENT_END_STRING,
line_statement_prefix=LINE_STATEMENT_PREFIX,
line_comment_prefix=LINE_COMMENT_PREFIX,
trim_blocks=TRIM_BLOCKS,
lstrip_blocks=LSTRIP_BLOCKS,
newline_sequence=NEWLINE_SEQUENCE,
keep_trailing_newline=KEEP_TRAILING_NEWLINE,
extensions=(),
optimized=True,
undefined=Undefined,
finalize=None,
autoescape=False,
enable_async=False):
#: Type of environment to create when creating a template directly
#: rather than through an existing environment.
environment_class = Environment
def __new__(
cls,
source,
block_start_string=BLOCK_START_STRING,
block_end_string=BLOCK_END_STRING,
variable_start_string=VARIABLE_START_STRING,
variable_end_string=VARIABLE_END_STRING,
comment_start_string=COMMENT_START_STRING,
comment_end_string=COMMENT_END_STRING,
line_statement_prefix=LINE_STATEMENT_PREFIX,
line_comment_prefix=LINE_COMMENT_PREFIX,
trim_blocks=TRIM_BLOCKS,
lstrip_blocks=LSTRIP_BLOCKS,
newline_sequence=NEWLINE_SEQUENCE,
keep_trailing_newline=KEEP_TRAILING_NEWLINE,
extensions=(),
optimized=True,
undefined=Undefined,
finalize=None,
autoescape=False,
enable_async=False,
):
env = get_spontaneous_environment(
block_start_string, block_end_string, variable_start_string,
variable_end_string, comment_start_string, comment_end_string,
line_statement_prefix, line_comment_prefix, trim_blocks,
lstrip_blocks, newline_sequence, keep_trailing_newline,
frozenset(extensions), optimized, undefined, finalize, autoescape,
None, 0, False, None, enable_async)
cls.environment_class,
block_start_string,
block_end_string,
variable_start_string,
variable_end_string,
comment_start_string,
comment_end_string,
line_statement_prefix,
line_comment_prefix,
trim_blocks,
lstrip_blocks,
newline_sequence,
keep_trailing_newline,
frozenset(extensions),
optimized,
undefined,
finalize,
autoescape,
None,
0,
False,
None,
enable_async,
)
return env.from_string(source, template_class=cls)
@classmethod
@@ -949,10 +1035,7 @@ def from_code(cls, environment, code, globals, uptodate=None):
"""Creates a template object from compiled code and the globals. This
is used by the loaders and environment to create a template object.
"""
namespace = {
'environment': environment,
'__file__': code.co_filename
}
namespace = {"environment": environment, "__file__": code.co_filename}
exec(code, namespace)
rv = cls._from_namespace(environment, namespace, globals)
rv._uptodate = uptodate
@@ -972,21 +1055,21 @@ def _from_namespace(cls, environment, namespace, globals):
t = object.__new__(cls)
t.environment = environment
t.globals = globals
t.name = namespace['name']
t.filename = namespace['__file__']
t.blocks = namespace['blocks']
t.name = namespace["name"]
t.filename = namespace["__file__"]
t.blocks = namespace["blocks"]
# render function and module
t.root_render_func = namespace['root']
t.root_render_func = namespace["root"]
t._module = None
# debug and loader helpers
t._debug_info = namespace['debug_info']
t._debug_info = namespace["debug_info"]
t._uptodate = None
# store the reference
namespace['environment'] = environment
namespace['__jinja_template__'] = t
namespace["environment"] = environment
namespace["__jinja_template__"] = t
return t
@@ -1004,8 +1087,7 @@ def render(self, *args, **kwargs):
try:
return concat(self.root_render_func(self.new_context(vars)))
except Exception:
exc_info = sys.exc_info()
return self.environment.handle_exception(exc_info, True)
self.environment.handle_exception()
def render_async(self, *args, **kwargs):
"""This works similar to :meth:`render` but returns a coroutine
@@ -1017,8 +1099,9 @@ def render_async(self, *args, **kwargs):
await template.render_async(knights='that say nih; asynchronously')
"""
# see asyncsupport for the actual implementation
raise NotImplementedError('This feature is not available for this '
'version of Python')
raise NotImplementedError(
"This feature is not available for this version of Python"
)
def stream(self, *args, **kwargs):
"""Works exactly like :meth:`generate` but returns a
@@ -1039,29 +1122,28 @@ def generate(self, *args, **kwargs):
for event in self.root_render_func(self.new_context(vars)):
yield event
except Exception:
exc_info = sys.exc_info()
else:
return
yield self.environment.handle_exception(exc_info, True)
yield self.environment.handle_exception()
def generate_async(self, *args, **kwargs):
"""An async version of :meth:`generate`. Works very similarly but
returns an async iterator instead.
"""
# see asyncsupport for the actual implementation
raise NotImplementedError('This feature is not available for this '
'version of Python')
raise NotImplementedError(
"This feature is not available for this version of Python"
)
def new_context(self, vars=None, shared=False, locals=None):
"""Create a new :class:`Context` for this template. The vars
provided will be passed to the template. Per default the globals
are added to the context. If shared is set to `True` the data
is passed as it to the context without adding the globals.
is passed as is to the context without adding the globals.
`locals` can be a dict of local variables for internal usage.
"""
return new_context(self.environment, self.name, self.blocks,
vars, shared, self.globals, locals)
return new_context(
self.environment, self.name, self.blocks, vars, shared, self.globals, locals
)
def make_module(self, vars=None, shared=False, locals=None):
"""This method works like the :attr:`module` attribute when called
@@ -1074,13 +1156,14 @@ def make_module(self, vars=None, shared=False, locals=None):
def make_module_async(self, vars=None, shared=False, locals=None):
"""As template module creation can invoke template code for
asynchronous exections this method must be used instead of the
asynchronous executions this method must be used instead of the
normal :meth:`make_module` one. Likewise the module attribute
becomes unavailable in async mode.
"""
# see asyncsupport for the actual implementation
raise NotImplementedError('This feature is not available for this '
'version of Python')
raise NotImplementedError(
"This feature is not available for this version of Python"
)
@internalcode
def _get_default_module(self):
@@ -1124,15 +1207,16 @@ def is_up_to_date(self):
@property
def debug_info(self):
"""The debug info mapping."""
return [tuple(imap(int, x.split('='))) for x in
self._debug_info.split('&')]
if self._debug_info:
return [tuple(map(int, x.split("="))) for x in self._debug_info.split("&")]
return []
def __repr__(self):
if self.name is None:
name = 'memory:%x' % id(self)
name = "memory:%x" % id(self)
else:
name = repr(self.name)
return '<%s %s>' % (self.__class__.__name__, name)
return "<%s %s>" % (self.__class__.__name__, name)
@implements_to_string
@@ -1145,10 +1229,12 @@ class TemplateModule(object):
def __init__(self, template, context, body_stream=None):
if body_stream is None:
if context.environment.is_async:
raise RuntimeError('Async mode requires a body stream '
'to be passed to a template module. Use '
'the async methods of the API you are '
'using.')
raise RuntimeError(
"Async mode requires a body stream "
"to be passed to a template module. Use "
"the async methods of the API you are "
"using."
)
body_stream = list(template.root_render_func(context))
self._body_stream = body_stream
self.__dict__.update(context.get_exported())
@@ -1162,10 +1248,10 @@ def __str__(self):
def __repr__(self):
if self.__name__ is None:
name = 'memory:%x' % id(self)
name = "memory:%x" % id(self)
else:
name = repr(self.__name__)
return '<%s %s>' % (self.__class__.__name__, name)
return "<%s %s>" % (self.__class__.__name__, name)
class TemplateExpression(object):
@@ -1181,7 +1267,7 @@ def __init__(self, template, undefined_to_none):
def __call__(self, *args, **kwargs):
context = self._template.new_context(dict(*args, **kwargs))
consume(self._template.root_render_func(context))
rv = context.vars['result']
rv = context.vars["result"]
if self._undefined_to_none and isinstance(rv, Undefined):
rv = None
return rv
@@ -1203,7 +1289,7 @@ def __init__(self, gen):
self._gen = gen
self.disable_buffering()
def dump(self, fp, encoding=None, errors='strict'):
def dump(self, fp, encoding=None, errors="strict"):
"""Dump the complete stream into a file or file-like object.
Per default unicode strings are written, if you want to encode
before writing specify an `encoding`.
@@ -1215,15 +1301,15 @@ def dump(self, fp, encoding=None, errors='strict'):
close = False
if isinstance(fp, string_types):
if encoding is None:
encoding = 'utf-8'
fp = open(fp, 'wb')
encoding = "utf-8"
fp = open(fp, "wb")
close = True
try:
if encoding is not None:
iterable = (x.encode(encoding, errors) for x in self)
else:
iterable = self
if hasattr(fp, 'writelines'):
if hasattr(fp, "writelines"):
fp.writelines(iterable)
else:
for item in iterable:
@@ -1259,7 +1345,7 @@ def _buffered_generator(self, size):
def enable_buffering(self, size=5):
"""Enable buffering. Buffer `size` items before yielding them."""
if size <= 1:
raise ValueError('buffer size too small')
raise ValueError("buffer size too small")
self.buffered = True
self._next = partial(next, self._buffered_generator(size))

View File

@@ -1,23 +1,18 @@
# -*- coding: utf-8 -*-
"""
jinja2.exceptions
~~~~~~~~~~~~~~~~~
Jinja exceptions.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
from jinja2._compat import imap, text_type, PY2, implements_to_string
from ._compat import imap
from ._compat import implements_to_string
from ._compat import PY2
from ._compat import text_type
class TemplateError(Exception):
"""Baseclass for all template errors."""
if PY2:
def __init__(self, message=None):
if message is not None:
message = text_type(message).encode('utf-8')
message = text_type(message).encode("utf-8")
Exception.__init__(self, message)
@property
@@ -25,11 +20,13 @@ def message(self):
if self.args:
message = self.args[0]
if message is not None:
return message.decode('utf-8', 'replace')
return message.decode("utf-8", "replace")
def __unicode__(self):
return self.message or u''
return self.message or u""
else:
def __init__(self, message=None):
Exception.__init__(self, message)
@@ -43,16 +40,28 @@ def message(self):
@implements_to_string
class TemplateNotFound(IOError, LookupError, TemplateError):
"""Raised if a template does not exist."""
"""Raised if a template does not exist.
.. versionchanged:: 2.11
If the given name is :class:`Undefined` and no message was
provided, an :exc:`UndefinedError` is raised.
"""
# looks weird, but removes the warning descriptor that just
# bogusly warns us about message being deprecated
message = None
def __init__(self, name, message=None):
IOError.__init__(self)
IOError.__init__(self, name)
if message is None:
from .runtime import Undefined
if isinstance(name, Undefined):
name._fail_with_undefined_error()
message = name
self.message = message
self.name = name
self.templates = [name]
@@ -66,13 +75,28 @@ class TemplatesNotFound(TemplateNotFound):
are selected. This is a subclass of :class:`TemplateNotFound`
exception, so just catching the base exception will catch both.
.. versionchanged:: 2.11
If a name in the list of names is :class:`Undefined`, a message
about it being undefined is shown rather than the empty string.
.. versionadded:: 2.2
"""
def __init__(self, names=(), message=None):
if message is None:
message = u'none of the templates given were found: ' + \
u', '.join(imap(text_type, names))
from .runtime import Undefined
parts = []
for name in names:
if isinstance(name, Undefined):
parts.append(name._undefined_message)
else:
parts.append(name)
message = u"none of the templates given were found: " + u", ".join(
imap(text_type, parts)
)
TemplateNotFound.__init__(self, names and names[-1] or None, message)
self.templates = list(names)
@@ -98,11 +122,11 @@ def __str__(self):
return self.message
# otherwise attach some stuff
location = 'line %d' % self.lineno
location = "line %d" % self.lineno
name = self.filename or self.name
if name:
location = 'File "%s", %s' % (name, location)
lines = [self.message, ' ' + location]
lines = [self.message, " " + location]
# if the source is set, add the line to the output
if self.source is not None:
@@ -111,9 +135,16 @@ def __str__(self):
except IndexError:
line = None
if line:
lines.append(' ' + line.strip())
lines.append(" " + line.strip())
return u'\n'.join(lines)
return u"\n".join(lines)
def __reduce__(self):
# https://bugs.python.org/issue1692335 Exceptions that take
# multiple required arguments have problems with pickling.
# Without this, raises TypeError: __init__() missing 1 required
# positional argument: 'lineno'
return self.__class__, (self.message, self.lineno, self.name, self.filename)
class TemplateAssertionError(TemplateSyntaxError):

View File

@@ -1,42 +1,49 @@
# -*- coding: utf-8 -*-
"""
jinja2.ext
~~~~~~~~~~
Jinja extensions allow to add custom tags similar to the way django custom
tags work. By default two example extensions exist: an i18n and a cache
extension.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD.
"""
"""Extension API for adding custom tags and behavior."""
import pprint
import re
from sys import version_info
from jinja2 import nodes
from jinja2.defaults import BLOCK_START_STRING, \
BLOCK_END_STRING, VARIABLE_START_STRING, VARIABLE_END_STRING, \
COMMENT_START_STRING, COMMENT_END_STRING, LINE_STATEMENT_PREFIX, \
LINE_COMMENT_PREFIX, TRIM_BLOCKS, NEWLINE_SEQUENCE, \
KEEP_TRAILING_NEWLINE, LSTRIP_BLOCKS
from jinja2.environment import Environment
from jinja2.runtime import concat
from jinja2.exceptions import TemplateAssertionError, TemplateSyntaxError
from jinja2.utils import contextfunction, import_string, Markup
from jinja2._compat import with_metaclass, string_types, iteritems
from markupsafe import Markup
from . import nodes
from ._compat import iteritems
from ._compat import string_types
from ._compat import with_metaclass
from .defaults import BLOCK_END_STRING
from .defaults import BLOCK_START_STRING
from .defaults import COMMENT_END_STRING
from .defaults import COMMENT_START_STRING
from .defaults import KEEP_TRAILING_NEWLINE
from .defaults import LINE_COMMENT_PREFIX
from .defaults import LINE_STATEMENT_PREFIX
from .defaults import LSTRIP_BLOCKS
from .defaults import NEWLINE_SEQUENCE
from .defaults import TRIM_BLOCKS
from .defaults import VARIABLE_END_STRING
from .defaults import VARIABLE_START_STRING
from .environment import Environment
from .exceptions import TemplateAssertionError
from .exceptions import TemplateSyntaxError
from .nodes import ContextReference
from .runtime import concat
from .utils import contextfunction
from .utils import import_string
# the only real useful gettext functions for a Jinja template. Note
# that ugettext must be assigned to gettext as Jinja doesn't support
# non unicode strings.
GETTEXT_FUNCTIONS = ('_', 'gettext', 'ngettext')
GETTEXT_FUNCTIONS = ("_", "gettext", "ngettext")
_ws_re = re.compile(r"\s*\n\s*")
class ExtensionRegistry(type):
"""Gives the extension an unique identifier."""
def __new__(cls, name, bases, d):
rv = type.__new__(cls, name, bases, d)
rv.identifier = rv.__module__ + '.' + rv.__name__
def __new__(mcs, name, bases, d):
rv = type.__new__(mcs, name, bases, d)
rv.identifier = rv.__module__ + "." + rv.__name__
return rv
@@ -91,10 +98,6 @@ def filter_stream(self, stream):
to filter tokens returned. This method has to return an iterable of
:class:`~jinja2.lexer.Token`\\s, but it doesn't have to return a
:class:`~jinja2.lexer.TokenStream`.
In the `ext` folder of the Jinja2 source distribution there is a file
called `inlinegettext.py` which implements a filter that utilizes this
method.
"""
return stream
@@ -116,8 +119,9 @@ def attr(self, name, lineno=None):
"""
return nodes.ExtensionAttribute(self.identifier, name, lineno=lineno)
def call_method(self, name, args=None, kwargs=None, dyn_args=None,
dyn_kwargs=None, lineno=None):
def call_method(
self, name, args=None, kwargs=None, dyn_args=None, dyn_kwargs=None, lineno=None
):
"""Call a method of the extension. This is a shortcut for
:meth:`attr` + :class:`jinja2.nodes.Call`.
"""
@@ -125,13 +129,19 @@ def call_method(self, name, args=None, kwargs=None, dyn_args=None,
args = []
if kwargs is None:
kwargs = []
return nodes.Call(self.attr(name, lineno=lineno), args, kwargs,
dyn_args, dyn_kwargs, lineno=lineno)
return nodes.Call(
self.attr(name, lineno=lineno),
args,
kwargs,
dyn_args,
dyn_kwargs,
lineno=lineno,
)
@contextfunction
def _gettext_alias(__context, *args, **kwargs):
return __context.call(__context.resolve('gettext'), *args, **kwargs)
return __context.call(__context.resolve("gettext"), *args, **kwargs)
def _make_new_gettext(func):
@@ -140,24 +150,31 @@ def gettext(__context, __string, **variables):
rv = __context.call(func, __string)
if __context.eval_ctx.autoescape:
rv = Markup(rv)
# Always treat as a format string, even if there are no
# variables. This makes translation strings more consistent
# and predictable. This requires escaping
return rv % variables
return gettext
def _make_new_ngettext(func):
@contextfunction
def ngettext(__context, __singular, __plural, __num, **variables):
variables.setdefault('num', __num)
variables.setdefault("num", __num)
rv = __context.call(func, __singular, __plural, __num)
if __context.eval_ctx.autoescape:
rv = Markup(rv)
# Always treat as a format string, see gettext comment above.
return rv % variables
return ngettext
class InternationalizationExtension(Extension):
"""This extension adds gettext support to Jinja2."""
tags = set(['trans'])
"""This extension adds gettext support to Jinja."""
tags = {"trans"}
# TODO: the i18n extension is currently reevaluating values in a few
# situations. Take this example:
@@ -168,30 +185,28 @@ class InternationalizationExtension(Extension):
def __init__(self, environment):
Extension.__init__(self, environment)
environment.globals['_'] = _gettext_alias
environment.globals["_"] = _gettext_alias
environment.extend(
install_gettext_translations=self._install,
install_null_translations=self._install_null,
install_gettext_callables=self._install_callables,
uninstall_gettext_translations=self._uninstall,
extract_translations=self._extract,
newstyle_gettext=False
newstyle_gettext=False,
)
def _install(self, translations, newstyle=None):
gettext = getattr(translations, 'ugettext', None)
gettext = getattr(translations, "ugettext", None)
if gettext is None:
gettext = translations.gettext
ngettext = getattr(translations, 'ungettext', None)
ngettext = getattr(translations, "ungettext", None)
if ngettext is None:
ngettext = translations.ngettext
self._install_callables(gettext, ngettext, newstyle)
def _install_null(self, newstyle=None):
self._install_callables(
lambda x: x,
lambda s, p, n: (n != 1 and (p,) or (s,))[0],
newstyle
lambda x: x, lambda s, p, n: (n != 1 and (p,) or (s,))[0], newstyle
)
def _install_callables(self, gettext, ngettext, newstyle=None):
@@ -200,13 +215,10 @@ def _install_callables(self, gettext, ngettext, newstyle=None):
if self.environment.newstyle_gettext:
gettext = _make_new_gettext(gettext)
ngettext = _make_new_ngettext(ngettext)
self.environment.globals.update(
gettext=gettext,
ngettext=ngettext
)
self.environment.globals.update(gettext=gettext, ngettext=ngettext)
def _uninstall(self, translations):
for key in 'gettext', 'ngettext':
for key in "gettext", "ngettext":
self.environment.globals.pop(key, None)
def _extract(self, source, gettext_functions=GETTEXT_FUNCTIONS):
@@ -226,41 +238,44 @@ def parse(self, parser):
plural_expr_assignment = None
variables = {}
trimmed = None
while parser.stream.current.type != 'block_end':
while parser.stream.current.type != "block_end":
if variables:
parser.stream.expect('comma')
parser.stream.expect("comma")
# skip colon for python compatibility
if parser.stream.skip_if('colon'):
if parser.stream.skip_if("colon"):
break
name = parser.stream.expect('name')
name = parser.stream.expect("name")
if name.value in variables:
parser.fail('translatable variable %r defined twice.' %
name.value, name.lineno,
exc=TemplateAssertionError)
parser.fail(
"translatable variable %r defined twice." % name.value,
name.lineno,
exc=TemplateAssertionError,
)
# expressions
if parser.stream.current.type == 'assign':
if parser.stream.current.type == "assign":
next(parser.stream)
variables[name.value] = var = parser.parse_expression()
elif trimmed is None and name.value in ('trimmed', 'notrimmed'):
trimmed = name.value == 'trimmed'
elif trimmed is None and name.value in ("trimmed", "notrimmed"):
trimmed = name.value == "trimmed"
continue
else:
variables[name.value] = var = nodes.Name(name.value, 'load')
variables[name.value] = var = nodes.Name(name.value, "load")
if plural_expr is None:
if isinstance(var, nodes.Call):
plural_expr = nodes.Name('_trans', 'load')
plural_expr = nodes.Name("_trans", "load")
variables[name.value] = plural_expr
plural_expr_assignment = nodes.Assign(
nodes.Name('_trans', 'store'), var)
nodes.Name("_trans", "store"), var
)
else:
plural_expr = var
num_called_num = name.value == 'num'
num_called_num = name.value == "num"
parser.stream.expect('block_end')
parser.stream.expect("block_end")
plural = None
have_plural = False
@@ -271,22 +286,24 @@ def parse(self, parser):
if singular_names:
referenced.update(singular_names)
if plural_expr is None:
plural_expr = nodes.Name(singular_names[0], 'load')
num_called_num = singular_names[0] == 'num'
plural_expr = nodes.Name(singular_names[0], "load")
num_called_num = singular_names[0] == "num"
# if we have a pluralize block, we parse that too
if parser.stream.current.test('name:pluralize'):
if parser.stream.current.test("name:pluralize"):
have_plural = True
next(parser.stream)
if parser.stream.current.type != 'block_end':
name = parser.stream.expect('name')
if parser.stream.current.type != "block_end":
name = parser.stream.expect("name")
if name.value not in variables:
parser.fail('unknown variable %r for pluralization' %
name.value, name.lineno,
exc=TemplateAssertionError)
parser.fail(
"unknown variable %r for pluralization" % name.value,
name.lineno,
exc=TemplateAssertionError,
)
plural_expr = variables[name.value]
num_called_num = name.value == 'num'
parser.stream.expect('block_end')
num_called_num = name.value == "num"
parser.stream.expect("block_end")
plural_names, plural = self._parse_block(parser, False)
next(parser.stream)
referenced.update(plural_names)
@@ -296,88 +313,97 @@ def parse(self, parser):
# register free names as simple name expressions
for var in referenced:
if var not in variables:
variables[var] = nodes.Name(var, 'load')
variables[var] = nodes.Name(var, "load")
if not have_plural:
plural_expr = None
elif plural_expr is None:
parser.fail('pluralize without variables', lineno)
parser.fail("pluralize without variables", lineno)
if trimmed is None:
trimmed = self.environment.policies['ext.i18n.trimmed']
trimmed = self.environment.policies["ext.i18n.trimmed"]
if trimmed:
singular = self._trim_whitespace(singular)
if plural:
plural = self._trim_whitespace(plural)
node = self._make_node(singular, plural, variables, plural_expr,
bool(referenced),
num_called_num and have_plural)
node = self._make_node(
singular,
plural,
variables,
plural_expr,
bool(referenced),
num_called_num and have_plural,
)
node.set_lineno(lineno)
if plural_expr_assignment is not None:
return [plural_expr_assignment, node]
else:
return node
def _trim_whitespace(self, string, _ws_re=re.compile(r'\s*\n\s*')):
return _ws_re.sub(' ', string.strip())
def _trim_whitespace(self, string, _ws_re=_ws_re):
return _ws_re.sub(" ", string.strip())
def _parse_block(self, parser, allow_pluralize):
"""Parse until the next block tag with a given name."""
referenced = []
buf = []
while 1:
if parser.stream.current.type == 'data':
buf.append(parser.stream.current.value.replace('%', '%%'))
if parser.stream.current.type == "data":
buf.append(parser.stream.current.value.replace("%", "%%"))
next(parser.stream)
elif parser.stream.current.type == 'variable_begin':
elif parser.stream.current.type == "variable_begin":
next(parser.stream)
name = parser.stream.expect('name').value
name = parser.stream.expect("name").value
referenced.append(name)
buf.append('%%(%s)s' % name)
parser.stream.expect('variable_end')
elif parser.stream.current.type == 'block_begin':
buf.append("%%(%s)s" % name)
parser.stream.expect("variable_end")
elif parser.stream.current.type == "block_begin":
next(parser.stream)
if parser.stream.current.test('name:endtrans'):
if parser.stream.current.test("name:endtrans"):
break
elif parser.stream.current.test('name:pluralize'):
elif parser.stream.current.test("name:pluralize"):
if allow_pluralize:
break
parser.fail('a translatable section can have only one '
'pluralize section')
parser.fail('control structures in translatable sections are '
'not allowed')
parser.fail(
"a translatable section can have only one pluralize section"
)
parser.fail(
"control structures in translatable sections are not allowed"
)
elif parser.stream.eos:
parser.fail('unclosed translation block')
parser.fail("unclosed translation block")
else:
assert False, 'internal parser error'
raise RuntimeError("internal parser error")
return referenced, concat(buf)
def _make_node(self, singular, plural, variables, plural_expr,
vars_referenced, num_called_num):
def _make_node(
self, singular, plural, variables, plural_expr, vars_referenced, num_called_num
):
"""Generates a useful node from the data provided."""
# no variables referenced? no need to escape for old style
# gettext invocations only if there are vars.
if not vars_referenced and not self.environment.newstyle_gettext:
singular = singular.replace('%%', '%')
singular = singular.replace("%%", "%")
if plural:
plural = plural.replace('%%', '%')
plural = plural.replace("%%", "%")
# singular only:
if plural_expr is None:
gettext = nodes.Name('gettext', 'load')
node = nodes.Call(gettext, [nodes.Const(singular)],
[], None, None)
gettext = nodes.Name("gettext", "load")
node = nodes.Call(gettext, [nodes.Const(singular)], [], None, None)
# singular and plural
else:
ngettext = nodes.Name('ngettext', 'load')
node = nodes.Call(ngettext, [
nodes.Const(singular),
nodes.Const(plural),
plural_expr
], [], None, None)
ngettext = nodes.Name("ngettext", "load")
node = nodes.Call(
ngettext,
[nodes.Const(singular), nodes.Const(plural), plural_expr],
[],
None,
None,
)
# in case newstyle gettext is used, the method is powerful
# enough to handle the variable expansion and autoescape
@@ -386,7 +412,7 @@ def _make_node(self, singular, plural, variables, plural_expr,
for key, value in iteritems(variables):
# the function adds that later anyways in case num was
# called num, so just skip it.
if num_called_num and key == 'num':
if num_called_num and key == "num":
continue
node.kwargs.append(nodes.Keyword(key, value))
@@ -396,18 +422,24 @@ def _make_node(self, singular, plural, variables, plural_expr,
# environment with autoescaping turned on
node = nodes.MarkSafeIfAutoescape(node)
if variables:
node = nodes.Mod(node, nodes.Dict([
nodes.Pair(nodes.Const(key), value)
for key, value in variables.items()
]))
node = nodes.Mod(
node,
nodes.Dict(
[
nodes.Pair(nodes.Const(key), value)
for key, value in variables.items()
]
),
)
return nodes.Output([node])
class ExprStmtExtension(Extension):
"""Adds a `do` tag to Jinja2 that works like the print statement just
"""Adds a `do` tag to Jinja that works like the print statement just
that it doesn't print the return value.
"""
tags = set(['do'])
tags = set(["do"])
def parse(self, parser):
node = nodes.ExprStmt(lineno=next(parser.stream).lineno)
@@ -417,11 +449,12 @@ def parse(self, parser):
class LoopControlExtension(Extension):
"""Adds break and continue to the template engine."""
tags = set(['break', 'continue'])
tags = set(["break", "continue"])
def parse(self, parser):
token = next(parser.stream)
if token.value == 'break':
if token.value == "break":
return nodes.Break(lineno=token.lineno)
return nodes.Continue(lineno=token.lineno)
@@ -434,8 +467,50 @@ class AutoEscapeExtension(Extension):
pass
def extract_from_ast(node, gettext_functions=GETTEXT_FUNCTIONS,
babel_style=True):
class DebugExtension(Extension):
"""A ``{% debug %}`` tag that dumps the available variables,
filters, and tests.
.. code-block:: html+jinja
<pre>{% debug %}</pre>
.. code-block:: text
{'context': {'cycler': <class 'jinja2.utils.Cycler'>,
...,
'namespace': <class 'jinja2.utils.Namespace'>},
'filters': ['abs', 'attr', 'batch', 'capitalize', 'center', 'count', 'd',
..., 'urlencode', 'urlize', 'wordcount', 'wordwrap', 'xmlattr'],
'tests': ['!=', '<', '<=', '==', '>', '>=', 'callable', 'defined',
..., 'odd', 'sameas', 'sequence', 'string', 'undefined', 'upper']}
.. versionadded:: 2.11.0
"""
tags = {"debug"}
def parse(self, parser):
lineno = parser.stream.expect("name:debug").lineno
context = ContextReference()
result = self.call_method("_render", [context], lineno=lineno)
return nodes.Output([result], lineno=lineno)
def _render(self, context):
result = {
"context": context.get_all(),
"filters": sorted(self.environment.filters.keys()),
"tests": sorted(self.environment.tests.keys()),
}
# Set the depth since the intent is to show the top few names.
if version_info[:2] >= (3, 4):
return pprint.pformat(result, depth=3, compact=True)
else:
return pprint.pformat(result, depth=3)
def extract_from_ast(node, gettext_functions=GETTEXT_FUNCTIONS, babel_style=True):
"""Extract localizable strings from the given template node. Per
default this function returns matches in babel style that means non string
parameters as well as keyword arguments are returned as `None`. This
@@ -471,19 +546,20 @@ def extract_from_ast(node, gettext_functions=GETTEXT_FUNCTIONS,
extraction interface or extract comments yourself.
"""
for node in node.find_all(nodes.Call):
if not isinstance(node.node, nodes.Name) or \
node.node.name not in gettext_functions:
if (
not isinstance(node.node, nodes.Name)
or node.node.name not in gettext_functions
):
continue
strings = []
for arg in node.args:
if isinstance(arg, nodes.Const) and \
isinstance(arg.value, string_types):
if isinstance(arg, nodes.Const) and isinstance(arg.value, string_types):
strings.append(arg.value)
else:
strings.append(None)
for arg in node.kwargs:
for _ in node.kwargs:
strings.append(None)
if node.dyn_args is not None:
strings.append(None)
@@ -517,9 +593,10 @@ def __init__(self, tokens, comment_tags):
def find_backwards(self, offset):
try:
for _, token_type, token_value in \
reversed(self.tokens[self.offset:offset]):
if token_type in ('comment', 'linecomment'):
for _, token_type, token_value in reversed(
self.tokens[self.offset : offset]
):
if token_type in ("comment", "linecomment"):
try:
prefix, comment = token_value.split(None, 1)
except ValueError:
@@ -533,7 +610,7 @@ def find_backwards(self, offset):
def find_comments(self, lineno):
if not self.comment_tags or self.last_lineno > lineno:
return []
for idx, (token_lineno, _, _) in enumerate(self.tokens[self.offset:]):
for idx, (token_lineno, _, _) in enumerate(self.tokens[self.offset :]):
if token_lineno > lineno:
return self.find_backwards(self.offset + idx)
return self.find_backwards(len(self.tokens))
@@ -545,7 +622,7 @@ def babel_extract(fileobj, keywords, comment_tags, options):
.. versionchanged:: 2.3
Basic support for translation comments was added. If `comment_tags`
is now set to a list of keywords for extraction, the extractor will
try to find the best preceeding comment that begins with one of the
try to find the best preceding comment that begins with one of the
keywords. For best results, make sure to not have more than one
gettext call in one line of code and the matching comment in the
same line or the line before.
@@ -568,7 +645,7 @@ def babel_extract(fileobj, keywords, comment_tags, options):
(comments will be empty currently)
"""
extensions = set()
for extension in options.get('extensions', '').split(','):
for extension in options.get("extensions", "").split(","):
extension = extension.strip()
if not extension:
continue
@@ -577,38 +654,37 @@ def babel_extract(fileobj, keywords, comment_tags, options):
extensions.add(InternationalizationExtension)
def getbool(options, key, default=False):
return options.get(key, str(default)).lower() in \
('1', 'on', 'yes', 'true')
return options.get(key, str(default)).lower() in ("1", "on", "yes", "true")
silent = getbool(options, 'silent', True)
silent = getbool(options, "silent", True)
environment = Environment(
options.get('block_start_string', BLOCK_START_STRING),
options.get('block_end_string', BLOCK_END_STRING),
options.get('variable_start_string', VARIABLE_START_STRING),
options.get('variable_end_string', VARIABLE_END_STRING),
options.get('comment_start_string', COMMENT_START_STRING),
options.get('comment_end_string', COMMENT_END_STRING),
options.get('line_statement_prefix') or LINE_STATEMENT_PREFIX,
options.get('line_comment_prefix') or LINE_COMMENT_PREFIX,
getbool(options, 'trim_blocks', TRIM_BLOCKS),
getbool(options, 'lstrip_blocks', LSTRIP_BLOCKS),
options.get("block_start_string", BLOCK_START_STRING),
options.get("block_end_string", BLOCK_END_STRING),
options.get("variable_start_string", VARIABLE_START_STRING),
options.get("variable_end_string", VARIABLE_END_STRING),
options.get("comment_start_string", COMMENT_START_STRING),
options.get("comment_end_string", COMMENT_END_STRING),
options.get("line_statement_prefix") or LINE_STATEMENT_PREFIX,
options.get("line_comment_prefix") or LINE_COMMENT_PREFIX,
getbool(options, "trim_blocks", TRIM_BLOCKS),
getbool(options, "lstrip_blocks", LSTRIP_BLOCKS),
NEWLINE_SEQUENCE,
getbool(options, 'keep_trailing_newline', KEEP_TRAILING_NEWLINE),
getbool(options, "keep_trailing_newline", KEEP_TRAILING_NEWLINE),
frozenset(extensions),
cache_size=0,
auto_reload=False
auto_reload=False,
)
if getbool(options, 'trimmed'):
environment.policies['ext.i18n.trimmed'] = True
if getbool(options, 'newstyle_gettext'):
if getbool(options, "trimmed"):
environment.policies["ext.i18n.trimmed"] = True
if getbool(options, "newstyle_gettext"):
environment.newstyle_gettext = True
source = fileobj.read().decode(options.get('encoding', 'utf-8'))
source = fileobj.read().decode(options.get("encoding", "utf-8"))
try:
node = environment.parse(source)
tokens = list(environment.lex(environment.preprocess(source)))
except TemplateSyntaxError as e:
except TemplateSyntaxError:
if not silent:
raise
# skip templates with syntax errors
@@ -625,3 +701,4 @@ def getbool(options, key, default=False):
loopcontrols = LoopControlExtension
with_ = WithExtension
autoescape = AutoEscapeExtension
debug = DebugExtension

File diff suppressed because it is too large Load Diff

View File

@@ -1,11 +1,10 @@
from jinja2.visitor import NodeVisitor
from jinja2._compat import iteritems
from ._compat import iteritems
from .visitor import NodeVisitor
VAR_LOAD_PARAMETER = 'param'
VAR_LOAD_RESOLVE = 'resolve'
VAR_LOAD_ALIAS = 'alias'
VAR_LOAD_UNDEFINED = 'undefined'
VAR_LOAD_PARAMETER = "param"
VAR_LOAD_RESOLVE = "resolve"
VAR_LOAD_ALIAS = "alias"
VAR_LOAD_UNDEFINED = "undefined"
def find_symbols(nodes, parent_symbols=None):
@@ -23,7 +22,6 @@ def symbols_for_node(node, parent_symbols=None):
class Symbols(object):
def __init__(self, parent=None, level=None):
if level is None:
if parent is None:
@@ -41,7 +39,7 @@ def analyze_node(self, node, **kwargs):
visitor.visit(node, **kwargs)
def _define_ref(self, name, load=None):
ident = 'l_%d_%s' % (self.level, name)
ident = "l_%d_%s" % (self.level, name)
self.refs[name] = ident
if load is not None:
self.loads[ident] = load
@@ -62,8 +60,10 @@ def find_ref(self, name):
def ref(self, name):
rv = self.find_ref(name)
if rv is None:
raise AssertionError('Tried to resolve a name to a reference that '
'was unknown to the frame (%r)' % name)
raise AssertionError(
"Tried to resolve a name to a reference that "
"was unknown to the frame (%r)" % name
)
return rv
def copy(self):
@@ -118,7 +118,7 @@ def branch_update(self, branch_symbols):
if branch_count == len(branch_symbols):
continue
target = self.find_ref(name)
assert target is not None, 'should not happen'
assert target is not None, "should not happen"
if self.parent is not None:
outer_target = self.parent.find_ref(name)
@@ -149,7 +149,6 @@ def dump_param_targets(self):
class RootVisitor(NodeVisitor):
def __init__(self, symbols):
self.sym_visitor = FrameSymbolVisitor(symbols)
@@ -157,35 +156,39 @@ def _simple_visit(self, node, **kwargs):
for child in node.iter_child_nodes():
self.sym_visitor.visit(child)
visit_Template = visit_Block = visit_Macro = visit_FilterBlock = \
visit_Scope = visit_If = visit_ScopedEvalContextModifier = \
_simple_visit
visit_Template = (
visit_Block
) = (
visit_Macro
) = (
visit_FilterBlock
) = visit_Scope = visit_If = visit_ScopedEvalContextModifier = _simple_visit
def visit_AssignBlock(self, node, **kwargs):
for child in node.body:
self.sym_visitor.visit(child)
def visit_CallBlock(self, node, **kwargs):
for child in node.iter_child_nodes(exclude=('call',)):
for child in node.iter_child_nodes(exclude=("call",)):
self.sym_visitor.visit(child)
def visit_OverlayScope(self, node, **kwargs):
for child in node.body:
self.sym_visitor.visit(child)
def visit_For(self, node, for_branch='body', **kwargs):
if for_branch == 'body':
def visit_For(self, node, for_branch="body", **kwargs):
if for_branch == "body":
self.sym_visitor.visit(node.target, store_as_param=True)
branch = node.body
elif for_branch == 'else':
elif for_branch == "else":
branch = node.else_
elif for_branch == 'test':
elif for_branch == "test":
self.sym_visitor.visit(node.target, store_as_param=True)
if node.test is not None:
self.sym_visitor.visit(node.test)
return
else:
raise RuntimeError('Unknown for branch')
raise RuntimeError("Unknown for branch")
for item in branch or ():
self.sym_visitor.visit(item)
@@ -196,8 +199,9 @@ def visit_With(self, node, **kwargs):
self.sym_visitor.visit(child)
def generic_visit(self, node, *args, **kwargs):
raise NotImplementedError('Cannot find symbols for %r' %
node.__class__.__name__)
raise NotImplementedError(
"Cannot find symbols for %r" % node.__class__.__name__
)
class FrameSymbolVisitor(NodeVisitor):
@@ -208,11 +212,11 @@ def __init__(self, symbols):
def visit_Name(self, node, store_as_param=False, **kwargs):
"""All assignments to names go through this function."""
if store_as_param or node.ctx == 'param':
if store_as_param or node.ctx == "param":
self.symbols.declare_parameter(node.name)
elif node.ctx == 'store':
elif node.ctx == "store":
self.symbols.store(node.name)
elif node.ctx == 'load':
elif node.ctx == "load":
self.symbols.load(node.name)
def visit_NSRef(self, node, **kwargs):

File diff suppressed because it is too large Load Diff

View File

@@ -1,22 +1,21 @@
# -*- coding: utf-8 -*-
"""
jinja2.loaders
~~~~~~~~~~~~~~
Jinja loader classes.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""API and implementations for loading templates from different data
sources.
"""
import os
import sys
import weakref
from types import ModuleType
from os import path
from hashlib import sha1
from jinja2.exceptions import TemplateNotFound
from jinja2.utils import open_if_exists, internalcode
from jinja2._compat import string_types, iteritems
from os import path
from types import ModuleType
from ._compat import abc
from ._compat import fspath
from ._compat import iteritems
from ._compat import string_types
from .exceptions import TemplateNotFound
from .utils import internalcode
from .utils import open_if_exists
def split_template_path(template):
@@ -24,12 +23,14 @@ def split_template_path(template):
'..' in the path it will raise a `TemplateNotFound` error.
"""
pieces = []
for piece in template.split('/'):
if path.sep in piece \
or (path.altsep and path.altsep in piece) or \
piece == path.pardir:
for piece in template.split("/"):
if (
path.sep in piece
or (path.altsep and path.altsep in piece)
or piece == path.pardir
):
raise TemplateNotFound(template)
elif piece and piece != '.':
elif piece and piece != ".":
pieces.append(piece)
return pieces
@@ -86,15 +87,16 @@ def get_source(self, environment, template):
the template will be reloaded.
"""
if not self.has_source_access:
raise RuntimeError('%s cannot provide access to the source' %
self.__class__.__name__)
raise RuntimeError(
"%s cannot provide access to the source" % self.__class__.__name__
)
raise TemplateNotFound(template)
def list_templates(self):
"""Iterates over all templates. If the loader does not support that
it should raise a :exc:`TypeError` which is the default behavior.
"""
raise TypeError('this loader cannot iterate over all templates')
raise TypeError("this loader cannot iterate over all templates")
@internalcode
def load(self, environment, name, globals=None):
@@ -131,8 +133,9 @@ def load(self, environment, name, globals=None):
bucket.code = code
bcc.set_bucket(bucket)
return environment.template_class.from_code(environment, code,
globals, uptodate)
return environment.template_class.from_code(
environment, code, globals, uptodate
)
class FileSystemLoader(BaseLoader):
@@ -153,14 +156,20 @@ class FileSystemLoader(BaseLoader):
>>> loader = FileSystemLoader('/path/to/templates', followlinks=True)
.. versionchanged:: 2.8+
The *followlinks* parameter was added.
.. versionchanged:: 2.8
The ``followlinks`` parameter was added.
"""
def __init__(self, searchpath, encoding='utf-8', followlinks=False):
if isinstance(searchpath, string_types):
def __init__(self, searchpath, encoding="utf-8", followlinks=False):
if not isinstance(searchpath, abc.Iterable) or isinstance(
searchpath, string_types
):
searchpath = [searchpath]
self.searchpath = list(searchpath)
# In Python 3.5, os.path.join doesn't support Path. This can be
# simplified to list(searchpath) when Python 3.5 is dropped.
self.searchpath = [fspath(p) for p in searchpath]
self.encoding = encoding
self.followlinks = followlinks
@@ -183,6 +192,7 @@ def uptodate():
return path.getmtime(filename) == mtime
except OSError:
return False
return contents, filename, uptodate
raise TemplateNotFound(template)
@@ -190,12 +200,14 @@ def list_templates(self):
found = set()
for searchpath in self.searchpath:
walk_dir = os.walk(searchpath, followlinks=self.followlinks)
for dirpath, dirnames, filenames in walk_dir:
for dirpath, _, filenames in walk_dir:
for filename in filenames:
template = os.path.join(dirpath, filename) \
[len(searchpath):].strip(os.path.sep) \
.replace(os.path.sep, '/')
if template[:2] == './':
template = (
os.path.join(dirpath, filename)[len(searchpath) :]
.strip(os.path.sep)
.replace(os.path.sep, "/")
)
if template[:2] == "./":
template = template[2:]
if template not in found:
found.add(template)
@@ -217,10 +229,11 @@ class PackageLoader(BaseLoader):
from the file system and not a zip file.
"""
def __init__(self, package_name, package_path='templates',
encoding='utf-8'):
from pkg_resources import DefaultProvider, ResourceManager, \
get_provider
def __init__(self, package_name, package_path="templates", encoding="utf-8"):
from pkg_resources import DefaultProvider
from pkg_resources import get_provider
from pkg_resources import ResourceManager
provider = get_provider(package_name)
self.encoding = encoding
self.manager = ResourceManager()
@@ -230,14 +243,17 @@ def __init__(self, package_name, package_path='templates',
def get_source(self, environment, template):
pieces = split_template_path(template)
p = '/'.join((self.package_path,) + tuple(pieces))
p = "/".join((self.package_path,) + tuple(pieces))
if not self.provider.has_resource(p):
raise TemplateNotFound(template)
filename = uptodate = None
if self.filesystem_bound:
filename = self.provider.get_resource_filename(self.manager, p)
mtime = path.getmtime(filename)
def uptodate():
try:
return path.getmtime(filename) == mtime
@@ -249,19 +265,24 @@ def uptodate():
def list_templates(self):
path = self.package_path
if path[:2] == './':
if path[:2] == "./":
path = path[2:]
elif path == '.':
path = ''
elif path == ".":
path = ""
offset = len(path)
results = []
def _walk(path):
for filename in self.provider.resource_listdir(path):
fullname = path + '/' + filename
fullname = path + "/" + filename
if self.provider.resource_isdir(fullname):
_walk(fullname)
else:
results.append(fullname[offset:].lstrip('/'))
results.append(fullname[offset:].lstrip("/"))
_walk(path)
results.sort()
return results
@@ -334,7 +355,7 @@ class PrefixLoader(BaseLoader):
by loading ``'app2/index.html'`` the file from the second.
"""
def __init__(self, mapping, delimiter='/'):
def __init__(self, mapping, delimiter="/"):
self.mapping = mapping
self.delimiter = delimiter
@@ -434,19 +455,20 @@ class ModuleLoader(BaseLoader):
has_source_access = False
def __init__(self, path):
package_name = '_jinja2_module_templates_%x' % id(self)
package_name = "_jinja2_module_templates_%x" % id(self)
# create a fake module that looks for the templates in the
# path given.
mod = _TemplateModule(package_name)
if isinstance(path, string_types):
path = [path]
else:
path = list(path)
mod.__path__ = path
sys.modules[package_name] = weakref.proxy(mod,
lambda x: sys.modules.pop(package_name, None))
if not isinstance(path, abc.Iterable) or isinstance(path, string_types):
path = [path]
mod.__path__ = [fspath(p) for p in path]
sys.modules[package_name] = weakref.proxy(
mod, lambda x: sys.modules.pop(package_name, None)
)
# the only strong reference, the sys.modules entry is weak
# so that the garbage collector can remove it once the
@@ -456,20 +478,20 @@ def __init__(self, path):
@staticmethod
def get_template_key(name):
return 'tmpl_' + sha1(name.encode('utf-8')).hexdigest()
return "tmpl_" + sha1(name.encode("utf-8")).hexdigest()
@staticmethod
def get_module_filename(name):
return ModuleLoader.get_template_key(name) + '.py'
return ModuleLoader.get_template_key(name) + ".py"
@internalcode
def load(self, environment, name, globals=None):
key = self.get_template_key(name)
module = '%s.%s' % (self.package_name, key)
module = "%s.%s" % (self.package_name, key)
mod = getattr(self.module, module, None)
if mod is None:
try:
mod = __import__(module, None, None, ['root'])
mod = __import__(module, None, None, ["root"])
except ImportError:
raise TemplateNotFound(name)
@@ -478,4 +500,5 @@ def load(self, environment, name, globals=None):
sys.modules.pop(module, None)
return environment.template_class.from_module_dict(
environment, mod.__dict__, globals)
environment, mod.__dict__, globals
)

View File

@@ -1,25 +1,18 @@
# -*- coding: utf-8 -*-
"""Functions that expose information about templates that might be
interesting for introspection.
"""
jinja2.meta
~~~~~~~~~~~
This module implements various functions that exposes information about
templates that might be interesting for various kinds of applications.
:copyright: (c) 2017 by the Jinja Team, see AUTHORS for more details.
:license: BSD, see LICENSE for more details.
"""
from jinja2 import nodes
from jinja2.compiler import CodeGenerator
from jinja2._compat import string_types, iteritems
from . import nodes
from ._compat import iteritems
from ._compat import string_types
from .compiler import CodeGenerator
class TrackingCodeGenerator(CodeGenerator):
"""We abuse the code generator for introspection."""
def __init__(self, environment):
CodeGenerator.__init__(self, environment, '<introspection>',
'<introspection>')
CodeGenerator.__init__(self, environment, "<introspection>", "<introspection>")
self.undeclared_identifiers = set()
def write(self, x):
@@ -29,7 +22,7 @@ def enter_frame(self, frame):
"""Remember all undeclared identifiers."""
CodeGenerator.enter_frame(self, frame)
for _, (action, param) in iteritems(frame.symbols.loads):
if action == 'resolve':
if action == "resolve" and param not in self.environment.globals:
self.undeclared_identifiers.add(param)
@@ -72,8 +65,9 @@ def find_referenced_templates(ast):
This function is useful for dependency tracking. For example if you want
to rebuild parts of the website after a layout template has changed.
"""
for node in ast.find_all((nodes.Extends, nodes.FromImport, nodes.Import,
nodes.Include)):
for node in ast.find_all(
(nodes.Extends, nodes.FromImport, nodes.Import, nodes.Include)
):
if not isinstance(node.template, nodes.Const):
# a tuple with some non consts in there
if isinstance(node.template, (nodes.Tuple, nodes.List)):
@@ -96,8 +90,9 @@ def find_referenced_templates(ast):
# a tuple or list (latter *should* not happen) made of consts,
# yield the consts that are strings. We could warn here for
# non string values
elif isinstance(node, nodes.Include) and \
isinstance(node.template.value, (tuple, list)):
elif isinstance(node, nodes.Include) and isinstance(
node.template.value, (tuple, list)
):
for template_name in node.template.value:
if isinstance(template_name, string_types):
yield template_name

View File

@@ -1,19 +1,23 @@
import sys
from ast import literal_eval
from itertools import islice, chain
from jinja2 import nodes
from jinja2._compat import text_type
from jinja2.compiler import CodeGenerator, has_safe_repr
from jinja2.environment import Environment, Template
from jinja2.utils import concat, escape
from itertools import chain
from itertools import islice
from . import nodes
from ._compat import text_type
from .compiler import CodeGenerator
from .compiler import has_safe_repr
from .environment import Environment
from .environment import Template
def native_concat(nodes):
"""Return a native Python type from the list of compiled nodes. If the
result is a single node, its value is returned. Otherwise, the nodes are
concatenated as strings. If the result can be parsed with
:func:`ast.literal_eval`, the parsed value is returned. Otherwise, the
string is returned.
"""Return a native Python type from the list of compiled nodes. If
the result is a single node, its value is returned. Otherwise, the
nodes are concatenated as strings. If the result can be parsed with
:func:`ast.literal_eval`, the parsed value is returned. Otherwise,
the string is returned.
:param nodes: Iterable of nodes to concatenate.
"""
head = list(islice(nodes, 2))
@@ -21,200 +25,70 @@ def native_concat(nodes):
return None
if len(head) == 1:
out = head[0]
raw = head[0]
else:
out = u''.join([text_type(v) for v in chain(head, nodes)])
raw = u"".join([text_type(v) for v in chain(head, nodes)])
try:
return literal_eval(out)
return literal_eval(raw)
except (ValueError, SyntaxError, MemoryError):
return out
return raw
class NativeCodeGenerator(CodeGenerator):
"""A code generator which avoids injecting ``to_string()`` calls around the
internal code Jinja uses to render templates.
"""A code generator which renders Python types by not adding
``to_string()`` around output nodes.
"""
def visit_Output(self, node, frame):
"""Same as :meth:`CodeGenerator.visit_Output`, but do not call
``to_string`` on output nodes in generated code.
"""
if self.has_known_extends and frame.require_output_check:
return
@staticmethod
def _default_finalize(value):
return value
finalize = self.environment.finalize
finalize_context = getattr(finalize, 'contextfunction', False)
finalize_eval = getattr(finalize, 'evalcontextfunction', False)
finalize_env = getattr(finalize, 'environmentfunction', False)
def _output_const_repr(self, group):
return repr(u"".join([text_type(v) for v in group]))
if finalize is not None:
if finalize_context or finalize_eval:
const_finalize = None
elif finalize_env:
def const_finalize(x):
return finalize(self.environment, x)
else:
const_finalize = finalize
else:
def const_finalize(x):
return x
def _output_child_to_const(self, node, frame, finalize):
const = node.as_const(frame.eval_ctx)
# If we are inside a frame that requires output checking, we do so.
outdent_later = False
if not has_safe_repr(const):
raise nodes.Impossible()
if frame.require_output_check:
self.writeline('if parent_template is None:')
self.indent()
outdent_later = True
if isinstance(node, nodes.TemplateData):
return const
# Try to evaluate as many chunks as possible into a static string at
# compile time.
body = []
return finalize.const(const)
for child in node.nodes:
try:
if const_finalize is None:
raise nodes.Impossible()
def _output_child_pre(self, node, frame, finalize):
if finalize.src is not None:
self.write(finalize.src)
const = child.as_const(frame.eval_ctx)
if not has_safe_repr(const):
raise nodes.Impossible()
except nodes.Impossible:
body.append(child)
continue
# the frame can't be volatile here, because otherwise the as_const
# function would raise an Impossible exception at that point
try:
if frame.eval_ctx.autoescape:
if hasattr(const, '__html__'):
const = const.__html__()
else:
const = escape(const)
const = const_finalize(const)
except Exception:
# if something goes wrong here we evaluate the node at runtime
# for easier debugging
body.append(child)
continue
if body and isinstance(body[-1], list):
body[-1].append(const)
else:
body.append([const])
# if we have less than 3 nodes or a buffer we yield or extend/append
if len(body) < 3 or frame.buffer is not None:
if frame.buffer is not None:
# for one item we append, for more we extend
if len(body) == 1:
self.writeline('%s.append(' % frame.buffer)
else:
self.writeline('%s.extend((' % frame.buffer)
self.indent()
for item in body:
if isinstance(item, list):
val = repr(native_concat(item))
if frame.buffer is None:
self.writeline('yield ' + val)
else:
self.writeline(val + ',')
else:
if frame.buffer is None:
self.writeline('yield ', item)
else:
self.newline(item)
close = 0
if finalize is not None:
self.write('environment.finalize(')
if finalize_context:
self.write('context, ')
close += 1
self.visit(item, frame)
if close > 0:
self.write(')' * close)
if frame.buffer is not None:
self.write(',')
if frame.buffer is not None:
# close the open parentheses
self.outdent()
self.writeline(len(body) == 1 and ')' or '))')
# otherwise we create a format string as this is faster in that case
else:
format = []
arguments = []
for item in body:
if isinstance(item, list):
format.append(native_concat(item).replace('%', '%%'))
else:
format.append('%s')
arguments.append(item)
self.writeline('yield ')
self.write(repr(concat(format)) + ' % (')
self.indent()
for argument in arguments:
self.newline(argument)
close = 0
if finalize is not None:
self.write('environment.finalize(')
if finalize_context:
self.write('context, ')
elif finalize_eval:
self.write('context.eval_ctx, ')
elif finalize_env:
self.write('environment, ')
close += 1
self.visit(argument, frame)
self.write(')' * close + ', ')
self.outdent()
self.writeline(')')
if outdent_later:
self.outdent()
class NativeTemplate(Template):
def render(self, *args, **kwargs):
"""Render the template to produce a native Python type. If the result
is a single node, its value is returned. Otherwise, the nodes are
concatenated as strings. If the result can be parsed with
:func:`ast.literal_eval`, the parsed value is returned. Otherwise, the
string is returned.
"""
vars = dict(*args, **kwargs)
try:
return native_concat(self.root_render_func(self.new_context(vars)))
except Exception:
exc_info = sys.exc_info()
return self.environment.handle_exception(exc_info, True)
def _output_child_post(self, node, frame, finalize):
if finalize.src is not None:
self.write(")")
class NativeEnvironment(Environment):
"""An environment that renders templates to native Python types."""
code_generator_class = NativeCodeGenerator
template_class = NativeTemplate
class NativeTemplate(Template):
environment_class = NativeEnvironment
def render(self, *args, **kwargs):
"""Render the template to produce a native Python type. If the
result is a single node, its value is returned. Otherwise, the
nodes are concatenated as strings. If the result can be parsed
with :func:`ast.literal_eval`, the parsed value is returned.
Otherwise, the string is returned.
"""
vars = dict(*args, **kwargs)
try:
return native_concat(self.root_render_func(self.new_context(vars)))
except Exception:
return self.environment.handle_exception()
NativeEnvironment.template_class = NativeTemplate

View File

@@ -1,54 +1,39 @@
# -*- coding: utf-8 -*-
"""AST nodes generated by the parser for the compiler. Also provides
some node tree helper functions used by the parser and compiler in order
to normalize nodes.
"""
jinja2.nodes
~~~~~~~~~~~~
This module implements additional nodes derived from the ast base node.
It also provides some node tree helper functions like `in_lineno` and
`get_nodes` used by the parser and translator in order to normalize
python and jinja nodes.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
import types
import operator
from collections import deque
from jinja2.utils import Markup
from jinja2._compat import izip, with_metaclass, text_type, PY2
from markupsafe import Markup
#: the types we support for context functions
_context_function_types = (types.FunctionType, types.MethodType)
from ._compat import izip
from ._compat import PY2
from ._compat import text_type
from ._compat import with_metaclass
_binop_to_func = {
'*': operator.mul,
'/': operator.truediv,
'//': operator.floordiv,
'**': operator.pow,
'%': operator.mod,
'+': operator.add,
'-': operator.sub
"*": operator.mul,
"/": operator.truediv,
"//": operator.floordiv,
"**": operator.pow,
"%": operator.mod,
"+": operator.add,
"-": operator.sub,
}
_uaop_to_func = {
'not': operator.not_,
'+': operator.pos,
'-': operator.neg
}
_uaop_to_func = {"not": operator.not_, "+": operator.pos, "-": operator.neg}
_cmpop_to_func = {
'eq': operator.eq,
'ne': operator.ne,
'gt': operator.gt,
'gteq': operator.ge,
'lt': operator.lt,
'lteq': operator.le,
'in': lambda a, b: a in b,
'notin': lambda a, b: a not in b
"eq": operator.eq,
"ne": operator.ne,
"gt": operator.gt,
"gteq": operator.ge,
"lt": operator.lt,
"lteq": operator.le,
"in": lambda a, b: a in b,
"notin": lambda a, b: a not in b,
}
@@ -61,16 +46,16 @@ class NodeType(type):
inheritance. fields and attributes from the parent class are
automatically forwarded to the child."""
def __new__(cls, name, bases, d):
for attr in 'fields', 'attributes':
def __new__(mcs, name, bases, d):
for attr in "fields", "attributes":
storage = []
storage.extend(getattr(bases[0], attr, ()))
storage.extend(d.get(attr, ()))
assert len(bases) == 1, 'multiple inheritance not allowed'
assert len(storage) == len(set(storage)), 'layout conflict'
assert len(bases) == 1, "multiple inheritance not allowed"
assert len(storage) == len(set(storage)), "layout conflict"
d[attr] = tuple(storage)
d.setdefault('abstract', False)
return type.__new__(cls, name, bases, d)
d.setdefault("abstract", False)
return type.__new__(mcs, name, bases, d)
class EvalContext(object):
@@ -97,15 +82,17 @@ def revert(self, old):
def get_eval_context(node, ctx):
if ctx is None:
if node.environment is None:
raise RuntimeError('if no eval context is passed, the '
'node must have an attached '
'environment.')
raise RuntimeError(
"if no eval context is passed, the "
"node must have an attached "
"environment."
)
return EvalContext(node.environment)
return ctx
class Node(with_metaclass(NodeType, object)):
"""Baseclass for all Jinja2 nodes. There are a number of nodes available
"""Baseclass for all Jinja nodes. There are a number of nodes available
of different types. There are four major types:
- :class:`Stmt`: statements
@@ -120,30 +107,32 @@ class Node(with_metaclass(NodeType, object)):
The `environment` attribute is set at the end of the parsing process for
all nodes automatically.
"""
fields = ()
attributes = ('lineno', 'environment')
attributes = ("lineno", "environment")
abstract = True
def __init__(self, *fields, **attributes):
if self.abstract:
raise TypeError('abstract nodes are not instanciable')
raise TypeError("abstract nodes are not instantiable")
if fields:
if len(fields) != len(self.fields):
if not self.fields:
raise TypeError('%r takes 0 arguments' %
self.__class__.__name__)
raise TypeError('%r takes 0 or %d argument%s' % (
self.__class__.__name__,
len(self.fields),
len(self.fields) != 1 and 's' or ''
))
raise TypeError("%r takes 0 arguments" % self.__class__.__name__)
raise TypeError(
"%r takes 0 or %d argument%s"
% (
self.__class__.__name__,
len(self.fields),
len(self.fields) != 1 and "s" or "",
)
)
for name, arg in izip(self.fields, fields):
setattr(self, name, arg)
for attr in self.attributes:
setattr(self, attr, attributes.pop(attr, None))
if attributes:
raise TypeError('unknown attribute %r' %
next(iter(attributes)))
raise TypeError("unknown attribute %r" % next(iter(attributes)))
def iter_fields(self, exclude=None, only=None):
"""This method iterates over all fields that are defined and yields
@@ -153,9 +142,11 @@ def iter_fields(self, exclude=None, only=None):
should be sets or tuples of field names.
"""
for name in self.fields:
if (exclude is only is None) or \
(exclude is not None and name not in exclude) or \
(only is not None and name in only):
if (
(exclude is only is None)
or (exclude is not None and name not in exclude)
or (only is not None and name in only)
):
try:
yield name, getattr(self, name)
except AttributeError:
@@ -166,7 +157,7 @@ def iter_child_nodes(self, exclude=None, only=None):
over all fields and yields the values of they are nodes. If the value
of a field is a list all the nodes in that list are returned.
"""
for field, item in self.iter_fields(exclude, only):
for _, item in self.iter_fields(exclude, only):
if isinstance(item, list):
for n in item:
if isinstance(n, Node):
@@ -200,7 +191,7 @@ def set_ctx(self, ctx):
todo = deque([self])
while todo:
node = todo.popleft()
if 'ctx' in node.fields:
if "ctx" in node.fields:
node.ctx = ctx
todo.extend(node.iter_child_nodes())
return self
@@ -210,7 +201,7 @@ def set_lineno(self, lineno, override=False):
todo = deque([self])
while todo:
node = todo.popleft()
if 'lineno' in node.attributes:
if "lineno" in node.attributes:
if node.lineno is None or override:
node.lineno = lineno
todo.extend(node.iter_child_nodes())
@@ -226,8 +217,9 @@ def set_environment(self, environment):
return self
def __eq__(self, other):
return type(self) is type(other) and \
tuple(self.iter_fields()) == tuple(other.iter_fields())
return type(self) is type(other) and tuple(self.iter_fields()) == tuple(
other.iter_fields()
)
def __ne__(self, other):
return not self.__eq__(other)
@@ -236,10 +228,9 @@ def __ne__(self, other):
__hash__ = object.__hash__
def __repr__(self):
return '%s(%s)' % (
return "%s(%s)" % (
self.__class__.__name__,
', '.join('%s=%r' % (arg, getattr(self, arg, None)) for
arg in self.fields)
", ".join("%s=%r" % (arg, getattr(self, arg, None)) for arg in self.fields),
)
def dump(self):
@@ -248,37 +239,39 @@ def _dump(node):
buf.append(repr(node))
return
buf.append('nodes.%s(' % node.__class__.__name__)
buf.append("nodes.%s(" % node.__class__.__name__)
if not node.fields:
buf.append(')')
buf.append(")")
return
for idx, field in enumerate(node.fields):
if idx:
buf.append(', ')
buf.append(", ")
value = getattr(node, field)
if isinstance(value, list):
buf.append('[')
buf.append("[")
for idx, item in enumerate(value):
if idx:
buf.append(', ')
buf.append(", ")
_dump(item)
buf.append(']')
buf.append("]")
else:
_dump(value)
buf.append(')')
buf.append(")")
buf = []
_dump(self)
return ''.join(buf)
return "".join(buf)
class Stmt(Node):
"""Base node for all statements."""
abstract = True
class Helper(Node):
"""Nodes that exist in a specific context only."""
abstract = True
@@ -286,19 +279,22 @@ class Template(Node):
"""Node that represents a template. This must be the outermost node that
is passed to the compiler.
"""
fields = ('body',)
fields = ("body",)
class Output(Stmt):
"""A node that holds multiple expressions which are then printed out.
This is used both for the `print` statement and the regular template data.
"""
fields = ('nodes',)
fields = ("nodes",)
class Extends(Stmt):
"""Represents an extends statement."""
fields = ('template',)
fields = ("template",)
class For(Stmt):
@@ -309,12 +305,14 @@ class For(Stmt):
For filtered nodes an expression can be stored as `test`, otherwise `None`.
"""
fields = ('target', 'iter', 'body', 'else_', 'test', 'recursive')
fields = ("target", "iter", "body", "else_", "test", "recursive")
class If(Stmt):
"""If `test` is true, `body` is rendered, else `else_`."""
fields = ('test', 'body', 'elif_', 'else_')
fields = ("test", "body", "elif_", "else_")
class Macro(Stmt):
@@ -322,19 +320,22 @@ class Macro(Stmt):
arguments and `defaults` a list of defaults if there are any. `body` is
a list of nodes for the macro body.
"""
fields = ('name', 'args', 'defaults', 'body')
fields = ("name", "args", "defaults", "body")
class CallBlock(Stmt):
"""Like a macro without a name but a call instead. `call` is called with
the unnamed macro as `caller` argument this node holds.
"""
fields = ('call', 'args', 'defaults', 'body')
fields = ("call", "args", "defaults", "body")
class FilterBlock(Stmt):
"""Node for filter sections."""
fields = ('body', 'filter')
fields = ("body", "filter")
class With(Stmt):
@@ -343,22 +344,26 @@ class With(Stmt):
.. versionadded:: 2.9.3
"""
fields = ('targets', 'values', 'body')
fields = ("targets", "values", "body")
class Block(Stmt):
"""A node that represents a block."""
fields = ('name', 'body', 'scoped')
fields = ("name", "body", "scoped")
class Include(Stmt):
"""A node that represents the include tag."""
fields = ('template', 'with_context', 'ignore_missing')
fields = ("template", "with_context", "ignore_missing")
class Import(Stmt):
"""A node that represents the import tag."""
fields = ('template', 'target', 'with_context')
fields = ("template", "target", "with_context")
class FromImport(Stmt):
@@ -372,26 +377,31 @@ class FromImport(Stmt):
The list of names may contain tuples if aliases are wanted.
"""
fields = ('template', 'names', 'with_context')
fields = ("template", "names", "with_context")
class ExprStmt(Stmt):
"""A statement that evaluates an expression and discards the result."""
fields = ('node',)
fields = ("node",)
class Assign(Stmt):
"""Assigns an expression to a target."""
fields = ('target', 'node')
fields = ("target", "node")
class AssignBlock(Stmt):
"""Assigns a block to a target."""
fields = ('target', 'filter', 'body')
fields = ("target", "filter", "body")
class Expr(Node):
"""Baseclass for all expressions."""
abstract = True
def as_const(self, eval_ctx=None):
@@ -414,15 +424,18 @@ def can_assign(self):
class BinExpr(Expr):
"""Baseclass for all binary expressions."""
fields = ('left', 'right')
fields = ("left", "right")
operator = None
abstract = True
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
# intercepted operators cannot be folded at compile time
if self.environment.sandboxed and \
self.operator in self.environment.intercepted_binops:
if (
self.environment.sandboxed
and self.operator in self.environment.intercepted_binops
):
raise Impossible()
f = _binop_to_func[self.operator]
try:
@@ -433,15 +446,18 @@ def as_const(self, eval_ctx=None):
class UnaryExpr(Expr):
"""Baseclass for all unary expressions."""
fields = ('node',)
fields = ("node",)
operator = None
abstract = True
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
# intercepted operators cannot be folded at compile time
if self.environment.sandboxed and \
self.operator in self.environment.intercepted_unops:
if (
self.environment.sandboxed
and self.operator in self.environment.intercepted_unops
):
raise Impossible()
f = _uaop_to_func[self.operator]
try:
@@ -458,16 +474,17 @@ class Name(Expr):
- `load`: load that name
- `param`: like `store` but if the name was defined as function parameter.
"""
fields = ('name', 'ctx')
fields = ("name", "ctx")
def can_assign(self):
return self.name not in ('true', 'false', 'none',
'True', 'False', 'None')
return self.name not in ("true", "false", "none", "True", "False", "None")
class NSRef(Expr):
"""Reference to a namespace value assignment"""
fields = ('name', 'attr')
fields = ("name", "attr")
def can_assign(self):
# We don't need any special checks here; NSRef assignments have a
@@ -479,6 +496,7 @@ def can_assign(self):
class Literal(Expr):
"""Baseclass for literals."""
abstract = True
@@ -488,14 +506,18 @@ class Const(Literal):
complex values such as lists too. Only constants with a safe
representation (objects where ``eval(repr(x)) == x`` is true).
"""
fields = ('value',)
fields = ("value",)
def as_const(self, eval_ctx=None):
rv = self.value
if PY2 and type(rv) is text_type and \
self.environment.policies['compiler.ascii_str']:
if (
PY2
and type(rv) is text_type
and self.environment.policies["compiler.ascii_str"]
):
try:
rv = rv.encode('ascii')
rv = rv.encode("ascii")
except UnicodeError:
pass
return rv
@@ -507,6 +529,7 @@ def from_untrusted(cls, value, lineno=None, environment=None):
an `Impossible` exception.
"""
from .compiler import has_safe_repr
if not has_safe_repr(value):
raise Impossible()
return cls(value, lineno=lineno, environment=environment)
@@ -514,7 +537,8 @@ def from_untrusted(cls, value, lineno=None, environment=None):
class TemplateData(Literal):
"""A constant template string."""
fields = ('data',)
fields = ("data",)
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -530,7 +554,8 @@ class Tuple(Literal):
for subscripts. Like for :class:`Name` `ctx` specifies if the tuple
is used for loading the names or storing.
"""
fields = ('items', 'ctx')
fields = ("items", "ctx")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -545,7 +570,8 @@ def can_assign(self):
class List(Literal):
"""Any list literal such as ``[1, 2, 3]``"""
fields = ('items',)
fields = ("items",)
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -556,7 +582,8 @@ class Dict(Literal):
"""Any dict literal such as ``{1: 2, 3: 4}``. The items must be a list of
:class:`Pair` nodes.
"""
fields = ('items',)
fields = ("items",)
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -565,7 +592,8 @@ def as_const(self, eval_ctx=None):
class Pair(Helper):
"""A key, value pair for dicts."""
fields = ('key', 'value')
fields = ("key", "value")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -574,7 +602,8 @@ def as_const(self, eval_ctx=None):
class Keyword(Helper):
"""A key, value pair for keyword arguments where key is a string."""
fields = ('key', 'value')
fields = ("key", "value")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -585,7 +614,8 @@ class CondExpr(Expr):
"""A conditional expression (inline if expression). (``{{
foo if bar else baz }}``)
"""
fields = ('test', 'expr1', 'expr2')
fields = ("test", "expr1", "expr2")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -626,7 +656,7 @@ class Filter(Expr):
filtered. Buffers are created by macros and filter blocks.
"""
fields = ('node', 'name', 'args', 'kwargs', 'dyn_args', 'dyn_kwargs')
fields = ("node", "name", "args", "kwargs", "dyn_args", "dyn_kwargs")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -636,28 +666,27 @@ def as_const(self, eval_ctx=None):
# we have to be careful here because we call filter_ below.
# if this variable would be called filter, 2to3 would wrap the
# call in a list beause it is assuming we are talking about the
# call in a list because it is assuming we are talking about the
# builtin filter function here which no longer returns a list in
# python 3. because of that, do not rename filter_ to filter!
filter_ = self.environment.filters.get(self.name)
if filter_ is None or getattr(filter_, 'contextfilter', False):
if filter_ is None or getattr(filter_, "contextfilter", False) is True:
raise Impossible()
# We cannot constant handle async filters, so we need to make sure
# to not go down this path.
if (
eval_ctx.environment.is_async
and getattr(filter_, 'asyncfiltervariant', False)
if eval_ctx.environment.is_async and getattr(
filter_, "asyncfiltervariant", False
):
raise Impossible()
args, kwargs = args_as_const(self, eval_ctx)
args.insert(0, self.node.as_const(eval_ctx))
if getattr(filter_, 'evalcontextfilter', False):
if getattr(filter_, "evalcontextfilter", False) is True:
args.insert(0, eval_ctx)
elif getattr(filter_, 'environmentfilter', False):
elif getattr(filter_, "environmentfilter", False) is True:
args.insert(0, self.environment)
try:
@@ -671,7 +700,7 @@ class Test(Expr):
rest of the fields are the same as for :class:`Call`.
"""
fields = ('node', 'name', 'args', 'kwargs', 'dyn_args', 'dyn_kwargs')
fields = ("node", "name", "args", "kwargs", "dyn_args", "dyn_kwargs")
def as_const(self, eval_ctx=None):
test = self.environment.tests.get(self.name)
@@ -696,20 +725,23 @@ class Call(Expr):
node for dynamic positional (``*args``) or keyword (``**kwargs``)
arguments.
"""
fields = ('node', 'args', 'kwargs', 'dyn_args', 'dyn_kwargs')
fields = ("node", "args", "kwargs", "dyn_args", "dyn_kwargs")
class Getitem(Expr):
"""Get an attribute or item from an expression and prefer the item."""
fields = ('node', 'arg', 'ctx')
fields = ("node", "arg", "ctx")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
if self.ctx != 'load':
if self.ctx != "load":
raise Impossible()
try:
return self.environment.getitem(self.node.as_const(eval_ctx),
self.arg.as_const(eval_ctx))
return self.environment.getitem(
self.node.as_const(eval_ctx), self.arg.as_const(eval_ctx)
)
except Exception:
raise Impossible()
@@ -721,15 +753,15 @@ class Getattr(Expr):
"""Get an attribute or item from an expression that is a ascii-only
bytestring and prefer the attribute.
"""
fields = ('node', 'attr', 'ctx')
fields = ("node", "attr", "ctx")
def as_const(self, eval_ctx=None):
if self.ctx != 'load':
if self.ctx != "load":
raise Impossible()
try:
eval_ctx = get_eval_context(self, eval_ctx)
return self.environment.getattr(self.node.as_const(eval_ctx),
self.attr)
return self.environment.getattr(self.node.as_const(eval_ctx), self.attr)
except Exception:
raise Impossible()
@@ -741,14 +773,17 @@ class Slice(Expr):
"""Represents a slice object. This must only be used as argument for
:class:`Subscript`.
"""
fields = ('start', 'stop', 'step')
fields = ("start", "stop", "step")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
def const(obj):
if obj is None:
return None
return obj.as_const(eval_ctx)
return slice(const(self.start), const(self.stop), const(self.step))
@@ -756,82 +791,103 @@ class Concat(Expr):
"""Concatenates the list of expressions provided after converting them to
unicode.
"""
fields = ('nodes',)
fields = ("nodes",)
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
return ''.join(text_type(x.as_const(eval_ctx)) for x in self.nodes)
return "".join(text_type(x.as_const(eval_ctx)) for x in self.nodes)
class Compare(Expr):
"""Compares an expression with some other expressions. `ops` must be a
list of :class:`Operand`\\s.
"""
fields = ('expr', 'ops')
fields = ("expr", "ops")
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
result = value = self.expr.as_const(eval_ctx)
try:
for op in self.ops:
new_value = op.expr.as_const(eval_ctx)
result = _cmpop_to_func[op.op](value, new_value)
if not result:
return False
value = new_value
except Exception:
raise Impossible()
return result
class Operand(Helper):
"""Holds an operator and an expression."""
fields = ('op', 'expr')
fields = ("op", "expr")
if __debug__:
Operand.__doc__ += '\nThe following operators are available: ' + \
', '.join(sorted('``%s``' % x for x in set(_binop_to_func) |
set(_uaop_to_func) | set(_cmpop_to_func)))
Operand.__doc__ += "\nThe following operators are available: " + ", ".join(
sorted(
"``%s``" % x
for x in set(_binop_to_func) | set(_uaop_to_func) | set(_cmpop_to_func)
)
)
class Mul(BinExpr):
"""Multiplies the left with the right node."""
operator = '*'
operator = "*"
class Div(BinExpr):
"""Divides the left by the right node."""
operator = '/'
operator = "/"
class FloorDiv(BinExpr):
"""Divides the left by the right node and truncates conver the
result into an integer by truncating.
"""
operator = '//'
operator = "//"
class Add(BinExpr):
"""Add the left to the right node."""
operator = '+'
operator = "+"
class Sub(BinExpr):
"""Subtract the right from the left node."""
operator = '-'
operator = "-"
class Mod(BinExpr):
"""Left modulo right."""
operator = '%'
operator = "%"
class Pow(BinExpr):
"""Left to the power of right."""
operator = '**'
operator = "**"
class And(BinExpr):
"""Short circuited AND."""
operator = 'and'
operator = "and"
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -840,7 +896,8 @@ def as_const(self, eval_ctx=None):
class Or(BinExpr):
"""Short circuited OR."""
operator = 'or'
operator = "or"
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -849,17 +906,20 @@ def as_const(self, eval_ctx=None):
class Not(UnaryExpr):
"""Negate the expression."""
operator = 'not'
operator = "not"
class Neg(UnaryExpr):
"""Make the expression negative."""
operator = '-'
operator = "-"
class Pos(UnaryExpr):
"""Make the expression positive (noop for most expressions)"""
operator = '+'
operator = "+"
# Helpers for extensions
@@ -869,7 +929,8 @@ class EnvironmentAttribute(Expr):
"""Loads an attribute from the environment object. This is useful for
extensions that want to call a callback stored on the environment.
"""
fields = ('name',)
fields = ("name",)
class ExtensionAttribute(Expr):
@@ -879,7 +940,8 @@ class ExtensionAttribute(Expr):
This node is usually constructed by calling the
:meth:`~jinja2.ext.Extension.attr` method on an extension.
"""
fields = ('identifier', 'name')
fields = ("identifier", "name")
class ImportedName(Expr):
@@ -888,7 +950,8 @@ class ImportedName(Expr):
function from the cgi module on evaluation. Imports are optimized by the
compiler so there is no need to assign them to local variables.
"""
fields = ('importname',)
fields = ("importname",)
class InternalName(Expr):
@@ -898,16 +961,20 @@ class InternalName(Expr):
a new identifier for you. This identifier is not available from the
template and is not threated specially by the compiler.
"""
fields = ('name',)
fields = ("name",)
def __init__(self):
raise TypeError('Can\'t create internal names. Use the '
'`free_identifier` method on a parser.')
raise TypeError(
"Can't create internal names. Use the "
"`free_identifier` method on a parser."
)
class MarkSafe(Expr):
"""Mark the wrapped expression as safe (wrap it as `Markup`)."""
fields = ('expr',)
fields = ("expr",)
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -920,7 +987,8 @@ class MarkSafeIfAutoescape(Expr):
.. versionadded:: 2.5
"""
fields = ('expr',)
fields = ("expr",)
def as_const(self, eval_ctx=None):
eval_ctx = get_eval_context(self, eval_ctx)
@@ -942,6 +1010,20 @@ class ContextReference(Expr):
Assign(Name('foo', ctx='store'),
Getattr(ContextReference(), 'name'))
This is basically equivalent to using the
:func:`~jinja2.contextfunction` decorator when using the
high-level API, which causes a reference to the context to be passed
as the first argument to a function.
"""
class DerivedContextReference(Expr):
"""Return the current template context including locals. Behaves
exactly like :class:`ContextReference`, but includes local
variables, such as from a ``for`` loop.
.. versionadded:: 2.11
"""
@@ -955,7 +1037,8 @@ class Break(Stmt):
class Scope(Stmt):
"""An artificial scope."""
fields = ('body',)
fields = ("body",)
class OverlayScope(Stmt):
@@ -971,7 +1054,8 @@ class OverlayScope(Stmt):
.. versionadded:: 2.10
"""
fields = ('context', 'body')
fields = ("context", "body")
class EvalContextModifier(Stmt):
@@ -982,7 +1066,8 @@ class EvalContextModifier(Stmt):
EvalContextModifier(options=[Keyword('autoescape', Const(True))])
"""
fields = ('options',)
fields = ("options",)
class ScopedEvalContextModifier(EvalContextModifier):
@@ -990,10 +1075,14 @@ class ScopedEvalContextModifier(EvalContextModifier):
:class:`EvalContextModifier` but will only modify the
:class:`~jinja2.nodes.EvalContext` for nodes in the :attr:`body`.
"""
fields = ('body',)
fields = ("body",)
# make sure nobody creates custom nodes
def _failing_new(*args, **kwargs):
raise TypeError('can\'t create custom node types')
NodeType.__new__ = staticmethod(_failing_new); del _failing_new
raise TypeError("can't create custom node types")
NodeType.__new__ = staticmethod(_failing_new)
del _failing_new

View File

@@ -1,23 +1,15 @@
# -*- coding: utf-8 -*-
"""The optimizer tries to constant fold expressions and modify the AST
in place so that it should be faster to evaluate.
Because the AST does not contain all the scoping information and the
compiler has to find that out, we cannot do all the optimizations we
want. For example, loop unrolling doesn't work because unrolled loops
would have a different scope. The solution would be a second syntax tree
that stored the scoping rules.
"""
jinja2.optimizer
~~~~~~~~~~~~~~~~
The jinja optimizer is currently trying to constant fold a few expressions
and modify the AST in place so that it should be easier to evaluate it.
Because the AST does not contain all the scoping information and the
compiler has to find that out, we cannot do all the optimizations we
want. For example loop unrolling doesn't work because unrolled loops would
have a different scoping.
The solution would be a second syntax tree that has the scoping rules stored.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD.
"""
from jinja2 import nodes
from jinja2.visitor import NodeTransformer
from . import nodes
from .visitor import NodeTransformer
def optimize(node, environment):
@@ -28,22 +20,22 @@ def optimize(node, environment):
class Optimizer(NodeTransformer):
def __init__(self, environment):
self.environment = environment
def fold(self, node, eval_ctx=None):
"""Do constant folding."""
node = self.generic_visit(node)
try:
return nodes.Const.from_untrusted(node.as_const(eval_ctx),
lineno=node.lineno,
environment=self.environment)
except nodes.Impossible:
return node
def generic_visit(self, node, *args, **kwargs):
node = super(Optimizer, self).generic_visit(node, *args, **kwargs)
visit_Add = visit_Sub = visit_Mul = visit_Div = visit_FloorDiv = \
visit_Pow = visit_Mod = visit_And = visit_Or = visit_Pos = visit_Neg = \
visit_Not = visit_Compare = visit_Getitem = visit_Getattr = visit_Call = \
visit_Filter = visit_Test = visit_CondExpr = fold
del fold
# Do constant folding. Some other nodes besides Expr have
# as_const, but folding them causes errors later on.
if isinstance(node, nodes.Expr):
try:
return nodes.Const.from_untrusted(
node.as_const(args[0] if args else None),
lineno=node.lineno,
environment=self.environment,
)
except nodes.Impossible:
pass
return node

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,76 +1,66 @@
# -*- coding: utf-8 -*-
"""A sandbox layer that ensures unsafe operations cannot be performed.
Useful when the template itself comes from an untrusted source.
"""
jinja2.sandbox
~~~~~~~~~~~~~~
Adds a sandbox layer to Jinja as it was the default behavior in the old
Jinja 1 releases. This sandbox is slightly different from Jinja 1 as the
default behavior is easier to use.
The behavior can be changed by subclassing the environment.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD.
"""
import types
import operator
import sys
from jinja2.environment import Environment
from jinja2.exceptions import SecurityError
from jinja2._compat import string_types, PY2
from jinja2.utils import Markup
from markupsafe import EscapeFormatter
import types
import warnings
from collections import deque
from string import Formatter
if sys.version_info >= (3, 3):
from collections.abc import Mapping
else:
from collections import Mapping
from markupsafe import EscapeFormatter
from markupsafe import Markup
from ._compat import abc
from ._compat import PY2
from ._compat import range_type
from ._compat import string_types
from .environment import Environment
from .exceptions import SecurityError
#: maximum number of items a range may produce
MAX_RANGE = 100000
#: attributes of function objects that are considered unsafe.
if PY2:
UNSAFE_FUNCTION_ATTRIBUTES = set(['func_closure', 'func_code', 'func_dict',
'func_defaults', 'func_globals'])
UNSAFE_FUNCTION_ATTRIBUTES = {
"func_closure",
"func_code",
"func_dict",
"func_defaults",
"func_globals",
}
else:
# On versions > python 2 the special attributes on functions are gone,
# but they remain on methods and generators for whatever reason.
UNSAFE_FUNCTION_ATTRIBUTES = set()
#: unsafe method attributes. function attributes are unsafe for methods too
UNSAFE_METHOD_ATTRIBUTES = set(['im_class', 'im_func', 'im_self'])
UNSAFE_METHOD_ATTRIBUTES = {"im_class", "im_func", "im_self"}
#: unsafe generator attirbutes.
UNSAFE_GENERATOR_ATTRIBUTES = set(['gi_frame', 'gi_code'])
#: unsafe generator attributes.
UNSAFE_GENERATOR_ATTRIBUTES = {"gi_frame", "gi_code"}
#: unsafe attributes on coroutines
UNSAFE_COROUTINE_ATTRIBUTES = set(['cr_frame', 'cr_code'])
UNSAFE_COROUTINE_ATTRIBUTES = {"cr_frame", "cr_code"}
#: unsafe attributes on async generators
UNSAFE_ASYNC_GENERATOR_ATTRIBUTES = set(['ag_code', 'ag_frame'])
import warnings
UNSAFE_ASYNC_GENERATOR_ATTRIBUTES = {"ag_code", "ag_frame"}
# make sure we don't warn in python 2.6 about stuff we don't care about
warnings.filterwarnings('ignore', 'the sets module', DeprecationWarning,
module='jinja2.sandbox')
from collections import deque
warnings.filterwarnings(
"ignore", "the sets module", DeprecationWarning, module=__name__
)
_mutable_set_types = (set,)
_mutable_mapping_types = (dict,)
_mutable_sequence_types = (list,)
# on python 2.x we can register the user collection types
try:
from UserDict import UserDict, DictMixin
from UserList import UserList
_mutable_mapping_types += (UserDict, DictMixin)
_mutable_set_types += (UserList,)
except ImportError:
@@ -79,39 +69,60 @@
# if sets is still available, register the mutable set from there as well
try:
from sets import Set
_mutable_set_types += (Set,)
except ImportError:
pass
#: register Python 2.6 abstract base classes
if sys.version_info >= (3, 3):
from collections.abc import MutableSet, MutableMapping, MutableSequence
else:
from collections import MutableSet, MutableMapping, MutableSequence
_mutable_set_types += (MutableSet,)
_mutable_mapping_types += (MutableMapping,)
_mutable_sequence_types += (MutableSequence,)
_mutable_set_types += (abc.MutableSet,)
_mutable_mapping_types += (abc.MutableMapping,)
_mutable_sequence_types += (abc.MutableSequence,)
_mutable_spec = (
(_mutable_set_types, frozenset([
'add', 'clear', 'difference_update', 'discard', 'pop', 'remove',
'symmetric_difference_update', 'update'
])),
(_mutable_mapping_types, frozenset([
'clear', 'pop', 'popitem', 'setdefault', 'update'
])),
(_mutable_sequence_types, frozenset([
'append', 'reverse', 'insert', 'sort', 'extend', 'remove'
])),
(deque, frozenset([
'append', 'appendleft', 'clear', 'extend', 'extendleft', 'pop',
'popleft', 'remove', 'rotate'
]))
(
_mutable_set_types,
frozenset(
[
"add",
"clear",
"difference_update",
"discard",
"pop",
"remove",
"symmetric_difference_update",
"update",
]
),
),
(
_mutable_mapping_types,
frozenset(["clear", "pop", "popitem", "setdefault", "update"]),
),
(
_mutable_sequence_types,
frozenset(["append", "reverse", "insert", "sort", "extend", "remove"]),
),
(
deque,
frozenset(
[
"append",
"appendleft",
"clear",
"extend",
"extendleft",
"pop",
"popleft",
"remove",
"rotate",
]
),
),
)
class _MagicFormatMapping(Mapping):
class _MagicFormatMapping(abc.Mapping):
"""This class implements a dummy wrapper to fix a bug in the Python
standard library for string formatting.
@@ -125,7 +136,7 @@ def __init__(self, args, kwargs):
self._last_index = 0
def __getitem__(self, key):
if key == '':
if key == "":
idx = self._last_index
self._last_index += 1
try:
@@ -143,9 +154,9 @@ def __len__(self):
def inspect_format_method(callable):
if not isinstance(callable, (types.MethodType,
types.BuiltinMethodType)) or \
callable.__name__ != 'format':
if not isinstance(
callable, (types.MethodType, types.BuiltinMethodType)
) or callable.__name__ not in ("format", "format_map"):
return None
obj = callable.__self__
if isinstance(obj, string_types):
@@ -156,10 +167,14 @@ def safe_range(*args):
"""A range that can't generate ranges with a length of more than
MAX_RANGE items.
"""
rng = range(*args)
rng = range_type(*args)
if len(rng) > MAX_RANGE:
raise OverflowError('range too big, maximum size for range is %d' %
MAX_RANGE)
raise OverflowError(
"Range too big. The sandbox blocks ranges larger than"
" MAX_RANGE (%d)." % MAX_RANGE
)
return rng
@@ -192,24 +207,25 @@ def is_internal_attribute(obj, attr):
if attr in UNSAFE_FUNCTION_ATTRIBUTES:
return True
elif isinstance(obj, types.MethodType):
if attr in UNSAFE_FUNCTION_ATTRIBUTES or \
attr in UNSAFE_METHOD_ATTRIBUTES:
if attr in UNSAFE_FUNCTION_ATTRIBUTES or attr in UNSAFE_METHOD_ATTRIBUTES:
return True
elif isinstance(obj, type):
if attr == 'mro':
if attr == "mro":
return True
elif isinstance(obj, (types.CodeType, types.TracebackType, types.FrameType)):
return True
elif isinstance(obj, types.GeneratorType):
if attr in UNSAFE_GENERATOR_ATTRIBUTES:
return True
elif hasattr(types, 'CoroutineType') and isinstance(obj, types.CoroutineType):
elif hasattr(types, "CoroutineType") and isinstance(obj, types.CoroutineType):
if attr in UNSAFE_COROUTINE_ATTRIBUTES:
return True
elif hasattr(types, 'AsyncGeneratorType') and isinstance(obj, types.AsyncGeneratorType):
elif hasattr(types, "AsyncGeneratorType") and isinstance(
obj, types.AsyncGeneratorType
):
if attr in UNSAFE_ASYNC_GENERATOR_ATTRIBUTES:
return True
return attr.startswith('__')
return attr.startswith("__")
def modifies_known_mutable(obj, attr):
@@ -250,28 +266,26 @@ class SandboxedEnvironment(Environment):
raised. However also other exceptions may occur during the rendering so
the caller has to ensure that all exceptions are caught.
"""
sandboxed = True
#: default callback table for the binary operators. A copy of this is
#: available on each instance of a sandboxed environment as
#: :attr:`binop_table`
default_binop_table = {
'+': operator.add,
'-': operator.sub,
'*': operator.mul,
'/': operator.truediv,
'//': operator.floordiv,
'**': operator.pow,
'%': operator.mod
"+": operator.add,
"-": operator.sub,
"*": operator.mul,
"/": operator.truediv,
"//": operator.floordiv,
"**": operator.pow,
"%": operator.mod,
}
#: default callback table for the unary operators. A copy of this is
#: available on each instance of a sandboxed environment as
#: :attr:`unop_table`
default_unop_table = {
'+': operator.pos,
'-': operator.neg
}
default_unop_table = {"+": operator.pos, "-": operator.neg}
#: a set of binary operators that should be intercepted. Each operator
#: that is added to this set (empty by default) is delegated to the
@@ -307,7 +321,7 @@ class SandboxedEnvironment(Environment):
def intercept_unop(self, operator):
"""Called during template compilation with the name of a unary
operator to check if it should be intercepted at runtime. If this
method returns `True`, :meth:`call_unop` is excuted for this unary
method returns `True`, :meth:`call_unop` is executed for this unary
operator. The default implementation of :meth:`call_unop` will use
the :attr:`unop_table` dictionary to perform the operator with the
same logic as the builtin one.
@@ -321,10 +335,9 @@ def intercept_unop(self, operator):
"""
return False
def __init__(self, *args, **kwargs):
Environment.__init__(self, *args, **kwargs)
self.globals['range'] = safe_range
self.globals["range"] = safe_range
self.binop_table = self.default_binop_table.copy()
self.unop_table = self.default_unop_table.copy()
@@ -335,7 +348,7 @@ def is_safe_attribute(self, obj, attr, value):
special attributes of internal python objects as returned by the
:func:`is_internal_attribute` function.
"""
return not (attr.startswith('_') or is_internal_attribute(obj, attr))
return not (attr.startswith("_") or is_internal_attribute(obj, attr))
def is_safe_callable(self, obj):
"""Check if an object is safely callable. Per default a function is
@@ -343,8 +356,9 @@ def is_safe_callable(self, obj):
True. Override this method to alter the behavior, but this won't
affect the `unsafe` decorator from this module.
"""
return not (getattr(obj, 'unsafe_callable', False) or
getattr(obj, 'alters_data', False))
return not (
getattr(obj, "unsafe_callable", False) or getattr(obj, "alters_data", False)
)
def call_binop(self, context, operator, left, right):
"""For intercepted binary operator calls (:meth:`intercepted_binops`)
@@ -404,13 +418,15 @@ def getattr(self, obj, attribute):
def unsafe_undefined(self, obj, attribute):
"""Return an undefined object for unsafe attributes."""
return self.undefined('access to attribute %r of %r '
'object is unsafe.' % (
attribute,
obj.__class__.__name__
), name=attribute, obj=obj, exc=SecurityError)
return self.undefined(
"access to attribute %r of %r "
"object is unsafe." % (attribute, obj.__class__.__name__),
name=attribute,
obj=obj,
exc=SecurityError,
)
def format_string(self, s, args, kwargs):
def format_string(self, s, args, kwargs, format_func=None):
"""If a format call is detected, then this is routed through this
method so that our safety sandbox can be used for it.
"""
@@ -418,20 +434,31 @@ def format_string(self, s, args, kwargs):
formatter = SandboxedEscapeFormatter(self, s.escape)
else:
formatter = SandboxedFormatter(self)
if format_func is not None and format_func.__name__ == "format_map":
if len(args) != 1 or kwargs:
raise TypeError(
"format_map() takes exactly one argument %d given"
% (len(args) + (kwargs is not None))
)
kwargs = args[0]
args = None
kwargs = _MagicFormatMapping(args, kwargs)
rv = formatter.vformat(s, args, kwargs)
return type(s)(rv)
def call(__self, __context, __obj, *args, **kwargs):
def call(__self, __context, __obj, *args, **kwargs): # noqa: B902
"""Call an object from sandboxed code."""
fmt = inspect_format_method(__obj)
if fmt is not None:
return __self.format_string(fmt, args, kwargs)
return __self.format_string(fmt, args, kwargs, __obj)
# the double prefixes are to avoid double keyword argument
# errors when proxying the call.
if not __self.is_safe_callable(__obj):
raise SecurityError('%r is not safely callable' % (__obj,))
raise SecurityError("%r is not safely callable" % (__obj,))
return __context.call(__obj, *args, **kwargs)
@@ -447,16 +474,16 @@ def is_safe_attribute(self, obj, attr, value):
return not modifies_known_mutable(obj, attr)
# This really is not a public API apparenlty.
# This really is not a public API apparently.
try:
from _string import formatter_field_name_split
except ImportError:
def formatter_field_name_split(field_name):
return field_name._formatter_field_name_split()
class SandboxedFormatterMixin(object):
def __init__(self, env):
self._env = env
@@ -470,14 +497,14 @@ def get_field(self, field_name, args, kwargs):
obj = self._env.getitem(obj, i)
return obj, first
class SandboxedFormatter(SandboxedFormatterMixin, Formatter):
class SandboxedFormatter(SandboxedFormatterMixin, Formatter):
def __init__(self, env):
SandboxedFormatterMixin.__init__(self, env)
Formatter.__init__(self)
class SandboxedEscapeFormatter(SandboxedFormatterMixin, EscapeFormatter):
class SandboxedEscapeFormatter(SandboxedFormatterMixin, EscapeFormatter):
def __init__(self, env, escape):
SandboxedFormatterMixin.__init__(self, env)
EscapeFormatter.__init__(self, escape)

View File

@@ -1,29 +1,17 @@
# -*- coding: utf-8 -*-
"""
jinja2.tests
~~~~~~~~~~~~
Jinja test functions. Used with the "is" operator.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
"""Built-in template tests used with the ``is`` operator."""
import decimal
import operator
import re
import sys
from jinja2.runtime import Undefined
from jinja2._compat import text_type, string_types, integer_types
import decimal
if sys.version_info >= (3, 3):
from collections.abc import Mapping
else:
from collections import Mapping
from ._compat import abc
from ._compat import integer_types
from ._compat import string_types
from ._compat import text_type
from .runtime import Undefined
number_re = re.compile(r'^-?\d+(\.\d+)?$')
number_re = re.compile(r"^-?\d+(\.\d+)?$")
regex_type = type(number_re)
test_callable = callable
@@ -69,6 +57,48 @@ def test_none(value):
return value is None
def test_boolean(value):
"""Return true if the object is a boolean value.
.. versionadded:: 2.11
"""
return value is True or value is False
def test_false(value):
"""Return true if the object is False.
.. versionadded:: 2.11
"""
return value is False
def test_true(value):
"""Return true if the object is True.
.. versionadded:: 2.11
"""
return value is True
# NOTE: The existing 'number' test matches booleans and floats
def test_integer(value):
"""Return true if the object is an integer.
.. versionadded:: 2.11
"""
return isinstance(value, integer_types) and value is not True and value is not False
# NOTE: The existing 'number' test matches booleans and integers
def test_float(value):
"""Return true if the object is a float.
.. versionadded:: 2.11
"""
return isinstance(value, float)
def test_lower(value):
"""Return true if the variable is lowercased."""
return text_type(value).islower()
@@ -89,7 +119,7 @@ def test_mapping(value):
.. versionadded:: 2.6
"""
return isinstance(value, Mapping)
return isinstance(value, abc.Mapping)
def test_number(value):
@@ -104,7 +134,7 @@ def test_sequence(value):
try:
len(value)
value.__getitem__
except:
except Exception:
return False
return True
@@ -133,7 +163,7 @@ def test_iterable(value):
def test_escaped(value):
"""Check if the value is escaped."""
return hasattr(value, '__html__')
return hasattr(value, "__html__")
def test_in(value, seq):
@@ -145,36 +175,41 @@ def test_in(value, seq):
TESTS = {
'odd': test_odd,
'even': test_even,
'divisibleby': test_divisibleby,
'defined': test_defined,
'undefined': test_undefined,
'none': test_none,
'lower': test_lower,
'upper': test_upper,
'string': test_string,
'mapping': test_mapping,
'number': test_number,
'sequence': test_sequence,
'iterable': test_iterable,
'callable': test_callable,
'sameas': test_sameas,
'escaped': test_escaped,
'in': test_in,
'==': operator.eq,
'eq': operator.eq,
'equalto': operator.eq,
'!=': operator.ne,
'ne': operator.ne,
'>': operator.gt,
'gt': operator.gt,
'greaterthan': operator.gt,
'ge': operator.ge,
'>=': operator.ge,
'<': operator.lt,
'lt': operator.lt,
'lessthan': operator.lt,
'<=': operator.le,
'le': operator.le,
"odd": test_odd,
"even": test_even,
"divisibleby": test_divisibleby,
"defined": test_defined,
"undefined": test_undefined,
"none": test_none,
"boolean": test_boolean,
"false": test_false,
"true": test_true,
"integer": test_integer,
"float": test_float,
"lower": test_lower,
"upper": test_upper,
"string": test_string,
"mapping": test_mapping,
"number": test_number,
"sequence": test_sequence,
"iterable": test_iterable,
"callable": test_callable,
"sameas": test_sameas,
"escaped": test_escaped,
"in": test_in,
"==": operator.eq,
"eq": operator.eq,
"equalto": operator.eq,
"!=": operator.ne,
"ne": operator.ne,
">": operator.gt,
"gt": operator.gt,
"greaterthan": operator.gt,
"ge": operator.ge,
">=": operator.ge,
"<": operator.lt,
"lt": operator.lt,
"lessthan": operator.lt,
"<=": operator.le,
"le": operator.le,
}

View File

@@ -1,44 +1,32 @@
# -*- coding: utf-8 -*-
"""
jinja2.utils
~~~~~~~~~~~~
Utility functions.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD, see LICENSE for more details.
"""
import re
import json
import errno
import os
import re
import warnings
from collections import deque
from random import choice
from random import randrange
from string import ascii_letters as _letters
from string import digits as _digits
from threading import Lock
from jinja2._compat import text_type, string_types, implements_iterator, \
url_quote
from markupsafe import escape
from markupsafe import Markup
_word_split_re = re.compile(r'(\s+)')
_punctuation_re = re.compile(
'^(?P<lead>(?:%s)*)(?P<middle>.*?)(?P<trail>(?:%s)*)$' % (
'|'.join(map(re.escape, ('(', '<', '&lt;'))),
'|'.join(map(re.escape, ('.', ',', ')', '>', '\n', '&gt;')))
)
)
_simple_email_re = re.compile(r'^\S+@[a-zA-Z0-9._-]+\.[a-zA-Z0-9._-]+$')
_striptags_re = re.compile(r'(<!--.*?-->|<[^>]*>)')
_entity_re = re.compile(r'&([^;]+);')
_letters = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'
_digits = '0123456789'
from ._compat import abc
from ._compat import string_types
from ._compat import text_type
from ._compat import url_quote
# special singleton representing missing values for the runtime
missing = type('MissingType', (), {'__repr__': lambda x: 'missing'})()
missing = type("MissingType", (), {"__repr__": lambda x: "missing"})()
# internal code
internal_code = set()
concat = u''.join
concat = u"".join
_slash_escape = '\\/' not in json.dumps('/')
_slash_escape = "\\/" not in json.dumps("/")
def contextfunction(f):
@@ -98,24 +86,26 @@ def default(var, default=''):
return default
return var
"""
from jinja2.runtime import Undefined
from .runtime import Undefined
return isinstance(obj, Undefined)
def consume(iterable):
"""Consumes an iterable without doing anything with it."""
for event in iterable:
for _ in iterable:
pass
def clear_caches():
"""Jinja2 keeps internal caches for environments and lexers. These are
used so that Jinja2 doesn't have to recreate environments and lexers all
"""Jinja keeps internal caches for environments and lexers. These are
used so that Jinja doesn't have to recreate environments and lexers all
the time. Normally you don't have to care about that but if you are
measuring memory consumption you may want to clean the caches.
"""
from jinja2.environment import _spontaneous_environments
from jinja2.lexer import _lexer_cache
from .environment import _spontaneous_environments
from .lexer import _lexer_cache
_spontaneous_environments.clear()
_lexer_cache.clear()
@@ -132,12 +122,10 @@ def import_string(import_name, silent=False):
:return: imported object
"""
try:
if ':' in import_name:
module, obj = import_name.split(':', 1)
elif '.' in import_name:
items = import_name.split('.')
module = '.'.join(items[:-1])
obj = items[-1]
if ":" in import_name:
module, obj = import_name.split(":", 1)
elif "." in import_name:
module, _, obj = import_name.rpartition(".")
else:
return __import__(import_name)
return getattr(__import__(module, None, None, [obj]), obj)
@@ -146,15 +134,14 @@ def import_string(import_name, silent=False):
raise
def open_if_exists(filename, mode='rb'):
def open_if_exists(filename, mode="rb"):
"""Returns a file descriptor for the filename if that file exists,
otherwise `None`.
otherwise ``None``.
"""
try:
return open(filename, mode)
except IOError as e:
if e.errno not in (errno.ENOENT, errno.EISDIR, errno.EINVAL):
raise
if not os.path.isfile(filename):
return None
return open(filename, mode)
def object_type_repr(obj):
@@ -163,15 +150,19 @@ def object_type_repr(obj):
example for `None` and `Ellipsis`).
"""
if obj is None:
return 'None'
return "None"
elif obj is Ellipsis:
return 'Ellipsis'
return "Ellipsis"
cls = type(obj)
# __builtin__ in 2.x, builtins in 3.x
if obj.__class__.__module__ in ('__builtin__', 'builtins'):
name = obj.__class__.__name__
if cls.__module__ in ("__builtin__", "builtins"):
name = cls.__name__
else:
name = obj.__class__.__module__ + '.' + obj.__class__.__name__
return '%s object' % name
name = cls.__module__ + "." + cls.__name__
return "%s object" % name
def pformat(obj, verbose=False):
@@ -180,9 +171,11 @@ def pformat(obj, verbose=False):
"""
try:
from pretty import pretty
return pretty(obj, verbose=verbose)
except ImportError:
from pprint import pformat
return pformat(obj)
@@ -200,45 +193,77 @@ def urlize(text, trim_url_limit=None, rel=None, target=None):
If target is not None, a target attribute will be added to the link.
"""
trim_url = lambda x, limit=trim_url_limit: limit is not None \
and (x[:limit] + (len(x) >=limit and '...'
or '')) or x
words = _word_split_re.split(text_type(escape(text)))
rel_attr = rel and ' rel="%s"' % text_type(escape(rel)) or ''
target_attr = target and ' target="%s"' % escape(target) or ''
trim_url = (
lambda x, limit=trim_url_limit: limit is not None
and (x[:limit] + (len(x) >= limit and "..." or ""))
or x
)
words = re.split(r"(\s+)", text_type(escape(text)))
rel_attr = rel and ' rel="%s"' % text_type(escape(rel)) or ""
target_attr = target and ' target="%s"' % escape(target) or ""
for i, word in enumerate(words):
match = _punctuation_re.match(word)
head, middle, tail = "", word, ""
match = re.match(r"^([(<]|&lt;)+", middle)
if match:
lead, middle, trail = match.groups()
if middle.startswith('www.') or (
'@' not in middle and
not middle.startswith('http://') and
not middle.startswith('https://') and
len(middle) > 0 and
middle[0] in _letters + _digits and (
middle.endswith('.org') or
middle.endswith('.net') or
middle.endswith('.com')
)):
middle = '<a href="http://%s"%s%s>%s</a>' % (middle,
rel_attr, target_attr, trim_url(middle))
if middle.startswith('http://') or \
middle.startswith('https://'):
middle = '<a href="%s"%s%s>%s</a>' % (middle,
rel_attr, target_attr, trim_url(middle))
if '@' in middle and not middle.startswith('www.') and \
not ':' in middle and _simple_email_re.match(middle):
middle = '<a href="mailto:%s">%s</a>' % (middle, middle)
if lead + middle + trail != word:
words[i] = lead + middle + trail
return u''.join(words)
head = match.group()
middle = middle[match.end() :]
# Unlike lead, which is anchored to the start of the string,
# need to check that the string ends with any of the characters
# before trying to match all of them, to avoid backtracking.
if middle.endswith((")", ">", ".", ",", "\n", "&gt;")):
match = re.search(r"([)>.,\n]|&gt;)+$", middle)
if match:
tail = match.group()
middle = middle[: match.start()]
if middle.startswith("www.") or (
"@" not in middle
and not middle.startswith("http://")
and not middle.startswith("https://")
and len(middle) > 0
and middle[0] in _letters + _digits
and (
middle.endswith(".org")
or middle.endswith(".net")
or middle.endswith(".com")
)
):
middle = '<a href="http://%s"%s%s>%s</a>' % (
middle,
rel_attr,
target_attr,
trim_url(middle),
)
if middle.startswith("http://") or middle.startswith("https://"):
middle = '<a href="%s"%s%s>%s</a>' % (
middle,
rel_attr,
target_attr,
trim_url(middle),
)
if (
"@" in middle
and not middle.startswith("www.")
and ":" not in middle
and re.match(r"^\S+@\w[\w.-]*\.\w+$", middle)
):
middle = '<a href="mailto:%s">%s</a>' % (middle, middle)
words[i] = head + middle + tail
return u"".join(words)
def generate_lorem_ipsum(n=5, html=True, min=20, max=100):
"""Generate some lorem ipsum for the template."""
from jinja2.constants import LOREM_IPSUM_WORDS
from random import choice, randrange
from .constants import LOREM_IPSUM_WORDS
words = LOREM_IPSUM_WORDS.split()
result = []
@@ -263,43 +288,53 @@ def generate_lorem_ipsum(n=5, html=True, min=20, max=100):
if idx - randrange(3, 8) > last_comma:
last_comma = idx
last_fullstop += 2
word += ','
word += ","
# add end of sentences
if idx - randrange(10, 20) > last_fullstop:
last_comma = last_fullstop = idx
word += '.'
word += "."
next_capitalized = True
p.append(word)
# ensure that the paragraph ends with a dot.
p = u' '.join(p)
if p.endswith(','):
p = p[:-1] + '.'
elif not p.endswith('.'):
p += '.'
p = u" ".join(p)
if p.endswith(","):
p = p[:-1] + "."
elif not p.endswith("."):
p += "."
result.append(p)
if not html:
return u'\n\n'.join(result)
return Markup(u'\n'.join(u'<p>%s</p>' % escape(x) for x in result))
return u"\n\n".join(result)
return Markup(u"\n".join(u"<p>%s</p>" % escape(x) for x in result))
def unicode_urlencode(obj, charset='utf-8', for_qs=False):
"""URL escapes a single bytestring or unicode string with the
given charset if applicable to URL safe quoting under all rules
that need to be considered under all supported Python versions.
def unicode_urlencode(obj, charset="utf-8", for_qs=False):
"""Quote a string for use in a URL using the given charset.
If non strings are provided they are converted to their unicode
representation first.
This function is misnamed, it is a wrapper around
:func:`urllib.parse.quote`.
:param obj: String or bytes to quote. Other types are converted to
string then encoded to bytes using the given charset.
:param charset: Encode text to bytes using this charset.
:param for_qs: Quote "/" and use "+" for spaces.
"""
if not isinstance(obj, string_types):
obj = text_type(obj)
if isinstance(obj, text_type):
obj = obj.encode(charset)
safe = not for_qs and b'/' or b''
rv = text_type(url_quote(obj, safe))
safe = b"" if for_qs else b"/"
rv = url_quote(obj, safe)
if not isinstance(rv, text_type):
rv = rv.decode("utf-8")
if for_qs:
rv = rv.replace('%20', '+')
rv = rv.replace("%20", "+")
return rv
@@ -326,9 +361,9 @@ def _postinit(self):
def __getstate__(self):
return {
'capacity': self.capacity,
'_mapping': self._mapping,
'_queue': self._queue
"capacity": self.capacity,
"_mapping": self._mapping,
"_queue": self._queue,
}
def __setstate__(self, d):
@@ -342,7 +377,7 @@ def copy(self):
"""Return a shallow copy of the instance."""
rv = self.__class__(self.capacity)
rv._mapping.update(self._mapping)
rv._queue = deque(self._queue)
rv._queue.extend(self._queue)
return rv
def get(self, key, default=None):
@@ -356,15 +391,11 @@ def setdefault(self, key, default=None):
"""Set `default` if the key is not in the cache otherwise
leave unchanged. Return the value of this key.
"""
self._wlock.acquire()
try:
try:
return self[key]
except KeyError:
self[key] = default
return default
finally:
self._wlock.release()
return self[key]
except KeyError:
self[key] = default
return default
def clear(self):
"""Clear the cache."""
@@ -384,10 +415,7 @@ def __len__(self):
return len(self._mapping)
def __repr__(self):
return '<%s %r>' % (
self.__class__.__name__,
self._mapping
)
return "<%s %r>" % (self.__class__.__name__, self._mapping)
def __getitem__(self, key):
"""Get an item from the cache. Moves the item up so that it has the
@@ -436,7 +464,6 @@ def __delitem__(self, key):
try:
self._remove(key)
except ValueError:
# __getitem__ is not locked, it might happen
pass
finally:
self._wlock.release()
@@ -449,6 +476,12 @@ def items(self):
def iteritems(self):
"""Iterate over all items."""
warnings.warn(
"'iteritems()' will be removed in version 3.0. Use"
" 'iter(cache.items())' instead.",
DeprecationWarning,
stacklevel=2,
)
return iter(self.items())
def values(self):
@@ -457,6 +490,22 @@ def values(self):
def itervalue(self):
"""Iterate over all values."""
warnings.warn(
"'itervalue()' will be removed in version 3.0. Use"
" 'iter(cache.values())' instead.",
DeprecationWarning,
stacklevel=2,
)
return iter(self.values())
def itervalues(self):
"""Iterate over all values."""
warnings.warn(
"'itervalues()' will be removed in version 3.0. Use"
" 'iter(cache.values())' instead.",
DeprecationWarning,
stacklevel=2,
)
return iter(self.values())
def keys(self):
@@ -467,12 +516,19 @@ def iterkeys(self):
"""Iterate over all keys in the cache dict, ordered by
the most recent usage.
"""
warnings.warn(
"'iterkeys()' will be removed in version 3.0. Use"
" 'iter(cache.keys())' instead.",
DeprecationWarning,
stacklevel=2,
)
return iter(self)
def __iter__(self):
return reversed(tuple(self._queue))
__iter__ = iterkeys
def __reversed__(self):
"""Iterate over the values in the cache dict, oldest items
"""Iterate over the keys in the cache dict, oldest items
coming first.
"""
return iter(tuple(self._queue))
@@ -480,22 +536,15 @@ def __reversed__(self):
__copy__ = copy
# register the LRU cache as mutable mapping if possible
try:
from collections.abc import MutableMapping
MutableMapping.register(LRUCache)
except ImportError:
try:
from collections import MutableMapping
MutableMapping.register(LRUCache)
except ImportError:
pass
abc.MutableMapping.register(LRUCache)
def select_autoescape(enabled_extensions=('html', 'htm', 'xml'),
disabled_extensions=(),
default_for_string=True,
default=False):
def select_autoescape(
enabled_extensions=("html", "htm", "xml"),
disabled_extensions=(),
default_for_string=True,
default=False,
):
"""Intelligently sets the initial value of autoescaping based on the
filename of the template. This is the recommended way to configure
autoescaping if you do not want to write a custom function yourself.
@@ -530,10 +579,9 @@ def select_autoescape(enabled_extensions=('html', 'htm', 'xml'),
.. versionadded:: 2.9
"""
enabled_patterns = tuple('.' + x.lstrip('.').lower()
for x in enabled_extensions)
disabled_patterns = tuple('.' + x.lstrip('.').lower()
for x in disabled_extensions)
enabled_patterns = tuple("." + x.lstrip(".").lower() for x in enabled_extensions)
disabled_patterns = tuple("." + x.lstrip(".").lower() for x in disabled_extensions)
def autoescape(template_name):
if template_name is None:
return default_for_string
@@ -543,6 +591,7 @@ def autoescape(template_name):
if template_name.endswith(disabled_patterns):
return False
return default
return autoescape
@@ -566,35 +615,63 @@ def htmlsafe_json_dumps(obj, dumper=None, **kwargs):
"""
if dumper is None:
dumper = json.dumps
rv = dumper(obj, **kwargs) \
.replace(u'<', u'\\u003c') \
.replace(u'>', u'\\u003e') \
.replace(u'&', u'\\u0026') \
.replace(u"'", u'\\u0027')
rv = (
dumper(obj, **kwargs)
.replace(u"<", u"\\u003c")
.replace(u">", u"\\u003e")
.replace(u"&", u"\\u0026")
.replace(u"'", u"\\u0027")
)
return Markup(rv)
@implements_iterator
class Cycler(object):
"""A cycle helper for templates."""
"""Cycle through values by yield them one at a time, then restarting
once the end is reached. Available as ``cycler`` in templates.
Similar to ``loop.cycle``, but can be used outside loops or across
multiple loops. For example, render a list of folders and files in a
list, alternating giving them "odd" and "even" classes.
.. code-block:: html+jinja
{% set row_class = cycler("odd", "even") %}
<ul class="browser">
{% for folder in folders %}
<li class="folder {{ row_class.next() }}">{{ folder }}
{% endfor %}
{% for file in files %}
<li class="file {{ row_class.next() }}">{{ file }}
{% endfor %}
</ul>
:param items: Each positional argument will be yielded in the order
given for each cycle.
.. versionadded:: 2.1
"""
def __init__(self, *items):
if not items:
raise RuntimeError('at least one item has to be provided')
raise RuntimeError("at least one item has to be provided")
self.items = items
self.reset()
self.pos = 0
def reset(self):
"""Resets the cycle."""
"""Resets the current item to the first item."""
self.pos = 0
@property
def current(self):
"""Returns the current item."""
"""Return the current item. Equivalent to the item that will be
returned next time :meth:`next` is called.
"""
return self.items[self.pos]
def next(self):
"""Goes one item ahead and returns it."""
"""Return the current item, then advance :attr:`current` to the
next item.
"""
rv = self.current
self.pos = (self.pos + 1) % len(self.items)
return rv
@@ -605,27 +682,28 @@ def next(self):
class Joiner(object):
"""A joining helper for templates."""
def __init__(self, sep=u', '):
def __init__(self, sep=u", "):
self.sep = sep
self.used = False
def __call__(self):
if not self.used:
self.used = True
return u''
return u""
return self.sep
class Namespace(object):
"""A namespace object that can hold arbitrary attributes. It may be
initialized from a dictionary or with keyword argments."""
initialized from a dictionary or with keyword arguments."""
def __init__(*args, **kwargs):
def __init__(*args, **kwargs): # noqa: B902
self, args = args[0], args[1:]
self.__attrs = dict(*args, **kwargs)
def __getattribute__(self, name):
if name == '_Namespace__attrs':
# __class__ is needed for the awaitable check in async mode
if name in {"_Namespace__attrs", "__class__"}:
return object.__getattribute__(self, name)
try:
return self.__attrs[name]
@@ -636,16 +714,24 @@ def __setitem__(self, name, value):
self.__attrs[name] = value
def __repr__(self):
return '<Namespace %r>' % self.__attrs
return "<Namespace %r>" % self.__attrs
# does this python version support async for in and async generators?
try:
exec('async def _():\n async for _ in ():\n yield _')
exec("async def _():\n async for _ in ():\n yield _")
have_async_gen = True
except SyntaxError:
have_async_gen = False
# Imported here because that's where it was in the past
from markupsafe import Markup, escape, soft_unicode
def soft_unicode(s):
from markupsafe import soft_unicode
warnings.warn(
"'jinja2.utils.soft_unicode' will be removed in version 3.0."
" Use 'markupsafe.soft_unicode' instead.",
DeprecationWarning,
stacklevel=2,
)
return soft_unicode(s)

View File

@@ -1,14 +1,8 @@
# -*- coding: utf-8 -*-
"""API for traversing the AST nodes. Implemented by the compiler and
meta introspection.
"""
jinja2.visitor
~~~~~~~~~~~~~~
This module implements a visitor for the nodes.
:copyright: (c) 2017 by the Jinja Team.
:license: BSD.
"""
from jinja2.nodes import Node
from .nodes import Node
class NodeVisitor(object):
@@ -28,7 +22,7 @@ def get_visitor(self, node):
exists for this node. In that case the generic visit function is
used instead.
"""
method = 'visit_' + node.__class__.__name__
method = "visit_" + node.__class__.__name__
return getattr(self, method, None)
def visit(self, node, *args, **kwargs):

View File

@@ -1,13 +0,0 @@
MarkupSafe is written and maintained by Armin Ronacher and
various contributors:
Development Lead
````````````````
- Armin Ronacher <armin.ronacher@active-4.com>
Patches and Suggestions
```````````````````````
- Georg Brandl
- Mickaël Guérin

View File

@@ -1,33 +0,0 @@
Copyright (c) 2010 by Armin Ronacher and contributors. See AUTHORS
for more details.
Some rights reserved.
Redistribution and use in source and binary forms of the software as well
as documentation, with or without modification, are permitted provided
that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
* The names of the contributors may not be used to endorse or
promote products derived from this software without specific
prior written permission.
THIS SOFTWARE AND DOCUMENTATION IS PROVIDED BY THE COPYRIGHT HOLDERS AND
CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT
NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE AND DOCUMENTATION, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
DAMAGE.

View File

@@ -0,0 +1,28 @@
Copyright 2010 Pallets
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@@ -1,113 +1,69 @@
MarkupSafe
==========
Implements a unicode subclass that supports HTML strings:
MarkupSafe implements a text object that escapes characters so it is
safe to use in HTML and XML. Characters that have special meanings are
replaced so that they display as the actual characters. This mitigates
injection attacks, meaning untrusted user input can safely be displayed
on a page.
.. code-block:: python
Installing
----------
Install and update using `pip`_:
.. code-block:: text
pip install -U MarkupSafe
.. _pip: https://pip.pypa.io/en/stable/quickstart/
Examples
--------
.. code-block:: pycon
>>> from markupsafe import Markup, escape
>>> escape("<script>alert(document.cookie);</script>")
>>> # escape replaces special characters and wraps in Markup
>>> escape('<script>alert(document.cookie);</script>')
Markup(u'&lt;script&gt;alert(document.cookie);&lt;/script&gt;')
>>> tmpl = Markup("<em>%s</em>")
>>> tmpl % "Peter > Lustig"
Markup(u'<em>Peter &gt; Lustig</em>')
>>> # wrap in Markup to mark text "safe" and prevent escaping
>>> Markup('<strong>Hello</strong>')
Markup('<strong>hello</strong>')
>>> escape(Markup('<strong>Hello</strong>'))
Markup('<strong>hello</strong>')
>>> # Markup is a text subclass (str on Python 3, unicode on Python 2)
>>> # methods and operators escape their arguments
>>> template = Markup("Hello <em>%s</em>")
>>> template % '"World"'
Markup('Hello <em>&#34;World&#34;</em>')
If you want to make an object unicode that is not yet unicode
but don't want to lose the taint information, you can use the
``soft_unicode`` function. (On Python 3 you can also use ``soft_str`` which
is a different name for the same function).
.. code-block:: python
Donate
------
>>> from markupsafe import soft_unicode
>>> soft_unicode(42)
u'42'
>>> soft_unicode(Markup('foo'))
Markup(u'foo')
The Pallets organization develops and supports MarkupSafe and other
libraries that use it. In order to grow the community of contributors
and users, and allow the maintainers to devote more time to the
projects, `please donate today`_.
HTML Representations
--------------------
.. _please donate today: https://palletsprojects.com/donate
Objects can customize their HTML markup equivalent by overriding
the ``__html__`` function:
.. code-block:: python
Links
-----
>>> class Foo(object):
... def __html__(self):
... return '<strong>Nice</strong>'
...
>>> escape(Foo())
Markup(u'<strong>Nice</strong>')
>>> Markup(Foo())
Markup(u'<strong>Nice</strong>')
* Website: https://palletsprojects.com/p/markupsafe/
* Documentation: https://markupsafe.palletsprojects.com/
* License: `BSD-3-Clause <https://github.com/pallets/markupsafe/blob/master/LICENSE.rst>`_
* Releases: https://pypi.org/project/MarkupSafe/
* Code: https://github.com/pallets/markupsafe
* Issue tracker: https://github.com/pallets/markupsafe/issues
* Test status:
Silent Escapes
--------------
* Linux, Mac: https://travis-ci.org/pallets/markupsafe
* Windows: https://ci.appveyor.com/project/pallets/markupsafe
Since MarkupSafe 0.10 there is now also a separate escape function
called ``escape_silent`` that returns an empty string for ``None`` for
consistency with other systems that return empty strings for ``None``
when escaping (for instance Pylons' webhelpers).
If you also want to use this for the escape method of the Markup
object, you can create your own subclass that does that:
.. code-block:: python
from markupsafe import Markup, escape_silent as escape
class SilentMarkup(Markup):
__slots__ = ()
@classmethod
def escape(cls, s):
return cls(escape(s))
New-Style String Formatting
---------------------------
Starting with MarkupSafe 0.21 new style string formats from Python 2.6 and
3.x are now fully supported. Previously the escape behavior of those
functions was spotty at best. The new implementations operates under the
following algorithm:
1. if an object has an ``__html_format__`` method it is called as
replacement for ``__format__`` with the format specifier. It either
has to return a string or markup object.
2. if an object has an ``__html__`` method it is called.
3. otherwise the default format system of Python kicks in and the result
is HTML escaped.
Here is how you can implement your own formatting:
.. code-block:: python
class User(object):
def __init__(self, id, username):
self.id = id
self.username = username
def __html_format__(self, format_spec):
if format_spec == 'link':
return Markup('<a href="/user/{0}">{1}</a>').format(
self.id,
self.__html__(),
)
elif format_spec:
raise ValueError('Invalid format spec')
return self.__html__()
def __html__(self):
return Markup('<span class=user>{0}</span>').format(self.username)
And to format that user:
.. code-block:: python
>>> user = User(1, 'foo')
>>> Markup('<p>User: {0:link}').format(user)
Markup(u'<p>User: <a href="/user/1"><span class=user>foo</span></a>')
Markupsafe supports Python 2.6, 2.7 and Python 3.3 and higher.
* Test coverage: https://codecov.io/gh/pallets/markupsafe

View File

@@ -1,80 +1,74 @@
# -*- coding: utf-8 -*-
"""
markupsafe
~~~~~~~~~~
markupsafe
~~~~~~~~~~
Implements a Markup string.
Implements an escape function and a Markup string to replace HTML
special characters with safe representations.
:copyright: (c) 2010 by Armin Ronacher.
:license: BSD, see LICENSE for more details.
:copyright: 2010 Pallets
:license: BSD-3-Clause
"""
import re
import string
import sys
from markupsafe._compat import text_type, string_types, int_types, \
unichr, iteritems, PY2
if sys.version_info >= (3, 3):
from collections.abc import Mapping
else:
from collections import Mapping
from ._compat import int_types
from ._compat import iteritems
from ._compat import Mapping
from ._compat import PY2
from ._compat import string_types
from ._compat import text_type
from ._compat import unichr
__version__ = "1.0"
__version__ = "1.1.1"
__all__ = ['Markup', 'soft_unicode', 'escape', 'escape_silent']
__all__ = ["Markup", "soft_unicode", "escape", "escape_silent"]
_striptags_re = re.compile(r'(<!--.*?-->|<[^>]*>)')
_entity_re = re.compile(r'&([^& ;]+);')
_striptags_re = re.compile(r"(<!--.*?-->|<[^>]*>)")
_entity_re = re.compile(r"&([^& ;]+);")
class Markup(text_type):
r"""Marks a string as being safe for inclusion in HTML/XML output without
needing to be escaped. This implements the `__html__` interface a couple
of frameworks and web applications use. :class:`Markup` is a direct
subclass of `unicode` and provides all the methods of `unicode` just that
it escapes arguments passed and always returns `Markup`.
"""A string that is ready to be safely inserted into an HTML or XML
document, either because it was escaped or because it was marked
safe.
The `escape` function returns markup objects so that double escaping can't
happen.
Passing an object to the constructor converts it to text and wraps
it to mark it safe without escaping. To escape the text, use the
:meth:`escape` class method instead.
The constructor of the :class:`Markup` class can be used for three
different things: When passed an unicode object it's assumed to be safe,
when passed an object with an HTML representation (has an `__html__`
method) that representation is used, otherwise the object passed is
converted into a unicode string and then assumed to be safe:
>>> Markup('Hello, <em>World</em>!')
Markup('Hello, <em>World</em>!')
>>> Markup(42)
Markup('42')
>>> Markup.escape('Hello, <em>World</em>!')
Markup('Hello &lt;em&gt;World&lt;/em&gt;!')
>>> Markup("Hello <em>World</em>!")
Markup(u'Hello <em>World</em>!')
>>> class Foo(object):
... def __html__(self):
... return '<a href="#">foo</a>'
This implements the ``__html__()`` interface that some frameworks
use. Passing an object that implements ``__html__()`` will wrap the
output of that method, marking it safe.
>>> class Foo:
... def __html__(self):
... return '<a href="/foo">foo</a>'
...
>>> Markup(Foo())
Markup(u'<a href="#">foo</a>')
Markup('<a href="/foo">foo</a>')
If you want object passed being always treated as unsafe you can use the
:meth:`escape` classmethod to create a :class:`Markup` object:
This is a subclass of the text type (``str`` in Python 3,
``unicode`` in Python 2). It has the same methods as that type, but
all methods escape their arguments and return a ``Markup`` instance.
>>> Markup.escape("Hello <em>World</em>!")
Markup(u'Hello &lt;em&gt;World&lt;/em&gt;!')
Operations on a markup string are markup aware which means that all
arguments are passed through the :func:`escape` function:
>>> em = Markup("<em>%s</em>")
>>> em % "foo & bar"
Markup(u'<em>foo &amp; bar</em>')
>>> strong = Markup("<strong>%(text)s</strong>")
>>> strong % {'text': '<blink>hacker here</blink>'}
Markup(u'<strong>&lt;blink&gt;hacker here&lt;/blink&gt;</strong>')
>>> Markup("<em>Hello</em> ") + "<foo>"
Markup(u'<em>Hello</em> &lt;foo&gt;')
>>> Markup('<em>%s</em>') % 'foo & bar'
Markup('<em>foo &amp; bar</em>')
>>> Markup('<em>Hello</em> ') + '<foo>'
Markup('<em>Hello</em> &lt;foo&gt;')
"""
__slots__ = ()
def __new__(cls, base=u'', encoding=None, errors='strict'):
if hasattr(base, '__html__'):
def __new__(cls, base=u"", encoding=None, errors="strict"):
if hasattr(base, "__html__"):
base = base.__html__()
if encoding is None:
return text_type.__new__(cls, base)
@@ -84,12 +78,12 @@ def __html__(self):
return self
def __add__(self, other):
if isinstance(other, string_types) or hasattr(other, '__html__'):
if isinstance(other, string_types) or hasattr(other, "__html__"):
return self.__class__(super(Markup, self).__add__(self.escape(other)))
return NotImplemented
def __radd__(self, other):
if hasattr(other, '__html__') or isinstance(other, string_types):
if hasattr(other, "__html__") or isinstance(other, string_types):
return self.escape(other).__add__(self)
return NotImplemented
@@ -97,6 +91,7 @@ def __mul__(self, num):
if isinstance(num, int_types):
return self.__class__(text_type.__mul__(self, num))
return NotImplemented
__rmul__ = __mul__
def __mod__(self, arg):
@@ -107,115 +102,124 @@ def __mod__(self, arg):
return self.__class__(text_type.__mod__(self, arg))
def __repr__(self):
return '%s(%s)' % (
self.__class__.__name__,
text_type.__repr__(self)
)
return "%s(%s)" % (self.__class__.__name__, text_type.__repr__(self))
def join(self, seq):
return self.__class__(text_type.join(self, map(self.escape, seq)))
join.__doc__ = text_type.join.__doc__
def split(self, *args, **kwargs):
return list(map(self.__class__, text_type.split(self, *args, **kwargs)))
split.__doc__ = text_type.split.__doc__
def rsplit(self, *args, **kwargs):
return list(map(self.__class__, text_type.rsplit(self, *args, **kwargs)))
rsplit.__doc__ = text_type.rsplit.__doc__
def splitlines(self, *args, **kwargs):
return list(map(self.__class__, text_type.splitlines(
self, *args, **kwargs)))
return list(map(self.__class__, text_type.splitlines(self, *args, **kwargs)))
splitlines.__doc__ = text_type.splitlines.__doc__
def unescape(self):
r"""Unescape markup again into an text_type string. This also resolves
known HTML4 and XHTML entities:
"""Convert escaped markup back into a text string. This replaces
HTML entities with the characters they represent.
>>> Markup("Main &raquo; <em>About</em>").unescape()
u'Main \xbb <em>About</em>'
>>> Markup('Main &raquo; <em>About</em>').unescape()
'Main » <em>About</em>'
"""
from markupsafe._constants import HTML_ENTITIES
from ._constants import HTML_ENTITIES
def handle_match(m):
name = m.group(1)
if name in HTML_ENTITIES:
return unichr(HTML_ENTITIES[name])
try:
if name[:2] in ('#x', '#X'):
if name[:2] in ("#x", "#X"):
return unichr(int(name[2:], 16))
elif name.startswith('#'):
elif name.startswith("#"):
return unichr(int(name[1:]))
except ValueError:
pass
# Don't modify unexpected input.
return m.group()
return _entity_re.sub(handle_match, text_type(self))
def striptags(self):
r"""Unescape markup into an text_type string and strip all tags. This
also resolves known HTML4 and XHTML entities. Whitespace is
normalized to one:
""":meth:`unescape` the markup, remove tags, and normalize
whitespace to single spaces.
>>> Markup("Main &raquo; <em>About</em>").striptags()
u'Main \xbb About'
>>> Markup('Main &raquo;\t<em>About</em>').striptags()
'Main » About'
"""
stripped = u' '.join(_striptags_re.sub('', self).split())
stripped = u" ".join(_striptags_re.sub("", self).split())
return Markup(stripped).unescape()
@classmethod
def escape(cls, s):
"""Escape the string. Works like :func:`escape` with the difference
that for subclasses of :class:`Markup` this function would return the
correct subclass.
"""Escape a string. Calls :func:`escape` and ensures that for
subclasses the correct type is returned.
"""
rv = escape(s)
if rv.__class__ is not cls:
return cls(rv)
return rv
def make_simple_escaping_wrapper(name):
def make_simple_escaping_wrapper(name): # noqa: B902
orig = getattr(text_type, name)
def func(self, *args, **kwargs):
args = _escape_argspec(list(args), enumerate(args), self.escape)
_escape_argspec(kwargs, iteritems(kwargs), self.escape)
return self.__class__(orig(self, *args, **kwargs))
func.__name__ = orig.__name__
func.__doc__ = orig.__doc__
return func
for method in '__getitem__', 'capitalize', \
'title', 'lower', 'upper', 'replace', 'ljust', \
'rjust', 'lstrip', 'rstrip', 'center', 'strip', \
'translate', 'expandtabs', 'swapcase', 'zfill':
for method in (
"__getitem__",
"capitalize",
"title",
"lower",
"upper",
"replace",
"ljust",
"rjust",
"lstrip",
"rstrip",
"center",
"strip",
"translate",
"expandtabs",
"swapcase",
"zfill",
):
locals()[method] = make_simple_escaping_wrapper(method)
# new in python 2.5
if hasattr(text_type, 'partition'):
def partition(self, sep):
return tuple(map(self.__class__,
text_type.partition(self, self.escape(sep))))
def rpartition(self, sep):
return tuple(map(self.__class__,
text_type.rpartition(self, self.escape(sep))))
def partition(self, sep):
return tuple(map(self.__class__, text_type.partition(self, self.escape(sep))))
# new in python 2.6
if hasattr(text_type, 'format'):
def format(*args, **kwargs):
self, args = args[0], args[1:]
formatter = EscapeFormatter(self.escape)
kwargs = _MagicFormatMapping(args, kwargs)
return self.__class__(formatter.vformat(self, args, kwargs))
def rpartition(self, sep):
return tuple(map(self.__class__, text_type.rpartition(self, self.escape(sep))))
def __html_format__(self, format_spec):
if format_spec:
raise ValueError('Unsupported format specification '
'for Markup.')
return self
def format(self, *args, **kwargs):
formatter = EscapeFormatter(self.escape)
kwargs = _MagicFormatMapping(args, kwargs)
return self.__class__(formatter.vformat(self, args, kwargs))
def __html_format__(self, format_spec):
if format_spec:
raise ValueError("Unsupported format specification " "for Markup.")
return self
# not in python 3
if hasattr(text_type, '__getslice__'):
__getslice__ = make_simple_escaping_wrapper('__getslice__')
if hasattr(text_type, "__getslice__"):
__getslice__ = make_simple_escaping_wrapper("__getslice__")
del method, make_simple_escaping_wrapper
@@ -234,7 +238,7 @@ def __init__(self, args, kwargs):
self._last_index = 0
def __getitem__(self, key):
if key == '':
if key == "":
idx = self._last_index
self._last_index += 1
try:
@@ -251,35 +255,37 @@ def __len__(self):
return len(self._kwargs)
if hasattr(text_type, 'format'):
class EscapeFormatter(string.Formatter):
if hasattr(text_type, "format"):
class EscapeFormatter(string.Formatter):
def __init__(self, escape):
self.escape = escape
def format_field(self, value, format_spec):
if hasattr(value, '__html_format__'):
if hasattr(value, "__html_format__"):
rv = value.__html_format__(format_spec)
elif hasattr(value, '__html__'):
elif hasattr(value, "__html__"):
if format_spec:
raise ValueError('No format specification allowed '
'when formatting an object with '
'its __html__ method.')
raise ValueError(
"Format specifier {0} given, but {1} does not"
" define __html_format__. A class that defines"
" __html__ must define __html_format__ to work"
" with format specifiers.".format(format_spec, type(value))
)
rv = value.__html__()
else:
# We need to make sure the format spec is unicode here as
# otherwise the wrong callback methods are invoked. For
# instance a byte string there would invoke __str__ and
# not __unicode__.
rv = string.Formatter.format_field(
self, value, text_type(format_spec))
rv = string.Formatter.format_field(self, value, text_type(format_spec))
return text_type(self.escape(rv))
def _escape_argspec(obj, iterable, escape):
"""Helper for various string-wrapped functions."""
for key, value in iterable:
if hasattr(value, '__html__') or isinstance(value, string_types):
if hasattr(value, "__html__") or isinstance(value, string_types):
obj[key] = escape(value)
return obj
@@ -291,20 +297,31 @@ def __init__(self, obj, escape):
self.obj = obj
self.escape = escape
__getitem__ = lambda s, x: _MarkupEscapeHelper(s.obj[x], s.escape)
__unicode__ = __str__ = lambda s: text_type(s.escape(s.obj))
__repr__ = lambda s: str(s.escape(repr(s.obj)))
__int__ = lambda s: int(s.obj)
__float__ = lambda s: float(s.obj)
def __getitem__(self, item):
return _MarkupEscapeHelper(self.obj[item], self.escape)
def __str__(self):
return text_type(self.escape(self.obj))
__unicode__ = __str__
def __repr__(self):
return str(self.escape(repr(self.obj)))
def __int__(self):
return int(self.obj)
def __float__(self):
return float(self.obj)
# we have to import it down here as the speedups and native
# modules imports the markup type which is define above.
try:
from markupsafe._speedups import escape, escape_silent, soft_unicode
from ._speedups import escape, escape_silent, soft_unicode
except ImportError:
from markupsafe._native import escape, escape_silent, soft_unicode
from ._native import escape, escape_silent, soft_unicode
if not PY2:
soft_str = soft_unicode
__all__.append('soft_str')
__all__.append("soft_str")

View File

@@ -1,12 +1,10 @@
# -*- coding: utf-8 -*-
"""
markupsafe._compat
~~~~~~~~~~~~~~~~~~
markupsafe._compat
~~~~~~~~~~~~~~~~~~
Compatibility module for different Python versions.
:copyright: (c) 2013 by Armin Ronacher.
:license: BSD, see LICENSE for more details.
:copyright: 2010 Pallets
:license: BSD-3-Clause
"""
import sys
@@ -17,10 +15,19 @@
string_types = (str,)
unichr = chr
int_types = (int,)
iteritems = lambda x: iter(x.items())
def iteritems(x):
return iter(x.items())
from collections.abc import Mapping
else:
text_type = unicode
string_types = (str, unicode)
unichr = unichr
int_types = (int, long)
iteritems = lambda x: x.iteritems()
def iteritems(x):
return x.iteritems()
from collections import Mapping

View File

@@ -1,267 +1,264 @@
# -*- coding: utf-8 -*-
"""
markupsafe._constants
~~~~~~~~~~~~~~~~~~~~~
markupsafe._constants
~~~~~~~~~~~~~~~~~~~~~
Highlevel implementation of the Markup string.
:copyright: (c) 2010 by Armin Ronacher.
:license: BSD, see LICENSE for more details.
:copyright: 2010 Pallets
:license: BSD-3-Clause
"""
HTML_ENTITIES = {
'AElig': 198,
'Aacute': 193,
'Acirc': 194,
'Agrave': 192,
'Alpha': 913,
'Aring': 197,
'Atilde': 195,
'Auml': 196,
'Beta': 914,
'Ccedil': 199,
'Chi': 935,
'Dagger': 8225,
'Delta': 916,
'ETH': 208,
'Eacute': 201,
'Ecirc': 202,
'Egrave': 200,
'Epsilon': 917,
'Eta': 919,
'Euml': 203,
'Gamma': 915,
'Iacute': 205,
'Icirc': 206,
'Igrave': 204,
'Iota': 921,
'Iuml': 207,
'Kappa': 922,
'Lambda': 923,
'Mu': 924,
'Ntilde': 209,
'Nu': 925,
'OElig': 338,
'Oacute': 211,
'Ocirc': 212,
'Ograve': 210,
'Omega': 937,
'Omicron': 927,
'Oslash': 216,
'Otilde': 213,
'Ouml': 214,
'Phi': 934,
'Pi': 928,
'Prime': 8243,
'Psi': 936,
'Rho': 929,
'Scaron': 352,
'Sigma': 931,
'THORN': 222,
'Tau': 932,
'Theta': 920,
'Uacute': 218,
'Ucirc': 219,
'Ugrave': 217,
'Upsilon': 933,
'Uuml': 220,
'Xi': 926,
'Yacute': 221,
'Yuml': 376,
'Zeta': 918,
'aacute': 225,
'acirc': 226,
'acute': 180,
'aelig': 230,
'agrave': 224,
'alefsym': 8501,
'alpha': 945,
'amp': 38,
'and': 8743,
'ang': 8736,
'apos': 39,
'aring': 229,
'asymp': 8776,
'atilde': 227,
'auml': 228,
'bdquo': 8222,
'beta': 946,
'brvbar': 166,
'bull': 8226,
'cap': 8745,
'ccedil': 231,
'cedil': 184,
'cent': 162,
'chi': 967,
'circ': 710,
'clubs': 9827,
'cong': 8773,
'copy': 169,
'crarr': 8629,
'cup': 8746,
'curren': 164,
'dArr': 8659,
'dagger': 8224,
'darr': 8595,
'deg': 176,
'delta': 948,
'diams': 9830,
'divide': 247,
'eacute': 233,
'ecirc': 234,
'egrave': 232,
'empty': 8709,
'emsp': 8195,
'ensp': 8194,
'epsilon': 949,
'equiv': 8801,
'eta': 951,
'eth': 240,
'euml': 235,
'euro': 8364,
'exist': 8707,
'fnof': 402,
'forall': 8704,
'frac12': 189,
'frac14': 188,
'frac34': 190,
'frasl': 8260,
'gamma': 947,
'ge': 8805,
'gt': 62,
'hArr': 8660,
'harr': 8596,
'hearts': 9829,
'hellip': 8230,
'iacute': 237,
'icirc': 238,
'iexcl': 161,
'igrave': 236,
'image': 8465,
'infin': 8734,
'int': 8747,
'iota': 953,
'iquest': 191,
'isin': 8712,
'iuml': 239,
'kappa': 954,
'lArr': 8656,
'lambda': 955,
'lang': 9001,
'laquo': 171,
'larr': 8592,
'lceil': 8968,
'ldquo': 8220,
'le': 8804,
'lfloor': 8970,
'lowast': 8727,
'loz': 9674,
'lrm': 8206,
'lsaquo': 8249,
'lsquo': 8216,
'lt': 60,
'macr': 175,
'mdash': 8212,
'micro': 181,
'middot': 183,
'minus': 8722,
'mu': 956,
'nabla': 8711,
'nbsp': 160,
'ndash': 8211,
'ne': 8800,
'ni': 8715,
'not': 172,
'notin': 8713,
'nsub': 8836,
'ntilde': 241,
'nu': 957,
'oacute': 243,
'ocirc': 244,
'oelig': 339,
'ograve': 242,
'oline': 8254,
'omega': 969,
'omicron': 959,
'oplus': 8853,
'or': 8744,
'ordf': 170,
'ordm': 186,
'oslash': 248,
'otilde': 245,
'otimes': 8855,
'ouml': 246,
'para': 182,
'part': 8706,
'permil': 8240,
'perp': 8869,
'phi': 966,
'pi': 960,
'piv': 982,
'plusmn': 177,
'pound': 163,
'prime': 8242,
'prod': 8719,
'prop': 8733,
'psi': 968,
'quot': 34,
'rArr': 8658,
'radic': 8730,
'rang': 9002,
'raquo': 187,
'rarr': 8594,
'rceil': 8969,
'rdquo': 8221,
'real': 8476,
'reg': 174,
'rfloor': 8971,
'rho': 961,
'rlm': 8207,
'rsaquo': 8250,
'rsquo': 8217,
'sbquo': 8218,
'scaron': 353,
'sdot': 8901,
'sect': 167,
'shy': 173,
'sigma': 963,
'sigmaf': 962,
'sim': 8764,
'spades': 9824,
'sub': 8834,
'sube': 8838,
'sum': 8721,
'sup': 8835,
'sup1': 185,
'sup2': 178,
'sup3': 179,
'supe': 8839,
'szlig': 223,
'tau': 964,
'there4': 8756,
'theta': 952,
'thetasym': 977,
'thinsp': 8201,
'thorn': 254,
'tilde': 732,
'times': 215,
'trade': 8482,
'uArr': 8657,
'uacute': 250,
'uarr': 8593,
'ucirc': 251,
'ugrave': 249,
'uml': 168,
'upsih': 978,
'upsilon': 965,
'uuml': 252,
'weierp': 8472,
'xi': 958,
'yacute': 253,
'yen': 165,
'yuml': 255,
'zeta': 950,
'zwj': 8205,
'zwnj': 8204
"AElig": 198,
"Aacute": 193,
"Acirc": 194,
"Agrave": 192,
"Alpha": 913,
"Aring": 197,
"Atilde": 195,
"Auml": 196,
"Beta": 914,
"Ccedil": 199,
"Chi": 935,
"Dagger": 8225,
"Delta": 916,
"ETH": 208,
"Eacute": 201,
"Ecirc": 202,
"Egrave": 200,
"Epsilon": 917,
"Eta": 919,
"Euml": 203,
"Gamma": 915,
"Iacute": 205,
"Icirc": 206,
"Igrave": 204,
"Iota": 921,
"Iuml": 207,
"Kappa": 922,
"Lambda": 923,
"Mu": 924,
"Ntilde": 209,
"Nu": 925,
"OElig": 338,
"Oacute": 211,
"Ocirc": 212,
"Ograve": 210,
"Omega": 937,
"Omicron": 927,
"Oslash": 216,
"Otilde": 213,
"Ouml": 214,
"Phi": 934,
"Pi": 928,
"Prime": 8243,
"Psi": 936,
"Rho": 929,
"Scaron": 352,
"Sigma": 931,
"THORN": 222,
"Tau": 932,
"Theta": 920,
"Uacute": 218,
"Ucirc": 219,
"Ugrave": 217,
"Upsilon": 933,
"Uuml": 220,
"Xi": 926,
"Yacute": 221,
"Yuml": 376,
"Zeta": 918,
"aacute": 225,
"acirc": 226,
"acute": 180,
"aelig": 230,
"agrave": 224,
"alefsym": 8501,
"alpha": 945,
"amp": 38,
"and": 8743,
"ang": 8736,
"apos": 39,
"aring": 229,
"asymp": 8776,
"atilde": 227,
"auml": 228,
"bdquo": 8222,
"beta": 946,
"brvbar": 166,
"bull": 8226,
"cap": 8745,
"ccedil": 231,
"cedil": 184,
"cent": 162,
"chi": 967,
"circ": 710,
"clubs": 9827,
"cong": 8773,
"copy": 169,
"crarr": 8629,
"cup": 8746,
"curren": 164,
"dArr": 8659,
"dagger": 8224,
"darr": 8595,
"deg": 176,
"delta": 948,
"diams": 9830,
"divide": 247,
"eacute": 233,
"ecirc": 234,
"egrave": 232,
"empty": 8709,
"emsp": 8195,
"ensp": 8194,
"epsilon": 949,
"equiv": 8801,
"eta": 951,
"eth": 240,
"euml": 235,
"euro": 8364,
"exist": 8707,
"fnof": 402,
"forall": 8704,
"frac12": 189,
"frac14": 188,
"frac34": 190,
"frasl": 8260,
"gamma": 947,
"ge": 8805,
"gt": 62,
"hArr": 8660,
"harr": 8596,
"hearts": 9829,
"hellip": 8230,
"iacute": 237,
"icirc": 238,
"iexcl": 161,
"igrave": 236,
"image": 8465,
"infin": 8734,
"int": 8747,
"iota": 953,
"iquest": 191,
"isin": 8712,
"iuml": 239,
"kappa": 954,
"lArr": 8656,
"lambda": 955,
"lang": 9001,
"laquo": 171,
"larr": 8592,
"lceil": 8968,
"ldquo": 8220,
"le": 8804,
"lfloor": 8970,
"lowast": 8727,
"loz": 9674,
"lrm": 8206,
"lsaquo": 8249,
"lsquo": 8216,
"lt": 60,
"macr": 175,
"mdash": 8212,
"micro": 181,
"middot": 183,
"minus": 8722,
"mu": 956,
"nabla": 8711,
"nbsp": 160,
"ndash": 8211,
"ne": 8800,
"ni": 8715,
"not": 172,
"notin": 8713,
"nsub": 8836,
"ntilde": 241,
"nu": 957,
"oacute": 243,
"ocirc": 244,
"oelig": 339,
"ograve": 242,
"oline": 8254,
"omega": 969,
"omicron": 959,
"oplus": 8853,
"or": 8744,
"ordf": 170,
"ordm": 186,
"oslash": 248,
"otilde": 245,
"otimes": 8855,
"ouml": 246,
"para": 182,
"part": 8706,
"permil": 8240,
"perp": 8869,
"phi": 966,
"pi": 960,
"piv": 982,
"plusmn": 177,
"pound": 163,
"prime": 8242,
"prod": 8719,
"prop": 8733,
"psi": 968,
"quot": 34,
"rArr": 8658,
"radic": 8730,
"rang": 9002,
"raquo": 187,
"rarr": 8594,
"rceil": 8969,
"rdquo": 8221,
"real": 8476,
"reg": 174,
"rfloor": 8971,
"rho": 961,
"rlm": 8207,
"rsaquo": 8250,
"rsquo": 8217,
"sbquo": 8218,
"scaron": 353,
"sdot": 8901,
"sect": 167,
"shy": 173,
"sigma": 963,
"sigmaf": 962,
"sim": 8764,
"spades": 9824,
"sub": 8834,
"sube": 8838,
"sum": 8721,
"sup": 8835,
"sup1": 185,
"sup2": 178,
"sup3": 179,
"supe": 8839,
"szlig": 223,
"tau": 964,
"there4": 8756,
"theta": 952,
"thetasym": 977,
"thinsp": 8201,
"thorn": 254,
"tilde": 732,
"times": 215,
"trade": 8482,
"uArr": 8657,
"uacute": 250,
"uarr": 8593,
"ucirc": 251,
"ugrave": 249,
"uml": 168,
"upsih": 978,
"upsilon": 965,
"uuml": 252,
"weierp": 8472,
"xi": 958,
"yacute": 253,
"yen": 165,
"yuml": 255,
"zeta": 950,
"zwj": 8205,
"zwnj": 8204,
}

View File

@@ -1,36 +1,49 @@
# -*- coding: utf-8 -*-
"""
markupsafe._native
~~~~~~~~~~~~~~~~~~
markupsafe._native
~~~~~~~~~~~~~~~~~~
Native Python implementation the C module is not compiled.
Native Python implementation used when the C module is not compiled.
:copyright: (c) 2010 by Armin Ronacher.
:license: BSD, see LICENSE for more details.
:copyright: 2010 Pallets
:license: BSD-3-Clause
"""
from markupsafe import Markup
from markupsafe._compat import text_type
from . import Markup
from ._compat import text_type
def escape(s):
"""Convert the characters &, <, >, ' and " in string s to HTML-safe
sequences. Use this if you need to display text that might contain
such characters in HTML. Marks return value as markup string.
"""Replace the characters ``&``, ``<``, ``>``, ``'``, and ``"`` in
the string with HTML-safe sequences. Use this if you need to display
text that might contain such characters in HTML.
If the object has an ``__html__`` method, it is called and the
return value is assumed to already be safe for HTML.
:param s: An object to be converted to a string and escaped.
:return: A :class:`Markup` string with the escaped text.
"""
if hasattr(s, '__html__'):
return s.__html__()
return Markup(text_type(s)
.replace('&', '&amp;')
.replace('>', '&gt;')
.replace('<', '&lt;')
.replace("'", '&#39;')
.replace('"', '&#34;')
if hasattr(s, "__html__"):
return Markup(s.__html__())
return Markup(
text_type(s)
.replace("&", "&amp;")
.replace(">", "&gt;")
.replace("<", "&lt;")
.replace("'", "&#39;")
.replace('"', "&#34;")
)
def escape_silent(s):
"""Like :func:`escape` but converts `None` into an empty
markup string.
"""Like :func:`escape` but treats ``None`` as the empty string.
Useful with optional values, as otherwise you get the string
``'None'`` when the value is ``None``.
>>> escape(None)
Markup('None')
>>> escape_silent(None)
Markup('')
"""
if s is None:
return Markup()
@@ -38,8 +51,18 @@ def escape_silent(s):
def soft_unicode(s):
"""Make a string unicode if it isn't already. That way a markup
string is not converted back to unicode.
"""Convert an object to a string if it isn't already. This preserves
a :class:`Markup` string rather than converting it back to a basic
string, so it will still be marked as safe and won't be escaped
again.
>>> value = escape('<User 1>')
>>> value
Markup('&lt;User 1&gt;')
>>> escape(str(value))
Markup('&amp;lt;User 1&amp;gt;')
>>> escape(soft_unicode(value))
Markup('&lt;User 1&gt;')
"""
if not isinstance(s, text_type):
s = text_type(s)

View File

@@ -1,22 +0,0 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This file dispatches to the correct implementation of OrderedDict."""
# TODO: this file, along with py26/ordereddict.py, can be removed when
# TODO: support for python 2.6 will be dropped
# Removing this import will make python 2.6
# fail on import of ordereddict
from __future__ import absolute_import
import sys
if sys.version_info[:2] == (2, 6):
import ordereddict
OrderedDict = ordereddict.OrderedDict
else:
import collections
OrderedDict = collections.OrderedDict

View File

@@ -1,127 +0,0 @@
# Copyright (c) 2009 Raymond Hettinger
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation files
# (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge,
# publish, distribute, sublicense, and/or sell copies of the Software,
# and to permit persons to whom the Software is furnished to do so,
# subject to the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
from UserDict import DictMixin
class OrderedDict(dict, DictMixin):
def __init__(self, *args, **kwds):
if len(args) > 1:
raise TypeError('expected at most 1 arguments, got %d' % len(args))
try:
self.__end
except AttributeError:
self.clear()
self.update(*args, **kwds)
def clear(self):
self.__end = end = []
end += [None, end, end] # sentinel node for doubly linked list
self.__map = {} # key --> [key, prev, next]
dict.clear(self)
def __setitem__(self, key, value):
if key not in self:
end = self.__end
curr = end[1]
curr[2] = end[1] = self.__map[key] = [key, curr, end]
dict.__setitem__(self, key, value)
def __delitem__(self, key):
dict.__delitem__(self, key)
key, prev, next = self.__map.pop(key)
prev[2] = next
next[1] = prev
def __iter__(self):
end = self.__end
curr = end[2]
while curr is not end:
yield curr[0]
curr = curr[2]
def __reversed__(self):
end = self.__end
curr = end[1]
while curr is not end:
yield curr[0]
curr = curr[1]
def popitem(self, last=True):
if not self:
raise KeyError('dictionary is empty')
if last:
key = reversed(self).next()
else:
key = iter(self).next()
value = self.pop(key)
return key, value
def __reduce__(self):
items = [[k, self[k]] for k in self]
tmp = self.__map, self.__end
del self.__map, self.__end
inst_dict = vars(self).copy()
self.__map, self.__end = tmp
if inst_dict:
return (self.__class__, (items,), inst_dict)
return self.__class__, (items,)
def keys(self):
return list(self)
setdefault = DictMixin.setdefault
update = DictMixin.update
pop = DictMixin.pop
values = DictMixin.values
items = DictMixin.items
iterkeys = DictMixin.iterkeys
itervalues = DictMixin.itervalues
iteritems = DictMixin.iteritems
def __repr__(self):
if not self:
return '%s()' % (self.__class__.__name__,)
return '%s(%r)' % (self.__class__.__name__, self.items())
def copy(self):
return self.__class__(self)
@classmethod
def fromkeys(cls, iterable, value=None):
d = cls()
for key in iterable:
d[key] = value
return d
def __eq__(self, other):
if isinstance(other, OrderedDict):
if len(self) != len(other):
return False
for p, q in zip(self.items(), other.items()):
if p != q:
return False
return True
return dict.__eq__(self, other)
def __ne__(self, other):
return not self == other

Some files were not shown because too many files have changed in this diff Show More