Compare commits

..

1199 Commits

Author SHA1 Message Date
Gregory Becker
afe1fd89b9 WIP -- wait for 18205 to continue 2020-10-21 18:37:21 -07:00
Tamara Dahlgren
1452020f22 Added raja smoke tests; updated build dir for cmake-based package tests 2020-10-14 19:35:45 -07:00
Tamara Dahlgren
9afff8eb60 Resolved eleven unit test failures (#18979) 2020-10-06 10:54:29 -07:00
Tamara Dahlgren
c6cd52f616 Remove unused test_compiler in intel.py (#18950) 2020-09-25 14:04:26 -07:00
Tamara Dahlgren
8bbfbc741d Added __init__.py to address test collection on the tty.py test (#18903) 2020-09-25 14:04:24 -07:00
Tamara Dahlgren
3f31fffe65 Resolved all basic flake8 errors 2020-09-25 14:04:23 -07:00
Tamara Dahlgren
fa023354c6 Restore test subcommand list limited to the first line though (#18723) 2020-09-23 12:44:27 -07:00
Tamara Dahlgren
02bd3d55a0 Bugfix: correct test find stage directory; fix flake8 errors (#18704) 2020-09-23 12:44:22 -07:00
Tamara Dahlgren
e58d4f8cb7 Fix test subcommand help/description (#18721) 2020-09-23 12:44:19 -07:00
Tamara Dahlgren
37a77e0d12 Preliminary binutils install tests (#18645) 2020-09-23 12:44:17 -07:00
Tamara Dahlgren
f10864b96e openmpi: Remove unneeded references to test part status values (#18644) 2020-09-23 12:44:14 -07:00
Greg Becker
19e226259c Features/spack test refactor cmds (#18518)
* no clean -t option, use 'spack test remove'

* refactor commands to make better use of TestSuite objects
2020-09-23 12:44:12 -07:00
Tamara Dahlgren
102b91203a Rename and make escaped text file utility readily available to packages (#18339) 2020-09-23 12:44:09 -07:00
Tamara Dahlgren
04ca718051 Updated hdf smoke test (#18337) 2020-09-23 12:44:06 -07:00
Tamara Dahlgren
61abc75bc6 Add remaining bugfixes and consistency changes from #18210 (#18334) 2020-09-23 12:44:03 -07:00
Greg Becker
86eececc5c Features/spack test refactor (#18277)
* refactor test code into a TestSuite object and install_test module

* update mpi tests

* refactor tests suites to use content hash for name and record reproducibility info

* update unit tests and fix bugs

* Fix tests using data dir for new format
Use new `self.test_stage` object to access current data dir

Co-authored by: Tamara Dahlgren <dahlgren1@llnl.gov>
2020-09-23 12:44:00 -07:00
Tamara Dahlgren
7551b66cd9 Smoke tests: handle package syntax errors (#17919) 2020-09-23 12:43:58 -07:00
Tamara Dahlgren
c9a00562c4 Preliminary HDF smoke tests (#18076) 2020-09-23 12:43:55 -07:00
Gregory Becker
dafda1ab1a update tests to use status=0 over status=None 2020-09-23 12:43:53 -07:00
Greg Becker
d6b9871169 Features/compiler tests (#17353)
* fix setup of run environment for tests

* remove unnecessary 'None' option from run_tests status arg

* allow package files for virtuals

* run tests for all virtuals provided by each package

* add tests for mpi

* add compiler tests for virtual packages

* run compiler tests automatically like virtuals

* use working_dir instead of os.chdir

* Move knowledge of virtual-ness from spec to repo

* refactor test/cmd/clean

* update cmd/pkg tests for correctness
2020-09-23 12:43:50 -07:00
Tamara Dahlgren
b6d1704729 Smoke tests: Preliminary berkeley-db tests (#17899) 2020-09-23 12:43:46 -07:00
Tamara Dahlgren
f49fa74bf7 Smoke test: Add test of a sequence of hdf5 commands (#17686) 2020-09-23 12:43:44 -07:00
Tamara Dahlgren
0b8bc43fb0 smoke test: preliminary sqlite tests 2020-09-23 12:43:40 -07:00
Tamara Dahlgren
70eb1960fb smoke test: Ensure expected test results and options are lists 2020-09-23 12:43:37 -07:00
Tamara Dahlgren
a7c109c3aa Smoke tests: cmake version checks (#17359)
* Smoke tests: cmake version checks

* Simplified cmake install checks: dict-to-list

Co-authored-by: Greg Becker <becker33@llnl.gov>
2020-09-23 12:43:34 -07:00
Tamara Dahlgren
da62f89f4a Smoke tests: hdf5 version checks and check_install (#17360) 2020-09-23 12:43:31 -07:00
Tamara Dahlgren
d9f0170024 Smoke tests: switched warn to debug plus bugfix (#17576) 2020-09-23 12:43:28 -07:00
Tamara Dahlgren
171ebd8189 Features/spack test emacs (#17363)
* Smoke tests: emacs version checks
2020-09-23 12:43:24 -07:00
Tamara Dahlgren
6d986b4478 Smoke tests: Preliminary Umpire install tests (#17178)
* Preliminary install tests for the Umpire package
2020-09-23 12:43:20 -07:00
Tamara Dahlgren
03569dee8d Add install tests for libsigsegv (#17064) 2020-09-23 12:43:18 -07:00
Tamara Dahlgren
e2ddd7846c bugfix: fix cache_extra_test_sources' file copy; add unit tests (#17057) 2020-09-23 12:43:16 -07:00
Gregory Becker
32693fa573 fixup bugs after rebase 2020-09-23 12:43:14 -07:00
Gregory Becker
e221ba6ba7 update macos test for new unit-test command 2020-09-23 12:43:12 -07:00
Gregory Becker
4146c3d135 flake 2020-09-23 12:43:10 -07:00
Gregory Becker
f5b165a76f flake 2020-09-23 12:43:08 -07:00
Tamara Dahlgren
3508362dde smoke tests: grab and run build examples (openmpi) (#16365)
* Snapshot smoke tests that grab and run examples

* Resolved openmpi example test issues for 2.0.0-4.0.3

* Use spec.satisfies; copy extra packages after install (vs. prior to install tests

* Added smoke tests for selected openmpi installed binaries

* Use which() to determine if install exe exists

* Switched onus for installer test source grab from installer to package

* Resolved (local) flake8 issues with package.py

* Use runner.name; use string format for *run_test* messages

* Renamed copy_src_to_install to cache_extra_test_source and added comments

* Metadata path cleanup: added metadata_dir property to and its use in package

* Support list of source paths to cache for install testing (with unit test)

* Added test subdir to install_test_root; changed skip_file to lambda
2020-09-23 12:43:05 -07:00
Tamara Dahlgren
fb145df4f4 bugfix: Resolve perl install test bug (#16501) 2020-09-23 12:43:02 -07:00
Tamara Dahlgren
fd46f67d63 smoke tests: Refined openmpi version checks (#16337)
* Refined openmpi version checks to pass for 2.1.0 through 4.0.3

* Allow skipping install tests with exe not in bin dir and revised openmpi version tests
2020-09-23 12:43:00 -07:00
Greg Becker
edf3e91a12 Add --fail-first and --fail-fast options to spack test run (#16277)
`spack test run --fail-first` exits after the first failed package.

`spack test run --fail-fast` stops each package test after the first
failure.
2020-09-23 12:42:58 -07:00
Gregory Becker
bca630a22a make output comparisons regex 2020-09-23 12:42:56 -07:00
Gregory Becker
5ff9ba320d remove debug log parser from ctest 2020-09-23 12:42:54 -07:00
Gregory Becker
50640d4924 update from Error to FAILED 2020-09-23 12:42:51 -07:00
Gregory Becker
f5cfcadfc5 change test headings from {name}-{hash} to {name}-{version}-{hash} 2020-09-23 12:42:49 -07:00
Gregory Becker
a5c534b86d update bash completion 2020-09-23 12:42:46 -07:00
Gregory Becker
99364b9c3f refactor 2020-09-23 12:42:43 -07:00
Gregory Becker
749ab2e79d Make Spack tests record their errors and continue
previously, tests would fail on the first error
now, we wrap them in a TestFailure object that records all failures
2020-09-23 12:42:40 -07:00
Greg Becker
1b3e1897ca Features/spack test subcommands (#16054)
* spack test: subcommands for asynchronous tests

* commands are `run`, `list`, `status`, `results`, `remove`.
2020-09-23 12:42:36 -07:00
Tamara Dahlgren
3976b2a083 tests: Preliminary libsigsegv smoke tests (updated) (#15981)
* tests: Preliminary libsigsegv smoke tests (updated)

* Cleaned up and added doc to libsigsegv smoke test
2020-09-23 12:42:33 -07:00
Tamara Dahlgren
54603cb91f tests: Update openmpi smoke tests to new run_test api (#15982)
* tests: Update openmpi smoke tests to new run_test api

* Removed version check try-except tracking per discussion

* Changed openmpi orted command status values to list
2020-09-23 12:42:29 -07:00
Tamara Dahlgren
aa630b8d71 install tests: added support for multiple test command status values (#15979)
* install tests: added support for multiple test command status values
2020-09-23 12:42:25 -07:00
Gregory Becker
77acf8ddc2 spack unit-test: fix pytest help command 2020-09-23 12:42:22 -07:00
Gregory Becker
dfb02e6d45 test runner: add options to check installation dir and print purpose 2020-09-23 12:42:18 -07:00
Gregory Becker
cf4a0cbc01 python: use self.command to get exe name in test 2020-09-23 12:42:16 -07:00
Gregory Becker
b0eb02a86f cmd/test.py: fix typo in spdx license header 2020-09-23 12:42:14 -07:00
Gregory Becker
73f76bc1b5 update bash completions 2020-09-23 12:42:12 -07:00
Gregory Becker
6f39d8011e spack test: factor out common args 2020-09-23 12:42:10 -07:00
Gregory Becker
97dc74c727 python: fix tests, remove intentional debug failures 2020-09-23 12:42:08 -07:00
Gregory Becker
d53eefa69f fix docs 2020-09-23 12:42:05 -07:00
Gregory Becker
bae57f2ae8 spack test: update existing docs for moved unit-test cmd 2020-09-23 12:41:23 -07:00
Gregory Becker
ba58ae9118 simplify error handling using language features 2020-09-23 12:36:23 -07:00
Gregory Becker
fdb8a59bae fix get_package_context check whether in a package file 2020-09-23 12:36:22 -07:00
Gregory Becker
d92f52ae02 fix handling of asserts for python3 2020-09-23 12:36:22 -07:00
Gregory Becker
3229bf04f5 fix 'belt and suspenders' for config values 2020-09-23 12:36:21 -07:00
Gregory Becker
ccf519daa5 update travis 2020-09-23 12:36:20 -07:00
Gregory Becker
c5ae92bf3f flake 2020-09-23 12:36:19 -07:00
Gregory Becker
f83280cb58 standardize names for configure_test, build_test, install_test 2020-09-23 12:36:18 -07:00
Gregory Becker
6e80de652c unbreak zlib 2020-09-23 12:36:17 -07:00
Gregory Becker
0dc212e67d tests and bugfixes 2020-09-23 12:36:16 -07:00
Gregory Becker
3ce2efe32a update bash completions 2020-09-23 12:36:14 -07:00
Gregory Becker
76ce5d90ec fixup unit-test from develop 2020-09-23 12:36:14 -07:00
Gregory Becker
e5a9a376bf fix cmd/clean tests 2020-09-23 12:36:13 -07:00
Gregory Becker
d6a497540d fixup reporter work 2020-09-23 12:36:12 -07:00
Gregory Becker
b996d65a96 bugfix 2020-09-23 12:36:11 -07:00
Gregory Becker
991a2aae37 test name message 2020-09-23 12:36:10 -07:00
Tamara Dahlgren
8ba45e358b Initial OpenMPI smoke tests: version checks 2020-09-23 12:36:09 -07:00
Gregory Becker
28e76be185 spack clean: option to clean test stage (-t) 2020-09-23 12:36:09 -07:00
Gregory Becker
70e91cc1e0 spack test: add dirty/clean flags to command 2020-09-23 12:36:08 -07:00
Gregory Becker
b52113aca9 move test dir to config option 2020-09-23 12:36:07 -07:00
Gregory Becker
ce06e24a2e refactor run_test to Package level 2020-09-23 12:36:06 -07:00
Gregory Becker
dd0fbe670c continue testing after error 2020-09-23 12:36:05 -07:00
Tamara Dahlgren
6ad70b5f5d Preliminary libxml2 tests (#15092)
* Initial libxml2 tests (using executables)

* Expanded libxml2 tests using installed bins

* Refactored/generalized _run_tests
2020-09-23 12:36:04 -07:00
wspear
dadf4d1ed9 Fixed import string (#15094) 2020-09-23 12:36:03 -07:00
Gregory Becker
64bac977f1 add spack test-env command, refactor to combine with build-env 2020-09-23 12:36:02 -07:00
Gregory Becker
2f1d26fa87 allow tests to require compiler 2020-09-23 12:36:02 -07:00
Gregory Becker
cf713c5320 Modify existing test methods to naming scheme <phase_name>test
Existing test methods run via callbacks at install time when run with `spack install --run-tests`
These methods are tied into the package build system, and cannot be run arbitrarily
New naming scheme for these tests based on the build system phase after which they should be run
The method name `test` is now reserved for methods run via the `spack test` command
2020-09-23 12:36:01 -07:00
Tamara Dahlgren
035e7b3743 tests: Added preliminary smoke test for perl (#14592)
* Added install test for perl, including use statements
2020-09-23 12:35:59 -07:00
Tamara Dahlgren
473457f2ba tests: Preliminary m4 smoke tests (#14553)
* Preliminary m4 smoke tests
2020-09-23 12:35:59 -07:00
Tamara Dahlgren
490bca73d1 Change variable name to 'standard' file to avoid confusion with function (#14589) 2020-09-23 12:35:58 -07:00
Tamara Dahlgren
59e885bd4f tests: Preliminary patchelf smoke tests (#14551)
* Initial patchelf smoke tests
2020-09-23 12:35:57 -07:00
Gregory Becker
966fc427a9 copy test data into './data' in test environment 2020-09-23 12:35:56 -07:00
Gregory Becker
8a34511789 improved error printing 2020-09-23 12:35:55 -07:00
Gregory Becker
8f255f9e6a fix reporter call for install command 2020-09-23 12:35:54 -07:00
Gregory Becker
4d282ad4d9 Changes in cmd/test.py in develop mirrored to cmd/unit-test.py 2020-09-23 12:35:53 -07:00
Gregory Becker
7216451ba7 tests occur in temporary directory, can be kept for debugging 2020-09-23 12:35:52 -07:00
Gregory Becker
e614cdf007 improve error catching/handling/re-raising 2020-09-23 12:35:51 -07:00
Gregory Becker
bc486a961c make test fail 2020-09-23 12:35:50 -07:00
Gregory Becker
a13eab94ce improve logging and add junit basics 2020-09-23 12:35:49 -07:00
Gregory Becker
6574c6779b python3 syntax for re-raising an error with the old traceback 2020-09-23 12:35:48 -07:00
Gregory Becker
d2cfbf177d make cdash test reporter work for testing 2020-09-23 12:35:46 -07:00
Gregory Becker
bfb97e4d57 add reporting format options to spack test 2020-09-23 12:35:14 -07:00
Gregory Becker
4151224ef2 WIP infrastructure for Spack test command to test existing installations 2020-09-23 12:22:26 -07:00
darmac
e294a1e0a6 fasttext: new package at v0.9.2 (#18890) 2020-09-23 18:53:31 +02:00
eugeneswalker
9eb87d1026 OLCF Ascent gitlab ci trigger: pass SPACK_REF (#18875) 2020-09-23 09:35:29 -07:00
vvolkl
394a23d392 py-uproot4: added new package at v0.0.27 (#18891) 2020-09-23 18:17:57 +02:00
vvolkl
3c418d9faa py-awkward1: added new package at v0.3.1 (#18892) 2020-09-23 18:15:32 +02:00
Matthieu Dorier
4277bc6429 nlohmann-json-schema-validator: new package at v2.1.0 (#18837) 2020-09-23 17:28:33 +02:00
Howard Pritchard
aedc056f9a trilinos: patch for cray cce fortran compiler (#18164)
two patchfiles needed since this file changed between 12.12.1 and 12.14.1

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2020-09-23 09:54:51 -05:00
darmac
562f504000 Add new package: addrwatch (#18728)
* Add new package: addrwatch

* addrwatch: refine url
2020-09-23 14:02:03 +02:00
Andre Sailer
8fac02e437 LCIO: added v2.15.[0123] (#18841) 2020-09-23 13:55:01 +02:00
Mathias Anselmann
b578d55d12 setting old GO default values for older trilinos versioins to (hopefully) not break the installation. Adjusting dealii package to just explicitly set GO if trilinos >= 12.18.1 is installed (#15439) 2020-09-22 18:52:05 -05:00
Jen Herting
bbb6b14540 treelite: new package at v0.93 (#18861) 2020-09-22 22:30:04 +02:00
Massimiliano Culpo
92b8177b77 gromacs: remove 'rdtscp' variant, deduce the flag from the target (#18868)
refers #18858
2020-09-22 13:11:35 -06:00
Miroslav Stoyanov
acf4dc2e12 Heffte: add magma variant (#18849) 2020-09-22 15:05:00 -04:00
Martin Pokorny
a23d67f7ea casacore: added v3.3.0 (#18870) 2020-09-22 20:55:58 +02:00
Andrew Gaspar
0d0ba79bfb Rust: added v1.46.0 (#18863) 2020-09-22 20:54:54 +02:00
Jordan Ogas
435c8862a6 charliecloud: added v0.19 (#18866) 2020-09-22 20:45:29 +02:00
Glenn Johnson
84dbf6948e Set conflicts parameter for cuda-11 (#18847)
Magma is not currently compatible with CUDA-11. While this is reflected
in the package, it is done with a comment in a `depends_on` directive,
which has the effect of trying to install a version of CUDA that may be
different from the one in the current environment, without any message
to the end user. A `conflicts` is a better way to handle this.
2020-09-22 12:35:13 -04:00
Adam J. Stewart
9558377f0f Bash: fix build with Xcode 12 (#18843) 2020-09-22 12:25:54 -04:00
Ganesh Kumar
9e906e2d47 ROCm 3.8 Stage1 Components (#18830)
* ROCm 3.8 Stage1 Components

* version review comments

* 3.5 dependency restrictions

Co-authored-by: root <root@mlseqa-hyd-virt-srv-07.amd.com>
2020-09-22 11:09:03 -05:00
Simon Pintarelli
7d0ae0f295 sirius: use -DCUDA_ARCH for develop, version >7.0.0 (#18852)
Also remove master branch
2020-09-22 17:35:11 +02:00
victorusu
a078e2ba13 ReFrame: added v3.1 (#18860) 2020-09-22 17:19:51 +02:00
Chris White
30d24db116 Added RAJA v0.12.1`and Umpire v4.0.1 (#18756)
Also renamed 'master' to 'main'
2020-09-22 15:40:16 +02:00
Hadrien G
a1e19de8e1 acts: added v1.0 (#18859) 2020-09-22 15:05:59 +02:00
Christoph Junghans
bdefac6964 miniqmc: fix install (#18857) 2020-09-21 18:15:27 -06:00
Robert Blake
b272e00d6a util-linux: fix bash completion install errors. (#18696)
* Disable bash completion by default.

* flake8

* Adding explicit dependence on libuuid

* Adding explicit dependence on cryptsetup

This way we don't pick up host crypto packages by mistake.

* Fixing the completion directory.

* Update var/spack/repos/builtin/packages/util-linux/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* flake8

* Removing libuuid linkage according to @michaelkuhn on #18696

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-21 16:11:44 -05:00
Christoph Junghans
0dd7f140b0 nut: fix install (#18848)
* nut: fix install

* flake8
2020-09-21 16:09:53 -05:00
Christoph Junghans
165dbf719d fix spack-build usages (#18846) 2020-09-21 12:19:06 -05:00
Greg Becker
5565b6494d typo (#18845) 2020-09-21 11:54:23 -05:00
Joseph Wang
88187bc63c add groff and ghostscript (#18803)
Without these packages, graphviz will set groff/ghostscript to false which will cause the build to fail.
2020-09-21 17:39:13 +02:00
Adam J. Stewart
678e9f5ad2 libksba: add new version (#18798) 2020-09-20 08:53:42 -04:00
Adam J. Stewart
7d5ddb1fc6 readline: simplify linking to ncurses (#18801) 2020-09-19 16:32:42 -05:00
Adam J. Stewart
489377d051 SciPy: add patch to fix XCode 12 build (#18800) 2020-09-19 16:32:26 -05:00
Adam J. Stewart
458dc81878 ncurses: fix libs method (#18799) 2020-09-19 16:32:05 -05:00
Adam J. Stewart
f570d7a45c graphviz: fix build with Apple Clang 12.0.0 (#18797) 2020-09-19 16:29:42 -05:00
Adam J. Stewart
36e9c1eba3 fish: relax ncurses dependency constraints (#18796) 2020-09-19 16:29:27 -05:00
Massimiliano Culpo
fcb4dfc307 Ensure variant defaults are parsable from CLI. (#18661)
- Add a unit test to check if there are unparsable defaults
- Fix 'rust' and 'nsimd' variants
2020-09-19 07:54:26 +02:00
Larry Knox
fff2f34de8 Hdf5 1.10.7 (#18712)
* Update hdf5/package.py for HDF5 1.10.7 release and obsolete home url.
Add maintainer to hdf/package.py.

* remove stray space.

* Remove unnecessary /diplay/support from homepage urls.
2020-09-18 21:38:56 -05:00
Tom Payerle
e36498cb46 bedtools2: Add missing python build dependency (#18744) (#18746)
Makefile invokes python to build some scripts
See #18744
2020-09-18 21:38:21 -05:00
Michael Kuhn
6198963ede popt: Add missing libiconv dependency (#18731)
Without this dependency, the build fails due to undefined references.
2020-09-18 18:00:00 -04:00
Adam J. Stewart
b44cf08cb2 py-notebook: add new version (#18638) 2020-09-18 14:46:12 -05:00
Greg Becker
7585b37865 do out of source builds in hashed directories (#18574) 2020-09-18 12:21:13 -07:00
Shahzeb Siddiqui
58fb6cdaad trigger ascent e4s pipeline on merge to spack develop (#18655)
* trigger ascent e4s pipeline on merge to spack develop

* change pipeline name ecpcitest/e4s is the pipeline that will be triggered for merge on develop its the E4S use-case.
2020-09-18 10:38:29 -07:00
Glenn Johnson
6e82776773 Add mumax-3.10 release version (#18740)
This PR adds the current release version of mumax and tweaks the install
of the previous beta version.

- Set the url parameter to reflect the release version over the beta
  version. Hopefully, this will be consistent going forward.
- Set an explicit url for the previous beta version.
- Accept values for `cuda_arch`. The previous version had its own list
  but the release version does not.
- Replace the built in cuda compute capabilities list with the one
  provided by Spack for the 3.10beta version.
2020-09-18 13:36:00 -04:00
Seth R. Johnson
71c7e28ca7 swig: add version 4.0.2 and 4.0.2-fortran (#18741) 2020-09-18 13:35:33 -04:00
Greg Becker
2e4892c111 env view failures: print underlying error message (#18713) 2020-09-18 10:21:14 -07:00
Jen Herting
44c7826892 [py-pyarrow] added variant cuda (#18716)
* [py-pyarrow] added variant cuda

* [py-pyarrow] simplifying variant dependencies
2020-09-18 10:28:28 -05:00
darmac
8cb1192050 util-linux: fix build error (#18647)
* util-linux: fix build error

* refine install stage
2020-09-18 10:04:28 -05:00
darmac
a4cdf664c6 Add new package: shiro (#18541)
* Add new package: shiro

* refine description and dependencies
2020-09-18 10:03:32 -05:00
ketsubouchi
403ea4384e ocaml: support 4.11 (#18705) 2020-09-18 10:02:50 -05:00
Toyohisa Kameyama
7d0a46c051 iwyu: Require llvm+all_targets on non-x86_64 systems (#18710) 2020-09-18 10:31:03 -04:00
Axel Huebl
a8b6faf430 py-recommonmark: fix URL and docutils version (#18714) 2020-09-18 12:07:57 +02:00
Axel Huebl
275583c02f py-breathe: added v4.21.0 (#18722) 2020-09-18 07:44:54 +02:00
Mark W. Krentel
80fe51046c libpfm4: add version 4.11.0 (#18720)
Add version 4.11.0 for libpfm4.
Add myself as maintainer.
2020-09-17 15:58:23 -05:00
Jen Herting
e329e13f32 [arrow] added cuda variant (#18715) 2020-09-17 15:26:51 -05:00
Glenn Johnson
4b41d56bc3 Add bart-0.6.00 (#18717)
This PR adds version 0.6.00 of bart.
2020-09-17 15:25:08 -05:00
Adam J. Stewart
f9699fd3ff py-nbconvert: add new version (#18636) 2020-09-17 15:23:59 -05:00
Adam J. Stewart
69d8417d8a py-nbclient: add new package (#18628) 2020-09-17 12:54:03 -07:00
Adam J. Stewart
01df552149 py-matplotlib: add v3.2.2 (#18681) 2020-09-17 12:46:20 -07:00
t-nojiri
b66d756da6 bowtie : Fix for aarch64 (#18709) 2020-09-17 11:42:32 -05:00
Glenn Johnson
7c01c64d53 gpu-burn: allow to build with non-gcc compilers (#18707)
This PR modifies the patch to use $(CXX) rather than g++ to allow the
spack compiler to be used.
2020-09-17 17:24:22 +02:00
Glenn Johnson
2576d8d767 Update libbeagle (#18703)
This PR fixes a couple of things with the libbeagle package.

- libbeagle can only be built for one GPU type. Add a test for that.
- version 2 had the arch statement in
  libhmsbeagle/GPU/kernels/Makefile.am but version 3 has it in
  configure.ac. Put the variant specified value in configure.ac for
  consistency.
2020-09-17 10:04:21 -05:00
Ganesh Kumar
0f332c73a6 Rocm 3.7 miopen, miopengemm, rocalution and rocm-opencl (#18690)
* ROCm 3.5 miopen recipe

* fixing flake8 issues

* cmake variant fix

* min support fix

* variant possible values

* rocm3.7 change for miopen, rocalution and rocm-opencl

* review comments
2020-09-17 10:00:45 -05:00
Enrico Usai
91bd99b4b2 aws-parallelcluster: add 2.9.1 and 2.9.0 releases (#18676)
Checked all the third party dependencies for all the versions.
2020-09-17 09:59:11 -05:00
arjun-raj-kuppala
7fb83047fd Hipify clang with AMD rocm 3.7.0 update (#18672)
* Hipify clang with rocm 3.7.0 update

* Update var/spack/repos/builtin/packages/hipify-clang/package.py

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2020-09-16 22:18:48 -05:00
Greg Sjaardema
9a5f043ce6 seacas: Force use of non-mpi-enabled hdf5 (#18702)
Due to recent changes in the `netcdf-c` package, it is now necessary to explicitly request a non-mpi-enabled hdf5 build if building a non-mpi-enabled seacas.
2020-09-16 16:37:13 -05:00
Robert Blake
f95d7959b2 lustre: Adding external support. Closing #18698 (#18700) 2020-09-16 16:29:59 -05:00
Simon Frasch
91649763ab spla: Add version 1.1.1 and fix cmake flag when build with +cuda (#18699) 2020-09-16 16:27:00 -05:00
Sajid Ali
7c23498f1d bump CGAL version (#18693)
* bump CGAL version

* Address reviewer comments

* flake8 fix

* Address reviewer comments

* Address reviewer comments
2020-09-16 16:22:40 -05:00
Andre Sailer
635b8243fe [dd4hep]: add variant lcio (#18691) 2020-09-16 17:33:14 +01:00
Tiziano Müller
0d5c065678 libvdwxc: unbreak concretization, request fftw-api (#18688)
* libvdwxc: unbreak concretization, request fftw-api

mixing both fftw and fftw-api in a dependency tree can trigger the
following:

```
$ spack spec cp2k@master +sirius
==> [2020-09-16-12:36:06.552981] sirius applying constraint gsl
==> [2020-09-16-12:36:06.554270] sirius applying constraint openblas@0.3.10%gcc@7.5.0~consistent_fpcsr~ilp64+pic+shared threads=none arch=linux-opensuse_leap15-sandybridge
Traceback (most recent call last):
  File "./bin/spack", line 64, in <module>
    sys.exit(spack.main.main())
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/main.py", line 762, in main
    return _invoke_command(command, parser, args, unknown)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/main.py", line 490, in _invoke_command
    return_val = command(parser, args)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/cmd/spec.py", line 103, in spec
    spec.concretize()
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2228, in concretize
    user_spec_deps=user_spec_deps),
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2716, in normalize
    visited, all_spec_deps, provider_index, tests)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2654, in _normalize_helper
    dep, visited, spec_deps, provider_index, tests)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2613, in _merge_dependency
    visited, spec_deps, provider_index, tests)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2654, in _normalize_helper
    dep, visited, spec_deps, provider_index, tests)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2554, in _merge_dependency
    provider = self._find_provider(dep, provider_index)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/spec.py", line 2489, in _find_provider
    providers = provider_index.providers_for(vdep)
  File "/data/tiziano/debug-spack/spack2/lib/spack/spack/provider_index.py", line 80, in providers_for
    return sorted(s.copy() for s in result)
  File "/data/tiziano/debug-spack/spack2/lib/spack/llnl/util/lang.py", line 249, in <lambda>
    lambda s, o: o is not None and s._cmp_key() < o._cmp_key())
TypeError: '<' not supported between instances of 'str' and 'NoneType'
```

while at the same point disallowing MKL as a fftw provider.
Solving both issues by depending on `fftw-api@3` instead and adding a
conflict on `^fftw~mpi` when using `+mpi` (thanks to alalazo).

* cp2k: use conflicts instead of runtime checks for fftw/openblas variants
2020-09-16 10:41:46 -05:00
t-nojiri
cb218058bc fraggenescan: Modify build_targets for aarch64 (#18687) 2020-09-16 10:40:33 -05:00
Xavier Delaruelle
0b3e860608 environment-modules: add version 4.6.0 (#18686) 2020-09-16 10:39:55 -05:00
Robert Pavel
2f0565de64 Added spackage for cosmoflow-benchmark proxy app (#18685)
* Initial Draft of Cosmoflow Spackage

Need to add in logic to streamline cpu/gpu builds

* Added ~cuda logic to cosmoflow spackage

Added logic to support a ~cuda build for cosmoflow

* Requested Changes to Cosmoflow Spackage

Made requested changes to cosmoflow spackage
2020-09-16 10:18:01 -05:00
Severin Strobl
e41c3ad1fc Likwid versions >= 5.0.0 depend on Lua 5.2. (#18675)
* Likwid versions >= 5.0.0 depend on Lua 5.2.

According to https://github.com/RRZE-HPC/likwid/issues/324 recent
versions of Likwid require Lua 5.2.

* Update var/spack/repos/builtin/packages/likwid/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-16 10:16:08 -05:00
Jason Lee
1a9f97fd0d jemalloc: Use AutotoolsPackage and allow for arbitrary public API prefixes (#18680)
Removed je variant
2020-09-15 20:37:05 -05:00
Greg Sjaardema
6a3583f181 MatIO: Update repository and add new versions (#18683)
MatIO development has switched to github from sourceforge.  Updated the `git` and `url` variables and added the four new versions (1.5.14 -- 1.5.17) that have been released since the last update of this package.
2020-09-15 16:43:45 -05:00
arjun-raj-kuppala
bad9f2bc28 AMD ROCmValidationSuite recipe for 3.5.0 and 3.7.0 (#18678)
* AMD ROCmValidationSuite recipe for 3.5.0 and 3.7.0

* Updated the PR comments for rocmvalidationsuite recipe
2020-09-15 13:16:41 -05:00
Nikolay Simakov
1b1bfd883a namd: Added optimization auto-selection for skylake-X+ CPU for 2.14 and 2.15a1 (AVX512 tile) (#18671) 2020-09-15 10:08:26 -05:00
Sergey Kosukhin
5cff304809 CLAW: update the package (#18673) 2020-09-15 10:04:59 -05:00
Tom Payerle
6752a1c377 Qbox minor issues 18664 (#18665)
* qbox: install to correct directory structure

* qbox: Have qb executable put in bin rather than src subdir

* qbox: Fix python script shebangs to use python from path

* qbox: Add dependencies on gnuplot, python2 for utilities

* qbox: fix flake8 issue

* qbox: Add $prefix/util to PATH
2020-09-15 09:54:26 -05:00
ketsubouchi
33e8dcad99 nseg: add return 0; to void functions (#18481) 2020-09-15 10:24:51 +02:00
Adam J. Stewart
5e6875008c fish: add dependencies, patch MacOS (#18526) 2020-09-15 08:43:53 +02:00
Adam J. Stewart
5f0c3427ae py-cmocean: added new package at v2.0 (#18614) 2020-09-15 08:36:26 +02:00
Adam J. Stewart
2af9e44340 Add setuptools run-time dependency to various Python packages (#18616) 2020-09-15 08:30:59 +02:00
Adam J. Stewart
b3ee04c6ef py-jupyterlab-pygments: added new package at v0.1.1 (#18627) 2020-09-15 08:30:08 +02:00
Adam J. Stewart
f7030287d3 py-nest-asyncio: add new package at v1.4.0 (#18629) 2020-09-15 08:28:51 +02:00
Adam J. Stewart
445988011c py-ipykernel: added v5.3.4, moved url to PyPI (#18630) 2020-09-15 08:28:17 +02:00
Adam J. Stewart
ef37852bb4 py-ipython: added v7.18.1 (#18631) 2020-09-15 08:26:27 +02:00
Adam J. Stewart
40a66317e8 py-jupyter-client: added v6.1.7, moved url to PyPI (#18632) 2020-09-15 08:25:39 +02:00
Adam J. Stewart
792c48a558 py-jupyter-core: updated the type of the setuptools dependency (#18633) 2020-09-15 07:42:28 +02:00
Adam J. Stewart
0448ade7b5 py-jupyterlab-server: added v1.2.0 (#18634) 2020-09-15 07:39:30 +02:00
Adam J. Stewart
defde398c4 py-argon2-cffi: added new package at v20.1.0 (#18626) 2020-09-15 07:38:16 +02:00
Adam J. Stewart
77d20906c0 py-jupyterlab: added v2.2.7 (#18635) 2020-09-15 07:20:15 +02:00
Adam J. Stewart
682223f1f4 py-nbformat: add v5.0.7, moved url to PyPI (#18637) 2020-09-15 07:18:12 +02:00
Adam J. Stewart
60a5a176fb py-terminado: added v0.8.3 (#18639) 2020-09-15 07:15:38 +02:00
Adam J. Stewart
c59836222e py-traitlets: added v5.0.4, moved url to PyPI (#18640) 2020-09-15 07:15:00 +02:00
Robert Pavel
4fc2370559 Added Missing Tag to Cradl Spackage (#18669)
Added missing proxy apps tag to cradl spackage
2020-09-14 18:55:53 -06:00
Robert Pavel
a2673aeddd CRADL Machine Learning Proxy Spackage (#18668)
* Initial CRADL Spackage Work

Currently resolving ```--single-version-externally-managed``` error

* Fixed GPUtil Issues

Thanks to Vinay Ramakrishnaiah for overwriting install

* Finished CRADL Install Function

Finished CRADL install function which is basically copying the scripts
to the install directory. Also resolved flake8 issues for PR purposes
2020-09-14 14:25:49 -06:00
Simon Frasch
49512e21ab SIRIUS: Update dependencies (#18622)
* sirius: Fixed dependency spfft when build with +rocm

* sirius: Added new dependency spla for develop build

* sirius: Added maintainer
2020-09-14 12:23:32 -05:00
Scott Wittenburg
f537d5bb58 Make sure each develop pipeline tests associated commit 2020-09-14 10:37:42 -06:00
Scott Wittenburg
28ef5b1204 Do not assume we sit in the directory where the env file lives. 2020-09-14 10:37:42 -06:00
Scott Wittenburg
031490f6aa Remove :<name> interpolation, add SPACK_VERSION variables
Also fix issues with documentation to reflect changes
2020-09-14 10:37:42 -06:00
Scott Wittenburg
bf90cdd6c7 Document pipeline keys which can be global but overridden
Update pipelines documentation to describe how 'tags', 'variables',
'image', 'before_script', 'script', and 'after_script' can be
supplied at the top level, to be used by any of the runner mappings,
and also overridden by any of the runner mappings.

Also show an example of capturing the custom spack SHA at pipeline
generation time, so all jobs are sure to run with the same version
of spack, as a means to illustrate the $env:VARIABLE_NAME syntax.
2020-09-14 10:37:42 -06:00
Scott Wittenburg
d9e0718c9d Allow overridable global runner attributes 2020-09-14 10:37:42 -06:00
Scott Wittenburg
e686f1500e Update pipeline documentation to describe user-provided scripts 2020-09-14 10:37:42 -06:00
Scott Wittenburg
e18612a321 Add test for variable interpolation and scripts 2020-09-14 10:37:42 -06:00
Scott Wittenburg
2386f7582a Support variable interpolation at pipeline generation time 2020-09-14 10:37:42 -06:00
Scott Wittenburg
ace52bd476 Provide your own script, before_script, and after_script 2020-09-14 10:37:42 -06:00
Massimiliano Culpo
4ca7d46e15 Fix a typo in test/concretize.pyi (#18662) 2020-09-14 09:58:14 -05:00
darmac
5f47170492 Add new package: webbench (#18650)
* Add new package: webbench

* Update var/spack/repos/builtin/packages/webbench/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-14 09:57:40 -05:00
ketsubouchi
41b68741ec cpio: add --rtlib=compiler-rt for %fj (#18619)
* cpio: add --rtlib=compiler-rt for %fj

* cpio: simplify if

* Update var/spack/repos/builtin/packages/cpio/package.py

This seems better.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-14 09:55:47 -05:00
t-nojiri
a250006449 ngmlr: support for aarch64 (#18621)
* ngmlr: support for aarch64

* ngmlr: Fixed patch file
2020-09-13 23:14:59 -05:00
ketsubouchi
3cfce42563 nek5000: Support Fujitsu fortran (#18659) 2020-09-13 23:14:04 -05:00
Paul
c29b74e7ad Updated FZF package for newer versions. (#18608)
* Support for external find.
* Added latest version (0.22.0) and conditions to package to continue
support for older versions.
2020-09-13 11:27:56 -05:00
Joseph Wang
58f101de88 use github for download (#18657) 2020-09-13 10:23:46 -05:00
Andrew W Elble
a734dabf2b new package: py-reproject (#18641)
* new package: py-reproject
add setuptools build/run dep to py-astropy-healpix

* fixes

* fix
2020-09-12 15:37:34 -05:00
darmac
85f7a8bf71 Add new package: delta (#18648) 2020-09-12 15:37:14 -05:00
darmac
10dab474ce cvs: add a patch for segv issue (#18649) 2020-09-12 09:47:39 -05:00
darmac
e355fc16ad Add new package: dbxtool (#18651) 2020-09-12 09:45:05 -05:00
darmac
3211ac5136 Add new package: geoip-api-c (#18653) 2020-09-12 09:44:28 -05:00
darmac
cde4654525 Add new package: libspiro (#18654) 2020-09-12 09:44:02 -05:00
darmac
6d39e6ebea Add new package: hping (#18311)
* Add new package: hping

* Update var/spack/repos/builtin/packages/hping/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/hping/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Fix flake8 errors

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-12 09:36:30 -05:00
darmac
128731ec74 Add new package: jetty-project (#18555)
* Add new package: jetty-project

* refine dependency

* refine maven version
2020-09-12 09:35:38 -05:00
Richarda Butler
8116153f2a bugfix: include configuration ignoring files with the same basename (#18487)
* Use the config path instead of the basename

* Removing unused variables

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Test
Making sure if there are 2 include config files with the same basename they are both implemented

* Edit test assert

Co-authored-by: Greg Becker <becker33@llnl.gov>
2020-09-11 16:45:36 -07:00
Robert Blake
afb0883762 ncurses: adding external support. (#18609)
* ncurses: adding external support.

* Update var/spack/repos/builtin/packages/ncurses/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/ncurses/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/ncurses/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fixing includes.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-11 16:00:11 -05:00
Massimiliano Culpo
8ad2cc2acf Environments: Avoid inconsistent state on failed write (#18538)
Fixes #18441 

When writing an environment, there are cases where the lock file for
the environment may be removed. In this case there was a period 
between removing the lock file and writing the new manifest file
where an exception could leave the manifest in its old state (in
which case the lock and manifest would be out of sync).

This adds a context manager which is used to restore the prior lock
file state in cases where the manifest file cannot be written.
2020-09-11 10:57:29 -07:00
Adam J. Stewart
e7040467f2 NumPy: added v1.19.2 (#18615) 2020-09-11 15:38:24 +02:00
srekolam
1aceb38b89 Changes for hipsparse, rocthrust recipes for rocm_3.7.0 (#18497)
* changes for hipsparse,rocthrust recipes for rocm_3.7.0

* changes to rocrand for 3.7.0

* version changes
2020-09-10 17:33:21 -05:00
srekolam
4a474d9d67 recipes changes for hipblas, rocprim, rocfft, rocsolver for rocm3.7.0 (#18495)
* recipes changes for rocprim,rocfft,rocsolver for rocm3.7.0

* changes to hipcub recipe for rocm-3.7.0

* changes to address review comments
2020-09-10 17:32:41 -05:00
Robert Pavel
29645ceba5 Adding Missing Versions for Flux-Sched/Flux-Core and Compiler Flags (#18612)
* Checksummed New Flux Versions

Checksummed new flux versions to let spack detect them

* Added CXXFlags to build Flux-sched

Added missing cxxflags to build flux-sched
2020-09-10 16:53:28 -05:00
Robert Pavel
3fabdb6e9b Adding Cuda Variant to SW4Lite (#18590)
* Adding Cuda Variant to SW4Lite

Added cuda variant of sw4lite  as per guidance in README

* Updated SW4Lite+cuda to Current Header Conventions

Updated sw4lite+cuda to use current conventions for spackage include
dirs

* Fixing FLake8 Issue with Sw4lite+cuda Fix

Fixed overly long line and further underlined sticky note reminding me
to run flake8 BEFORE pushing

* Switching to Spack Compiler Wrapper

Switching to spack compiler wrapper for consistency
2020-09-10 16:53:13 -05:00
Tim Haines
ff6ca57dda Dyninst: add v10.2.1 (#18611) 2020-09-10 16:09:06 -05:00
Rémi Lacroix
114317464b Orca: update the package. (#18593)
* Orca: Add new versions.

* Orca: Support OpenMPI without the legacy wrappers.

By default, Spack builds OpenMPI without the legacy wrappers when using the Slurm scheduler. This breaks Orca since its binaries are hardcoded to call "mpirun". To workaround this issue, add a "mpirun" wrapper which calls "srun" when required.
2020-09-10 11:55:13 -05:00
Johannes Blaschke
757dad370f Bugfix for fish support: overly zealous arg matching (#18528)
* bugfix for issue 18369

* fix typo

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-10 10:01:44 -05:00
Tiziano Müller
2bc9821c43 update CP2K pkg for 8+ (#18607)
* cp2k: do not support ~openmp for v8+

* sirius: version bump

* cp2k: fix overlapping deps for elpa

fixes #18029

* cp2k: update SIRIUS dependency for v8+

* spfft: requires CMake 3.11+

* cp2k: fix build with +sirius
2020-09-10 10:00:31 -05:00
Toyohisa Kameyama
34f4049815 dpdk: Avoid option conflicts between spack wrappers and Makefiles on aarch64 gcc. (#18603) 2020-09-10 09:59:18 -05:00
t-nojiri
e13e2b0d54 prism: support for aarch64 (#18562)
* prism: support for aarch64

* prism: Change patch file.
2020-09-10 09:56:19 -05:00
Adam J. Stewart
25291cf01c GDAL: fixed Java bindings, added v3.1.3 (#18494) 2020-09-10 15:08:24 +02:00
Tomoki, Karatsu
778e659a03 openfoam: Set 'FOAM_SIGFPE' when using Fujitsu compiler. (#18601) 2020-09-10 15:01:50 +02:00
Tomoki, Karatsu
d4535f3115 fj: fixed homepage URL. (#18602) 2020-09-10 14:55:52 +02:00
Fabian Brandt
fc919e490e NetworKit: update to v7.1, including dependencies (#18604) 2020-09-10 14:53:16 +02:00
srekolam
651cffae0a Rocprofiler changes for rocm-3.7.0 release (#18599)
* rocprofiler changes for rocm-3.7.0 release

* fix flake8 errors
2020-09-09 21:43:07 -05:00
Brian Van Essen
87dc324f36 Support older cuda arch capabilties. (#18597) 2020-09-09 21:42:34 -05:00
ketsubouchi
b014ffcd3d darshan-util: remove return(-1) from void function (#18504)
* darshan-util: remove return(-1) from void function

* Update var/spack/repos/builtin/packages/darshan-util/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/darshan-util/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-09 21:03:59 -05:00
psakievich
7b2c59e6cf Fix typo in nalu-wind package (#18596) 2020-09-09 19:12:14 -05:00
Paul
9eac1ed6a8 Support external find for gpgme. (#18594) 2020-09-09 17:39:33 -05:00
Christoph Junghans
691e46c4f5 Packages/gamess ri mp2 mini app (#18595)
* gamess-ri-mp2-miniapp: initial import

* flake8

* Update var/spack/repos/builtin/packages/gamess-ri-mp2-miniapp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-09 16:10:59 -06:00
Glenn Johnson
f1652e89af Add new versions of intel-mkl (#18592) 2020-09-09 15:27:36 -05:00
arjun-raj-kuppala
9c8cfcca0f Adding hipify-clang and aomp recipe for rocm (#18333)
* Adding aomp recipe for rocm 3.5.0 release

* hipify-clang rocm recipe

* incorporated suggested changes on PR#18333 for rocm aomp recipe

* remove binutils dependency and update to devicelibs tar path
2020-09-09 13:47:51 -05:00
Jen Herting
857530c5ae [parallel] added version 20200822 (#18591) 2020-09-09 10:58:07 -07:00
Robert Brunner
1ae50f8022 Added specific "@master" version specifier for component libraries when building (#18579)
the develop version of SCR
2020-09-09 12:52:36 -05:00
Jen Herting
f7d1f845f4 [py-thinc] fixed checksum (#18571) 2020-09-09 11:25:06 -05:00
Rémi Lacroix
fa04ad5d92 tcl module files: fix configuration overriding (#18514)
This is a special case of overriding since each section is being matched with the current spec.

The trailing ':' for sections with override is now removed when parsing the configuration so the special handling for the modules configuration stopped working but it went unnoticed.
2020-09-09 18:05:58 +02:00
Garth N. Wells
3dedd2e321 (py-)fenics-dolfinx: fix dependencies (#18586)
* Fix (py-)fenics-dolfinx dependencies

* flake8 updates
2020-09-09 10:55:06 -05:00
Ganesh Kumar
4c5151327f Rocm 3.7 rccl (#18587)
* ROCm 3.5 miopen recipe

* fixing flake8 issues

* cmake variant fix

* min support fix

* variant possible values

* ROCm 3.7 RCCL changes
2020-09-09 10:49:40 -05:00
Ganesh Kumar
a18700a86a Rocm 3.7 rocm smi (#18522)
* ROCm 3.5 miopen recipe

* fixing flake8 issues

* cmake variant fix

* min support fix

* variant possible values

* ROCm 3.7 support for rocm-smi

* review comments change

* miopen merge conflict resolve

* reverting back from copy_tree to install_tree
2020-09-09 08:22:40 -05:00
Simon Frasch
290b77fe43 spla: Add version 1.1.0 and ROCm support (#18561) 2020-09-09 07:22:53 -05:00
Toyohisa Kameyama
4472914847 audacious: added gettext and iconv dependency. (#18584) 2020-09-09 11:23:01 +02:00
Adam J. Stewart
8494d26c0a nn-c: fix pic flags (#18478) 2020-09-09 10:25:57 +02:00
Adam J. Stewart
ba47a057f0 py-torchvision: add variant to set image backend (#18500) 2020-09-09 10:16:50 +02:00
Adam J. Stewart
96364235e3 readline: fix build with ncurses~termlib (#18524) 2020-09-09 10:12:46 +02:00
Adam J. Stewart
4307b73299 pcre2: fix libs property (#18525) 2020-09-09 10:11:29 +02:00
Hadrien G
d24532f574 acts: added v0.32 and adapt to latest master changes (#18563) 2020-09-09 09:44:44 +02:00
Adam J. Stewart
346d12dc6c Pandas: added v1.1.2 (#18580) 2020-09-09 09:32:56 +02:00
t-nojiri
d11705f9d0 r-ff: support for aarch64 (#18585) 2020-09-09 09:01:51 +02:00
darmac
e3cd3fb9eb Add new package: quartz (#18539) 2020-09-08 21:29:26 -05:00
Martin Aumüller
ab51edecb5 openscenegraph: remove dependency on Qt for newer versions (#18531)
Starting with OpenSceneGraph 3.5.5, support for windows managed by Qt
has been moved to the seperate project osgQt. Hence, a dependency on Qt
is not needed any longer for version 3.5.5 or newer.
In order to still satisfy the dependency on OpenGL, a depends_on('gl')
has been added.
2020-09-08 21:28:27 -05:00
darmac
b1cea5c23e Add new package: kylin (#18537)
Co-authored-by: root <root@localhost.localdomain>
2020-09-08 21:27:01 -05:00
darmac
275ac86925 Add new package: sqlite-jdbc (#18540) 2020-09-08 21:26:11 -05:00
darmac
0f4f4a2e95 Add new package: orientdb (#18542) 2020-09-08 21:21:35 -05:00
darmac
97b7af01d9 Add new package: nacos (#18543) 2020-09-08 21:20:41 -05:00
darmac
0325cb564b Add new package: fastjson (#18545) 2020-09-08 21:17:19 -05:00
darmac
a0ba89d84a Add new package: guacamole-client (#18546) 2020-09-08 21:15:52 -05:00
darmac
a12ced781c Add new package: jansi-native (#18547) 2020-09-08 21:14:49 -05:00
darmac
fe05fc7fd5 Add new package: jline3 (#18548) 2020-09-08 21:14:00 -05:00
darmac
1ac24aca96 refine efivar install flow (#18557) 2020-09-08 21:08:38 -05:00
Jaroslav Hron
e2e90e4d6b Update package.py (#18552)
without setting the build enviroment, the installation fails with

```
1 error found in build log:
     35946    fmtutil [INFO]: /usr/local/pkg/Installs/linux-ubuntu18.04-skylake_avx512/gcc7.4.0/texlive/20190410/rgs2nakycorkgzno/t
              exmf-var/web2c/pdftex/pdfcslatex.fmt installed.
     35947    fmtutil [INFO]: Disabled formats: 6
     35948    fmtutil [INFO]: Successfully rebuilt formats: 45
     35949    fmtutil [INFO]: Total formats: 51
     35950    fmtutil [INFO]: exiting with status 0
     35951    ==> [2020-09-07-21:23:21.482745] '/usr/local/pkg/Installs/linux-ubuntu18.04-skylake_avx512/gcc7.4.0/texlive/20190410/
              rgs2nakycorkgzno/bin/x86_64-linux/mtxrun' '--generate'
  >> 35952    /usr/bin/env: 'texlua': No such file or directory
```

May be there is a better way...
2020-09-08 21:05:18 -05:00
Garth N. Wells
af189e3ed9 Fix linking problem on macos (#18564) 2020-09-08 21:01:21 -05:00
Andrew W Elble
520308ad2b astra: update checksum, add other executables (#18567) 2020-09-08 21:00:27 -05:00
Tom Payerle
7165795c4a libxmms: add python build dependency (#18566) (#18568)
Build of libxmms requires python, but not in spack dependency list.
See  #18566
2020-09-08 20:59:22 -05:00
Michael Kuhn
017331684e go: Add 1.15.1 and 1.14.8 (#18575) 2020-09-08 20:56:08 -05:00
Michael Kuhn
e43855bc8f node-js: Add 14.10.0 and 12.18.3 (#18576) 2020-09-08 20:55:40 -05:00
Michael Kuhn
9c4d79f8cd npm: Add 6.14.8 (#18577) 2020-09-08 20:55:12 -05:00
Adam J. Stewart
2f4d493744 Cython: add setuptools run-dependency (#18572)
Cython requires a library that is available in Python 3.8, or before
Python 3.8 with setuptools. This specifies that setuptools is a run
dependency to allow running with Python < 3.8
2020-09-08 17:27:49 -07:00
Tamara Dahlgren
88749de5c9 Clarify manual download required if unable to fetch package (#18242)
Clarify manual download required if unable to fetch (from mirror(s)); support (and tests) for package-specific download instructions
2020-09-08 17:15:48 -07:00
Tamara Dahlgren
6b30cd18d6 Update cray-libsci homepage and install error (#18581) 2020-09-08 16:22:25 -06:00
Richarda Butler
d721bd8070 commands: update help for spack install --yes-to-all (#18367)
`spack install --yes-to-all` doesn't actually make the build non-interactive,
but that is why people typically use it. This documents that you must also
specify `--no-checksum` for a fully non-interactive build.
2020-09-08 13:18:25 -07:00
Peter Josef Scheibel
ccd65895a6 print out debug information about which specs are applying which constraints 2020-09-08 12:19:02 -07:00
Adam J. Stewart
94e694b19f spack docs: http -> https (#18573) 2020-09-08 20:19:20 +02:00
Michal Sudwoj
7205a75427 Added nvptx variant to rust (#18209)
Co-authored-by: Andrew Gaspar <andrew.gaspar@outlook.com>

Co-authored-by: Andrew Gaspar <andrew.gaspar@outlook.com>
2020-09-08 11:28:59 -05:00
Rémi Lacroix
92bf9493cf Modules: Deduplicate suffixes but don't sort them. (#18351)
* Modules: Deduplicate suffixes but don't sort them.

The suffixes' order is defined by the order in which they appear in the configuration file.

* Modules: Modify tests to use spack_yaml.load_config.

spack_yaml.load_config ensures that the configuration is stored in an ordered manner. Without this change, the behavior of the tests did not match Spack's.

* Modules: Tweak the suffixes test to better catch ordering issues.
2020-09-08 08:43:03 -06:00
Gvozden Neskovic
c2b33b4444 gromacs: add zen2 target SIMD optimizations (#18551)
Co-authored-by: Gvozden Nešković <neskovic@dev06.compeng.uni-frankfurt.de>
2020-09-08 06:52:56 -06:00
Piotr Luszczek
73110b415d hpcc: add explicit C99 flag for older GCC versions (#18556) 2020-09-08 08:30:02 +02:00
Mark W. Krentel
850924e423 hpctoolkit: adjust some dependencies (#18558)
Hpctoolkit master and upcoming releases now want the +pic variant for
two dependencies, libunwind and xz.
2020-09-08 08:27:39 +02:00
Axel Huebl
64273da2cc openPMD-api: added v0.12.0 (#18560) 2020-09-08 08:18:21 +02:00
Massimiliano Culpo
28c6ce9714 SpecList: remove mutable types from __init__ arguments (#18515)
fixes #18439
2020-09-07 11:53:59 -07:00
jthies
dcee0a1d5d phist: added v1.9.1 (#18529) 2020-09-07 17:52:07 +02:00
ketsubouchi
03a808ec2d kim-api: add support for Fujitsu compilers (#18533) 2020-09-07 17:47:55 +02:00
Andre Sailer
138e2ad0a1 [LCIO]: changes to install/CPATH for python bindings (#18512) 2020-09-07 10:16:16 -05:00
darmac
b2b7bcd86a Add new package: kbd (#18436)
* Add new package: kbd

* fix description error
2020-09-07 10:12:47 -05:00
Tim Haines
8b7ca5ef50 capstone: added v4.0.2 (#18534)
This also adds the git branches "master" and "next".
2020-09-07 16:04:23 +02:00
darmac
5e86a131d2 jansi: added new package at v1.18 (#18549) 2020-09-07 16:02:18 +02:00
ketsubouchi
80691fa6d5 gconf: add dependencies (#18406)
* gconf: add dependencies

* gconf: add run type to perl-xml-parser
2020-09-06 22:33:42 -05:00
Andrew W Elble
8ad581e7b3 new package: py-textblob (#18516)
* new package: py-textblob

add variant to py-nltk to allow for data download/installation
add dependencies to py-nltk so that bin/nltk works

* add resources and resource generation script
2020-09-05 11:50:03 -05:00
Satish Balay
b494f50489 petsc4py: repo is migrated from bitbucket to gitlab (#18519) 2020-09-05 10:18:48 -05:00
Massimiliano Culpo
ba257914b3 fujitsu: added new package (#18021)
The package is at the moment not installable, just detectable.

Co-authored-by: Toyohisa Kameyama <kameyama@riken.jp>
2020-09-05 10:40:52 +02:00
Robert Blake
ea57171712 Make spack environment configurations writable from spack external and spack compiler find (#18165)
* spack config: default modification scope can be an environment

The previous model was that environments are the highest priority config
scope for config reading operations, but were not considered for config
writing operations. Now, the active environment is the highest priority
config scope for both reading and writing operations.

Now spack config add, spack external find and spack compiler set environment 
configuration in the environment by default if an environment is active. This is a
change in default behavior for these routines, but better matches the mental
model for an environment taking precedence over the user's default config file.

* add scope argument to 'spack external find' to choose non-default scope

* Increase testing for config modifications on environments

Co-authored-by: Gregory Becker <becker33@llnl.gov>
2020-09-05 01:12:26 -07:00
Ganesh Kumar
704fc475e3 ROCm3.5 miopen recipe (#18442)
* ROCm 3.5 miopen recipe

* fixing flake8 issues

* cmake variant fix

* min support fix

* variant possible values
2020-09-04 17:02:33 -05:00
Nick Booher
61ae21ee4d energyplus: add version 9.1 (#18485) 2020-09-04 16:13:53 -05:00
mic84
c5e50b73ef amrex:: new version 20.09 (#18486) 2020-09-04 16:13:20 -05:00
Tom Payerle
2cc7718edd bml: Add build dependency on python (#18489) (#18491)
At some point in the build phase a script
spack-src/scripts/convert-template
has a shebang looking for python in the path.

Currently this picks up system python if in invoker's path, but should
be using python from spack, so add a build dependency on python.
2020-09-04 16:12:53 -05:00
Michael Kuhn
518372ccd1 mariadb-c-client: Add 3.1.9 (#18501) 2020-09-04 16:03:43 -05:00
Michael Kuhn
ca2760381b py-setuptools: Add 50.1.0 and 49.6.0 (#18502) 2020-09-04 16:03:20 -05:00
Michael Kuhn
f836d93da1 rocksdb: Add 6.11.4 (#18503) 2020-09-04 16:01:30 -05:00
Michael Kuhn
f2adf531b3 glib: Add 2.64.5 (#18505) 2020-09-04 16:00:07 -05:00
Michael Kuhn
262edde2c1 libbson, mongo-c-driver: Add 1.17.0 (#18506) 2020-09-04 15:59:43 -05:00
Michael Kuhn
14983401a1 libidn2: Add 2.3.0 (#18507) 2020-09-04 15:59:16 -05:00
Michael Kuhn
972caba882 curl: Add 7.72.0 (#18508) 2020-09-04 15:58:52 -05:00
Michael Kuhn
53bf97298f gettext: Add 0.21 (#18509) 2020-09-04 15:58:30 -05:00
iarspider
0c03248537 Add madgraph 2.8.0; fix recipe (#18510) 2020-09-04 15:58:04 -05:00
Luke Dalessandro
8e41208c65 Build libtinfo.so "--with-versioned-syms" when it is enabled in ncurses. (#18511)
Many system-installed binaries (at least in Debian) are built against a
libtinfo.so that has versioned symbols. If spack builds a version without this
functionality, and it winds up in the user's LD_LIBRARY_PATH via spack load,
system binaries will begin to complain.

```
$ less log.txt
less: /opt/spack/.../libtinfo.so.6: no version information available (required by less)
```

Co-authored-by: Luke D'Alessandro <ldalessa@uw.edu>
2020-09-04 15:54:44 -05:00
Andrew W Elble
f4d3a1a0cb new package: py-markovify (#18517) 2020-09-04 15:46:38 -05:00
t-nojiri
f3c4747318 fermikit: added support for aarch64 (#18480) 2020-09-04 19:52:10 +02:00
Scott Wittenburg
597b43e30a Rely on E4S project variable for SPACK_REPO 2020-09-04 11:18:56 -06:00
Adam J. Stewart
6fcec1dcff Python: default to Python 3.8 (#17798) 2020-09-03 16:15:43 -07:00
Adam J. Stewart
0eca977cc9 py-pillow-simd: fix concretization (#18490) 2020-09-03 15:36:20 -07:00
Adam J. Stewart
7d9f2bf4ed depends_on cannot handle ^ sigil (#18220)
* depends_on cannot handle ^ sigil

* cardioid+mfem+cuda requires hypre+cuda

* Document this limitation

* Move warning message to Known Issues docs

* Better handling of parmetis dep
2020-09-03 17:31:00 -05:00
Adam J. Stewart
7728b0737b Add new MavenPackage build system base class (#18185)
* Add new MavenPackage build system base class

* Fix flake8 and doc tests

* More specific regex

* Java 8 required for these packages
2020-09-03 17:30:39 -05:00
Massimiliano Culpo
fab2622a71 Hashing: force hash consistency for values read from config (#18446)
The 'external_modules' attribute on a Spec, when read from a YAML
configuration file, may contain extra formatting that is lost when
that Spec is written-to/read-from JSON format. This was resulting in
a hashing instability (when the Spec was read back, it would report a
different hash). This commit adds a function which removes the extra
formatting from 'external_modules' as it is passed to the Spec in
__init__ to ensure a consistent hash.
2020-09-03 10:49:36 -07:00
Adam J. Stewart
741bb9bafe install/install_tree: glob support (#18376)
* install/install_tree: glob support

* Add unit tests

* Update existing packages

* Raise error if glob finds no files, document function raises
2020-09-03 10:47:19 -07:00
Adam J. Stewart
098beee295 Pillow-SIMD: use as default PIL provider (#18097)
* Pillow-SIMD: use as default PIL provider

* Fix concretization of pil

* Fix build of older versions of pillow
2020-09-03 10:39:35 -07:00
Michael Kuhn
8c264a9f26 freetype: Add custom headers property (#18440)
freetype's headers are installed in the `freetype2` subdirectory, use a
custom headers property to fix this in dependent packages.
2020-09-03 11:32:41 -05:00
Tamara Dahlgren
84381fbc80 Bugfix: terminate if a spack.yaml include path does not exist (#18074) 2020-09-03 14:37:24 +02:00
t-nojiri
3a562c0cec fermi: added patch to support aarch64 (#18479) 2020-09-03 10:06:15 +02:00
psakievich
7c0b356e79 Add test tolerance variant to nalu-wind pkg (#18455)
* Add test tolerance variant to nalu-wind pkg

* flake8 fixes
2020-09-02 20:31:58 -05:00
Gabriel Rockefeller
ced0a8f068 eospac: add version 6.4.1 (#18476) 2020-09-02 20:28:48 -05:00
Andrew W Elble
879bb063f7 new package: corenlp (#18467)
* new package: corenlp

* import os, not os.path
2020-09-02 20:27:56 -05:00
Robert Blake
638f44d842 CUDA_HOME needs to be set, or CUDA is found based on user's PATH. (#18475) 2020-09-02 19:52:00 -05:00
Harmen Stoppels
6d87cbdb52 Add rocm 3.7.0 libs (#18366)
* Add rocm 3.7.0 libs

* Make 3.7.0-only dependency on numactl explicit

* Add rocm-device-libs dep to rocm-clang-ocl

* Update the cmakelists dir in rocm-debug-agant

* Make rocm-debug-agent work on 3.7.0

* Disable tensile host; following rocm-arch recommendations
2020-09-02 19:50:05 -05:00
Adam J. Stewart
443407cda5 Add new RubyPackage build system base class (#18199)
* Add new RubyPackage build system base class

* Ruby: add spack external find support

* Add build tests for RubyPackage
2020-09-02 16:26:36 -07:00
Chris White
e22a0ca5cf mfem: fix transitive hdf5 static libs (#18457) 2020-09-02 15:06:45 -07:00
Adam J. Stewart
e58db067c3 PythonPackage: update documentation (#18181) 2020-09-02 15:05:10 -07:00
Adam J. Stewart
8eb375bf81 spack test: no gpg signing for git commits (#18454) 2020-09-02 14:48:48 -07:00
Adam J. Stewart
6b20687b26 python: switch to +uuid by default (#18252) 2020-09-02 14:46:58 -07:00
Adam J. Stewart
1992fdf712 Document test dependency type (#18365) 2020-09-02 13:46:52 -07:00
Jose E. Roman
601f97d8a5 New patch release SLEPc 3.13.4 (#18466) 2020-09-02 15:24:07 -05:00
Rémi Lacroix
447ea50bf9 Aria2: Add version 1.35.0. (#18465) 2020-09-02 15:23:40 -05:00
Adam J. Stewart
ba317b01c3 Packages: headers should be lists (#18445) 2020-09-02 13:19:22 -07:00
Nick Booher
0116c44714 ldak: new package at 5.1 (#18431)
* ldak: new package at 5.1

* flake8

* Re-run tests

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-02 11:29:40 -05:00
Auriane R
24843844ae Add HPX 1.5.0 release and update the homepage (#18464) 2020-09-02 10:34:05 -05:00
Sinan
e5fdb2a46c new package: py-python-fmask (#18382)
* new package: py-python-fmask

* flake8

* add missing dependency

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@earth>
2020-09-02 10:29:02 -05:00
Sinan
712e7bd2bf new package: py-gitdb (#18386)
* new package: py-gitdb

* Update var/spack/repos/builtin/packages/py-gitdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-02 10:28:33 -05:00
Jordan Moxon
aa8e046073 Add blaze versions 3.6-3.8 (#18424) 2020-09-02 10:15:38 -05:00
darmac
f268f65733 Add new package: rasdaemon (#18434) 2020-09-02 10:08:34 -05:00
Gvozden Neskovic
1e05321c8f gromacs: add support for opencl build (#18461)
Co-authored-by: Gvozden Nešković <neskovic@dev06.compeng.uni-frankfurt.de>
2020-09-02 09:07:43 -06:00
Seth R. Johnson
d904c57d2b Flibcpp: update version (#18448)
Update available versions and add Fortran check
2020-09-02 11:07:10 -04:00
Paul
1b78a010d8 Updated Hugo package. (#18443)
* Set GOPATH in build environment to avoid creating files in the user's
default GOPATH (e.g. ~/go).
* Support for external find.
* Added latest releease 0.74.3.
2020-09-02 10:03:56 -05:00
ketsubouchi
1026ace6b6 r-boot: checksum mismatch @1.3-23 (#18458) 2020-09-02 10:00:12 -05:00
Weston Ortiz
bf2ded5269 Add gdb TUI variant (#18459) 2020-09-02 09:59:19 -05:00
Mark Olesen
5c94827201 scotch: update to 6.0.10 (released 31-AUG-2020) (#18462)
- added gitlab location, updated the homepage location

Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
2020-09-02 09:57:48 -05:00
Seth R. Johnson
9e51b8d165 New package: ForTrilinos (#18456)
Remove prior built-in Trilinos subrepository.

Added a Trilinos conflict discovered while documenting ForTrilinos:
```
   ***
   *** ERROR: Setting Trilinos_ENABLE_SEACASExodus=OFF which was 'ON' because SEACASExodus has a required library dependence on disabled TPL Netcdf!
   ***
```
2020-09-02 08:14:20 -04:00
Rui Xue
d9b945f663 Mac OS: support Python >= 3.8 by using fork-based multiprocessing (#18124)
As detailed in https://bugs.python.org/issue33725, starting new
processes with 'fork' on Mac OS is not guaranteed to work in general.
As of Python 3.8 the default process spawning mechanism was changed
to avoid this issue.

Spack depends on the fork-based method to preserve file descriptors
transparently, to preserve global state, and to avoid pickling some
objects. An effort is underway to remove dependence on fork-based
process spawning (see #18205). In the meantime, this allows Spack to
run with Python 3.8 on Mac OS by explicitly choosing to use 'fork'.

Co-authored-by: Peter Josef Scheibel <scheibel1@llnl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-09-02 00:15:39 -07:00
Axel Huebl
0740a4ac7e ADIOS: Fix no-MPI Build (#18453)
Do not apply this patch in no-MPI builds. I think this autotools
check logic is generally borked, this will just set it manually to
on/off now.
2020-09-01 18:17:51 -07:00
Adam J. Stewart
17f7b23783 Deprecate spack setup (#18240) 2020-09-01 18:07:48 -07:00
t-nojiri
ae3f3887a6 ccs-qcd: Change compile option for aarch64 (#17516)
* ccs-qcd: Change compile option for aarch64
2020-09-01 18:03:47 -07:00
Patrick Gartung
ae44a8ff64 test/relocate.py: skip tests involving patchelf on macOS (#18451) 2020-09-01 14:49:05 -05:00
Rao Garimella
ea97b37f60 Jali: Fix bugs in CMake section (#18447)
Fix variant name and cmake variable.

Co-authored-by: Rao Garimella <rao@abyzou.lanl.gov>
2020-09-01 15:18:24 -04:00
Jen Herting
e7f1eeb7af Update dependencies: py-torch-geometric (#18265)
* [py-torch-geometric] depends on py-torch-sparse

* [py-torch-geometric] setting TORCH_CUDA_ARCH_LIST

* [py-torch-geometric] added the rest of the dependencies

* [py-torch-geometric] added cuda variant and added more build env vars

* [py-torch-geometric] added variant info for depenedencies

* [py-torch-geometric] flake8

* [py-torch-geometric] add variant description
2020-09-01 13:44:12 -05:00
darmac
f9a330ae99 Add new package: efivar (#18392) 2020-09-01 10:09:59 -05:00
Nikolay Simakov
12078382b6 Added HPC Challenge Benchmark (#18323)
* HPCC Benchmark: added HPC Challenge (HPCC) benchmark

* HPCC Benchmark: modified error message on lack of fftw2 interface in MKL

* hpcc: fixed styling add one more installation example

* hpcc: styling fix

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* hpcc: changed include and lib location setter

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* hpcc: fixed styling add one more installation example

* hpcc: removed readme.md

* hpcc: develop repo now is in github

* hpcc: march arguments are set explicitly in case of intel compilers, added -restrict flag, which needed for older intel compilers (at least <=19.0.5.281)

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-01 09:59:06 -05:00
ketsubouchi
3a5746c6c7 orbit2: new package at v2.14.19 (#18405) 2020-09-01 13:11:55 +02:00
darmac
3701633937 fuse-overlayfs: added new package at v1.1.2 (#18435) 2020-09-01 08:28:27 +02:00
Joseph Wang
ebeb8fb8df ocaml: allow v4.08 and v4.09 to build with gcc10 (#18254)
fixes #18228.  

This patch doesn't cover all old versions but it allows packages like whizard to build.
2020-09-01 05:53:13 +02:00
Sinan
1c67a304c8 py-tpot: added new package at v0.11.5 (#18385)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@earth>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-09-01 05:51:23 +02:00
Xavier Delaruelle
efff025787 environment-modules: added v4.5.3 (#18425) 2020-09-01 05:46:13 +02:00
Brian Van Essen
28ef5c0e27 dihydrogen, hydrogen: dependency on CUB is conditional on CUDA version (#18427)
In CUDA 11, CUB is integrated into the CUDA library.
2020-09-01 05:42:21 +02:00
Tim Haines
7926f84022 elfutils: add support for debuginfod (#18227) 2020-09-01 05:35:18 +02:00
Jeffrey Salmond
9b6c2e80fe py-llvmlite: added v0.34 (#18432) 2020-09-01 05:33:27 +02:00
Toyohisa Kameyama
b2d2bb694e openfoam: delete print to screen and updated docstrings/comments (#17985) 2020-09-01 05:31:49 +02:00
darmac
49b47864e9 libpam: added new package at v1.0.9 (#18418) 2020-09-01 05:19:48 +02:00
MichaelLaufer
16e3e28cc8 new package: wrf (#18398)
* wrf: new package

* wrf: fix install dir

* wrf: ndown location

* Add more compiler and nesting options to wrf package

* Fix configure that didn't find pgf90, use tempfile and compile in parallel

* WRF v4.2 with parallel I/O support through pnetcdf

Signed-off-by: michael laufer <michael.laufer@toganetworks.com>

* extend Package, compiler wrapper now used, small fixes

Signed-off-by: michael laufer <michael.laufer@toganetworks.com>

* Update var/spack/repos/builtin/packages/wrf/package.py

fixed typo

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Levi Baber <baberlevi@gmail.com>
Co-authored-by: eXact lab <info@exact-lab.it>
Co-authored-by: michael laufer <michael.laufer@toganetworks.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-31 20:38:36 -05:00
Howard Pritchard
712d80955b OPENMPI: add 4.0.5 (#18332)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2020-08-31 20:38:06 -05:00
Julien Loiseau
2d718b56ca Update package.py (#18426)
Correct boost version to match flecsi
2020-08-31 17:09:39 -06:00
Sinan
e7447f266a new package: py-update-checker (#18396)
* new package: py-update-checker

* add test deps

* Update var/spack/repos/builtin/packages/py-update-checker/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-update-checker/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-update-checker/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove lint stuff.

Co-authored-by: Sinan81 <Sinan81@earth>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-31 15:50:21 -05:00
Jen Herting
bc93632c06 New package: py-torch-sparse (#18270)
* [py-torch-sparse] created template

* [py-torch-sparse] added dependencies

* [py-torch-sparse] extends py-torch-scatter

* [py-torch-sparse] added variant cuda and setting env vars

* [py-torch-sparse] added homepage and description. removed fixmes

* [py-torch-sparse] flake8

* [py-torch-sparse] added variant description

* [py-torch-sparse] extends -> depends on

* [py-torch-sparse] added dependencies of py-setuptools and py-pytest-runner
2020-08-31 15:43:30 -05:00
Harmen Stoppels
7c9fe7bcbd Add more cmake patch versions (#18422) 2020-08-31 10:52:18 -05:00
ketsubouchi
fc251e62d1 new package: dbus-glib (#18400) 2020-08-31 10:52:01 -05:00
darmac
3287050226 Add new package: dosfstools (#18410) 2020-08-31 10:49:50 -05:00
darmac
0d08071e95 Add new package: tesseract (#18411) 2020-08-31 10:49:05 -05:00
darmac
b2e1036316 Add new package: fipscheck (#18412) 2020-08-31 10:48:34 -05:00
darmac
1d54eb3fee Add new package: libfuse (#18413) 2020-08-31 10:48:10 -05:00
darmac
7ba20239fb Add new package: hardlink (#18414) 2020-08-31 10:47:46 -05:00
darmac
b8bf34b348 Add new package: libhbaapi (#18415) 2020-08-31 10:47:14 -05:00
darmac
ecc7f19177 Add new package: jimtcl (#18416) 2020-08-31 10:46:48 -05:00
Xavier Delaruelle
7036f41ea5 environment-modules: fix version 4.5.2 install (#18421)
`configure` script of Modules 4.5.2 is a bit too strict and breaks when
special options like `--disable-dependency-tracking` are set. This issue
will be fixed on Modules project starting version 4.5.3
(cea-hpc/modules#354).

This change adapts `configure` options set when installing version 4.5.2
to avoid options unrecognized on this version.

Fix #18420
2020-08-31 10:43:31 -05:00
Toyohisa Kameyama
97f7378097 neovim: build on aarch64 (#18136)
* libvterm: renumber version and add 1.0.3
neovim: build on aarrch64

* Remove unneeded comment.

* libvterm:   newer bazaar snapshot version is set to version 0.0.
neovim: change for libvterm version change, and libtermkey version bug is fixed.

* update libvterm versions.
2020-08-31 10:31:41 -05:00
darmac
c4e966f162 Add new package: byte-unixbench (#18257)
* Add new package: byte-unixbench

* refine install flow

* Update var/spack/repos/builtin/packages/byte-unixbench/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-31 10:28:53 -05:00
t-nojiri
b0a1a7e9aa cp2k: Add depend on libxc@4.3.4. (#18346)
* cp2k: Add depend on libxc@4.3.4.

* cp2k: The fix of depend of libxc@4.3.4 was redone.
2020-08-31 10:27:23 -05:00
Jen Herting
122a4719ca New packages: py-torch-cluster (#18262)
* [py-torch-cluster] created template

* [py-torch-cluster] added dependencies

* [py-torch-cluster] setting TORCH_CUDA_ARCH_LIST

* [py-torch-cluster] limited extends

* [py-torch-cluster] depends on scipy

* [py-torch-cluster] added variant cuda and set env vars

* [py-torch-cluster] added homepage and description. removed fixmes

* [py-torch-cluster] flake8

* [py-torch-cluster] switched to depends_on from extends.

* [py-torch-cluster] added variant description

* [py-torch-cluster] added py-setuptools and py-pytest-runner as dependencies
2020-08-31 10:11:53 -05:00
Jen Herting
f659927938 New package: py-torch-scatter (#18263)
* [py-torch-scatter] created template

* [py-torch-scatter] listed specific version of python

* [py-torch-scatter] extends py-torch

* [py-torch-scatter] setting TORCH_CUDA_ARCH_LIST

* [py-torch-scatter] setting more environemnt variables and added variant cuda

* [py-torch-scatter] added homepage and description. removed fixmes

* [py-torch-scatter] flake8

* [py-torch-scatter] Added variant description

* [py-torch-scatter] extends -> depends_on

* [py-torch-scatter] added dependencies of setup tools and pytest-runner
2020-08-31 10:11:00 -05:00
Jen Herting
47ef8d9deb New package: py-torch-spline-conv (#18273)
* [py-torch-spline-conv] created template

* [py-torch-spline-conv] specified version of python

* [py-torch-spline-conv] extends py-torch-cluster

* [py-torch-spline-conv] setting TORCH_CUDA_ARCH_LIST

* [py-torch-spline-conv] limiting extension to py-torch

* [py-torch-spline-conv] added cuda variant and setting env vars

* [py-torch-spline-conv] added homepage and description. removed fixmes

* [py-torch-spline-conv] added variant cuda

* [py-torch-spline-conv] flake8

* [py-torch-spline-conv] added variant description

* [py-torch-spline-conv] extends -> depends_on

* [py-torch-spline-conv] added dependencies py-setuptools and py-pytorch-runner
2020-08-31 10:10:21 -05:00
Adam J. Stewart
f4a37b2dc2 Remove unmatched triple quotes (#18272) 2020-08-31 13:05:07 +02:00
Weston Ortiz
86eac24f36 trilinos: added v13.0.0 and PYTHONPATH for exodus.py (#18266) 2020-08-31 12:56:37 +02:00
Dr. Christian Tacke
18fde34d38 singularity: added v3.6.2 (#18353) 2020-08-31 12:21:08 +02:00
Dr. Christian Tacke
c07204d661 glew: added v2.1.0 (#18394) 2020-08-31 11:48:46 +02:00
srekolam
d73db33005 atmi,rocgdb,rocm-dbgapi changes for rocm3.7 (#18404) 2020-08-31 11:36:38 +02:00
ketsubouchi
06ad858be2 libidl: added new package at v0.8.14 (#18403) 2020-08-31 11:34:11 +02:00
Sinan
3a0d273dae py-py6s: added new package at v1.8.0 (#18407)
Co-authored-by: Sinan81 <Sinan81@earth>
2020-08-31 11:30:22 +02:00
h-denpo
55f490b093 lesstif: added dependency on 'libxext' (#18408) 2020-08-31 11:12:48 +02:00
darmac
1fdaffed3f minizip: support minizip and miniunz building (#17925)
* minizip: support minizip and miniunz building

* minizip: remove comment

* refine build flow
2020-08-30 22:22:48 -05:00
Toyohisa Kameyama
7f9018e893 lua-luajit: remove duplicated lua-jit and merge to lua-luajit. (#18348) 2020-08-30 19:16:53 -05:00
Sinan
665e5ce0fd new package: py-planet (#18377)
* new package: py-planet

* flake8

* specify checksum type

* improve dependency specs

* specify dependency

* add test dependencies

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@earth>
2020-08-30 10:45:17 -05:00
Sinan
06b551f98e new package: py-deap (#18378)
* new package: py-deap

* flake8

* fix sha sum

* add missing dependency

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@earth>
2020-08-30 10:44:43 -05:00
Sinan
855e77036e new package: py-pypeg2 (#18379)
* new package: py-pypeg2

* add missing dependency

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@earth>
2020-08-30 10:44:21 -05:00
Sinan
a43dc51551 new package: py-pysolar (#18380)
* new package: py-pysolar

* latest version depends on python@3, flake8

* add missing dependencies

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@earth>
2020-08-30 10:43:54 -05:00
Sinan
376d6119ce new package: py-rios (#18381)
* new package: py-rios

* flake8

* Update var/spack/repos/builtin/packages/py-rios/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-30 10:43:26 -05:00
Sinan
fc6cf29f0f new package: py-ssmap (#18383)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2020-08-29 15:50:59 -05:00
Sinan
884e707810 new package: py-stopit (#18384)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2020-08-29 15:50:07 -05:00
vvolkl
a475ef20dd [heppdt] fix broken url (#18388) 2020-08-29 15:43:27 -05:00
darmac
89df727e5f Add new pacakge: termcap (#18389) 2020-08-29 15:42:51 -05:00
darmac
e9ea8d1e81 Add new package: wrk (#18390) 2020-08-29 15:42:19 -05:00
darmac
66e006608a Add new package: cgdcbxd (#18391) 2020-08-29 15:41:52 -05:00
Wouter Deconinck
49df20f1ef [libdrm] AutotoolsPackage; %gcc@10.0.0 requires CFLAGS=-fcommon (#18393)
* [libdrm] AutotoolsPackage; %gcc@10.0.0 requires CFLAGS=-fcommon

* [libdrm] placate flake8
2020-08-29 15:39:02 -05:00
darmac
586fbe05b5 Bcache (#18103)
* bcache:add pkg-config to find blkid.h in linux-utils

* bcache: fix libuuid race condition in pkgconfig
2020-08-29 11:54:46 -05:00
darmac
b4042f23d0 Add new package: re2 (#18302)
* Add new package: re2

* re2: refine versions
2020-08-29 11:53:29 -05:00
darmac
cf170b5ff1 Add new package: varnish-cache (#18258)
* Add new package: varnish-cache

* Add dependency: python
2020-08-29 11:52:09 -05:00
darmac
8b212c7845 Add new package: leptonica (#18306)
* Add new package: leptonica

* Update var/spack/repos/builtin/packages/leptonica/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-29 11:51:41 -05:00
Christoph Junghans
2204ba18b5 flexiblas: initial add (#18364)
* flexiblas: initial add

* Update package.py
2020-08-29 09:48:48 -06:00
darmac
e58c7af1a3 Add new package: weighttp (#18255) 2020-08-28 23:12:56 -05:00
darmac
e19f971b7b Add new package: librdkafka (#18256) 2020-08-28 23:12:32 -05:00
srekolam
6739908144 mathlibs- rocrand recipe for amd rocm-3.5.0 release (#18283)
* mathlibs- rocrand recipe for amd rocm-3.5.0 release

* fixing review comments
2020-08-28 23:11:46 -05:00
Andrew W Elble
4b649b2d2b py-spacy: new version 2.3.2 (#18294)
* py-spacy: new version 2.3.2

update en-core-web-sm to @2.3.1
add en-vectors-web-lg@2.3.0

* update deps

* wasabi

Co-authored-by: Andrew Elble <aweits@localhost.localdomain>
2020-08-28 23:11:13 -05:00
Andrew W Elble
d08d0f2732 new package: qcachegrind@20.08.0 (#18295)
Co-authored-by: Andrew Elble <aweits@localhost.localdomain>
2020-08-28 23:10:58 -05:00
darmac
654f52bc08 Add new package: rinetd (#18303) 2020-08-28 23:06:04 -05:00
darmac
a206d2aebd Add new package: rsyslog (#18304) 2020-08-28 23:05:30 -05:00
darmac
50d7467715 Add new package: tengine (#18305) 2020-08-28 23:04:48 -05:00
darmac
e09d906a35 Add new package: fastdb (#18308) 2020-08-28 23:03:30 -05:00
vvolkl
62128f1351 [whizard] update ocaml dependency (#18309)
up
2020-08-28 23:02:45 -05:00
darmac
2c155d4fe2 Add new package: sysbench (#18310) 2020-08-28 23:02:11 -05:00
darmac
d5da8e7543 Add new package: foundationdb (#18312) 2020-08-28 22:59:28 -05:00
darmac
eaf843c3e8 Add new package: pflask (#18313) 2020-08-28 22:58:54 -05:00
darmac
a1230d1910 Add new package: shc (#18314) 2020-08-28 22:58:29 -05:00
darmac
b799b983bb Add new package: vsftpd (#18317) 2020-08-28 22:55:44 -05:00
darmac
0eeed1f7f7 Add new package: iniparser (#18318) 2020-08-28 22:53:57 -05:00
darmac
c85dc3a5b4 Add new package: fcgi (#18320) 2020-08-28 22:51:03 -05:00
darmac
6113be0919 Add new package: faust (#18321) 2020-08-28 22:50:05 -05:00
Kelly (KT) Thompson
48bfffd32c [new version] Draco package (#18336)
* Update version for package Draco

+ Add support for `draco-7.7.0`.
  + Introduces new `+cuda` variant.  This variant is only allowed in version
    `7.7.0:`.
  + Restrict `random123` to compatible versions.
  + Restrict `libquo` to compatible versions.
  + Moving forward, require `python@3:`
  + Moving forward, the ``+superlu_dist` variant is not longer supported.
+ Improve printed output for `--test` mode by adding `ctest` option
  `--output-on-failure`
+ Provide a patch to support for IBM Spectrum-MPI in version `7.7.0:`
+ Provide a patch to allow variant `~cuda` to actually disable GPU portions of
  the code when a GPU is discovered on the local system.

* Remove unnecessary function decoration.
2020-08-28 22:42:15 -05:00
Chris Richardson
25021ec228 Add setuptools as a direct dependency (#18324) 2020-08-28 16:24:43 -05:00
Gabriel Rockefeller
8a408a6571 eospac: add versions 6.4.1alpha.2 and 6.4.1beta (#18329) 2020-08-28 16:23:30 -05:00
Rémi Lacroix
b27cd9d75d NetCDF-Fortran: Add version 4.5.3. (#18350) 2020-08-28 16:22:42 -05:00
Robert Blake
eb8ff0bc81 Adding externals for bison and flex (#18358)
* Adding externals for bison and flex

Added because bison actually pulls in a ton of stuff.

* Need to escape parentheses.

* Need to add re package.

* Adding re package.
2020-08-28 16:22:28 -05:00
Robert Blake
aca370a3a2 External recognition for find. (#18360)
* External recognition for find.

* Adding re package.
2020-08-28 16:22:05 -05:00
Robert Blake
c9fd2983dc texinfo: Adding external support (#18362)
* texinfo: Adding external support for texinfo.

* Adding re package.
2020-08-28 16:21:51 -05:00
Dr. Christian Tacke
65fda72d7c davix: Improve shared library building on macOS (#18352)
Add -DCMAKE_MACOSX_RPATH=ON to cmake.
2020-08-28 16:13:53 -05:00
Keita Iwabuchi
9525c530d5 Add Metall v0.3 (#18340)
* Metall: add version 0.2

* Add Metall v0.3
2020-08-28 14:42:02 -05:00
Jonathan R. Madsen
0bf696a29d Timemory: Fix python dependencies + NCCL (#18342)
* Fix python dependencies + NCCL

* Removed trailing whitespace
2020-08-28 14:40:34 -05:00
Dr. Christian Tacke
6bb2dd40b6 fairlogger: Update cmake options and version (#18354)
* Add version 1.7.0 and 1.8.0
* Better support for boost < 1.70
* No color in output
2020-08-28 14:31:09 -05:00
Nikolay Simakov
9b654fe60c namd: added patching charmrun location, as it stored in prefix.bin (#18355) 2020-08-28 14:29:53 -05:00
Robert Blake
6ceb3d4be0 spectrum-mpi: external support, compiler detection (#18055)
* spectrum-mpi: adding external support.

* Package is tested, works on LLNL lassen

* Spectrum external now detects the correct compiler

* Changing code to not output all compilers

Done per becker33's request on #18055
2020-08-28 11:53:27 -07:00
eugeneswalker
9befc43708 binutils: add build dep: diffutils (provides cmp) (#18361) 2020-08-28 11:31:43 -07:00
Rémi Lacroix
b006123331 Add new package: AutoDock-GPU. (#17808)
The project currently only has a develop branch and no versioned releases so only provide a git-based "develop" version.
2020-08-28 11:02:52 -05:00
ketsubouchi
1d0650b2cb looptools: skip UNDERSCORE check and add -Fwide (#18135) 2020-08-28 10:03:38 -05:00
ketsubouchi
abffcefadd genometools: use signed char for %fj (#18126)
* genometools: use signed char for %fj

* genometools: update patch to use int

* use int
2020-08-28 10:02:42 -05:00
ketsubouchi
7dd58c7ed0 ghostscript: patched sources to allow building with Fujitsu compilers (#18345) 2020-08-28 11:52:11 +02:00
ajw1980
1251919318 Add type=link for libxml2 and libxslt for py-lxml (#18293) and add version 4.5.2 (#18325)
Co-authored-by: Andy Wettstein <andy.wettstein@xrtrading.com>
2020-08-27 15:03:23 -05:00
Greg Becker
d10dff1b89 docs: add main version to docs for develop-like versions (#18328) 2020-08-27 12:35:11 -07:00
Hector
9f5d0c0007 Fix Zoltan installation with gcc 10.2.0 (#18301) 2020-08-27 20:57:53 +02:00
Andrew W Elble
77a28b81ac thrift: add missing py-six dependency (#18285)
* thrift: add missing py-six dependency

* add more missing dependencies (+extras)

* small fix

* gssapi variant / setuptools run dep

Co-authored-by: Andrew Elble <aweits@localhost.localdomain>
2020-08-27 13:24:42 -05:00
psakievich
ccae9cff3a Add nalu-wind test structure (#13640)
- Add custom function to checkout submoudles if testing is enabled
2020-08-27 12:09:56 -05:00
Chen Wang
680c1b339a New version: Recorder 2.1.5 (#18327) 2020-08-27 11:55:04 -05:00
darmac
a81106757d Add new package: libmnl (#18316) 2020-08-27 16:05:21 +02:00
Itaru Kitayama
1746799087 CUDA Toolkit: added v11.0.2 for Arm (#17845)
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2020-08-27 14:56:18 +02:00
Chen Wang
9fc6bdabef Add a new package: Recorder (#18297)
* Add a new package: Recorder

* Delete all FIXMEs

* Update package.py

* Remove extra spaces.
2020-08-27 07:47:21 -05:00
Simon Pintarelli
22329c4f92 sirius, q-e-sirius (#18286)
* sirius: fix bug in shared spec

make +shared the default

* q-e-sirius: depend on sirius+shared, fix gcc@10

- add missing whitespace in -fallow-argument-mismatch.
- require sirius+shared
2020-08-27 07:45:55 -05:00
Adam J. Stewart
7fd8f74c23 SLOCCount: add new package (#18271) 2020-08-27 14:39:46 +02:00
Seth R. Johnson
ba470137c8 nlohmann-json: added v3.9.1 and v3.8.0 (#18287)
Also use 'define' helper function.
2020-08-27 14:31:00 +02:00
Axel Huebl
a6c0b7ab3a CMake: Update GCC on macOS Conflict message (#18253)
* CMake 3.18.0+: Builds with GCC on macOS

The latest release of CMake updates libuv, which fixes ObjC code
usage on macOS. Passing ObjC code to non apple-clang compilers
crashed the build before.

Refs.:
- https://gitlab.kitware.com/cmake/cmake/-/issues/20620
- https://gitlab.kitware.com/cmake/cmake/-/merge_requests/4687

* CMake: Further issues GCC+macOS

There are further issues to fix before this will work.
https://gitlab.kitware.com/cmake/cmake/-/issues/21135
2020-08-27 08:51:00 +02:00
Richarda Butler
416afa0059 docs: fix bugs in contribution, getting started guides (#18216)
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-27 08:23:26 +02:00
darmac
5d3ad70e0a Add new package: sysget (#18259) 2020-08-26 21:11:40 -05:00
Weston Ortiz
1a17921fa5 Add new package: sparse (#18264)
* Add new package: sparse

* flake8 fixes
2020-08-26 21:02:16 -05:00
Tom Payerle
273a158f1b dbus: Add libsm dependency (fix for issue #18267) (#18268)
dbus has a dependency on libSM which was not being declared (and would
normally be satisfied by using system libraries).
2020-08-26 20:57:54 -05:00
Tim Haines
1536691bc9 caliper: Remove support for deprecated Dyninst versions (#18275) 2020-08-26 20:51:00 -05:00
darmac
080625c3af Jpegoptim (#18279)
* Add new package: jpegoptim

* refine homepage
2020-08-26 20:49:34 -05:00
arjun-raj-kuppala
24f2850cab update to rocm hip,hsa-rocr-dev,rocm-dbgapi (#18280)
* update to rocm hip,hsa-rocr-dev,rocm-dbgapi

* Update var/spack/repos/builtin/packages/rocm-dbgapi/package.py

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2020-08-26 20:47:55 -05:00
darmac
b93e65065a Add new package: libestr (#18281) 2020-08-26 20:45:51 -05:00
darmac
369e691ec2 Add new package: libfastjson (#18282) 2020-08-26 20:45:27 -05:00
Satish Balay
67e72bf1f5 sowing: update to version 1.1.26-p1 (#18298) 2020-08-26 20:17:05 -05:00
Seth R. Johnson
81bb372d75 Trilinos: Require parallel HDF5 when +hdf5+mpi (#17606)
EpetraExt requires parallel IO. It does after all make sense to require
the parallel version of HDF5 when building a parallel solver library.
2020-08-26 14:54:05 -05:00
Seth R. Johnson
26bab9d4d6 Fix build when +stratimikos +xpetra (#17971)
If Thyra isn't explicitly enabled at the package level, trilinos fails
to build.

```
/var/folders/gy/mrg1ffts2h945qj9k29s1l1dvvmbqb/T/s3j/spack-stage/spack-stage-trilinos-12.18.1-vfmemkls4ncta6qoptm5s7bcmrxnjhnd/spack-src/packages/muelu/adapters/stratimikos/Thyra_XpetraLinearOp_def.hpp:167:15: error:
      no member named 'ThyraUtils' in namespace 'Xpetra'
      Xpetra::ThyraUtils<Scalar,LocalOrdinal,GlobalOrdinal,Node>::toXpetra(rcpFromRef(X_in), comm);
      ~~~~~~~~^
```
2020-08-26 14:51:57 -05:00
Massimiliano Culpo
96ac5add9d release procedure: add step to activate the documentation on readthedocs (#18288) 2020-08-26 12:47:47 -07:00
ketsubouchi
02dc84a2b3 conduit: added patch to build with Fujitsu compilers (#18284) 2020-08-26 18:55:09 +02:00
eugeneswalker
f7d156af05 binutils: add version 2.35 (#18291) 2020-08-26 11:54:33 -05:00
Adam J. Stewart
20d37afafa TensorFlow: added v2.3.0 (#17736)
* Workaround for Spack-installed proto support

Co-authored-by: Andrew W Elble <aweits@rit.edu>
2020-08-26 14:32:14 +02:00
Veselin Dobrev
ad437bff8f [GnuTLS] Add gnutls-specific headers property. (#7259) 2020-08-25 19:00:49 -05:00
Christoph Junghans
afa907f549 kokkos: add v3.2.00 (#18274) 2020-08-25 15:26:43 -05:00
darmac
878d0b793c Add new package: influxdb (#17909)
* Add new package: influxdb

* put usr/* in prefix
2020-08-25 07:45:30 -05:00
Axel Huebl
11a3ac25ac CCache: added v3.7.11 and support for external detection (#18246) 2020-08-25 14:30:16 +02:00
lpoirel
b562154423 petsc: new version add support for +mumps +int64 (#18170)
* petsc: new version add support for +mumps +int64

The latest petsc release allows using mumps while using 64 bit integers :
https://www.mcs.anl.gov/petsc/documentation/changes/313.html
https://gitlab.com/petsc/petsc/-/merge_requests/2591/

* petsc: use conflicts for mumps dependency
2020-08-25 14:07:59 +02:00
Adam J. Stewart
b305990ee8 pkgconfig: add conflict for PGI (#18236) 2020-08-25 11:33:36 +02:00
Tim Haines
9b07669ab3 libmicrohttpd: added new package at v0.9.71 (#18226) 2020-08-25 10:24:04 +02:00
Adam J. Stewart
df51cf20c5 libtiff: added v4.1.0 (#18247) 2020-08-25 10:00:56 +02:00
Adam J. Stewart
cb322300cb libgeotiff: added v1.6.0 (#18248) 2020-08-25 10:00:39 +02:00
Adam J. Stewart
5aef9f8f83 PROJ: added versions up to v7.1.0 (#18249) 2020-08-25 10:00:18 +02:00
Adam J. Stewart
c6736653cd py-shapely: added v1.7.1 (#18250) 2020-08-25 09:59:36 +02:00
Adam J. Stewart
d4ef804b15 py-cartopy: added v0.18.0 (#18251) 2020-08-25 09:59:12 +02:00
Tim Haines
eebcd6b24f Dyninst: added conflicts for unsupported compilers and platforms (#18245) 2020-08-25 09:56:27 +02:00
Joseph Wang
ac6d7dcdf6 freeglut: added v3.2.1 and patch to build with GCC 10 (#18229) 2020-08-25 09:53:27 +02:00
Elizabeth Fischer
c13bc308db py-basemap: Update for proj@6 (#16183)
* py-basemap

* Updated versions + URL attribute

* Update var/spack/repos/builtin/packages/py-basemap/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-basemap/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Removed unnecssary comment

* flake8

Co-authored-by: Elizabeth Fischer <elizabeth.fischer@alaska.edu>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-24 23:24:56 -05:00
Robert Blake
5e1909c00a mvapich2: Adding external find support. (#18177)
* Adding external support for mvapich2.

This picks up all the options that are currently settable by
the spack package. It also detects the compiler and sets it
appropriately.

* Removing debugging printing.

* Adding changes suggested by @nithintsk
2020-08-24 20:53:42 -05:00
Satish Balay
b885dbcd85 pflotran: requires hdf5+hl (#18224) 2020-08-24 19:13:47 -05:00
Paul
9e7029d806 Added Go 1.15 and find external package. (#18235) 2020-08-24 19:05:48 -05:00
Robert Underwood
6f2f02366a sz: version bump, cmake bug-fix, add maintainer (#18238)
+ Added version 2.1.9
+ Previously the SZ package incorrectly depended on CMake without a
  version dependancy, but actually version 3.13 or newer is required
+ Added myself as a maintainer for the SZ spack package
2020-08-24 19:05:08 -05:00
Benjamin Tovar
d6ca3be63a Update to CCTools 7.1.7 (#18241)
This is a bug release with some new features and bug fixes. Among them:

[Batch] Set number of MPI processes for SLURM. (Ben Tovar)
[General] Use the right signature when overriding gettimeofday. (Tim
Shaffer)
[Resource Monitor] Add context-switch count to final summary. (Ben
Tovar)
[Resource Monitor] Fix kbps to Mbps typo in final summary. (Ben Tovar)
[WorkQueue] Update example apps to python3. (Douglas Thain)
2020-08-24 19:04:08 -05:00
Axel Huebl
a483b0b9dd CMake: 3.18.2 (#18244)
Adds the latest CMake release, which fixes a Spectrum-MPI + CUDA<10.2
issue, among others. Relevant for Summit.
2020-08-24 19:03:11 -05:00
Adam J. Stewart
5ef7d5e792 CUDA: add v10.1.243 for ppc64le (#18237)
Closes #18174 

@smarocchi @ax3l @svenevs 

The error message in #18174 isn't great, but I'm hoping that #18127 will improve that.
2020-08-24 14:50:10 -07:00
Daryl W. Grunau
f5102cd582 call the universal 'mpifc' instead of hardcode 'mpif90' (#18162)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2020-08-24 15:44:20 -05:00
Harmen Stoppels
17f7d9f44c openmpi: fix concretization (#18233) 2020-08-24 19:05:39 +02:00
Erik Schnetter
53cfca1f59 simulatioion: Declare dependency on asdf-cxx (#18018)
* simulationio: Declare dependency on asdf-cxx

* Rename "develop" to master"; add descriptions to variants

* Add maintainer
2020-08-24 11:25:43 -05:00
David Gardner
3c866294e1 add github url as backup (#18223) 2020-08-23 16:43:57 -05:00
t-nojiri
b721a2e54b soapdenovo-trans: build on aarch64 (#18187)
* samtools: Add version 0.1.8 for OSS soapdenovo-trans.

* Add depend on zlib and samtools to build on aarch64.

* soapdenovo-trans: Change the condition of depend on zlib and samtools.
2020-08-22 22:11:19 -05:00
David Gardner
a6ca236c26 add tcl 6.8.10 (#18221) 2020-08-22 22:02:57 -05:00
Sergey Kosukhin
b5e78bce06 OpenMPI: sanitize dependent build environment. (#15360) 2020-08-22 22:01:41 -05:00
Sergey Kosukhin
9fdb945383 NetCDF: fix constraints. (#16719) 2020-08-22 16:06:17 -05:00
Teodor Nikolov
dc5176fbae [hpx] Two new variants for upcoming v1.5, stable and master (#18022)
* New package: cxxopts

* Use +unicode instead of unicode=True

- Make the unicode option more explicit

* Add two new variants to spack for upcoming 1.5, stable and develop

* Add as maintainer

* Add depends_on on clauses

* Remove unrelated change
2020-08-22 16:00:11 -05:00
Carson Woods
c743d219aa qthreads: add additional configuration options (#15833)
* Add new config options for qthreads

* Fix syntax error and add option for specifying stack size

* Fix flake8 issues

* Add input validation for the stack size variant and add explicit enable for spawn cache

* Fix flake8 issues with input validation code
2020-08-22 15:58:41 -05:00
Jianwen
e26e532eb3 vifi: a new package. [WIP] (#14287)
* vifi: a new package.

* Fix url.

* Add dependencies.
2020-08-22 15:23:04 -05:00
noguchi-k
15facf2171 ibm-java: add conflicts for aarch64 (#13728) 2020-08-22 15:22:50 -05:00
Ruben Di Battista
f902202b5e Add scantailor package (#12833)
* Add scantailor

* qt: Fix build w/ GCC9 for @4:

* qt: Fix patch filename and qt version

* Fix typo in patch name

* scantailor: Fix version

* scantailor: Use a more-modern fork

* scantailor: Remove unused patch file

Co-authored-by: Ruben Di Battista <ruben.di-battista@polytechnique.edu>
2020-08-22 15:22:37 -05:00
Andreas Baumbach
27b9727926 Add new package arbor (#11914)
* Add new package arbor

Change-Id: I70a442d6ad74979d13bd9f599df238c085d4baf9

* Update package.py

* add additional dependencies

* change download to tar-ball

* py26 compatibility

* restrict noqas and expand comment

Co-authored-by: Eric Müller <mueller@kip.uni-heidelberg.de>
2020-08-22 14:42:46 -05:00
Brian Homerding
2e2c06121a llvm-openmp-ompt: Additional dependencies and adding variant (#11266)
* llvm-openmp-ompt: Additional dependencies and adding variant for disabling building libomptarget

* Flake8 fixes
2020-08-22 14:23:04 -05:00
Levi Baber
d65b9ad195 mark: added new package (#10513)
* mark: Create new package.

* mark: change description.

* mark: change description.

* mark: Delete set_up environment.

* mark: replace join_path(prefix.bin, mark) with prefix.bin.mark

* mark: new license and sha256 hash

Co-authored-by: lingnanyuan <1297162327@qq.com>
2020-08-22 14:01:00 -05:00
George Hartzell
b53b9fd17e Fix redundant reset of terminal in prompt example (#17698)
I know that it's just an example, but I was trying to figure out what was going on and it wasn't making sense....

`tput sgr0` resets the terminal state (http://linuxcommand.org/lc3_adv_tput.php) and I can't see any reason to do it twice.  Deleting the second occurrence doesn't seem to break the fancy prompt effect.
2020-08-22 13:27:39 -05:00
Elizabeth Fischer
8ce5718922 qgis: Updates for proj@6 (#16172)
* qgis

* Update package.py

QGIS 3.12.1 can use PROJ >= 4.9.3.  Therefore both version restrictions on PROJ were incorrect.
https://github.com/qgis/QGIS/blob/final-3_12_1/INSTALL

* Update package.py

Add explanation to (hopefully temporary) removal of hdf5 dependency.

* Remove overly restrictive GRASS version number.

* flake8

Co-authored-by: Elizabeth Fischer <elizabeth.fischer@alaska.edu>
2020-08-22 13:22:57 -05:00
Erik Schnetter
488d8ae747 openmpi: Update hwloc version bounds (#18040)
`openmpi @4:` can use `hwloc @2:`.
2020-08-22 13:10:11 -05:00
Greg Becker
3c6544fcbf fix mailmap for becker33 (#18215) 2020-08-22 12:46:48 -05:00
Christoph Junghans
adc98ae409 votca*: add v1.6.2 (#18218) 2020-08-22 11:38:41 -06:00
darmac
74d274f02f Add new package: consul (#18044)
* Add new package: consul

* fix package type

* refine install phase
2020-08-22 10:51:20 -05:00
Richarda Butler
d599e4d9d4 Added the file path for running a specific test (#18214) 2020-08-21 17:42:24 -05:00
srekolam
febc8ccb15 new recipe for rocalution and fix for rocm-smi-lib (#18211) 2020-08-21 13:34:06 -05:00
Jen Herting
512aa40151 [py-numba] added version 0.50.1 (#17880)
* [py-numba] fixing upper limit for conditional dependency

* [py-numba] added version 0.50.1

* [py-numba] Specifying llvmlite version for new version
2020-08-21 12:45:57 -05:00
Jen Herting
c24b838407 [py-llvmlite] new version 0.33.0 (#17878)
* [py-llvmlite] new version 0.33.0

* [py-llvmlite] newer version of python required
2020-08-21 10:54:06 -05:00
g-mathias
d222551185 namd: latest version 2.14 (#18167)
* upgrade to version 2.14; added target architecture optimization

* renamed devel-> (adamjstuart); keep 2.14bx versions

Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-08-21 09:55:53 -05:00
Adam J. Stewart
07fc495c94 py-pandas: added v1.1.1 (#18206) 2020-08-21 08:49:32 +02:00
Ryan Mast
3a1334bc6a helics: Add version 2.6.0 (#18208)
* helics: Add version 2.6.0

* Remove bms version from HELICS package
2020-08-20 22:24:46 -05:00
Jordan Ogas
1666db60d0 charliecloud: add 0.18 (#18200)
* add 0.18, remove older versions

* update url

* fix typo
2020-08-20 20:16:34 -05:00
Michael Kuhn
fc3b4657ce sqlite: Add 3.33.0 (#18145) 2020-08-20 17:17:02 -05:00
g-mathias
880ff1d795 new dakota version 6.12; included boost versions limit; changed default url (#18168)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-08-20 17:14:40 -05:00
Michael Kuhn
11b1ce84cf docs: List tar and some compressors in prerequisites (#18169)
Fixes #18152
2020-08-20 17:13:47 -05:00
vvolkl
6b6f6db431 [gaudi] sadly, the cmake fixes are postponed to next version (#18171) 2020-08-20 17:11:45 -05:00
vvolkl
cd06100001 [podio] new version, cxxstd 17 now mandatory (#18191) 2020-08-20 17:06:52 -05:00
Harmen Stoppels
ebe5a5652c Add rocSPARSE, rocSOLVER, hipSPARSE and hipBLAS (#18108) 2020-08-20 14:55:02 -05:00
Greg Becker
ad9cd25285 allow external packages that violate conflicts (#18183) 2020-08-20 10:16:48 -07:00
Massimiliano Culpo
573ce3fe81 gcc: fixed compilation on OpenSUSE (#18190)
Set location for dependencies specifying explicitly both
the include and lib path. This permits to handle cases where
the libraries are installed in lib64 instead of lib.

fixes #17556
fixes #10842
closes #18150
2020-08-20 17:42:18 +02:00
Massimiliano Culpo
1addcff724 Test "is_extension" after a round trip to dict (#18188)
closes #3887
closes #3853
2020-08-20 08:08:49 -07:00
Marc Mengel
d5c3b876e0 detatch binutils usage from bootstrap in gcc variants (#18116) 2020-08-20 17:06:22 +02:00
vvolkl
286c3d6cbc [lcio] set up run time paths (#18109)
* [lcio] set up run time paths

* [lcio] update runtime

* [lcio] flake8
2020-08-20 09:43:59 -05:00
Greg Becker
aa4fc2ac44 codecov: set project threshold to 0.2% (#18184) 2020-08-20 09:43:24 -05:00
Isaac Whitfield
b9adbac5c6 libimobiledevice: new package (#10819) 2020-08-20 11:22:04 +02:00
ketsubouchi
8a02ef4d51 bwa: patch Makefile to permit the use of compilers other than GCC (#18189) 2020-08-20 09:55:05 +02:00
Tim Haines
c786eb46bb dyninst: use elfutils for all versions (#18063) 2020-08-20 08:36:02 +02:00
Elizabeth Fischer
306f3a1232 Make finding of NetCDF and HDF5 more explicit. (#18166)
Co-authored-by: Elizabeth Fischer <elizabeth.fischer@alaska.edu>
2020-08-20 08:34:53 +02:00
Greg Becker
ccf94ded67 Compilers: use Compiler._real_version for flag version checks (#18179)
Compilers can have strange versions, as the version is provided by the user.  We know the real version internally, (by querying the compiler) so expose it as a property and use it in places we don't trust the user.  Eventually we'll refactor this with compilers as dependencies, but this is the best fix we've got for now.

- [x] Make `real_version` a property and cache the version returned by the compiler
- [x] Use `real_version` to make C++ language level flags work
2020-08-19 21:56:06 -07:00
darmac
1650824ef5 Add new package: phoenix (#18143) 2020-08-19 20:33:37 -05:00
darmac
0187884591 qemu: add pkgconfig to avoid build error (#17929) 2020-08-19 20:32:47 -05:00
darmac
d9722e2730 nfs-ganesha: fix compile error on debian (#18102)
* nfs-ganesha: fix compile error on debian

* add type for py-stsci-distutils
2020-08-19 20:32:02 -05:00
darmac
bc2ff81a64 Add new package: presto (#18142) 2020-08-19 20:31:10 -05:00
darmac
472a32fb41 brigand: fix build error (#18173) 2020-08-19 20:30:33 -05:00
eugeneswalker
b5db5cf259 prepend ${NINJA_ROOT}/misc to PYTHONPATH in run environment (#18182) 2020-08-19 15:06:08 -07:00
Tamara Dahlgren
1c0a92662b Restore curl progress output (#18127)
Restores the fetching progress bar sans failure outputs; restores non-debug reporting of using fetch cache for installed packages; and adds a unit test.

* Add status bar check to test and fetch output when already installed
2020-08-19 12:10:18 -07:00
Filippo Spiga
8e8071ffb6 [WIP] Adding Quantum ESPRESSO v6.6 (#18091)
* Adding v6.6 (latest stable)

* There is no '-' in Quantum ESPRESSO

Co-authored-by: Filippo Spiga <fspiga@nvidia.com>
2020-08-19 11:10:54 -05:00
Michael Kuhn
b6321cdfa9 microarchitectures: Fix icelake (#18151)
Some of the feature flags are named differently and clwb is missing on
my i7-1065G7. cascadelake and cannonlake might have similar problems but
I do not have access to those architectures to test.
2020-08-19 11:49:43 +02:00
Pieter Ghysels
2945babdad STRUMPACK: added v4.0.0 (#18159)
- add cuda variant, enabled by default, but conflicting with
  strumpack@:3.9.999
- add zfp variant, enabled by default, but conflicting with
  strumpack@:3.9.999
- update minimum CMake version to 3.11
- for version 4.0.0:, do not use mpi wrappers. v4.0.0 uses CMake
  MPI targets
- for version 4.0.0, add dependency on butterflypack@1.2.0:
- remove versions 3.1.0 and older
- make parmetis variant True by default
- add TODO for slate variant (spack package not ready yet)
2020-08-19 10:15:02 +02:00
Kai Germaschewski
3cbd4c8dcf adios: relax libtool restriction (#18056)
While I believe there must have been a reason to restrict libtool to <=
2.4.2, adios compiles just fine with libtool 2.4.6 for me.

In fact, without this change, I'm getting this error:

libtool: Version mismatch error.  This is libtool 2.4.6, but the
libtool: definition of this LT_INIT comes from libtool 2.4.2.
libtool: You should recreate aclocal.m4 with macros from libtool 2.4.6

This doesn't make much sense, since spack did build libtool@2.4.2 as a
dependency, and was supposedly trying to use it. My guess is that on
this system (NERSC's cori) the system libtool in /usr/bin, which is
2.4.6 somehow got picked up partially.
2020-08-19 08:49:54 +02:00
Michael Kuhn
fd0d79ecc5 meson: added v0.55.1 (#18146) 2020-08-19 08:32:13 +02:00
Luke Dalessandro
76f3d84a1b libfabric: added v1.10.0, v1.10.1 and v1.11.0. (#18161) 2020-08-19 08:26:22 +02:00
Robert Pavel
bc34ab4701 lua: specified better the dependency on ncurses (needs +termlib) (#18163)
Semi-recently the lua spackage was updated to explicitly add libtinfow
to the lua build line. Ncurses provides this but only when the +termlib
variant is enabled
2020-08-19 08:23:56 +02:00
Massimiliano Culpo
bef560636d intel: added external detection capabilities (#17991) 2020-08-19 08:20:57 +02:00
ketsubouchi
71748a3b7a moab: delete -march=native from generated files (#18137)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-08-19 07:40:40 +02:00
Christoph Junghans
c7096eb537 ninja: add v1.10.1 (#18160) 2020-08-18 14:51:55 -06:00
Greg Becker
08dd826891 emacs: add 27.1 and conflict 26.3 on macos catalina (#18157)
* emacs: add version 27.1

* emacs: 26.3 does not work on macos catalina
2020-08-18 11:26:09 -07:00
Adam J. Stewart
00e9e1b3c7 Java: add spack external find support (#18006) 2020-08-18 20:17:08 +02:00
Massimiliano Culpo
1ed70d0e2c 'spack env update' can handle overrides (#18149)
fixes #18147

Before this commit the command erroneously reported
"Additional properties not allowed" for keys with a
double colon.
2020-08-18 17:05:25 +02:00
Michael Kuhn
319d160ed1 glib: added v2.64.4 (#18144) 2020-08-18 15:15:19 +02:00
Adam J. Stewart
17d96b615a py-accimage: add new package (#18122) 2020-08-18 12:10:18 +02:00
Toyohisa Kameyama
3210d6b64e unibilium: added v2.0.0 (#18132) 2020-08-18 12:03:06 +02:00
Pieter Ghysels
9b65bdeca8 ButterflyPACK: added v1.2.0 (#18133) 2020-08-18 12:02:16 +02:00
Hadrien G
d466e9224f acts: added v0.31 (#18138) 2020-08-18 11:55:45 +02:00
Harmen Stoppels
25d27a38de SpFFT: added v0.9.13 (#18139) 2020-08-18 11:53:27 +02:00
ketsubouchi
71b7b353d8 ocaml: fix building with Fujitsu compilers (#17918) 2020-08-18 11:49:09 +02:00
darmac
060731a824 ape: fix build error and update version (#17952)
* ape: fix build error and update version

* ape: fix 2.3.0 & 2.3.1

* ape: only refine libxc version constraintion

* ape: fix flake8 error
2020-08-17 21:22:53 -05:00
Toyohisa Kameyama
264958fc18 lua-jit: New pacjage. (#18099) 2020-08-17 21:17:08 -05:00
Toyohisa Kameyama
ad8418fbbf libluv: New package. (#18100) 2020-08-17 21:15:11 -05:00
Toyohisa Kameyama
b3535d909e libuc: Add version 1.38.1. (#18104) 2020-08-17 21:07:43 -05:00
Harmen Stoppels
f582ebbaa0 Add master and develop for spfft (#18105) 2020-08-17 21:05:26 -05:00
Hadrien G
97ca824487 Add acts v0.30 (#18106) 2020-08-17 21:04:17 -05:00
Paul
12c4b8a73e Find external package for git-lfs. (#18107) 2020-08-17 21:02:41 -05:00
Levi Baber
6127a6faf6 canu: new version (#18112) 2020-08-17 20:58:50 -05:00
Marc Mengel
fdddbc9ae7 libsm: new version, depends_on libuuid (#18117)
* libsm: new version, depends_on libuuid

* blank lines

Co-authored-by: Marc Mengel <mengel@fnal.gov>
2020-08-17 20:52:26 -05:00
Toyohisa Kameyama
4f624727ef lua-lpeg: Add version 1.0.2-1. (#18128) 2020-08-17 20:51:29 -05:00
Rao Garimella
da1135cd19 Fix typo in CMake options (#18125)
* New interface reconstruction package

* forgot to put in CMake option for Jali

* cleanup whitespace

* fix lines with more than 79 chars

* more long line cleanup

* fix typo WONTON_ENABLE_Kokkos ---> TANGRAM_ENABLE_Kokkos

Co-authored-by: Rao Garimella <rao@abyzou.lanl.gov>
2020-08-17 20:41:04 -05:00
Marc Mengel
3da40a5c61 dependency fix for python3 (#18118)
* dependency fix for python3

* version cleanup

Co-authored-by: Marc Mengel <mengel@fnal.gov>
2020-08-17 16:51:17 -05:00
Nithin Senthil Kumar
9c3b4e4d28 Changes to url port and a bug fix in mvapich2x package (#18096)
Co-authored-by: nithintsk <nithintsk@github.com>
2020-08-17 15:33:26 -05:00
Adam J. Stewart
cc06181ef9 ALSA-lib only works on Linux (#18075) 2020-08-17 15:22:25 -05:00
eugeneswalker
1b965ac507 Binary Distribution: Relocate RPATH on Cray (#18110)
* make_package_relative: relocate rpaths on cray

* relocate_package: relocate rpaths on cray

* platforms: add `binary_formats` property

We need to know which binary formats are supported on a platform so we
know which types of relocations to try. This adds a list of binary
formats to the platform and removes a bunch of special cases from
`binary_distribution.py`.

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-08-17 13:21:36 -07:00
vvolkl
525a9f02ca xrootd: added v5.0.0, updated dependency on root (#18101) 2020-08-17 20:02:45 +02:00
Rao Garimella
1498d076c3 New package Tangram (#17944)
* New interface reconstruction package

* forgot to put in CMake option for Jali

* cleanup whitespace

* fix lines with more than 79 chars

* more long line cleanup

Co-authored-by: Rao Garimella <rao@abyzou.lanl.gov>
2020-08-17 13:00:43 -05:00
Adam J. Stewart
9350ecf046 py-torchvision: fix linking to -lavcodec (#18093) 2020-08-17 11:24:19 -05:00
vvolkl
022586b11d [delphes] align env var with lcg release (#18080) 2020-08-17 09:20:41 -05:00
t-nojiri
a19ac05d3e openmolcas: fix build on aarch64. (#17923) 2020-08-17 11:54:48 +02:00
vvolkl
b51e48732c pythia8: added environment variable (#18081) 2020-08-17 11:51:16 +02:00
Adam J. Stewart
8a8d36d828 hdf+external-xdr does not build on macOS (#18085) 2020-08-17 10:54:30 +02:00
Wouter Deconinck
7618b4e25c cppzmq: added versions up to v4.6.0 (#18087)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-17 10:32:38 +02:00
Harmen Stoppels
207eadc1b2 Add rocBLAS (#18069)
* Add rocBLAS

* Add rocBLAS support to SIRIUS
2020-08-16 19:42:43 -05:00
darmac
f56b521bb5 Add new package: hudi (#18041) 2020-08-16 17:36:03 -05:00
darmac
a9116ac95c Add new package: slf4j (#18042) 2020-08-16 17:35:12 -05:00
darmac
3e7e07f2fd Add new package: h2database (#18046) 2020-08-16 17:33:09 -05:00
darmac
4428f59ad5 kaldi:refine dependency openfst (#18047) 2020-08-16 17:32:23 -05:00
darmac
34b7d99bec Add new package: mahout (#18048) 2020-08-16 17:30:03 -05:00
darmac
06a0fe9d94 util-linux: remove libintl link (#18065) 2020-08-16 17:28:38 -05:00
darmac
bef02e9d89 nfs-utils: fix compile error on ubuntu (#18066) 2020-08-16 17:27:47 -05:00
Tomoki, Karatsu
af9a3dc69a fenics: split 'parmetis' dependency definition. (#18067) 2020-08-16 17:25:16 -05:00
Harmen Stoppels
fd82cff737 Make spfft build with rocm (#18068) 2020-08-16 17:23:04 -05:00
Tom Payerle
0ffe64dee0 libssh: add gssapi variant and include krb5 as a dependency accordingly (#18070)
See #18033

libssh seemed to detect and link to system krb5 libraries if found
to provide gssapi support, causing issues/system dependencies/etc.

We add a boolean variant gssapi

If +gssapi, the spack krb5 package is added as a dependency.
If ~gssapi, the Cmake flags are adjusted to not use gssapi so that
does not link to any krb5 package.
2020-08-16 17:16:44 -05:00
vvolkl
b89c794806 [xrootd] new version and updated dependency (#18079) 2020-08-16 17:14:08 -05:00
vvolkl
2916ebe743 [dd4hep] align runtime env with lcg release (#18082) 2020-08-16 17:11:28 -05:00
Filippo Spiga
0788a69fa8 Adding osu-micro-benchmarks version 5.6.3 (#18090)
Co-authored-by: Filippo Spiga <fspiga@nvidia.com>
2020-08-16 17:03:54 -05:00
Mark W. Krentel
6e1f9b65e4 xz: add +pic variant (#18092)
xz-utils already builds a shared library.  The +pic variant adds the
compiler pic flag to the static archive so that it can be linked into
another shared library.
2020-08-16 17:02:43 -05:00
Auriane R
93221f8bf9 Add boost 1.74.0 package (#18088) 2020-08-16 16:13:55 -05:00
iarspider
6fdbdaf5b6 New packages: madgraph, syscalc, collier, gosam-contrib (#17601)
* Add Collier and SysCalc recipes

* Remove extra syscalc version

* Build collier with -j1 for @:1.2.4

* Add recipe for gosam-contrib

* Update gosam-contrib recipe with 'provides'

* Madgraph recipe, first version

* Finalize madgraph recipe + flake8

* Make py2 version of madgraph default; fix hash for syscalc; fix patch

* Handle virtual packages (#3)

* Update package.py

* Update packages.yaml

* Remove virtual packages - pt. 1

* Remove virtual packages - pt. 2

* Changes from review - pt. 1

* Changes from code review - pt. 2

* Update var/spack/repos/builtin/packages/collier/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/madgraph5amc/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add hash for version 2.7.2 (available in our private mirror)

* Fixes for 2.7.3 family

* Patches for 2.7.3{.py3,}{.atlas,}

* Fix hash of syscalc

* Hack to fix concretization (2.7.3 matches 2.7.3.py3)

* Add conflict statement (reported to devs)

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Delete madgraph5amc-2.7.2.atlas.patch

* Delete madgraph5amc-2.7.2.patch

* Update package.py

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: iarspider <iarpsider@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-16 12:07:14 -05:00
Nithin Senthil Kumar
6ececa6976 New packages Mvapich2x and Mvapich2-GDR (#17883)
* Adding new packages Mvapich2x and Mvapich2-GDR which can be installed only via binary mirrors

* Added docstring descriptions to both packages

* Removed variant wrapper for cuda dependencies

* Fixed multiple flake8 errors

* Updated APIs to pass unit tests

* Updated APIs for MVAPICH2-X package and fixed flake8 warnings for MVAPICH2-GDR

* Changed url back to single line

* Removed extra parantesis around URL string

Co-authored-by: nithintsk <nithintsk@github.com>
2020-08-16 08:56:22 -05:00
vvolkl
46fb456b54 [root] add dataframe cmake option (#17962)
* [root] add dataframe cmake option

@chissg @HadrienG2  @drbenmorgan


This has been a separate cmake option starting v6-19 I believe: 31292b9082
It should default to true -- not sure why, but this recipe sets it to off.

I could add a variant too, but since it has become an integral part of root and doesn't introduce extra dependencies, I'd propose to just set it to true like I do here.

* Update package.py
2020-08-15 10:13:11 -05:00
Massimiliano Culpo
395b478b1b spack config update (bugfix): packages.yaml with empty attributes (#18057)
Before this PR, packages.yaml files that contained an
empty "paths" or "modules" attribute were not updated
correctly, since the update function was not reporting
them as changed after the update.

This PR fixes that issue and adds a unit test to
avoid regression.
2020-08-14 19:31:55 -07:00
Chuck Atkins
a6a5708c44 pugixml: add version 1.10, add option for shared libs (#18072) 2020-08-14 17:45:30 -07:00
Adam J. Stewart
9ab8521a1c Python: add spack external find support (#16684) 2020-08-14 16:08:42 +02:00
Adam J. Stewart
512ef506a7 pkgconfig: add spack external find support (#16690) 2020-08-14 16:07:17 +02:00
Adam J. Stewart
c18167d01f Autoconf: add spack external find support (#16692) 2020-08-14 16:06:26 +02:00
Adam J. Stewart
73fe3ce158 M4: add spack external find support (#16693) 2020-08-14 16:05:30 +02:00
darmac
f901947f69 bmake: fix compilation error and added v20200710 (#17956) 2020-08-14 11:32:06 +02:00
Adam J. Stewart
2430ac5b7c GDAL: add spack external find support (#18004) 2020-08-14 08:59:34 +02:00
Adam J. Stewart
c27e82f0ac gmake: add spack external find support (#18009) 2020-08-14 08:58:22 +02:00
Levi Baber
5eea09d43d nextflow: added v20.07.1 (#18058) 2020-08-14 08:53:16 +02:00
Adam J. Stewart
be046d7341 Bazel: add spack external find support (#18008) 2020-08-14 08:48:49 +02:00
Adam J. Stewart
84a16e62d6 OpenGL: add spack external find support (#18003) 2020-08-14 08:48:11 +02:00
Adam J. Stewart
70419c752d GMT: add spack external find support (#18007) 2020-08-14 08:44:32 +02:00
Robert Blake
bcae9e354b Adding external package support for tar. (#18002)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-14 08:43:27 +02:00
Robert Blake
a9b1f22ba1 External package recognition for git. (#18010)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-14 08:42:53 +02:00
Adam J. Stewart
a14648ffd0 external packages: redirect stderr too (#18037) 2020-08-14 08:40:58 +02:00
Greg Sjaardema
8ade93d09d NETCDF-C: added v4.7.4 (#18051) 2020-08-14 08:25:41 +02:00
Greg Sjaardema
4f14d67537 SEACAS: added v2020-05-12 and v2020-08-13 (#18053) 2020-08-14 08:25:06 +02:00
Jed Brown
fe718afaa5 ceed: fix @3.0.0 dependency on hypre@2.18.2 (#17983)
Co-authored-by: Veselin Dobrev <dobrev1@llnl.gov>
2020-08-14 08:18:58 +02:00
Adam J. Stewart
2dad0e5825 Matplotlib: added v3.3.1 (#18061) 2020-08-14 08:01:37 +02:00
Adam J. Stewart
32710f1f28 py-certifi: add v2020.6.20 (#18060) 2020-08-14 08:00:57 +02:00
Adam J. Stewart
35bba4223f Bash: add v5.0.18, external package detection (#18062) 2020-08-14 08:00:25 +02:00
Kai Germaschewski
1d152a44fd fix configure.ac/autotools issue that casues problems on RHEL 7.7 (#17465) 2020-08-13 16:20:05 -07:00
Massimiliano Culpo
31f660bd90 Improve output of the external find command (#18017)
This commit adds output to the "spack external find"
command to inform users of the result of the operation.

It also fixes a bug introduced in #17804 due to the fact
that a function was not updated to conform to the new
packages.yaml format (_get_predefined_externals).
2020-08-13 14:36:58 -07:00
Jim Galarowicz
c0342e4152 Update the change to add gomp compatibity to llvm-openmp. (#17400)
* Update the change to add gomp compatibity to llvm-openmp.

* Update the change to add gomp compatibity to llvm-openmp using append instead of extend.

* Fix flake8 issue.

Co-authored-by: Jim Galarowicz <jgalarowicz@newmexicoconsortium.org>
2020-08-13 15:41:20 -05:00
Alicia Klinvex
8398265638 Add support for pFUnit version 4 (#17683)
* pFUnit: Added support for version 4

pFUnit v4 uses submodules, so we must fetch from the repo rather
than grabbing the tarball (see #11642).

* pFUnit: Added conflicts

pFUnit 4 causes an internal compiler error with gcc 7.2.0, and
several pFUnit versions are incompatible with shared libraries.

* pFUnit: Added conflicts for version 4

Verson 4 uses Fortran 2008 features and cannot be built with gcc
compilers prior to 8.4.

* pFUnit: Fixed conflicts/dependencies as suggested

* pFUnit: Version 4 no longer fetches from git

Checksummable files are fetched instead.

* pFUnit: Simplify major version check

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* pFUnit: Removed unnecessary patch for v4

The patch is still applied to v3.

* pFUnit: Modified MPI flag for v4

pFUnit v3 and v4 use different CMake flags to enable/disable MPI
support. Also added a conflict for v3 with MPI enabled using
gfortran 10, since newer gfortran is more finicky about datatypes.

* pFUnit: Rearranged mpi logic

* pFUnit: changed m4 to a build dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* pFUnit: Added URL back

I did not realize it was needed by "spack versions" and
"spack checksum". Thanks @adamjstewart!

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-13 15:35:00 -05:00
Ganesh Kumar
3dfd99c57a AMD ROCm Mathlibs (#17699)
* ROCm Mathlibs

* fixed the review comments

* fixed flake8 issues
2020-08-13 15:34:30 -05:00
Massimiliano Culpo
4a2136670d gcc: improve detection functions (#17988)
* Don't detect Apple's clang as gcc@4.2.1
* Avoid inspecting links except for Cray platforms
* Always return string paths from compiler properties
* Improved name-based filtering (apt-based packages)
2020-08-13 12:17:02 -07:00
Nikolay Simakov
f61b14106a IOR package: added 3.3.0rc1 and develop version (#18036)
* IOR package: added 3.3.0rc1 and develop version

* IOR package: reordered versions, set 3.2.1 as preferred
2020-08-13 13:28:50 -05:00
Harmen Stoppels
f5d3b82bcd Make mpich buildable without fortran (#17964)
When the user explicitly sets ~fortran, mpich builds without fortran
support. This will make building C/C++ libraries using clang easier,
since clang does not offer a fortran compiler by default (yet).

Since the user has to disable Fortran support explicitly, this change
is not breaking.
2020-08-13 13:25:55 -05:00
Harmen Stoppels
d6587ea365 fix buildcache create for environments with uninstalled root specs (#17859)
* Handle uninstalled rootspecs in buildcache

- Do not parse specs / find matching specs when in an environment and no
package string is provided
- Error only when a spec.yaml or spec string are not installed. In an
environment it is fine when the root spec does not exist.
- When iterating through the matched specs, simply skip uninstalled
packages
2020-08-13 09:59:20 -07:00
Massimiliano Culpo
d2f56830f1 "spack config update" can handle comments in YAML files (#18045)
fixes #18031

With this fix "spack config update" can update YAML
files that contain comments, while previously it
couldn't.
2020-08-13 09:54:07 -07:00
Matt Larsen
29818fda00 update vtk-h release (#18052) 2020-08-13 09:30:32 -07:00
Todd Gamblin
b476f8aa63 Merge tag 'v0.15.4' into develop 2020-08-13 07:23:02 -07:00
Massimiliano Culpo
ecf4829de7 llvm: added external detection capabilities (#17989)
* llvm: added external detection capabilities

* Added comment with reference to external package detection docs

* Fix typo in a comment
2020-08-13 09:03:09 -05:00
Todd Gamblin
764cafc1ce update CHANGELOG.md for 0.15.4 2020-08-13 00:41:59 -07:00
Todd Gamblin
091b45c3c7 bump version number for 0.15.4 2020-08-13 00:33:31 -07:00
Massimiliano Culpo
1707448fde Move Python 2.6 unit tests to Github Actions (#17279)
* Run Python2.6 unit tests on Github Actions
* Skip url tests on Python 2.6 to reduce waiting times
* Skip foreground background tests on Python 2.6 to reduce waiting times
* Removed references to Travis in the documentation
* Deleted install_patchelf.sh (can be installed from repo on CentOS 6)
2020-08-13 00:33:31 -07:00
Patrick Gartung
4d25481473 Buildcache: bindist test without invoking spack compiler wrappers. (#15687)
* Buildcache:
   * Try mocking an install of quux, corge and garply using prebuilt binaries
   * Put patchelf install after ccache restore
   * Add script to install patchelf from source so it can be used on Ubuntu:Trusty which does not have a patchelf pat package. The script will skip building on macOS
   * Remove mirror at end of bindist test
   * Add patchelf to Ubuntu build env
   * Revert mock patchelf package to allow other tests to run.
   * Remove depends_on('patchelf', type='build') relying instead on
   * Test fixture to ensure patchelf is available.

* Call g++ command to build libraries directly during test build

* Flake8

* Install patchelf in before_install stage using apt unless on Trusty where a build is done.

* Add some symbolic links between packages

* Flake8

* Flake8:

* Update mock packages to write their own source files

* Create the stage because spec search does not create it any longer

* updates after change of list command arguments

* cleanup after merge

* flake8
2020-08-13 00:33:31 -07:00
Massimiliano Culpo
ecbfa5e448 Use "fetch-depth: 0" to retrieve all history from remote 2020-08-13 00:33:31 -07:00
Massimiliano Culpo
c00773521e Simplified YAML files for Github Actions workflows
Updated actions where needed
2020-08-13 00:33:31 -07:00
Massimiliano Culpo
a4b0239635 Group tests with similar duration together
Style and documentation tests take just a few minutes
to run. Since in Github actions one can't restart a single
job but needs to restart an entire workflow, here we group
tests with similar duration together.
2020-08-13 00:33:31 -07:00
Todd Gamblin
303882834a docs: document releases and branches in Spack
- [x] Remove references to `master` branch
- [x] Document how release branches are structured
- [x] Document how to make a major release
- [x] Document how to make a point release
- [x] Document how to do work in our release projects
2020-08-13 00:33:31 -07:00
Todd Gamblin
5b63ec8652 Remove references to master from CI
- [x] remove master from github actions
- [x] remove master from .travis.yml
- [x] make `develop` the default branch for `spack ci`
2020-08-13 00:30:51 -07:00
Massimiliano Culpo
fc94dde3fc Moved flake8, shell and documentation tests to Github Action (#17328)
* Move flake8 tests on Github Actions

* Move shell test to Github Actions

* Moved documentation build to Github Action

* Don't run coverage on Python 2.6

Since we get connection errors consistently on Travis
when trying to upload coverage results for Python 2.6,
avoid computing coverage entirely to speed-up tests.
2020-08-13 00:30:51 -07:00
Robert Blake
c064088cf3 Bugfix for #17999: use cudart instead of cuda. (#18000)
This is needed because libcuda is used by the driver,
whereas libcudart is used by the runtime. CMake searches
for cudart instead of cuda.

On LLNL LC systems, libcuda is only found in compat and
stubs directories, meaning that the lookup of libraries
fails.
2020-08-12 23:58:10 -07:00
Todd Gamblin
c05fa25057 bugfix: fix spack -V with releases/latest and shallow clones (#17884)
`spack -V` stopped working when we added the `releases/latest` tag to
track the most recent release. It started just reporting the version,
even on a `develop` checkout. We need to tell it to *only* search for
tags that start with `v`, so that it will ignore `releases/latest`.

`spack -V` also would print out unwanted git eror output on a shallow
clone.

- [x] add `--match 'v*'` to `git describe` arguments
- [x] route error output to `os.devnull`
2020-08-12 23:58:10 -07:00
Patrick Gartung
8e2f41fe18 Buildcache create: change NoOverwriteException back to a warning as in v0.14 (#17832)
* Change buildcache create `NoOverwriteException` back to a warning.
2020-08-12 23:58:10 -07:00
Axel Huebl
50f76f6131 Hotfix: move CUDAHOSTCXX (#17826)
* Hotfix: move CUDAHOSTCXX

Set only in dependent packages.

* dependent compiler
2020-08-12 23:58:10 -07:00
Todd Gamblin
5f8ab69396 bugfix: fix spack buildcache list --allarch
`spack buildcache list` was trying to construct an `Arch` object and
compare it to `arch_for_spec(<spec>)`. for each spec in the buildcache.
`Arch` objects are only intended to be constructed for the machine they
describe. The `ArchSpec` object (part of the `Spec`) is the descriptor
that lets us talk about architectures anywhere.

- [x] Modify `spack buildcache list` and `spack buildcache install` to
      filter with `Spec` matching instead of using `Arch`.
2020-08-12 23:58:10 -07:00
Todd Gamblin
aff0e8b592 architecture: make it easier to get a Spec for the default arch
- [x] Make it easier to get a `Spec` with a proper `ArchSpec` from an
      `Arch` object via new `Arch.to_spec()` method.

- [x] Pull `spack.architecture.default_arch()` out of
      `spack.architecture.sys_type()` so we can get an `Arch` instead of
      a string.
2020-08-12 23:58:10 -07:00
h-denpo
5512340a51 gotcha: fixed building v0.0.2 on ARM (#18012) 2020-08-13 08:16:23 +02:00
eugeneswalker
7302dd834f allow GNUPGHOME to come from SPACK_GNUPGHOME in env, if set (#17139) 2020-08-12 22:57:57 -07:00
ketsubouchi
ae23f33a31 eztrace: add space, --linkfortran, -Wl (#17801) 2020-08-12 20:16:59 -05:00
Chris White
105caa7297 Axom + Conduit updates (#17863)
* Loosen Axom's variants, add shared variant for axom, fix clang/xlf rpath'ing problem on blueos

* Fix flake8

* Add main branch to list of known git branches
2020-08-12 20:15:59 -05:00
Massimiliano Culpo
fa216e5f15 mpich, mvapich2: fixed setup_*_environment (#18032)
fixes #18028

Since now external packages support multiple modules
the correct thing to do is to check if the name of the
*first* module to be loaded contains the string "cray"
2020-08-12 20:04:03 -05:00
Harmen Stoppels
075e9428a1 Make the build stage of OpenSSL parallel (#18024) 2020-08-12 19:58:36 -05:00
Erik Schnetter
3fbbf539d1 libpciaccess: New version 0.16 (#18035) 2020-08-12 19:57:00 -05:00
Erik Schnetter
c024a55da8 cmake: Add version 3.16.3 (#18034)
`cmake @3.16.3` is the version provided by Ubuntu 20.04. Adding this version here avoids the warning
```
==> Warning: Missing a source id for cmake@3.16.3
```
when using the system `cmake`.
2020-08-12 19:53:55 -05:00
srekolam
ad060c7870 Spack recipes for ROCm software components-Phase1 (#17422)
* Spack recipes for ROCm Stage 1 Build components

* fix flake8 errors

* fixes for flake8 errors

* Add a patch for cmake 3.x suport

* Fix rpath issue where hsa-rocr-dev does not allow it to be filled in by spack

* Remove inherited cmake args from comgr

* Make hsakmt-roct compile: no -Werror because with const cast in numa, and actually add the numa dependency

* Remove redundant cmake args which is inherited

* Fix some dependencies

* Fix some python 2.x compatibilities

* Add amd gpu targets to rocfft

* Make comgr a link dep of rocm-dbgapi and remove redundant cmake args

* Remove redundant cmake args

* Remove more redundant cmake args

* Final redundant args

* Use cmake 3.x instead of a fixed version

* Remove random variable

* Use installed rocclr instead of nonexisting directory

* Don't build outside the staging folder

* Deploy some missing cmake target file

* Formatting

* Fix target list

* Properly handle the rocclr dependency

* Formatting

* Fix vermin test

* Make all 3.5.0 package depend exactly on eachother

* Add a few missing link dependencies

* Fix flake8

* Remove some other redundant flags

* Add gcc install prefix for gcc builds of llvm-amdgpu

* review changes for the spack recipes

* Do not hard-code versions

* Fix atmi install

- no more relative rpaths outside of install directory (required patch)
- fix build -> link dependencies
- remove unused build dependency

* Fix flake8 errors

* Remove unused variable and make things python 2.x compatible

* Fix flake8

* Move compiler config from rocfft -> hipcc

* Remove redundant dependency on fftw-api

* Remove redundant import

* Avoid hitting the ROCM_PATH variable altogether with a patch; also just fill in all variables

* Add missing deps z3, zlib and ncurses+termlib to llvm-amdgpu

* Fix perl shebang and add dep

* Fix typo and patch HIP_CLANG_ROOT detection in hip's cmake files

* fixing build failure due z3 and adding zlib for rocgdb

* new changes to add z3,curses dependency for llvm-amdgpu

* fix flake8 error

Co-authored-by: root <root@localhost.localdomain>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2020-08-12 19:36:35 -05:00
eugeneswalker
25a0d89b58 petsc: add explicit build dependency: diffutils (#17979) 2020-08-12 13:42:03 -07:00
Julien Loiseau
015ea82bd5 Update FleCSPH package (#17997)
* Update FleCSPH package

* Flake8 corrections

* Update FleCSI version
2020-08-12 11:49:30 -06:00
Robert Blake
b4ff584bc0 Adding externals support for CUDA. (#18011)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-08-12 19:30:57 +02:00
Jon Rood
193c3535e1 Use configure-release instead of configure for UCX (#17941)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-12 15:14:53 +02:00
Michele Martone
d1fae90f84 librsb: added v1.2.0.9 (#18015) 2020-08-12 14:09:49 +02:00
Andrew W Elble
de053ca2ea weechat: added new package at v2.9 (#17996) 2020-08-12 14:09:09 +02:00
Harmen Stoppels
edcc334631 Respect $PATH ordering when registering compilers (#17967) 2020-08-12 13:38:08 +02:00
Ethan Stam
12aa6d221c ParaView: added v5.8.1 (#17995) 2020-08-12 13:18:32 +02:00
Robert Blake
f73141c6af Bugfix for #17999: use cudart instead of cuda. (#18000)
This is needed because libcuda is used by the driver,
whereas libcudart is used by the runtime. CMake searches
for cudart instead of cuda.

On LLNL LC systems, libcuda is only found in compat and
stubs directories, meaning that the lookup of libraries
fails.
2020-08-12 13:13:57 +02:00
Thomas Madlener
041e21802e podio: added dependency on jinja2 for newer versions (#17990) 2020-08-12 08:56:11 +02:00
Robert Blake
80a382c218 Adding external recognition to perl (#17756)
* Perl now has external recognition.

* Changing code to use the new external packages API.
2020-08-11 17:06:20 -07:00
Adam J. Stewart
2f0fd44b97 Libtool: add spack external find support (#16691)
* Libtool: add spack external find support

* Less specific regex

* match -> search

* Clarify that min returns first alphabetically, not shortest

* Simplify version determination
2020-08-11 10:16:15 -05:00
Harmen Stoppels
313511bf1d elpa: add cuda support, add libtool dep, fix parallel build failure (#17969) 2020-08-11 17:01:49 +02:00
Adam J. Stewart
09cc89a449 py-pyinstrument: added v3.1.3 (#17976) 2020-08-11 10:23:10 +02:00
Adam J. Stewart
db3ecb2fdf py-sphinx: added v3.2.0 (#17977) 2020-08-11 10:22:18 +02:00
Adam J. Stewart
1fdd19bcc2 py-sphinx-rtd-theme: added v0.5.0 (#17978) 2020-08-11 10:21:55 +02:00
darmac
9e54570b4c libcroco: added build dependency on pkg-config (#17970) 2020-08-11 10:14:22 +02:00
Massimiliano Culpo
6cda20472e Fix loading of compiler modules on CRAY (#17984)
The modifications in 193e8333fa
introduced a bug in the loading of compiler modules, since a
function that was expecting a list of string was just getting
a string.

This commit fixes the bug and adds an assertion to verify the
prerequisite of the function.
2020-08-11 10:11:01 +02:00
Adam J. Stewart
f0c0cd5c3f Fix typo in spack external debug msg (#17982) 2020-08-11 09:55:44 +02:00
Sajid Ali
30dc0baa34 openssl: added detection capabilities (#16653) 2020-08-11 09:44:51 +02:00
Kurt Sansom
07422f95de sprng: added new package (#17570)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-11 09:15:13 +02:00
Howard Pritchard
18d2682f75 UCX: add version 1.8.1 (#17972)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2020-08-10 23:10:59 -05:00
Jeffrey Salmond
6ccc430e8f Add fenicsx packages (#17743)
* add py-ufl package from fenics

* add py-fiat package from fenics

* add py-ffcx package from fenics

* add py-dijitso package from fenics

* add dolfinx library from fenics

* amend ffcx to use ufl and fiat master branches

* setup variants complex and int64 of dolfinx

* add dolfinx python library as package

* add test dependencies to py-dolfinx

* remove broken doc variant

* remove test dependencies from py-dolfinx

* flake8 fixes to dolfinx and py-dolfinx

* make sure dolfinx cmake picks up the correct python version

* list build phases in py-dolfinx package

* remove unnecessary package url

* make pkgconf a build dependency

* make all python dependencies build+run

* py-ffcx needs py-setuptools to be a build/run dependency to support ffcx executable

* remove unnecessary variants from dolfinx

* add missing dependencies to py-dijitso

* remove stray line from py-dolfinx

* simplify definition of build_directory in py-dolfinx

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* use depends_on("python") rather than extends("python") in py-ffcx

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* use depends_on("python") rather than extends("python") in py-fiat

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* use depends_on("python") rather than extends("python") in py-ufl

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* rename py-fiat to py-fenics-fiat

* rename py-ufl to py-fenics-ufl

* fix error in depends_on(petsc) definition

* add missing dep on numpy to py-fenics-fiat

* specify python@3.8: as requirement for all fenics components

* use tuples rather than list for depends_on type=

* specify eigen@3.3.7: as dependency for dolfinx

* add js947 and chrisrichardson as maintainers for the fenics packages

* remove scipy dependency from py-dolfinx

* rename package py-ffcx -> py-fenics-ffcx

* rename package dolfinx -> fenics-dolfinx

* rename package py-dolfinx -> py-fenics-dolfinx

* remove pointless URL from py-fenics-dolfinx package

* rename package py-dijitso -> py-fenics-dijitso

* formatting

* remove unecessary cmake args from fenics-dolfinx

* revert py-fenics-fiat python version to 3:

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* revert py-fenics-ufl python version to 3.5:

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add conflict to fenics-dolfinx for C++17 support

* revert py-fenics-ffcx python version to 3.5:

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-10 21:43:43 -05:00
Nithin Senthil Kumar
38f2a0a92b Updated docstring description for MVAPICH2 (#17921)
* Updated docstring description for MVAPICH2

* Fixed flake8 character count warning and added maintainers

* Removed trailing whitespaces

Co-authored-by: nithintsk <nithintsk@github.com>
2020-08-10 21:35:41 -05:00
darmac
08beb7aa30 aperture-photometry: add v2.8.4 (#17953) 2020-08-10 21:35:20 -05:00
darmac
00bdff81ae Add new package: tiptop (#17907)
* Add new package: tiptop

* fix flake8 error

* tiptop: remove sha256 value
2020-08-10 21:26:21 -05:00
darmac
84a97d8372 pbbam: fix build error (#17955)
* pbbam: fix build error

* Update var/spack/repos/builtin/packages/pbbam/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-10 21:25:45 -05:00
darmac
9fbfd6b58c kibana: support aarch64 (#17910)
* kibana: support aarch64

* kibana: change jdk to java
2020-08-10 21:25:08 -05:00
darmac
c322bea8f8 fix apex sha256 and depends constraint (#17902) 2020-08-10 20:36:58 -05:00
Massimiliano Culpo
1dba0ce81b Removed references to BlueGene/Q in docs and comments 2020-08-10 17:09:09 -07:00
Massimiliano Culpo
f128e7de54 Removed references to BlueGene/Q in Spack builtin packages 2020-08-10 17:09:09 -07:00
Massimiliano Culpo
98701279df Removed references to BlueGene/Q in core Spack 2020-08-10 17:09:09 -07:00
Satish Balay
1920d125e8 petsc: update hypre dependency wrt 2.18.2 -> 2.19.0 API change (#17973) 2020-08-10 13:41:35 -07:00
Massimiliano Culpo
c0d490ffbe Simplify the detection protocol for packages
Packages can implement “detect_version” to support detection
of external instances of a package. This is generally easier
than implementing “determine_spec_details”. The API for
determine_version is similar: for example you can return
“None” to indicate that an executable is not an instance
of a package.

Users may implement a “determine_variants” method for a package.
When doing external detection, executables are grouped by version
and each group results in a single invocation of “determine_variants”
for the associated spec. The method returns a string specifying
the variants for the package. The method may additionally return
a dictionary representing extra attributes for the package.

These will be stored in the spec yaml and can be retrieved
from self.spec.extra_attributes

The Spack GCC package has been updated with an implementation
of “determine_variants” which adds the following extra
attributes to the package: c, cxx, fortran
2020-08-10 11:59:05 -07:00
Massimiliano Culpo
193e8333fa Update packages.yaml format and support configuration updates
The YAML config for paths and modules of external packages has
changed: the new format allows a single spec to load multiple
modules. Spack will automatically convert from the old format
when reading the configs (the updates do not add new essential
properties, so this change in Spack is backwards-compatible).

With this update, Spack cannot modify existing configs/environments
without updating them (e.g. “spack config add” will fail if the
configuration is in a format that predates this PR). The user is
prompted to do this explicitly and commands are provided. All
config scopes can be updated at once. Each environment must be
updated one at a time.
2020-08-10 11:59:05 -07:00
Amjad Kotobi
1398038bee git: new version 2.28 (#17848)
* git: new version 2.28

* git: man page version 2.28
2020-08-10 07:48:03 -04:00
Michael Kuhn
8c1329958c Hotfix for config singleton initialization (#17263)
Fixes #17262
2020-08-10 07:48:33 +02:00
Brian Van Essen
aa79d565b3 DiHydrogen: adding blas and associated variants (#17911) 2020-08-10 07:31:41 +02:00
Todd Gamblin
7c9c486d07 deptypes: move deptype formatting code from Spec.format to dependency.py (#17843)
- This simplifies Spec.format somewhat
- Makes code to generate deptype strings (e.g., '[blrt]') reusable
2020-08-09 15:33:31 -07:00
darmac
6309c5eff5 Add new package: libhugetlbfs (#17904) 2020-08-09 14:05:28 -05:00
Nichols A. Romero
c015b8538d LLVM minor cleanup (#17905)
* Since LLVM already depends on the CUDA build system, these lines are redundant.

* This conflict doesn't do anything.
2020-08-09 14:04:53 -05:00
darmac
6c4c7c4b72 libtirpc: fix krb5 depends and remove --disable-gssapi configure (#17926) 2020-08-09 13:52:17 -05:00
darmac
3b8a5dea44 Add new package: hazelcast (#17928) 2020-08-09 13:51:25 -05:00
Rémi Lacroix
ab6e74f6ac FFTW: "configure" script failed with Intel compilers on some systems. (#17936)
Next version of FFTW won't use `-no-gcc` so add a patch to backport the fix to older versions.

Fixes #17810.
2020-08-09 13:47:26 -05:00
Teodor Nikolov
902fac185a blaspp: added explicit dependency on CUDA (#17965) 2020-08-09 20:44:05 +02:00
eugeneswalker
bad8734316 add variant tests which can be one of (none, tests, benchmarks) (#17949) 2020-08-09 13:40:46 -05:00
darmac
5965522bbe libxscrnsaver: fix depends scrnsaverproto error (#17951) 2020-08-09 13:38:55 -05:00
darmac
e0e73c6ea1 blaze: fix build error (#17954) 2020-08-09 13:30:39 -05:00
Alex Margolin
c0492efc11 KNEM and XPMEM support for UCX (#17215)
* KNEM url updated

Signed-off-by: Alex Margolin <alex.margolin@huawei.com>

* Allow UCX to be built against KNEM and XPMEM

Signed-off-by: Alex Margolin <alex.margolin@huawei.com>
2020-08-09 12:55:49 -05:00
Wouter Deconinck
4493d31170 dire: new package at v2.004 (#17749)
Older versions do not compile correctly. New users should use 2.004,
not any of the older versions.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-09 19:03:26 +02:00
Adam J. Stewart
3b6c16ee9f py-scikit-learn: added v0.23.2 (#17876) 2020-08-09 15:57:07 +02:00
Jordan Ogas
f243291334 charliecloud: added v0.17 (#17855) 2020-08-09 15:55:53 +02:00
Harmen Stoppels
512fa8e460 Fix cpio clang build error (#17963)
undefined reference to '__muloti4', using the proposed fix from
https://bugs.llvm.org/show_bug.cgi?id=16404
2020-08-09 15:45:53 +02:00
ketsubouchi
0642216c31 eospac: added support for fujitsu compiler (#17922) 2020-08-09 15:45:35 +02:00
Cameron Stanavige
7631013975 unifyfs: remove flatcc dependency and add spath (#17913)
FlatCC has been removed from UnifyFS as a dependency on the develop
branch and for future releases.

spath is now an optional dependency for UnifyFS to normalize relative
paths provided by the user.
2020-08-09 15:14:44 +02:00
ketsubouchi
c6ad321838 r-rcppparallel: fix for fujitsu compiler (#17924) 2020-08-09 15:01:19 +02:00
vvolkl
e83f639c13 HEP versions update (#17927)
* [whizard] bug fix for ~openloops

* [lcio] new version

* [gaudi] new version

* [lcio] add delphes dependency for examples
2020-08-09 14:54:26 +02:00
Dr. Christian Tacke
dfd5da85c3 pythia8: added v8302 and maintainer (#17931) 2020-08-09 14:43:47 +02:00
Rémi Lacroix
9979aa28ce hypre: added v2.19.0 (#17809) 2020-08-09 14:42:52 +02:00
Adam J. Stewart
eda170665f oneDNN: added v1.6 and v1.6.1 (#17822) 2020-08-09 14:40:05 +02:00
Adam J. Stewart
33ec8cd86a py-azureml-sdk: add v1.11.0 (and deps) (#17939) 2020-08-09 14:31:28 +02:00
Sinan
f340b81ca8 py-netket: added new package at v2.1.1 (#17515)
Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-09 14:05:26 +02:00
Brian Van Essen
1fc02b801e cub: added v1.9.10, fixed typo in url (#17942) 2020-08-09 13:59:42 +02:00
Glenn Johnson
8eef488e2c Replace [] with () in description (#17940)
The tcl module for r-dorng will fail to load due to the [] characters in
the description. This happens for Tcl formatted modules loaded by Lmod
at least.

```
module load r-dorng-1.7.1-gcc-9.2.0-wtq7bne
Lmod has detected the following error: .../spack/share/spack/modules/linux-centos7-broadwell/r-dorng-1.7.1-gcc-9.2.0-wtq7bne:(r-dorng-1.7.1-gcc-9.2.0-wtq7bne):
invalid command name "L'Ecuyer"
```

Split text for short and long descriptions.
2020-08-09 13:54:28 +02:00
Adam J. Stewart
a3a571e5c3 PIL: add py-pillow 7.2.0 (#17933) 2020-08-09 13:43:29 +02:00
Sinan
d7ae244a14 llvm: added v10.0.1 (#17950)
Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
2020-08-09 13:39:11 +02:00
Adam J. Stewart
09223e5c0b wcslib: added v7.3 (#17836) 2020-08-09 13:24:48 +02:00
Satish Balay
61735f3847 petsc: added v3.13.2,v3.13.3,v3.13.4 (#17957) 2020-08-09 13:19:59 +02:00
Adam J. Stewart
aa14c9d1d1 libimagequant: added new package at v2.12.6 (#17959) 2020-08-09 13:09:52 +02:00
Satish Balay
2954eb39c9 saws: added v0.1.1 (#17958) 2020-08-09 13:05:21 +02:00
Mark W. Krentel
cef729e39c mbedtls: fix min cmake version, added v2.16.7 (#17960)
Mbedtls 2.16.x uses target_sources() from cmake >= 3.1.0
2020-08-09 12:52:21 +02:00
Daniel Topa
61cf6d9c8f cassandra: fixed checksum for v3.11.6 (#17961)
Signed-off-by: Daniel Topa <dantopa@gmail.com>
2020-08-09 12:47:03 +02:00
Glenn Johnson
023a057f20 Add variants to petsc (#17218)
* Add variants to petsc

This PR adds the follolwing variants to the petsc package

- gmp
- jpeg
- libpng
- giflib
- mpfr
- netcdf
- pnetcdf (parallel-netcdf)
- moab
- eigen
- random123
- exodusii
- mstk
- cgns
- memkind
- muparser
- p4est
- saws
- libyaml
- zstd

* Fix flake8 errors

* Additional changes to Petsc recipe

This commit addresses the issues with dependencies that were brought up
in the comments. There are also a few other enhancements.

- the language of the new variant descriptions was changed to be more
  consistent with what was already in the recipe
- an explicit '+mpi' was added to the depends_on('hypre...') directives
- an explicit '+mpi' was added to the depends_on('trilinos...')
  directives
- the run time error checking for '~mpi' was replaced with 'conflicts()'
  directives that will cause the install to fail sooner
- additional variants that were 'parallel only' were added to the '~mpi'
  check

* Set the '~mpi`' conflicts msg to a variable
2020-08-08 11:07:51 -05:00
Sinan
e4a7cb4d60 new package: py-cmake (#17588)
* new package: py_cmake

* sync cmake and py-cmake versions

* new package: py_cmake

* sync cmake and py-cmake versions

* flake8, improve dependency specifics

* replace template stuff with actual values

Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
2020-08-08 11:04:41 -05:00
Sinan
775c14e0c1 package/cmake: add versions 1.18.0 and 1.18.1 (#17945)
* package/cmake: add versions 1.18.0 and 1.18.1

* fix shasums

Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
2020-08-08 00:42:02 -05:00
Adrien Bernede
3978db91dc Feature/raja chai umpire update (#17665)
* Changing raja, chai, and umpire packages so all will compile with each other.

* Need a CUDA version of CHAI when compiling with raja+cuda+chai

* Updating checks for commit.

* Adding comments explaining why chai+umpire tests were disabled

* Reactivating tests for CHAI and Umpire

* reordering versions

* Unified handling of Cuda Arch

* Adding latest versions

* Unused/Untested: removed

* Aesthetic and test mode in Chai

* Unified handling of Cuda Arch

* Using 'ON' consistently, instead of 'On'

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix, suggestion and patch:

Chai depends on RAJA, not the other way.
Apply suggested master-main version mapping.
Add Umpire version 3.0.0 and patch.

Co-authored-by: Robert Blake <blake14@llnl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-07 17:42:07 -05:00
eugeneswalker
ceed9c4bc0 conduit: dont use cmake >= 3.18 because of FindHDF5 bug (#17937) 2020-08-07 13:50:14 -07:00
Glenn Johnson
1a11449c86 New package - REDItools (#17703)
* New package - REDItools

This PR adds the REDItools package, along with a new package dependency,
py-fisher. This contains a patch generated from the python 2to3 script
as well as some other fixes. I am not sure if the project is ready to
support python-3 yet but I submitted the other patches upstream.

* Update var/spack/repos/builtin/packages/reditools/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-07 14:40:52 -05:00
Adam J. Stewart
11f2d01051 py-inference-schema: add new package (#17912) 2020-08-07 12:09:34 -05:00
Tomoki, Karatsu
c83236d3a4 Fujitsu compiler: Accept alphabet as version. (#17890)
* Fujitsu compiler: Accept alphabet as version.

* Fujitsu copiler: Updated test pattern.
2020-08-07 09:06:22 -05:00
g-mathias
c1efa09928 gromacs: new version 2020.3; variant nosuffix and mkl support (#17908)
* new version 2020.3; new variants nosuffix and fft; version selections for plumed

* fixed too long lines

* fixed whitespaces

* revised fft interface according to @haampie 's suggestions

Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-08-07 07:03:14 -06:00
darmac
6fc6f1ea9d activeharmony: add version 4.6.0 (#17867) 2020-08-06 23:51:47 -07:00
Adam J. Stewart
463aaef092 py-rasterio: add v1.1.5 2020-08-06 23:50:02 -07:00
Robert Pavel
07e426ce00 New package: ProfugusMC (#17914)
Added spackage for profugusMC
2020-08-06 23:45:19 -07:00
Mark W. Krentel
7d348268fb hpctoolkit: fix commit hash for version 2020.08.03 (#17920) 2020-08-06 23:42:52 -07:00
Rao Garimella
cff383802a Wonton (package): Fix sha 256 sum for version (#17916) 2020-08-06 23:41:48 -07:00
Justin S
7a5c0e0326 stacks (package): add version 2.53 (#17915) 2020-08-06 23:40:41 -07:00
Justin S
0f8ad5be1b isescan: new package at 1.7.2.1 (#16589)
* isescan: new package at 1.7.2.1

* isescan: update external program paths

* isescan: use spack compiler
2020-08-06 22:26:52 -05:00
Rao Garimella
9a7834949d New Package Wonton (#17882)
* new package Wonton

* remove the flecsi variant because flecsi-sp does not have a spackage

* fix url, clean up whitespaces

* formatting

* put in explicit else clauses for variants in CMake section because CMake's behavior is system-dependent

Co-authored-by: Rao Garimella <rao@abyzou.lanl.gov>
2020-08-06 13:55:24 -05:00
Adam J. Stewart
051e124533 Bazel: relax Java dependency (#17734)
* Bazel: relax Java dependency

* Flake8 fix
2020-08-06 13:02:02 -04:00
Julius-Plehn
f3ec1d445d New package: HDFView (#17707)
* HDFView

* adds support for version 3.1.1
2020-08-06 09:02:40 -05:00
Todd Gamblin
9dbe1d7776 bugfix: fix spack -V with releases/latest and shallow clones (#17884)
`spack -V` stopped working when we added the `releases/latest` tag to
track the most recent release. It started just reporting the version,
even on a `develop` checkout. We need to tell it to *only* search for
tags that start with `v`, so that it will ignore `releases/latest`.

`spack -V` also would print out unwanted git eror output on a shallow
clone.

- [x] add `--match 'v*'` to `git describe` arguments
- [x] route error output to `os.devnull`
2020-08-05 17:01:18 -07:00
Robert Mijakovic
aee95fe9a9 GPI-2 new package (#17875)
* update version: intel packages daal, ipp, mkl-dnn, mkl, mpi, parallel-studio, pin, tbb and makes url parameter consistent and always use single quote.

* Fixes a typo with one of the sha256 checksum..

* Adds version entries for new versions of Intel packages.

* Adds hashes for new versions of Intel packages.

* Adds missing hash of Intel compiler.

* Adds the newest version of Intel MPI 2019.8.

* Fixes hash for intel-parallel-studio and intel-tbb.

* Fixes version number of Intel MPI.

* Adds GPI-2 package.

* Fixes flake8 noticed issues.

* Second try to fix flake8 comment

* Fixes some issues adamjstewart noticed.

* Fixes package according to flake8 complains.

* Fixes flake8 issue.

* Renames next version to master and removes master.

* Adds maintainer into gpi-2 and returns master branch for the git
repository.

Co-authored-by: Robert Mijakovic <robert.mijakovic@lrz.de>
2020-08-05 16:38:23 -05:00
carlabguillen
cc0a1283c4 Variant with fortran for likwid package (#17889)
* Option to build likwid with fortran interface

* Removing white spaces

* Flake8 conform
2020-08-05 16:31:49 -05:00
h-denpo
a5914cecb4 add depends_on('zlib', type='link') (#17887) 2020-08-05 16:24:49 -05:00
darmac
dc321abb72 bcache:add pkg-config to find blkid.h in linux-utils (#17888) 2020-08-05 16:24:14 -05:00
Greg Becker
acfa2ea018 remove hypre variant from mfem and all references to it (#17885) 2020-08-05 09:19:22 -07:00
darmac
afaae70855 nodejs: add version 8.11.4 (#17824)
* nodejs: add version 8.11.4

* node-js: refine configure() for different version
2020-08-05 10:12:14 -05:00
Ronak Buch
e64e600fbb charmpp: added v6.10.2 (#17886) 2020-08-05 15:03:51 +02:00
Jen Herting
96f5075eb1 [py-smart-open] fixing dependencies (#17640)
* version 1.8.4 requires py-boto at 2.3.2:
* google-cloud-storage only in newer versions
2020-08-04 16:44:30 -07:00
Jen Herting
118948cb0a [bowtie] added version 1.3.0. Patch fixed for new version (#17744) 2020-08-04 14:52:51 -07:00
Tim Haines
f9ee76a817 Dyninst: 10.2 release (#17847)
* Dyninst: 10.2 release

* Use 'elf' instead of 'elfutils'

* Use v10.2.0 tag

* Change minimum elfutils to 0.173

* Move STERILE_BUILD option to correct cmake_args

* make a sacrifice to the flake8 gods

* Add maintainer

* Revert to using elf@1 for elfutils
2020-08-04 13:48:08 -05:00
Brian Van Essen
54dc871524 Renamed the aluminum variant for thhe intra-node RMA functions. (#17861) 2020-08-04 12:52:59 -05:00
Ethan Stam
b3dd90b95c ParaView: Allow all ParaView versions to depend on Python 2 (#17484)
* Allow all ParaView versions to depend on Python 2

* Keep conflict for 5.9 and up with python 2

* Fix line too long

* Don't use backslash

* Try fixing indent

* Clean logic for python cmake flags

* Try fixing indent
2020-08-04 12:51:49 -05:00
Toyohisa Kameyama
d77f388a0d yorick: avoid hang to fputest on aarch64. (#17865) 2020-08-04 12:27:27 -05:00
Simon Frasch
cb676eab0f Added new package: spla (#17868) 2020-08-04 12:24:07 -05:00
Robert Underwood
4eb3558d20 Prefer dynamic linking for Python in vim when +python (#17870)
Previously the python package for vim used static linking, and depending
on what system libraries were available and linked against could cause
symbol conflicts for python leading to segfaults in loading c modules in
the standard library (i.e. heapq).  This patch address this issue by
dynamically linking them.
2020-08-04 12:22:53 -05:00
Robert Underwood
b35b950ee2 Add openssh runtime dependency to git (#17872)
If you use git to clone a repository ssh, git transfers control the ssh
binary available on your path, if that ssh binary was built with
contradictory version of openssl/kerberos, then your git commands will
fail.
2020-08-04 12:21:55 -05:00
Adam J. Stewart
91671be7cc py-keras-preprocessing: add new version (#17735) 2020-08-04 12:12:15 -05:00
t-nojiri
8977f7377a abyss 2.1.4: fails to build with GCC 8 (#17614)
* abyss 2.1.4: fails to build with GCC 8

* abyss 2.1.4: fails to build with GCC 8

* abyss 2.1.4: Revise the points indicated by the review.
2020-08-04 12:11:52 -05:00
vvolkl
9d2b60ac0c [delphes] pythia8 variant and cleanup (#17664)
* [delphes] new version

* [delphes] pythia8 variant

* [delphes] flake8
2020-08-04 12:11:13 -05:00
Zicklag
b85cc363c1 Add Espanso Package and its xdotool Dependency (#17586) 2020-08-04 12:06:46 -05:00
Simon Pintarelli
c3a38e0b14 sirius (new versions, fixes), q-e-sirius (new package), nlcglib (new package) (#17844)
* sirius, update versions, fixes, add missing options

- sirius/spfft: depend on fftw-api
- cleanup +shared option
- sirius add option for memory pool
- sirius add version 6.5.3 and 6.5.4
- sirius: add spfft dependency for @master, @develop

* add nlcglib package

Robust wave function optimization for SIRIUS.

* add q-e-sirius package

based on q-e package

* Update var/spack/repos/builtin/packages/q-e-sirius/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* nlcglib: pass nvcc_wrapper to cmake

* Add 6.5.6

* Make flake8 happy

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2020-08-04 12:03:04 -05:00
Mark W. Krentel
3a02d1a847 hpctoolkit: add v2020.08.03 (#17860)
Add version 2020.08.03.  Adjust the cuda args.  The --with-cupti arg
was redundant, even for old versions of hpctoolkit.
2020-08-04 13:39:47 +02:00
Enrico Usai
bd0fb35ff0 AWS ParallelCluster: added v2.8.1 (#17866) 2020-08-04 13:25:27 +02:00
Brian Van Essen
c203898663 Added versions for cuDNN 8.0.2. (#17862) 2020-08-04 13:05:29 +02:00
t-nojiri
23f61ae2b0 subread: extend support for aarch64 to v2.0.0 (#17864) 2020-08-04 13:01:39 +02:00
g-mathias
9c8d4be569 new plumed versions 2.5.5 and 2.6.1; bumped default to 2.5.5 (#17850)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-08-03 23:15:57 -05:00
albestro
7154351860 HPX: fix wrong method name and use define/define_from_variant methods (#17851)
* bug fix: wrong method name

* refactoring using define_from_variant and define

* flake8 style fix

* revert change string format
2020-08-03 23:15:23 -05:00
Harmen Stoppels
9318029b63 Bump cmake (#17852) 2020-08-03 23:12:07 -05:00
Harmen Stoppels
c07102ac9f Fix typo: yaml -> json (#17854) 2020-08-03 23:11:41 -05:00
mic84
1c4b6bad43 amrex:: new version 20.08 (#17856) 2020-08-03 23:08:42 -05:00
darmac
8e1e3ac8c3 canu: fix depends issue & using java instead of jdk (#17599)
* canu: fix depends issue & using java instead of jdk

* Update var/spack/repos/builtin/packages/canu/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-03 23:05:37 -05:00
darmac
a7f1565efb Add new package: ycsb (#17788)
* Add new package: ycsb

* refine mongodb-async-driver jar path
2020-08-03 23:04:48 -05:00
darmac
1884822b56 Add new package: slider (#17799)
* Add new package: slider

* refine version check
2020-08-03 23:04:09 -05:00
darmac
0cc2377de6 Add new package: giraph (#17790)
* Add new package: giraph

* refine version check
2020-08-03 23:03:25 -05:00
Claire Guilbaud
ce7aefbb4f Packages/py colorspacious (#17623)
* typo error correction

* Adding recipe for `colorspacious` (a python package)

* Copyright year changed

* revert last commit on basic_usage.rst

* better with a good description

* fix according to failed test

* Update var/spack/repos/builtin/packages/py-colorspacious/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-colorspacious/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-03 23:01:03 -05:00
Massimiliano Culpo
b1133fab22 MacOS nightly builds: use Python 3.7 in CI
Nightly builds with MacOS started failing again
due to an upgrade of the default virtual environment
that now uses Python 3.8

This makes us hit #14102 and every build fails. This
commit should be reverted along with the fix to #14102.
2020-08-03 17:23:10 -07:00
Harmen Stoppels
827ca72c26 Fix docs about containers on cray (#17431)
* For detecting Cray: CRAYPE_VERSION is not used, but MODULEPATH

* Fix typo and write Cray with a capital
2020-08-03 16:16:18 -07:00
Adam J. Stewart
a67a0e3181 py-astropy: add version 4.0.1.post1, update header finding (#17838)
* Add install tests
* Add pkgconfig dependency to find dependency headers (specifically
  wcslib)
2020-08-03 11:51:29 -07:00
Adam J. Stewart
8b50433cd7 ERFA (package): add version 1.7.0 (#17837) 2020-08-03 11:40:50 -07:00
Paul
4c97a0ea1c Added Go 1.14.6 and 1.13.14 (#17574) 2020-08-03 09:42:05 -05:00
ketsubouchi
a480507a92 python: RPATH on fj (#17783)
* python: RPATH on fj

* python: patch _is_gcc
2020-08-02 22:53:14 -05:00
Francesco Di Natale
ac433134e5 Updates to jsonschema to include newer versions. (#17613)
* Additional versions of py-jsonschema.

* Tweak to force Maestro to use jsonschema@3.2.0:

* Correction of whitespace (flake8 error).

* Merges importlib's Python  version conditons

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-02 19:42:56 -05:00
Harmen Stoppels
e199f3f245 Add new versions of spfft (#17841)
* Add new versions of spfft

* Extend CudaPackage and use virtual fftw package

Co-authored-by: Simon Pintarelli <simon.pintarelli@cscs.ch>

* Add CUDA 11 compatibility note

* Depend on older cuda <= 10 for spfft <= 0.9.11

Co-authored-by: Simon Pintarelli <simon.pintarelli@cscs.ch>
2020-08-02 19:42:07 -05:00
René Widera
d560d83b76 fix isaac-server dependency (#17569)
isaac-server can not find jansson if jansson2.10+ is used.
2020-08-02 19:41:18 -05:00
Jon Rood
041fcbfa59 Gnuplot also depends on libsm with +wx. (#17575) 2020-08-02 16:46:42 -05:00
darmac
53b7f381c7 krb5: fix url parse and update versions (#17581) 2020-08-02 16:46:04 -05:00
darmac
044b9d3e85 mariadb: add depends package krb5 (#17583) 2020-08-02 16:45:05 -05:00
Harsh Bhatia
b25055c7e8 new package: ibm databroker (#17591) 2020-08-02 16:40:44 -05:00
Patrick Gartung
33116d730d Buildcache create: change NoOverwriteException back to a warning as in v0.14 (#17832)
* Change buildcache create `NoOverwriteException` back to a warning.
2020-08-02 13:52:15 -07:00
Patrick Gartung
f29dd48101 Add bindist tests for macOS. 2020-08-02 13:51:14 -07:00
albestro
ef3338a49b Improve HPX package management of coroutines implementation (#17654)
* introduce logic for boost+context dependency and generic_context variant

* fix OTF2 instrumentation minor problem

* default coroutine impl depends on platform

* fix flake8

* add reference to ~generic_coroutines conflict info

* Update var/spack/repos/builtin/packages/hpx/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-02 09:56:50 -05:00
Rémi Lacroix
8e6fe883eb Octopus: Add support for version 10.0. (#17782)
* Octopus: Add support for version 10.0.

Fix compilation when using the MKL as a provider for BLAS/LAPACK. Octopus will now detect that the MKL also provides the FFTW API and will refuse to compile when both the FFTW library and the MKL are given to the configure script.

* Octopus: Add supported version range for libxc.
2020-08-02 09:55:08 -05:00
Todd Gamblin
b94a837760 berkeley-db: add version 18.1.40, update build options in package (#17839)
* berkeley-db: add version 18.1.40, update build options in package

* combine adamjstewart's changes

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-01 19:33:35 -05:00
Wouter Deconinck
7e6a77d907 New package: kassiopeia (#17742)
* [kassiopeia] New package

* [kassiopeia] Remove master branch, update dependencies

* Update var/spack/repos/builtin/packages/kassiopeia/package.py

Unable to test since I do not have a license to intel-parallel-studio, but I see no reason why it would not work if.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [kassiopeia] depends_on mpi

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [kassiopeia] cmake_args with self.spec.satisfies and elses

* [kassiopeia] args.extend -> args.append

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-01 17:40:21 -05:00
Simon Pintarelli
75f34126fc h5py does not correctly recognize hdf5 version on Cray (#17831)
* h5py: explicitly specify version

hdf5@1.10.5 on Cray is wrongly detected as 1.8.4.

* Update var/spack/repos/builtin/packages/py-h5py/package.py

Thanks. Also had this first, then CI was complaining about line length ...

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-08-01 16:26:41 -05:00
Mark W. Krentel
b0eb771b19 elfutils: add version 0.180 (#17835) 2020-08-01 16:26:13 -05:00
Axel Huebl
7498336f3d Hotfix: move CUDAHOSTCXX (#17826)
* Hotfix: move CUDAHOSTCXX

Set only in dependent packages.

* dependent compiler
2020-08-01 15:29:17 -05:00
Todd Gamblin
f3cb3a2eb8 license: fix up MIT license so it's an exact match
Before:

```console
$ licensee diff --license mit LICENSE-MIT
Comparing to MIT License:
Input Length:      1092
License length:    1020
Similarity:      92.46%
diff --git a/LICENSE b/LICENSE
index 0ce42af..be0ff1c 100644
--- a/LICENSE
+++ b/LICENSE
@@ -1,3 +1,4 @@
{+spack project developers. see the top-level copyright file for details.+}
permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "software"), to deal in
the software without restriction, including without limitation the rights to
```

After:

```console
$ licensee diff --license mit LICENSE-MIT
Comparing to MIT License:
Input Length:       1020
License length:     1020
Similarity:      100.00%
Exact match!
```

This gets us a 100% license match from GitHub's `licensee` tool.
2020-08-01 10:06:28 -07:00
darmac
3ea9b9a014 Add new package: findbugs (#17825) 2020-08-01 11:46:02 -05:00
darmac
52858be668 Openfst: upgrade version and gcc constraint (#17830)
* openfst: upgrade version and gcc constraint

* refine version format
2020-08-01 11:41:12 -05:00
Todd Gamblin
ff27233e30 bugfix: fix spack buildcache list --allarch
`spack buildcache list` was trying to construct an `Arch` object and
compare it to `arch_for_spec(<spec>)`. for each spec in the buildcache.
`Arch` objects are only intended to be constructed for the machine they
describe. The `ArchSpec` object (part of the `Spec`) is the descriptor
that lets us talk about architectures anywhere.

- [x] Modify `spack buildcache list` and `spack buildcache install` to
      filter with `Spec` matching instead of using `Arch`.
2020-08-01 08:36:12 -07:00
Todd Gamblin
0c48f0a15d architecture: make it easier to get a Spec for the default arch
- [x] Make it easier to get a `Spec` with a proper `ArchSpec` from an
      `Arch` object via new `Arch.to_spec()` method.

- [x] Pull `spack.architecture.default_arch()` out of
      `spack.architecture.sys_type()` so we can get an `Arch` instead of
      a string.
2020-08-01 08:36:12 -07:00
Massimiliano Culpo
c65cde4cf8 Avoid update and upgrades to brew (#17815)
Ci is currently failing on brew update with the error:
```
Error: Cannot install bazelisk because conflicting formulae are installed.
  bazel: because Bazelisk replaces the bazel binary

Please `brew unlink bazel` before continuing.

Unlinking removes a formula's symlinks from /usr/local. You can
link the formula again after the install finishes. You can --force this
install, but the build may fail or cause obscure side effects in the
resulting software.
```
Avoiding:
```
$ brew update
$ brew upgrade
```
solves the issue by preventing the risk of conflicting formulae
2020-08-01 07:45:39 +02:00
Brian Van Essen
5c2b34e43b cuDNN (package): add version 7.6.5.32-10.2-linux-ppc64le (#17821) 2020-07-31 17:48:03 -07:00
Harmen Stoppels
d4831181ea Add libxc 5.0.0 (#17807)
With experimental CUDA support and some patches to make it compile.

Currently +shared and +cuda conflict, this has to be fixed upstream.
2020-07-31 18:01:39 -05:00
Michael Kuhn
346977f501 meson: Add 0.55.0 (#17816) 2020-07-31 18:01:12 -05:00
Adam J. Stewart
9d942df352 py-wheel: add new version (#17819) 2020-07-31 15:58:00 -07:00
Adam J. Stewart
89759cc1f1 py-setuptools: add new version (#17818) 2020-07-31 15:57:42 -07:00
Adam J. Stewart
4c35cc1b20 py-pip: add new version (#17817) 2020-07-31 15:57:27 -07:00
Massimiliano Culpo
9dbad500bc Move Python 2.6 unit tests to Github Actions (#17279)
* Run Python2.6 unit tests on Github Actions
* Skip url tests on Python 2.6 to reduce waiting times
* Skip foreground background tests on Python 2.6 to reduce waiting times
* Removed references to Travis in the documentation
* Deleted install_patchelf.sh (can be installed from repo on CentOS 6)
2020-07-31 15:01:12 -07:00
Adam J. Stewart
4aaa39d091 py-torchvision: add v0.7.0 2020-07-31 14:34:55 -07:00
Adam J. Stewart
58d383877d PyTorch: add v1.6.0 2020-07-31 14:32:56 -07:00
Adam J. Stewart
198ccfae4e Pandas: add v1.1.0 2020-07-31 14:20:17 -07:00
darmac
c2cd728e03 cp2k: support for aarch64 (#17786) 2020-07-31 16:17:31 -05:00
Brian Van Essen
46e7fbe120 LBANN: add versions, update CUDA support and dependencies (#17813)
* Update LBANN, Hydrogen, Aluminum to inherit CudaPackage
* Update CMake constraints: LBANN, Hydrogen, and Aluminum now require
  cmake@3.16.0: (better support for pthreads with nvcc)
* Aluminum: add variants for host-enabled MPI and RMA features in a
  MPI-GPU RDMA-enabled library
* NCCL: add versions 2.7.5-1, 2.7.6-1, and 2.7.8-1
* Hydrogen: add version 1.4.0
* LBANN: add versions 0.99 and 0.100
* Aluminum: add versions 0.4.0 and 0.5.0
2020-07-31 13:53:51 -07:00
Andrew W Elble
d89bc1d998 new package(s): py-gql (#17769)
* new package(s): py-gql

and related dependencies:
py-aiohttp
py-async-timeout
py-graphql-core
py-idna-ssl
py-multidict
py-websockets
py-yarl

new versions:
py-requests

* fixes

Co-authored-by: Andrew W Elble <aweits@skl-a-00.rc.rit.edu>
2020-07-31 14:54:25 -05:00
Rémi Lacroix
0303ef72ac libxc: Add version 4.3.4. (#17781) 2020-07-31 14:48:48 -05:00
darmac
e938e439b9 Add new package: mongodb-async-driver (#17787) 2020-07-31 14:45:35 -05:00
darmac
8df17d3830 open-iscsi: refine runtime environment (#17789) 2020-07-31 14:43:40 -05:00
Xavier Delaruelle
c7e83a8375 environment-modules: add version 4.5.2 (#17795) 2020-07-31 14:42:08 -05:00
Andrew Gaspar
c436414352 Add Rust versions 1.45.1 and 1.44.1 (#17812) 2020-07-31 14:30:00 -05:00
darmac
b7e0fec5f0 Alluxio (package): update url, add versions (#17805)
* Add versions 2.2.1 and 2.2.0
* Remove version 2.1.0
* Add dependency on Java
2020-07-31 11:25:22 -07:00
Justin S
cab2af9a71 py-mixedhtseq: new package at 0.1.0 (#17702)
* py-mixedhtseq: new package at 0.1.0

* py-mixedhtseq: flake8 fixes
2020-07-31 11:58:53 -05:00
Justin S
da5bbe3cef py-gpy: add 0.8.8 (#17548)
* py-gpy: add 0.8.8

* py-gpy: remove unneeded dep

* py-gpy: make cython build-only
2020-07-31 11:55:49 -05:00
Edoardo Aprà
154870da0b NWChem 7.0.0 (#17779)
* NWChem 7.0.0

* add python2 for 6.8.1. removed 6.8 https://github.com/spack/spack/pull/17779#discussion_r462700413

* nwchem 6.8.1 breaks with gcc 10 and later

* restored extra python bits for version 6.8.1. add env. definition of basis libraries

* changes for flake8

* url fixed

* prevent 6.8.1 being compiled with gcc 10
2020-07-31 10:45:27 -05:00
Andrew W Elble
edba70557d new package(s): py-torch-geometric (#17768)
* new package(s): py-torch-geometric
(with related dependencies: py-rdflib, py-googledrivedownloader)

* fixes
2020-07-31 10:41:24 -05:00
Claire Guilbaud
fd15fc9c70 sphinx copybutton: new package at v0.2.12 (#17632) 2020-07-31 11:41:56 +02:00
Claire Guilbaud
41f743b45d recommonmark: new package at v0.6.0 (#17629) 2020-07-31 11:39:53 +02:00
Rémi Lacroix
ccb316964d Improve Ferret package (#17620)
* Ferret: Add missing dependency with curl.

* Ferret: Don't force using the static version of libgfortran.

* Ferret: Ensure Spack's compiler wrappers are used.

This allows properly setting the rpaths.

* Ferret: Add support for versions 7.3 to 7.6.

* Ferret: Add a variant to install Ferret standard datasets.

* Ferret: Define some useful runtime environnement variables.

* Ferret: Fix flake8.

Also add myself as a maintainer as suggested by @alalazo.
2020-07-30 22:23:49 -05:00
Tom Payerle
58911d8248 kahip: Fix issue #17638 (make SConstruct files python3 friendly) (#17642)
As discussed in issue #17638, wherein kahip fails to build when
scons is dependent on python@3.

This converts the print statements in various SConstruct files
into python3 friendly print functions.

I found most of the affected SConstruct files in both @2.00 and
the later versions I found on web, but some files were only in @2.00.
I split the patches into two files for that reason, but have not
tried the later versions.
2020-07-30 22:22:40 -05:00
vvolkl
f09656e3f8 New Versions: dd4hep, podio (#17659)
* [dd4hep] add new patch version

* [podio] add new version and update env vars

* [dd4hep] add hepmc3 variant
2020-07-30 22:21:02 -05:00
Rémi Lacroix
d609a6dde7 Update LAMMPS package (#17715)
* LAMMPS: Use LATTE 1.2.2 starting with version 20200602.

Version 20200602 and upper requires Latte 1.2.2. This caused the internal Latte distribution to be used instead of the Latte install provided by Spack.

* LAMMPS: Add new versions 20200630 and 20200721.
2020-07-30 22:19:59 -05:00
ketsubouchi
45a67fa0f3 dcmtk: fixed type error (#17758)
* dcmtk: fixed type error

* Update var/spack/repos/builtin/packages/dcmtk/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-30 22:18:55 -05:00
Francine Lapid
32b070a76b New package: IDL (#17451)
* New package: IDL

* Update var/spack/repos/builtin/packages/idl/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/idl/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* added license header and changed url_for_version to just url

* removed unused imports, addressed comments

* removed trailing whitespace on line 14

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-30 22:17:35 -05:00
darmac
9c7c4a739f mozjs@1.8.5: fix compile issue (#17594)
* mozjs@1.8.5: fix compile issue

* mozjs: refine method
2020-07-30 22:15:53 -05:00
Tomoki, Karatsu
50f96e15de vtk: Support for new option to enable MPI. (#17727) 2020-07-30 22:05:20 -05:00
Daryl W. Grunau
7dd5793b75 backport Mesa MR#6053 to prevent multiply-defined symbols (#17720)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2020-07-30 22:04:24 -05:00
G-Ragghianti
b86e620743 Fixing problems caused by a comfused spack concretizer (#17797) 2020-07-30 22:01:48 -05:00
Harmen Stoppels
6fb8946dd5 Add variants to squashfs for different compression algorithms (#17755) 2020-07-30 22:01:13 -05:00
Lucas Frérot
cfbcf719db New package: py-uvw (#17719)
* py-uvw: added package for versions 0.0.7 and 0.3.1

* py-uvw: added py-setuptools as dependency
2020-07-30 21:52:17 -05:00
Dr Owain Kenway
8b515f3ba0 flang: make sure to find libstdc++ if needed (#17480) 2020-07-30 14:55:16 +02:00
Adam J. Stewart
d512537417 Python: added v3.8.4, v3.8.5, v3.7.7, v3.7.8, v3.6.11 (#17775)
Also added older version in the 3.6 and 3.5 series
2020-07-30 14:49:01 +02:00
G-Ragghianti
2f5e4e1664 MAGMA isn't compatible with CUDA 11 (#17753)
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-30 11:24:21 +02:00
G-Ragghianti
7b051df83f slate package: resolve issues with cuda version and fortran compiler name (#17759)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-30 11:21:49 +02:00
darmac
b96448269a couchdb: new package at v3.1.0 (#17595) 2020-07-30 11:17:19 +02:00
Wouter Deconinck
b24a9a383a lhapdf5: new package at v5.9.1 (#17746)
During configure lhapdf5 searches for python. On one system
I tested on (ubuntu 19.10) it finds a system installed python3
and fails to create the python extension.

Variant named to make explicit that this is only a python2 extension.
2020-07-30 11:11:17 +02:00
Rémi Lacroix
cd1647af66 latte: added v1.2.2 and master (#17714)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-30 10:19:59 +02:00
darmac
d5fc3f267c zipkin: new package at v2.21.5 (#17780) 2020-07-30 10:03:52 +02:00
Glenn Johnson
f0ce129c43 New package: r-hh (#17667)
* New package: r-hh

* Add short description

Add the project one liner for a short description that will be used for
`module whatis`.
2020-07-29 22:25:01 -05:00
ilbiondo
d61f362211 Updated iq-tree package (#17690) 2020-07-29 22:20:44 -05:00
Robert Mijakovic
8ea597c79a Intel packages: new versions (#17692)
* update version: intel packages daal, ipp, mkl-dnn, mkl, mpi, parallel-studio, pin, tbb and makes url parameter consistent and always use single quote.

* Fixes a typo with one of the sha256 checksum..

* Adds version entries for new versions of Intel packages.

* Adds hashes for new versions of Intel packages.

* Adds missing hash of Intel compiler.

* Adds the newest version of Intel MPI 2019.8.

* Fixes hash for intel-parallel-studio and intel-tbb.

* Fixes version number of Intel MPI.

Co-authored-by: Robert Mijakovic <robert.mijakovic@lrz.de>
2020-07-29 22:18:32 -05:00
Rémi Lacroix
e7fbd6c53e CMake: Fix compilation with Intel compilers on some systems. (#17693)
Systems with older GNU compilers were not affected.

This commit fixes #15901 and fixes #17605.
2020-07-29 22:17:32 -05:00
Tiziano Müller
8e49cac433 pgi: update to 20.4 (#17696) 2020-07-29 22:16:30 -05:00
Shayna Kapadia
867e64cb4f libmodbus: new package (#17778)
* libmodbus:adding new package

* fixing testing failures

* fixing flake 8 errors

* fixing flake 8 errors

Co-authored-by: Kapadia <kapadia2@llnl.gov>
2020-07-29 22:13:58 -05:00
Rao Garimella
3a8815ea7a update version to 1.1.5 (#17701)
Co-authored-by: Rao Garimella <rao@abyzou.lanl.gov>
2020-07-29 22:12:10 -05:00
Toyohisa Kameyama
d14baf4532 use libquadmath only x86_4 and ppcle. (#17728) 2020-07-29 21:47:18 -05:00
Mark Olesen
badd11e71c openfoam package updates, scotch version update (#17731)
* openfoam: use MPI 'headers' property (fixes #17730)

* openfoam: +spdp variant, usable for OpenFOAM 1906 and later

in contrast to +float32, which uses single-precision throughout, +spdp
uses the following:

- single-precision for most internals
- double-precision for linear solver

* openfoam: add m4 as build dependency

* scotch: update to 6.0.9 released Oct 2019

Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
2020-07-29 21:45:40 -05:00
Valentin Clement (バレンタイン クレメン)
42dec9eb12 CLAW: Add version 2.0.2 (#17733) 2020-07-29 21:43:14 -05:00
Wouter Deconinck
6b3f9c5d60 New package: vgm (#17741)
* [vgm] new package Virtual Geometry Model (VGM)

* [vgm] Updated description
2020-07-29 21:04:37 -05:00
Wouter Deconinck
ba5aa67303 New package: geant4-vmc (#17745)
* [geant4-vmc] New package

* [geant4-vmc] aligned spacing
2020-07-29 20:44:16 -05:00
David Böhme
21ca7abf8d Add Caliper v2.4.0 (#17750)
* Add Caliper v2.4.0

* Use built-in gotcha
2020-07-29 20:41:41 -05:00
Julien Loiseau
efb456cb0a Adding pic support for Kokkos (#17751)
* Adding pic support for Kokkos

* Update pic for kokkos
2020-07-29 20:40:12 -05:00
h-denpo
4e12dc3303 elang add depends_on('ncurses', type='link') (#17761) 2020-07-29 20:30:04 -05:00
Mark W. Krentel
896e83e3e6 libunwind: add +pic variant (#17762)
Libunwind already builds a shared library.  The +pic variant adds the
compiler pic flag to the static archive so that it can be linked into
another shared library.
2020-07-29 20:29:24 -05:00
ketsubouchi
8435016a43 eagle: fix CC=gcc and delete march=native (#17763) 2020-07-29 20:28:31 -05:00
Justin S
416a929f7f spades: add 3.14.1 (#17776) 2020-07-29 20:03:58 -05:00
Robert Pavel
af778aac0a Tweak to EOSPAC for gcc@10 Support (#17777)
Eospac's build breaks on gcc@10: due to dependence on -fcommon behavior
and gnu changing to -fno-common. Added conditional argument to support
bleeding edge compilers
2020-07-29 20:03:12 -05:00
tilne
5a5f2c00a8 update URL and sha256 for aws-parallelcluster 2.8.0 (#17685)
Signed-off-by: Tim Lane <tilne@amazon.com>
2020-07-29 19:35:43 -05:00
Jen Herting
5243f97c3b [bowtie2] added version 2.4.1 (#17748) 2020-07-29 15:15:33 -07:00
Rémi Lacroix
5a42883528 Boost: Update conflicts for version 1.73.0. (#17774)
Variant "+mpi+python cxxstd=98" is fixed in 1.73.0.
2020-07-29 12:08:20 -07:00
ketsubouchi
1827db2859 express: add cast for %fj (#17764) 2020-07-29 11:53:39 -07:00
Toyohisa Kameyama
5b12c0f4a0 clamcv: Add curl dependency. (#17765) 2020-07-29 11:48:07 -07:00
darmac
a0e6145884 Add new package: solr (#17597)
* Add new package: solr

* refine version order
2020-07-29 11:38:44 -07:00
Massimiliano Culpo
e47e972cf2 zlib: style changes to check if set of changed files is computed correctly 2020-07-29 11:23:34 -07:00
Massimiliano Culpo
3e1661a183 Use "fetch-depth: 0" to retrieve all history from remote 2020-07-29 11:23:34 -07:00
Massimiliano Culpo
c4f29c6384 Simplified YAML files for Github Actions workflows
Updated actions where needed
2020-07-29 11:23:34 -07:00
Massimiliano Culpo
1f7f076189 Group tests with similar duration together
Style and documentation tests take just a few minutes
to run. Since in Github actions one can't restart a single
job but needs to restart an entire workflow, here we group
tests with similar duration together.
2020-07-29 11:23:34 -07:00
Matthias Wolf
90648bb477 qt: fix build with ~ssl. (#17767)
OpenSSL was pulled from the spec too early, leading to failures when
attempting to build with ~ssl.
2020-07-29 10:53:01 -07:00
Andrew W Elble
d1494fe8da perl: add missing berkeley-db dependency (#17771) 2020-07-29 10:46:24 -07:00
Massimiliano Culpo
cad21d6eb1 lmod: change variant defaults to match Lmod's defaults (#17770) 2020-07-29 10:35:46 -07:00
mic84
a212bb0577 Amrvis: update branch name (#17718) 2020-07-28 09:12:00 -07:00
Todd Gamblin
f24dd29cd2 Merge tag 'v0.15.3' into develop 2020-07-28 02:18:30 -07:00
Todd Gamblin
0f25462ea6 update CHANGELOG.md for 0.15.3 2020-07-28 02:11:06 -07:00
Todd Gamblin
ae4bbbd241 bump version number for 0.15.3 2020-07-28 02:05:26 -07:00
Greg Becker
24bd9e3039 bugfix: allow relative view paths (#17721)
Relative paths in views have been broken since #17608 or earlier.

- [x] Fix by passing base path of the environment into the `ViewDescriptor`.
      Relative paths are calculated from this path.
2020-07-27 23:48:59 -07:00
Greg Becker
158ee6ac25 bugfix: allow relative view paths (#17721)
Relative paths in views have been broken since #17608 or earlier.

- [x] Fix by passing base path of the environment into the `ViewDescriptor`.
      Relative paths are calculated from this path.
2020-07-27 23:44:56 -07:00
Todd Gamblin
cefb4ba014 tutorial: Add boto3 installation to setup script (#17722) 2020-07-27 16:55:33 -07:00
Todd Gamblin
0efb8ef412 tutorial: Add boto3 installation to setup script 2020-07-27 15:55:44 -07:00
Patrick Gartung
7c61d6d45f Update darshan package with mpi variant (#17717)
* Update darshan package with nompi variant.

* Change variant to mpi and default to True
2020-07-27 16:30:29 -05:00
Glenn Johnson
ff529e6dc1 r-adespatial: added new package (#17700)
This PR adds the r-adesaptial package and several other new packages as
dependencies.

- r-adegraphics
- r-adephylo
- r-phylobase
- r-rncl
- r-rnexml
2020-07-27 22:08:48 +02:00
Glenn Johnson
1c0abaa6eb r-dss: added new package at v2.36.0 with dependencies (#17661)
This PR adds the r-dss package and the r-bsseq package, also new, as a
dependency. This includes the latest versions, which required updates to
the following dependencies:

- r-biocgenerics
- r-iranges
- r-s4vectors
- r-summarizedexperiment

Older versions of r-dss and r-bsseq are included as well to ensure
compatibility with older versions of the above dependencies.
2020-07-27 20:51:27 +02:00
Patrick Gartung
69775fcc07 Relocation of sbang needs to be done when the spack prefix changes even if the install tree has not changed. (#17455) 2020-07-27 11:38:48 -07:00
Patrick Gartung
ce772420dd Relocate rpaths for all binaries, then do text bin replacement if the rpaths still exist after running patchelf/otool (#17418) 2020-07-27 11:28:50 -07:00
Seth R. Johnson
6c2749536e qt: fixed build with apple-clang (#17706) 2020-07-27 18:32:29 +02:00
Adam J. Stewart
4e4b8d8249 SciPy: added v1.5.2 (#17708) 2020-07-27 18:26:30 +02:00
ketsubouchi
032a52e006 scons: added support to Fujitsu compilers (#17710) 2020-07-27 18:23:53 +02:00
Hadrien G
e3cc7fc38c acts: added v0.29 (#17712) 2020-07-27 18:13:56 +02:00
Amjad Kotobi
f2e66730d0 openmpi: added lustre variant to openmpi (#17478) 2020-07-27 18:11:57 +02:00
Claire Guilbaud
0ebdfb3c37 sphinxcontrib-mermaid: new package at v0.4.0 (#17630) 2020-07-27 17:30:22 +02:00
Claire Guilbaud
fdb21e3e91 json logger: new package at v0.1.11 (#17628) 2020-07-27 16:32:56 +02:00
Claire Guilbaud
d66d430ab5 pygments pytest: new package at v1.2.0 (#17626) 2020-07-27 15:44:26 +02:00
Claire Guilbaud
bbfc9fd448 commonmark: new package at v0.9.0 (#17624) 2020-07-27 15:02:56 +02:00
Claire Guilbaud
e960e016af hieroglyph: new package at v1.0.0 (#17625) 2020-07-27 14:34:40 +02:00
Claire Guilbaud
074c0d622f python docs theme: new package at v2020.1 (#17627) 2020-07-27 12:53:04 +02:00
Claire Guilbaud
458a9f22da yolk3k: new package at v0.9 (#17635) 2020-07-27 12:48:58 +02:00
Claire Guilbaud
38730c6e68 sphinxcontrib trio: new package at v1.1.2 (#17631) 2020-07-27 12:14:12 +02:00
Claire Guilbaud
21a4edb1f3 sphinx gallery: new package at v0.7.0 (#17633) 2020-07-27 11:21:48 +02:00
Greg Becker
9cc01dc574 add tutorial setup script to share/spack (#17705)
* add tutorial setup script to share/spack

* Add check for Ubuntu 18, fix xvda check, fix apt-get errors
  - now works on t2.micro, t2.small, and m instances
  - apt-get needs retries around it to work

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-07-27 01:18:16 -07:00
Greg Becker
1ceec31422 add tutorial setup script to share/spack (#17705)
* add tutorial setup script to share/spack

* Add check for Ubuntu 18, fix xvda check, fix apt-get errors
  - now works on t2.micro, t2.small, and m instances
  - apt-get needs retries around it to work

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-07-27 01:17:58 -07:00
Todd Gamblin
8d8cf6201b bugfix: don't redundantly print ChildErrors (#17709)
A bug was introduced in #13100 where ChildErrors would be redundantly
printed when raised during a build. We should eventually revisit error
handling in builds and figure out what the right separation of
responsibilities is for distributed builds, but for now just skip
printing.

- [x] SpackErrors were designed to be printed by the forked process, not
      by the parent, so check if they've already been printed.
- [x] update tests
2020-07-26 22:43:10 -07:00
Todd Gamblin
d351946194 bugfix: don't redundantly print ChildErrors (#17709)
A bug was introduced in #13100 where ChildErrors would be redundantly
printed when raised during a build. We should eventually revisit error
handling in builds and figure out what the right separation of
responsibilities is for distributed builds, but for now just skip
printing.

- [x] SpackErrors were designed to be printed by the forked process, not
      by the parent, so check if they've already been printed.
- [x] update tests
2020-07-26 22:41:55 -07:00
Hadrien G
907f9e8411 [root] Add version 6.22 (#17459)
* Add ROOT v6.22

* Hello xext my old friend...
2020-07-26 10:08:53 -05:00
Patrick Gartung
66d59a90ed Rename sas static-analysis-package (#17695) 2020-07-24 17:07:01 -07:00
Nicholas Sly
5795f1d7da Add Totalview package (#17643)
* Add initial totalview package.

* Add maintainer and helpful comments/information.

Co-authored-by: sly <sly@lanl.gov>
2020-07-24 16:01:32 -07:00
t-nojiri
de6dfe3707 brltty (package): Add dependency on alsa-lib (#17616) 2020-07-24 15:56:34 -07:00
Jen Herting
148acfefcc py-gensim (package): add version 3.8.3; update dependency constraints (#17641) 2020-07-24 15:47:19 -07:00
Matthieu Dorier
b04f9e6774 MPICH (package): add optional support for argobots (#17678) 2020-07-24 15:42:00 -07:00
Christian Tacke
0e090064c4 singularity: Add version 3.6.1 2020-07-24 15:35:40 -07:00
vvolkl
be06803804 WHIZARD (package): add LCIO dependency, Openloops support (#17658)
* WHIZARD: add versions 2.8.4 and 2.8.3
* New package: LCIO
* WHIZARD: add optional dependency on LCIO
* WHIZARD: add optional dependency on Openloops
* WHIZARD: allow building with either hepmc or hepmc3 dependencies
* Openloops: set process_lib_dir in configure
* Openloops: fix reference to variant
2020-07-24 15:25:57 -07:00
Dennis Klein
0c63c94103 Relax architecture compatibility check (#15972)
* Relax architecture compatibility check
* Add test coverage for the spack.abi module
2020-07-24 10:00:55 -07:00
Andrew W Elble
99c46e8186 py-astropy: force re-cythonization of distributed .pyx files (#17567)
astropy 3.2.1 fails to build with python 3.8.3 with
errors similar to this:

astropy/stats/_stats.c:318:11: error: too many arguments to function 'PyCode_New'
PyCode_New(a, 0, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)

These are files that are generated by cython, but are included in the
tarball. Since there's apparently been an API change to PyCode_New, they will
need to be re-cythonized to compile correctly.
2020-07-24 17:46:56 +02:00
Andrew W Elble
30d0347825 py-astropy-healpix: new package (#17568) 2020-07-24 17:39:45 +02:00
jdomke
ea50e4036e scorep: add libunwind dependency (#17580) 2020-07-24 17:30:50 +02:00
Adam J. Stewart
f73a3d35a8 spack help --spec: add compiler flags (#17584) 2020-07-24 17:27:43 +02:00
Dmitriy
02f14fd857 Initialize new_specs in Environment.remove() (#17592) 2020-07-24 17:18:40 +02:00
Harmen Stoppels
54bce00d4d Ensure that the stubs directory does not end up in the rpath (#17619) 2020-07-24 17:02:58 +02:00
Jon Rood
d3e4b14997 imagemagick: added dependency on libsm (#17577) 2020-07-24 17:00:25 +02:00
rempke
6bad79fca0 netlib-scalapack: fixed compilation with gcc 10 (#17647) 2020-07-24 16:53:32 +02:00
t-nojiri
13b3578d2f camx: change compile option for aarch64 (#17653) 2020-07-24 16:39:40 +02:00
Adam J. Stewart
08b5b56566 Qhull: add v2019.1 and v2020.1 (#17648)
* Qhull: add v2019.1 and v2020.1
* Fix compilation with Apple Clang
2020-07-24 16:16:53 +02:00
Adam J. Stewart
c1a2d66804 py-matplotlib: fix freetype and qhull dependencies (#17649) 2020-07-24 16:15:50 +02:00
Mark Olesen
c95c183bc4 openfoam: install META-INFO directory (#17673)
Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
2020-07-24 15:32:18 +02:00
Michael Kuhn
ed4b770e1a gcc: added v10.2.0 (#17681) 2020-07-24 15:24:12 +02:00
Federico Ficarelli
b8c3e3e16f hipsycl: switch to renamed default branch (#17689) 2020-07-24 15:19:59 +02:00
Rémi Lacroix
d9c1b62d38 CMake: added v3.18.0. (#17688) 2020-07-24 15:02:08 +02:00
ilbiondo
1aa1ddcd78 Updated elmer-fem recipe (#17687) 2020-07-24 09:47:09 +02:00
Adam J. Stewart
5fed42eae8 NumPy: add v1.19.1 2020-07-23 17:56:48 -07:00
ketsubouchi
1bbf8c0635 libgd (package): update configure to find jpeg dependency (#17655) 2020-07-23 17:55:34 -07:00
ketsubouchi
1fe07891e3 cctools (package): remove fstack-protector-all for Fujitsu compiler (#17656)
The Fujitsu C compiler does not support the "fstack-protector-all" option.
2020-07-23 17:49:37 -07:00
Tamara Dahlgren
a88675ffa9 XBraid (package): Switch to Github URL (#17670) 2020-07-23 17:27:25 -07:00
Gregory Becker
63db5499ee Merge tag 'v0.15.2' into develop 2020-07-23 16:55:22 -07:00
Chuck Atkins
547c71ad78 Revert "Add libglvnd packages/Add EGL support (#14572)" (#17682)
This reverts commit 573489db71.
2020-07-23 17:41:48 -04:00
Greg Becker
cdab4bdee0 add tutorial public key to share/spack/keys dir (#17684) 2020-07-23 14:35:25 -07:00
Greg Becker
44bc176d08 cray: detect shasta os properly (#17467)
Fixes #17299

Cray Shasta systems appear to use an unmodified Sles or other Linux operating system on the backend (like Cray "Cluster" systems and unlike Cray "XC40" systems that use CNL).

This updates the CNL version detection to properly note that this is the underlying OS instead of CNL and delegate to LinuxDistro.
2020-07-23 13:20:03 -07:00
robo-wylder
3c145b42bc environment-views: fix bug where missing recipe/repo breaks env commands (#17608)
* environment-views: fix bug where missing recipe/repo breaks env commands

When a recipe or a repo has been removed from Spack and an environment
is active, it causes the view activation to crash Spack before any
commands can be executed. Further, the error message it not at all clear
in explaining the issue.

This forces view regeneration to always start from scratch to avoid the
missing package recipes, and defaults add_view=False in main for views activated
by the `spack -e` option.

* add messages to env status and deactivate

Warn users that a view may be corrupt when deactivating an environment
or checking its status while active. Updated message for activate.

* tests for view checking

Co-authored-by: Gregory Becker <becker33@llnl.gov>
2020-07-23 11:00:58 -07:00
Peter Scheibel
ae82650174 Update fetch order to match iteration order of MirrorReference (#17572) 2020-07-23 10:58:59 -07:00
Greg Becker
e8aa737b09 util.executable.which: handle path separators like /bin/which (#17668)
* util.executable.which: handle path separators like /bin/which

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-07-23 10:54:25 -07:00
ilbiondo
f42394daf5 csa-c: added new package at master (#17676)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-07-23 17:07:48 +02:00
ilbiondo
10dacc2588 nn-c: added new package at v1.86.2 (#17675)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-07-23 17:06:35 +02:00
ilbiondo
5067b956f4 Shapeit4: added new package at v4.1.3 (#17674) 2020-07-23 13:22:58 +02:00
Tamara Dahlgren
605c1a76e0 Reduce output verbosity with debug levels (#17546)
* switch from bool to int debug levels

* Added debug options and changed lock logging to use more detailed values

* Limit installer and timestamp PIDs to standard debug output

* Reduced verbosity of fetch/stage/install output, changing most to debug level 1

* Combine lock log methods; change build process install to debug

* Changed binary cache install messages to extraction messages
2020-07-23 00:49:57 -07:00
Nick Robison
39cfa9630c [M4] Add missing compiler flag on Cray Compiler (#17604)
* [M4] Add missing compiler flag on Cray Compiler

The new version of the Cray Compiler are based on Clang, which means we
need to add the same LDFLAG as other clang environments.
2020-07-22 23:55:03 -07:00
Adam J. Stewart
4b33558707 Cython: add v0.29.21 (#17650) 2020-07-22 18:36:55 -07:00
Jen Herting
48e8072b26 New Package: py-boto (#17639)
Added py-boto package
2020-07-22 17:52:36 -07:00
Frédéric Simonis
37fa6fe343 precice: Add version 2.1.0 (#17644) 2020-07-22 16:49:36 -07:00
eugeneswalker
f1eec05d0e bugfix: use getattr for variation.prefix/suffix (#17669) 2020-07-22 16:15:25 -07:00
Mark W. Krentel
4c7e52adaa hpctoolkit: add version 2020.07.21 (#17645) 2020-07-22 13:53:41 -07:00
Ben Bergen
4ca6d1f0f7 Added variant to enable Legion SPY logging. (#17637) 2020-07-22 11:41:20 -06:00
Hadrien G
b8135bd205 [acts] Add version 0.28.0 (#17622)
Add acts v0.28.0
2020-07-22 09:14:51 -07:00
Justin S
dedadcd2ea hisat2 (package): add version 2.2.0, update homepage (#17600) 2020-07-21 19:29:18 -07:00
Adam J. Stewart
983aeea850 New packages: py-azure-cli and dependencies (#17585) 2020-07-21 19:21:29 -07:00
Todd Gamblin
0c44a9a504 bugfix: make compiler preferences slightly saner (#17590)
* bugfix: make compiler preferences slightly saner

This fixes two issues with the way we currently select compilers.

If multiple compilers have the same "id" (os/arch/compiler/version), we
currently prefer them by picking this one with the most supported
languages.  This can have some surprising effects:

* If you have no `gfortran` but you have `gfortran-8`, you can detect
  `clang` that has no configured C compiler -- just `f77` and `f90`. This
  happens frequently on macOS with homebrew. The bug is due to some
  kludginess about the way we detect mixed `clang`/`gfortran`.

* We can prefer suffixed versions of compilers to non-suffixed versions,
  which means we may select `clang-gpu` over `clang` at LLNL. But,
  `clang-gpu` is not actually clang, and it can break builds. We should
  prefer `clang` if it's available.

- [x] prefer compilers that have C compilers and prefer no name variation
  to variation.

* tests: add test for which()
2020-07-21 18:48:37 -07:00
Hadrien G
b81339cf80 [acts] Add 0.27.x series (#17621)
Add acts v0.27 and v0.27.1
2020-07-21 10:56:18 -07:00
Harmen Stoppels
6c69b8a4d4 ci pipelines: activate environment without view (#17440) 2020-07-21 10:15:43 -07:00
Tiziano Müller
ec1237479e cp2k: make libint optional (#17618) 2020-07-21 09:13:59 -07:00
t-nojiri
40e2a41477 aegean (package): remove -m64 on aarch64 (#17615) 2020-07-20 23:57:09 -07:00
Hadrien G
f168d63586 acts: added v0.26 (#17602) 2020-07-21 08:37:21 +02:00
Nick Robison
78a84efb4b flatbuffers: added v0.12.0 (#17603) 2020-07-21 08:30:44 +02:00
Seth R. Johnson
c6891376f4 qt4: add missing libSM dependency (#17611)
See https://github.com/spack/spack/issues/15082 and
https://github.com/spack/spack/pull/16226
2020-07-21 08:12:36 +02:00
Mark W. Krentel
83b281f36b hpcviewer, ibm-java: new versions (#17612)
Add hpcviewer version 2020.07 and ibm-java 8.0.6.11.
2020-07-21 08:09:53 +02:00
Tamara Dahlgren
86ec698a33 Bugfix: Do not raise InstallError for ascent_ver (#17578) 2020-07-20 18:32:56 -07:00
Todd Gamblin
897e80e596 bugfix: ignore Apple's "gcc" by default (#17589)
Apple's gcc is really clang. We previously ignored it by default but
there was a regression in #17110.

Originally we checked for all clang versions with this, but I know of
none other than `gcc` on macos that actually do this, so limiting to
`apple-clang` should be ok.

- [x] Fix check for `apple-clang` in `gcc.py` to use version detection
  from `spack.compilers.apple_clang`
2020-07-20 18:24:18 -07:00
Massimiliano Culpo
ab32799b52 Fix MacOS build tests (#17542)
* MacOS build tests

- Run on PR that modify the YAML file of the workflow
- Don't clone Spack, since we are in the Spack repo now

* Try to add opengl to configuration to build jupyter

* fixup
2020-07-20 17:25:42 -07:00
Dr. Christian Tacke
bd236918dd Configuration: allow usage of command-line scopes with environments (#14608)
Spack did not support usage of the `--config-scope` option in
combination with an environment: In `lib/spack/spack/main.py`,
`spack.config.command_line_scopes` is set equal to any config scopes
passed by the `--config-scope` option. However, this is done after
activating an environment. In the process of activating an environment,
the `spack.config.config` singleton is instantiated, so later setting of
`spack.config.command_line_scopes` is ignored.

This commit sets command line scopes before activating an environment to
ensure that they are included in the configuration.

Co-authored-by: Tim Fuller <tjfulle@sandia.gov>
2020-07-20 13:58:06 -07:00
Justin S
3949a85f9a py-dp-gp-cluster: new package at 2019-09-22 (#17549)
* py-dp-gp-cluster: new package

* py-dp-gp-cluster: remove master, add 2019-09-22

* py-dp-gp-cluster: require python2, older gpy, sklearn

* py-dp-gp-cluster: remove cython runtime dep
2020-07-20 10:48:17 -05:00
Cyrus Harrison
bab1852340 update ascent package with recent ver dep logic, and dray support (#17502)
* update ascent package with recent ver dep logic, and dray support

* update pmt name, make babelflow logic dep on mpi
2020-07-18 08:53:48 -05:00
Jon Rood
ef814b7a32 Add texlive 2020 version (#17559)
* Add new versions of texlive and poppler.

* Add new versions of harfbuzz which also relocated source location to github.

* Update var/spack/repos/builtin/packages/harfbuzz/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Restore deleted url line in harfbuzz.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-18 08:49:32 -05:00
eugeneswalker
96fa6f0c1b allow GNUPGHOME to come from SPACK_GNUPGHOME in env, if set (#17139) 2020-07-17 13:29:30 -07:00
Amjad Kotobi
e4ba1c1daf r-devtools: add version 2.3.0; update dependencies (#17408) 2020-07-17 12:11:29 -07:00
Fabien Bruneval
bbbf0466dc libint (package): add tuning options for MOLGW (#17329) 2020-07-17 12:10:05 -07:00
Cyrus Harrison
dc18b3e3d4 New package: parallelmergetree (#17501) 2020-07-17 12:03:11 -07:00
Scott Wittenburg
b5f82696e2 Bugfix/install missing compiler from buildcache (#17536)
Ensure compilers installed from buildcache are registered.
2020-07-17 11:13:36 -07:00
Adam J. Stewart
a5aa150a98 py-matplotlib: add v3.3.0 2020-07-17 11:05:41 -07:00
Scott Wittenburg
ae03782032 buildcache: list all mirrors even if one fails 2020-07-17 10:04:05 -06:00
Amjad Kotobi
c729c6b93c subversion: added v1.14.0 amd v1.13.0, added new url (#17519) 2020-07-17 16:33:46 +02:00
Cyrus Harrison
324c383d8e add devil ray package (#17495)
* add dray with mid-review changes

* remove env import since its impliclity included
2020-07-16 19:51:41 -07:00
Francesco Di Natale
35b7a69456 Updates to maestrowf package (#17470)
* Addition of Chainmap to satisfy Maestro dependency.

* Additional versions and dependencies for Maestro.

* Updated URL to point to pypi.

* Updates to chainmap hashes.

* Updates to pull version from PyPi.

* Corrections to flake8 errors.

* Stricter restrictions on Python versioning.

Maestro actually supports Python 3.5 and later.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Only install chainmap for Python2 versions.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Removal of setuptools python cond.

* Removal of version constaints on setuptools.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-16 19:38:45 -05:00
Kelly (KT) Thompson
d69c32d7ef Version 1.13.2 still needs the XL patch. (#17561) 2020-07-16 19:35:33 -05:00
Scott Wittenburg
27aaff3dc2 adept-utils: 1.0.1 does not build w/ boost 1.73.0 or newer (#17560) 2020-07-16 19:34:47 -05:00
Seth R. Johnson
fc8847cf4e Mark old icu4c as conflicting (#17562)
GCC 4.8.5 on rhel6:
```
utext.cpp:572:5: error: 'max_align_t' in namespace 'std' does not name a
type
     std::max_align_t    extension;
     ^
utext.cpp: In function 'UText* utext_setup_67(UText*, int32_t,
UErrorCode*)':
utext.cpp:587:73: error: 'max_align_t' is not a member of 'std'
             spaceRequired = sizeof(ExtendedUText) + extraSpace -
sizeof(std::max_align_t);
                                                                         ^
utext.cpp:587:73: note: suggested alternative:
In file included from
/projects/spack/opt/spack/gcc-4.4.7/gcc/6ln2t7b/include/c++/4.8.5/cstddef:42:0,
                 from utext.cpp:19:
/projects/spack/opt/spack/gcc-4.4.7/gcc/6ln2t7b/lib/gcc/x86_64-unknown-linux-gnu/4.8.5/include/stddef.h:
425:3: note:   'max_align_t'
 } max_align_t;
   ^
utext.cpp:598:57: error: 'struct ExtendedUText' has no member named
'extension'
                 ut->pExtra    = &((ExtendedUText *)ut)->extension;
                                                         ^
   g++   ...  loadednormalizer2impl.cpp
   g++   ...  chariter.cpp
```
2020-07-16 19:34:02 -05:00
Harmen Stoppels
1fcc00df96 Fix security issue in CI (#17545)
The `spack-build-env.txt` file may contains many secrets, but the obvious one is the private signing key in `SPACK_SIGNING_KEY`. This file is nonetheless uploaded as a build artifact to gitlab. For anyone running CI on a public version of Gitlab this is a major security problem. Even for private Gitlab instances it can be very problematic.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2020-07-16 17:27:37 -07:00
Jon Rood
697c2183d3 Add new dependencies required in latest rsync. (#17558) 2020-07-16 19:25:22 -05:00
Nichols A. Romero
b320be70cb PySCF new package (#17474)
* Initial version of PySCF.

* Add master branch to xcfun library

* PySCF only compatible with specific commit of xcfun library

* Update var/spack/repos/builtin/packages/py-pyscf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pyscf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Revert "PySCF only compatible with specific commit of xcfun library"

This reverts commit 8296005400.

* Revert "Add master branch to xcfun library"

This reverts commit f2b6998931.

* Issues conflict for xcfun library version rather than relying on a random commit.

* Add version xcfun 2.0.0a2 which is needed by PySCF.

* Remove xcfun conflict and express dependency more explictly. Add comment as to why this is necessary.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-16 15:58:52 -05:00
downloadico
3f24188d19 Abinit+wannier90 fix (#17417)
* wannier90: add versions 3.0.0 and 3.1.0 and 'shared variant'

Added versions 3.0.0 and 3.1.0

Added shared variant

Added url_for_version function as versions less than 3 are from the
wannier.org site and versions 3 and up are from github.com

Added the MPI libraries to the list of libs substituted into the make.sys file
in place of @LIBS

Made it possible to build a shared object version of the library for versions
< 3 by filtering the src/Makefile.2 file (based off of the patch from a src rpm
from RHEL for version 2.0.1)

Create a modules directory in the install prefix root directory and copy the
Fortran .mod files there.

Set the MPIFC variable to the Spack Fortran MPI compiler wrapper.

* abinit: added 'wannier90' variant  which enables building abinit with wannier90

Added wannier90 variant

Made abinit depend on the shared object ('shared') variant of
wannier90 if the wannier90 variant is selected

Add configure args for wannier90 libs, includes, and binaries and to
set MPIFC

set the dft-flavor to wannier90 when wannier90 is enabled and only
set the dft flavor to 'atompaw+libxc' if wannier90 is not selected

* Update var/spack/repos/builtin/packages/abinit/package.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Update var/spack/repos/builtin/packages/wannier90/package.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Update var/spack/repos/builtin/packages/wannier90/package.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* incorporated bbecker's suggestion for making the strings less ugly!

* incorporated bbecker's suggestion to fix the logic for picking which
"DFT flavor" configure argument.
If the wannier variant is enabled, it passes --with-dft-flavor=wannier90
to configure, otherwise it passes --with-dft-flavor=atompaw+libxc to configure

* Changed to using plain strings

* Fixed version tests

* incorporated @adamjstewart's fix for testing if the major version is > 2

* incorporated @adamjstewart's fix to check if mpi is enabled and
only set the MPIFC variable if it is.

* Update var/spack/repos/builtin/packages/wannier90/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Only set MPIFC if '+mpi' is set

* incorporated fixes from @adamjstewart including:
	- using the string=True argument to filter_file (and removed the unneeded
 	  escapes)
	- changing the url to the github location
	- fixing the version checks
	- building a libwannier.dylib on darwin

* incorporated fixes suggested by @adamjstewart including:
	- using the string=True argument to filter_file and cleaned up the escapes
	- only pass the MPIFC argument to configure when '+mpi' is set
	- chaned the url to the github site for Wannier090
	- fixed the version checks
	- build a 'libwannier.dylib' file when building the shared variant on darwin

* Update var/spack/repos/builtin/packages/wannier90/package.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* moved a configure argument from it's own '+mpi' check to under the lower one

* Update var/spack/repos/builtin/packages/wannier90/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Cleaned up syntax as suggested by @adamjstewart
It looks *so much better* now!  Thanks!

* removed unneeded import of 'find' from 'llnl.util.filesystem' package
as suggested by @adamjstewart

* Update var/spack/repos/builtin/packages/wannier90/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* incorporated changes from @adamjstewart
changed check to "if '@:2 +shared' in spec:" instead of a nested check of '@:2' and
'+shared'
removed unneeded joins used in filter_file and spliced the list of objs directly into
the filter_file call
used the dso_suffix instead of testing for darwin to determine the name of the
shared library

* removed whitespace from blank line

* fixed bug with '../../wannier90.x: .*' not being treated as a regexp.  Thanks Adam!

* fixed missing whitespace when modifying Makefile.2

Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-16 15:57:42 -05:00
Harmen Stoppels
3449087284 Make the largest layer of the docker image cacheable (#17553) 2020-07-16 13:15:04 -04:00
Themos Tsikas
8e9f4d0078 Update for Build 7020 of nagfor compiler (#17555) 2020-07-16 10:39:06 -05:00
darmac
d7794540b2 Add new package: hibench (#17552) 2020-07-16 08:47:33 -05:00
iarspider
ae44b1d7b9 New package: openloops (#17520)
* New package: OpenLoops

* install() for openloops

* Working OpenLoops recipe

* Flake-8

* Only copy collection file if required; add clarification to num_jobs

* Add __future__ import just in case

* Fix missing space

* Remove __future__ import

* Changes from review, pt. 1

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Replace print() with write()

* Flake-8

Co-authored-by: iarspider <iarpsider@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-16 08:33:23 -05:00
Tom Payerle
df12b2bd15 kallisto: remove mpich dependency (#17551)
kallisto does not depend on mpich or MPI, except possibly indirectly
through hdf5 (but that should be handled by hdf5).
2020-07-16 08:28:50 +02:00
darmac
148a6a8860 Add new package: prometheus (#17541) 2020-07-15 21:56:14 -05:00
darmac
efba3731e5 storm: update url, version & runtime depends (#17523)
* storm: update url, version & runtime depends

* fix list_url error
2020-07-15 21:39:03 -05:00
fcannini
6eb332a984 vasp: New package. (#15187)
* vasp: New package.

* Remove unneeded `#noqa`

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Removed a completely needless tty.debug()

* Add compiler conflicts() and minute fixes

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-15 21:33:53 -05:00
Cyrus Harrison
d0a83f318b add apcomp package (#17494)
* add apcomp package

* add maintainers

* fake8

* Update var/spack/repos/builtin/packages/apcomp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* review suggestions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-15 21:31:58 -05:00
Rémi Lacroix
0f67b97065 Update icu4c package (#17461)
* icu4c: Add new versions for older releases.

The old URLs for versions 60.1, 58.2 and 57.1 do not work anymore so add the versions available on Github.

The old versions are kept for reference (cf. #15896).

* icu4c: Add versions 66.1 and 67.1.

* icu4c: Fix compilation of versions 58 and 59 with recent glibc.
2020-07-15 18:57:19 -05:00
kolamsrinivas
d2c2e000a7 changes to py-torch recipe to enable rocm build (#17410)
* changes to recipe to enable rocm build

* fixing flake8 issue

* addressed the review comment
2020-07-15 18:45:22 -05:00
Julius-Plehn
4ac1a532f3 Adds new R package: GSODR (#17529)
* R GSODR package

* use cloud mirror
2020-07-15 12:05:39 -05:00
vvolkl
f42dc4fa4d [root] fix cmake args for r variant (#17487)
* [root] fix cmake args for r variant

* [root] add readline dependency to +r
2020-07-15 12:04:05 -05:00
Paul
d25c7ddd6f spack containerize: added --fail-fast argument to containerize install. (#17533) 2020-07-15 11:13:04 +02:00
Paul
48a9ad3652 Go: added v1.14.5 and v1.13.13. (#17539) 2020-07-15 09:31:49 +02:00
Cyrus Harrison
d55541919d visit package update, add glu as a linux dep (#17537)
* visit: add glu as a dep for linux

* add note to suggested install command about mesa
2020-07-15 00:01:05 -07:00
Dr. Christian Tacke
0d4740d1b1 curl: add dependency on libidn2 (#17526)
If the system has libidn2 installed, then curl will use it.
spack has a libidn2 package, so let's use that!

Related: #16514
2020-07-15 08:43:38 +02:00
yellowhat
d56711f799 namd: added v2.14b2 (#17395)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-15 08:04:39 +02:00
ketsubouchi
99a47e407e cantera: better specify dependency on sundials (#17540) 2020-07-15 07:53:36 +02:00
Julius-Plehn
7efb0e541e New Package: GrADS (#17476)
* grads minimal package

* udpt template

* grads minima

* grads & shapelib package

* hdf4

* cleanup

* hdf5, netcdf variants

* updates environment function

* updating paths and pkgconfig

* cleanup
2020-07-14 22:30:39 -05:00
Hadrien G
7340be98f6 [acts] Add 0.25.x series (#17485)
* Add Acts v0.25 support

* Add Acts v0.25.1

* Add acts v0.25.2
2020-07-14 22:23:07 -05:00
vvolkl
c281eaf69f [acts] remove false dependency (#17511) 2020-07-14 22:07:39 -05:00
Amjad Kotobi
2110b98829 r-glue: new version (#17517) 2020-07-14 21:56:05 -05:00
darmac
88537d02e4 gpdb: fix runtime issue (#17521) 2020-07-14 21:47:29 -05:00
Jon Rood
a2729fcd7f zsh (package): add versions; switch to .xz archives (#17489)
* Add new versions including 5.8
* Download .xz archives for existing versions (this requires updating
  the associated checksums)
2020-07-14 19:47:03 -07:00
darmac
bcd41cec71 Add new package: minio (#17522) 2020-07-14 21:45:47 -05:00
Cyrus Harrison
6f6e896795 New package: babelflow (#17500) 2020-07-14 19:42:38 -07:00
darmac
28a25080ca py-lockfile: depends on py-pbr by setup.py (#17524) 2020-07-14 21:34:59 -05:00
ketsubouchi
14f3f230c1 scons: support Fujitsu Fortran moddir option (#17538) 2020-07-14 21:24:18 -05:00
Jon Rood
d32bbae431 rsync (package): add version 3.2.2 (#17504) 2020-07-14 16:41:11 -07:00
Frank Willmore
710ff8d7ce libyogrt (package): add variant to enable static builds (#17535) 2020-07-14 16:36:18 -07:00
Jon Rood
683881f912 py-protobuf (package): add version 3.12.2 (#17532)
This matches the current latest version of protobuf in Spack.
Generally the version of py-protobuf and protobuf should match,
but this constraint is not currently recorded in py-protobuf.
2020-07-14 16:31:15 -07:00
vvolkl
11d8aed6cd dd4hep (package): add version 1.13.0 (#17528) 2020-07-14 16:24:33 -07:00
Andrey Prokopenko
ab68410c4c Trilinos (package): remove maintainer (#17534) 2020-07-14 16:23:19 -07:00
darmac
fa614404e6 smartdenovo: added patch to fix compile error (debian) (#17435) 2020-07-14 13:02:31 +02:00
Axel Huebl
a6abd530bd CUDA 11.0.2 (#17423)
- [x] wait for general release candidate
- [x] compute capability support
- [x] compiler conflicts
  - [x] ppc64le
- [x] new download links
2020-07-13 18:32:28 -05:00
Harmen Stoppels
2b809a5374 Add -o flag to tar decompressor (#17427)
For normal users, `-o` or `--no-same-owner` (GNU extension) is
the default behavior, but for the root user, `tar` attempts to preserve
the ownership from the tarball.

This makes `tar` use `-o` all the time.  This should improve untarring
files owned by users not available in rootless Docker builds.
2020-07-13 15:19:04 -07:00
Cyrus Harrison
3e13137f6e add share libs variant to raja (#17496) 2020-07-13 14:57:25 -07:00
Cyrus Harrison
6aa6e19d34 add shared libs variant to umpire (#17497) 2020-07-13 14:57:04 -07:00
Cyrus Harrison
c2d8d8acbd update vtk-m with pinned version for ascent and related packages (#17498)
* add ascent_ver to vtk-m pkg

* vtk-m:: add patches used by ascent
2020-07-13 14:56:45 -07:00
Cyrus Harrison
299dcdd3eb update vtk-h package with new version and options (#17499) 2020-07-13 14:55:38 -07:00
Jon Rood
e0f13b298d tmux (package): add version 3.1b (#17486) 2020-07-13 13:56:33 -07:00
Jon Rood
d2ac26f844 gdb (package): add version 9.2 (#17490) 2020-07-13 13:46:27 -07:00
Jon Rood
fae57d1422 cppcheck (package): add version 2.1 (#17491) 2020-07-13 13:45:37 -07:00
Jon Rood
c84a05b809 bison (package): add versions including 3.6.4 and 3.5.3 (#17492) 2020-07-13 13:44:37 -07:00
Jon Rood
05e8918076 curl (package): add version 7.71.0 (#17493) 2020-07-13 13:42:42 -07:00
Jon Rood
929cb9e62e vim (package): add version 8.2.1201 (#17503) 2020-07-13 13:31:37 -07:00
Jon Rood
7d1f2abd56 screen (package): add version 4.8.0 (#17505) 2020-07-13 13:28:58 -07:00
Jon Rood
ab5f28aceb stow (package): add version 2.3.1 (#17506) 2020-07-13 13:28:18 -07:00
Jon Rood
4450377794 gnutls: add version 3.6.14 (#17507) 2020-07-13 13:27:28 -07:00
Jon Rood
45eaa442c3 Global (package): add version 6.6.4 (#17508) 2020-07-13 13:26:39 -07:00
darmac
4fa519134f bwa: support for aarch64 (#17473)
* bwa: support for aarch64

* bwa: fix build error for non-aarch64 machine
2020-07-13 10:52:23 -05:00
Rémi Lacroix
815f62ce0c Update gdk-pixbuf package. (#17458)
* gdk-pixbuf: Add new stable versions.

* gdk-pixbuf: Add a missing dependency with libx11.

Also add a variant disabled by default to make it optional since it is considered deprecated
(cf. 3362e94c25).
2020-07-13 10:49:17 -05:00
Mark Olesen
b3b5ea4064 updated sha256 for openfoam-1806 patch (#17483)
- perhaps related to gitlab migration and/or upgrade (Dec 2019)

Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
2020-07-13 10:47:06 -05:00
Omar Padron
573489db71 Add libglvnd packages/Add EGL support (#14572)
* add new package: "libglvnd-frontend"

* add +glvnd variant to opengl package

* add +glvnd variant to mesa package

* add +egl variant to paraview package

* add libglvnd-frontend entries to default packages config

* fix style

* add default providers for glvnd virtuals

add default providers for glvnd-gl, glvnd-glx, and glvnd-egl

* WIP: rough start to external OpenGL documentation

* rename libglvnd-frontend package and backend virtual dependencies

* update documentation

* fix ligvnd-be-* typos

* fix libglvnd-fe package class name

* fix doc parse error
2020-07-13 11:32:36 -04:00
darmac
9c42f246ed Add new package: atf (#17472) 2020-07-12 21:32:02 -05:00
Jannek Squar
dbdd2cb92f Magics fix and update (#17477)
* Added new versions to magics and began to set not-so-optional netcdf dependency

* Added enforced netcdf dependency

* Fix also works for version 4.1.0
2020-07-12 21:20:12 -05:00
Greg Becker
406596af70 update docs on point releases (#17463) 2020-07-11 14:35:25 -07:00
darmac
73f02b10de lmbench: fix scripts path for aarch64 (#17456) 2020-07-11 12:53:53 -05:00
Dr Owain Kenway
9629f571bc llvm-flang: Only build offload code if cuda enabled (#17466)
* llvm-flang Only build offload code if cuda enabled

The current version executes `cmake(*args)` always as part of the post install.  If device offload is not part of the build, this results in referencing `args` without it being set and the error:

```
==> Error: UnboundLocalError: local variable 'args' referenced before assignment

```

Looking at prevoous version of `llvm-package.py` this whole routine appears to be only required for offload, some indent `cmake/make/install` to be under the `if`.

* Update package.py

Add comment
2020-07-11 09:02:53 -05:00
Peter Josef Scheibel
5e50dc5acb Merge branch 'releases/v0.15' into develop 2020-07-10 23:14:36 -07:00
Jen Herting
59bfc22d40 [glew] depends on libsm and libice (#17428)
* [glew] depends on libsm

* [glew] depends on libice
2020-07-10 19:32:26 -05:00
Justin S
1a8a147fe5 energyplus: add 9.3.0 (#17452)
* energyplus: add 9.3.0

* energyplus: fix version order

* energyplus: more concise links

* energyplus: avoid join_path
2020-07-10 19:23:04 -05:00
figroc
0612a9e8e9 tensorflow-serving-client: add new version 2.2.0 (#17462) 2020-07-10 19:10:04 -05:00
Greg Becker
f2889e698a spack install: improve error message with no args (#17454)
The error message was not updated when the behavior of Spack environments
was changed to not automatically activate the local environment in #17258.
The previous error message no longer makes sense.
2020-07-10 10:45:11 -07:00
Rémi Lacroix
ea546425e8 Update the bbcp package (#17436)
* bbcp: Update the URLs to use HTTPS.

The HTTP URLs do not work anymore.

* bbcp: Add missing libnsl dependency.

* bbcp: Rename the git-based version to match the branch name.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-10 08:27:07 -05:00
ketsubouchi
7269a5bf51 py-pysam: add LDFLAGS to curl (#17434)
* py-pysam: add LDFLAGS to curl

* Update var/spack/repos/builtin/packages/py-pysam/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-10 08:26:21 -05:00
figroc
00d7e817c6 grpc: add versions 1.28/1.29/1.30 (#17433) 2020-07-10 08:25:47 -05:00
iarspider
ed7d485b58 New packages: thepeg, herwig++ (2.x) (#17443)
* New packages: Rivet, Herwig++ 2

* Add patches for thepeg

* Flake-8

* Update package.py

* Delete thepeg-2.1.6.patch

* Delete thepeg-2.1.5.patch

* Delete thepeg-2.1.3.patch

* Delete thepeg-2.2.0.patch
2020-07-10 08:25:21 -05:00
iarspider
38d387c9a5 New packages: looptools + vbfnlo (#17446)
* New package: vbfnlo

* Add new package: vbfnlo

* Add recipe for looptools

* Add patch for looptools

* LoopTools: patch not needed (fixed by developers without changing version)

* Remove patch file as well

* Update package.py

* Update package.py

* Fix vbfnlo recipe for old version

Co-authored-by: iarspider <iarpsider@gmail.com>
2020-07-10 08:24:14 -05:00
Julius-Plehn
02dd90ebf9 New Package: ChaNGa (#17442)
* WIP: changa package

* changa cleanup

* flake8 format

* adds master branch to ChaNGa

* positional arguments

* use install instead of copy
2020-07-10 08:22:20 -05:00
Patrick Gartung
e72e2568dd Relocation of sbang needs to be done when the spack prefix changes even if the install tree has not changed. (#17455) 2020-07-09 22:28:51 -05:00
Rémi Lacroix
d9923a05e0 ltrace: Disable "-Werror". (#17444)
Some functions used by ltrace have been deprecated in recent versions of glibc.
2020-07-09 22:27:30 -05:00
Rémi Lacroix
8c6fa66b2a openslide: Add missing dependencies. (#17445) 2020-07-09 22:26:37 -05:00
Harmen Stoppels
84eae97f91 aspirin for buildaches (#17437) 2020-07-09 22:00:38 -05:00
Sajid Ali
12099ed55e clear mpicc and friends before each build (#17450)
* clear mpi env vars
2020-07-09 16:14:49 -05:00
Greg Becker
d0f5b69a19 installation: skip repository metadata for externals (#16954)
When Spack installs a package, it stores repository package.py files
for it and all of its dependencies - any package with a Spack metadata
directory in its installation prefix.

It turns out this was too broad: this ends up including external
packages installed by Spack (e.g. installed by another Spack instance).
Currently Spack doesn't store the namespace properly for such packages,
so even though the package file could be fetched from the external,
Spack is unable to locate it.

This commit avoids the issue by skipping any attempt to locate and copy
from the package repository of externals, regardless of whether they
have a Spack repo directory.
2020-07-09 11:08:51 -07:00
Peter Scheibel
ce9d30f80f add public spack mirror (#17077) 2020-07-08 15:59:24 -07:00
Sinan
e02d955aed new package: ligra (#17425)
* new package: ligra

* setup run environment

* tidy up

* Update var/spack/repos/builtin/packages/ligra/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/ligra/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/ligra/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* flake8

Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-08 16:30:53 -05:00
Frank Willmore
b3fff20d1f enable flatcc to be built with gcc@9.X.X (#17430)
`gcc` 9 and above have more warnings that break the `flatcc` build by default, because `-Werror` is enabled.  This loosens the build up so that we can build with more compilers in Spack.

- [x] Add `-DFLATCC_ALLOW_WERROR=OFF` to `flatcc` CMake arguments

Co-authored-by: Frank Willmore <willmore@anl.gov>
2020-07-08 13:57:09 -07:00
Patrick Gartung
8c41173678 Buildcache: bindist test without invoking spack compiler wrappers. (#15687)
* Buildcache:
   * Try mocking an install of quux, corge and garply using prebuilt binaries
   * Put patchelf install after ccache restore
   * Add script to install patchelf from source so it can be used on Ubuntu:Trusty which does not have a patchelf pat package. The script will skip building on macOS
   * Remove mirror at end of bindist test
   * Add patchelf to Ubuntu build env
   * Revert mock patchelf package to allow other tests to run.
   * Remove depends_on('patchelf', type='build') relying instead on
   * Test fixture to ensure patchelf is available.

* Call g++ command to build libraries directly during test build

* Flake8

* Install patchelf in before_install stage using apt unless on Trusty where a build is done.

* Add some symbolic links between packages

* Flake8

* Flake8:

* Update mock packages to write their own source files

* Create the stage because spec search does not create it any longer

* updates after change of list command arguments

* cleanup after merge

* flake8
2020-07-08 15:05:58 -05:00
iarspider
0bed621d0c Add missing file (#17426)
Co-authored-by: Ivan Razumov <ivan.razumov@cern.ch>
2020-07-08 10:50:47 -05:00
Amjad Kotobi
1d2754c3f6 r-usethis: new version and dependencies (#17411)
* r-usethis: new version and dependencies

* r-usethis: fix in dependency
2020-07-08 08:44:47 -05:00
tcojean
ae2a867a7f Ginkgo: new versions (#17413)
* Add new Ginkgo versions with HIP support.

* Drop HIP support until more ROCm packages are integrated.
2020-07-08 08:42:56 -05:00
Adam J. Stewart
207e496162 spack create: ask how many to download (#17373) 2020-07-08 09:38:42 +02:00
ketsubouchi
f0391db096 typhon: fix build with Fujitsu compilers (#17424) 2020-07-08 09:28:41 +02:00
TZ
084994db9c ncl: fix compilation errors with Intel compilers (#17391)
The Intel compilers are more strict and require special command
line options (like -std=c99) to properly compile NCL.
2020-07-08 08:42:49 +02:00
Simon Byrne
f85da868ac Improve Travis sample in the docs (#17420)
- printf is better than echo for multiline strings
- ** should be &&
- use line continuation
- Use multiline block
2020-07-08 07:25:37 +02:00
iarspider
f1f31e3dfe Fix YODA and Rivet recipes (#17412)
* Fix Rivet recipe; restrict Yoda versions for a give Rivet version

* Fix YODA recipe

* More tweaks to YODA version requirements

* Flake-8
2020-07-07 22:24:48 -05:00
Patrick Gartung
7f8e827db8 Relocate rpaths for all binaries, then do text bin replacement if the rpaths still exist after running patchelf/otool (#17418) 2020-07-07 16:46:39 -05:00
Adam J. Stewart
a63761f875 oneDNN: add v1.5.1 (#17419) 2020-07-07 16:38:47 -05:00
Adam J. Stewart
3ce16c89b7 GDAL: add v3.1.2 (#17416) 2020-07-07 15:45:25 -05:00
Massimiliano Culpo
f4ac3770b4 CudaPackage: maintainers are listed in the docstring (#17409)
fixes #17396

This prevents the class attribute to be inherited and
saves current maintainers from becoming the default
maintainers of every Cuda package.
2020-07-07 20:45:41 +02:00
Todd Gamblin
b0506a722e releases: document releases/latest tag (#17402)
We got rid of `master` after #17377, but users still want a way to get
the latest stable release without knowing its number.

We've added a `releases/latest` tag to replace what was once `master`.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-07-07 11:44:15 -07:00
Peter Scheibel
650ab563f4 Uninstall: tolerate hook failures when force=true (#16513)
Fixes #16478

This allows an uninstall to proceed even when encountering pre-uninstall
hook failures if the user chooses the --force option for the uninstall.

This also prevents post-uninstall hook failures from raising an exception,
which would terminate a sequence of uninstalls. This isn't likely essential
for #16478, but I think overall it will improve the user experience: if
the post-uninstall hook fails, there isn't much point in terminating a
sequence of spec uninstalls because at the point where the post-uninstall
hook is run, the spec has already been removed from the database (so it
will never have another chance to run).

Notes:

* When doing spack uninstall -a, certain pre/post-uninstall hooks aren't
  important to run, but this isn't easy to track with the current model.
  For example: if you are uninstalling a package and its extension, you
  do not have to do the activation check for the extension.
* This doesn't handle the uninstallation of specs that are not in the DB,
  so it may leave "dangling" specs in the installation prefix
2020-07-07 11:37:36 -07:00
Christoph Junghans
90285c7d61 votca-tools: fix build with mkl (#17414) 2020-07-07 13:27:10 -05:00
Frank Willmore
6c300ab717 snappy: added v1.1.8 (#17397) 2020-07-07 17:25:26 +02:00
g-mathias
05d8ba170b jube: added v2.4.0 (#17404)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-07-07 16:11:53 +02:00
figroc
51f65152a5 abseil-cpp: added v20200225.2 (#17383) 2020-07-07 14:02:21 +02:00
Harmen Stoppels
1113357e35 libtree: fixed checksums (#17393)
The hash was wrongly computed for the `tar.gz` 
that Github provides, not the custom tarball which 
includes submodules as well.
2020-07-07 12:56:57 +02:00
ketsubouchi
d65a076c0d bliss: add spaces to __DATE__ (#17385)
C++11 requires a space between literal and string macro.
2020-07-07 11:28:55 +02:00
Glenn Johnson
845139740f mumax: new package at v3.10beta (#17398)
This PR creates a new spack package for

mumax: GPU accelerated micromagnetic simulator.

This uses the current beta version because
- it is somewhat dated, ~2018
- it is the only one that supports recent GPU kernels
2020-07-07 11:01:59 +02:00
TZ
1f87b07689 nco: added v4.8.[0,1] and v4.9.[0-3] (#17389) 2020-07-07 10:18:27 +02:00
TZ
cbaa1bca1c ncview: added v2.1.8 (#17388) 2020-07-07 10:09:27 +02:00
Sinan
5fb6a06c37 gunrock: improved package recipe (added variants for applications and others) (#17340)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
2020-07-07 10:02:37 +02:00
Adam Moody
6e38fc56f6 mpifileutils: add v0.10.1 2020-07-06 18:19:29 -07:00
Todd Gamblin
c00a05bfba bugfix: no infinite recursion in setup-env.sh on Cray
On Cray platforms, we rely heavily on the module system to figure out
what targets, compilers, etc. are available. This unfortunately means
that we shell out to the `module` command as part of platform
initialization.

Because we run subcommands in a shell, we can get infinite recursion if
`setup-env.sh` and friends are in some init script like `.bashrc`.

This fixes the infinite loop by adding guards around `setup-env.sh`,
`setup-env.csh`, and `setup-env.fish`, to prevent recursive
initializations of Spack. This is safe because Spack never shells out to
itself, so we do not need it to be initialized in subshells.

- [x] add recursion guard around `setup-env.sh`
- [x] add recursion guard around `setup-env.csh`
- [x] add recursion guard around `setup-env.fish`
2020-07-06 13:55:14 -07:00
Todd Gamblin
9ec9327f5a docs: document releases and branches in Spack
- [x] Remove references to `master` branch
- [x] Document how release branches are structured
- [x] Document how to make a major release
- [x] Document how to make a point release
- [x] Document how to do work in our release projects
2020-07-06 11:39:19 -07:00
Todd Gamblin
11088df402 Remove references to master from CI
- [x] remove master from github actions
- [x] remove master from .travis.yml
- [x] make `develop` the default branch for `spack ci`
2020-07-06 11:39:19 -07:00
Todd Gamblin
4ea76dc95c change master/child to controller/minion in pty docstrings
PTY support used the concept of 'master' and 'child' processes. 'master'
has been renamed to 'controller' and 'child' to 'minion'.
2020-07-06 11:39:19 -07:00
cedricchevalier19
f0275d7e1b Fix gcc + binutils compilation. (#9024)
* fix binutils deptype for gcc

binutils needs to be a run dependency of gcc

* Fix gcc+binutils build on RHEL7+

static-libstdc++ is not available with system gcc.
Anyway, as it is for bootstraping, we do not really care depending on
a shared libstdc++.

Co-authored-by: Michael Kuhn <michael@ikkoku.de>
2020-07-06 13:02:35 -05:00
Michael Kuhn
516c3e659f autotools bugfix: handle missing config.guess (#17356)
Spack was attempting to calculate abspath on the located config.guess
path even when it was not found (None); this commit skips the abspath
calculation when config.guess is not found.
2020-07-06 10:53:02 -07:00
iarspider
e62ddcb582 Add Rivet and YODA (#17372)
* Add Rivet and YODA

* Add patches

* Flake-8

* Set level for Rivet patches

* Syntax fix

* Fix dependencies of Rivet

* Update var/spack/repos/builtin/packages/rivet/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-06 10:51:36 -05:00
TZ
b3bc538df6 esmf: set ESMF_COMM=intelmpi also for ^intel-mpi (#17387)
Not only intel-parallel-studio+mpi provides Intel MPI but also
intel-mpi.
2020-07-06 11:40:11 +02:00
fcannini
29fc94e29e psi4: fix "filter_compilers" signature (#17375) 2020-07-06 09:07:26 +02:00
Adam J. Stewart
466f7fd996 GMT: add v6.1.0 (#17384) 2020-07-05 21:06:05 -05:00
darmac
58cfe4e078 acl: fix depends error (#17341) 2020-07-05 15:28:29 -05:00
darmac
00f7577273 brpc: fix depends issue (#17347) 2020-07-05 15:27:49 -05:00
Wouter Deconinck
4e6d189a94 [opencascade] depends_on freetype, tcl, tk, gl (#17357)
* [opencascade] depends_on freetype, tcl, tk, gl

* [opencascade] new version 7.4.0p1 and url_for_version
2020-07-05 15:25:38 -05:00
iarspider
9abadd4985 New version of LHAPDF: 6.3.0 (#17367) 2020-07-05 15:23:03 -05:00
figroc
66d4bc3f3c protobuf: add versions (#17381) 2020-07-05 15:22:31 -05:00
yellowhat
52cafe6c96 amdblis bump to 2.2. (#17369) 2020-07-05 15:22:11 -05:00
g-mathias
8d5aa46765 package Amber: amber tools 20 (#17374)
* package amber: added amber_tools 20 hash; added minor version for amber_tools

* fix flake8 issues

Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-07-05 15:17:38 -05:00
Gilles Gouaillardet
e5ec89ad5b openblas: fix fj compiler support in 0.3.10 (#17376)
The latest 0.3.10 version openblas changed how Fortran libraries
are detected, and this broke Fujitsu compiler support.

This (new) openblas patch addresses that issue.
2020-07-05 15:14:36 -05:00
Timo Heister
7bba9cd2a5 update aspect to 2.2.0 (#17379) 2020-07-05 15:13:22 -05:00
Adam J. Stewart
cce629e791 SciPy: add v1.5.1 (#17380) 2020-07-05 14:58:26 -05:00
TZ
bb15addad5 inel-mpi: fix for wrong structure name instroduced in ea8a0be4 (#17382)
it's    mpi_compiler_wrappers
and not mpi_compiler._wrappers

fixes 2nd part of #17371
2020-07-05 11:10:28 -05:00
Adam J. Stewart
e9e3e88f63 Fix Intel MPI super invocation, again (#17378) 2020-07-05 11:09:24 -05:00
fcannini
c797a0611c dtfbplus: New package. (#15191)
* dtfbplus: New package.

* dftbplus: Addresses @adamjstewart's comments on PR #15191

* dftbplus: Fixes format() calls that slipped in previous commit.

* dftbplus: Appease flake8.

* dftbplus: Change 'url' and misc. fixes.

* Add a resource to do the job of './utils/get_opt_externals'
2020-07-04 08:27:32 -05:00
Shahzeb Siddiqui
04f3000646 Pipelines doc: fixed two broken links (#17355) 2020-07-03 12:29:45 +02:00
Amjad Kotobi
f3eba3c482 py-python-swiftclient: added v3.10.0 (#17352) 2020-07-03 12:25:54 +02:00
g-mathias
02fa7b680f elpa: added v2019.11.001 and v2020.05.001 (#17368)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-07-03 11:52:02 +02:00
Paul R. C. Kent
8fcd917e51 libelf: added extra url (#17358) 2020-07-03 11:49:58 +02:00
g-mathias
9c85d87b90 jube: added v2.3.0 (#17366)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-07-03 10:13:37 +02:00
yellowhat
b45fc97564 Package request: HPCG (#17350)
* use patch from upstream

Co-authored-by: Michael Kuhn <michael.kuhn@informatik.uni-hamburg.de>
2020-07-03 10:05:24 +02:00
ketsubouchi
986f68f7ed blktrace: use Spack compiler wrappers (#17365) 2020-07-02 23:39:04 -07:00
ketsubouchi
2cd9e1eb62 blat: use SPACK_CC (#17364) 2020-07-02 23:38:23 -07:00
darmac
61804f201a glusterfs: add pkgconfig dependency (#17343) 2020-07-02 23:35:38 -07:00
g-mathias
cf104b0f10 jmol: add version 14.31.0 (#17351)
Also:

* Add url_for_version function
* Add Java to PATH for run environment
* Update `install` method to handle old and new version

Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2020-07-02 23:34:00 -07:00
Kelly (KT) Thompson
17106a131d Random123: add versions 1.10, 1.13.2 (#17361) 2020-07-02 23:25:53 -07:00
Christoph Junghans
cc0dda95c4 quicksilver: add v1.0 2020-07-02 23:24:44 -07:00
takanori-ihara
7679e20e83 py-tensorflow: Fix for tensorflow issue #40688 (#17324)
* py-tensorflow: Fix for #40688

* py-tensorflow:  Fix for tensorflow issue #40688
2020-07-02 21:49:29 -05:00
Michio Ogawa
4349c091e7 FrontISTR: add version 5.1 (#17349) 2020-07-02 15:34:25 -07:00
darmac
ff60f51a7a keepalived: openssl is a link and build dependency (#17346) 2020-07-02 15:13:24 -07:00
Amjad Kotobi
06da1f195c openmpi: add singularity variant (#17288) 2020-07-02 15:10:06 -07:00
iarspider
3d98ad3f4c New packages: heputils and mcutils (#17330)
heputils is a (conditional) dependency of mcutils
2020-07-02 15:05:20 -07:00
Harmen Stoppels
f1bb8999ab cpprestsdk: add version 2.10.16 (#17331)
Also

* Patch is only needed for 2.9.1
* Add openssl dependency
* Build with -DWERROR:BOOL=Off
2020-07-02 15:02:12 -07:00
Sinan
1e75dde7b2 mapnik: add version 3.0.23, update boost dependency (#17338) 2020-07-02 14:56:36 -07:00
manifest
f780839b87 examl + (#17265)
* examl +

* examl style fix

* examl flake8 fix

* Update var/spack/repos/builtin/packages/examl/package.py

using `working_dir`

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-02 09:43:47 -05:00
Andrew Gaspar
204f15b4c1 py-fortran-language-server: new package at v1.11.1 (#17318)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-07-02 10:22:01 +02:00
Michael Kuhn
a4fff39d7e py-shroud: added v0.12.1 (#17332)
py-setuptools is also needed at runtime, otherwise we get errors:
```
ModuleNotFoundError: No module named 'pkg_resources'
```
2020-07-02 10:14:57 +02:00
Michael Kuhn
10016a34e0 autotools: Fix config.guess detection, take two (#17333)
The previous fix from #17149 contained a thinko that produced errors for
packages that overwrite configure_directory.
2020-07-02 00:45:41 -07:00
Sinan
e133b44da6 py-opt-einsum: added v3.2.1, v3.2.0 and v2.3.2 (#17339)
Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
2020-07-02 09:14:39 +02:00
Paul R. C. Kent
5732d8de50 py-sphinxcontrib-bibtex: added v1.0.0 (#17336) 2020-07-02 08:32:26 +02:00
mic84
509b3c3016 amrex: added v20.07 (#17337)
Also added support for hdf5, petsc and hypre
2020-07-02 08:28:57 +02:00
ketsubouchi
8a9fa9bd18 biobloom: use the correct standard library for Fujitsu compilers (#17327) 2020-07-02 08:11:56 +02:00
Massimiliano Culpo
a5eabfad91 Moved flake8, shell and documentation tests to Github Action (#17328)
* Move flake8 tests on Github Actions

* Move shell test to Github Actions

* Moved documentation build to Github Action

* Don't run coverage on Python 2.6

Since we get connection errors consistently on Travis
when trying to upload coverage results for Python 2.6,
avoid computing coverage entirely to speed-up tests.
2020-07-01 11:58:53 -05:00
Adam J. Stewart
6a77f1ff45 Fix hashlib function capitalization (#17323) 2020-07-01 09:46:20 -05:00
Glenn Johnson
60283775b3 Documentation update for container example (#17321)
This updates the documentation to reflect #17316.
2020-07-01 08:40:36 +02:00
Greg Becker
4433e4de2d Use apple-clang for MacOS nightly tests (#17320) 2020-07-01 08:21:08 +02:00
darmac
aaf6f80d4c hbase: refine url , java and version (#17306)
* hbase: refine url , java and version

* Update var/spack/repos/builtin/packages/hbase/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-06-30 22:22:32 -05:00
Gregory Becker
59fb789290 Merge branch 'releases/v0.15' into develop 2020-06-30 19:19:12 -05:00
1393 changed files with 39406 additions and 5227 deletions

View File

@@ -4,7 +4,8 @@ coverage:
range: 60...90
status:
project:
default: yes
default:
threshold: 0.2%
ignore:
- lib/spack/spack/test/.*

View File

@@ -1,5 +1,20 @@
#!/usr/bin/env sh
git clone https://github.com/spack/spack.git
echo -e "config:\n build_jobs: 2" > spack/etc/spack/config.yaml
. spack/share/spack/setup-env.sh
spack compilers
. share/spack/setup-env.sh
echo -e "config:\n build_jobs: 2" > etc/spack/config.yaml
spack config add "packages:all:target:[x86_64]"
# TODO: remove this explicit setting once apple-clang detection is fixed
cat <<EOF > etc/spack/compilers.yaml
compilers:
- compiler:
spec: apple-clang@11.0.3
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++
f77: /usr/local/bin/gfortran-9
fc: /usr/local/bin/gfortran-9
modules: []
operating_system: catalina
target: x86_64
EOF
spack compiler info apple-clang
spack debug report

View File

@@ -3,13 +3,12 @@ name: linux builds
on:
push:
branches:
- master
- develop
- releases/**
pull_request:
branches:
- master
- develop
- releases/**
paths-ignore:
# Don't run if we only modified packages in the built-in repository
- 'var/spack/repos/builtin/**'
@@ -19,36 +18,41 @@ on:
- '!var/spack/repos/builtin/packages/py-setuptools/**'
- '!var/spack/repos/builtin/packages/openjpeg/**'
- '!var/spack/repos/builtin/packages/r-rcpp/**'
- '!var/spack/repos/builtin/packages/ruby-rake/**'
# Don't run if we only modified documentation
- 'lib/spack/docs/**'
jobs:
build:
runs-on: ubuntu-latest
strategy:
max-parallel: 4
matrix:
package: [lz4, mpich, tut, py-setuptools, openjpeg, r-rcpp]
package:
- lz4 # MakefilePackage
- mpich # AutotoolsPackage
- tut # WafPackage
- py-setuptools # PythonPackage
- openjpeg # CMakePackage
- r-rcpp # RPackage
- ruby-rake # RubyPackage
steps:
- uses: actions/checkout@v2
- name: Cache ccache's store
uses: actions/cache@v1
- uses: actions/cache@v2
with:
path: ~/.ccache
key: ccache-build-${{ matrix.package }}
restore-keys: |
ccache-build-${{ matrix.package }}
- name: Setup Python
uses: actions/setup-python@v1
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install System Packages
run: |
sudo apt-get update
sudo apt-get -yqq install ccache gfortran perl perl-base r-base r-base-core r-base-dev findutils openssl libssl-dev libpciaccess-dev
sudo apt-get -yqq install ccache gfortran perl perl-base r-base r-base-core r-base-dev ruby findutils openssl libssl-dev libpciaccess-dev
R --version
perl --version
ruby --version
- name: Copy Configuration
run: |
ccache -M 300M && ccache -z

View File

@@ -3,13 +3,12 @@ name: linux tests
on:
push:
branches:
- master
- develop
- releases/**
pull_request:
branches:
- master
- develop
- releases/**
jobs:
unittests:
runs-on: ubuntu-latest
@@ -19,8 +18,9 @@ jobs:
steps:
- uses: actions/checkout@v2
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -36,9 +36,7 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
git fetch -u origin develop:develop
. .github/workflows/setup_git.sh
- name: Install kcov for bash script coverage
env:
KCOV_VERSION: 34
@@ -56,7 +54,61 @@ jobs:
share/spack/qa/run-unit-tests
coverage combine
coverage xml
- name: Upload to codecov.io
uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@v1
with:
flags: unittests,linux
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install System packages
run: |
sudo apt-get -y update
sudo apt-get install -y coreutils gfortran gnupg2 mercurial ninja-build patchelf zsh fish
# Needed for kcov
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev zlib1g-dev libdw-dev libiberty-dev
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Install kcov for bash script coverage
env:
KCOV_VERSION: 38
run: |
KCOV_ROOT=$(mktemp -d)
wget --output-document=${KCOV_ROOT}/${KCOV_VERSION}.tar.gz https://github.com/SimonKagstrom/kcov/archive/v${KCOV_VERSION}.tar.gz
tar -C ${KCOV_ROOT} -xzvf ${KCOV_ROOT}/${KCOV_VERSION}.tar.gz
mkdir -p ${KCOV_ROOT}/build
cd ${KCOV_ROOT}/build && cmake -Wno-dev ${KCOV_ROOT}/kcov-${KCOV_VERSION} && cd -
make -C ${KCOV_ROOT}/build && sudo make -C ${KCOV_ROOT}/build install
- name: Run shell tests
env:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@v1
with:
flags: shelltests,linux
centos6:
# Test for Python2.6 run on Centos 6
runs-on: ubuntu-latest
container: spack/github-actions:centos6
steps:
- name: Run unit tests
env:
HOME: /home/spack-test
run: |
whoami && echo $HOME && cd $HOME
git clone https://github.com/spack/spack.git && cd spack
git fetch origin ${{ github.ref }}:test-branch
git checkout test-branch
share/spack/qa/run-unit-tests

View File

@@ -8,6 +8,13 @@ on:
schedule:
# nightly at 1 AM
- cron: '0 1 * * *'
pull_request:
branches:
- develop
paths:
# Run if we modify this yaml file
- '.github/workflows/macos_python.yml'
# TODO: run if we touch any of the recipes involved in this
# GitHub Action Limits
# https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions
@@ -18,10 +25,14 @@ jobs:
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: spack install
run: |
. .github/workflows/install_spack.sh
spack install -v gcc
# 9.2.0 is the latest version on which we apply homebrew patch
spack install -v --fail-fast gcc@9.2.0 %apple-clang
install_jupyter_clang:
name: jupyter
@@ -29,30 +40,40 @@ jobs:
timeout-minutes: 700
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: spack install
run: |
. .github/workflows/install_spack.sh
spack install -v py-jupyter %clang
spack config add packages:opengl:paths:opengl@4.1:/usr/X11R6
spack install -v --fail-fast py-jupyter %apple-clang
install_scipy_clang:
name: scipy, mpl, pd
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: spack install
run: |
. .github/workflows/install_spack.sh
spack install -v py-scipy %clang
spack install -v py-matplotlib %clang
spack install -v py-pandas %clang
spack install -v --fail-fast py-scipy %apple-clang
spack install -v --fail-fast py-matplotlib %apple-clang
spack install -v --fail-fast py-pandas %apple-clang
install_mpi4py_clang:
name: mpi4py, petsc4py
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: spack install
run: |
. .github/workflows/install_spack.sh
spack install -v py-mpi4py %clang
spack install -v py-petsc4py %clang
spack install -v --fail-fast py-mpi4py %apple-clang
spack install -v --fail-fast py-petsc4py %apple-clang

View File

@@ -3,27 +3,22 @@ name: macos tests
on:
push:
branches:
- master
- develop
- releases/**
pull_request:
branches:
- master
- develop
- releases/**
jobs:
build:
runs-on: macos-latest
strategy:
matrix:
python-version: [3.7]
steps:
- uses: actions/checkout@v2
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
fetch-depth: 0
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
@@ -31,19 +26,16 @@ jobs:
pip install --upgrade flake8 pep8-naming
- name: Setup Homebrew packages
run: |
brew update
brew upgrade
brew install gcc gnupg2 dash kcov
- name: Run unit tests
run: |
git --version
git fetch -u origin develop:develop
. .github/workflows/setup_git.sh
. share/spack/setup-env.sh
coverage run $(which spack) test
coverage run $(which spack) unit-test
coverage combine
coverage xml
- name: Upload to codecov.io
uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@v1
with:
file: ./coverage.xml
flags: unittests,macos

View File

@@ -1,31 +0,0 @@
name: python version check
on:
push:
branches:
- master
- develop
- releases/**
pull_request:
branches:
- master
- develop
jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install Python Packages
run: |
pip install --upgrade pip
pip install --upgrade vermin
- name: Minimum Version (Spack's Core)
run: vermin --backport argparse -t=2.6- -t=3.5- -v lib/spack/spack/ lib/spack/llnl/ bin/
- name: Minimum Version (Repositories)
run: vermin --backport argparse -t=2.6- -t=3.5- -v var/spack/repos

9
.github/workflows/setup_git.sh vendored Executable file
View File

@@ -0,0 +1,9 @@
#!/usr/bin/env sh
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
# With fetch-depth: 0 we have a remote develop
# but not a local branch. Don't do this on develop
if [ "$(git branch --show-current)" != "develop" ]
then
git branch develop origin/develop
fi

65
.github/workflows/style_and_docs.yaml vendored Normal file
View File

@@ -0,0 +1,65 @@
name: style and docs
on:
push:
branches:
- develop
- releases/**
pull_request:
branches:
- develop
- releases/**
jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install Python Packages
run: |
pip install --upgrade pip
pip install --upgrade vermin
- name: Minimum Version (Spack's Core)
run: vermin --backport argparse -t=2.6- -t=3.5- -v lib/spack/spack/ lib/spack/llnl/ bin/
- name: Minimum Version (Repositories)
run: vermin --backport argparse -t=2.6- -t=3.5- -v var/spack/repos
flake8:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools flake8
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Run flake8 tests
run: |
share/spack/qa/run-flake8-tests
documentation:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install System packages
run: |
sudo apt-get -y update
sudo apt-get install -y coreutils ninja-build graphviz
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade -r lib/spack/docs/requirements.txt
- name: Build documentation
run: |
share/spack/qa/run-doc-tests

View File

@@ -20,8 +20,8 @@ Geoffrey Oxberry <oxberry1@llnl.gov> Geoffrey Oxberry
Glenn Johnson <glenn-johnson@uiowa.edu> Glenn Johnson <gjohnson@argon-ohpc.hpc.uiowa.edu>
Glenn Johnson <glenn-johnson@uiowa.edu> Glenn Johnson <glennpj@gmail.com>
Gregory Becker <becker33@llnl.gov> Gregory Becker <becker33.llnl.gov>
Gregory Becker <becker33@llnl.gov> becker33 <becker33.llnl.gov>
Gregory Becker <becker33@llnl.gov> becker33 <becker33@llnl.gov>
Gregory Becker <becker33@llnl.gov> Gregory Becker <becker33.llnl.gov>
Gregory Becker <becker33@llnl.gov> Gregory Becker <becker33@llnl.gov>
Gregory L. Lee <lee218@llnl.gov> Greg Lee <lee218@llnl.gov>
Gregory L. Lee <lee218@llnl.gov> Gregory L. Lee <lee218@cab687.llnl.gov>
Gregory L. Lee <lee218@llnl.gov> Gregory L. Lee <lee218@cab690.llnl.gov>

View File

@@ -1,152 +0,0 @@
#=============================================================================
# Project settings
#=============================================================================
# Only build master and develop on push; do not build every branch.
branches:
only:
- master
- develop
- /^releases\/.*$/
#=============================================================================
# Build matrix
#=============================================================================
dist: bionic
jobs:
fast_finish: true
include:
- stage: 'style checks'
python: '3.8'
os: linux
language: python
env: TEST_SUITE=flake8
- stage: 'unit tests + documentation'
python: '2.6'
dist: trusty
os: linux
language: python
addons:
apt:
# Everything but patchelf, that is not available for trusty
packages:
- ccache
- gfortran
- graphviz
- gnupg2
- kcov
- mercurial
- ninja-build
- realpath
- zsh
- fish
env: [ TEST_SUITE=unit, COVERAGE=true ]
- python: '3.8'
os: linux
language: python
env: [ TEST_SUITE=shell, COVERAGE=true, KCOV_VERSION=38 ]
- python: '3.8'
os: linux
language: python
env: TEST_SUITE=doc
stages:
- 'style checks'
- 'unit tests + documentation'
#=============================================================================
# Environment
#=============================================================================
# Docs need graphviz to build
addons:
# for Linux builds, we use APT
apt:
packages:
- ccache
- coreutils
- gfortran
- graphviz
- gnupg2
- mercurial
- ninja-build
- patchelf
- zsh
- fish
update: true
# ~/.ccache needs to be cached directly as Travis is not taking care of it
# (possibly because we use 'language: python' and not 'language: c')
cache:
pip: true
ccache: true
directories:
- ~/.ccache
before_install:
- ccache -M 2G && ccache -z
# Install kcov manually, since it's not packaged for bionic beaver
- if [[ "$KCOV_VERSION" ]]; then
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev zlib1g-dev libdw-dev libiberty-dev;
KCOV_ROOT=$(mktemp -d);
wget --output-document=${KCOV_ROOT}/${KCOV_VERSION}.tar.gz https://github.com/SimonKagstrom/kcov/archive/v${KCOV_VERSION}.tar.gz;
tar -C ${KCOV_ROOT} -xzvf ${KCOV_ROOT}/${KCOV_VERSION}.tar.gz;
mkdir -p ${KCOV_ROOT}/build;
cd ${KCOV_ROOT}/build && cmake -Wno-dev ${KCOV_ROOT}/kcov-${KCOV_VERSION} && cd - ;
make -C ${KCOV_ROOT}/build && sudo make -C ${KCOV_ROOT}/build install;
fi
# Install various dependencies
install:
- pip install --upgrade pip
- pip install --upgrade six
- pip install --upgrade setuptools
- pip install --upgrade codecov coverage==4.5.4
- pip install --upgrade flake8
- pip install --upgrade pep8-naming
- if [[ "$TEST_SUITE" == "doc" ]]; then
pip install --upgrade -r lib/spack/docs/requirements.txt;
fi
before_script:
# Need this for the git tests to succeed.
- git config --global user.email "spack@example.com"
- git config --global user.name "Test User"
# Need this to be able to compute the list of changed files
- git fetch origin ${TRAVIS_BRANCH}:${TRAVIS_BRANCH}
#=============================================================================
# Building
#=============================================================================
script:
- share/spack/qa/run-$TEST_SUITE-tests
after_success:
- ccache -s
- case "$TEST_SUITE" in
unit)
if [[ "$COVERAGE" == "true" ]]; then
codecov --env PYTHON_VERSION
--required
--flags "${TEST_SUITE}${TRAVIS_OS_NAME}";
fi
;;
shell)
codecov --env PYTHON_VERSION
--required
--flags "${TEST_SUITE}${TRAVIS_OS_NAME}";
esac
#=============================================================================
# Notifications
#=============================================================================
notifications:
email:
recipients:
- tgamblin@llnl.gov
- massimiliano.culpo@gmail.com
on_success: change
on_failure: always

View File

@@ -1,3 +1,32 @@
# v0.15.4 (2020-08-12)
This release contains one feature addition:
* Users can set `SPACK_GNUPGHOME` to override Spack's GPG path (#17139)
Several bugfixes for CUDA, binary packaging, and `spack -V`:
* CUDA package's `.libs` method searches for `libcudart` instead of `libcuda` (#18000)
* Don't set `CUDAHOSTCXX` in environments that contain CUDA (#17826)
* `buildcache create`: `NoOverwriteException` is a warning, not an error (#17832)
* Fix `spack buildcache list --allarch` (#17884)
* `spack -V` works with `releases/latest` tag and shallow clones (#17884)
And fixes for GitHub Actions and tests to ensure that CI passes on the
release branch (#15687, #17279, #17328, #17377, #17732).
# v0.15.3 (2020-07-28)
This release contains the following bugfixes:
* Fix handling of relative view paths (#17721)
* Fixes for binary relocation (#17418, #17455)
* Fix redundant printing of error messages in build environment (#17709)
It also adds a support script for Spack tutorials:
* Add a tutorial setup script to share/spack (#17705, #17722)
# v0.15.2 (2020-07-23)
This minor release includes two new features:

View File

@@ -1,20 +1,21 @@
Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
MIT License
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the "Software"),
to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following conditions:
Copyright (c) 2013-2020 LLNS, LLC and other Spack Project Developers.
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -4,7 +4,6 @@
[![Linux Tests](https://github.com/spack/spack/workflows/linux%20tests/badge.svg)](https://github.com/spack/spack/actions)
[![Linux Builds](https://github.com/spack/spack/workflows/linux%20builds/badge.svg)](https://github.com/spack/spack/actions)
[![macOS Builds (nightly)](https://github.com/spack/spack/workflows/macOS%20builds%20nightly/badge.svg?branch=develop)](https://github.com/spack/spack/actions?query=workflow%3A%22macOS+builds+nightly%22)
[![Build Status](https://travis-ci.com/spack/spack.svg?branch=develop)](https://travis-ci.com/spack/spack)
[![codecov](https://codecov.io/gh/spack/spack/branch/develop/graph/badge.svg)](https://codecov.io/gh/spack/spack)
[![Read the Docs](https://readthedocs.org/projects/spack/badge/?version=latest)](https://spack.readthedocs.io)
[![Slack](https://spackpm.herokuapp.com/badge.svg)](https://spackpm.herokuapp.com)
@@ -74,15 +73,33 @@ When you send your request, make ``develop`` the destination branch on the
Your PR must pass Spack's unit tests and documentation tests, and must be
[PEP 8](https://www.python.org/dev/peps/pep-0008/) compliant. We enforce
these guidelines with [Travis CI](https://travis-ci.org/spack/spack). To
run these tests locally, and for helpful tips on git, see our
these guidelines with our CI process. To run these tests locally, and for
helpful tips on git, see our
[Contribution Guide](http://spack.readthedocs.io/en/latest/contribution_guide.html).
Spack uses a rough approximation of the
[Git Flow](http://nvie.com/posts/a-successful-git-branching-model/)
branching model. The ``develop`` branch contains the latest
contributions, and ``master`` is always tagged and points to the latest
stable release.
Spack's `develop` branch has the latest contributions. Pull requests
should target `develop`, and users who want the latest package versions,
features, etc. can use `develop`.
Releases
--------
For multi-user site deployments or other use cases that need very stable
software installations, we recommend using Spack's
[stable releases](https://github.com/spack/spack/releases).
Each Spack release series also has a corresponding branch, e.g.
`releases/v0.14` has `0.14.x` versions of Spack, and `releases/v0.13` has
`0.13.x` versions. We backport important bug fixes to these branches but
we do not advance the package versions or make other changes that would
change the way Spack concretizes dependencies within a release branch.
So, you can base your Spack deployment on a release branch and `git pull`
to get fixes, without the package churn that comes with `develop`.
The latest release is always available with the `releases/latest` tag.
See the [docs on releases](https://spack.readthedocs.io/en/latest/developer_guide.html#releases)
for more details.
Code of Conduct
------------------------

View File

@@ -64,6 +64,10 @@ config:
- ~/.spack/stage
# - $spack/var/spack/stage
# Directory in which to run tests and store test results.
# Tests will be stored in directories named by date/time and package
# name/hash.
test_stage: ~/.spack/test
# Cache directory for already downloaded source tarballs and archived
# repositories. This can be purged with `spack clean --downloads`.

View File

@@ -21,11 +21,14 @@ packages:
- gcc
- intel
providers:
elf: [libelf]
unwind: [apple-libunwind]
elf:
- libelf
unwind:
- apple-libunwind
apple-libunwind:
paths:
buildable: false
externals:
# Apple bundles libunwind version 35.3 with macOS 10.9 and later,
# although the version number used here isn't critical
apple-libunwind@35.3: /usr
buildable: False
- spec: apple-libunwind@35.3
prefix: /usr

View File

@@ -38,7 +38,7 @@ packages:
mpi: [openmpi, mpich]
mysql-client: [mysql, mariadb-c-client]
opencl: [pocl]
pil: [py-pillow]
pil: [py-pillow-simd]
pkgconfig: [pkgconf, pkg-config]
rpc: [libtirpc]
scalapack: [netlib-scalapack]

View File

@@ -695,11 +695,11 @@ Here is an example of a much longer spec than we've seen thus far:
.. code-block:: none
mpileaks @1.2:1.4 %gcc@4.7.5 +debug -qt arch=bgq_os ^callpath @1.1 %gcc@4.7.2
mpileaks @1.2:1.4 %gcc@4.7.5 +debug -qt target=x86_64 ^callpath @1.1 %gcc@4.7.2
If provided to ``spack install``, this will install the ``mpileaks``
library at some version between ``1.2`` and ``1.4`` (inclusive),
built using ``gcc`` at version 4.7.5 for the Blue Gene/Q architecture,
built using ``gcc`` at version 4.7.5 for a generic ``x86_64`` architecture,
with debug options enabled, and without Qt support. Additionally, it
says to link it with the ``callpath`` library (which it depends on),
and to build callpath with ``gcc`` 4.7.2. Most specs will not be as

View File

@@ -57,10 +57,13 @@ directory. Here's an example of an external configuration:
packages:
openmpi:
paths:
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with GCC,
one built with GCC and debug information, and another built with Intel.
@@ -76,13 +79,15 @@ of the installation prefixes. The following example says that module
.. code-block:: yaml
cmake:
modules:
cmake@3.7.2: CMake/3.7.2
externals:
- spec: cmake@3.7.2
modules:
- CMake/3.7.2
Each ``packages.yaml`` begins with a ``packages:`` token, followed
by a list of package names. To specify externals, add a ``paths`` or ``modules``
token under the package name, which lists externals in a
``spec: /path`` or ``spec: module-name`` format. Each spec should be as
Each ``packages.yaml`` begins with a ``packages:`` attribute, followed
by a list of package names. To specify externals, add an ``externals:``
attribute under the package name, which lists externals.
Each external should specify a ``spec:`` string that should be as
well-defined as reasonably possible. If a
package lacks a spec component, such as missing a compiler or
package version, then Spack will guess the missing component based
@@ -106,10 +111,13 @@ be:
packages:
openmpi:
paths:
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build
@@ -137,10 +145,13 @@ but more conveniently:
mpi:
buildable: False
openmpi:
paths:
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Implementations can also be listed immediately under the virtual they provide:
@@ -172,8 +183,9 @@ After running this command your ``packages.yaml`` may include new entries:
packages:
cmake:
paths:
cmake@3.17.2: /usr
externals:
- spec: cmake@3.17.2
prefix: /usr
Generally this is useful for detecting a small set of commonly-used packages;
for now this is generally limited to finding build-only dependencies.

View File

@@ -29,6 +29,7 @@ on these ideas for each distinct build system that Spack supports:
:maxdepth: 1
:caption: Make-incompatible
build_systems/mavenpackage
build_systems/sconspackage
build_systems/wafpackage

View File

@@ -175,7 +175,7 @@ In the ``perl`` package, we can see:
@run_after('build')
@on_package_attributes(run_tests=True)
def test(self):
def build_test(self):
make('test')
As you can guess, this runs ``make test`` *after* building the package,

View File

@@ -418,9 +418,13 @@ Adapt the following example. Be sure to maintain the indentation:
# other content ...
intel-mkl:
modules:
intel-mkl@2018.2.199 arch=linux-centos6-x86_64: intel-mkl/18/18.0.2
intel-mkl@2018.3.222 arch=linux-centos6-x86_64: intel-mkl/18/18.0.3
externals:
- spec: "intel-mkl@2018.2.199 arch=linux-centos6-x86_64"
modules:
- intel-mkl/18/18.0.2
- spec: "intel-mkl@2018.3.222 arch=linux-centos6-x86_64"
modules:
- intel-mkl/18/18.0.3
The version numbers for the ``intel-mkl`` specs defined here correspond to file
and directory names that Intel uses for its products because they were adopted
@@ -451,12 +455,16 @@ mechanism.
packages:
intel-parallel-studio:
modules:
intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64: intel/18/18.0.2
intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64: intel/18/18.0.3
externals:
- spec: "intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64"
modules:
- intel/18/18.0.2
- spec: "intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64"
modules:
- intel/18/18.0.3
buildable: False
One additional example illustrates the use of ``paths:`` instead of
One additional example illustrates the use of ``prefix:`` instead of
``modules:``, useful when external modulefiles are not available or not
suitable:
@@ -464,13 +472,15 @@ suitable:
packages:
intel-parallel-studio:
paths:
intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal: /opt/intel
intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal: /opt/intel
externals:
- spec: "intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal"
prefix: /opt/intel
- spec: "intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal"
prefix: /opt/intel
buildable: False
Note that for the Intel packages discussed here, the directory values in the
``paths:`` entries must be the high-level and typically version-less
``prefix:`` entries must be the high-level and typically version-less
"installation directory" that has been used by Intel's product installer.
Such a directory will typically accumulate various product versions. Amongst
them, Spack will select the correct version-specific product directory based on

View File

@@ -0,0 +1,84 @@
.. Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _mavenpackage:
------------
MavenPackage
------------
Apache Maven is a general-purpose build system that does not rely
on Makefiles to build software. It is designed for building and
managing and Java-based project.
^^^^^^
Phases
^^^^^^
The ``MavenPackage`` base class comes with the following phases:
#. ``build`` - compile code and package into a JAR file
#. ``install`` - copy to installation prefix
By default, these phases run:
.. code-block:: console
$ mvn package
$ install . <prefix>
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
Maven packages can be identified by the presence of a ``pom.xml`` file.
This file lists dependencies and other metadata about the project.
There may also be configuration files in the ``.mvn`` directory.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
Maven requires the ``mvn`` executable to build the project. It also
requires Java at both build- and run-time. Because of this, the base
class automatically adds the following dependencies:
.. code-block:: python
depends_on('java', type=('build', 'run'))
depends_on('maven', type='build')
In the ``pom.xml`` file, you may see sections like:
.. code-block:: xml
<requireJavaVersion>
<version>[1.7,)</version>
</requireJavaVersion>
<requireMavenVersion>
<version>[3.5.4,)</version>
</requireMavenVersion>
This specifies the versions of Java and Maven that are required to
build the package. See
https://docs.oracle.com/middleware/1212/core/MAVEN/maven_version.htm#MAVEN402
for a description of this version range syntax. In this case, you
should add:
.. code-block:: python
depends_on('java@7:', type='build')
depends_on('maven@3.5.4:', type='build')
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on the Maven build system, see:
https://maven.apache.org/index.html

View File

@@ -81,6 +81,24 @@ you'll need to define a function for it like so:
self.setup_py('configure')
^^^^^^
Wheels
^^^^^^
Some Python packages are closed-source and distributed as wheels.
Instead of using the ``PythonPackage`` base class, you should extend
the ``Package`` base class and implement the following custom installation
procedure:
.. code-block::
def install(self, spec, prefix):
pip = which('pip')
pip('install', self.stage.archive_file, '--prefix={0}'.format(prefix))
This will require a dependency on pip, as mentioned below.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
@@ -95,6 +113,27 @@ file should be considered to be the truth. As dependencies are added or
removed, the documentation is much more likely to become outdated than
the ``setup.py``.
The Python ecosystem has evolved significantly over the years. Before
setuptools became popular, most packages listed their dependencies in a
``requirements.txt`` file. Once setuptools took over, these dependencies
were listed directly in the ``setup.py``. Newer PEPs introduced additional
files, like ``setup.cfg`` and ``pyproject.toml``. You should look out for
all of these files, as they may all contain important information about
package dependencies.
Some Python packages are closed-source and are distributed as Python
wheels. For example, ``py-azureml-sdk`` downloads a ``.whl`` file. This
file is simply a zip file, and can be extracted using:
.. code-block:: console
$ unzip *.whl
The zip file will not contain a ``setup.py``, but it will contain a
``METADATA`` file which contains all the information you need to
write a ``package.py`` build recipe.
^^^^^^^^^^^^^^^^^^^^^^^
Finding Python packages
^^^^^^^^^^^^^^^^^^^^^^^
@@ -105,8 +144,9 @@ it the only option for developers who want a simple installation.
Search for "PyPI <package-name>" to find the download page. Note that
some pages are versioned, and the first result may not be the newest
version. Click on the "Latest Version" button to the top right to see
if a newer version is available. The download page is usually at:
https://pypi.org/project/<package-name>
if a newer version is available. The download page is usually at::
https://pypi.org/project/<package-name>
^^^^^^^^^^^
Description
@@ -151,39 +191,67 @@ replacing this with the requested version. Obviously, if Spack cannot
guess the version correctly, or if non-version-related things change
in the URL, Spack cannot substitute the version properly.
Once upon a time, PyPI offered nice, simple download URLs like:
https://pypi.python.org/packages/source/n/numpy/numpy-1.13.1.zip
Once upon a time, PyPI offered nice, simple download URLs like::
https://pypi.python.org/packages/source/n/numpy/numpy-1.13.1.zip
As you can see, the version is 1.13.1. It probably isn't hard to guess
what URL to use to download version 1.12.0, and Spack was perfectly
capable of performing this calculation.
However, PyPI switched to a new download URL format:
https://pypi.python.org/packages/c0/3a/40967d9f5675fbb097ffec170f59c2ba19fc96373e73ad47c2cae9a30aed/numpy-1.13.1.zip#md5=2c3c0f4edf720c3a7b525dacc825b9ae
However, PyPI switched to a new download URL format::
https://pypi.python.org/packages/c0/3a/40967d9f5675fbb097ffec170f59c2ba19fc96373e73ad47c2cae9a30aed/numpy-1.13.1.zip#md5=2c3c0f4edf720c3a7b525dacc825b9ae
and more recently::
https://files.pythonhosted.org/packages/b0/2b/497c2bb7c660b2606d4a96e2035e92554429e139c6c71cdff67af66b58d2/numpy-1.14.3.zip
and more recently:
https://files.pythonhosted.org/packages/b0/2b/497c2bb7c660b2606d4a96e2035e92554429e139c6c71cdff67af66b58d2/numpy-1.14.3.zip
As you can imagine, it is impossible for Spack to guess what URL to
use to download version 1.12.0 given this URL. There is a solution,
however. PyPI offers a new hidden interface for downloading
Python packages that does not include a hash in the URL:
https://pypi.io/packages/source/n/numpy/numpy-1.13.1.zip
Python packages that does not include a hash in the URL::
This URL redirects to the files.pythonhosted.org URL. The general syntax for
this pypi.io URL is:
https://pypi.io/packages/source/<first-letter-of-name>/<name>/<name>-<version>.<extension>
https://pypi.io/packages/source/n/numpy/numpy-1.13.1.zip
This URL redirects to the https://files.pythonhosted.org URL. The general
syntax for this https://pypi.io URL is::
https://pypi.io/packages/<type>/<first-letter-of-name>/<name>/<name>-<version>.<extension>
Please use the https://pypi.io URL instead of the https://pypi.python.org
URL. If both ``.tar.gz`` and ``.zip`` versions are available, ``.tar.gz``
is preferred. If some releases offer both ``.tar.gz`` and ``.zip`` versions,
but some only offer ``.zip`` versions, use ``.zip``.
Some Python packages are closed-source and do not ship ``.tar.gz`` or ``.zip``
files on either PyPI or GitHub. If this is the case, you can still download
and install a Python wheel. For example, ``py-azureml-sdk`` is closed source
and can be downloaded from::
https://pypi.io/packages/py3/a/azureml_sdk/azureml_sdk-1.11.0-py3-none-any.whl
Note that instead of ``<type>`` being ``source``, it is now ``py3`` since this
wheel will work for any generic version of Python 3. You may see Python-specific
or OS-specific URLs. Note that when you add a ``.whl`` URL, you should add
``expand=False`` to ensure that Spack doesn't try to extract the wheel:
.. code-block:: python
version('1.11.0', sha256='d8c9d24ea90457214d798b0d922489863dad518adde3638e08ef62de28fb183a', expand=False)
Please use the pypi.io URL instead of the pypi.python.org URL. If both
``.tar.gz`` and ``.zip`` versions are available, ``.tar.gz`` is preferred.
If some releases offer both ``.tar.gz`` and ``.zip`` versions, but some
only offer ``.zip`` versions, use ``.zip``.
"""""""""""""""
PyPI vs. GitHub
"""""""""""""""
Many packages are hosted on PyPI, but are developed on GitHub and other
Many packages are hosted on PyPI, but are developed on GitHub or another
version control systems. The tarball can be downloaded from either
location, but PyPI is preferred for the following reasons:
@@ -226,7 +294,7 @@ location, but PyPI is preferred for the following reasons:
There are some reasons to prefer downloading from GitHub:
#. The GitHub tarball may contain unit tests
#. The GitHub tarball may contain unit tests.
As previously mentioned, the PyPI tarball contains the bare minimum
of files to install the package. Unless explicitly specified by the
@@ -234,12 +302,6 @@ There are some reasons to prefer downloading from GitHub:
If you desire to run the unit tests during installation, you should
use the GitHub tarball instead.
#. Spack does not yet support ``spack versions`` and ``spack checksum``
with PyPI URLs
These commands work just fine with GitHub URLs. This is a minor
annoyance, not a reason to prefer GitHub over PyPI.
If you really want to run these unit tests, no one will stop you from
submitting a PR for a new package that downloads from GitHub.
@@ -280,8 +342,8 @@ If Python 2.7 is the only version that works, you can use:
The documentation may not always specify supported Python versions.
Another place to check is in the ``setup.py`` file. Look for a line
containing ``python_requires``. An example from
Another place to check is in the ``setup.py`` or ``setup.cfg`` file.
Look for a line containing ``python_requires``. An example from
`py-numpy <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-numpy/package.py>`_
looks like:
@@ -290,7 +352,7 @@ looks like:
python_requires='>=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*'
More commonly, you will find a version check at the top of the file:
You may also find a version check at the top of the ``setup.py``:
.. code-block:: python
@@ -305,6 +367,39 @@ This can be converted to Spack's spec notation like so:
depends_on('python@2.7:2.8,3.4:', type=('build', 'run'))
If you are writing a recipe for a package that only distributes
wheels, look for a section in the ``METADATA`` file that looks like::
Requires-Python: >=3.5,<4
This would be translated to:
.. code-block:: python
extends('python')
depends_on('python@3.5:3.999', type=('build', 'run'))
Many ``setup.py`` or ``setup.cfg`` files also contain information like::
Programming Language :: Python :: 2
Programming Language :: Python :: 2.6
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.3
Programming Language :: Python :: 3.4
Programming Language :: Python :: 3.5
Programming Language :: Python :: 3.6
This is a list of versions of Python that the developer likely tests.
However, you should not use this to restrict the versions of Python
the package uses unless one of the two former methods (``python_requires``
or ``sys.version_info``) is used. There is no logic in setuptools
that prevents the package from building for Python versions not in
this list, and often new releases like Python 3.7 or 3.8 work just fine.
""""""""""
setuptools
""""""""""
@@ -317,7 +412,7 @@ Most notably, there was no way to list a project's dependencies
with distutils. Along came setuptools, a non-builtin build system
designed to overcome the limitations of distutils. Both projects
use a similar API, making the transition easy while adding much
needed functionality. Today, setuptools is used in around 75% of
needed functionality. Today, setuptools is used in around 90% of
the Python packages in Spack.
Since setuptools isn't built-in to Python, you need to add it as a
@@ -360,6 +455,20 @@ run-time. This can be specified as:
depends_on('py-setuptools', type='build')
"""
pip
"""
Packages distributed as Python wheels will require an extra dependency
on pip:
.. code-block:: python
depends_on('py-pip', type='build')
We will use pip to install the actual wheel.
""""""
cython
""""""
@@ -383,6 +492,12 @@ where speed is crucial. There is no reason why someone would not
want an optimized version of a library instead of the pure-Python
version.
Note that some release tarballs come pre-cythonized, and cython is
not needed as a dependency. However, this is becoming less common
as Python continues to evolve and developers discover that cythonized
sources are no longer compatible with newer versions of Python and
need to be re-cythonized.
^^^^^^^^^^^^^^^^^^^
Python dependencies
^^^^^^^^^^^^^^^^^^^
@@ -429,15 +544,26 @@ Obviously, this means that ``py-numpy`` is a dependency.
If the package uses ``setuptools``, check for the following clues:
* ``python_requires``
As mentioned above, this specifies which versions of Python are
required.
* ``setup_requires``
These packages are usually only needed at build-time, so you can
add them with ``type='build'``.
* ``install_requires``
These packages are required for installation.
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
* ``extra_requires``
These packages are optional dependencies that enable additional
functionality. You should add a variant that optionally adds these
dependencies.
dependencies. This variant should be False by default.
* ``test_requires``
@@ -461,13 +587,37 @@ sphinx. If you can't find any information about the package's
dependencies, you can take a look in ``requirements.txt``, but be sure
not to add test or documentation dependencies.
Newer PEPs have added alternative ways to specify a package's dependencies.
If you don't see any dependencies listed in the ``setup.py``, look for a
``setup.cfg`` or ``pyproject.toml``. These files can be used to store the
same ``install_requires`` information that ``setup.py`` used to use.
If you are write a recipe for a package that only distributes wheels,
check the ``METADATA`` file for lines like::
Requires-Dist: azureml-core (~=1.11.0)
Requires-Dist: azureml-dataset-runtime[fuse] (~=1.11.0)
Requires-Dist: azureml-train (~=1.11.0)
Requires-Dist: azureml-train-automl-client (~=1.11.0)
Requires-Dist: azureml-pipeline (~=1.11.0)
Provides-Extra: accel-models
Requires-Dist: azureml-accel-models (~=1.11.0); extra == 'accel-models'
Provides-Extra: automl
Requires-Dist: azureml-train-automl (~=1.11.0); extra == 'automl'
Lines that use ``Requires-Dist`` are similar to ``install_requires``.
Lines that use ``Provides-Extra`` are similar to ``extra_requires``,
and you can add a variant for those dependencies. The ``~=1.11.0``
syntax is equivalent to ``1.11.0:1.11.999``.
""""""""""
setuptools
""""""""""
Setuptools is a bit of a special case. If a package requires setuptools
at run-time, how do they express this? They could add it to
``install_requires``, but setuptools is imported long before this and
``install_requires``, but setuptools is imported long before this and is
needed to read this line. And since you can't install the package
without setuptools, the developers assume that setuptools will already
be there, so they never mention when it is required. We don't want to
@@ -580,11 +730,13 @@ By default, Spack runs:
if it detects that the ``setup.py`` file supports a ``test`` phase.
You can add additional build-time or install-time tests by overriding
``test`` and ``installtest``, respectively. For example, ``py-numpy``
adds:
``test`` or adding a custom install-time test function. For example,
``py-numpy`` adds:
.. code-block:: python
install_time_test_callbacks = ['install_test', 'import_module_test']
def install_test(self):
with working_dir('..'):
python('-c', 'import numpy; numpy.test("full", verbose=2)')
@@ -651,6 +803,8 @@ that the package uses the ``PythonPackage`` build system. However, there
are occasionally packages that use ``PythonPackage`` that shouldn't
start with ``py-``. For example:
* awscli
* aws-parallelcluster
* busco
* easybuild
* httpie
@@ -736,8 +890,9 @@ non-Python dependencies. Anaconda contains many Python packages that
are not yet in Spack, and Spack contains many Python packages that are
not yet in Anaconda. The main advantage of Spack over Anaconda is its
ability to choose a specific compiler and BLAS/LAPACK or MPI library.
Spack also has better platform support for supercomputers. On the
other hand, Anaconda offers Windows support.
Spack also has better platform support for supercomputers, and can build
optimized binaries for your specific microarchitecture. On the other hand,
Anaconda offers Windows support.
^^^^^^^^^^^^^^^^^^^^^^
External documentation

View File

@@ -12,5 +12,173 @@ RubyPackage
Like Perl, Python, and R, Ruby has its own build system for
installing Ruby gems.
This build system is a work-in-progress. See
https://github.com/spack/spack/pull/3127 for more information.
^^^^^^
Phases
^^^^^^
The ``RubyPackage`` base class provides the following phases that
can be overridden:
#. ``build`` - build everything needed to install
#. ``install`` - install everything from build directory
For packages that come with a ``*.gemspec`` file, these phases run:
.. code-block:: console
$ gem build *.gemspec
$ gem install *.gem
For packages that come with a ``Rakefile`` file, these phases run:
.. code-block:: console
$ rake package
$ gem install *.gem
For packages that come pre-packaged as a ``*.gem`` file, the build
phase is skipped and the install phase runs:
.. code-block:: console
$ gem install *.gem
These are all standard ``gem`` commands and can be found by running:
.. code-block:: console
$ gem help commands
For packages that only distribute ``*.gem`` files, these files can be
downloaded with the ``expand=False`` option in the ``version`` directive.
The build phase will be automatically skipped.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
When building from source, Ruby packages can be identified by the
presence of any of the following files:
* ``*.gemspec``
* ``Rakefile``
* ``setup.rb`` (not yet supported)
However, not all Ruby packages are released as source code. Some are only
released as ``*.gem`` files. These files can be extracted using:
.. code-block:: console
$ gem unpack *.gem
^^^^^^^^^^^
Description
^^^^^^^^^^^
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
summary = 'An implementation of the AsciiDoc text processor and publishing toolchain'
description = 'A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats.'
Either of these can be used for the description of the Spack package.
^^^^^^^^
Homepage
^^^^^^^^
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
homepage = 'https://asciidoctor.org'
This should be used as the official homepage of the Spack package.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
All Ruby packages require Ruby at build and run-time. For this reason,
the base class contains:
.. code-block:: python
extends('ruby')
depends_on('ruby', type=('build', 'run'))
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
required_ruby_version = '>= 2.3.0'
This can be added to the Spack package using:
.. code-block:: python
depends_on('ruby@2.3.0:', type=('build', 'run'))
^^^^^^^^^^^^^^^^^
Ruby dependencies
^^^^^^^^^^^^^^^^^
When you install a package with ``gem``, it reads the ``*.gemspec``
file in order to determine the dependencies of the package.
If the dependencies are not yet installed, ``gem`` downloads them
and installs them for you. This may sound convenient, but Spack
cannot rely on this behavior for two reasons:
#. Spack needs to be able to install packages on air-gapped networks.
If there is no internet connection, ``gem`` can't download the
package dependencies. By explicitly listing every dependency in
the ``package.py``, Spack knows what to download ahead of time.
#. Duplicate installations of the same dependency may occur.
Spack supports *activation* of Ruby extensions, which involves
symlinking the package installation prefix to the Ruby installation
prefix. If your package is missing a dependency, that dependency
will be installed to the installation directory of the same package.
If you try to activate the package + dependency, it may cause a
problem if that package has already been activated.
For these reasons, you must always explicitly list all dependencies.
Although the documentation may list the package's dependencies,
often the developers assume people will use ``gem`` and won't have to
worry about it. Always check the ``*.gemspec`` file to find the true
dependencies.
Check for the following clues in the ``*.gemspec`` file:
* ``add_runtime_dependency``
These packages are required for installation.
* ``add_dependency``
This is an alias for ``add_runtime_dependency``
* ``add_development_dependency``
These packages are optional dependencies used for development.
They should not be added as dependencies of the package.
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on Ruby packaging, see:
https://guides.rubygems.org/

View File

@@ -56,7 +56,7 @@ overridden like so:
.. code-block:: python
def test(self):
def build_test(self):
scons('check')

View File

@@ -99,7 +99,7 @@ username is not already in the path, Spack will append the value of ``$user`` to
the selected ``build_stage`` path.
.. warning:: We highly recommend specifying ``build_stage`` paths that
distinguish between staging and other activities to ensure
distinguish between staging and other activities to ensure
``spack clean`` does not inadvertently remove unrelated files.
Spack prepends ``spack-stage-`` to temporary staging directory names to
reduce this risk. Using a combination of ``spack`` and or ``stage`` in
@@ -223,7 +223,7 @@ To build all software in serial, set ``build_jobs`` to 1.
--------------------
When set to ``true`` Spack will use ccache to cache compiles. This is
useful specifically in two cases: (1) when using ``spack setup``, and (2)
useful specifically in two cases: (1) when using ``spack dev-build``, and (2)
when building the same package with many different variants. The default is
``false``.

View File

@@ -45,7 +45,7 @@ Environments:
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
# Install the software, remove unnecessary deps
RUN cd /opt/spack-environment && spack install && spack gc -y
RUN cd /opt/spack-environment && spack env activate . && spack install && spack gc -y
# Strip all the binaries
RUN find -L /opt/view/* -type f -exec readlink -f '{}' \; | \
@@ -71,7 +71,7 @@ Environments:
&& yum install -y libgomp \
&& rm -rf /var/cache/yum && yum clean all
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ "' >> ~/.bashrc
LABEL "app"="gromacs"
@@ -165,7 +165,7 @@ of environments:
# Extra instructions
extra_instructions:
final: |
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ "' >> ~/.bashrc
# Labels for the image
labels:
@@ -267,7 +267,7 @@ following ``Dockerfile``:
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
# Install the software, remove unnecessary deps
RUN cd /opt/spack-environment && spack install && spack gc -y
RUN cd /opt/spack-environment && spack env activate . && spack install && spack gc -y
# Strip all the binaries
RUN find -L /opt/view/* -type f -exec readlink -f '{}' \; | \
@@ -293,7 +293,7 @@ following ``Dockerfile``:
&& yum install -y libgomp \
&& rm -rf /var/cache/yum && yum clean all
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ "' >> ~/.bashrc
LABEL "app"="gromacs"

View File

@@ -27,17 +27,28 @@ correspond to one feature/bugfix/extension/etc. One can create PRs with
changes relevant to different ideas, however reviewing such PRs becomes tedious
and error prone. If possible, try to follow the **one-PR-one-package/feature** rule.
Spack uses a rough approximation of the `Git Flow <http://nvie.com/posts/a-successful-git-branching-model/>`_
branching model. The develop branch contains the latest contributions, and
master is always tagged and points to the latest stable release. Therefore, when
you send your request, make ``develop`` the destination branch on the
`Spack repository <https://github.com/spack/spack>`_.
--------
Branches
--------
Spack's ``develop`` branch has the latest contributions. Nearly all pull
requests should start from ``develop`` and target ``develop``.
There is a branch for each major release series. Release branches
originate from ``develop`` and have tags for each point release in the
series. For example, ``releases/v0.14`` has tags for ``0.14.0``,
``0.14.1``, ``0.14.2``, etc. versions of Spack. We backport important bug
fixes to these branches, but we do not advance the package versions or
make other changes that would change the way Spack concretizes
dependencies. Currently, the maintainers manage these branches by
cherry-picking from ``develop``. See :ref:`releases` for more
information.
----------------------
Continuous Integration
----------------------
Spack uses `Travis CI <https://travis-ci.org/spack/spack>`_ for Continuous Integration
Spack uses `Github Actions <https://docs.github.com/en/actions>`_ for Continuous Integration
testing. This means that every time you submit a pull request, a series of tests will
be run to make sure you didn't accidentally introduce any bugs into Spack. **Your PR
will not be accepted until it passes all of these tests.** While you can certainly wait
@@ -46,25 +57,24 @@ locally to speed up the review process.
.. note::
Oftentimes, Travis will fail for reasons other than a problem with your PR.
Oftentimes, CI will fail for reasons other than a problem with your PR.
For example, apt-get, pip, or homebrew will fail to download one of the
dependencies for the test suite, or a transient bug will cause the unit tests
to timeout. If Travis fails, click the "Details" link and click on the test(s)
to timeout. If any job fails, click the "Details" link and click on the test(s)
that is failing. If it doesn't look like it is failing for reasons related to
your PR, you have two options. If you have write permissions for the Spack
repository, you should see a "Restart job" button on the right-hand side. If
repository, you should see a "Restart workflow" button on the right-hand side. If
not, you can close and reopen your PR to rerun all of the tests. If the same
test keeps failing, there may be a problem with your PR. If you notice that
every recent PR is failing with the same error message, it may be that Travis
is down or one of Spack's dependencies put out a new release that is causing
problems. If this is the case, please file an issue.
every recent PR is failing with the same error message, it may be that an issue
occurred with the CI infrastructure or one of Spack's dependencies put out a
new release that is causing problems. If this is the case, please file an issue.
If you take a look in ``$SPACK_ROOT/.travis.yml``, you'll notice that we test
against Python 2.6, 2.7, and 3.4-3.7 on both macOS and Linux. We currently
We currently test against Python 2.6, 2.7, and 3.5-3.7 on both macOS and Linux and
perform 3 types of tests:
.. _cmd-spack-test:
.. _cmd-spack-unit-test:
^^^^^^^^^^
Unit Tests
@@ -86,7 +96,7 @@ To run *all* of the unit tests, use:
.. code-block:: console
$ spack test
$ spack unit-test
These tests may take several minutes to complete. If you know you are
only modifying a single Spack feature, you can run subsets of tests at a
@@ -95,51 +105,53 @@ time. For example, this would run all the tests in
.. code-block:: console
$ spack test architecture.py
$ spack unit-test lib/spack/spack/test/architecture.py
And this would run the ``test_platform`` test from that file:
.. code-block:: console
$ spack test architecture.py::test_platform
$ spack unit-test lib/spack/spack/test/architecture.py::test_platform
This allows you to develop iteratively: make a change, test that change,
make another change, test that change, etc. We use `pytest
<http://pytest.org/>`_ as our tests fromework, and these types of
<http://pytest.org/>`_ as our tests framework, and these types of
arguments are just passed to the ``pytest`` command underneath. See `the
pytest docs
<http://doc.pytest.org/en/latest/usage.html#specifying-tests-selecting-tests>`_
for more details on test selection syntax.
``spack test`` has a few special options that can help you understand
what tests are available. To get a list of all available unit test
files, run:
``spack unit-test`` has a few special options that can help you
understand what tests are available. To get a list of all available
unit test files, run:
.. command-output:: spack test --list
.. command-output:: spack unit-test --list
:ellipsis: 5
To see a more detailed list of available unit tests, use ``spack test
--list-long``:
To see a more detailed list of available unit tests, use ``spack
unit-test --list-long``:
.. command-output:: spack test --list-long
.. command-output:: spack unit-test --list-long
:ellipsis: 10
And to see the fully qualified names of all tests, use ``--list-names``:
.. command-output:: spack test --list-names
.. command-output:: spack unit-test --list-names
:ellipsis: 5
You can combine these with ``pytest`` arguments to restrict which tests
you want to know about. For example, to see just the tests in
``architecture.py``:
.. command-output:: spack test --list-long architecture.py
.. command-output:: spack unit-test --list-long lib/spack/spack/test/architecture.py
You can also combine any of these options with a ``pytest`` keyword
search. For example, to see the names of all tests that have "spec"
search. See the `pytest usage docs
<https://docs.pytest.org/en/stable/usage.html#specifying-tests-selecting-tests>`_:
for more details on test selection syntax. For example, to see the names of all tests that have "spec"
or "concretize" somewhere in their names:
.. command-output:: spack test --list-names -k "spec and concretize"
.. command-output:: spack unit-test --list-names -k "spec and concretize"
By default, ``pytest`` captures the output of all unit tests, and it will
print any captured output for failed tests. Sometimes it's helpful to see
@@ -149,7 +161,7 @@ argument to ``pytest``:
.. code-block:: console
$ spack test -s architecture.py::test_platform
$ spack unit-test -s --list-long lib/spack/spack/test/architecture.py::test_platform
Unit tests are crucial to making sure bugs aren't introduced into
Spack. If you are modifying core Spack libraries or adding new
@@ -162,9 +174,9 @@ how to write tests!
.. note::
You may notice the ``share/spack/qa/run-unit-tests`` script in the
repository. This script is designed for Travis CI. It runs the unit
repository. This script is designed for CI. It runs the unit
tests and reports coverage statistics back to Codecov. If you want to
run the unit tests yourself, we suggest you use ``spack test``.
run the unit tests yourself, we suggest you use ``spack unit-test``.
^^^^^^^^^^^^
Flake8 Tests
@@ -235,7 +247,7 @@ to update them.
Try fixing flake8 errors in reverse order. This eliminates the need for
multiple runs of ``spack flake8`` just to re-compute line numbers and
makes it much easier to fix errors directly off of the Travis output.
makes it much easier to fix errors directly off of the CI output.
.. warning::
@@ -315,7 +327,7 @@ Once all of the dependencies are installed, you can try building the documentati
.. code-block:: console
$ cd "$SPACK_ROOT/lib/spack/docs"
$ cd path/to/spack/lib/spack/docs/
$ make clean
$ make
@@ -327,7 +339,7 @@ your PR is accepted.
There is also a ``run-doc-tests`` script in ``share/spack/qa``. The only
difference between running this script and running ``make`` by hand is that
the script will exit immediately if it encounters an error or warning. This
is necessary for Travis CI. If you made a lot of documentation changes, it is
is necessary for CI. If you made a lot of documentation changes, it is
much quicker to run ``make`` by hand so that you can see all of the warnings
at once.
@@ -391,7 +403,7 @@ and allow you to see coverage line-by-line when viewing the Spack repository.
If you are new to Spack, a great way to get started is to write unit tests to
increase coverage!
Unlike with Travis, Codecov tests are not required to pass in order for your
Unlike with CI on Github Actions Codecov tests are not required to pass in order for your
PR to be merged. If you modify core Spack libraries, we would greatly
appreciate unit tests that cover these changed lines. Otherwise, we have no
way of knowing whether or not your changes introduce a bug. If you make

View File

@@ -363,11 +363,12 @@ Developer commands
``spack doc``
^^^^^^^^^^^^^
^^^^^^^^^^^^^^
``spack test``
^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^
``spack unit-test``
^^^^^^^^^^^^^^^^^^^
See the :ref:`contributor guide section <cmd-spack-test>` on ``spack test``.
See the :ref:`contributor guide section <cmd-spack-unit-test>` on
``spack unit-test``.
.. _cmd-spack-python:
@@ -495,3 +496,398 @@ The bottom of the output shows the top most time consuming functions,
slowest on top. The profiling support is from Python's built-in tool,
`cProfile
<https://docs.python.org/2/library/profile.html#module-cProfile>`_.
.. _releases:
--------
Releases
--------
This section documents Spack's release process. It is intended for
project maintainers, as the tasks described here require maintainer
privileges on the Spack repository. For others, we hope this section at
least provides some insight into how the Spack project works.
.. _release-branches:
^^^^^^^^^^^^^^^^
Release branches
^^^^^^^^^^^^^^^^
There are currently two types of Spack releases: :ref:`major releases
<major-releases>` (``0.13.0``, ``0.14.0``, etc.) and :ref:`point releases
<point-releases>` (``0.13.1``, ``0.13.2``, ``0.13.3``, etc.). Here is a
diagram of how Spack release branches work::
o branch: develop (latest version)
|
o merge v0.14.1 into develop
|\
| o branch: releases/v0.14, tag: v0.14.1
o | merge v0.14.0 into develop
|\|
| o tag: v0.14.0
|/
o merge v0.13.2 into develop
|\
| o branch: releases/v0.13, tag: v0.13.2
o | merge v0.13.1 into develop
|\|
| o tag: v0.13.1
o | merge v0.13.0 into develop
|\|
| o tag: v0.13.0
o |
| o
|/
o
The ``develop`` branch has the latest contributions, and nearly all pull
requests target ``develop``.
Each Spack release series also has a corresponding branch, e.g.
``releases/v0.14`` has ``0.14.x`` versions of Spack, and
``releases/v0.13`` has ``0.13.x`` versions. A major release is the first
tagged version on a release branch. Minor releases are back-ported from
develop onto release branches. This is typically done by cherry-picking
bugfix commits off of ``develop``.
To avoid version churn for users of a release series, minor releases
should **not** make changes that would change the concretization of
packages. They should generally only contain fixes to the Spack core.
Both major and minor releases are tagged. After each release, we merge
the release branch back into ``develop`` so that the version bump and any
other release-specific changes are visible in the mainline. As a
convenience, we also tag the latest release as ``releases/latest``,
so that users can easily check it out to get the latest
stable version. See :ref:`merging-releases` for more details.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Scheduling work for releases
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
We schedule work for releases by creating `GitHub projects
<https://github.com/spack/spack/projects>`_. At any time, there may be
several open release projects. For example, here are two releases (from
some past version of the page linked above):
.. image:: images/projects.png
Here, there's one release in progress for ``0.15.1`` and another for
``0.16.0``. Each of these releases has a project board containing issues
and pull requests. GitHub shows a status bar with completed work in
green, work in progress in purple, and work not started yet in gray, so
it's fairly easy to see progress.
Spack's project boards are not firm commitments, and we move work between
releases frequently. If we need to make a release and some tasks are not
yet done, we will simply move them to next minor or major release, rather
than delaying the release to complete them.
For more on using GitHub project boards, see `GitHub's documentation
<https://docs.github.com/en/github/managing-your-work-on-github/about-project-boards>`_.
.. _major-releases:
^^^^^^^^^^^^^^^^^^^^^
Making Major Releases
^^^^^^^^^^^^^^^^^^^^^
Assuming you've already created a project board and completed the work
for a major release, the steps to make the release are as follows:
#. Create two new project boards:
* One for the next major release
* One for the next point release
#. Move any tasks that aren't done yet to one of the new project boards.
Small bugfixes should go to the next point release. Major features,
refactors, and changes that could affect concretization should go in
the next major release.
#. Create a branch for the release, based on ``develop``:
.. code-block:: console
$ git checkout -b releases/v0.15 develop
For a version ``vX.Y.Z``, the branch's name should be
``releases/vX.Y``. That is, you should create a ``releases/vX.Y``
branch if you are preparing the ``X.Y.0`` release.
#. Bump the version in ``lib/spack/spack/__init__.py``. See `this example from 0.13.0
<https://github.com/spack/spack/commit/8eeb64096c98b8a43d1c587f13ece743c864fba9>`_
#. Update the release version lists in these files to include the new version:
* ``lib/spack/spack/schema/container.py``
* ``lib/spack/spack/container/images.json``
.. TODO: We should get rid of this step in some future release.
#. Update ``CHANGELOG.md`` with major highlights in bullet form. Use
proper markdown formatting, like `this example from 0.15.0
<https://github.com/spack/spack/commit/d4bf70d9882fcfe88507e9cb444331d7dd7ba71c>`_.
#. Push the release branch to GitHub.
#. Make sure CI passes on the release branch, including:
* Regular unit tests
* Build tests
* The E4S pipeline at `gitlab.spack.io <https://gitlab.spack.io>`_
If CI is not passing, submit pull requests to ``develop`` as normal
and keep rebasing the release branch on ``develop`` until CI passes.
#. Follow the steps in :ref:`publishing-releases`.
#. Follow the steps in :ref:`merging-releases`.
#. Follow the steps in :ref:`announcing-releases`.
.. _point-releases:
^^^^^^^^^^^^^^^^^^^^^
Making Point Releases
^^^^^^^^^^^^^^^^^^^^^
This assumes you've already created a project board for a point release
and completed the work to be done for the release. To make a point
release:
#. Create one new project board for the next point release.
#. Move any cards that aren't done yet to the next project board.
#. Check out the release branch (it should already exist). For the
``X.Y.Z`` release, the release branch is called ``releases/vX.Y``. For
``v0.15.1``, you would check out ``releases/v0.15``:
.. code-block:: console
$ git checkout releases/v0.15
#. Cherry-pick each pull request in the ``Done`` column of the release
project onto the release branch.
This is **usually** fairly simple since we squash the commits from the
vast majority of pull requests, which means there is only one commit
per pull request to cherry-pick. For example, `this pull request
<https://github.com/spack/spack/pull/15777>`_ has three commits, but
the were squashed into a single commit on merge. You can see the
commit that was created here:
.. image:: images/pr-commit.png
You can easily cherry pick it like this (assuming you already have the
release branch checked out):
.. code-block:: console
$ git cherry-pick 7e46da7
For pull requests that were rebased, you'll need to cherry-pick each
rebased commit individually. There have not been any rebased PRs like
this in recent point releases.
.. warning::
It is important to cherry-pick commits in the order they happened,
otherwise you can get conflicts while cherry-picking. When
cherry-picking onto a point release, look at the merge date,
**not** the number of the pull request or the date it was opened.
Sometimes you may **still** get merge conflicts even if you have
cherry-picked all the commits in order. This generally means there
is some other intervening pull request that the one you're trying
to pick depends on. In these cases, you'll need to make a judgment
call:
1. If the dependency is small, you might just cherry-pick it, too.
If you do this, add it to the release board.
2. If it is large, then you may decide that this fix is not worth
including in a point release, in which case you should remove it
from the release project.
3. You can always decide to manually back-port the fix to the release
branch if neither of the above options makes sense, but this can
require a lot of work. It's seldom the right choice.
#. Bump the version in ``lib/spack/spack/__init__.py``. See `this example from 0.14.1
<https://github.com/spack/spack/commit/ff0abb9838121522321df2a054d18e54b566b44a>`_.
#. Updaate the release version lists in these files to include the new version:
* ``lib/spack/spack/schema/container.py``
* ``lib/spack/spack/container/images.json``
**TODO**: We should get rid of this step in some future release.
#. Update ``CHANGELOG.md`` with a list of bugfixes. This is typically just a
summary of the commits you cherry-picked onto the release branch. See
`the changelog from 0.14.1
<https://github.com/spack/spack/commit/ff0abb9838121522321df2a054d18e54b566b44a>`_.
#. Push the release branch to GitHub.
#. Make sure CI passes on the release branch, including:
* Regular unit tests
* Build tests
* The E4S pipeline at `gitlab.spack.io <https://gitlab.spack.io>`_
If CI does not pass, you'll need to figure out why, and make changes
to the release branch until it does. You can make more commits, modify
or remove cherry-picked commits, or cherry-pick **more** from
``develop`` to make this happen.
#. Follow the steps in :ref:`publishing-releases`.
#. Follow the steps in :ref:`merging-releases`.
#. Follow the steps in :ref:`announcing-releases`.
.. _publishing-releases:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Publishing a release on GitHub
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#. Go to `github.com/spack/spack/releases
<https://github.com/spack/spack/releases>`_ and click ``Draft a new
release``. Set the following:
* ``Tag version`` should start with ``v`` and contain *all three*
parts of the version, .g. ``v0.15.1``. This is the name of the tag
that will be created.
* ``Target`` should be the ``releases/vX.Y`` branch (e.g., ``releases/v0.15``).
* ``Release title`` should be ``vX.Y.Z`` (To match the tag, e.g., ``v0.15.1``).
* For the text, paste the latest release markdown from your ``CHANGELOG.md``.
You can save the draft and keep coming back to this as you prepare the release.
#. When you are done, click ``Publish release``.
#. Immediately after publishing, go back to
`github.com/spack/spack/releases
<https://github.com/spack/spack/releases>`_ and download the
auto-generated ``.tar.gz`` file for the release. It's the ``Source
code (tar.gz)`` link.
#. Click ``Edit`` on the release you just did and attach the downloaded
release tarball as a binary. This does two things:
#. Makes sure that the hash of our releases doesn't change over time.
GitHub sometimes annoyingly changes they way they generate
tarballs, and then hashes can change if you rely on the
auto-generated tarball links.
#. Gets us download counts on releases visible through the GitHub
API. GitHub tracks downloads of artifacts, but *not* the source
links. See the `releases
page <https://api.github.com/repos/spack/spack/releases>`_ and search
for ``download_count`` to see this.
#. Go to `readthedocs.org <https://readthedocs.org/projects/spack>`_ and activate
the release tag. This builds the documentation and makes the released version
selectable in the versions menu.
.. _merging-releases:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Updating `releases/latest` and `develop`
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If the new release is the **highest** Spack release yet, you should
also tag it as ``releases/latest``. For example, suppose the highest
release is currently ``0.15.3``:
* If you are releasing ``0.15.4`` or ``0.16.0``, then you should tag
it with ``releases/latest``, as these are higher than ``0.15.3``.
* If you are making a new release of an **older** major version of
Spack, e.g. ``0.14.4``, then you should not tag it as
``releases/latest`` (as there are newer major versions).
To tag ``releases/latest``, do this:
.. code-block:: console
$ git checkout releases/vX.Y # vX.Y is the new release's branch
$ git tag --force releases/latest
$ git push --tags
The ``--force`` argument makes ``git`` overwrite the existing
``releases/latest`` tag with the new one.
We also merge each release that we tag as ``releases/latest`` into ``develop``.
Make sure to do this with a merge commit:
.. code-block:: console
$ git checkout develop
$ git merge --no-ff vX.Y.Z # vX.Y.Z is the new release's tag
$ git push
We merge back to ``develop`` because it:
* updates the version and ``CHANGELOG.md`` on ``develop``.
* ensures that your release tag is reachable from the head of
``develop``
We *must* use a real merge commit (via the ``--no-ff`` option) because it
ensures that the release tag is reachable from the tip of ``develop``.
This is necessary for ``spack -V`` to work properly -- it uses ``git
describe --tags`` to find the last reachable tag in the repository and
reports how far we are from it. For example:
.. code-block:: console
$ spack -V
0.14.2-1486-b80d5e74e5
This says that we are at commit ``b80d5e74e5``, which is 1,486 commits
ahead of the ``0.14.2`` release.
We put this step last in the process because it's best to do it only once
the release is complete and tagged. If you do it before you've tagged the
release and later decide you want to tag some later commit, you'll need
to merge again.
.. _announcing-releases:
^^^^^^^^^^^^^^^^^^^^
Announcing a release
^^^^^^^^^^^^^^^^^^^^
We announce releases in all of the major Spack communication channels.
Publishing the release takes care of GitHub. The remaining channels are
Twitter, Slack, and the mailing list. Here are the steps:
#. Make a tweet to announce the release. It should link to the release's
page on GitHub. You can base it on `this example tweet
<https://twitter.com/spackpm/status/1231761858182307840>`_.
#. Ping ``@channel`` in ``#general`` on Slack (`spackpm.slack.com
<https://spackpm.slack.com>`_) with a link to the tweet. The tweet
will be shown inline so that you do not have to retype your release
announcement.
#. Email the Spack mailing list to let them know about the release. As
with the tweet, you likely want to link to the release's page on
GitHub. It's also helpful to include some information directly in the
email. You can base yours on this `example email
<https://groups.google.com/forum/#!topic/spack/WT4CT9i_X4s>`_.
Once you've announced the release, congratulations, you're done! You've
finished making the release!

View File

@@ -87,11 +87,12 @@ will be available from the command line:
--implicit select specs that are not installed or were installed implicitly
--output OUTPUT where to dump the result
The corresponding unit tests can be run giving the appropriate options to ``spack test``:
The corresponding unit tests can be run giving the appropriate options
to ``spack unit-test``:
.. code-block:: console
$ spack test --extension=scripting
$ spack unit-test --extension=scripting
============================================================== test session starts ===============================================================
platform linux2 -- Python 2.7.15rc1, pytest-3.2.5, py-1.4.34, pluggy-0.4.0

View File

@@ -48,8 +48,8 @@ platform, all on the command line.
# Add compiler flags using the conventional names
$ spack install mpileaks@1.1.2 %gcc@4.7.3 cppflags="-O3 -floop-block"
# Cross-compile for a different architecture with arch=
$ spack install mpileaks@1.1.2 arch=bgqos_0
# Cross-compile for a different micro-architecture with target=
$ spack install mpileaks@1.1.2 target=icelake
Users can specify as many or few options as they care about. Spack
will fill in the unspecified values with sensible defaults. The two listed

View File

@@ -19,6 +19,9 @@ before Spack is run:
#. Python 2 (2.6 or 2.7) or 3 (3.5 - 3.8) to run Spack
#. A C/C++ compiler for building
#. The ``make`` executable for building
#. The ``tar``, ``gzip``, ``bzip2``, ``xz`` and optionally ``zstd``
executables for extracting source code
#. The ``patch`` command to apply patches
#. The ``git`` and ``curl`` commands for fetching
#. If using the ``gpg`` subcommand, ``gnupg2`` is required
@@ -50,22 +53,37 @@ in the ``SPACK_ROOT`` environment variable. Add ``$SPACK_ROOT/bin``
to your path and you're ready to go:
.. code-block:: console
# For bash/zsh users
$ export SPACK_ROOT=/path/to/spack
$ export PATH=$SPACK_ROOT/bin:$PATH
# For tsch/csh users
$ setenv SPACK_ROOT /path/to/spack
$ setenv PATH $SPACK_ROOT/bin:$PATH
# For fish users
$ set -x SPACK_ROOT /path/to/spack
$ set -U fish_user_paths /path/to/spack $fish_user_paths
.. code-block:: console
$ spack install libelf
For a richer experience, use Spack's shell support:
.. code-block:: console
# Note you must set SPACK_ROOT
# For bash/zsh users
$ export SPACK_ROOT=/path/to/spack
$ . $SPACK_ROOT/share/spack/setup-env.sh
# For tcsh or csh users (note you must set SPACK_ROOT)
$ setenv SPACK_ROOT /path/to/spack
# For tcsh/csh users
$ source $SPACK_ROOT/share/spack/setup-env.csh
# For fish users
$ source $SPACK_ROOT/share/spack/setup-env.fish
This automatically adds Spack to your ``PATH`` and allows the ``spack``
command to be used to execute spack :ref:`commands <shell-support>` and
@@ -712,8 +730,9 @@ an OpenMPI installed in /opt/local, one would use:
packages:
openmpi:
paths:
openmpi@1.10.1: /opt/local
externals:
- spec: openmpi@1.10.1
prefix: /opt/local
buildable: False
In general, Spack is easier to use and more reliable if it builds all of
@@ -775,8 +794,9 @@ Then add the following to ``~/.spack/packages.yaml``:
packages:
openssl:
paths:
openssl@1.0.2g: /usr
externals:
- spec: openssl@1.0.2g
prefix: /usr
buildable: False
@@ -791,8 +811,9 @@ to add the following to ``packages.yaml``:
packages:
netlib-lapack:
paths:
netlib-lapack@3.6.1: /usr
externals:
- spec: netlib-lapack@3.6.1
prefix: /usr
buildable: False
all:
providers:
@@ -818,7 +839,7 @@ Git
Some Spack packages use ``git`` to download, which might not work on
some computers. For example, the following error was
encountered on a Macintosh during ``spack install julia-master``:
encountered on a Macintosh during ``spack install julia@master``:
.. code-block:: console
@@ -1181,9 +1202,13 @@ Here's an example of an external configuration for cray modules:
packages:
mpich:
modules:
mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10: cray-mpich
mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10: cray-mpich
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
all:
providers:
mpi: [mpich]
@@ -1195,7 +1220,7 @@ via module load.
.. note::
For Cray-provided packages, it is best to use ``modules:`` instead of ``paths:``
For Cray-provided packages, it is best to use ``modules:`` instead of ``prefix:``
in ``packages.yaml``, because the Cray Programming Environment heavily relies on
modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the
compiler wrapper link line).
@@ -1211,19 +1236,31 @@ Here is an example of a full packages.yaml used at NERSC
packages:
mpich:
modules:
mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge: cray-mpich
mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge: cray-mpich
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge"
modules:
- cray-mpich
buildable: False
netcdf:
modules:
netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge: cray-netcdf
netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge: cray-netcdf
externals:
- spec: "netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
- spec: "netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
buildable: False
hdf5:
modules:
hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge: cray-hdf5
hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge: cray-hdf5
externals:
- spec: "hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
- spec: "hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
buildable: False
all:
compiler: [gcc@5.2.0, intel@16.0.0.109]
@@ -1247,6 +1284,6 @@ environment variables may be propagated into containers that are not
using the Cray programming environment.
To ensure that Spack does not autodetect the Cray programming
environment, unset the environment variable ``CRAYPE_VERSION``. This
environment, unset the environment variable ``MODULEPATH``. This
will cause Spack to treat a linux container on a Cray system as a base
linux distro.

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

View File

@@ -14,7 +14,7 @@ problems if you encounter them.
Variants are not properly forwarded to dependencies
---------------------------------------------------
**Status:** Expected to be fixed in the next release
**Status:** Expected to be fixed by Spack's new concretizer
Sometimes, a variant of a package can also affect how its dependencies are
built. For example, in order to build MPI support for a package, it may
@@ -49,15 +49,29 @@ A workaround is to explicitly activate the variants of dependencies as well:
See https://github.com/spack/spack/issues/267 and
https://github.com/spack/spack/issues/2546 for further details.
-----------------------------------------------
depends_on cannot handle recursive dependencies
-----------------------------------------------
----------------------------
``spack setup`` doesn't work
----------------------------
**Status:** Not yet a work in progress
**Status:** Work in progress
Although ``depends_on`` can handle any aspect of Spack's spec syntax,
it currently cannot handle recursive dependencies. If the ``^`` sigil
appears in a ``depends_on`` statement, the concretizer will hang.
For example, something like:
Spack provides a ``setup`` command that is useful for the development of
software outside of Spack. Unfortunately, this command no longer works.
See https://github.com/spack/spack/issues/2597 and
https://github.com/spack/spack/issues/2662 for details. This is expected
to be fixed by https://github.com/spack/spack/pull/2664.
.. code-block:: python
depends_on('mfem+cuda ^hypre+cuda', when='+cuda')
should be rewritten as:
.. code-block:: python
depends_on('mfem+cuda', when='+cuda')
depends_on('hypre+cuda', when='+cuda')
See https://github.com/spack/spack/issues/17660 and
https://github.com/spack/spack/issues/11160 for more details.

View File

@@ -645,7 +645,7 @@ multiple fields based on delimiters such as ``.``, ``-`` etc. Then
matching fields are compared using the rules below:
#. The following develop-like strings are greater (newer) than all
numbers and are ordered as ``develop > master > head > trunk``.
numbers and are ordered as ``develop > main > master > head > trunk``.
#. Numbers are all less than the chosen develop-like strings above,
and are sorted numerically.
@@ -1967,22 +1967,29 @@ exactly what kind of a dependency you need. For example:
depends_on('cmake', type='build')
depends_on('py-numpy', type=('build', 'run'))
depends_on('libelf', type=('build', 'link'))
depends_on('py-pytest', type='test')
The following dependency types are available:
* **"build"**: made available during the project's build. The package will
be added to ``PATH``, the compiler include paths, and ``PYTHONPATH``.
Other projects which depend on this one will not have these modified
(building project X doesn't need project Y's build dependencies).
* **"link"**: the project is linked to by the project. The package will be
added to the current package's ``rpath``.
* **"run"**: the project is used by the project at runtime. The package will
be added to ``PATH`` and ``PYTHONPATH``.
* **"build"**: the dependency will be added to the ``PATH`` and
``PYTHONPATH`` at build-time.
* **"link"**: the dependency will be added to Spack's compiler
wrappers, automatically injecting the appropriate linker flags,
including ``-I``, ``-L``, and RPATH/RUNPATH handling.
* **"run"**: the dependency will be added to the ``PATH`` and
``PYTHONPATH`` at run-time. This is true for both ``spack load``
and the module files Spack writes.
* **"test"**: the dependency will be added to the ``PATH`` and
``PYTHONPATH`` at build-time. The only difference between
"build" and "test" is that test dependencies are only built
if the user requests unit tests with ``spack install --test``.
One of the advantages of the ``build`` dependency type is that although the
dependency needs to be installed in order for the package to be built, it
can be uninstalled without concern afterwards. ``link`` and ``run`` disallow
this because uninstalling the dependency would break the package.
this because uninstalling the dependency would break the package. Another
consequence of this is that ``build``-only dependencies do not affect the
hash of the package. The same is true for ``test`` dependencies.
If the dependency type is not specified, Spack uses a default of
``('build', 'link')``. This is the common case for compiler languages.
@@ -2003,7 +2010,8 @@ package. In that case, you could say something like:
.. code-block:: python
variant('mpi', default=False)
variant('mpi', default=False, description='Enable MPI support')
depends_on('mpi', when='+mpi')
``when`` can include constraints on the variant, version, compiler, etc. and
@@ -4054,21 +4062,223 @@ File functions
Making a package discoverable with ``spack external find``
----------------------------------------------------------
To make a package discoverable with
:ref:`spack external find <cmd-spack-external-find>` you must
define one or more executables associated with the package and must
implement a method to generate a Spec when given an executable.
The simplest way to make a package discoverable with
:ref:`spack external find <cmd-spack-external-find>` is to:
The executables are specified as a package level ``executables``
attribute which is a list of strings (see example below); each string
is treated as a regular expression (e.g. 'gcc' would match 'gcc', 'gcc-8.3',
'my-weird-gcc', etc.).
1. Define the executables associated with the package
2. Implement a method to determine the versions of these executables
The method ``determine_spec_details`` has the following signature:
^^^^^^^^^^^^^^^^^
Minimal detection
^^^^^^^^^^^^^^^^^
The first step is fairly simple, as it requires only to
specify a package level ``executables`` attribute:
.. code-block:: python
def determine_spec_details(prefix, exes_in_prefix):
class Foo(Package):
# Each string provided here is treated as a regular expression, and
# would match for example 'foo', 'foobar', and 'bazfoo'.
executables = ['foo']
This attribute must be a list of strings. Each string is a regular
expression (e.g. 'gcc' would match 'gcc', 'gcc-8.3', 'my-weird-gcc', etc.) to
determine a set of system executables that might be part or this package. Note
that to match only executables named 'gcc' the regular expression ``'^gcc$'``
must be used.
Finally to determine the version of each executable the ``determine_version``
method must be implemented:
.. code-block:: python
@classmethod
def determine_version(cls, exe):
"""Return either the version of the executable passed as argument
or ``None`` if the version cannot be determined.
Args:
exe (str): absolute path to the executable being examined
"""
This method receives as input the path to a single executable and must return
as output its version as a string; if the user cannot determine the version
or determines that the executable is not an instance of the package, they can
return None and the exe will be discarded as a candidate.
Implementing the two steps above is mandatory, and gives the package the
basic ability to detect if a spec is present on the system at a given version.
.. note::
Any executable for which the ``determine_version`` method returns ``None``
will be discarded and won't appear in later stages of the workflow described below.
^^^^^^^^^^^^^^^^^^^^^^^^
Additional functionality
^^^^^^^^^^^^^^^^^^^^^^^^
Besides the two mandatory steps described above, there are also optional
methods that can be implemented to either increase the amount of details
being detected or improve the robustness of the detection logic in a package.
""""""""""""""""""""""""""""""
Variants and custom attributes
""""""""""""""""""""""""""""""
The ``determine_variants`` method can be optionally implemented in a package
to detect additional details of the spec:
.. code-block:: python
@classmethod
def determine_variants(cls, exes, version_str):
"""Return either a variant string, a tuple of a variant string
and a dictionary of extra attributes that will be recorded in
packages.yaml or a list of those items.
Args:
exes (list of str): list of executables (absolute paths) that
live in the same prefix and share the same version
version_str (str): version associated with the list of
executables, as detected by ``determine_version``
"""
This method takes as input a list of executables that live in the same prefix and
share the same version string, and returns either:
1. A variant string
2. A tuple of a variant string and a dictionary of extra attributes
3. A list of items matching either 1 or 2 (if multiple specs are detected
from the set of executables)
If extra attributes are returned, they will be recorded in ``packages.yaml``
and be available for later reuse. As an example, the ``gcc`` package will record
by default the different compilers found and an entry in ``packages.yaml``
would look like:
.. code-block:: yaml
packages:
gcc:
externals:
- spec: 'gcc@9.0.1 languages=c,c++,fortran'
prefix: /usr
extra_attributes:
compilers:
c: /usr/bin/x86_64-linux-gnu-gcc-9
c++: /usr/bin/x86_64-linux-gnu-g++-9
fortran: /usr/bin/x86_64-linux-gnu-gfortran-9
This allows us, for instance, to keep track of executables that would be named
differently if built by Spack (e.g. ``x86_64-linux-gnu-gcc-9``
instead of just ``gcc``).
.. TODO: we need to gather some more experience on overriding 'prefix'
and other special keywords in extra attributes, but as soon as we are
confident that this is the way to go we should document the process.
See https://github.com/spack/spack/pull/16526#issuecomment-653783204
"""""""""""""""""""""""""""
Filter matching executables
"""""""""""""""""""""""""""
Sometimes defining the appropriate regex for the ``executables``
attribute might prove to be difficult, especially if one has to
deal with corner cases or exclude "red herrings". To help keeping
the regular expressions as simple as possible, each package can
optionally implement a ``filter_executables`` method:
.. code-block:: python
@classmethod
def filter_detected_exes(cls, prefix, exes_in_prefix):
"""Return a filtered list of the executables in prefix"""
which takes as input a prefix and a list of matching executables and
returns a filtered list of said executables.
Using this method has the advantage of allowing custom logic for
filtering, and does not restrict the user to regular expressions
only. Consider the case of detecting the GNU C++ compiler. If we
try to search for executables that match ``g++``, that would have
the unwanted side effect of selecting also ``clang++`` - which is
a C++ compiler provided by another package - if present on the system.
Trying to select executables that contain ``g++`` but not ``clang``
would be quite complicated to do using regex only. Employing the
``filter_detected_exes`` method it becomes:
.. code-block:: python
class Gcc(Package):
executables = ['g++']
def filter_detected_exes(cls, prefix, exes_in_prefix):
return [x for x in exes_in_prefix if 'clang' not in x]
Another possibility that this method opens is to apply certain
filtering logic when specific conditions are met (e.g. take some
decisions on an OS and not on another).
^^^^^^^^^^^^^^^^^^
Validate detection
^^^^^^^^^^^^^^^^^^
To increase detection robustness, packagers may also implement a method
to validate the detected Spec objects:
.. code-block:: python
@classmethod
def validate_detected_spec(cls, spec, extra_attributes):
"""Validate a detected spec. Raise an exception if validation fails."""
This method receives a detected spec along with its extra attributes and can be
used to check that certain conditions are met by the spec. Packagers can either
use assertions or raise an ``InvalidSpecDetected`` exception when the check fails.
In case the conditions are not honored the spec will be discarded and any message
associated with the assertion or the exception will be logged as the reason for
discarding it.
As an example, a package that wants to check that the ``compilers`` attribute is
in the extra attributes can implement this method like this:
.. code-block:: python
@classmethod
def validate_detected_spec(cls, spec, extra_attributes):
"""Check that 'compilers' is in the extra attributes."""
msg = ('the extra attribute "compilers" must be set for '
'the detected spec "{0}"'.format(spec))
assert 'compilers' in extra_attributes, msg
or like this:
.. code-block:: python
@classmethod
def validate_detected_spec(cls, spec, extra_attributes):
"""Check that 'compilers' is in the extra attributes."""
if 'compilers' not in extra_attributes:
msg = ('the extra attribute "compilers" must be set for '
'the detected spec "{0}"'.format(spec))
raise InvalidSpecDetected(msg)
.. _determine_spec_details:
^^^^^^^^^^^^^^^^^^^^^^^^^
Custom detection workflow
^^^^^^^^^^^^^^^^^^^^^^^^^
In the rare case when the mechanisms described so far don't fit the
detection of a package, the implementation of all the methods above
can be disregarded and instead a custom ``determine_spec_details``
method can be implemented directly in the package class (note that
the definition of the ``executables`` attribute is still required):
.. code-block:: python
@classmethod
def determine_spec_details(cls, prefix, exes_in_prefix):
# exes_in_prefix = a set of paths, each path is an executable
# prefix = a prefix that is common to each path in exes_in_prefix
@@ -4076,14 +4286,13 @@ The method ``determine_spec_details`` has the following signature:
# the package. Return one or more Specs for each instance of the
# package which is thought to be installed in the provided prefix
``determine_spec_details`` takes as parameters a set of discovered
executables (which match those specified by the user) as well as a
common prefix shared by all of those executables. The function must
return one or more Specs associated with the executables (it can also
return ``None`` to indicate that no provided executables are associated
with the package).
This method takes as input a set of discovered executables (which match
those specified by the user) as well as a common prefix shared by all
of those executables. The function must return one or more :py:class:`spack.spec.Spec` associated
with the executables (it can also return ``None`` to indicate that no
provided executables are associated with the package).
Say for example we have a package called ``foo-package`` which
As an example, consider a made-up package called ``foo-package`` which
builds an executable called ``foo``. ``FooPackage`` would appear as
follows:
@@ -4107,10 +4316,12 @@ follows:
return
# This implementation is lazy and only checks the first candidate
exe_path = candidates[0]
exe = spack.util.executable.Executable(exe_path)
output = exe('--version')
exe = Executable(exe_path)
output = exe('--version', output=str, error=str)
version_str = ... # parse output for version string
return Spec('foo-package@{0}'.format(version_str))
return Spec.from_detection(
'foo-package@{0}'.format(version_str)
)
.. _package-lifecycle:
@@ -4474,119 +4685,3 @@ might write:
DWARF_PREFIX = $(spack location --install-dir libdwarf)
CXXFLAGS += -I$DWARF_PREFIX/include
CXXFLAGS += -L$DWARF_PREFIX/lib
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Build System Configuration Support
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Imagine a developer creating a CMake or Autotools-based project in a
local directory, which depends on libraries A-Z. Once Spack has
installed those dependencies, one would like to run ``cmake`` with
appropriate command line and environment so CMake can find them. The
``spack setup`` command does this conveniently, producing a CMake
configuration that is essentially the same as how Spack *would have*
configured the project. This can be demonstrated with a usage
example:
.. code-block:: console
$ cd myproject
$ spack setup myproject@local
$ mkdir build; cd build
$ ../spconfig.py ..
$ make
$ make install
Notes:
* Spack must have ``myproject/package.py`` in its repository for
this to work.
* ``spack setup`` produces the executable script ``spconfig.py`` in
the local directory, and also creates the module file for the
package. ``spconfig.py`` is normally run from the user's
out-of-source build directory.
* The version number given to ``spack setup`` is arbitrary, just
like ``spack diy``. ``myproject/package.py`` does not need to
have any valid downloadable versions listed (typical when a
project is new).
* spconfig.py produces a CMake configuration that *does not* use the
Spack wrappers. Any resulting binaries *will not* use RPATH,
unless the user has enabled it. This is recommended for
development purposes, not production.
* ``spconfig.py`` is human readable, and can serve as a developer
reference of what dependencies are being used.
* ``make install`` installs the package into the Spack repository,
where it may be used by other Spack packages.
* CMake-generated makefiles re-run CMake in some circumstances. Use
of ``spconfig.py`` breaks this behavior, requiring the developer
to manually re-run ``spconfig.py`` when a ``CMakeLists.txt`` file
has changed.
^^^^^^^^^^^^
CMakePackage
^^^^^^^^^^^^
In order to enable ``spack setup`` functionality, the author of
``myproject/package.py`` must subclass from ``CMakePackage`` instead
of the standard ``Package`` superclass. Because CMake is
standardized, the packager does not need to tell Spack how to run
``cmake; make; make install``. Instead the packager only needs to
create (optional) methods ``configure_args()`` and ``configure_env()``, which
provide the arguments (as a list) and extra environment variables (as
a dict) to provide to the ``cmake`` command. Usually, these will
translate variant flags into CMake definitions. For example:
.. code-block:: python
def cmake_args(self):
spec = self.spec
return [
'-DUSE_EVERYTRACE=%s' % ('YES' if '+everytrace' in spec else 'NO'),
'-DBUILD_PYTHON=%s' % ('YES' if '+python' in spec else 'NO'),
'-DBUILD_GRIDGEN=%s' % ('YES' if '+gridgen' in spec else 'NO'),
'-DBUILD_COUPLER=%s' % ('YES' if '+coupler' in spec else 'NO'),
'-DUSE_PISM=%s' % ('YES' if '+pism' in spec else 'NO')
]
If needed, a packager may also override methods defined in
``StagedPackage`` (see below).
^^^^^^^^^^^^^
StagedPackage
^^^^^^^^^^^^^
``CMakePackage`` is implemented by subclassing the ``StagedPackage``
superclass, which breaks down the standard ``Package.install()``
method into several sub-stages: ``setup``, ``configure``, ``build``
and ``install``. Details:
* Instead of implementing the standard ``install()`` method, package
authors implement the methods for the sub-stages
``install_setup()``, ``install_configure()``,
``install_build()``, and ``install_install()``.
* The ``spack install`` command runs the sub-stages ``configure``,
``build`` and ``install`` in order. (The ``setup`` stage is
not run by default; see below).
* The ``spack setup`` command runs the sub-stages ``setup``
and a dummy install (to create the module file).
* The sub-stage install methods take no arguments (other than
``self``). The arguments ``spec`` and ``prefix`` to the standard
``install()`` method may be accessed via ``self.spec`` and
``self.prefix``.
^^^^^^^^^^^^^
GNU Autotools
^^^^^^^^^^^^^
The ``setup`` functionality is currently only available for
CMake-based packages. Extending this functionality to GNU
Autotools-based packages would be easy (and should be done by a
developer who actively uses Autotools). Packages that use
non-standard build systems can gain ``setup`` functionality by
subclassing ``StagedPackage`` directly.
.. Emacs local variables
Local Variables:
fill-column: 79
End:

View File

@@ -45,7 +45,7 @@ for setting up a build pipeline are as follows:
tags:
- <custom-tag>
script:
- spack env activate .
- spack env activate --without-view .
- spack ci generate
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
artifacts:
@@ -82,9 +82,9 @@ or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), thou
topics are outside the scope of this document.
Spack's pipelines are now making use of the
`trigger <https://docs.gitlab.com/12.9/ee/ci/yaml/README.html#trigger>` syntax to run
`trigger <https://docs.gitlab.com/12.9/ee/ci/yaml/README.html#trigger>`_ syntax to run
dynamically generated
`child pipelines <https://docs.gitlab.com/12.9/ee/ci/parent_child_pipelines.html>`.
`child pipelines <https://docs.gitlab.com/12.9/ee/ci/parent_child_pipelines.html>`_.
Note that the use of dynamic child pipelines requires running Gitlab version
``>= 12.9``.
@@ -122,6 +122,10 @@ pipeline jobs.
Concretizes the specs in the active environment, stages them (as described in
:ref:`staging_algorithm`), and writes the resulting ``.gitlab-ci.yml`` to disk.
This sub-command takes two arguments, but the most useful is ``--output-file``,
which should be an absolute path (including file name) to the generated
pipeline, if the default (``./.gitlab-ci.yml``) is not desired.
.. _cmd-spack-ci-rebuild:
^^^^^^^^^^^^^^^^^^^^
@@ -132,6 +136,10 @@ This sub-command is responsible for ensuring a single spec from the release
environment is up to date on the remote mirror configured in the environment,
and as such, corresponds to a single job in the ``.gitlab-ci.yml`` file.
Rather than taking command-line arguments, this sub-command expects information
to be communicated via environment variables, which will typically come via the
``.gitlab-ci.yml`` job as ``variables``.
------------------------------------
A pipeline-enabled spack environment
------------------------------------
@@ -189,15 +197,33 @@ corresponds to a known gitlab runner, where the ``match`` section is used
in assigning a release spec to one of the runners, and the ``runner-attributes``
section is used to configure the spec/job for that particular runner.
Both the top-level ``gitlab-ci`` section as well as each ``runner-attributes``
section can also contain the following keys: ``image``, ``tags``, ``variables``,
``before_script``, ``script``, and ``after_script``. If any of these keys are
provided at the ``gitlab-ci`` level, they will be used as the defaults for any
``runner-attributes``, unless they are overridden in those sections. Specifying
any of these keys at the ``runner-attributes`` level generally overrides the
keys specified at the higher level, with a couple exceptions. Any ``variables``
specified at both levels result in those dictionaries getting merged in the
resulting generated job, and any duplicate variable names get assigned the value
provided in the specific ``runner-attributes``. If ``tags`` are specified both
at the ``gitlab-ci`` level as well as the ``runner-attributes`` level, then the
lists of tags are combined, and any duplicates are removed.
See the section below on using a custom spack for an example of how these keys
could be used.
There are other pipeline options you can configure within the ``gitlab-ci`` section
as well. The ``bootstrap`` section allows you to specify lists of specs from
as well.
The ``bootstrap`` section allows you to specify lists of specs from
your ``definitions`` that should be staged ahead of the environment's ``specs`` (this
section is described in more detail below). The ``enable-artifacts-buildcache`` key
takes a boolean and determines whether the pipeline uses artifacts to store and
pass along the buildcaches from one stage to the next (the default if you don't
provide this option is ``False``). The ``enable-debug-messages`` key takes a boolean
and allows you to choose whether the pipeline build jobs are run as ``spack -d ci rebuild``
or just ``spack ci rebuild`` (the default is not to enable debug messages). The
provide this option is ``False``).
The
``final-stage-rebuild-index`` section controls whether an extra job is added to the
end of your pipeline (in a stage by itself) which will regenerate the mirror's
buildcache index. Under normal operation, each pipeline job that rebuilds a package
@@ -220,6 +246,11 @@ progresses, this build group may have jobs added or removed. The url, project,
and site are used to specify the CDash instance to which build results should
be reported.
Take a look at the
`schema <https://github.com/spack/spack/blob/develop/lib/spack/spack/schema/gitlab_ci.py>`_
for the gitlab-ci section of the spack environment file, to see precisely what
syntax is allowed there.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Assignment of specs to runners
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -245,7 +276,18 @@ runners known to the gitlab instance. For Docker executor type runners, the
as well as an ``entrypoint`` to override whatever the default for that image is).
For other types of runners the ``variables`` key will be useful to pass any
information on to the runner that it needs to do its work (e.g. scheduler
parameters, etc.).
parameters, etc.). Any ``variables`` provided here will be added, verbatim, to
each job.
The ``runner-attributes`` section also allows users to supply custom ``script``,
``before_script``, and ``after_script`` sections to be applied to every job
scheduled on that runner. This allows users to do any custom preparation or
cleanup tasks that fit their particular workflow, as well as completely
customize the rebuilding of a spec if they so choose. Spack will not generate
a ``before_script`` or ``after_script`` for jobs, but if you do not provide
a custom ``script``, spack will generate one for you that assumes your
``spack.yaml`` is at the root of the repository, activates that environment for
you, and invokes ``spack ci rebuild``.
.. _staging_algorithm:
@@ -256,8 +298,8 @@ Summary of ``.gitlab-ci.yml`` generation algorithm
All specs yielded by the matrix (or all the specs in the environment) have their
dependencies computed, and the entire resulting set of specs are staged together
before being run through the ``gitlab-ci/mappings`` entries, where each staged
spec is assigned a runner. "Staging" is the name we have given to the process
of figuring out in what order the specs should be built, taking into consideration
spec is assigned a runner. "Staging" is the name given to the process of
figuring out in what order the specs should be built, taking into consideration
Gitlab CI rules about jobs/stages. In the staging process the goal is to maximize
the number of jobs in any stage of the pipeline, while ensuring that the jobs in
any stage only depend on jobs in previous stages (since those jobs are guaranteed
@@ -268,7 +310,7 @@ a runner, the ``.gitlab-ci.yml`` is written to disk.
The short example provided above would result in the ``readline``, ``ncurses``,
and ``pkgconf`` packages getting staged and built on the runner chosen by the
``spack-k8s`` tag. In this example, we assume the runner is a Docker executor
``spack-k8s`` tag. In this example, spack assumes the runner is a Docker executor
type runner, and thus certain jobs will be run in the ``centos7`` container,
and others in the ``ubuntu-18.04`` container. The resulting ``.gitlab-ci.yml``
will contain 6 jobs in three stages. Once the jobs have been generated, the
@@ -327,12 +369,12 @@ Here's an example of what bootstrapping some compilers might look like:
# mappings similar to the example higher up in this description
...
In the example above, we have added a list to the ``definitions`` called
``compiler-pkgs`` (you can add any number of these), which lists compiler packages
we want to be staged ahead of the full matrix of release specs (which consists
only of readline in our example). Then within the ``gitlab-ci`` section, we
have added a ``bootstrap`` section, which can contain a list of items, each
referring to a list in the ``definitions`` section. These items can either
The example above adds a list to the ``definitions`` called ``compiler-pkgs``
(you can add any number of these), which lists compiler packages that should
be staged ahead of the full matrix of release specs (in this example, only
readline). Then within the ``gitlab-ci`` section, note the addition of a
``bootstrap`` section, which can contain a list of items, each referring to
a list in the ``definitions`` section. These items can either
be a dictionary or a string. If you supply a dictionary, it must have a name
key whose value must match one of the lists in definitions and it can have a
``compiler-agnostic`` key whose value is a boolean. If you supply a string,
@@ -368,13 +410,15 @@ Using a custom spack in your pipeline
If your runners will not have a version of spack ready to invoke, or if for some
other reason you want to use a custom version of spack to run your pipelines,
this can be accomplished fairly simply. First, create CI environment variables
containing the url and branch/tag you want to clone (calling them, for example,
``SPACK_REPO`` and ``SPACK_REF``), use them to clone spack in your pre-ci
``before_script``, and finally pass those same values along to the workload
generation process via the ``spack-repo`` and ``spack-ref`` cli args. Here's
the ``generate-pipeline`` job from the top of this document, updated to clone
a custom spack and make sure the generated rebuild jobs will clone it too:
this section provides an example of how you could take advantage of
user-provided pipeline scripts to accomplish this fairly simply. First, you
could use the GitLab user interface to create CI environment variables
containing the url and branch or tag you want to use (calling them, for
example, ``SPACK_REPO`` and ``SPACK_REF``), then refer to those in a custom shell
script invoked both from your pipeline generation job, as well as in your rebuild
jobs. Here's the ``generate-pipeline`` job from the top of this document,
updated to invoke a custom shell script that will clone and source a custom
spack:
.. code-block:: yaml
@@ -382,12 +426,10 @@ a custom spack and make sure the generated rebuild jobs will clone it too:
tags:
- <some-other-tag>
before_script:
- git clone ${SPACK_REPO} --branch ${SPACK_REF}
- . ./spack/share/spack/setup-env.sh
- ./cloneSpack.sh
script:
- spack env activate .
- spack env activate --without-view .
- spack ci generate
--spack-repo ${SPACK_REPO} --spack-ref ${SPACK_REF}
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
after_script:
- rm -rf ./spack
@@ -395,13 +437,68 @@ a custom spack and make sure the generated rebuild jobs will clone it too:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
And the ``cloneSpack.sh`` script could contain:
If the ``spack ci generate`` command receives those extra command line arguments,
then it adds similar ``before_script`` and ``after_script`` sections for each of
the ``spack ci rebuild`` jobs it generates (cloning and sourcing a custom
spack in the ``before_script`` and removing it again in the ``after_script``).
This gives you control over the version of spack used when the rebuild jobs
are actually run on the gitlab runner.
.. code-block:: bash
#!/bin/bash
git clone ${SPACK_REPO}
pushd ./spack
git checkout ${SPACK_REF}
popd
. "./spack/share/spack/setup-env.sh"
spack --version
Finally, you would also want your generated rebuild jobs to clone that version
of spack, so you would update your ``spack.yaml`` from above as follows:
.. code-block:: yaml
spack:
...
gitlab-ci:
mappings:
- match:
- os=ubuntu18.04
runner-attributes:
tags:
- spack-kube
image: spack/ubuntu-bionic
before_script:
- ./cloneSpack.sh
script:
- spack env activate --without-view .
- spack -d ci rebuild
after_script:
- rm -rf ./spack
Now all of the generated rebuild jobs will use the same shell script to clone
spack before running their actual workload. Note in the above example the
provision of a custom ``script`` section. The reason for this is to run
``spack ci rebuild`` in debug mode to get more information when builds fail.
Now imagine you have long pipelines with many specs to be built, and you
are pointing to a spack repository and branch that has a tendency to change
frequently, such as the main repo and it's ``develop`` branch. If each child
job checks out the ``develop`` branch, that could result in some jobs running
with one SHA of spack, while later jobs run with another. To help avoid this
issue, the pipeline generation process saves global variables called
``SPACK_VERSION`` and ``SPACK_CHECKOUT_VERSION`` that capture the version
of spack used to generate the pipeline. While the ``SPACK_VERSION`` variable
simply contains the human-readable value produced by ``spack -V`` at pipeline
generation time, the ``SPACK_CHECKOUT_VERSION`` variable can be used in a
``git checkout`` command to make sure all child jobs checkout the same version
of spack used to generate the pipeline. To take advantage of this, you could
simply replace ``git checkout ${SPACK_REF}`` in the example ``cloneSpack.sh``
script above with ``git checkout ${SPACK_CHECKOUT_VERSION}``.
On the other hand, if you're pointing to a spack repository and branch under your
control, there may be no benefit in using the captured ``SPACK_CHECKOUT_VERSION``,
and you can instead just clone using the project CI variables you set (in the
earlier example these were ``SPACK_REPO`` and ``SPACK_REF``).
.. _ci_environment_variables:

View File

@@ -703,400 +703,6 @@ environments:
Administrators might find things easier to maintain without the
added "heavyweight" state of a view.
------------------------------
Developing Software with Spack
------------------------------
For any project, one needs to assemble an
environment of that application's dependencies. You might consider
loading a series of modules or creating a filesystem view. This
approach, while obvious, has some serious drawbacks:
1. There is no guarantee that an environment created this way will be
consistent. Your application could end up with dependency A
expecting one version of MPI, and dependency B expecting another.
The linker will not be happy...
2. Suppose you need to debug a package deep within your software DAG.
If you build that package with a manual environment, then it
becomes difficult to have Spack auto-build things that depend on
it. That could be a serious problem, depending on how deep the
package in question is in your dependency DAG.
3. At its core, Spack is a sophisticated concretization algorithm that
matches up packages with appropriate dependencies and creates a
*consistent* environment for the package it's building. Writing a
list of ``spack load`` commands for your dependencies is at least
as hard as writing the same list of ``depends_on()`` declarations
in a Spack package. But it makes no use of Spack concretization
and is more error-prone.
4. Spack provides an automated, systematic way not just to find a
packages's dependencies --- but also to build other packages on
top. Any Spack package can become a dependency for another Spack
package, offering a powerful vision of software re-use. If you
build your package A outside of Spack, then your ability to use it
as a building block for other packages in an automated way is
diminished: other packages depending on package A will not
be able to use Spack to fulfill that dependency.
5. If you are reading this manual, you probably love Spack. You're
probably going to write a Spack package for your software so
prospective users can install it with the least amount of pain.
Why should you go to additional work to find dependencies in your
development environment? Shouldn't Spack be able to help you build
your software based on the package you've already written?
In this section, we show how Spack can be used in the software
development process to greatest effect, and how development packages
can be seamlessly integrated into the Spack ecosystem. We will show
how this process works by example, assuming the software you are
creating is called ``mylib``.
^^^^^^^^^^^^^^^^^^^^^
Write the CMake Build
^^^^^^^^^^^^^^^^^^^^^
For now, the techniques in this section only work for CMake-based
projects, although they could be easily extended to other build
systems in the future. We will therefore assume you are using CMake
to build your project.
The ``CMakeLists.txt`` file should be written as normal. A few caveats:
1. Your project should produce binaries with RPATHs. This will ensure
that they work the same whether built manually or automatically by
Spack. For example:
.. code-block:: cmake
# enable @rpath in the install name for any shared library being built
# note: it is planned that a future version of CMake will enable this by default
set(CMAKE_MACOSX_RPATH 1)
# Always use full RPATH
# http://www.cmake.org/Wiki/CMake_RPATH_handling
# http://www.kitware.com/blog/home/post/510
# use, i.e. don't skip the full RPATH for the build tree
SET(CMAKE_SKIP_BUILD_RPATH FALSE)
# when building, don't use the install RPATH already
# (but later on when installing)
SET(CMAKE_BUILD_WITH_INSTALL_RPATH FALSE)
# add the automatically determined parts of the RPATH
# which point to directories outside the build tree to the install RPATH
SET(CMAKE_INSTALL_RPATH_USE_LINK_PATH TRUE)
# the RPATH to be used when installing, but only if it's not a system directory
LIST(FIND CMAKE_PLATFORM_IMPLICIT_LINK_DIRECTORIES "${CMAKE_INSTALL_PREFIX}/lib" isSystemDir)
IF("${isSystemDir}" STREQUAL "-1")
SET(CMAKE_INSTALL_RPATH "${CMAKE_INSTALL_PREFIX}/lib")
ENDIF("${isSystemDir}" STREQUAL "-1")
2. Spack provides a CMake variable called
``SPACK_TRANSITIVE_INCLUDE_PATH``, which contains the ``include/``
directory for all of your project's transitive dependencies. It
can be useful if your project ``#include``s files from package B,
which ``#include`` files from package C, but your project only
lists project B as a dependency. This works in traditional
single-tree build environments, in which B and C's include files
live in the same place. In order to make it work with Spack as
well, you must add the following to ``CMakeLists.txt``. It will
have no effect when building without Spack:
.. code-block:: cmake
# Include all the transitive dependencies determined by Spack.
# If we're not running with Spack, this does nothing...
include_directories($ENV{SPACK_TRANSITIVE_INCLUDE_PATH})
.. note::
Note that this feature is controversial and could break with
future versions of GNU ld. The best practice is to make sure
anything you ``#include`` is listed as a dependency in your
CMakeLists.txt (and Spack package).
.. _write-the-spack-package:
^^^^^^^^^^^^^^^^^^^^^^^
Write the Spack Package
^^^^^^^^^^^^^^^^^^^^^^^
The Spack package also needs to be written, in tandem with setting up
the build (for example, CMake). The most important part of this task
is declaring dependencies. Here is an example of the Spack package
for the ``mylib`` package (ellipses for brevity):
.. code-block:: python
class Mylib(CMakePackage):
"""Misc. reusable utilities used by Myapp."""
homepage = "https://github.com/citibeth/mylib"
url = "https://github.com/citibeth/mylib/tarball/123"
version('0.1.2', '3a6acd70085e25f81b63a7e96c504ef9')
version('develop', git='https://github.com/citibeth/mylib.git',
branch='develop')
variant('everytrace', default=False,
description='Report errors through Everytrace')
...
extends('python')
depends_on('eigen')
depends_on('everytrace', when='+everytrace')
depends_on('proj', when='+proj')
...
depends_on('cmake', type='build')
depends_on('doxygen', type='build')
def cmake_args(self):
spec = self.spec
return [
'-DUSE_EVERYTRACE=%s' % ('YES' if '+everytrace' in spec else 'NO'),
'-DUSE_PROJ4=%s' % ('YES' if '+proj' in spec else 'NO'),
...
'-DUSE_UDUNITS2=%s' % ('YES' if '+udunits2' in spec else 'NO'),
'-DUSE_GTEST=%s' % ('YES' if '+googletest' in spec else 'NO')]
This is a standard Spack package that can be used to install
``mylib`` in a production environment. The list of dependencies in
the Spack package will generally be a repeat of the list of CMake
dependencies. This package also has some features that allow it to be
used for development:
1. It subclasses ``CMakePackage`` instead of ``Package``. This
eliminates the need to write an ``install()`` method, which is
defined in the superclass. Instead, one just needs to write the
``configure_args()`` method. That method should return the
arguments needed for the ``cmake`` command (beyond the standard
CMake arguments, which Spack will include already). These
arguments are typically used to turn features on/off in the build.
2. It specifies a non-checksummed version ``develop``. Running
``spack install mylib@develop`` the ``@develop`` version will
install the latest version off the develop branch. This method of
download is useful for the developer of a project while it is in
active development; however, it should only be used by developers
who control and trust the repository in question!
3. The ``url``, ``url_for_version()`` and ``homepage`` attributes are
not used in development. Don't worry if you don't have any, or if
they are behind a firewall.
^^^^^^^^^^^^^^^^
Build with Spack
^^^^^^^^^^^^^^^^
Now that you have a Spack package, you can use Spack to find its
dependencies automatically. For example:
.. code-block:: console
$ cd mylib
$ spack setup mylib@local
The result will be a file ``spconfig.py`` in the top-level
``mylib/`` directory. It is a short script that calls CMake with the
dependencies and options determined by Spack --- similar to what
happens in ``spack install``, but now written out in script form.
From a developer's point of view, you can think of ``spconfig.py`` as
a stand-in for the ``cmake`` command.
.. note::
You can invent any "version" you like for the ``spack setup``
command.
.. note::
Although ``spack setup`` does not build your package, it does
create and install a module file, and mark in the database that
your package has been installed. This can lead to errors, of
course, if you don't subsequently install your package.
Also... you will need to ``spack uninstall`` before you run
``spack setup`` again.
You can now build your project as usual with CMake:
.. code-block:: console
$ mkdir build; cd build
$ ../spconfig.py .. # Instead of cmake ..
$ make
$ make install
Once your ``make install`` command is complete, your package will be
installed, just as if you'd run ``spack install``. Except you can now
edit, re-build and re-install as often as needed, without checking
into Git or downloading tarballs.
.. note::
The build you get this way will be *almost* the same as the build
from ``spack install``. The only difference is, you will not be
using Spack's compiler wrappers. This difference has not caused
problems in our experience, as long as your project sets
RPATHs as shown above. You DO use RPATHs, right?
^^^^^^^^^^^^^^^^^^^^
Build Other Software
^^^^^^^^^^^^^^^^^^^^
Now that you've built ``mylib`` with Spack, you might want to build
another package that depends on it --- for example, ``myapp``. This
is accomplished easily enough:
.. code-block:: console
$ spack install myapp ^mylib@local
Note that auto-built software has now been installed *on top of*
manually-built software, without breaking Spack's "web." This
property is useful if you need to debug a package deep in the
dependency hierarchy of your application. It is a *big* advantage of
using ``spack setup`` to build your package's environment.
If you feel your software is stable, you might wish to install it with
``spack install`` and skip the source directory. You can just use,
for example:
.. code-block:: console
$ spack install mylib@develop
.. _release-your-software:
^^^^^^^^^^^^^^^^^^^^^
Release Your Software
^^^^^^^^^^^^^^^^^^^^^
You are now ready to release your software as a tarball with a
numbered version, and a Spack package that can build it. If you're
hosted on GitHub, this process will be a bit easier.
#. Put tag(s) on the version(s) in your GitHub repo you want to be
release versions. For example, a tag ``v0.1.0`` for version 0.1.0.
#. Set the ``url`` in your ``package.py`` to download a tarball for
the appropriate version. GitHub will give you a tarball for any
commit in the repo, if you tickle it the right way. For example:
.. code-block:: python
url = 'https://github.com/citibeth/mylib/tarball/v0.1.2'
#. Use Spack to determine your version's hash, and cut'n'paste it into
your ``package.py``:
.. code-block:: console
$ spack checksum mylib 0.1.2
==> Found 1 versions of mylib
0.1.2 https://github.com/citibeth/mylib/tarball/v0.1.2
How many would you like to checksum? (default is 5, q to abort)
==> Downloading...
==> Trying to fetch from https://github.com/citibeth/mylib/tarball/v0.1.2
######################################################################## 100.0%
==> Checksummed new versions of mylib:
version('0.1.2', '3a6acd70085e25f81b63a7e96c504ef9')
#. You should now be able to install released version 0.1.2 of your package with:
.. code-block:: console
$ spack install mylib@0.1.2
#. There is no need to remove the `develop` version from your package.
Spack concretization will always prefer numbered version to
non-numeric versions. Users will only get it if they ask for it.
^^^^^^^^^^^^^^^^^^^^^^^^
Distribute Your Software
^^^^^^^^^^^^^^^^^^^^^^^^
Once you've released your software, other people will want to build
it; and you will need to tell them how. In the past, that has meant a
few paragraphs of prose explaining which dependencies to install. But
now you use Spack, and those instructions are written in executable
Python code. But your software has many dependencies, and you know
Spack is the best way to install it:
#. First, you will want to fork Spack's ``develop`` branch. Your aim
is to provide a stable version of Spack that you KNOW will install
your software. If you make changes to Spack in the process, you
will want to submit pull requests to Spack core.
#. Add your software's ``package.py`` to that fork. You should submit
a pull request for this as well, unless you don't want the public
to know about your software.
#. Prepare instructions that read approximately as follows:
#. Download Spack from your forked repo.
#. Install Spack; see :ref:`getting_started`.
#. Set up an appropriate ``packages.yaml`` file. You should tell
your users to include in this file whatever versions/variants
are needed to make your software work correctly (assuming those
are not already in your ``packages.yaml``).
#. Run ``spack install mylib``.
#. Run this script to generate the ``module load`` commands or
filesystem view needed to use this software.
#. Be aware that your users might encounter unexpected bootstrapping
issues on their machines, especially if they are running on older
systems. The :ref:`getting_started` section should cover this, but
there could always be issues.
^^^^^^^^^^^^^^^^^^^
Other Build Systems
^^^^^^^^^^^^^^^^^^^
``spack setup`` currently only supports CMake-based builds, in
packages that subclass ``CMakePackage``. The intent is that this
mechanism should support a wider range of build systems; for example,
GNU Autotools. Someone well-versed in Autotools is needed to develop
this patch and test it out.
Python Distutils is another popular build system that should get
``spack setup`` support. For non-compiled languages like Python,
``spack diy`` may be used. Even better is to put the source directory
directly in the user's ``PYTHONPATH``. Then, edits in source files
are immediately available to run without any install process at all!
^^^^^^^^^^
Conclusion
^^^^^^^^^^
The ``spack setup`` development workflow provides better automation,
flexibility and safety than workflows relying on environment modules
or filesystem views. However, it has some drawbacks:
#. It currently works only with projects that use the CMake build
system. Support for other build systems is not hard to build, but
will require a small amount of effort for each build system to be
supported. It might not work well with some IDEs.
#. It only works with packages that sub-class ``StagedPackage``.
Currently, most Spack packages do not. Converting them is not
hard; but must be done on a package-by-package basis.
#. It requires that users are comfortable with Spack, as they
integrate Spack explicitly in their workflow. Not all users are
willing to do this.
-------------------------------------
Using Spack to Replace Homebrew/Conda
-------------------------------------
@@ -1405,11 +1011,12 @@ The main points that are implemented below:
- export CXXFLAGS="-std=c++11"
install:
- if ! which spack >/dev/null; then
- |
if ! which spack >/dev/null; then
mkdir -p $SPACK_ROOT &&
git clone --depth 50 https://github.com/spack/spack.git $SPACK_ROOT &&
echo -e "config:""\n build_jobs:"" 2" > $SPACK_ROOT/etc/spack/config.yaml **
echo -e "packages:""\n all:""\n target:"" ['x86_64']"
printf "config:\n build_jobs: 2\n" > $SPACK_ROOT/etc/spack/config.yaml &&
printf "packages:\n all:\n target: ['x86_64']\n" \
> $SPACK_ROOT/etc/spack/packages.yaml;
fi
- travis_wait spack install cmake@3.7.2~openssl~ncurses
@@ -1544,8 +1151,9 @@ Avoid double-installing CUDA by adding, e.g.
packages:
cuda:
paths:
cuda@9.0.176%gcc@5.4.0 arch=linux-ubuntu16-x86_64: /usr/local/cuda
externals:
- spec: "cuda@9.0.176%gcc@5.4.0 arch=linux-ubuntu16-x86_64"
prefix: /usr/local/cuda
buildable: False
to your ``packages.yaml``.

View File

@@ -118,6 +118,7 @@ def match(self, text):
"([^:]+): (Error:|error|undefined reference|multiply defined)",
"([^ :]+) ?: (error|fatal error|catastrophic error)",
"([^:]+)\\(([^\\)]+)\\) ?: (error|fatal error|catastrophic error)"),
"^FAILED",
"^[Bb]us [Ee]rror",
"^[Ss]egmentation [Vv]iolation",
"^[Ss]egmentation [Ff]ault",

View File

@@ -652,15 +652,14 @@
"avx512cd",
"avx512vbmi",
"avx512ifma",
"sha",
"sha_ni",
"umip",
"clwb",
"rdpid",
"gfni",
"avx512vbmi2",
"avx512vpopcntdq",
"avx512bitalg",
"avx512vnni",
"avx512_vbmi2",
"avx512_vpopcntdq",
"avx512_bitalg",
"avx512_vnni",
"vpclmulqdq",
"vaes"
],

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import errno
import hashlib
@@ -42,6 +41,8 @@
'fix_darwin_install_name',
'force_remove',
'force_symlink',
'chgrp',
'chmod_x',
'copy',
'install',
'copy_tree',
@@ -52,6 +53,7 @@
'partition_path',
'prefixes',
'remove_dead_links',
'remove_directory_contents',
'remove_if_dead_link',
'remove_linked_tree',
'set_executable',
@@ -338,56 +340,78 @@ def unset_executable_mode(path):
def copy(src, dest, _permissions=False):
"""Copies the file *src* to the file or directory *dest*.
"""Copy the file(s) *src* to the file or directory *dest*.
If *dest* specifies a directory, the file will be copied into *dest*
using the base filename from *src*.
*src* may contain glob characters.
Parameters:
src (str): the file to copy
src (str): the file(s) to copy
dest (str): the destination file or directory
_permissions (bool): for internal use only
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* matches multiple files but *dest* is
not a directory
"""
if _permissions:
tty.debug('Installing {0} to {1}'.format(src, dest))
else:
tty.debug('Copying {0} to {1}'.format(src, dest))
# Expand dest to its eventual full path if it is a directory.
if os.path.isdir(dest):
dest = join_path(dest, os.path.basename(src))
files = glob.glob(src)
if not files:
raise IOError("No such file or directory: '{0}'".format(src))
if len(files) > 1 and not os.path.isdir(dest):
raise ValueError(
"'{0}' matches multiple files but '{1}' is not a directory".format(
src, dest))
shutil.copy(src, dest)
for src in files:
# Expand dest to its eventual full path if it is a directory.
dst = dest
if os.path.isdir(dest):
dst = join_path(dest, os.path.basename(src))
if _permissions:
set_install_permissions(dest)
copy_mode(src, dest)
shutil.copy(src, dst)
if _permissions:
set_install_permissions(dst)
copy_mode(src, dst)
def install(src, dest):
"""Installs the file *src* to the file or directory *dest*.
"""Install the file(s) *src* to the file or directory *dest*.
Same as :py:func:`copy` with the addition of setting proper
permissions on the installed file.
Parameters:
src (str): the file to install
src (str): the file(s) to install
dest (str): the destination file or directory
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* matches multiple files but *dest* is
not a directory
"""
copy(src, dest, _permissions=True)
def resolve_link_target_relative_to_the_link(l):
def resolve_link_target_relative_to_the_link(link):
"""
os.path.isdir uses os.path.exists, which for links will check
the existence of the link target. If the link target is relative to
the link, we need to construct a pathname that is valid from
our cwd (which may not be the same as the link's directory)
"""
target = os.readlink(l)
target = os.readlink(link)
if os.path.isabs(target):
return target
link_dir = os.path.dirname(os.path.abspath(l))
link_dir = os.path.dirname(os.path.abspath(link))
return os.path.join(link_dir, target)
@@ -397,6 +421,8 @@ def copy_tree(src, dest, symlinks=True, ignore=None, _permissions=False):
If the destination directory *dest* does not already exist, it will
be created as well as missing parent directories.
*src* may contain glob characters.
If *symlinks* is true, symbolic links in the source tree are represented
as symbolic links in the new tree and the metadata of the original links
will be copied as far as the platform allows; if false, the contents and
@@ -411,56 +437,66 @@ def copy_tree(src, dest, symlinks=True, ignore=None, _permissions=False):
symlinks (bool): whether or not to preserve symlinks
ignore (function): function indicating which files to ignore
_permissions (bool): for internal use only
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
if _permissions:
tty.debug('Installing {0} to {1}'.format(src, dest))
else:
tty.debug('Copying {0} to {1}'.format(src, dest))
abs_src = os.path.abspath(src)
if not abs_src.endswith(os.path.sep):
abs_src += os.path.sep
abs_dest = os.path.abspath(dest)
if not abs_dest.endswith(os.path.sep):
abs_dest += os.path.sep
# Stop early to avoid unnecessary recursion if being asked to copy from a
# parent directory.
if abs_dest.startswith(abs_src):
raise ValueError('Cannot copy ancestor directory {0} into {1}'.
format(abs_src, abs_dest))
files = glob.glob(src)
if not files:
raise IOError("No such file or directory: '{0}'".format(src))
mkdirp(dest)
for src in files:
abs_src = os.path.abspath(src)
if not abs_src.endswith(os.path.sep):
abs_src += os.path.sep
for s, d in traverse_tree(abs_src, abs_dest, order='pre',
follow_symlinks=not symlinks,
ignore=ignore,
follow_nonexisting=True):
if os.path.islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
if os.path.isabs(target):
new_target = re.sub(abs_src, abs_dest, target)
if new_target != target:
tty.debug("Redirecting link {0} to {1}"
.format(target, new_target))
target = new_target
# Stop early to avoid unnecessary recursion if being asked to copy
# from a parent directory.
if abs_dest.startswith(abs_src):
raise ValueError('Cannot copy ancestor directory {0} into {1}'.
format(abs_src, abs_dest))
os.symlink(target, d)
elif os.path.isdir(link_target):
mkdirp(d)
mkdirp(abs_dest)
for s, d in traverse_tree(abs_src, abs_dest, order='pre',
follow_symlinks=not symlinks,
ignore=ignore,
follow_nonexisting=True):
if os.path.islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
if os.path.isabs(target):
new_target = re.sub(abs_src, abs_dest, target)
if new_target != target:
tty.debug("Redirecting link {0} to {1}"
.format(target, new_target))
target = new_target
os.symlink(target, d)
elif os.path.isdir(link_target):
mkdirp(d)
else:
shutil.copyfile(s, d)
else:
shutil.copyfile(s, d)
else:
if os.path.isdir(s):
mkdirp(d)
else:
shutil.copy2(s, d)
if os.path.isdir(s):
mkdirp(d)
else:
shutil.copy2(s, d)
if _permissions:
set_install_permissions(d)
copy_mode(s, d)
if _permissions:
set_install_permissions(d)
copy_mode(s, d)
def install_tree(src, dest, symlinks=True, ignore=None):
@@ -474,6 +510,10 @@ def install_tree(src, dest, symlinks=True, ignore=None):
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (function): function indicating which files to ignore
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True)
@@ -643,7 +683,7 @@ def replace_directory_transaction(directory_name, tmp_root=None):
try:
yield tmp_dir
except (Exception, KeyboardInterrupt, SystemExit):
except (Exception, KeyboardInterrupt, SystemExit) as e:
# Delete what was there, before copying back the original content
if os.path.exists(directory_name):
shutil.rmtree(directory_name)
@@ -654,6 +694,7 @@ def replace_directory_transaction(directory_name, tmp_root=None):
tty.debug('DIRECTORY RECOVERED [{0}]'.format(directory_name))
msg = 'the transactional move of "{0}" failed.'
msg += '\n ' + str(e)
raise RuntimeError(msg.format(directory_name))
else:
# Otherwise delete the temporary directory
@@ -937,6 +978,53 @@ def remove_linked_tree(path):
shutil.rmtree(path, True)
@contextmanager
def safe_remove(*files_or_dirs):
"""Context manager to remove the files passed as input, but restore
them in case any exception is raised in the context block.
Args:
*files_or_dirs: glob expressions for files or directories
to be removed
Returns:
Dictionary that maps deleted files to their temporary copy
within the context block.
"""
# Find all the files or directories that match
glob_matches = [glob.glob(x) for x in files_or_dirs]
# Sort them so that shorter paths like "/foo/bar" come before
# nested paths like "/foo/bar/baz.yaml". This simplifies the
# handling of temporary copies below
sorted_matches = sorted([
os.path.abspath(x) for x in itertools.chain(*glob_matches)
], key=len)
# Copy files and directories in a temporary location
removed, dst_root = {}, tempfile.mkdtemp()
try:
for id, file_or_dir in enumerate(sorted_matches):
# The glob expression at the top ensures that the file/dir exists
# at the time we enter the loop. Double check here since it might
# happen that a previous iteration of the loop already removed it.
# This is the case, for instance, if we remove the directory
# "/foo/bar" before the file "/foo/bar/baz.yaml".
if not os.path.exists(file_or_dir):
continue
# The monotonic ID is a simple way to make the filename
# or directory name unique in the temporary folder
basename = os.path.basename(file_or_dir) + '-{0}'.format(id)
temporary_path = os.path.join(dst_root, basename)
shutil.move(file_or_dir, temporary_path)
removed[file_or_dir] = temporary_path
yield removed
except BaseException:
# Restore the files that were removed
for original_path, temporary_path in removed.items():
shutil.move(temporary_path, original_path)
raise
def fix_darwin_install_name(path):
"""Fix install name of dynamic libraries on Darwin to have full path.
@@ -1570,6 +1658,19 @@ def can_access_dir(path):
return os.path.isdir(path) and os.access(path, os.R_OK | os.X_OK)
@memoized
def can_write_to_dir(path):
"""Return True if the argument is a directory in which we can write.
Args:
path: path to be tested
Returns:
True if ``path`` is an writeable directory, else False
"""
return os.path.isdir(path) and os.access(path, os.R_OK | os.X_OK | os.W_OK)
@memoized
def files_in(*search_paths):
"""Returns all the files in paths passed as arguments.
@@ -1683,3 +1784,28 @@ def prefixes(path):
pass
return paths
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()
def remove_directory_contents(dir):
"""Remove all contents of a directory."""
if os.path.exists(dir):
for entry in [os.path.join(dir, entry) for entry in os.listdir(dir)]:
if os.path.isfile(entry) or os.path.islink(entry):
os.unlink(entry)
else:
shutil.rmtree(entry)

View File

@@ -5,6 +5,7 @@
from __future__ import division
import multiprocessing
import os
import re
import functools
@@ -19,49 +20,67 @@
ignore_modules = [r'^\.#', '~$']
# On macOS, Python 3.8 multiprocessing now defaults to the 'spawn' start
# method. Spack cannot currently handle this, so force the process to start
# using the 'fork' start method.
#
# TODO: This solution is not ideal, as the 'fork' start method can lead to
# crashes of the subprocess. Figure out how to make 'spawn' work.
#
# See:
# * https://github.com/spack/spack/pull/18124
# * https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods # noqa: E501
# * https://bugs.python.org/issue33725
if sys.version_info >= (3,): # novm
fork_context = multiprocessing.get_context('fork')
else:
fork_context = multiprocessing
def index_by(objects, *funcs):
"""Create a hierarchy of dictionaries by splitting the supplied
set of objects on unique values of the supplied functions.
Values are used as keys. For example, suppose you have four
objects with attributes that look like this::
set of objects on unique values of the supplied functions.
a = Spec(name="boost", compiler="gcc", arch="bgqos_0")
b = Spec(name="mrnet", compiler="intel", arch="chaos_5_x86_64_ib")
c = Spec(name="libelf", compiler="xlc", arch="bgqos_0")
d = Spec(name="libdwarf", compiler="intel", arch="chaos_5_x86_64_ib")
Values are used as keys. For example, suppose you have four
objects with attributes that look like this::
list_of_specs = [a,b,c,d]
index1 = index_by(list_of_specs, lambda s: s.arch,
lambda s: s.compiler)
index2 = index_by(list_of_specs, lambda s: s.compiler)
a = Spec("boost %gcc target=skylake")
b = Spec("mrnet %intel target=zen2")
c = Spec("libelf %xlc target=skylake")
d = Spec("libdwarf %intel target=zen2")
``index1`` now has two levels of dicts, with lists at the
leaves, like this::
list_of_specs = [a,b,c,d]
index1 = index_by(list_of_specs, lambda s: str(s.target),
lambda s: s.compiler)
index2 = index_by(list_of_specs, lambda s: s.compiler)
{ 'bgqos_0' : { 'gcc' : [a], 'xlc' : [c] },
'chaos_5_x86_64_ib' : { 'intel' : [b, d] }
}
``index1`` now has two levels of dicts, with lists at the
leaves, like this::
And ``index2`` is a single level dictionary of lists that looks
like this::
{ 'zen2' : { 'gcc' : [a], 'xlc' : [c] },
'skylake' : { 'intel' : [b, d] }
}
{ 'gcc' : [a],
'intel' : [b,d],
'xlc' : [c]
}
And ``index2`` is a single level dictionary of lists that looks
like this::
If any elemnts in funcs is a string, it is treated as the name
of an attribute, and acts like getattr(object, name). So
shorthand for the above two indexes would be::
{ 'gcc' : [a],
'intel' : [b,d],
'xlc' : [c]
}
index1 = index_by(list_of_specs, 'arch', 'compiler')
index2 = index_by(list_of_specs, 'compiler')
If any elements in funcs is a string, it is treated as the name
of an attribute, and acts like getattr(object, name). So
shorthand for the above two indexes would be::
You can also index by tuples by passing tuples::
index1 = index_by(list_of_specs, 'arch', 'compiler')
index2 = index_by(list_of_specs, 'compiler')
index1 = index_by(list_of_specs, ('arch', 'compiler'))
You can also index by tuples by passing tuples::
Keys in the resulting dict will look like ('gcc', 'bgqos_0').
index1 = index_by(list_of_specs, ('target', 'compiler'))
Keys in the resulting dict will look like ('gcc', 'skylake').
"""
if not funcs:
return objects

View File

@@ -21,6 +21,7 @@
from six import StringIO
import llnl.util.tty as tty
from llnl.util.lang import fork_context
try:
import termios
@@ -323,7 +324,8 @@ class log_output(object):
work within test frameworks like nose and pytest.
"""
def __init__(self, file_like=None, echo=False, debug=0, buffer=False):
def __init__(self, file_like=None, output=None, error=None,
echo=False, debug=0, buffer=False):
"""Create a new output log context manager.
Args:
@@ -348,13 +350,15 @@ def __init__(self, file_like=None, echo=False, debug=0, buffer=False):
"""
self.file_like = file_like
self.output = output or sys.stdout
self.error = error or sys.stderr
self.echo = echo
self.debug = debug
self.buffer = buffer
self._active = False # used to prevent re-entry
def __call__(self, file_like=None, echo=None, debug=None, buffer=None):
def __call__(self, file_like=None, output=None, error=None,
echo=None, debug=None, buffer=None):
"""This behaves the same as init. It allows a logger to be reused.
Arguments are the same as for ``__init__()``. Args here take
@@ -375,6 +379,10 @@ def __call__(self, file_like=None, echo=None, debug=None, buffer=None):
"""
if file_like is not None:
self.file_like = file_like
if output is not None:
self.output = output
if error is not None:
self.error = error
if echo is not None:
self.echo = echo
if debug is not None:
@@ -430,11 +438,11 @@ def __enter__(self):
except BaseException:
input_stream = None # just don't forward input if this fails
self.process = multiprocessing.Process(
self.process = fork_context.Process(
target=_writer_daemon,
args=(
input_stream, read_fd, write_fd, self.echo, self.log_file,
child_pipe
input_stream, read_fd, write_fd, self.echo, self.output,
self.log_file, child_pipe
)
)
self.process.daemon = True # must set before start()
@@ -447,43 +455,54 @@ def __enter__(self):
# Flush immediately before redirecting so that anything buffered
# goes to the original stream
sys.stdout.flush()
sys.stderr.flush()
self.output.flush()
self.error.flush()
# sys.stdout.flush()
# sys.stderr.flush()
# Now do the actual output rediction.
self.use_fds = _file_descriptors_work(sys.stdout, sys.stderr)
self.use_fds = _file_descriptors_work(self.output, self.error)#sys.stdout, sys.stderr)
if self.use_fds:
# We try first to use OS-level file descriptors, as this
# redirects output for subprocesses and system calls.
# Save old stdout and stderr file descriptors
self._saved_stdout = os.dup(sys.stdout.fileno())
self._saved_stderr = os.dup(sys.stderr.fileno())
self._saved_output = os.dup(self.output.fileno())
self._saved_error = os.dup(self.error.fileno())
# self._saved_stdout = os.dup(sys.stdout.fileno())
# self._saved_stderr = os.dup(sys.stderr.fileno())
# redirect to the pipe we created above
os.dup2(write_fd, sys.stdout.fileno())
os.dup2(write_fd, sys.stderr.fileno())
os.dup2(write_fd, self.output.fileno())
os.dup2(write_fd, self.error.fileno())
# os.dup2(write_fd, sys.stdout.fileno())
# os.dup2(write_fd, sys.stderr.fileno())
os.close(write_fd)
else:
# Handle I/O the Python way. This won't redirect lower-level
# output, but it's the best we can do, and the caller
# shouldn't expect any better, since *they* have apparently
# redirected I/O the Python way.
# Save old stdout and stderr file objects
self._saved_stdout = sys.stdout
self._saved_stderr = sys.stderr
self._saved_output = self.output
self._saved_error = self.error
# self._saved_stdout = sys.stdout
# self._saved_stderr = sys.stderr
# create a file object for the pipe; redirect to it.
pipe_fd_out = os.fdopen(write_fd, 'w')
sys.stdout = pipe_fd_out
sys.stderr = pipe_fd_out
self.output = pipe_fd_out
self.error = pipe_fd_out
# sys.stdout = pipe_fd_out
# sys.stderr = pipe_fd_out
# Unbuffer stdout and stderr at the Python level
if not self.buffer:
sys.stdout = Unbuffered(sys.stdout)
sys.stderr = Unbuffered(sys.stderr)
self.output = Unbuffered(self.output)
self.error = Unbuffered(self.error)
# sys.stdout = Unbuffered(sys.stdout)
# sys.stderr = Unbuffered(sys.stderr)
# Force color and debug settings now that we have redirected.
tty.color.set_color_when(forced_color)
@@ -498,20 +517,29 @@ def __enter__(self):
def __exit__(self, exc_type, exc_val, exc_tb):
# Flush any buffered output to the logger daemon.
sys.stdout.flush()
sys.stderr.flush()
self.output.flush()
self.error.flush()
# sys.stdout.flush()
# sys.stderr.flush()
# restore previous output settings, either the low-level way or
# the python way
if self.use_fds:
os.dup2(self._saved_stdout, sys.stdout.fileno())
os.close(self._saved_stdout)
os.dup2(self._saved_output, self.output.fileno())
os.close(self._saved_output)
os.dup2(self._saved_stderr, sys.stderr.fileno())
os.close(self._saved_stderr)
os.dup2(self._saved_error, self.error.fileno())
os.close(self._saved_error)
# os.dup2(self._saved_stdout, sys.stdout.fileno())
# os.close(self._saved_stdout)
# os.dup2(self._saved_stderr, sys.stderr.fileno())
# os.close(self._saved_stderr)
else:
sys.stdout = self._saved_stdout
sys.stderr = self._saved_stderr
self.output = self._saved_output
self.error = self._saved_error
# sys.stdout = self._saved_stdout
# sys.stderr = self._saved_stderr
# print log contents in parent if needed.
if self.write_log_in_parent:
@@ -545,16 +573,17 @@ def force_echo(self):
# output. We us these control characters rather than, say, a
# separate pipe, because they're in-band and assured to appear
# exactly before and after the text we want to echo.
sys.stdout.write(xon)
sys.stdout.flush()
self.output.write(xon)
self.output.flush()
try:
yield
finally:
sys.stdout.write(xoff)
sys.stdout.flush()
self.output.write(xoff)
self.output.flush()
def _writer_daemon(stdin, read_fd, write_fd, echo, log_file, control_pipe):
def _writer_daemon(stdin, read_fd, write_fd, echo, echo_stream, log_file,
control_pipe):
"""Daemon used by ``log_output`` to write to a log file and to ``stdout``.
The daemon receives output from the parent process and writes it both
@@ -597,6 +626,7 @@ def _writer_daemon(stdin, read_fd, write_fd, echo, log_file, control_pipe):
immediately closed by the writer daemon)
echo (bool): initial echo setting -- controlled by user and
preserved across multiple writer daemons
echo_stream (stream): output to echo to when echoing
log_file (file-like): file to log all output
control_pipe (Pipe): multiprocessing pipe on which to send control
information to the parent
@@ -651,8 +681,8 @@ def _writer_daemon(stdin, read_fd, write_fd, echo, log_file, control_pipe):
# Echo to stdout if requested or forced.
if echo or force_echo:
sys.stdout.write(line)
sys.stdout.flush()
echo_stream.write(line)
echo_stream.flush()
# Stripped output to log file.
log_file.write(_strip(line))

View File

@@ -24,6 +24,7 @@
import traceback
import llnl.util.tty.log as log
from llnl.util.lang import fork_context
from spack.util.executable import which
@@ -31,17 +32,17 @@
class ProcessController(object):
"""Wrapper around some fundamental process control operations.
This allows one process to drive another similar to the way a shell
would, by sending signals and I/O.
This allows one process (the controller) to drive another (the
minion) similar to the way a shell would, by sending signals and I/O.
"""
def __init__(self, pid, master_fd,
def __init__(self, pid, controller_fd,
timeout=1, sleep_time=1e-1, debug=False):
"""Create a controller to manipulate the process with id ``pid``
Args:
pid (int): id of process to control
master_fd (int): master file descriptor attached to pid's stdin
controller_fd (int): controller fd attached to pid's stdin
timeout (int): time in seconds for wait operations to time out
(default 1 second)
sleep_time (int): time to sleep after signals, to control the
@@ -58,7 +59,7 @@ def __init__(self, pid, master_fd,
"""
self.pid = pid
self.pgid = os.getpgid(pid)
self.master_fd = master_fd
self.controller_fd = controller_fd
self.timeout = timeout
self.sleep_time = sleep_time
self.debug = debug
@@ -67,8 +68,8 @@ def __init__(self, pid, master_fd,
self.ps = which("ps", required=True)
def get_canon_echo_attrs(self):
"""Get echo and canon attributes of the terminal of master_fd."""
cfg = termios.tcgetattr(self.master_fd)
"""Get echo and canon attributes of the terminal of controller_fd."""
cfg = termios.tcgetattr(self.controller_fd)
return (
bool(cfg[3] & termios.ICANON),
bool(cfg[3] & termios.ECHO),
@@ -82,7 +83,7 @@ def horizontal_line(self, name):
)
def status(self):
"""Print debug message with status info for the child."""
"""Print debug message with status info for the minion."""
if self.debug:
canon, echo = self.get_canon_echo_attrs()
sys.stderr.write("canon: %s, echo: %s\n" % (
@@ -94,12 +95,12 @@ def status(self):
sys.stderr.write("\n")
def input_on(self):
"""True if keyboard input is enabled on the master_fd pty."""
"""True if keyboard input is enabled on the controller_fd pty."""
return self.get_canon_echo_attrs() == (False, False)
def background(self):
"""True if pgid is in a background pgroup of master_fd's terminal."""
return self.pgid != os.tcgetpgrp(self.master_fd)
"""True if pgid is in a background pgroup of controller_fd's tty."""
return self.pgid != os.tcgetpgrp(self.controller_fd)
def tstp(self):
"""Send SIGTSTP to the controlled process."""
@@ -115,18 +116,18 @@ def cont(self):
def fg(self):
self.horizontal_line("fg")
with log.ignore_signal(signal.SIGTTOU):
os.tcsetpgrp(self.master_fd, os.getpgid(self.pid))
os.tcsetpgrp(self.controller_fd, os.getpgid(self.pid))
time.sleep(self.sleep_time)
def bg(self):
self.horizontal_line("bg")
with log.ignore_signal(signal.SIGTTOU):
os.tcsetpgrp(self.master_fd, os.getpgrp())
os.tcsetpgrp(self.controller_fd, os.getpgrp())
time.sleep(self.sleep_time)
def write(self, byte_string):
self.horizontal_line("write '%s'" % byte_string.decode("utf-8"))
os.write(self.master_fd, byte_string)
os.write(self.controller_fd, byte_string)
def wait(self, condition):
start = time.time()
@@ -156,50 +157,51 @@ def wait_running(self):
class PseudoShell(object):
"""Sets up master and child processes with a PTY.
"""Sets up controller and minion processes with a PTY.
You can create a ``PseudoShell`` if you want to test how some
function responds to terminal input. This is a pseudo-shell from a
job control perspective; ``master_function`` and ``child_function``
are set up with a pseudoterminal (pty) so that the master can drive
the child through process control signals and I/O.
job control perspective; ``controller_function`` and ``minion_function``
are set up with a pseudoterminal (pty) so that the controller can drive
the minion through process control signals and I/O.
The two functions should have signatures like this::
def master_function(proc, ctl, **kwargs)
def child_function(**kwargs)
def controller_function(proc, ctl, **kwargs)
def minion_function(**kwargs)
``master_function`` is spawned in its own process and passed three
``controller_function`` is spawned in its own process and passed three
arguments:
proc
the ``multiprocessing.Process`` object representing the child
the ``multiprocessing.Process`` object representing the minion
ctl
a ``ProcessController`` object tied to the child
a ``ProcessController`` object tied to the minion
kwargs
keyword arguments passed from ``PseudoShell.start()``.
``child_function`` is only passed ``kwargs`` delegated from
``minion_function`` is only passed ``kwargs`` delegated from
``PseudoShell.start()``.
The ``ctl.master_fd`` will have its ``master_fd`` connected to
``sys.stdin`` in the child process. Both processes will share the
The ``ctl.controller_fd`` will have its ``controller_fd`` connected to
``sys.stdin`` in the minion process. Both processes will share the
same ``sys.stdout`` and ``sys.stderr`` as the process instantiating
``PseudoShell``.
Here are the relationships between processes created::
._________________________________________________________.
| Child Process | pid 2
| - runs child_function | pgroup 2
| Minion Process | pid 2
| - runs minion_function | pgroup 2
|_________________________________________________________| session 1
^
| create process with master_fd connected to stdin
| create process with controller_fd connected to stdin
| stdout, stderr are the same as caller
._________________________________________________________.
| Master Process | pid 1
| - runs master_function | pgroup 1
| - uses ProcessController and master_fd to control child | session 1
| Controller Process | pid 1
| - runs controller_function | pgroup 1
| - uses ProcessController and controller_fd to | session 1
| control minion |
|_________________________________________________________|
^
| create process
@@ -207,51 +209,51 @@ def child_function(**kwargs)
._________________________________________________________.
| Caller | pid 0
| - Constructs, starts, joins PseudoShell | pgroup 0
| - provides master_function, child_function | session 0
| - provides controller_function, minion_function | session 0
|_________________________________________________________|
"""
def __init__(self, master_function, child_function):
def __init__(self, controller_function, minion_function):
self.proc = None
self.master_function = master_function
self.child_function = child_function
self.controller_function = controller_function
self.minion_function = minion_function
# these can be optionally set to change defaults
self.controller_timeout = 1
self.sleep_time = 0
def start(self, **kwargs):
"""Start the master and child processes.
"""Start the controller and minion processes.
Arguments:
kwargs (dict): arbitrary keyword arguments that will be
passed to master and child functions
passed to controller and minion functions
The master process will create the child, then call
``master_function``. The child process will call
``child_function``.
The controller process will create the minion, then call
``controller_function``. The minion process will call
``minion_function``.
"""
self.proc = multiprocessing.Process(
target=PseudoShell._set_up_and_run_master_function,
args=(self.master_function, self.child_function,
self.proc = fork_context.Process(
target=PseudoShell._set_up_and_run_controller_function,
args=(self.controller_function, self.minion_function,
self.controller_timeout, self.sleep_time),
kwargs=kwargs,
)
self.proc.start()
def join(self):
"""Wait for the child process to finish, and return its exit code."""
"""Wait for the minion process to finish, and return its exit code."""
self.proc.join()
return self.proc.exitcode
@staticmethod
def _set_up_and_run_child_function(
tty_name, stdout_fd, stderr_fd, ready, child_function, **kwargs):
"""Child process wrapper for PseudoShell.
def _set_up_and_run_minion_function(
tty_name, stdout_fd, stderr_fd, ready, minion_function, **kwargs):
"""Minion process wrapper for PseudoShell.
Handles the mechanics of setting up a PTY, then calls
``child_function``.
``minion_function``.
"""
# new process group, like a command or pipeline launched by a shell
@@ -266,45 +268,45 @@ def _set_up_and_run_child_function(
if kwargs.get("debug"):
sys.stderr.write(
"child: stdin.isatty(): %s\n" % sys.stdin.isatty())
"minion: stdin.isatty(): %s\n" % sys.stdin.isatty())
# tell the parent that we're really running
if kwargs.get("debug"):
sys.stderr.write("child: ready!\n")
sys.stderr.write("minion: ready!\n")
ready.value = True
try:
child_function(**kwargs)
minion_function(**kwargs)
except BaseException:
traceback.print_exc()
@staticmethod
def _set_up_and_run_master_function(
master_function, child_function, controller_timeout, sleep_time,
**kwargs):
"""Set up a pty, spawn a child process, and execute master_function.
def _set_up_and_run_controller_function(
controller_function, minion_function, controller_timeout,
sleep_time, **kwargs):
"""Set up a pty, spawn a minion process, execute controller_function.
Handles the mechanics of setting up a PTY, then calls
``master_function``.
``controller_function``.
"""
os.setsid() # new session; this process is the controller
master_fd, child_fd = os.openpty()
pty_name = os.ttyname(child_fd)
controller_fd, minion_fd = os.openpty()
pty_name = os.ttyname(minion_fd)
# take controlling terminal
pty_fd = os.open(pty_name, os.O_RDWR)
os.close(pty_fd)
ready = multiprocessing.Value('i', False)
child_process = multiprocessing.Process(
target=PseudoShell._set_up_and_run_child_function,
minion_process = multiprocessing.Process(
target=PseudoShell._set_up_and_run_minion_function,
args=(pty_name, sys.stdout.fileno(), sys.stderr.fileno(),
ready, child_function),
ready, minion_function),
kwargs=kwargs,
)
child_process.start()
minion_process.start()
# wait for subprocess to be running and connected.
while not ready.value:
@@ -315,30 +317,31 @@ def _set_up_and_run_master_function(
sys.stderr.write("pid: %d\n" % os.getpid())
sys.stderr.write("pgid: %d\n" % os.getpgrp())
sys.stderr.write("sid: %d\n" % os.getsid(0))
sys.stderr.write("tcgetpgrp: %d\n" % os.tcgetpgrp(master_fd))
sys.stderr.write("tcgetpgrp: %d\n" % os.tcgetpgrp(controller_fd))
sys.stderr.write("\n")
child_pgid = os.getpgid(child_process.pid)
sys.stderr.write("child pid: %d\n" % child_process.pid)
sys.stderr.write("child pgid: %d\n" % child_pgid)
sys.stderr.write("child sid: %d\n" % os.getsid(child_process.pid))
minion_pgid = os.getpgid(minion_process.pid)
sys.stderr.write("minion pid: %d\n" % minion_process.pid)
sys.stderr.write("minion pgid: %d\n" % minion_pgid)
sys.stderr.write(
"minion sid: %d\n" % os.getsid(minion_process.pid))
sys.stderr.write("\n")
sys.stderr.flush()
# set up master to ignore SIGTSTP, like a shell
# set up controller to ignore SIGTSTP, like a shell
signal.signal(signal.SIGTSTP, signal.SIG_IGN)
# call the master function once the child is ready
# call the controller function once the minion is ready
try:
controller = ProcessController(
child_process.pid, master_fd, debug=kwargs.get("debug"))
minion_process.pid, controller_fd, debug=kwargs.get("debug"))
controller.timeout = controller_timeout
controller.sleep_time = sleep_time
error = master_function(child_process, controller, **kwargs)
error = controller_function(minion_process, controller, **kwargs)
except BaseException:
error = 1
traceback.print_exc()
child_process.join()
minion_process.join()
# return whether either the parent or child failed
return error or child_process.exitcode
# return whether either the parent or minion failed
return error or minion_process.exitcode

View File

@@ -5,7 +5,7 @@
#: major, minor, patch version for Spack, in a tuple
spack_version_info = (0, 15, 2)
spack_version_info = (0, 15, 4)
#: String containing Spack version joined with .'s
spack_version = '.'.join(str(v) for v in spack_version_info)

View File

@@ -18,10 +18,13 @@ class ABI(object):
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""
def architecture_compatible(self, parent, child):
"""Return true if parent and child have ABI compatible targets."""
return not parent.architecture or not child.architecture or \
parent.architecture == child.architecture
def architecture_compatible(self, target, constraint):
"""Return true if architecture of target spec is ABI compatible
to the architecture of constraint spec. If either the target
or constraint specs have no architecture, target is also defined
as architecture ABI compatible to constraint."""
return not target.architecture or not constraint.architecture or \
target.architecture.satisfies(constraint.architecture)
@memoized
def _gcc_get_libstdcxx_version(self, version):
@@ -107,8 +110,8 @@ def compiler_compatible(self, parent, child, **kwargs):
return True
return False
def compatible(self, parent, child, **kwargs):
"""Returns true iff a parent and child spec are ABI compatible"""
def compatible(self, target, constraint, **kwargs):
"""Returns true if target spec is ABI compatible to constraint spec"""
loosematch = kwargs.get('loose', False)
return self.architecture_compatible(parent, child) and \
self.compiler_compatible(parent, child, loose=loosematch)
return self.architecture_compatible(target, constraint) and \
self.compiler_compatible(target, constraint, loose=loosematch)

View File

@@ -6,7 +6,7 @@
"""
This module contains all the elements that are required to create an
architecture object. These include, the target processor, the operating system,
and the architecture platform (i.e. cray, darwin, linux, bgq, etc) classes.
and the architecture platform (i.e. cray, darwin, linux, etc) classes.
On a multiple architecture machine, the architecture spec field can be set to
build a package against any target and operating system that is present on the
@@ -217,7 +217,7 @@ def optimization_flags(self, compiler):
if isinstance(compiler, spack.spec.CompilerSpec):
compiler = spack.compilers.compilers_for_spec(compiler).pop()
try:
compiler_version = compiler.get_real_version()
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
@@ -233,10 +233,14 @@ class Platform(object):
Will return a instance of it once it is returned.
"""
priority = None # Subclass sets number. Controls detection order
priority = None # Subclass sets number. Controls detection order
#: binary formats used on this platform; used by relocation logic
binary_formats = ['elf']
front_end = None
back_end = None
default = None # The default back end target. On cray ivybridge
default = None # The default back end target. On cray ivybridge
front_os = None
back_os = None
@@ -436,6 +440,12 @@ def to_dict(self):
('target', self.target.to_dict_or_value())])
return syaml_dict([('arch', d)])
def to_spec(self):
"""Convert this Arch to an anonymous Spec with architecture defined."""
spec = spack.spec.Spec()
spec.architecture = spack.spec.ArchSpec(str(self))
return spec
@staticmethod
def from_dict(d):
spec = spack.spec.ArchSpec.from_dict(d)
@@ -518,6 +528,14 @@ def platform():
@memoized
def default_arch():
"""Default ``Arch`` object for this machine.
See ``sys_type()``.
"""
return Arch(platform(), 'default_os', 'default_target')
def sys_type():
"""Print out the "default" platform-os-target tuple for this machine.
@@ -530,8 +548,7 @@ def sys_type():
architectures.
"""
arch = Arch(platform(), 'default_os', 'default_target')
return str(arch)
return str(default_arch())
@memoized

View File

@@ -11,7 +11,6 @@
import tempfile
import hashlib
import glob
import platform
from contextlib import closing
import ruamel.yaml as yaml
@@ -36,7 +35,6 @@
from spack.spec import Spec
from spack.stage import Stage
from spack.util.gpg import Gpg
import spack.architecture as architecture
_build_cache_relative_path = 'build_cache'
@@ -498,6 +496,7 @@ def download_tarball(spec):
# stage the tarball into standard place
stage = Stage(url, name="build_cache", keep=True)
stage.create()
try:
stage.fetch()
return stage.save_filename
@@ -520,16 +519,16 @@ def make_package_relative(workdir, spec, allow_root):
for filename in buildinfo['relocate_binaries']:
orig_path_names.append(os.path.join(prefix, filename))
cur_path_names.append(os.path.join(workdir, filename))
if (spec.architecture.platform == 'darwin' or
spec.architecture.platform == 'test' and
platform.system().lower() == 'darwin'):
relocate.make_macho_binaries_relative(cur_path_names, orig_path_names,
old_layout_root)
if (spec.architecture.platform == 'linux' or
spec.architecture.platform == 'test' and
platform.system().lower() == 'linux'):
relocate.make_elf_binaries_relative(cur_path_names, orig_path_names,
old_layout_root)
platform = spack.architecture.get_platform(spec.platform)
if 'macho' in platform.binary_formats:
relocate.make_macho_binaries_relative(
cur_path_names, orig_path_names, old_layout_root)
if 'elf' in platform.binary_formats:
relocate.make_elf_binaries_relative(
cur_path_names, orig_path_names, old_layout_root)
relocate.raise_if_not_relocatable(cur_path_names, allow_root)
orig_path_names = list()
cur_path_names = list()
@@ -602,29 +601,23 @@ def is_backup_file(file):
if not is_backup_file(text_name):
text_names.append(text_name)
# If we are installing back to the same location don't replace anything
# If we are not installing back to the same install tree do the relocation
if old_layout_root != new_layout_root:
paths_to_relocate = [old_spack_prefix, old_layout_root]
paths_to_relocate.extend(prefix_to_hash.keys())
files_to_relocate = list(filter(
lambda pathname: not relocate.file_is_relocatable(
pathname, paths_to_relocate=paths_to_relocate),
map(lambda filename: os.path.join(workdir, filename),
buildinfo['relocate_binaries'])))
files_to_relocate = [os.path.join(workdir, filename)
for filename in buildinfo.get('relocate_binaries')
]
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
if (spec.architecture.platform == 'darwin' or
spec.architecture.platform == 'test' and
platform.system().lower() == 'darwin'):
platform = spack.architecture.get_platform(spec.platform)
if 'macho' in platform.binary_formats:
relocate.relocate_macho_binaries(files_to_relocate,
old_layout_root,
new_layout_root,
prefix_to_prefix, rel,
old_prefix,
new_prefix)
if (spec.architecture.platform == 'linux' or
spec.architecture.platform == 'test' and
platform.system().lower() == 'linux'):
if 'elf' in platform.binary_formats:
relocate.relocate_elf_binaries(files_to_relocate,
old_layout_root,
new_layout_root,
@@ -646,6 +639,13 @@ def is_backup_file(file):
new_spack_prefix,
prefix_to_prefix)
paths_to_relocate = [old_prefix, old_layout_root]
paths_to_relocate.extend(prefix_to_hash.keys())
files_to_relocate = list(filter(
lambda pathname: not relocate.file_is_relocatable(
pathname, paths_to_relocate=paths_to_relocate),
map(lambda filename: os.path.join(workdir, filename),
buildinfo['relocate_binaries'])))
# relocate the install prefixes in binary files including dependencies
relocate.relocate_text_bin(files_to_relocate,
old_prefix, new_prefix,
@@ -653,6 +653,17 @@ def is_backup_file(file):
new_spack_prefix,
prefix_to_prefix)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
else:
if old_spack_prefix != new_spack_prefix:
relocate.relocate_text(text_names,
old_layout_root, new_layout_root,
old_prefix, new_prefix,
old_spack_prefix,
new_spack_prefix,
prefix_to_prefix)
def extract_tarball(spec, filename, allow_root=False, unsigned=False,
force=False):
@@ -841,13 +852,11 @@ def get_spec(spec=None, force=False):
return try_download_specs(urls=urls, force=force)
def get_specs(allarch=False):
def get_specs():
"""
Get spec.yaml's for build caches available on mirror
"""
global _cached_specs
arch = architecture.Arch(architecture.platform(),
'default_os', 'default_target')
if not spack.mirror.MirrorCollection():
tty.debug("No Spack mirrors are currently configured")
@@ -867,8 +876,7 @@ def get_specs(allarch=False):
index_url, 'application/json')
index_object = codecs.getreader('utf-8')(file_stream).read()
except (URLError, web_util.SpackWebError) as url_err:
tty.error('Failed to read index {0}'.format(index_url))
tty.debug(url_err)
tty.debug('Failed to read index {0}'.format(index_url), url_err, 1)
# Continue on to the next mirror
continue
@@ -885,9 +893,7 @@ def get_specs(allarch=False):
spec_list = db.query_local(installed=False)
for indexed_spec in spec_list:
spec_arch = architecture.arch_for_spec(indexed_spec.architecture)
if (allarch is True or spec_arch == arch):
_cached_specs.add(indexed_spec)
_cached_specs.add(indexed_spec)
return _cached_specs

View File

@@ -33,7 +33,6 @@
calls you can make from within the install() function.
"""
import re
import inspect
import multiprocessing
import os
import shutil
@@ -45,16 +44,19 @@
import llnl.util.tty as tty
from llnl.util.tty.color import cescape, colorize
from llnl.util.filesystem import mkdirp, install, install_tree
from llnl.util.lang import dedupe
from llnl.util.lang import dedupe, fork_context
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.config
import spack.main
import spack.paths
import spack.package
import spack.schema.environment
import spack.store
import spack.install_test
import spack.architecture as arch
import spack.util.path
from spack.util.string import plural
from spack.util.environment import (
env_flag, filter_system_paths, get_path, is_system_path,
@@ -62,7 +64,7 @@
from spack.util.environment import system_dirs
from spack.error import NoLibrariesError, NoHeadersError
from spack.util.executable import Executable
from spack.util.module_cmd import load_module, get_path_from_module, module
from spack.util.module_cmd import load_module, path_from_modules, module
from spack.util.log_parse import parse_log_events, make_log_context
@@ -451,7 +453,6 @@ def _set_variables_for_single_module(pkg, module):
jobs = spack.config.get('config:build_jobs', 16) if pkg.parallel else 1
jobs = min(jobs, multiprocessing.cpu_count())
assert jobs is not None, "no default set for config:build_jobs"
m = module
m.make_jobs = jobs
@@ -642,7 +643,7 @@ def get_rpaths(pkg):
# Second module is our compiler mod name. We use that to get rpaths from
# module show output.
if pkg.compiler.modules and len(pkg.compiler.modules) > 1:
rpaths.append(get_path_from_module(pkg.compiler.modules[1]))
rpaths.append(path_from_modules([pkg.compiler.modules[1]]))
return list(dedupe(filter_system_paths(rpaths)))
@@ -706,32 +707,48 @@ def load_external_modules(pkg):
pkg (PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
if dep.external_module:
load_module(dep.external_module)
external_modules = dep.external_modules or []
for external_module in external_modules:
load_module(external_module)
def setup_package(pkg, dirty):
def setup_package(pkg, dirty, context='build'):
"""Execute all environment setup routines."""
build_env = EnvironmentModifications()
env = EnvironmentModifications()
# clean environment
if not dirty:
clean_environment()
set_compiler_environment_variables(pkg, build_env)
set_build_environment_variables(pkg, build_env, dirty)
pkg.architecture.platform.setup_platform_environment(pkg, build_env)
# setup compilers and build tools for build contexts
need_compiler = context == 'build' or (context == 'test' and
pkg.test_requires_compiler)
if need_compiler:
set_compiler_environment_variables(pkg, env)
set_build_environment_variables(pkg, env, dirty)
build_env.extend(
modifications_from_dependencies(pkg.spec, context='build')
)
# architecture specific setup
pkg.architecture.platform.setup_platform_environment(pkg, env)
if (not dirty) and (not build_env.is_unset('CPATH')):
tty.debug("A dependency has updated CPATH, this may lead pkg-config"
" to assume that the package is part of the system"
" includes and omit it when invoked with '--cflags'.")
if context == 'build':
# recursive post-order dependency information
env.extend(
modifications_from_dependencies(pkg.spec, context=context)
)
set_module_variables_for_package(pkg)
pkg.setup_build_environment(build_env)
if (not dirty) and (not env.is_unset('CPATH')):
tty.debug("A dependency has updated CPATH, this may lead pkg-"
"config to assume that the package is part of the system"
" includes and omit it when invoked with '--cflags'.")
# setup package itself
set_module_variables_for_package(pkg)
pkg.setup_build_environment(env)
elif context == 'test':
import spack.user_environment as uenv # avoid circular import
env.extend(uenv.environment_modifications_for_spec(pkg.spec))
set_module_variables_for_package(pkg)
env.prepend_path('PATH', '.')
# Loading modules, in particular if they are meant to be used outside
# of Spack, can change environment variables that are relevant to the
@@ -741,15 +758,16 @@ def setup_package(pkg, dirty):
# unnecessary. Modules affecting these variables will be overwritten anyway
with preserve_environment('CC', 'CXX', 'FC', 'F77'):
# All module loads that otherwise would belong in previous
# functions have to occur after the build_env object has its
# functions have to occur after the env object has its
# modifications applied. Otherwise the environment modifications
# could undo module changes, such as unsetting LD_LIBRARY_PATH
# after a module changes it.
for mod in pkg.compiler.modules:
# Fixes issue https://github.com/spack/spack/issues/3153
if os.environ.get("CRAY_CPU_TARGET") == "mic-knl":
load_module("cce")
load_module(mod)
if need_compiler:
for mod in pkg.compiler.modules:
# Fixes issue https://github.com/spack/spack/issues/3153
if os.environ.get("CRAY_CPU_TARGET") == "mic-knl":
load_module("cce")
load_module(mod)
# kludge to handle cray libsci being automatically loaded by PrgEnv
# modules on cray platform. Module unload does no damage when
@@ -763,12 +781,12 @@ def setup_package(pkg, dirty):
implicit_rpaths = pkg.compiler.implicit_rpaths()
if implicit_rpaths:
build_env.set('SPACK_COMPILER_IMPLICIT_RPATHS',
':'.join(implicit_rpaths))
env.set('SPACK_COMPILER_IMPLICIT_RPATHS',
':'.join(implicit_rpaths))
# Make sure nothing's strange about the Spack environment.
validate(build_env, tty.warn)
build_env.apply_modifications()
validate(env, tty.warn)
env.apply_modifications()
def modifications_from_dependencies(spec, context):
@@ -788,7 +806,8 @@ def modifications_from_dependencies(spec, context):
deptype_and_method = {
'build': (('build', 'link', 'test'),
'setup_dependent_build_environment'),
'run': (('link', 'run'), 'setup_dependent_run_environment')
'run': (('link', 'run'), 'setup_dependent_run_environment'),
'test': (('link', 'run', 'test'), 'setup_dependent_run_environment')
}
deptype, method = deptype_and_method[context]
@@ -802,7 +821,7 @@ def modifications_from_dependencies(spec, context):
return env
def fork(pkg, function, dirty, fake):
def fork(pkg, function, dirty, fake, context='build', **kwargs):
"""Fork a child process to do part of a spack build.
Args:
@@ -814,6 +833,8 @@ def fork(pkg, function, dirty, fake):
dirty (bool): If True, do NOT clean the environment before
building.
fake (bool): If True, skip package setup b/c it's not a real build
context (string): If 'build', setup build environment. If 'test', setup
test environment.
Usage::
@@ -842,7 +863,7 @@ def child_process(child_pipe, input_stream):
try:
if not fake:
setup_package(pkg, dirty=dirty)
setup_package(pkg, dirty=dirty, context=context)
return_value = function()
child_pipe.send(return_value)
@@ -860,19 +881,29 @@ def child_process(child_pipe, input_stream):
# build up some context from the offending package so we can
# show that, too.
package_context = get_package_context(tb)
if exc_type is not spack.install_test.TestFailure:
package_context = get_package_context(traceback.extract_tb(tb))
else:
package_context = []
build_log = None
if hasattr(pkg, 'log_path'):
if context == 'build' and hasattr(pkg, 'log_path'):
build_log = pkg.log_path
test_log = None
if context == 'test':
test_log = os.path.join(
pkg.test_suite.stage,
spack.install_test.TestSuite.test_log_name(pkg.spec))
# make a pickleable exception to send to parent.
msg = "%s: %s" % (exc_type.__name__, str(exc))
ce = ChildError(msg,
exc_type.__module__,
exc_type.__name__,
tb_string, build_log, package_context)
tb_string, package_context,
build_log, test_log)
child_pipe.send(ce)
finally:
@@ -885,7 +916,7 @@ def child_process(child_pipe, input_stream):
if sys.stdin.isatty() and hasattr(sys.stdin, 'fileno'):
input_stream = os.fdopen(os.dup(sys.stdin.fileno()))
p = multiprocessing.Process(
p = fork_context.Process(
target=child_process, args=(child_pipe, input_stream))
p.start()
@@ -925,8 +956,8 @@ def get_package_context(traceback, context=3):
"""Return some context for an error message when the build fails.
Args:
traceback (traceback): A traceback from some exception raised during
install
traceback (list of tuples): output from traceback.extract_tb() or
traceback.extract_stack()
context (int): Lines of context to show before and after the line
where the error happened
@@ -935,51 +966,44 @@ def get_package_context(traceback, context=3):
from there.
"""
def make_stack(tb, stack=None):
"""Tracebacks come out of the system in caller -> callee order. Return
an array in callee -> caller order so we can traverse it."""
if stack is None:
stack = []
if tb is not None:
make_stack(tb.tb_next, stack)
stack.append(tb)
return stack
stack = make_stack(traceback)
for tb in stack:
frame = tb.tb_frame
if 'self' in frame.f_locals:
# Find the first proper subclass of PackageBase.
obj = frame.f_locals['self']
if isinstance(obj, spack.package.PackageBase):
for filename, lineno, function, text in reversed(traceback):
if 'package.py' in filename or 'spack/build_systems' in filename:
if function not in ('run_test', '_run_test_helper'):
# We are in a package and not one of the listed methods
# We exclude these methods because we expect errors in them to
# be the result of user tests failing, and we show the tests
# instead.
break
# Package files have a line added at import time, so we adjust the lineno
# when we are getting context from a package file instead of a base class
adjust = 1 if spack.paths.is_package_file(filename) else 0
lineno = lineno - adjust
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
lines = [
'{0}:{1:d}, in {2}:'.format(
inspect.getfile(frame.f_code),
frame.f_lineno - 1, # subtract 1 because f_lineno is 0-indexed
frame.f_code.co_name
filename,
lineno,
function
)
]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
# Subtract 1 because f_lineno is 0-indexed.
fun_lineno = frame.f_lineno - start - 1
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx:fun_lineno + context + 1]
# Adjust for import mangling of package files.
with open(filename, 'r') as f:
sourcelines = f.readlines()
start = max(0, lineno - context - 1)
sourcelines = sourcelines[start:lineno + context + 1]
for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
i = i + adjust # adjusting for import munging again
is_error = start + i == lineno
mark = '>> ' if is_error else ' '
# Add start to get lineno relative to start of file, not function.
marked = ' {0}{1:-6d}{2}'.format(
mark, start + start_ctx + i, line.rstrip())
mark, start + i, line.rstrip())
if is_error:
marked = colorize('@R{%s}' % cescape(marked))
lines.append(marked)
@@ -1033,14 +1057,15 @@ class ChildError(InstallError):
# context instead of Python context.
build_errors = [('spack.util.executable', 'ProcessError')]
def __init__(self, msg, module, classname, traceback_string, build_log,
context):
def __init__(self, msg, module, classname, traceback_string, context,
build_log, test_log):
super(ChildError, self).__init__(msg)
self.module = module
self.name = classname
self.traceback = traceback_string
self.build_log = build_log
self.context = context
self.build_log = build_log
self.test_log = test_log
@property
def long_message(self):
@@ -1049,21 +1074,12 @@ def long_message(self):
if (self.module, self.name) in ChildError.build_errors:
# The error happened in some external executed process. Show
# the build log with errors or warnings highlighted.
# the log with errors or warnings highlighted.
if self.build_log and os.path.exists(self.build_log):
errors, warnings = parse_log_events(self.build_log)
nerr = len(errors)
nwar = len(warnings)
if nerr > 0:
# If errors are found, only display errors
out.write(
"\n%s found in build log:\n" % plural(nerr, 'error'))
out.write(make_log_context(errors))
elif nwar > 0:
# If no errors are found but warnings are, display warnings
out.write(
"\n%s found in build log:\n" % plural(nwar, 'warning'))
out.write(make_log_context(warnings))
write_log_summary(out, 'build', self.build_log)
if self.test_log and os.path.exists(self.test_log):
write_log_summary(out, 'test', self.test_log)
else:
# The error happened in in the Python code, so try to show
@@ -1080,6 +1096,10 @@ def long_message(self):
out.write('See build log for details:\n')
out.write(' %s\n' % self.build_log)
if self.test_log and os.path.exists(self.test_log):
out.write('See test log for details:\n')
out.write(' %s\n' % self.test_log)
return out.getvalue()
def __str__(self):
@@ -1096,13 +1116,16 @@ def __reduce__(self):
self.module,
self.name,
self.traceback,
self.context,
self.build_log,
self.context)
self.test_log)
def _make_child_error(msg, module, name, traceback, build_log, context):
def _make_child_error(msg, module, name, traceback, context,
build_log, test_log):
"""Used by __reduce__ in ChildError to reconstruct pickled errors."""
return ChildError(msg, module, name, traceback, build_log, context)
return ChildError(msg, module, name, traceback, context,
build_log, test_log)
class StopPhase(spack.error.SpackError):
@@ -1113,3 +1136,30 @@ def __reduce__(self):
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
def write_log_summary(out, log_type, log, last=None):
errors, warnings = parse_log_events(log)
nerr = len(errors)
nwar = len(warnings)
if nerr > 0:
if last and nerr > last:
errors = errors[-last:]
nerr = last
# If errors are found, only display errors
out.write(
"\n%s found in %s log:\n" %
(plural(nerr, 'error'), log_type))
out.write(make_log_context(errors))
elif nwar > 0:
if last and nwar > last:
warnings = warnings[-last:]
nwar = last
# If no errors are found but warnings are, display warnings
out.write(
"\n%s found in %s log:\n" %
(plural(nwar, 'warning'), log_type))
out.write(make_log_context(warnings))

View File

@@ -308,13 +308,21 @@ def flags_to_build_system_args(self, flags):
self.cmake_flag_args.append(libs_string.format(lang,
libs_flags))
@property
def build_dirname(self):
"""Returns the directory name to use when building the package
:return: name of the subdirectory for building the package
"""
return 'spack-build-%s' % self.spec.dag_hash(7)
@property
def build_directory(self):
"""Returns the directory to use when building the package
:return: directory where to build the package
"""
return os.path.join(self.stage.path, 'spack-build')
return os.path.join(self.stage.path, self.build_dirname)
def cmake_args(self):
"""Produces a list containing all the arguments that must be passed to

View File

@@ -12,8 +12,9 @@
class CudaPackage(PackageBase):
"""Auxiliary class which contains CUDA variant, dependencies and conflicts
and is meant to unify and facilitate its usage.
Maintainers: ax3l, svenevs
"""
maintainers = ['ax3l', 'svenevs']
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#gpu-feature-list
# https://developer.nvidia.com/cuda-gpus
@@ -25,6 +26,7 @@ class CudaPackage(PackageBase):
'50', '52', '53',
'60', '61', '62',
'70', '72', '75',
'80',
]
# FIXME: keep cuda and cuda_arch separate to make usage easier until
@@ -48,6 +50,7 @@ def cuda_flags(arch_list):
# CUDA version vs Architecture
# https://en.wikipedia.org/wiki/CUDA#GPUs_supported
# https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#deprecated-features
depends_on('cuda@:6.0', when='cuda_arch=10')
depends_on('cuda@:6.5', when='cuda_arch=11')
depends_on('cuda@2.1:6.5', when='cuda_arch=12')
@@ -58,8 +61,8 @@ def cuda_flags(arch_list):
depends_on('cuda@5.0:10.2', when='cuda_arch=30')
depends_on('cuda@5.0:10.2', when='cuda_arch=32')
depends_on('cuda@5.0:10.2', when='cuda_arch=35')
depends_on('cuda@6.5:10.2', when='cuda_arch=37')
depends_on('cuda@5.0:', when='cuda_arch=35')
depends_on('cuda@6.5:', when='cuda_arch=37')
depends_on('cuda@6.0:', when='cuda_arch=50')
depends_on('cuda@6.5:', when='cuda_arch=52')
@@ -73,6 +76,8 @@ def cuda_flags(arch_list):
depends_on('cuda@9.0:', when='cuda_arch=72')
depends_on('cuda@10.0:', when='cuda_arch=75')
depends_on('cuda@11.0:', when='cuda_arch=80')
# There are at least three cases to be aware of for compiler conflicts
# 1. Linux x86_64
# 2. Linux ppc64le
@@ -88,12 +93,15 @@ def cuda_flags(arch_list):
conflicts('%gcc@7:', when='+cuda ^cuda@:9.1' + arch_platform)
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130' + arch_platform)
conflicts('%gcc@9:', when='+cuda ^cuda@:10.2.89' + arch_platform)
conflicts('%gcc@:4,10:', when='+cuda ^cuda@:11.0.2' + arch_platform)
conflicts('%pgi@:14.8', when='+cuda ^cuda@:7.0.27' + arch_platform)
conflicts('%pgi@:15.3,15.5:', when='+cuda ^cuda@7.5' + arch_platform)
conflicts('%pgi@:16.2,16.0:16.3', when='+cuda ^cuda@8' + arch_platform)
conflicts('%pgi@:15,18:', when='+cuda ^cuda@9.0:9.1' + arch_platform)
conflicts('%pgi@:16', when='+cuda ^cuda@9.2.88:10' + arch_platform)
conflicts('%pgi@:17', when='+cuda ^cuda@10.2.89' + arch_platform)
conflicts('%pgi@:16,19:', when='+cuda ^cuda@9.2.88:10' + arch_platform)
conflicts('%pgi@:17,20:',
when='+cuda ^cuda@10.1.105:10.2.89' + arch_platform)
conflicts('%pgi@:17,20.2:', when='+cuda ^cuda@11.0.2' + arch_platform)
conflicts('%clang@:3.4', when='+cuda ^cuda@:7.5' + arch_platform)
conflicts('%clang@:3.7,4:',
when='+cuda ^cuda@8.0:9.0' + arch_platform)
@@ -104,7 +112,8 @@ def cuda_flags(arch_list):
conflicts('%clang@:3.7,7.1:', when='+cuda ^cuda@10.1.105' + arch_platform)
conflicts('%clang@:3.7,8.1:',
when='+cuda ^cuda@10.1.105:10.1.243' + arch_platform)
conflicts('%clang@:3.2,9.0:', when='+cuda ^cuda@10.2.89' + arch_platform)
conflicts('%clang@:3.2,9:', when='+cuda ^cuda@10.2.89' + arch_platform)
conflicts('%clang@:5,10:', when='+cuda ^cuda@11.0.2' + arch_platform)
# x86_64 vs. ppc64le differ according to NVidia docs
# Linux ppc64le compiler conflicts from Table from the docs below:
@@ -119,6 +128,8 @@ def cuda_flags(arch_list):
conflicts('%gcc@6:', when='+cuda ^cuda@:9' + arch_platform)
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130' + arch_platform)
conflicts('%gcc@9:', when='+cuda ^cuda@:10.1.243' + arch_platform)
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts('%gcc@:4,10:', when='+cuda ^cuda@:11.0.2' + arch_platform)
conflicts('%pgi', when='+cuda ^cuda@:8' + arch_platform)
conflicts('%pgi@:16', when='+cuda ^cuda@:9.1.185' + arch_platform)
conflicts('%pgi@:17', when='+cuda ^cuda@:10' + arch_platform)
@@ -128,6 +139,7 @@ def cuda_flags(arch_list):
conflicts('%clang@7:', when='+cuda ^cuda@10.0.130' + arch_platform)
conflicts('%clang@7.1:', when='+cuda ^cuda@:10.1.105' + arch_platform)
conflicts('%clang@8.1:', when='+cuda ^cuda@:10.2.89' + arch_platform)
conflicts('%clang@:5,10.0:', when='+cuda ^cuda@11.0.2' + arch_platform)
# Intel is mostly relevant for x86_64 Linux, even though it also
# exists for Mac OS X. No information prior to CUDA 3.2 or Intel 11.1
@@ -141,11 +153,13 @@ def cuda_flags(arch_list):
conflicts('%intel@17.0:', when='+cuda ^cuda@:8.0.60')
conflicts('%intel@18.0:', when='+cuda ^cuda@:9.9')
conflicts('%intel@19.0:', when='+cuda ^cuda@:10.0')
conflicts('%intel@19.1:', when='+cuda ^cuda@:10.1')
conflicts('%intel@19.2:', when='+cuda ^cuda@:11.0.2')
# XL is mostly relevant for ppc64le Linux
conflicts('%xl@:12,14:', when='+cuda ^cuda@:9.1')
conflicts('%xl@:12,14:15,17:', when='+cuda ^cuda@9.2')
conflicts('%xl@17:', when='+cuda ^cuda@:10.2.89')
conflicts('%xl@:12,17:', when='+cuda ^cuda@:11.0.2')
# Mac OS X
# platform = ' platform=darwin'
@@ -156,7 +170,7 @@ def cuda_flags(arch_list):
# `clang-apple@x.y.z as a possible fix.
# Compiler conflicts will be eventual taken from here:
# https://docs.nvidia.com/cuda/cuda-installation-guide-mac-os-x/index.html#abstract
conflicts('platform=darwin', when='+cuda ^cuda@11.0:')
conflicts('platform=darwin', when='+cuda ^cuda@11.0.2:')
# Make sure cuda_arch can not be used without +cuda
for value in cuda_arch_values:

View File

@@ -1017,6 +1017,15 @@ def setup_run_environment(self, env):
env.extend(EnvironmentModifications.from_sourcing_file(f, *args))
if self.spec.name in ('intel', 'intel-parallel-studio'):
# this package provides compilers
# TODO: fix check above when compilers are dependencies
env.set('CC', self.prefix.bin.icc)
env.set('CXX', self.prefix.bin.icpc)
env.set('FC', self.prefix.bin.ifort)
env.set('F77', self.prefix.bin.ifort)
env.set('F90', self.prefix.bin.ifort)
def setup_dependent_build_environment(self, env, dependent_spec):
# NB: This function is overwritten by 'mpi' provider packages:
#

View File

@@ -0,0 +1,55 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from llnl.util.filesystem import install_tree, working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.util.executable import which
class MavenPackage(PackageBase):
"""Specialized class for packages that are built using the
Maven build system. See https://maven.apache.org/index.html
for more information.
This class provides the following phases that can be overridden:
* build
* install
"""
# Default phases
phases = ['build', 'install']
# To be used in UI queries that require to know which
# build-system class we are using
build_system_class = 'MavenPackage'
depends_on('java', type=('build', 'run'))
depends_on('maven', type='build')
@property
def build_directory(self):
"""The directory containing the ``pom.xml`` file."""
return self.stage.source_path
def build(self, spec, prefix):
"""Compile code and package into a JAR file."""
with working_dir(self.build_directory):
mvn = which('mvn')
if self.run_tests:
mvn('verify')
else:
mvn('package', '-DskipTests')
def install(self, spec, prefix):
"""Copy to installation prefix."""
with working_dir(self.build_directory):
install_tree('.', prefix)
# Check that self.prefix is there after installation
run_after('install')(PackageBase.sanity_check_prefix)

View File

@@ -91,7 +91,7 @@ def configure(self, spec, prefix):
build_system_class = 'PythonPackage'
#: Callback names for build-time test
build_time_test_callbacks = ['test']
build_time_test_callbacks = ['build_test']
#: Callback names for install-time test
install_time_test_callbacks = ['import_module_test']
@@ -192,6 +192,10 @@ def build_scripts(self, spec, prefix):
self.setup_py('build_scripts', *args)
def build_scripts_args(self, spec, prefix):
"""Arguments to pass to build_scripts."""
return []
def clean(self, spec, prefix):
"""Clean up temporary files from 'build' command."""
args = self.clean_args(spec, prefix)
@@ -357,7 +361,7 @@ def check_args(self, spec, prefix):
# Testing
def test(self):
def build_test(self):
"""Run unit tests after in-place build.
These tests are only run if the package actually has a 'test' command.

View File

@@ -0,0 +1,59 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import inspect
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
class RubyPackage(PackageBase):
"""Specialized class for building Ruby gems.
This class provides two phases that can be overridden if required:
#. :py:meth:`~.RubyPackage.build`
#. :py:meth:`~.RubyPackage.install`
"""
#: Phases of a Ruby package
phases = ['build', 'install']
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = 'RubyPackage'
extends('ruby')
depends_on('ruby', type=('build', 'run'))
def build(self, spec, prefix):
"""Build a Ruby gem."""
# ruby-rake provides both rake.gemspec and Rakefile, but only
# rake.gemspec can be built without an existing rake installation
gemspecs = glob.glob('*.gemspec')
rakefiles = glob.glob('Rakefile')
if gemspecs:
inspect.getmodule(self).gem('build', '--norc', gemspecs[0])
elif rakefiles:
jobs = inspect.getmodule(self).make_jobs
inspect.getmodule(self).rake('package', '-j{0}'.format(jobs))
else:
# Some Ruby packages only ship `*.gem` files, so nothing to build
pass
def install(self, spec, prefix):
"""Install a Ruby gem.
The ruby package sets ``GEM_HOME`` to tell gem where to install to."""
gems = glob.glob('*.gem')
if gems:
inspect.getmodule(self).gem(
'install', '--norc', '--ignore-dependencies', gems[0])
# Check that self.prefix is there after installation
run_after('install')(PackageBase.sanity_check_prefix)

View File

@@ -33,7 +33,7 @@ class SConsPackage(PackageBase):
build_system_class = 'SConsPackage'
#: Callback names for build-time test
build_time_test_callbacks = ['test']
build_time_test_callbacks = ['build_test']
depends_on('scons', type='build')
@@ -59,7 +59,7 @@ def install(self, spec, prefix):
# Testing
def test(self):
def build_test(self):
"""Run unit tests after build.
By default, does nothing. Override this if you want to

View File

@@ -47,10 +47,10 @@ class WafPackage(PackageBase):
build_system_class = 'WafPackage'
# Callback names for build-time test
build_time_test_callbacks = ['test']
build_time_test_callbacks = ['build_test']
# Callback names for install-time test
install_time_test_callbacks = ['installtest']
install_time_test_callbacks = ['install_test']
# Much like AutotoolsPackage does not require automake and autoconf
# to build, WafPackage does not require waf to build. It only requires
@@ -106,7 +106,7 @@ def install_args(self):
# Testing
def test(self):
def build_test(self):
"""Run unit tests after build.
By default, does nothing. Override this if you want to
@@ -116,7 +116,7 @@ def test(self):
run_after('build')(PackageBase._run_default_build_time_test_callbacks)
def installtest(self):
def install_test(self):
"""Run unit tests after install.
By default, does nothing. Override this if you want to

View File

@@ -4,9 +4,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import base64
import copy
import datetime
import json
import os
import re
import shutil
import tempfile
import zlib
@@ -26,7 +28,7 @@
import spack.environment as ev
from spack.error import SpackError
import spack.hash_types as ht
from spack.main import SpackCommand
import spack.main
import spack.repo
from spack.spec import Spec
import spack.util.spack_yaml as syaml
@@ -37,8 +39,8 @@
'always',
]
spack_gpg = SpackCommand('gpg')
spack_compiler = SpackCommand('compiler')
spack_gpg = spack.main.SpackCommand('gpg')
spack_compiler = spack.main.SpackCommand('compiler')
class TemporaryDirectory(object):
@@ -421,12 +423,53 @@ def spec_matches(spec, match_string):
return spec.satisfies(match_string)
def find_matching_config(spec, ci_mappings):
def copy_attributes(attrs_list, src_dict, dest_dict):
for runner_attr in attrs_list:
if runner_attr in src_dict:
if runner_attr in dest_dict and runner_attr == 'tags':
# For 'tags', we combine the lists of tags, while
# avoiding duplicates
for tag in src_dict[runner_attr]:
if tag not in dest_dict[runner_attr]:
dest_dict[runner_attr].append(tag)
elif runner_attr in dest_dict and runner_attr == 'variables':
# For 'variables', we merge the dictionaries. Any conflicts
# (i.e. 'runner-attributes' has same variable key as the
# higher level) we resolve by keeping the more specific
# 'runner-attributes' version.
for src_key, src_val in src_dict[runner_attr].items():
dest_dict[runner_attr][src_key] = copy.deepcopy(
src_dict[runner_attr][src_key])
else:
dest_dict[runner_attr] = copy.deepcopy(src_dict[runner_attr])
def find_matching_config(spec, gitlab_ci):
runner_attributes = {}
overridable_attrs = [
'image',
'tags',
'variables',
'before_script',
'script',
'after_script',
]
copy_attributes(overridable_attrs, gitlab_ci, runner_attributes)
ci_mappings = gitlab_ci['mappings']
for ci_mapping in ci_mappings:
for match_string in ci_mapping['match']:
if spec_matches(spec, match_string):
return ci_mapping['runner-attributes']
return None
if 'runner-attributes' in ci_mapping:
copy_attributes(overridable_attrs,
ci_mapping['runner-attributes'],
runner_attributes)
return runner_attributes
else:
return None
return runner_attributes
def pkg_name_from_spec_label(spec_label):
@@ -449,7 +492,6 @@ def format_job_needs(phase_name, strip_compilers, dep_jobs,
def generate_gitlab_ci_yaml(env, print_summary, output_file,
custom_spack_repo=None, custom_spack_ref=None,
run_optimizer=False, use_dependencies=False):
# FIXME: What's the difference between one that opens with 'spack'
# and one that opens with 'env'? This will only handle the former.
@@ -462,7 +504,6 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tty.die('Environment yaml does not have "gitlab-ci" section')
gitlab_ci = yaml_root['gitlab-ci']
ci_mappings = gitlab_ci['mappings']
final_job_config = None
if 'final-stage-rebuild-index' in gitlab_ci:
@@ -488,22 +529,6 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
os.environ.get('SPACK_IS_PR_PIPELINE', '').lower() == 'true'
)
# Make sure we use a custom spack if necessary
before_script = None
after_script = None
if custom_spack_repo:
if not custom_spack_ref:
custom_spack_ref = 'master'
before_script = [
('git clone "{0}"'.format(custom_spack_repo)),
'pushd ./spack && git checkout "{0}" && popd'.format(
custom_spack_ref),
'. "./spack/share/spack/setup-env.sh"',
]
after_script = [
'rm -rf "./spack"'
]
ci_mirrors = yaml_root['mirrors']
mirror_urls = [url for url in ci_mirrors.values()]
@@ -580,7 +605,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
release_spec = root_spec[pkg_name]
runner_attribs = find_matching_config(
release_spec, ci_mappings)
release_spec, gitlab_ci)
if not runner_attribs:
tty.warn('No match found for {0}, skipping it'.format(
@@ -604,19 +629,27 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
except AttributeError:
image_name = build_image
job_script = [
'spack env activate --without-view .',
'spack ci rebuild',
]
if 'script' in runner_attribs:
job_script = [s for s in runner_attribs['script']]
before_script = None
if 'before_script' in runner_attribs:
before_script = [
s for s in runner_attribs['before_script']
]
after_script = None
if 'after_script' in runner_attribs:
after_script = [s for s in runner_attribs['after_script']]
osname = str(release_spec.architecture)
job_name = get_job_name(phase_name, strip_compilers,
release_spec, osname, build_group)
debug_flag = ''
if 'enable-debug-messages' in gitlab_ci:
debug_flag = '-d '
job_scripts = [
'spack env activate --without-view .',
'spack {0}ci rebuild'.format(debug_flag),
]
compiler_action = 'NONE'
if len(phases) > 1:
compiler_action = 'FIND_ANY'
@@ -717,7 +750,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
job_object = {
'stage': stage_name,
'variables': variables,
'script': job_scripts,
'script': job_script,
'tags': tags,
'artifacts': {
'paths': artifact_paths,
@@ -788,6 +821,26 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
output_object['stages'] = stage_names
# Capture the version of spack used to generate the pipeline, transform it
# into a value that can be passed to "git checkout", and save it in a
# global yaml variable
spack_version = spack.main.get_version()
version_to_clone = None
v_match = re.match(r"^\d+\.\d+\.\d+$", spack_version)
if v_match:
version_to_clone = 'v{0}'.format(v_match.group(0))
else:
v_match = re.match(r"^[^-]+-[^-]+-([a-f\d]+)$", spack_version)
if v_match:
version_to_clone = v_match.group(1)
else:
version_to_clone = spack_version
output_object['variables'] = {
'SPACK_VERSION': spack_version,
'SPACK_CHECKOUT_VERSION': version_to_clone,
}
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value

View File

@@ -2,86 +2,15 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import os
import llnl.util.tty as tty
import spack.build_environment as build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.util.environment import dump_environment, pickle_environment
import spack.cmd.common.env_utility as env_utility
description = "run a command in a spec's install environment, " \
"or dump its environment to screen or file"
section = "build"
level = "long"
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ['clean', 'dirty'])
subparser.add_argument(
'--dump', metavar="FILE",
help="dump a source-able environment to FILE"
)
subparser.add_argument(
'--pickle', metavar="FILE",
help="dump a pickled source-able environment to FILE"
)
subparser.add_argument(
'spec', nargs=argparse.REMAINDER,
metavar='spec [--] [cmd]...',
help="spec of package environment to emulate")
subparser.epilog\
= 'If a command is not specified, the environment will be printed ' \
'to standard output (cf /usr/bin/env) unless --dump and/or --pickle ' \
'are specified.\n\nIf a command is specified and spec is ' \
'multi-word, then the -- separator is obligatory.'
setup_parser = env_utility.setup_parser
def build_env(parser, args):
if not args.spec:
tty.die("spack build-env requires a spec.")
# Specs may have spaces in them, so if they do, require that the
# caller put a '--' between the spec and the command to be
# executed. If there is no '--', assume that the spec is the
# first argument.
sep = '--'
if sep in args.spec:
s = args.spec.index(sep)
spec = args.spec[:s]
cmd = args.spec[s + 1:]
else:
spec = args.spec[0]
cmd = args.spec[1:]
specs = spack.cmd.parse_specs(spec, concretize=True)
if len(specs) > 1:
tty.die("spack build-env only takes one spec.")
spec = specs[0]
build_environment.setup_package(spec.package, args.dirty)
if args.dump:
# Dump a source-able environment to a text file.
tty.msg("Dumping a source-able environment to {0}".format(args.dump))
dump_environment(args.dump)
if args.pickle:
# Dump a source-able environment to a pickle file.
tty.msg(
"Pickling a source-able environment to {0}".format(args.pickle))
pickle_environment(args.pickle)
if cmd:
# Execute the command with the new environment
os.execvp(cmd[0], cmd)
elif not bool(args.pickle or args.dump):
# If no command or dump/pickle option act like the "env" command
# and print out env vars.
for key, val in os.environ.items():
print("%s=%s" % (key, val))
env_utility.emulate_env_utility('build-env', 'build', args)

View File

@@ -8,6 +8,7 @@
import sys
import llnl.util.tty as tty
import spack.architecture
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
@@ -25,6 +26,7 @@
from spack.error import SpecError
from spack.spec import Spec, save_dependency_spec_yamls
from spack.util.string import plural
from spack.cmd import display_specs
@@ -237,8 +239,9 @@ def find_matching_specs(pkgs, allow_multiple_matches=False, env=None):
concretized specs given from cli
Args:
specs: list of specs to be matched against installed packages
allow_multiple_matches : if True multiple matches are admitted
pkgs (string): spec to be matched against installed packages
allow_multiple_matches (bool): if True multiple matches are admitted
env (Environment): active environment, or ``None`` if there is not one
Return:
list of specs
@@ -288,8 +291,12 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False,
# List of specs that match expressions given via command line
specs_from_cli = []
has_errors = False
allarch = other_arch
specs = bindist.get_specs(allarch)
specs = bindist.get_specs()
if not other_arch:
arch = spack.architecture.default_arch().to_spec()
specs = [s for s in specs if s.satisfies(arch)]
for pkg in pkgs:
matches = []
tty.msg("buildcache spec(s) matching %s \n" % pkg)
@@ -326,26 +333,25 @@ def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
signing_key=None, force=False, make_relative=False,
unsigned=False, allow_root=False, rebuild_index=False):
if spec_yaml:
packages = set()
with open(spec_yaml, 'r') as fd:
yaml_text = fd.read()
tty.debug('createtarball read spec yaml:')
tty.debug(yaml_text)
s = Spec.from_yaml(yaml_text)
packages.add('/{0}'.format(s.dag_hash()))
package = '/{0}'.format(s.dag_hash())
matches = find_matching_specs(package, env=env)
elif packages:
packages = packages
matches = find_matching_specs(packages, env=env)
elif env:
packages = env.concretized_user_specs
matches = [env.specs_by_hash[h] for h in env.concretized_order]
else:
tty.die("build cache file creation requires at least one" +
" installed package spec, an activate environment," +
" installed package spec, an active environment," +
" or else a path to a yaml file containing a spec" +
" to install")
pkgs = set(packages)
specs = set()
mirror = spack.mirror.MirrorCollection().lookup(output_location)
@@ -354,8 +360,6 @@ def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
msg = 'Buildcache files will be output to %s/build_cache' % outdir
tty.msg(msg)
matches = find_matching_specs(pkgs, env=env)
if matches:
tty.debug('Found at least one matching spec')
@@ -365,11 +369,16 @@ def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
tty.debug('skipping external or virtual spec %s' %
match.format())
else:
if add_spec:
lookup = spack.store.db.query_one(match)
if not add_spec:
tty.debug('skipping matching root spec %s' % match.format())
elif lookup is None:
tty.debug('skipping uninstalled matching spec %s' %
match.format())
else:
tty.debug('adding matching spec %s' % match.format())
specs.add(match)
else:
tty.debug('skipping matching spec %s' % match.format())
if not add_deps:
continue
@@ -382,9 +391,14 @@ def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
if d == 0:
continue
lookup = spack.store.db.query_one(node)
if node.external or node.virtual:
tty.debug('skipping external or virtual dependency %s' %
node.format())
elif lookup is None:
tty.debug('skipping uninstalled depenendency %s' %
node.format())
else:
tty.debug('adding dependency %s' % node.format())
specs.add(node)
@@ -393,9 +407,12 @@ def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
for spec in specs:
tty.debug('creating binary cache file for package %s ' % spec.format())
bindist.build_tarball(spec, outdir, force, make_relative,
unsigned, allow_root, signing_key,
rebuild_index)
try:
bindist.build_tarball(spec, outdir, force, make_relative,
unsigned, allow_root, signing_key,
rebuild_index)
except bindist.NoOverwriteException as e:
tty.warn(e)
def createtarball(args):
@@ -488,10 +505,20 @@ def install_tarball(spec, args):
def listspecs(args):
"""list binary packages available from mirrors"""
specs = bindist.get_specs(args.allarch)
specs = bindist.get_specs()
if not args.allarch:
arch = spack.architecture.default_arch().to_spec()
specs = [s for s in specs if s.satisfies(arch)]
if args.specs:
constraints = set(args.specs)
specs = [s for s in specs if any(s.satisfies(c) for c in constraints)]
if sys.stdout.isatty():
builds = len(specs)
tty.msg("%s." % plural(builds, 'cached build'))
if not builds and not args.allarch:
tty.msg("You can query all available architectures with:",
"spack buildcache list --allarch")
display_specs(specs, args, all_headers=True)

View File

@@ -65,7 +65,7 @@ def checksum(parser, args):
version_lines = spack.stage.get_checksums_for_versions(
url_dict, pkg.name, keep_stage=args.keep_stage,
batch=(args.batch or len(args.versions) > 0),
batch=(args.batch or len(args.versions) > 0 or len(url_dict) == 1),
fetch_options=pkg.fetch_options)
print()

View File

@@ -45,15 +45,6 @@ def setup_parser(subparser):
'--copy-to', default=None,
help="Absolute path of additional location where generated jobs " +
"yaml file should be copied. Default is not to copy.")
generate.add_argument(
'--spack-repo', default=None,
help="Provide a url for this argument if a custom spack repo " +
"should be cloned as a step in each generated job.")
generate.add_argument(
'--spack-ref', default=None,
help="Provide a git branch or tag if a custom spack branch " +
"should be checked out as a step in each generated job. " +
"This argument is ignored if no --spack-repo is provided.")
generate.add_argument(
'--optimize', action='store_true', default=False,
help="(Experimental) run the generated document through a series of "
@@ -82,8 +73,6 @@ def ci_generate(args):
output_file = args.output_file
copy_yaml_to = args.copy_to
spack_repo = args.spack_repo
spack_ref = args.spack_ref
run_optimizer = args.optimize
use_dependencies = args.dependencies
@@ -97,8 +86,7 @@ def ci_generate(args):
# Generate the jobs
spack_ci.generate_gitlab_ci_yaml(
env, True, output_file, spack_repo, spack_ref,
run_optimizer=run_optimizer,
env, True, output_file, run_optimizer=run_optimizer,
use_dependencies=use_dependencies)
if copy_yaml_to:
@@ -249,8 +237,11 @@ def ci_rebuild(args):
# Make a copy of the environment file, so we can overwrite the changed
# version in between the two invocations of "spack install"
env_src_path = os.path.join(current_directory, 'spack.yaml')
env_dst_path = os.path.join(current_directory, 'spack.yaml_BACKUP')
env_src_path = env.manifest_path
env_dirname = os.path.dirname(env_src_path)
env_filename = os.path.basename(env_src_path)
env_copyname = '{0}_BACKUP'.format(env_filename)
env_dst_path = os.path.join(env_dirname, env_copyname)
shutil.copyfile(env_src_path, env_dst_path)
tty.debug('job concrete spec path: {0}'.format(job_spec_yaml_path))
@@ -339,8 +330,10 @@ def ci_rebuild(args):
first_pass_args))
spack_cmd(*first_pass_args)
# Overwrite the changed environment file so it doesn't
# Overwrite the changed environment file so it doesn't break
# the next install invocation.
tty.debug('Copying {0} to {1}'.format(
env_dst_path, env_src_path))
shutil.copyfile(env_dst_path, env_src_path)
second_pass_args = install_args + [

View File

@@ -10,10 +10,11 @@
import llnl.util.tty as tty
import spack.caches
import spack.cmd
import spack.cmd.test
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.stage
import spack.config
from spack.paths import lib_path, var_path

View File

@@ -275,3 +275,53 @@ def no_checksum():
return Args(
'-n', '--no-checksum', action='store_true', default=False,
help="do not use checksums to verify downloaded files (unsafe)")
def add_cdash_args(subparser, add_help):
cdash_help = {}
if add_help:
cdash_help['upload-url'] = "CDash URL where reports will be uploaded"
cdash_help['build'] = """The name of the build that will be reported to CDash.
Defaults to spec of the package to operate on."""
cdash_help['site'] = """The site name that will be reported to CDash.
Defaults to current system hostname."""
cdash_help['track'] = """Results will be reported to this group on CDash.
Defaults to Experimental."""
cdash_help['buildstamp'] = """Instead of letting the CDash reporter prepare the
buildstamp which, when combined with build name, site and project,
uniquely identifies the build, provide this argument to identify
the build yourself. Format: %%Y%%m%%d-%%H%%M-[cdash-track]"""
else:
cdash_help['upload-url'] = argparse.SUPPRESS
cdash_help['build'] = argparse.SUPPRESS
cdash_help['site'] = argparse.SUPPRESS
cdash_help['track'] = argparse.SUPPRESS
cdash_help['buildstamp'] = argparse.SUPPRESS
subparser.add_argument(
'--cdash-upload-url',
default=None,
help=cdash_help['upload-url']
)
subparser.add_argument(
'--cdash-build',
default=None,
help=cdash_help['build']
)
subparser.add_argument(
'--cdash-site',
default=None,
help=cdash_help['site']
)
cdash_subgroup = subparser.add_mutually_exclusive_group()
cdash_subgroup.add_argument(
'--cdash-track',
default='Experimental',
help=cdash_help['track']
)
cdash_subgroup.add_argument(
'--cdash-buildstamp',
default=None,
help=cdash_help['buildstamp']
)

View File

@@ -0,0 +1,82 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import os
import llnl.util.tty as tty
import spack.build_environment as build_environment
import spack.paths
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.util.environment import dump_environment, pickle_environment
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ['clean', 'dirty'])
subparser.add_argument(
'--dump', metavar="FILE",
help="dump a source-able environment to FILE"
)
subparser.add_argument(
'--pickle', metavar="FILE",
help="dump a pickled source-able environment to FILE"
)
subparser.add_argument(
'spec', nargs=argparse.REMAINDER,
metavar='spec [--] [cmd]...',
help="specs of package environment to emulate")
subparser.epilog\
= 'If a command is not specified, the environment will be printed ' \
'to standard output (cf /usr/bin/env) unless --dump and/or --pickle ' \
'are specified.\n\nIf a command is specified and spec is ' \
'multi-word, then the -- separator is obligatory.'
def emulate_env_utility(cmd_name, context, args):
if not args.spec:
tty.die("spack %s requires a spec." % cmd_name)
# Specs may have spaces in them, so if they do, require that the
# caller put a '--' between the spec and the command to be
# executed. If there is no '--', assume that the spec is the
# first argument.
sep = '--'
if sep in args.spec:
s = args.spec.index(sep)
spec = args.spec[:s]
cmd = args.spec[s + 1:]
else:
spec = args.spec[0]
cmd = args.spec[1:]
specs = spack.cmd.parse_specs(spec, concretize=True)
if len(specs) > 1:
tty.die("spack %s only takes one spec." % cmd_name)
spec = specs[0]
build_environment.setup_package(spec.package, args.dirty, context)
if args.dump:
# Dump a source-able environment to a text file.
tty.msg("Dumping a source-able environment to {0}".format(args.dump))
dump_environment(args.dump)
if args.pickle:
# Dump a source-able environment to a pickle file.
tty.msg(
"Pickling a source-able environment to {0}".format(args.pickle))
pickle_environment(args.pickle)
if cmd:
# Execute the command with the new environment
os.execvp(cmd[0], cmd)
elif not bool(args.pickle or args.dump):
# If no command or dump/pickle option act like the "env" command
# and print out env vars.
for key, val in os.environ.items():
print("%s=%s" % (key, val))

View File

@@ -2,16 +2,19 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import collections
import os
import re
import shutil
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.config
import spack.cmd.common.arguments
import spack.schema.env
import spack.environment as ev
import spack.schema.packages
import spack.util.spack_yaml as syaml
from spack.util.editor import editor
@@ -80,6 +83,19 @@ def setup_parser(subparser):
# Make the add parser available later
setup_parser.add_parser = add_parser
update = sp.add_parser(
'update', help='update configuration files to the latest format'
)
spack.cmd.common.arguments.add_common_arguments(update, ['yes_to_all'])
update.add_argument('section', help='section to update')
revert = sp.add_parser(
'revert',
help='revert configuration files to their state before update'
)
spack.cmd.common.arguments.add_common_arguments(revert, ['yes_to_all'])
revert.add_argument('section', help='section to update')
def _get_scope_and_section(args):
"""Extract config scope and section from arguments."""
@@ -161,14 +177,6 @@ def config_list(args):
print(' '.join(list(spack.config.section_schemas)))
def set_config(args, section, new, scope):
if re.match(r'env.*', scope):
e = ev.get_env(args, 'config add')
e.set_config(section, new)
else:
spack.config.set(section, new, scope=scope)
def config_add(args):
"""Add the given configuration to the specified config scope
@@ -200,7 +208,7 @@ def config_add(args):
existing = spack.config.get(section, scope=scope)
new = spack.config.merge_yaml(existing, value)
set_config(args, section, new, scope)
spack.config.set(section, new, scope)
if args.path:
components = spack.config.process_config_path(args.path)
@@ -244,7 +252,7 @@ def config_add(args):
# merge value into existing
new = spack.config.merge_yaml(existing, value)
set_config(args, path, new, scope)
spack.config.set(path, new, scope)
def config_remove(args):
@@ -272,15 +280,167 @@ def config_remove(args):
# This should be impossible to reach
raise spack.config.ConfigError('Config has nested non-dict values')
set_config(args, path, existing, scope)
spack.config.set(path, existing, scope)
def _can_update_config_file(scope_dir, cfg_file):
dir_ok = fs.can_write_to_dir(scope_dir)
cfg_ok = fs.can_access(cfg_file)
return dir_ok and cfg_ok
def config_update(args):
# Read the configuration files
spack.config.config.get_config(args.section, scope=args.scope)
updates = spack.config.config.format_updates[args.section]
cannot_overwrite, skip_system_scope = [], False
for scope in updates:
cfg_file = spack.config.config.get_config_filename(
scope.name, args.section
)
scope_dir = scope.path
can_be_updated = _can_update_config_file(scope_dir, cfg_file)
if not can_be_updated:
if scope.name == 'system':
skip_system_scope = True
msg = ('Not enough permissions to write to "system" scope. '
'Skipping update at that location [cfg={0}]')
tty.warn(msg.format(cfg_file))
continue
cannot_overwrite.append((scope, cfg_file))
if cannot_overwrite:
msg = 'Detected permission issues with the following scopes:\n\n'
for scope, cfg_file in cannot_overwrite:
msg += '\t[scope={0}, cfg={1}]\n'.format(scope.name, cfg_file)
msg += ('\nEither ensure that you have sufficient permissions to '
'modify these files or do not include these scopes in the '
'update.')
tty.die(msg)
if skip_system_scope:
updates = [x for x in updates if x.name != 'system']
# Report if there are no updates to be done
if not updates:
msg = 'No updates needed for "{0}" section.'
tty.msg(msg.format(args.section))
return
proceed = True
if not args.yes_to_all:
msg = ('The following configuration files are going to be updated to'
' the latest schema format:\n\n')
for scope in updates:
cfg_file = spack.config.config.get_config_filename(
scope.name, args.section
)
msg += '\t[scope={0}, file={1}]\n'.format(scope.name, cfg_file)
msg += ('\nIf the configuration files are updated, versions of Spack '
'that are older than this version may not be able to read '
'them. Spack stores backups of the updated files which can '
'be retrieved with "spack config revert"')
tty.msg(msg)
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
# Get a function to update the format
update_fn = spack.config.ensure_latest_format_fn(args.section)
for scope in updates:
cfg_file = spack.config.config.get_config_filename(
scope.name, args.section
)
with open(cfg_file) as f:
data = syaml.load_config(f) or {}
data = data.pop(args.section, {})
update_fn(data)
# Make a backup copy and rewrite the file
bkp_file = cfg_file + '.bkp'
shutil.copy(cfg_file, bkp_file)
spack.config.config.update_config(
args.section, data, scope=scope.name, force=True
)
msg = 'File "{0}" updated [backup={1}]'
tty.msg(msg.format(cfg_file, bkp_file))
def _can_revert_update(scope_dir, cfg_file, bkp_file):
dir_ok = fs.can_write_to_dir(scope_dir)
cfg_ok = not os.path.exists(cfg_file) or fs.can_access(cfg_file)
bkp_ok = fs.can_access(bkp_file)
return dir_ok and cfg_ok and bkp_ok
def config_revert(args):
scopes = [args.scope] if args.scope else [
x.name for x in spack.config.config.file_scopes
]
# Search for backup files in the configuration scopes
Entry = collections.namedtuple('Entry', ['scope', 'cfg', 'bkp'])
to_be_restored, cannot_overwrite = [], []
for scope in scopes:
cfg_file = spack.config.config.get_config_filename(scope, args.section)
bkp_file = cfg_file + '.bkp'
# If the backup files doesn't exist move to the next scope
if not os.path.exists(bkp_file):
continue
# If it exists and we don't have write access in this scope
# keep track of it and report a comprehensive error later
entry = Entry(scope, cfg_file, bkp_file)
scope_dir = os.path.dirname(bkp_file)
can_be_reverted = _can_revert_update(scope_dir, cfg_file, bkp_file)
if not can_be_reverted:
cannot_overwrite.append(entry)
continue
to_be_restored.append(entry)
# Report errors if we can't revert a configuration
if cannot_overwrite:
msg = 'Detected permission issues with the following scopes:\n\n'
for e in cannot_overwrite:
msg += '\t[scope={0.scope}, cfg={0.cfg}, bkp={0.bkp}]\n'.format(e)
msg += ('\nEither ensure to have the right permissions before retrying'
' or be more specific on the scope to revert.')
tty.die(msg)
proceed = True
if not args.yes_to_all:
msg = ('The following scopes will be restored from the corresponding'
' backup files:\n')
for entry in to_be_restored:
msg += '\t[scope={0.scope}, bkp={0.bkp}]\n'.format(entry)
msg += 'This operation cannot be undone.'
tty.msg(msg)
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
for _, cfg_file, bkp_file in to_be_restored:
shutil.copy(bkp_file, cfg_file)
os.unlink(bkp_file)
msg = 'File "{0}" reverted to old state'
tty.msg(msg.format(cfg_file))
def config(parser, args):
action = {'get': config_get,
'blame': config_blame,
'edit': config_edit,
'list': config_list,
'add': config_add,
'rm': config_remove,
'remove': config_remove}
action = {
'get': config_get,
'blame': config_blame,
'edit': config_edit,
'list': config_list,
'add': config_add,
'rm': config_remove,
'remove': config_remove,
'update': config_update,
'revert': config_revert
}
action[args.config_command](args)

View File

@@ -204,6 +204,17 @@ def qmake_args(self):
return args"""
class MavenPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for Maven-based packages"""
base_class_name = 'MavenPackage'
body_def = """\
def build(self, spec, prefix):
# FIXME: If not needed delete this function
pass"""
class SconsPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for SCons-based packages"""
@@ -352,6 +363,34 @@ def __init__(self, name, *args, **kwargs):
super(OctavePackageTemplate, self).__init__(name, *args, **kwargs)
class RubyPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for Ruby packages"""
base_class_name = 'RubyPackage'
dependencies = """\
# FIXME: Add dependencies if required. Only add the ruby dependency
# if you need specific versions. A generic ruby dependency is
# added implicity by the RubyPackage class.
# depends_on('ruby@X.Y.Z:', type=('build', 'run'))
# depends_on('ruby-foo', type=('build', 'run'))"""
body_def = """\
def build(self, spec, prefix):
# FIXME: If not needed delete this function
pass"""
def __init__(self, name, *args, **kwargs):
# If the user provided `--name ruby-numpy`, don't rename it
# ruby-ruby-numpy
if not name.startswith('ruby-'):
# Make it more obvious that we are renaming the package
tty.msg("Changing package name from {0} to ruby-{0}".format(name))
name = 'ruby-{0}'.format(name)
super(RubyPackageTemplate, self).__init__(name, *args, **kwargs)
class MakefilePackageTemplate(PackageTemplate):
"""Provides appropriate overrides for Makefile packages"""
@@ -402,6 +441,7 @@ def __init__(self, name, *args, **kwargs):
'cmake': CMakePackageTemplate,
'bundle': BundlePackageTemplate,
'qmake': QMakePackageTemplate,
'maven': MavenPackageTemplate,
'scons': SconsPackageTemplate,
'waf': WafPackageTemplate,
'bazel': BazelPackageTemplate,
@@ -410,6 +450,7 @@ def __init__(self, name, *args, **kwargs):
'perlmake': PerlmakePackageTemplate,
'perlbuild': PerlbuildPackageTemplate,
'octave': OctavePackageTemplate,
'ruby': RubyPackageTemplate,
'makefile': MakefilePackageTemplate,
'intel': IntelPackageTemplate,
'meson': MesonPackageTemplate,
@@ -445,6 +486,9 @@ def setup_parser(subparser):
subparser.add_argument(
'--skip-editor', action='store_true',
help="skip the edit session for the package (e.g., automation)")
subparser.add_argument(
'-b', '--batch', action='store_true',
help="don't ask which versions to checksum")
class BuildSystemGuesser:
@@ -461,12 +505,16 @@ def __call__(self, stage, url):
"""Try to guess the type of build system used by a project based on
the contents of its archive or the URL it was downloaded from."""
# Most octave extensions are hosted on Octave-Forge:
# https://octave.sourceforge.net/index.html
# They all have the same base URL.
if url is not None and 'downloads.sourceforge.net/octave/' in url:
self.build_system = 'octave'
return
if url is not None:
# Most octave extensions are hosted on Octave-Forge:
# https://octave.sourceforge.net/index.html
# They all have the same base URL.
if 'downloads.sourceforge.net/octave/' in url:
self.build_system = 'octave'
return
if url.endswith('.gem'):
self.build_system = 'ruby'
return
# A list of clues that give us an idea of the build system a package
# uses. If the regular expression matches a file contained in the
@@ -479,12 +527,16 @@ def __call__(self, stage, url):
(r'/configure$', 'autotools'),
(r'/configure\.(in|ac)$', 'autoreconf'),
(r'/Makefile\.am$', 'autoreconf'),
(r'/pom\.xml$', 'maven'),
(r'/SConstruct$', 'scons'),
(r'/waf$', 'waf'),
(r'/setup\.py$', 'python'),
(r'/WORKSPACE$', 'bazel'),
(r'/Build\.PL$', 'perlbuild'),
(r'/Makefile\.PL$', 'perlmake'),
(r'/.*\.gemspec$', 'ruby'),
(r'/Rakefile$', 'ruby'),
(r'/setup\.rb$', 'ruby'),
(r'/.*\.pro$', 'qmake'),
(r'/(GNU)?[Mm]akefile$', 'makefile'),
(r'/DESCRIPTION$', 'octave'),
@@ -511,7 +563,7 @@ def __call__(self, stage, url):
# Determine the build system based on the files contained
# in the archive.
for pattern, bs in clues:
if any(re.search(pattern, l) for l in lines):
if any(re.search(pattern, line) for line in lines):
self.build_system = bs
break
@@ -629,7 +681,8 @@ def get_versions(args, name):
versions = spack.stage.get_checksums_for_versions(
url_dict, name, first_stage_function=guesser,
keep_stage=args.keep_stage, batch=True)
keep_stage=args.keep_stage,
batch=(args.batch or len(url_dict) == 1))
else:
versions = unhashed_versions

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
import sys
from collections import namedtuple
@@ -14,6 +15,7 @@
import spack.config
import spack.schema.env
import spack.cmd.common.arguments
import spack.cmd.install
import spack.cmd.uninstall
import spack.cmd.modules
@@ -37,6 +39,8 @@
['status', 'st'],
'loads',
'view',
'update',
'revert'
]
@@ -394,6 +398,80 @@ def env_loads(args):
print(' source %s' % loads_file)
def env_update_setup_parser(subparser):
"""update environments to the latest format"""
subparser.add_argument(
metavar='env', dest='env',
help='name or directory of the environment to activate'
)
spack.cmd.common.arguments.add_common_arguments(subparser, ['yes_to_all'])
def env_update(args):
manifest_file = ev.manifest_file(args.env)
backup_file = manifest_file + ".bkp"
needs_update = not ev.is_latest_format(manifest_file)
if not needs_update:
tty.msg('No update needed for the environment "{0}"'.format(args.env))
return
proceed = True
if not args.yes_to_all:
msg = ('The environment "{0}" is going to be updated to the latest '
'schema format.\nIf the environment is updated, versions of '
'Spack that are older than this version may not be able to '
'read it. Spack stores backups of the updated environment '
'which can be retrieved with "spack env revert"')
tty.msg(msg.format(args.env))
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
ev.update_yaml(manifest_file, backup_file=backup_file)
msg = 'Environment "{0}" has been updated [backup={1}]'
tty.msg(msg.format(args.env, backup_file))
def env_revert_setup_parser(subparser):
"""restore environments to their state before update"""
subparser.add_argument(
metavar='env', dest='env',
help='name or directory of the environment to activate'
)
spack.cmd.common.arguments.add_common_arguments(subparser, ['yes_to_all'])
def env_revert(args):
manifest_file = ev.manifest_file(args.env)
backup_file = manifest_file + ".bkp"
# Check that both the spack.yaml and the backup exist, the inform user
# on what is going to happen and ask for confirmation
if not os.path.exists(manifest_file):
msg = 'cannot fine the manifest file of the environment [file={0}]'
tty.die(msg.format(manifest_file))
if not os.path.exists(backup_file):
msg = 'cannot find the old manifest file to be restored [file={0}]'
tty.die(msg.format(backup_file))
proceed = True
if not args.yes_to_all:
msg = ('Spack is going to overwrite the current manifest file'
' with a backup copy [manifest={0}, backup={1}]')
tty.msg(msg.format(manifest_file, backup_file))
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
shutil.copy(backup_file, manifest_file)
os.remove(backup_file)
msg = 'Environment "{0}" reverted to old state'
tty.msg(msg.format(manifest_file))
#: Dictionary mapping subcommand names and aliases to functions
subcommand_functions = {}

View File

@@ -2,22 +2,25 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
from collections import defaultdict, namedtuple
import argparse
import os
import re
import six
import sys
from collections import defaultdict, namedtuple
import spack
import spack.error
import llnl.util.tty as tty
import spack.util.spack_yaml as syaml
import spack.util.environment
import llnl.util.filesystem
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
import six
import spack
import spack.cmd
import spack.error
import spack.util.environment
import spack.util.spack_yaml as syaml
description = "add external packages to Spack configuration"
description = "manage external packages in Spack configuration"
section = "config"
level = "short"
@@ -26,12 +29,25 @@ def setup_parser(subparser):
sp = subparser.add_subparsers(
metavar='SUBCOMMAND', dest='external_command')
find_parser = sp.add_parser('find', help=external_find.__doc__)
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
find_parser = sp.add_parser(
'find', help='add external packages to packages.yaml'
)
find_parser.add_argument(
'--not-buildable', action='store_true', default=False,
help="packages with detected externals won't be built with Spack")
find_parser.add_argument(
'--scope', choices=scopes, metavar=scopes_metavar,
default=spack.config.default_modify_scope('packages'),
help="configuration scope to modify")
find_parser.add_argument('packages', nargs=argparse.REMAINDER)
sp.add_parser(
'list', help='list detectable packages, by repository and name'
)
def is_executable(path):
return os.path.isfile(path) and os.access(path, os.X_OK)
@@ -74,19 +90,37 @@ def _generate_pkg_config(external_pkg_entries):
This does not generate the entire packages.yaml. For example, given some
external entries for the CMake package, this could return::
{ 'paths': {
'cmake@3.17.1': '/opt/cmake-3.17.1/',
'cmake@3.16.5': '/opt/cmake-3.16.5/'
}
{
'externals': [{
'spec': 'cmake@3.17.1',
'prefix': '/opt/cmake-3.17.1/'
}, {
'spec': 'cmake@3.16.5',
'prefix': '/opt/cmake-3.16.5/'
}]
}
"""
paths_dict = syaml.syaml_dict()
pkg_dict = syaml.syaml_dict()
pkg_dict['externals'] = []
for e in external_pkg_entries:
if not _spec_is_valid(e.spec):
continue
paths_dict[str(e.spec)] = e.base_dir
pkg_dict = syaml.syaml_dict()
pkg_dict['paths'] = paths_dict
external_items = [('spec', str(e.spec)), ('prefix', e.base_dir)]
if e.spec.external_modules:
external_items.append(('modules', e.spec.external_modules))
if e.spec.extra_attributes:
external_items.append(
('extra_attributes',
syaml.syaml_dict(e.spec.extra_attributes.items()))
)
# external_items.extend(e.spec.extra_attributes.items())
pkg_dict['externals'].append(
syaml.syaml_dict(external_items)
)
return pkg_dict
@@ -120,7 +154,17 @@ def external_find(args):
packages_to_check = spack.repo.path.all_packages()
pkg_to_entries = _get_external_packages(packages_to_check)
_update_pkg_config(pkg_to_entries, args.not_buildable)
new_entries = _update_pkg_config(
args.scope, pkg_to_entries, args.not_buildable
)
if new_entries:
path = spack.config.config.get_config_filename(args.scope, 'packages')
msg = ('The following specs have been detected on this system '
'and added to {0}')
tty.msg(msg.format(path))
spack.cmd.display_specs(new_entries)
else:
tty.msg('No new external packages detected')
def _group_by_prefix(paths):
@@ -162,32 +206,34 @@ def _get_predefined_externals():
pkg_config = spack.config.get('packages')
already_defined_specs = set()
for pkg_name, per_pkg_cfg in pkg_config.items():
paths = per_pkg_cfg.get('paths', {})
already_defined_specs.update(spack.spec.Spec(k) for k in paths)
modules = per_pkg_cfg.get('modules', {})
already_defined_specs.update(spack.spec.Spec(k) for k in modules)
for item in per_pkg_cfg.get('externals', []):
already_defined_specs.add(spack.spec.Spec(item['spec']))
return already_defined_specs
def _update_pkg_config(pkg_to_entries, not_buildable):
def _update_pkg_config(scope, pkg_to_entries, not_buildable):
predefined_external_specs = _get_predefined_externals()
pkg_to_cfg = {}
pkg_to_cfg, all_new_specs = {}, []
for pkg_name, ext_pkg_entries in pkg_to_entries.items():
new_entries = list(
e for e in ext_pkg_entries
if (e.spec not in predefined_external_specs))
pkg_config = _generate_pkg_config(new_entries)
all_new_specs.extend([
spack.spec.Spec(x['spec']) for x in pkg_config.get('externals', [])
])
if not_buildable:
pkg_config['buildable'] = False
pkg_to_cfg[pkg_name] = pkg_config
cfg_scope = spack.config.default_modify_scope()
pkgs_cfg = spack.config.get('packages', scope=cfg_scope)
pkgs_cfg = spack.config.get('packages', scope=scope)
spack.config.merge_yaml(pkgs_cfg, pkg_to_cfg)
spack.config.set('packages', pkgs_cfg, scope=cfg_scope)
spack.config.set('packages', pkgs_cfg, scope=scope)
return all_new_specs
def _get_external_packages(packages_to_check, system_path_to_exe=None):
@@ -234,7 +280,7 @@ def _get_external_packages(packages_to_check, system_path_to_exe=None):
if not specs:
tty.debug(
'The following executables in {0} were decidedly not'
'The following executables in {0} were decidedly not '
'part of the package {1}: {2}'
.format(prefix, pkg.name, ', '.join(exes_in_prefix))
)
@@ -259,13 +305,33 @@ def _get_external_packages(packages_to_check, system_path_to_exe=None):
else:
resolved_specs[spec] = prefix
try:
spec.validate_detection()
except Exception as e:
msg = ('"{0}" has been detected on the system but will '
'not be added to packages.yaml [reason={1}]')
tty.warn(msg.format(spec, str(e)))
continue
if spec.external_path:
pkg_prefix = spec.external_path
pkg_to_entries[pkg.name].append(
ExternalPackageEntry(spec=spec, base_dir=pkg_prefix))
return pkg_to_entries
def external(parser, args):
action = {'find': external_find}
def external_list(args):
# Trigger a read of all packages, might take a long time.
list(spack.repo.path.all_packages())
# Print all the detectable packages
tty.msg("Detectable packages per repository")
for namespace, pkgs in sorted(spack.package.detectable_packages.items()):
print("Repository:", namespace)
colify.colify(pkgs, indent=4, output=sys.stdout)
def external(parser, args):
action = {'find': external_find, 'list': external_list}
action[args.external_command](args)

View File

@@ -35,6 +35,10 @@
@g{%compiler@version} build with specific compiler version
@g{%compiler@min:max} specific version range (see above)
compiler flags:
@g{cflags="flags"} cppflags, cflags, cxxflags,
fflags, ldflags, ldlibs
variants:
@B{+variant} enable <variant>
@r{-variant} or @r{~variant} disable <variant>
@@ -42,7 +46,7 @@
@B{variant=value1,value2,value3} set multi-value <variant> values
architecture variants:
@m{platform=platform} linux, darwin, cray, bgq, etc.
@m{platform=platform} linux, darwin, cray, etc.
@m{os=operating_system} specific <operating_system>
@m{target=target} specific <target> processor
@m{arch=platform-os-target} shortcut for all three above

View File

@@ -160,60 +160,10 @@ def setup_parser(subparser):
action='store_true',
help="Show usage instructions for CDash reporting"
)
add_cdash_args(subparser, False)
arguments.add_cdash_args(subparser, False)
arguments.add_common_arguments(subparser, ['yes_to_all', 'spec'])
def add_cdash_args(subparser, add_help):
cdash_help = {}
if add_help:
cdash_help['upload-url'] = "CDash URL where reports will be uploaded"
cdash_help['build'] = """The name of the build that will be reported to CDash.
Defaults to spec of the package to install."""
cdash_help['site'] = """The site name that will be reported to CDash.
Defaults to current system hostname."""
cdash_help['track'] = """Results will be reported to this group on CDash.
Defaults to Experimental."""
cdash_help['buildstamp'] = """Instead of letting the CDash reporter prepare the
buildstamp which, when combined with build name, site and project,
uniquely identifies the build, provide this argument to identify
the build yourself. Format: %%Y%%m%%d-%%H%%M-[cdash-track]"""
else:
cdash_help['upload-url'] = argparse.SUPPRESS
cdash_help['build'] = argparse.SUPPRESS
cdash_help['site'] = argparse.SUPPRESS
cdash_help['track'] = argparse.SUPPRESS
cdash_help['buildstamp'] = argparse.SUPPRESS
subparser.add_argument(
'--cdash-upload-url',
default=None,
help=cdash_help['upload-url']
)
subparser.add_argument(
'--cdash-build',
default=None,
help=cdash_help['build']
)
subparser.add_argument(
'--cdash-site',
default=None,
help=cdash_help['site']
)
cdash_subgroup = subparser.add_mutually_exclusive_group()
cdash_subgroup.add_argument(
'--cdash-track',
default='Experimental',
help=cdash_help['track']
)
cdash_subgroup.add_argument(
'--cdash-buildstamp',
default=None,
help=cdash_help['buildstamp']
)
def default_log_file(spec):
"""Computes the default filename for the log file and creates
the corresponding directory if not present
@@ -263,7 +213,7 @@ def install(parser, args, **kwargs):
SPACK_CDASH_AUTH_TOKEN
authentication token to present to CDash
'''))
add_cdash_args(parser, True)
arguments.add_cdash_args(parser, True)
parser.print_help()
return
@@ -313,7 +263,8 @@ def install(parser, args, **kwargs):
tty.warn("Deprecated option: --run-tests: use --test=all instead")
# 1. Abstract specs from cli
reporter = spack.report.collect_info(args.log_format, args)
reporter = spack.report.collect_info(
spack.package.PackageInstaller, '_install_task', args.log_format, args)
if args.log_file:
reporter.filename = args.log_file
@@ -353,7 +304,7 @@ def install(parser, args, **kwargs):
if not args.log_file and not reporter.filename:
reporter.filename = default_log_file(specs[0])
reporter.specs = specs
with reporter:
with reporter('build'):
if args.overwrite:
installed = list(filter(lambda x: x,

View File

@@ -54,6 +54,9 @@ def setup_parser(subparser):
subparser.add_argument(
'--update', metavar='FILE', default=None, action='store',
help='write output to the specified file, if any package is newer')
subparser.add_argument(
'-v', '--virtuals', action='store_true', default=False,
help='include virtual packages in list')
arguments.add_common_arguments(subparser, ['tags'])
@@ -267,7 +270,7 @@ def list(parser, args):
formatter = formatters[args.format]
# Retrieve the names of all the packages
pkgs = set(spack.repo.all_package_names())
pkgs = set(spack.repo.all_package_names(args.virtuals))
# Filter the set appropriately
sorted_packages = filter_by_name(pkgs, args)

View File

@@ -70,6 +70,6 @@ def python(parser, args, unknown_args):
# Provides readline support, allowing user to use arrow keys
console.push('import readline')
console.interact("Spack version %s\nPython %s, %s %s"""
console.interact("Spack version %s\nPython %s, %s %s"
% (spack.spack_version, platform.python_version(),
platform.system(), platform.machine()))

View File

@@ -37,6 +37,7 @@ def setup_parser(subparser):
cd_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(cd_group, ['clean', 'dirty'])
subparser.epilog = 'DEPRECATED: use `spack dev-build` instead'
def write_spconfig(package, dirty):
@@ -98,6 +99,8 @@ def cmdlist(str):
def setup(self, args):
tty.warn('DEPRECATED: use `spack dev-build` instead')
if not args.spec:
tty.die("spack setup requires a package spec argument.")

View File

@@ -34,7 +34,7 @@ def setup_parser(subparser):
const='yaml', help='print concrete spec as YAML')
subparser.add_argument(
'-j', '--json', action='store_const', dest='format', default=None,
const='json', help='print concrete spec as YAML')
const='json', help='print concrete spec as JSON')
subparser.add_argument(
'-c', '--cover', action='store',
default='nodes', choices=['nodes', 'edges', 'paths'],

View File

@@ -4,166 +4,319 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
from __future__ import division
import collections
import sys
import re
import os
import argparse
import pytest
from six import StringIO
import textwrap
import fnmatch
import re
import shutil
import llnl.util.tty.color as color
from llnl.util.filesystem import working_dir
from llnl.util.tty.colify import colify
import llnl.util.tty as tty
import spack.paths
import spack.install_test
import spack.environment as ev
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.report
import spack.package
description = "run spack's unit tests (wrapper around pytest)"
section = "developer"
description = "run spack's tests for an install"
section = "administrator"
level = "long"
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split('\n')[0]
def setup_parser(subparser):
subparser.add_argument(
'-H', '--pytest-help', action='store_true', default=False,
help="show full pytest help, with advanced options")
sp = subparser.add_subparsers(metavar='SUBCOMMAND', dest='test_command')
# extra spack arguments to list tests
list_group = subparser.add_argument_group("listing tests")
list_mutex = list_group.add_mutually_exclusive_group()
list_mutex.add_argument(
'-l', '--list', action='store_const', default=None,
dest='list', const='list', help="list test filenames")
list_mutex.add_argument(
'-L', '--list-long', action='store_const', default=None,
dest='list', const='long', help="list all test functions")
list_mutex.add_argument(
'-N', '--list-names', action='store_const', default=None,
dest='list', const='names', help="list full names of all tests")
# Run
run_parser = sp.add_parser('run', description=test_run.__doc__,
help=first_line(test_run.__doc__))
# use tests for extension
subparser.add_argument(
'--extension', default=None,
help="run test for a given spack extension")
alias_help_msg = "Provide an alias for this test-suite"
alias_help_msg += " for subsequent access."
run_parser.add_argument('--alias', help=alias_help_msg)
# spell out some common pytest arguments, so they'll show up in help
pytest_group = subparser.add_argument_group(
"common pytest arguments (spack test --pytest-help for more details)")
pytest_group.add_argument(
"-s", action='append_const', dest='parsed_args', const='-s',
help="print output while tests run (disable capture)")
pytest_group.add_argument(
"-k", action='store', metavar="EXPRESSION", dest='expression',
help="filter tests by keyword (can also use w/list options)")
pytest_group.add_argument(
"--showlocals", action='append_const', dest='parsed_args',
const='--showlocals', help="show local variable values in tracebacks")
run_parser.add_argument(
'--fail-fast', action='store_true',
help="Stop tests for each package after the first failure."
)
run_parser.add_argument(
'--fail-first', action='store_true',
help="Stop after the first failed package."
)
run_parser.add_argument(
'--keep-stage',
action='store_true',
help='Keep testing directory for debugging'
)
run_parser.add_argument(
'--log-format',
default=None,
choices=spack.report.valid_formats,
help="format to be used for log files"
)
run_parser.add_argument(
'--log-file',
default=None,
help="filename for the log file. if not passed a default will be used"
)
arguments.add_cdash_args(run_parser, False)
run_parser.add_argument(
'--help-cdash',
action='store_true',
help="Show usage instructions for CDash reporting"
)
# remainder is just passed to pytest
subparser.add_argument(
'pytest_args', nargs=argparse.REMAINDER, help="arguments for pytest")
length_group = run_parser.add_mutually_exclusive_group()
length_group.add_argument(
'--smoke', action='store_true', dest='smoke_test', default=True,
help='run smoke tests (default)')
length_group.add_argument(
'--capability', action='store_false', dest='smoke_test', default=True,
help='run full capability tests using pavilion')
cd_group = run_parser.add_mutually_exclusive_group()
arguments.add_common_arguments(cd_group, ['clean', 'dirty'])
arguments.add_common_arguments(run_parser, ['installed_specs'])
# List
list_parser = sp.add_parser('list', description=test_list.__doc__,
help=first_line(test_list.__doc__))
list_parser.add_argument(
'filter', nargs=argparse.REMAINDER,
help='optional case-insensitive glob patterns to filter results.')
# Find
find_parser = sp.add_parser('find', description=test_find.__doc__,
help=first_line(test_find.__doc__))
find_parser.add_argument(
'filter', nargs=argparse.REMAINDER,
help='optional case-insensitive glob patterns to filter results.')
# Status
status_parser = sp.add_parser('status', description=test_status.__doc__,
help=first_line(test_status.__doc__))
status_parser.add_argument(
'names', nargs=argparse.REMAINDER,
help="Test suites for which to print status")
# Results
results_parser = sp.add_parser('results', description=test_results.__doc__,
help=first_line(test_results.__doc__))
results_parser.add_argument(
'names', nargs=argparse.REMAINDER,
help="Test suites for which to print results")
# Remove
remove_parser = sp.add_parser('remove', description=test_remove.__doc__,
help=first_line(test_remove.__doc__))
arguments.add_common_arguments(remove_parser, ['yes_to_all'])
remove_parser.add_argument(
'names', nargs=argparse.REMAINDER,
help="Test suites to remove from test stage")
def do_list(args, extra_args):
"""Print a lists of tests than what pytest offers."""
# Run test collection and get the tree out.
old_output = sys.stdout
try:
sys.stdout = output = StringIO()
pytest.main(['--collect-only'] + extra_args)
finally:
sys.stdout = old_output
def test_run(args):
"""Run tests for the specified installed packages.
lines = output.getvalue().split('\n')
tests = collections.defaultdict(lambda: set())
prefix = []
# collect tests into sections
for line in lines:
match = re.match(r"(\s*)<([^ ]*) '([^']*)'", line)
if not match:
continue
indent, nodetype, name = match.groups()
# strip parametrized tests
if "[" in name:
name = name[:name.index("[")]
depth = len(indent) // 2
if nodetype.endswith("Function"):
key = tuple(prefix)
tests[key].add(name)
else:
prefix = prefix[:depth]
prefix.append(name)
def colorize(c, prefix):
if isinstance(prefix, tuple):
return "::".join(
color.colorize("@%s{%s}" % (c, p))
for p in prefix if p != "()"
)
return color.colorize("@%s{%s}" % (c, prefix))
if args.list == "list":
files = set(prefix[0] for prefix in tests)
color_files = [colorize("B", file) for file in sorted(files)]
colify(color_files)
elif args.list == "long":
for prefix, functions in sorted(tests.items()):
path = colorize("*B", prefix) + "::"
functions = [colorize("c", f) for f in sorted(functions)]
color.cprint(path)
colify(functions, indent=4)
print()
else: # args.list == "names"
all_functions = [
colorize("*B", prefix) + "::" + colorize("c", f)
for prefix, functions in sorted(tests.items())
for f in sorted(functions)
]
colify(all_functions)
def add_back_pytest_args(args, unknown_args):
"""Add parsed pytest args, unknown args, and remainder together.
We add some basic pytest arguments to the Spack parser to ensure that
they show up in the short help, so we have to reassemble things here.
If no specs are listed, run tests for all packages in the current
environment or all installed packages if there is no active environment.
"""
result = args.parsed_args or []
result += unknown_args or []
result += args.pytest_args or []
if args.expression:
result += ["-k", args.expression]
return result
# cdash help option
if args.help_cdash:
parser = argparse.ArgumentParser(
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog=textwrap.dedent('''\
environment variables:
SPACK_CDASH_AUTH_TOKEN
authentication token to present to CDash
'''))
arguments.add_cdash_args(parser, True)
parser.print_help()
return
# set config option for fail-fast
if args.fail_fast:
spack.config.set('config:fail_fast', True, scope='command_line')
# Get specs to test
env = ev.get_env(args, 'test')
hashes = env.all_hashes() if env else None
specs = spack.cmd.parse_specs(args.specs) if args.specs else [None]
specs_to_test = []
for spec in specs:
matching = spack.store.db.query_local(spec, hashes=hashes)
if spec and not matching:
tty.warn("No installed packages match spec %s" % spec)
specs_to_test.extend(matching)
# test_stage_dir
test_suite = spack.install_test.TestSuite(specs_to_test, args.alias)
test_suite.ensure_stage()
tty.msg("Spack test %s" % test_suite.name)
# Set up reporter
setattr(args, 'package', [s.format() for s in test_suite.specs])
reporter = spack.report.collect_info(
spack.package.PackageBase, 'do_test', args.log_format, args)
if not reporter.filename:
if args.log_file:
if os.path.isabs(args.log_file):
log_file = args.log_file
else:
log_dir = os.getcwd()
log_file = os.path.join(log_dir, args.log_file)
else:
log_file = os.path.join(
os.getcwd(),
'test-%s' % test_suite.name)
reporter.filename = log_file
reporter.specs = specs_to_test
with reporter('test', test_suite.stage):
if args.smoke_test:
test_suite(remove_directory=not args.keep_stage,
dirty=args.dirty,
fail_first=args.fail_first)
else:
raise NotImplementedError
def test(parser, args, unknown_args):
if args.pytest_help:
# make the pytest.main help output more accurate
sys.argv[0] = 'spack test'
return pytest.main(['-h'])
def test_list(args):
"""List all installed packages with available tests."""
raise NotImplementedError
# add back any parsed pytest args we need to pass to pytest
pytest_args = add_back_pytest_args(args, unknown_args)
# The default is to test the core of Spack. If the option `--extension`
# has been used, then test that extension.
pytest_root = spack.paths.spack_root
if args.extension:
target = args.extension
extensions = spack.config.get('config:extensions')
pytest_root = spack.extensions.path_for_extension(target, *extensions)
def test_find(args): # TODO: merge with status (noargs)
"""Find tests that are running or have available results.
# pytest.ini lives in the root of the spack repository.
with working_dir(pytest_root):
if args.list:
do_list(args, pytest_args)
Displays aliases for tests that have them, otherwise test suite content
hashes."""
test_suites = spack.install_test.get_all_test_suites()
# Filter tests by filter argument
if args.filter:
def create_filter(f):
raw = fnmatch.translate('f' if '*' in f or '?' in f
else '*' + f + '*')
return re.compile(raw, flags=re.IGNORECASE)
filters = [create_filter(f) for f in args.filter]
def match(t, f):
return f.match(t)
test_suites = [t for t in test_suites
if any(match(t.alias, f) for f in filters) and
os.path.isdir(t.stage)]
names = [t.name for t in test_suites]
if names:
# TODO: Make these specify results vs active
msg = "Spack test results available for the following tests:\n"
msg += " %s\n" % ' '.join(names)
msg += " Run `spack test remove` to remove all tests"
tty.msg(msg)
else:
msg = "No test results match the query\n"
msg += " Tests may have been removed using `spack test remove`"
tty.msg(msg)
def test_status(args):
"""Get the current status for the specified Spack test suite(s)."""
if args.names:
test_suites = []
for name in args.names:
test_suite = spack.install_test.get_test_suite(name)
if test_suite:
test_suites.append(test_suite)
else:
tty.msg("No test suite %s found in test stage" % name)
else:
test_suites = spack.install_test.get_all_test_suites()
if not test_suites:
tty.msg("No test suites with status to report")
for test_suite in test_suites:
# TODO: Make this handle capability tests too
# TODO: Make this handle tests running in another process
tty.msg("Test suite %s completed" % test_suite.name)
def test_results(args):
"""Get the results from Spack test suite(s) (default all)."""
if args.names:
test_suites = []
for name in args.names:
test_suite = spack.install_test.get_test_suite(name)
if test_suite:
test_suites.append(test_suite)
else:
tty.msg("No test suite %s found in test stage" % name)
else:
test_suites = spack.install_test.get_all_test_suites()
if not test_suites:
tty.msg("No test suites with results to report")
# TODO: Make this handle capability tests too
# The results file may turn out to be a placeholder for future work
for test_suite in test_suites:
results_file = test_suite.results_file
if os.path.exists(results_file):
msg = "Results for test suite %s: \n" % test_suite.name
with open(results_file, 'r') as f:
lines = f.readlines()
for line in lines:
msg += " %s" % line
tty.msg(msg)
else:
msg = "Test %s has no results.\n" % test_suite.name
msg += " Check if it is running with "
msg += "`spack test status %s`" % test_suite.name
tty.msg(msg)
def test_remove(args):
"""Remove results from Spack test suite(s) (default all).
If no test suite is listed, remove results for all suites.
Removed tests can no longer be accessed for results or status, and will not
appear in `spack test list` results."""
if args.names:
test_suites = []
for name in args.names:
test_suite = spack.install_test.get_test_suite(name)
if test_suite:
test_suites.append(test_suite)
else:
tty.msg("No test suite %s found in test stage" % name)
else:
test_suites = spack.install_test.get_all_test_suites()
if not test_suites:
tty.msg("No test suites to remove")
return
if not args.yes_to_all:
msg = 'The following test suites will be removed:\n\n'
msg += ' ' + ' '.join(test.name for test in test_suites) + '\n'
tty.msg(msg)
answer = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not answer:
tty.msg('Aborting removal of test suites')
return
return pytest.main(pytest_args)
for test_suite in test_suites:
shutil.rmtree(test_suite.stage)
def test(parser, args):
globals()['test_%s' % args.test_command](args)

View File

@@ -0,0 +1,16 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd.common.env_utility as env_utility
description = "run a command in a spec's test environment, " \
"or dump its environment to screen or file"
section = "administration"
level = "long"
setup_parser = env_utility.setup_parser
def test_env(parser, args):
env_utility.emulate_env_utility('test-env', 'test', args)

View File

@@ -0,0 +1,169 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
from __future__ import division
import collections
import sys
import re
import argparse
import pytest
from six import StringIO
import llnl.util.tty.color as color
from llnl.util.filesystem import working_dir
from llnl.util.tty.colify import colify
import spack.paths
description = "run spack's unit tests (wrapper around pytest)"
section = "developer"
level = "long"
def setup_parser(subparser):
subparser.add_argument(
'-H', '--pytest-help', action='store_true', default=False,
help="show full pytest help, with advanced options")
# extra spack arguments to list tests
list_group = subparser.add_argument_group("listing tests")
list_mutex = list_group.add_mutually_exclusive_group()
list_mutex.add_argument(
'-l', '--list', action='store_const', default=None,
dest='list', const='list', help="list test filenames")
list_mutex.add_argument(
'-L', '--list-long', action='store_const', default=None,
dest='list', const='long', help="list all test functions")
list_mutex.add_argument(
'-N', '--list-names', action='store_const', default=None,
dest='list', const='names', help="list full names of all tests")
# use tests for extension
subparser.add_argument(
'--extension', default=None,
help="run test for a given spack extension")
# spell out some common pytest arguments, so they'll show up in help
pytest_group = subparser.add_argument_group(
"common pytest arguments (spack unit-test --pytest-help for more)")
pytest_group.add_argument(
"-s", action='append_const', dest='parsed_args', const='-s',
help="print output while tests run (disable capture)")
pytest_group.add_argument(
"-k", action='store', metavar="EXPRESSION", dest='expression',
help="filter tests by keyword (can also use w/list options)")
pytest_group.add_argument(
"--showlocals", action='append_const', dest='parsed_args',
const='--showlocals', help="show local variable values in tracebacks")
# remainder is just passed to pytest
subparser.add_argument(
'pytest_args', nargs=argparse.REMAINDER, help="arguments for pytest")
def do_list(args, extra_args):
"""Print a lists of tests than what pytest offers."""
# Run test collection and get the tree out.
old_output = sys.stdout
try:
sys.stdout = output = StringIO()
pytest.main(['--collect-only'] + extra_args)
finally:
sys.stdout = old_output
lines = output.getvalue().split('\n')
tests = collections.defaultdict(lambda: set())
prefix = []
# collect tests into sections
for line in lines:
match = re.match(r"(\s*)<([^ ]*) '([^']*)'", line)
if not match:
continue
indent, nodetype, name = match.groups()
# strip parametrized tests
if "[" in name:
name = name[:name.index("[")]
depth = len(indent) // 2
if nodetype.endswith("Function"):
key = tuple(prefix)
tests[key].add(name)
else:
prefix = prefix[:depth]
prefix.append(name)
def colorize(c, prefix):
if isinstance(prefix, tuple):
return "::".join(
color.colorize("@%s{%s}" % (c, p))
for p in prefix if p != "()"
)
return color.colorize("@%s{%s}" % (c, prefix))
if args.list == "list":
files = set(prefix[0] for prefix in tests)
color_files = [colorize("B", file) for file in sorted(files)]
colify(color_files)
elif args.list == "long":
for prefix, functions in sorted(tests.items()):
path = colorize("*B", prefix) + "::"
functions = [colorize("c", f) for f in sorted(functions)]
color.cprint(path)
colify(functions, indent=4)
print()
else: # args.list == "names"
all_functions = [
colorize("*B", prefix) + "::" + colorize("c", f)
for prefix, functions in sorted(tests.items())
for f in sorted(functions)
]
colify(all_functions)
def add_back_pytest_args(args, unknown_args):
"""Add parsed pytest args, unknown args, and remainder together.
We add some basic pytest arguments to the Spack parser to ensure that
they show up in the short help, so we have to reassemble things here.
"""
result = args.parsed_args or []
result += unknown_args or []
result += args.pytest_args or []
if args.expression:
result += ["-k", args.expression]
return result
def unit_test(parser, args, unknown_args):
if args.pytest_help:
# make the pytest.main help output more accurate
sys.argv[0] = 'spack test'
return pytest.main(['-h'])
# add back any parsed pytest args we need to pass to pytest
pytest_args = add_back_pytest_args(args, unknown_args)
# The default is to test the core of Spack. If the option `--extension`
# has been used, then test that extension.
pytest_root = spack.paths.spack_root
if args.extension:
target = args.extension
extensions = spack.config.get('config:extensions')
pytest_root = spack.extensions.path_for_extension(target, *extensions)
# pytest.ini lives in the root of the spack repository.
with working_dir(pytest_root):
if args.list:
do_list(args, pytest_args)
return
return pytest.main(pytest_args)

View File

@@ -18,6 +18,7 @@
import spack.error
import spack.spec
import spack.version
import spack.architecture
import spack.util.executable
import spack.util.module_cmd
@@ -201,7 +202,7 @@ class Compiler(object):
fc_names = []
# Optional prefix regexes for searching for this type of compiler.
# Prefixes are sometimes used for toolchains, e.g. 'powerpc-bgq-linux-'
# Prefixes are sometimes used for toolchains
prefixes = []
# Optional suffix regexes for searching for this type of compiler.
@@ -278,7 +279,8 @@ def __init__(self, cspec, operating_system, target,
self.target = target
self.modules = modules or []
self.alias = alias
self.extra_rpaths = extra_rpaths
self.environment = environment or {}
self.extra_rpaths = extra_rpaths or []
self.enable_implicit_rpaths = enable_implicit_rpaths
self.cc = paths[0]
@@ -292,9 +294,6 @@ def __init__(self, cspec, operating_system, target,
else:
self.fc = paths[3]
self.environment = environment
self.extra_rpaths = extra_rpaths or []
# Unfortunately have to make sure these params are accepted
# in the same order they are returned by sorted(flags)
# in compilers/__init__.py
@@ -304,6 +303,10 @@ def __init__(self, cspec, operating_system, target,
if value is not None:
self.flags[flag] = tokenize_flags(value)
# caching value for compiler reported version
# used for version checks for API, e.g. C++11 flag
self._real_version = None
def verify_executables(self):
"""Raise an error if any of the compiler executables is not valid.
@@ -333,6 +336,20 @@ def accessible_exe(exe):
def version(self):
return self.spec.version
@property
def real_version(self):
"""Executable reported compiler version used for API-determinations
E.g. C++11 flag checks.
"""
if not self._real_version:
try:
self._real_version = spack.version.Version(
self.get_real_version())
except spack.util.executable.ProcessError:
self._real_version = self.version
return self._real_version
def implicit_rpaths(self):
if self.enable_implicit_rpaths is False:
return []

View File

@@ -576,9 +576,7 @@ def _default(search_paths):
)
command_arguments.append(detect_version_args)
# Reverse it here so that the dict creation (last insert wins)
# does not spoil the intended precedence.
return reversed(command_arguments)
return command_arguments
fn = getattr(
operating_system, 'arguments_to_detect_version_fn', _default

View File

@@ -38,7 +38,7 @@ def extract_version_from_output(cls, output):
def cxx11_flag(self):
# Adapted from CMake's AppleClang-CXX rules
# Spack's AppleClang detection only valid from Xcode >= 4.6
if self.version < spack.version.ver('4.0.0'):
if self.real_version < spack.version.ver('4.0.0'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++11 standard", "cxx11_flag", "Xcode < 4.0.0"
)
@@ -47,11 +47,11 @@ def cxx11_flag(self):
@property
def cxx14_flag(self):
# Adapted from CMake's rules for AppleClang
if self.version < spack.version.ver('5.1.0'):
if self.real_version < spack.version.ver('5.1.0'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++14 standard", "cxx14_flag", "Xcode < 5.1.0"
)
elif self.version < spack.version.ver('6.1.0'):
elif self.real_version < spack.version.ver('6.1.0'):
return "-std=c++1y"
return "-std=c++14"
@@ -59,7 +59,7 @@ def cxx14_flag(self):
@property
def cxx17_flag(self):
# Adapted from CMake's rules for AppleClang
if self.version < spack.version.ver('6.1.0'):
if self.real_version < spack.version.ver('6.1.0'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++17 standard", "cxx17_flag", "Xcode < 6.1.0"
)

View File

@@ -34,7 +34,7 @@ class Cce(Compiler):
@property
def is_clang_based(self):
version = self.version
version = self._real_version or self.version
return version >= ver('9.0') and 'classic' not in str(version)
@property
@@ -69,9 +69,9 @@ def cxx11_flag(self):
def c99_flag(self):
if self.is_clang_based:
return '-std=c99'
elif self.version >= ver('8.4'):
elif self.real_version >= ver('8.4'):
return '-h std=c99,noconform,gnu'
elif self.version >= ver('8.1'):
elif self.real_version >= ver('8.1'):
return '-h c99,noconform,gnu'
raise UnsupportedCompilerFlag(self,
'the C99 standard',
@@ -82,7 +82,7 @@ def c99_flag(self):
def c11_flag(self):
if self.is_clang_based:
return '-std=c11'
elif self.version >= ver('8.5'):
elif self.real_version >= ver('8.5'):
return '-h std=c11,noconform,gnu'
raise UnsupportedCompilerFlag(self,
'the C11 standard',

View File

@@ -90,7 +90,7 @@ def verbose_flag(self):
@property
def cxx11_flag(self):
if self.version < ver('3.3'):
if self.real_version < ver('3.3'):
raise UnsupportedCompilerFlag(
self, "the C++11 standard", "cxx11_flag", "< 3.3"
)
@@ -98,22 +98,22 @@ def cxx11_flag(self):
@property
def cxx14_flag(self):
if self.version < ver('3.4'):
if self.real_version < ver('3.4'):
raise UnsupportedCompilerFlag(
self, "the C++14 standard", "cxx14_flag", "< 3.5"
)
elif self.version < ver('3.5'):
elif self.real_version < ver('3.5'):
return "-std=c++1y"
return "-std=c++14"
@property
def cxx17_flag(self):
if self.version < ver('3.5'):
if self.real_version < ver('3.5'):
raise UnsupportedCompilerFlag(
self, "the C++17 standard", "cxx17_flag", "< 3.5"
)
elif self.version < ver('5.0'):
elif self.real_version < ver('5.0'):
return "-std=c++1z"
return "-std=c++17"
@@ -124,7 +124,7 @@ def c99_flag(self):
@property
def c11_flag(self):
if self.version < ver('6.1.0'):
if self.real_version < ver('6.1.0'):
raise UnsupportedCompilerFlag(self,
"the C11 standard",
"c11_flag",

View File

@@ -26,7 +26,7 @@ class Fj(spack.compiler.Compiler):
'fc': 'fj/frt'}
version_argument = '--version'
version_regex = r'\((?:FCC|FRT)\) ([\d.]+)'
version_regex = r'\((?:FCC|FRT)\) ([a-z\d.]+)'
required_libs = ['libfj90i', 'libfj90f', 'libfjsrcinfo']

View File

@@ -56,53 +56,53 @@ def openmp_flag(self):
@property
def cxx98_flag(self):
if self.version < ver('6.0'):
if self.real_version < ver('6.0'):
return ""
else:
return "-std=c++98"
@property
def cxx11_flag(self):
if self.version < ver('4.3'):
if self.real_version < ver('4.3'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++11 standard", "cxx11_flag", " < 4.3")
elif self.version < ver('4.7'):
elif self.real_version < ver('4.7'):
return "-std=c++0x"
else:
return "-std=c++11"
@property
def cxx14_flag(self):
if self.version < ver('4.8'):
if self.real_version < ver('4.8'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++14 standard", "cxx14_flag", "< 4.8")
elif self.version < ver('4.9'):
elif self.real_version < ver('4.9'):
return "-std=c++1y"
elif self.version < ver('6.0'):
elif self.real_version < ver('6.0'):
return "-std=c++14"
else:
return ""
@property
def cxx17_flag(self):
if self.version < ver('5.0'):
if self.real_version < ver('5.0'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++17 standard", "cxx17_flag", "< 5.0")
elif self.version < ver('6.0'):
elif self.real_version < ver('6.0'):
return "-std=c++1z"
else:
return "-std=c++17"
@property
def c99_flag(self):
if self.version < ver('4.5'):
if self.real_version < ver('4.5'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C99 standard", "c99_flag", "< 4.5")
return "-std=c99"
@property
def c11_flag(self):
if self.version < ver('4.7'):
if self.real_version < ver('4.7'):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C11 standard", "c11_flag", "< 4.7")
return "-std=c11"

View File

@@ -48,20 +48,20 @@ def opt_flags(self):
@property
def openmp_flag(self):
if self.version < ver('16.0'):
if self.real_version < ver('16.0'):
return "-openmp"
else:
return "-qopenmp"
@property
def cxx11_flag(self):
if self.version < ver('11.1'):
if self.real_version < ver('11.1'):
raise UnsupportedCompilerFlag(self,
"the C++11 standard",
"cxx11_flag",
"< 11.1")
elif self.version < ver('13'):
elif self.real_version < ver('13'):
return "-std=c++0x"
else:
return "-std=c++11"
@@ -69,19 +69,19 @@ def cxx11_flag(self):
@property
def cxx14_flag(self):
# Adapted from CMake's Intel-CXX rules.
if self.version < ver('15'):
if self.real_version < ver('15'):
raise UnsupportedCompilerFlag(self,
"the C++14 standard",
"cxx14_flag",
"< 15")
elif self.version < ver('15.0.2'):
elif self.real_version < ver('15.0.2'):
return "-std=c++1y"
else:
return "-std=c++14"
@property
def c99_flag(self):
if self.version < ver('12'):
if self.real_version < ver('12'):
raise UnsupportedCompilerFlag(self,
"the C99 standard",
"c99_flag",
@@ -91,7 +91,7 @@ def c99_flag(self):
@property
def c11_flag(self):
if self.version < ver('16'):
if self.real_version < ver('16'):
raise UnsupportedCompilerFlag(self,
"the C11 standard",
"c11_flag",

View File

@@ -73,7 +73,7 @@ def fc_pic_flag(self):
@property
def c99_flag(self):
if self.version >= ver('12.10'):
if self.real_version >= ver('12.10'):
return '-c99'
raise UnsupportedCompilerFlag(self,
'the C99 standard',
@@ -82,7 +82,7 @@ def c99_flag(self):
@property
def c11_flag(self):
if self.version >= ver('15.3'):
if self.real_version >= ver('15.3'):
return '-c11'
raise UnsupportedCompilerFlag(self,
'the C11 standard',

View File

@@ -47,7 +47,7 @@ def openmp_flag(self):
@property
def cxx11_flag(self):
if self.version < ver('13.1'):
if self.real_version < ver('13.1'):
raise UnsupportedCompilerFlag(self,
"the C++11 standard",
"cxx11_flag",
@@ -57,9 +57,9 @@ def cxx11_flag(self):
@property
def c99_flag(self):
if self.version >= ver('13.1.1'):
if self.real_version >= ver('13.1.1'):
return '-std=gnu99'
if self.version >= ver('10.1'):
if self.real_version >= ver('10.1'):
return '-qlanglvl=extc99'
raise UnsupportedCompilerFlag(self,
'the C99 standard',
@@ -68,9 +68,9 @@ def c99_flag(self):
@property
def c11_flag(self):
if self.version >= ver('13.1.2'):
if self.real_version >= ver('13.1.2'):
return '-std=gnu11'
if self.version >= ver('12.1'):
if self.real_version >= ver('12.1'):
return '-qlanglvl=extc1x'
raise UnsupportedCompilerFlag(self,
'the C11 standard',

View File

@@ -365,6 +365,7 @@ def _proper_compiler_style(cspec, aspec):
compilers = spack.compilers.compilers_for_spec(
cspec, arch_spec=aspec
)
# If the spec passed as argument is concrete we want to check
# the versions match exactly
if (cspec.concrete and compilers and
@@ -454,7 +455,7 @@ def concretize_compiler_flags(self, spec):
# continue. `return True` here to force concretization to keep
# running.
return True
raise Exception
compiler_match = lambda other: (
spec.compiler == other.compiler and
spec.architecture == other.architecture)

View File

@@ -30,6 +30,7 @@
"""
import collections
import copy
import os
import re
@@ -140,6 +141,10 @@ def __init__(self, name, path):
self.path = path # path to directory containing configs.
self.sections = syaml.syaml_dict() # sections read from config files.
@property
def is_platform_dependent(self):
return '/' in self.name
def get_section_filename(self, section):
_validate_section_name(section)
return os.path.join(self.path, "%s.yaml" % section)
@@ -183,18 +188,27 @@ def __init__(self, name, path, schema, yaml_path=None):
Arguments:
schema (dict): jsonschema for the file to read
yaml_path (list): list of dict keys in the schema where
config data can be found;
yaml_path (list): path in the schema where config data can be
found.
If the schema accepts the following yaml data, the yaml_path
would be ['outer', 'inner']
Elements of ``yaml_path`` can be tuples or lists to represent an
"or" of keys (e.g. "env" or "spack" is ``('env', 'spack')``)
.. code-block:: yaml
outer:
inner:
config:
install_tree: $spack/opt/spack
"""
super(SingleFileScope, self).__init__(name, path)
self._raw_data = None
self.schema = schema
self.yaml_path = yaml_path or []
@property
def is_platform_dependent(self):
return False
def get_section_filename(self, section):
return self.path
@@ -229,32 +243,54 @@ def get_section(self, section):
if self._raw_data is None:
return None
section_data = self._raw_data
for key in self.yaml_path:
if self._raw_data is None:
if section_data is None:
return None
section_data = section_data[key]
# support tuples as "or" in the yaml path
if isinstance(key, (list, tuple)):
key = first_existing(self._raw_data, key)
self._raw_data = self._raw_data[key]
for section_key, data in self._raw_data.items():
for section_key, data in section_data.items():
self.sections[section_key] = {section_key: data}
return self.sections.get(section, None)
def write_section(self, section):
validate(self.sections, self.schema)
data_to_write = self._raw_data
# If there is no existing data, this section SingleFileScope has never
# been written to disk. We need to construct the portion of the data
# from the root of self._raw_data to the level at which the config
# sections are defined. That requires creating keys for every entry in
# self.yaml_path
if not data_to_write:
data_to_write = {}
# reverse because we construct it from the inside out
for key in reversed(self.yaml_path):
data_to_write = {key: data_to_write}
# data_update_pointer is a pointer to the part of data_to_write
# that we are currently updating.
# We start by traversing into the data to the point at which the
# config sections are defined. This means popping the keys from
# self.yaml_path
data_update_pointer = data_to_write
for key in self.yaml_path:
data_update_pointer = data_update_pointer[key]
# For each section, update the data at the level of our pointer
# with the data from the section
for key, data in self.sections.items():
data_update_pointer[key] = data[key]
validate(data_to_write, self.schema)
try:
parent = os.path.dirname(self.path)
mkdirp(parent)
tmp = os.path.join(parent, '.%s.tmp' % self.path)
tmp = os.path.join(parent, '.%s.tmp' % os.path.basename(self.path))
with open(tmp, 'w') as f:
syaml.dump_config(self.sections, stream=f,
syaml.dump_config(data_to_write, stream=f,
default_flow_style=False)
os.path.move(tmp, self.path)
os.rename(tmp, self.path)
except (yaml.YAMLError, IOError) as e:
raise ConfigFileError(
"Error writing to config file: '%s'" % str(e))
@@ -352,6 +388,7 @@ def __init__(self, *scopes):
self.scopes = OrderedDict()
for scope in scopes:
self.push_scope(scope)
self.format_updates = collections.defaultdict(list)
def push_scope(self, scope):
"""Add a higher precedence scope to the Configuration."""
@@ -378,7 +415,9 @@ def remove_scope(self, scope_name):
@property
def file_scopes(self):
"""List of writable scopes with an associated file."""
return [s for s in self.scopes.values() if type(s) == ConfigScope]
return [s for s in self.scopes.values()
if (type(s) == ConfigScope
or type(s) == SingleFileScope)]
def highest_precedence_scope(self):
"""Non-internal scope with highest precedence."""
@@ -390,7 +429,7 @@ def highest_precedence_non_platform_scope(self):
Platform-specific scopes are of the form scope/platform"""
generator = reversed(self.file_scopes)
highest = next(generator, None)
while highest and '/' in highest.name:
while highest and highest.is_platform_dependent:
highest = next(generator, None)
return highest
@@ -440,7 +479,7 @@ def clear_caches(self):
for scope in self.scopes.values():
scope.clear()
def update_config(self, section, update_data, scope=None):
def update_config(self, section, update_data, scope=None, force=False):
"""Update the configuration file for a particular scope.
Overwrites contents of a section in a scope with update_data,
@@ -449,7 +488,26 @@ def update_config(self, section, update_data, scope=None):
update_data should have the top-level section name stripped off
(it will be re-added). Data itself can be a list, dict, or any
other yaml-ish structure.
Configuration scopes that are still written in an old schema
format will fail to update unless ``force`` is True.
Args:
section (str): section of the configuration to be updated
update_data (dict): data to be used for the update
scope (str): scope to be updated
force (str): force the update
"""
if self.format_updates.get(section) and not force:
msg = ('The "{0}" section of the configuration needs to be written'
' to disk, but is currently using a deprecated format. '
'Please update it using:\n\n'
'\tspack config [--scope=<scope] update {0}\n\n'
'Note that previous versions of Spack will not be able to '
'use the updated configuration.')
msg = msg.format(section)
raise RuntimeError(msg)
_validate_section_name(section) # validate section name
scope = self._validate_scope(scope) # get ConfigScope object
@@ -514,6 +572,15 @@ def get_config(self, section, scope=None):
if section not in data:
continue
# We might be reading configuration files in an old format,
# thus read data and update it in memory if need be.
changed = _update_in_memory(data, section)
if changed:
self.format_updates[section].append(scope)
msg = ('OUTDATED CONFIGURATION FILE '
'[section={0}, scope={1}, dir={2}]')
tty.debug(msg.format(section, scope.name, scope.path))
merged_section = merge_yaml(merged_section, data)
# no config files -- empty config.
@@ -723,7 +790,7 @@ def get(path, default=None, scope=None):
def set(path, value, scope=None):
"""Convenience function for getting single values in config files.
"""Convenience function for setting single values in config files.
Accepts the path syntax described in ``get()``.
"""
@@ -999,6 +1066,41 @@ def default_list_scope():
return None
def _update_in_memory(data, section):
"""Update the format of the configuration data in memory.
This function assumes the section is valid (i.e. validation
is responsibility of the caller)
Args:
data (dict): configuration data
section (str): section of the configuration to update
Returns:
True if the data was changed, False otherwise
"""
update_fn = ensure_latest_format_fn(section)
changed = update_fn(data[section])
return changed
def ensure_latest_format_fn(section):
"""Return a function that takes as input a dictionary read from
a configuration file and update it to the latest format.
The function returns True if there was any update, False otherwise.
Args:
section (str): section of the configuration e.g. "packages",
"config", etc.
"""
# The line below is based on the fact that every module we need
# is already imported at the top level
section_module = getattr(spack.schema, section)
update_fn = getattr(section_module, 'update', lambda x: False)
return update_fn
class ConfigError(SpackError):
"""Superclass for all Spack config related errors."""

View File

@@ -14,7 +14,9 @@
"0.15": "0.15",
"0.15.0": "0.15.0",
"0.15.1": "0.15.1",
"0.15.2": "0.15.2"
"0.15.2": "0.15.2",
"0.15.3": "0.15.3",
"0.15.4": "0.15.4"
}
},
"ubuntu:16.04": {
@@ -32,7 +34,9 @@
"0.15": "0.15",
"0.15.0": "0.15.0",
"0.15.1": "0.15.1",
"0.15.2": "0.15.2"
"0.15.2": "0.15.2",
"0.15.3": "0.15.3",
"0.15.4": "0.15.4"
}
},
"centos:7": {
@@ -50,7 +54,9 @@
"0.15": "0.15",
"0.15.0": "0.15.0",
"0.15.1": "0.15.1",
"0.15.2": "0.15.2"
"0.15.2": "0.15.2",
"0.15.3": "0.15.3",
"0.15.4": "0.15.4"
}
},
"centos:6": {
@@ -68,7 +74,9 @@
"0.15": "0.15",
"0.15.0": "0.15.0",
"0.15.1": "0.15.1",
"0.15.2": "0.15.2"
"0.15.2": "0.15.2",
"0.15.3": "0.15.3",
"0.15.4": "0.15.4"
}
}
}

View File

@@ -17,6 +17,26 @@
default_deptype = ('build', 'link')
def deptype_chars(*type_tuples):
"""Create a string representing deptypes for many dependencies.
The string will be some subset of 'blrt', like 'bl ', 'b t', or
' lr ' where each letter in 'blrt' stands for 'build', 'link',
'run', and 'test' (the dependency types).
For a single dependency, this just indicates that the dependency has
the indicated deptypes. For a list of dependnecies, this shows
whether ANY dpeendency in the list has the deptypes (so the deptypes
are merged).
"""
types = set()
for t in type_tuples:
if t:
types.update(t)
return ''.join(t[0] if t in types else ' ' for t in all_deptypes)
def canonical_deptype(deptype):
"""Convert deptype to a canonical sorted tuple, or raise ValueError.
@@ -108,3 +128,8 @@ def merge(self, other):
self.patches[cond].extend(other.patches[cond])
else:
self.patches[cond] = other.patches[cond]
def __repr__(self):
types = deptype_chars(self.type)
return '<Dependency: %s -> %s [%s]>' % (
self.pkg.name, self.spec, types)

View File

@@ -463,8 +463,9 @@ def _eval_conditional(string):
class ViewDescriptor(object):
def __init__(self, root, projections={}, select=[], exclude=[],
def __init__(self, base_path, root, projections={}, select=[], exclude=[],
link=default_view_link):
self.base = base_path
self.root = root
self.projections = projections
self.select = select
@@ -494,15 +495,19 @@ def to_dict(self):
return ret
@staticmethod
def from_dict(d):
return ViewDescriptor(d['root'],
def from_dict(base_path, d):
return ViewDescriptor(base_path,
d['root'],
d.get('projections', {}),
d.get('select', []),
d.get('exclude', []),
d.get('link', default_view_link))
def view(self):
return YamlFilesystemView(self.root, spack.store.layout,
root = self.root
if not os.path.isabs(root):
root = os.path.normpath(os.path.join(self.base, self.root))
return YamlFilesystemView(root, spack.store.layout,
ignore_conflicts=True,
projections=self.projections)
@@ -548,8 +553,11 @@ def regenerate(self, all_specs, roots):
# that cannot be resolved or have repos that have been removed
# we always regenerate the view from scratch. We must first make
# sure the root directory exists for the very first time though.
fs.mkdirp(self.root)
with fs.replace_directory_transaction(self.root):
root = self.root
if not os.path.isabs(root):
root = os.path.normpath(os.path.join(self.base, self.root))
fs.mkdirp(root)
with fs.replace_directory_transaction(root):
view = self.view()
view.clean()
@@ -609,9 +617,11 @@ def __init__(self, path, init_file=None, with_view=None):
self.views = {}
elif with_view is True:
self.views = {
default_view_name: ViewDescriptor(self.view_path_default)}
default_view_name: ViewDescriptor(self.path,
self.view_path_default)}
elif isinstance(with_view, six.string_types):
self.views = {default_view_name: ViewDescriptor(with_view)}
self.views = {default_view_name: ViewDescriptor(self.path,
with_view)}
# If with_view is None, then defer to the view settings determined by
# the manifest file
@@ -682,11 +692,14 @@ def _read_manifest(self, f, raw_yaml=None):
# enable_view can be boolean, string, or None
if enable_view is True or enable_view is None:
self.views = {
default_view_name: ViewDescriptor(self.view_path_default)}
default_view_name: ViewDescriptor(self.path,
self.view_path_default)}
elif isinstance(enable_view, six.string_types):
self.views = {default_view_name: ViewDescriptor(enable_view)}
self.views = {default_view_name: ViewDescriptor(self.path,
enable_view)}
elif enable_view:
self.views = dict((name, ViewDescriptor.from_dict(values))
path = self.path
self.views = dict((name, ViewDescriptor.from_dict(path, values))
for name, values in enable_view.items())
else:
self.views = {}
@@ -799,6 +812,7 @@ def included_config_scopes(self):
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = config_dict(self.yaml).get('include', [])
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
@@ -813,15 +827,22 @@ def included_config_scopes(self):
config_name = 'env:%s:%s' % (
self.name, os.path.basename(config_path))
scope = spack.config.ConfigScope(config_name, config_path)
else:
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
base, ext = os.path.splitext(os.path.basename(config_path))
config_name = 'env:%s:%s' % (self.name, base)
config_name = 'env:%s:%s' % (self.name, config_path)
scope = spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema)
else:
missing.append(config_path)
continue
scopes.append(scope)
if missing:
msg = 'Detected {0} missing include path(s):'.format(len(missing))
msg += '\n {0}'.format('\n '.join(missing))
tty.die('{0}\nPlease correct and try again.'.format(msg))
return scopes
def env_file_config_scope_name(self):
@@ -831,24 +852,17 @@ def env_file_config_scope_name(self):
def env_file_config_scope(self):
"""Get the configuration scope for the environment's manifest file."""
config_name = self.env_file_config_scope_name()
return spack.config.SingleFileScope(config_name,
self.manifest_path,
spack.schema.env.schema,
[spack.schema.env.keys])
return spack.config.SingleFileScope(
config_name,
self.manifest_path,
spack.schema.env.schema,
[spack.config.first_existing(self.raw_yaml,
spack.schema.env.keys)])
def config_scopes(self):
"""A list of all configuration scopes for this environment."""
return self.included_config_scopes() + [self.env_file_config_scope()]
def set_config(self, path, value):
"""Set configuration for this environment"""
yaml = config_dict(self.yaml)
keys = spack.config.process_config_path(path)
for key in keys[:-1]:
yaml = yaml[key]
yaml[keys[-1]] = value
self.write()
def destroy(self):
"""Remove this environment from Spack entirely."""
shutil.rmtree(self.path)
@@ -933,6 +947,7 @@ def remove(self, query_spec, list_name=user_speclist_name, force=False):
"Not found: {0}".format(query_spec))
old_specs = set(self.user_specs)
new_specs = set()
for spec in matches:
if spec in list_to_change:
try:
@@ -1120,7 +1135,7 @@ def update_default_view(self, viewpath):
if name in self.views:
self.default_view.root = viewpath
else:
self.views[name] = ViewDescriptor(viewpath)
self.views[name] = ViewDescriptor(self.path, viewpath)
else:
self.views.pop(name, None)
@@ -1459,6 +1474,18 @@ def write(self, regenerate_views=True):
writing if True.
"""
# Intercept environment not using the latest schema format and prevent
# them from being modified
manifest_exists = os.path.exists(self.manifest_path)
if manifest_exists and not is_latest_format(self.manifest_path):
msg = ('The environment "{0}" needs to be written to disk, but '
'is currently using a deprecated format. Please update it '
'using:\n\n'
'\tspack env update {0}\n\n'
'Note that previous versions of Spack will not be able to '
'use the updated configuration.')
raise RuntimeError(msg.format(self.name))
# ensure path in var/spack/environments
fs.mkdirp(self.path)
@@ -1486,13 +1513,26 @@ def write(self, regenerate_views=True):
# write the lock file last
with fs.write_tmp_and_move(self.lock_path) as f:
sjson.dump(self._to_lockfile_dict(), stream=f)
self._update_and_write_manifest(raw_yaml_dict, yaml_dict)
else:
if os.path.exists(self.lock_path):
os.unlink(self.lock_path)
with fs.safe_remove(self.lock_path):
self._update_and_write_manifest(raw_yaml_dict, yaml_dict)
# TODO: rethink where this needs to happen along with
# writing. For some of the commands (like install, which write
# concrete specs AND regen) this might as well be a separate
# call. But, having it here makes the views consistent witht the
# concretized environment for most operations. Which is the
# special case?
if regenerate_views:
self.regenerate_views()
def _update_and_write_manifest(self, raw_yaml_dict, yaml_dict):
"""Update YAML manifest for this environment based on changes to
spec lists and views and write it.
"""
# invalidate _repo cache
self._repo = None
# put any changes in the definitions in the YAML
for name, speclist in self.spec_lists.items():
if name == user_speclist_name:
@@ -1519,21 +1559,20 @@ def write(self, regenerate_views=True):
for ayl in active_yaml_lists)]
list_for_new_specs = active_yaml_lists[0].setdefault(name, [])
list_for_new_specs[:] = list_for_new_specs + new_specs
# put the new user specs in the YAML.
# This can be done directly because there can't be multiple definitions
# nor when clauses for `specs` list.
yaml_spec_list = yaml_dict.setdefault(user_speclist_name,
[])
yaml_spec_list[:] = self.user_specs.yaml_list
# Construct YAML representation of view
default_name = default_view_name
if self.views and len(self.views) == 1 and default_name in self.views:
path = self.default_view.root
if self.default_view == ViewDescriptor(self.view_path_default):
if self.default_view == ViewDescriptor(self.path,
self.view_path_default):
view = True
elif self.default_view == ViewDescriptor(path):
elif self.default_view == ViewDescriptor(self.path, path):
view = path
else:
view = dict((name, view.to_dict())
@@ -1543,9 +1582,7 @@ def write(self, regenerate_views=True):
for name, view in self.views.items())
else:
view = False
yaml_dict['view'] = view
# Remove yaml sections that are shadowing defaults
# construct garbage path to ensure we don't find a manifest by accident
with fs.temp_cwd() as env_dir:
@@ -1555,7 +1592,6 @@ def write(self, regenerate_views=True):
if yaml_dict[key] == config_dict(bare_env.yaml).get(key, None):
if key not in raw_yaml_dict:
del yaml_dict[key]
# if all that worked, write out the manifest file at the top level
# (we used to check whether the yaml had changed and not write it out
# if it hadn't. We can't do that anymore because it could be the only
@@ -1569,15 +1605,6 @@ def write(self, regenerate_views=True):
with fs.write_tmp_and_move(self.manifest_path) as f:
_write_yaml(self.yaml, f)
# TODO: rethink where this needs to happen along with
# writing. For some of the commands (like install, which write
# concrete specs AND regen) this might as well be a separate
# call. But, having it here makes the views consistent witht the
# concretized environment for most operations. Which is the
# special case?
if regenerate_views:
self.regenerate_views()
def __enter__(self):
self._previous_active = _active_environment
activate(self)
@@ -1708,5 +1735,92 @@ def deactivate_config_scope(env):
spack.config.config.remove_scope(scope.name)
def manifest_file(env_name_or_dir):
"""Return the absolute path to a manifest file given the environment
name or directory.
Args:
env_name_or_dir (str): either the name of a valid environment
or a directory where a manifest file resides
Raises:
AssertionError: if the environment is not found
"""
env_dir = None
if is_env_dir(env_name_or_dir):
env_dir = os.path.abspath(env_name_or_dir)
elif exists(env_name_or_dir):
env_dir = os.path.abspath(root(env_name_or_dir))
assert env_dir, "environment not found [env={0}]".format(env_name_or_dir)
return os.path.join(env_dir, manifest_name)
def update_yaml(manifest, backup_file):
"""Update a manifest file from an old format to the current one.
Args:
manifest (str): path to a manifest file
backup_file (str): file where to copy the original manifest
Returns:
True if the manifest was updated, False otherwise.
Raises:
AssertionError: in case anything goes wrong during the update
"""
# Check if the environment needs update
with open(manifest) as f:
data = syaml.load(f)
top_level_key = _top_level_key(data)
needs_update = spack.schema.env.update(data[top_level_key])
if not needs_update:
msg = "No update needed [manifest={0}]".format(manifest)
tty.debug(msg)
return False
# Copy environment to a backup file and update it
msg = ('backup file "{0}" already exists on disk. Check its content '
'and remove it before trying to update again.')
assert not os.path.exists(backup_file), msg.format(backup_file)
shutil.copy(manifest, backup_file)
with open(manifest, 'w') as f:
syaml.dump_config(data, f)
return True
def _top_level_key(data):
"""Return the top level key used in this environment
Args:
data (dict): raw yaml data of the environment
Returns:
Either 'spack' or 'env'
"""
msg = ('cannot find top level attribute "spack" or "env"'
'in the environment')
assert any(x in data for x in ('spack', 'env')), msg
if 'spack' in data:
return 'spack'
return 'env'
def is_latest_format(manifest):
"""Return True if the manifest file is at the latest schema format,
False otherwise.
Args:
manifest (str): manifest file to be analyzed
"""
with open(manifest) as f:
data = syaml.load(f)
top_level_key = _top_level_key(data)
changed = spack.schema.env.update(data[top_level_key])
return not changed
class SpackEnvironmentError(spack.error.SpackError):
"""Superclass for all errors to do with Spack environments."""

View File

@@ -295,6 +295,9 @@ def fetch(self):
url = None
errors = []
for url in self.candidate_urls:
if not self._existing_url(url):
continue
try:
partial_file, save_file = self._fetch_from_url(url)
if save_file:
@@ -309,13 +312,22 @@ def fetch(self):
if not self.archive_file:
raise FailedDownloadError(url)
def _existing_url(self, url):
tty.debug('Checking existence of {0}'.format(url))
curl = self.curl
# Telling curl to fetch the first byte (-r 0-0) is supposed to be
# portable.
curl_args = ['--stderr', '-', '-s', '-f', '-r', '0-0', url]
_ = curl(*curl_args, fail_on_error=False, output=os.devnull)
return curl.returncode == 0
def _fetch_from_url(self, url):
save_file = None
partial_file = None
if self.stage.save_filename:
save_file = self.stage.save_filename
partial_file = self.stage.save_filename + '.part'
tty.debug('Fetching {0}'.format(url))
tty.msg('Fetching {0}'.format(url))
if partial_file:
save_args = ['-C',
'-', # continue partial downloads
@@ -330,8 +342,6 @@ def _fetch_from_url(self, url):
'-', # print out HTML headers
'-L', # resolve 3xx redirects
url,
'--stderr', # redirect stderr output
'-', # redirect to stdout
]
if not spack.config.get('config:verify_ssl'):
@@ -340,7 +350,7 @@ def _fetch_from_url(self, url):
if sys.stdout.isatty() and tty.msg_enabled():
curl_args.append('-#') # status bar when using a tty
else:
curl_args.append('-sS') # just errors when not.
curl_args.append('-sS') # show errors if fail
connect_timeout = spack.config.get('config:connect_timeout', 10)
@@ -569,7 +579,7 @@ def fetch(self):
raise
# Notify the user how we fetched.
tty.debug('Using cached archive: {0}'.format(path))
tty.msg('Using cached archive: {0}'.format(path))
class VCSFetchStrategy(FetchStrategy):

View File

@@ -0,0 +1,264 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import base64
import hashlib
import os
import re
import shutil
import sys
import tty
import llnl.util.filesystem as fs
from spack.spec import Spec
import spack.error
import spack.util.prefix
import spack.util.spack_json as sjson
test_suite_filename = 'test_suite.lock'
results_filename = 'results.txt'
def get_escaped_text_output(filename):
"""Retrieve and escape the expected text output from the file
Args:
filename (str): path to the file
Returns:
(list of str): escaped text lines read from the file
"""
with open(filename, 'r') as f:
# Ensure special characters are escaped as needed
expected = f.read()
# Split the lines to make it easier to debug failures when there is
# a lot of output
return [re.escape(ln) for ln in expected.split('\n')]
def get_test_stage_dir():
return spack.util.path.canonicalize_path(
spack.config.get('config:test_stage', '~/.spack/test'))
def get_all_test_suites():
stage_root = get_test_stage_dir()
def valid_stage(d):
dirpath = os.path.join(stage_root, d)
return (os.path.isdir(dirpath) and
test_suite_filename in os.listdir(dirpath))
candidates = [
os.path.join(stage_root, d, test_suite_filename)
for d in os.listdir(stage_root)
if valid_stage(d)
]
test_suites = [TestSuite.from_file(c) for c in candidates]
return test_suites
def get_test_suite(name):
assert name, "Cannot search for empty test name or 'None'"
test_suites = get_all_test_suites()
names = [ts for ts in test_suites
if ts.name == name]
assert len(names) < 2, "alias shadows test suite hash"
if not names:
return None
return names[0]
class TestSuite(object):
def __init__(self, specs, alias=None):
# copy so that different test suites have different package objects
# even if they contain the same spec
self.specs = [spec.copy() for spec in specs]
self.current_test_spec = None # spec currently tested, can be virtual
self.current_base_spec = None # spec currently running do_test
self.alias = alias
self._hash = None
@property
def name(self):
return self.alias if self.alias else self.content_hash
@property
def content_hash(self):
if not self._hash:
json_text = sjson.dump(self.to_dict())
sha = hashlib.sha1(json_text.encode('utf-8'))
b32_hash = base64.b32encode(sha.digest()).lower()
if sys.version_info[0] >= 3:
b32_hash = b32_hash.decode('utf-8')
self._hash = b32_hash
return self._hash
def __call__(self, *args, **kwargs):
self.write_reproducibility_data()
remove_directory = kwargs.get('remove_directory', True)
dirty = kwargs.get('dirty', False)
fail_first = kwargs.get('fail_first', False)
for spec in self.specs:
try:
msg = "A package object cannot run in two test suites at once"
assert not spec.package.test_suite, msg
# Set up the test suite to know which test is running
spec.package.test_suite = self
self.current_base_spec = spec
self.current_test_spec = spec
# setup per-test directory in the stage dir
test_dir = self.test_dir_for_spec(spec)
if os.path.exists(test_dir):
shutil.rmtree(test_dir)
fs.mkdirp(test_dir)
# run the package tests
spec.package.do_test(
dirty=dirty
)
# Clean up on success and log passed test
if remove_directory:
shutil.rmtree(test_dir)
self.write_test_result(spec, 'PASSED')
except BaseException as exc:
if isinstance(exc, SyntaxError):
# Create the test log file and report the error.
self.ensure_stage()
msg = 'Testing package {0}\n{1}'\
.format(self.test_pkg_id(spec), str(exc))
_add_msg_to_file(self.log_file_for_spec(spec), msg)
self.write_test_result(spec, 'FAILED')
if fail_first:
break
finally:
spec.package.test_suite = None
self.current_test_spec = None
self.current_base_spec = None
def ensure_stage(self):
if not os.path.exists(self.stage):
fs.mkdirp(self.stage)
@property
def stage(self):
return spack.util.prefix.Prefix(
os.path.join(get_test_stage_dir(), self.content_hash))
@property
def results_file(self):
return self.stage.join(results_filename)
@classmethod
def test_pkg_id(cls, spec):
"""Build the standard install test package identifier
Args:
spec (Spec): instance of the spec under test
Returns:
(str): the install test package identifier
"""
return spec.format('{name}-{version}-{hash:7}')
@classmethod
def test_log_name(cls, spec):
return '%s-test-out.txt' % cls.test_pkg_id(spec)
def log_file_for_spec(self, spec):
return self.stage.join(self.test_log_name(spec))
def test_dir_for_spec(self, spec):
return self.stage.join(self.test_pkg_id(spec))
@property
def current_test_data_dir(self):
assert self.current_test_spec and self.current_base_spec
test_spec = self.current_test_spec
base_spec = self.current_base_spec
return self.test_dir_for_spec(base_spec).data.join(test_spec.name)
def add_failure(self, exc, msg):
current_hash = self.current_base_spec.dag_hash()
current_failures = self.failures.get(current_hash, [])
current_failures.append((exc, msg))
self.failures[current_hash] = current_failures
def write_test_result(self, spec, result):
msg = "{0} {1}".format(self.test_pkg_id(spec), result)
_add_msg_to_file(self.results_file, msg)
def write_reproducibility_data(self):
for spec in self.specs:
repo_cache_path = self.stage.repo.join(spec.name)
spack.repo.path.dump_provenance(spec, repo_cache_path)
for vspec in spec.package.virtuals_provided:
repo_cache_path = self.stage.repo.join(vspec.name)
if not os.path.exists(repo_cache_path):
try:
spack.repo.path.dump_provenance(vspec, repo_cache_path)
except spack.repo.UnknownPackageError:
pass # not all virtuals have package files
with open(self.stage.join(test_suite_filename), 'w') as f:
sjson.dump(self.to_dict(), stream=f)
def to_dict(self):
specs = [s.to_dict() for s in self.specs]
d = {'specs': specs}
if self.alias:
d['alias'] = self.alias
return d
@staticmethod
def from_dict(d):
specs = [Spec.from_dict(spec_dict) for spec_dict in d['specs']]
alias = d.get('alias', None)
return TestSuite(specs, alias)
@staticmethod
def from_file(filename):
try:
with open(filename, 'r') as f:
data = sjson.load(f)
return TestSuite.from_dict(data)
except Exception as e:
tty.debug(e)
raise sjson.SpackJSONError("error parsing JSON TestSuite:", str(e))
def _add_msg_to_file(filename, msg):
"""Add the message to the specified file
Args:
filename (str): path to the file
msg (str): message to be appended to the file
"""
with open(filename, 'a+') as f:
f.write('{0}\n'.format(msg))
class TestFailure(spack.error.SpackError):
"""Raised when package tests have failed for an installation."""
def __init__(self, failures):
# Failures are all exceptions
msg = "%d tests failed.\n" % len(failures)
for failure, message in failures:
msg += '\n\n%s\n' % str(failure)
msg += '\n%s\n' % message
super(TestFailure, self).__init__(msg)

View File

@@ -272,9 +272,9 @@ def _process_external_package(pkg, explicit):
pre = '{s.name}@{s.version} :'.format(s=pkg.spec)
spec = pkg.spec
if spec.external_module:
if spec.external_modules:
tty.msg('{0} has external module in {1}'
.format(pre, spec.external_module))
.format(pre, spec.external_modules))
tty.debug('{0} is actually installed in {1}'
.format(pre, spec.external_path))
else:
@@ -466,7 +466,6 @@ def log(pkg):
packages_dir = spack.store.layout.build_packages_path(pkg.spec)
# Remove first if we're overwriting another build
# (can happen with spack setup)
try:
# log and env install paths are inside this
shutil.rmtree(packages_dir)
@@ -1109,10 +1108,10 @@ def build_process():
pkg.name, 'src')
tty.debug('{0} Copying source to {1}'
.format(pre, src_target))
fs.install_tree(pkg.stage.source_path, src_target)
fs.install_tree(source_path, src_target)
# Do the real install in the source directory.
with fs.working_dir(pkg.stage.source_path):
with fs.working_dir(source_path):
# Save the build environment in a file before building.
dump_environment(pkg.env_path)
@@ -1134,7 +1133,7 @@ def build_process():
# Spawn a daemon that reads from a pipe and redirects
# everything to log_path
with log_output(pkg.log_path, echo, True) as logger:
with log_output(pkg.log_path, echo=echo, debug=True) as logger:
for phase_name, phase_attr in zip(
pkg.phases, pkg._InstallPhase_phases):
@@ -1568,9 +1567,14 @@ def install(self, **kwargs):
except (Exception, SystemExit) as exc:
# Best effort installs suppress the exception and mark the
# package as a failure UNLESS this is the explicit package.
err = 'Failed to install {0} due to {1}: {2}'
tty.error(err.format(pkg.name, exc.__class__.__name__,
str(exc)))
if (not isinstance(exc, spack.error.SpackError) or
not exc.printed):
# SpackErrors can be printed by the build process or at
# lower levels -- skip printing if already printed.
# TODO: sort out this and SpackEror.print_context()
err = 'Failed to install {0} due to {1}: {2}'
tty.error(
err.format(pkg.name, exc.__class__.__name__, str(exc)))
self._update_failed(task, True, exc)

View File

@@ -128,8 +128,8 @@ def get_version():
git = exe.which("git")
if git:
with fs.working_dir(spack.paths.prefix):
desc = git(
"describe", "--tags", output=str, fail_on_error=False)
desc = git("describe", "--tags", "--match", "v*",
output=str, error=os.devnull, fail_on_error=False)
if git.returncode == 0:
match = re.match(r"v([^-]+)-([^-]+)-g([a-f\d]+)", desc)
@@ -281,7 +281,7 @@ def add_subcommand_group(title, commands):
spack help --all list all commands and options
spack help <command> help on a specific command
spack help --spec help on the package specification syntax
spack docs open http://spack.rtfd.io/ in a browser
spack docs open https://spack.rtfd.io/ in a browser
""".format(help=section_descriptions['help']))
# determine help from format above
@@ -702,16 +702,16 @@ def main(argv=None):
if stored_var_name in os.environ:
os.environ[var] = os.environ[stored_var_name]
# make spack.config aware of any command line configuration scopes
if args.config_scopes:
spack.config.command_line_scopes = args.config_scopes
# activate an environment if one was specified on the command line
if not args.no_env:
env = ev.find_environment(args)
if env:
ev.activate(env, args.use_env_repo, add_view=False)
# make spack.config aware of any command line configuration scopes
if args.config_scopes:
spack.config.command_line_scopes = args.config_scopes
if args.print_shell_vars:
print_setup_info(*args.print_shell_vars.split(','))
return 0

View File

@@ -36,6 +36,7 @@
import re
import llnl.util.filesystem
from llnl.util.lang import dedupe
import llnl.util.tty as tty
import spack.build_environment as build_environment
import spack.error
@@ -173,12 +174,8 @@ def merge_config_rules(configuration, spec):
# evaluated in order of appearance in the module file
spec_configuration = module_specific_configuration.pop('all', {})
for constraint, action in module_specific_configuration.items():
override = False
if constraint.endswith(':'):
constraint = constraint.strip(':')
override = True
if spec.satisfies(constraint, strict=True):
if override:
if hasattr(constraint, 'override') and constraint.override:
spec_configuration = {}
update_dictionary_extending_lists(spec_configuration, action)
@@ -442,7 +439,7 @@ def suffixes(self):
for constraint, suffix in self.conf.get('suffixes', {}).items():
if constraint in self.spec:
suffixes.append(suffix)
suffixes = sorted(set(suffixes))
suffixes = list(dedupe(suffixes))
if self.hash:
suffixes.append(self.hash)
return suffixes

View File

@@ -12,6 +12,7 @@
import spack.config
import spack.compilers
import spack.spec
import spack.repo
import spack.error
import spack.tengine as tengine
@@ -125,7 +126,9 @@ def hierarchy_tokens(self):
# Check if all the tokens in the hierarchy are virtual specs.
# If not warn the user and raise an error.
not_virtual = [t for t in tokens if not spack.spec.Spec.is_virtual(t)]
not_virtual = [t for t in tokens
if t != 'compiler' and
not spack.repo.path.is_virtual(t)]
if not_virtual:
msg = "Non-virtual specs in 'hierarchy' list for lmod: {0}\n"
msg += "Please check the 'modules.yaml' configuration files"

Some files were not shown because too many files have changed in this diff Show More