Compare commits

...

370 Commits

Author SHA1 Message Date
Todd Gamblin
8eab69fc0b Merge branch 'develop' for v0.8.15 2015-02-24 10:42:58 -08:00
Todd Gamblin
ffdb90f39a Last minute Qt bugfix. 2015-02-24 10:42:35 -08:00
Todd Gamblin
5eb7e46654 Spell check docs 2015-02-24 10:26:26 -08:00
Todd Gamblin
447e295947 SPACK-62: fix for site docs. 2015-02-24 02:45:21 -08:00
Todd Gamblin
daa38d2ff4 SPACK-59: Documentation updates, bugfix in fetching. 2015-02-24 02:45:07 -08:00
Todd Gamblin
049808a34f Merge remote-tracking branch 'origin/features/SPACK-46' into develop
Conflicts:
	lib/spack/docs/packaging_guide.rst
2015-02-23 10:46:58 -08:00
Todd Gamblin
daef78f538 Update packaging documentataion. 2015-02-23 10:31:22 -08:00
Todd Gamblin
5699cbb597 Fix SPACK-60: 0.8.15 basic docs. 2015-02-23 01:23:30 -08:00
Todd Gamblin
6dab133d9f Same package add icon on mac and linux. 2015-02-23 01:23:30 -08:00
Todd Gamblin
d49c98188a Add an override to colify so we can set terminal dimensions. 2015-02-23 01:23:30 -08:00
Todd Gamblin
065e5ccd1a Update contributors list. 2015-02-18 20:51:50 -08:00
Todd Gamblin
27e9bfb5aa Merge pull request #19 from psaravan/develop
Added netcdf package support.
2015-02-18 20:38:04 -08:00
Saravan Pantham
14e70ad689 Added netcdf package support. 2015-02-18 18:05:57 -08:00
Todd Gamblin
02e316e772 Convert ValueErrors to SpackError subclass. 2015-02-18 16:45:54 -08:00
Todd Gamblin
2374eb4dca Fix for SPACK-62
- deactivate -a errors if arg is not activated
- deactivate -af does not.
2015-02-18 16:45:12 -08:00
Todd Gamblin
2eda01c703 uninstall -f ignores nonexisting packages. 2015-02-18 16:21:15 -08:00
Todd Gamblin
2755171e08 Update documentation to reflect new restage/clean behavior. 2015-02-18 14:46:00 -08:00
Todd Gamblin
e67655c31a docs autodetect version. 2015-02-18 14:33:21 -08:00
Todd Gamblin
c7b8a4e25c Fix for SPACK-46: cleanup spack clean, spack restage. 2015-02-18 14:00:37 -08:00
Todd Gamblin
db11373351 Resurrect combined qt4/5 package from b7dacb 2015-02-18 13:16:22 -08:00
Gregory L. Lee
59198e29f9 Merge branch 'develop' of ssh://cz-stash.llnl.gov:7999/scale/spack into develop 2015-02-18 13:14:26 -08:00
Gregory L. Lee
6e13d0985c fixed deps for python packages 2015-02-18 13:13:19 -08:00
Todd Gamblin
3e5aa4b0f5 llvm/clang version bump 2015-02-18 10:59:03 -08:00
Todd Gamblin
959ce4f985 Downgrade standard version of ImageMagick to a non-changing URL.
- bleeding edge still available but commented by default.
2015-02-18 10:59:03 -08:00
Todd Gamblin
14097e39cc Suppress download status meter when routing I/O to a file. 2015-02-18 10:59:03 -08:00
Gregory L. Lee
44003449d5 fixed install steps for version 4 2015-02-17 16:26:00 -08:00
Todd Gamblin
17ac609d23 Merge branch 'features/memaxes' into develop
Conflicts:
	var/spack/packages/libpng/package.py
2015-02-17 00:49:52 -08:00
Todd Gamblin
724b72bdaf take out dyninst 8.2 for now.
- doesn't build correctly with boost 1.55
2015-02-17 00:47:35 -08:00
Todd Gamblin
4af85441db Merge branch 'features/python-modules' into develop 2015-02-17 00:44:02 -08:00
Todd Gamblin
d800c23cec Better activate/deactivate logic.
spack activate
  - now activates dependency extensions
  - ensures dependencies are activated in the python installation.
  - -f/--force option still allows the old activate behavior.

spack deactivate
  - checks for dependents before deactivating (like uninstall)
  - deactivate -a/--all <extension> will deactviate a package and ALL
    of its dependency extensions.
  - deactivate -a/--all <extendee> activates all extensions of <extendee>
    e.g.: spack deactivate -a python
  - deactivate -f/--force option allows removing regardless of dependents.
    - deactivate -f can be run EVEN if a package is not activated.
    - allows for clenup of activations gone wrong.
2015-02-17 00:24:58 -08:00
Todd Gamblin
57f331e2ac Ignore conflicting nose tests in py-nose and py-matplotlib. 2015-02-17 00:22:18 -08:00
Todd Gamblin
67db8ddca8 Factor ignore logic into a predicate builder. 2015-02-17 00:21:15 -08:00
Todd Gamblin
06d6b0b205 More py-setuptools dependencies added. 2015-02-16 21:53:55 -08:00
Todd Gamblin
13376efafc Add package-specific rpath back to shiboken and pyside. 2015-02-16 21:53:34 -08:00
Todd Gamblin
e6b2c27011 Factor out forking logic to build_environment.py. 2015-02-16 21:41:31 -08:00
Todd Gamblin
614c22fc1b Allow forced deactivation -- best effort unlinking
spack deactivate -f will unlink even if Spack thinks the package isn't enabled.
Made deactivate routines idempotent.
2015-02-16 12:41:22 -08:00
Todd Gamblin
8aa3afcfde Python package cleanup.
- Added a number of dependencies to python packages.
- Python packages may still not build without some OS support.
- Example: Numpy needs ATLAS, and will use a system ATLAS install.
  - Atlas requires turning off CPU throttling to build.
  - can't do this as a regular user -- how to build ATLAS with Spack
  - currnetly relying on a system ATLAS install.
2015-02-15 23:04:20 -08:00
Todd Gamblin
847ed8ad39 Add libxslt, cleanup libxml2. 2015-02-15 23:04:04 -08:00
Todd Gamblin
b86eb69552 libgcrypt and libgpg-error packages. 2015-02-15 23:03:33 -08:00
Todd Gamblin
65d60f92f5 qhull package. 2015-02-15 23:02:51 -08:00
Todd Gamblin
36579844d9 Add Tcl/Tk packages. 2015-02-15 23:02:36 -08:00
Todd Gamblin
ce011501f9 Add R package. 2015-02-15 23:02:21 -08:00
Todd Gamblin
b11061f99d Rename py-pyqt4 to py-pyqt. 2015-02-15 12:40:02 -08:00
Todd Gamblin
2f67cdaf10 Better time output on build completion. 2015-02-15 12:39:10 -08:00
Todd Gamblin
d1e03329c5 Memoize all_specs() and exists() for better performance.
- Real bottleneck is calling normalize() for every spec when we read it.
- Need to store graph information in spec files to avoid the need for this.
  - Also, normalizing old specs isn't always possible, so we need to do this anyway.
2015-02-15 11:50:13 -08:00
Todd Gamblin
3c0048dd89 py-sip installs properly into a prefix 2015-02-15 01:59:36 -08:00
Todd Gamblin
c0c0879924 Better extension activation/deactivation 2015-02-15 01:58:35 -08:00
Todd Gamblin
82dc935a50 installed_extensions_for no longer fails when nothing known about pkg 2015-02-15 01:49:50 -08:00
Todd Gamblin
93067d0d63 Add profile option to spack script. 2015-02-15 01:45:05 -08:00
Todd Gamblin
0c94a6e2b0 Merge branch 'features/python-modules' into features/memaxes
Conflicts:
	var/spack/packages/qt/package.py
2015-02-12 10:01:58 -08:00
Gregory L. Lee
5c2608b032 typo: Self -> self 2015-02-09 15:55:18 -08:00
Todd Gamblin
25af341954 Python package improvements. 2015-02-09 02:54:49 -08:00
Todd Gamblin
d1d0b85d80 Add Alfredo to contributors. 2015-02-09 01:13:56 -08:00
Todd Gamblin
1e5bbe60f7 Merge pull request #18 from ch4i/features/memaxes
Features/memaxes
2015-02-09 01:06:06 -08:00
Alfredo Gimenez
27617670f0 qt with hardware accelerated opengl working 2015-02-09 00:01:07 -08:00
Todd Gamblin
aae364b4c9 "spack extensions" shows total extension count. 2015-02-08 23:26:15 -08:00
Todd Gamblin
c077f05705 Move dpeendency environment setup to build_environemnt. 2015-02-08 22:01:00 -08:00
Todd Gamblin
f81b136547 import fix in cmd/clean 2015-02-08 19:43:10 -08:00
Todd Gamblin
20ec80295d setup_extension_environment is now setup_dependent_environment.
- other packages, like Qt, can now use this to set up relevant build
  variables and env vars for their dependencies.

- not just extensions anymore.
2015-02-08 19:41:17 -08:00
Todd Gamblin
60a385d4a4 Minor textual error in extensions command. 2015-02-08 19:40:28 -08:00
Todd Gamblin
e51e01f4f0 Cleaned up python to remove redundant line. 2015-02-08 19:39:36 -08:00
Todd Gamblin
befe72b9b9 directory_layout now raises an error when an install fails. 2015-02-08 19:36:30 -08:00
Alfredo Gimenez
9e878075ac mesa 8.0.5 working 2015-02-08 16:09:13 -08:00
Alfredo Gimenez
cc684a3ebe older mesa for 2.6 kernel (not workin yet) 2015-02-08 13:34:45 -08:00
Alfredo Gimenez
1605e04d44 mesa and systemd (systemd not working yet) 2015-02-07 22:08:50 -08:00
Alfredo Gimenez
932f3930f4 util-linux added 2015-02-07 09:18:34 -08:00
Alfredo Gimenez
676cc84c9e more mesa dependencies 2015-02-06 17:24:55 -08:00
Alfredo Gimenez
5fdf5438ea flex and bison 2015-02-06 16:55:48 -08:00
Alfredo Gimenez
d95d48bbe6 py-mako and fix for setup-env.sh 2015-02-06 16:43:21 -08:00
Gregory L. Lee
5cc369c2b8 add dependent packages to PYTHONPATH for build 2015-02-06 16:35:35 -08:00
Alfredo Gimenez
a4ac1977a4 merge with python-modules 2015-02-06 16:27:33 -08:00
Todd Gamblin
457f2d1d51 Fix libpng to use a better URL
Sourceforge URLs like this eventually die when the libpng version is bumped:
    http://sourceforge.net/projects/libpng/files/libpng16/1.6.14/libpng-1.6.14.tar.gz/download

But ones like this give you a "permanently moved", which curl -L will follow:
    http://download.sourceforge.net/libpng/libpng-1.6.16.tar.gz
2015-02-06 08:37:22 -08:00
Todd Gamblin
3a3e4d4391 Do not automatically activate extensions on install. 2015-02-04 15:47:03 -08:00
Todd Gamblin
a9e189972a Bugfix in spak extensions 2015-02-04 15:42:41 -08:00
Todd Gamblin
5bde8359e8 More information in extensions command. 2015-02-02 11:20:36 -08:00
Todd Gamblin
2d9190d264 Add extensions command. 2015-02-02 11:20:36 -08:00
Todd Gamblin
6b90017efa Fixed dumb link_tree bug, added test for link tree. 2015-02-02 11:20:35 -08:00
Todd Gamblin
6400ace901 Add "spack extensions" command to list activated extensions. 2015-02-02 11:19:54 -08:00
Todd Gamblin
70c8bf44b8 Fix for install sanity check -- don't count hidden dir layout files. 2015-02-02 11:19:54 -08:00
Gregory L. Lee
48f1ff87f8 added more Python modules 2015-02-02 11:19:54 -08:00
Gregory L. Lee
2bc3f74df2 added more Python modules 2015-02-02 11:19:54 -08:00
Todd Gamblin
de91c95e8e Ability to ignore files in activate/deactivate for extensions. 2015-02-02 11:19:54 -08:00
Todd Gamblin
ff9cb94f4f Add arguements to extends() and activate/deactivate. 2015-02-02 11:19:54 -08:00
Gregory L. Lee
9fa489b7f2 added several modules 2015-02-02 11:19:54 -08:00
Gregory L. Lee
7992f415fe added py-nose 2015-02-02 11:19:54 -08:00
Todd Gamblin
2ae7f53b83 Bugfix: Extension hooks shoudl only run for extensions. 2015-02-02 11:19:54 -08:00
Todd Gamblin
89ccdf92cd Add activate and deactivate commands for extensions. 2015-02-02 11:19:54 -08:00
Todd Gamblin
acc62abbd0 Rework do_activate/activate and do_deactivate/deactivate semantics.
- packages can now extend only one other package.
- do_activate() and do_deactivate() are now called on the extension,
  and they automatically find the extendee
- activate() and deactivate() are still called on the extendee and are
  passed the extension.
2015-02-02 11:19:53 -08:00
Todd Gamblin
d13bbeb605 Add PYTOHNPATH to modules for python extensions. 2015-02-02 11:19:53 -08:00
Todd Gamblin
bcccf02020 Add setup_extension_environment() method.
- lets packages do some setup before their extensions run install()
2015-02-02 11:19:53 -08:00
Todd Gamblin
82946d2914 Move symlink tree routines to LinkTree class. 2015-02-02 11:19:52 -08:00
Todd Gamblin
9977543478 Added feature: package extensions
- packages can be "extended" by others
- allows extension to be symlinked into extendee's prefix.
- used for python modules.
  - first module: py-setuptools
2015-02-02 11:19:00 -08:00
Todd Gamblin
7215aee224 do_install() passes kwargs to dependencies. 2015-02-02 11:16:24 -08:00
Todd Gamblin
2c1eda66c4 First python extension package: setuptools 2015-02-02 11:16:23 -08:00
Todd Gamblin
adb7d614e6 Add pre-install and pre-uninstall hooks. 2015-02-02 11:16:23 -08:00
Todd Gamblin
ebe0c1d83a New "extends" relation adds another special list to the package class. 2015-02-02 11:16:23 -08:00
Todd Gamblin
88afad3e46 Directory layout can now track installed extensions per package. 2015-02-02 11:16:23 -08:00
Todd Gamblin
ba593ccb26 Fix bug in mirror path construction. 2015-02-02 11:15:24 -08:00
Todd Gamblin
81a4d89e94 Merge pull request #15 from ch4i/features/memaxes
Mitos package
2015-01-25 12:29:15 -08:00
Alfredo Gimenez
6a496ef620 PSAPI v0.6 -> Mitos v0.7 2015-01-23 16:58:15 -08:00
Todd Gamblin
0ac6ffb3ef Add extra gcc dependencies.
- not used until optional/conditional deps are implemented.
2015-01-23 00:05:23 -08:00
Todd Gamblin
3e37903ffd Packages have rpath property. 2015-01-23 00:03:51 -08:00
Todd Gamblin
e6b4530234 Add is_exe function to filesystem. 2015-01-22 13:52:28 -08:00
Alfredo Gimenez
e97db785d6 psapi v0.6 2015-01-21 20:42:44 -08:00
Todd Gamblin
51ed0d3f6f Properly set install RPATHS for cmake builds. 2015-01-19 20:59:23 -08:00
Todd Gamblin
2a0e33876e Add PSAPI 2015-01-19 20:45:27 -08:00
Todd Gamblin
d08c0703a0 Initial build of MemAxes GUI. 2015-01-19 14:07:41 -08:00
Todd Gamblin
b7dacb427d Qt5 & VTK builds. VTK works with Qt 4 and 5. 2015-01-19 14:07:09 -08:00
Todd Gamblin
0211adbdb6 version bump libpng 2015-01-19 14:06:25 -08:00
Todd Gamblin
53c8b4249a Make dbus put a machine id file in the right place. 2015-01-19 14:06:09 -08:00
Todd Gamblin
f35b8b8db4 Better location error output. 2015-01-19 14:05:48 -08:00
Todd Gamblin
a4c19eee14 Qt5 webkit requires gperf 2015-01-19 14:00:54 -08:00
Todd Gamblin
4e3662f318 Dyninst 8.2 works. 2015-01-17 17:09:42 -08:00
Todd Gamblin
c6351b5d00 Fix #11: bug in ProviderIndex
- packages that provided same spec (e.g. mpe) were overwritten in the index
  - Index now has a set of providers instead of a single provider per provided spec.
- see https://github.com/scalability-llnl/spack/issues/11
2015-01-14 00:18:29 -08:00
Todd Gamblin
f73abe6849 Merge branch 'features/dep-graph' into develop 2015-01-13 01:00:55 -08:00
Todd Gamblin
fa67d69585 Merge branch 'develop' of github.com:scalability-llnl/spack into develop 2015-01-13 00:53:04 -08:00
Todd Gamblin
917d82be0d Add list_url for ompss 2015-01-13 00:45:12 -08:00
George Todd Gamblin
1324b32665 Merge pull request #26 in SCALE/spack from ~JAULMES1/spack:develop to develop
# By Luc Jaulmes
# Via Luc Jaulmes
* commit '844c0838487529c0f2edc6f09e6ef86f12364716':
  Updated versions in OmpSs and Extrae, which resolves version-dependency problems with MPI
2015-01-12 22:33:26 -08:00
Luc Jaulmes
844c083848 Updated versions in OmpSs and Extrae, which resolves version-dependency problems with MPI 2015-01-12 20:38:32 +01:00
Todd Gamblin
9db967be98 Fix bug when all deps are back edges.
- Happened with the graph for SAMRAI
2015-01-10 19:23:07 -08:00
Todd Gamblin
011f71a442 Fix bug in STAT graph 2015-01-10 19:09:03 -08:00
Todd Gamblin
36198c525b Merge pull request #10 from justintoo/rose
Add Packages for ROSE and JDK
2015-01-08 09:10:52 -08:00
Justin Too
3a07ec6c7d (Package) Add ROSE compiler package 2015-01-07 14:07:35 -08:00
Justin Too
cd9e4b5b7f (Package) Add Oracle JDK package 2015-01-07 14:07:02 -08:00
Todd Gamblin
935eba2357 Allow commands to return error codes. 2015-01-05 02:33:15 -05:00
Todd Gamblin
5d033fbd0a Expansion works properly, simplified graph code. 2015-01-04 18:49:22 -08:00
Todd Gamblin
b4b8339d0d bugfix for dot graphs of virtual packages. 2015-01-03 17:58:37 -08:00
Todd Gamblin
0a0291678e Factor graph code out into its own module, rework spack graph. 2015-01-03 17:45:54 -08:00
Todd Gamblin
478af54cce Color graph edges. 2014-12-31 14:55:35 -08:00
Todd Gamblin
dba5d020cd Pipelining back edges works, saves more space. 2014-12-30 18:05:47 -08:00
Todd Gamblin
bb3dafa3b5 Reduce number of immediate expand/contracts. 2014-12-29 21:11:28 -08:00
Todd Gamblin
daf1e229f7 More compact graphs: do back edges before forward expansion. 2014-12-29 14:29:44 -08:00
Todd Gamblin
226de0a42d Spec graph works without color. 2014-12-29 01:52:03 -08:00
Todd Gamblin
a6e00f6086 Fix ColorStream 2014-12-29 01:05:21 -08:00
Todd Gamblin
6ffcdc1166 Partially wroking ASCII dependency graph. 2014-12-29 00:03:35 -08:00
Todd Gamblin
860f834aad spack graph allows plotting specific packages. 2014-12-26 13:52:49 -08:00
Todd Gamblin
9dabcc8703 Git package. 2014-12-26 00:07:15 -08:00
Todd Gamblin
d3e52d9f9a Fix lack of sorting in version concretization. 2014-12-25 23:13:44 -08:00
Todd Gamblin
b0ce1b81ba Fix SPINDLE and SCR download URLs. 2014-12-25 18:42:03 -08:00
Todd Gamblin
b80a0e1da5 Merge branch 'features/qt' into develop 2014-12-25 18:01:51 -08:00
Todd Gamblin
37bdbdd990 URLFetchStrategy now contains exploding tarballs. 2014-12-25 17:57:55 -08:00
Todd Gamblin
0bc861db6e Fix up bzip2 install 2014-12-25 17:55:19 -08:00
Todd Gamblin
d98e475361 Qt4 builds successfully with proper RPATHs. 2014-12-25 16:09:42 -08:00
Todd Gamblin
20388ece86 Clearer code in filter_file. 2014-12-25 16:07:39 -08:00
Todd Gamblin
7b71e6fb5a spack env command
spack env allows regular commands to be run with a spack build environment.
It also displays the spack build environment for a package.
2014-12-25 16:06:30 -08:00
Todd Gamblin
b3042db755 Add patch function to Package, so that packages can define custom patch functions. 2014-12-25 16:05:45 -08:00
Todd Gamblin
852c1dc286 Print out fetch, build, and total time for builds. 2014-12-23 16:35:54 -08:00
Todd Gamblin
01ca61c7cc Updates for Qt dependencies 2014-12-23 14:43:05 -08:00
Todd Gamblin
8edf299dd2 gnutls, nettle, wget, dbus 2014-12-23 14:43:05 -08:00
Todd Gamblin
e0b5890ab5 Initial versions of Qt and some dependencies. 2014-12-23 14:43:04 -08:00
Todd Gamblin
887c29ddc4 Merge branch 'features/better-mirror-support' into develop 2014-12-22 23:28:07 -08:00
Todd Gamblin
983f35f32a Tweak extrae indentation. 2014-12-22 23:24:21 -08:00
Todd Gamblin
c8d2097bae Merge branch 'features/gperftools' into develop 2014-12-22 23:15:44 -08:00
Todd Gamblin
ab3bf61903 Fix for SPACK-50
Bad format string in version check.
2014-12-19 11:09:37 -08:00
Todd Gamblin
5a3a838fe5 Merge branch 'bugfix/load-hooks-fix' into develop 2014-12-18 21:40:20 -08:00
Todd Gamblin
5cd4ddaf08 Fix for SPACK-49.
- name conflict in imp.load_source caused this to fail.
- Python modules loaded by imp have unique names now.
2014-12-18 21:38:25 -08:00
Todd Gamblin
08f1701e35 Allow fake installations (just make the directory).
- Use for debugging.
2014-12-18 15:52:45 -08:00
Adam Moody
a9be5e7239 add gperftools (tcmalloc and friends) 2014-12-18 11:31:58 -08:00
Todd Gamblin
f1c5e64c23 Partial fix for SPACK-48.
- Try to accommodate packages that have grown dependencies better.
- This will only get fully fixed when optional dependencies are supported
  and some extra functionality is added to the spec syntax.
2014-12-15 14:46:34 -08:00
Todd Gamblin
722e73f309 Better mirror path calculation.
- Add support in spack.url for extrapolating actual file type for URL
- Move mirror path computation to mirror.py from package.py
2014-12-12 14:53:55 -08:00
Todd Gamblin
2f90068661 Handle cases where tarball is in the URL query string. 2014-12-12 14:48:59 -08:00
Todd Gamblin
e309b41972 Add support for URLs with query strings
- support tarballs from raw github URLs
2014-12-09 01:07:48 -08:00
Todd Gamblin
c3fce7b77f Bugfix in create and checksum 2014-12-08 22:49:49 -08:00
Todd Gamblin
105420f372 Merge branch 'bugfix/ncurses-pkgconfig' into develop
Fixes #7 on github.
2014-12-05 09:19:00 -08:00
Todd Gamblin
588955a987 Disable pkgconfig files until I support this better. 2014-12-05 08:45:51 -08:00
Todd Gamblin
7dc90c7097 Add experimental gasnet package for legion. 2014-12-04 10:53:52 -08:00
Todd Gamblin
ba53ccb6b3 Minor tweak: use self.git everywhere in get fetch strategy. 2014-12-04 10:51:23 -08:00
Todd Gamblin
c774455fc5 Bugfix in create command. 2014-12-04 10:47:01 -08:00
Todd Gamblin
c19347a055 Merge branch 'features/mpibash' into develop 2014-12-02 23:00:11 -08:00
Adam Moody
0f04f75fa3 add autoconf and libcircle dependencies, call autoconf before configure 2014-12-02 22:59:33 -08:00
Todd Gamblin
652b761894 Merge branch 'features/better-find' into develop 2014-12-02 22:55:11 -08:00
Todd Gamblin
fdc6081244 CLI improvements to find and list. 2014-12-02 22:53:11 -08:00
Todd Gamblin
11cffff943 colify handles ansi color input directly; no more decorator. 2014-12-02 22:32:15 -08:00
Todd Gamblin
908400bfc5 Fix dyninst 8.1.1 checksum. 2014-12-02 21:57:37 -08:00
Todd Gamblin
0c12e26026 Bugfix in boost build.
- b2 used to be called bjam
2014-12-02 14:25:52 -08:00
Todd Gamblin
e71cf672f1 Fail fast in stage if all fetch strategies fail for a package. 2014-12-02 09:58:30 -08:00
Todd Gamblin
40b4fa5443 Better spack find view. 2014-12-01 23:14:06 -08:00
Todd Gamblin
e15316e825 index_by supports compound index keys. 2014-12-01 23:13:09 -08:00
Todd Gamblin
72c753b93e Colify now supports fixing the number of columns. 2014-12-01 21:29:01 -08:00
Todd Gamblin
22e4d11010 Cleanup code in colify. 2014-11-23 16:19:26 -08:00
Todd Gamblin
287b04e50a Bugfix in terminal_size() 2014-11-23 17:55:37 -06:00
Todd Gamblin
d2fe038caf Minor bugfix for 404 error on fetch. 2014-11-17 15:03:48 -08:00
Todd Gamblin
321a3a55c7 Prompt the user about checksums only if interactive. 2014-11-16 15:26:00 -08:00
Todd Gamblin
eba13b8653 Checksum warning now prompts for override. 2014-11-08 23:20:01 -08:00
Todd Gamblin
79414947ae Merge branch 'features/gcc' into develop
Conflicts:
	lib/spack/spack/package.py
2014-11-08 22:30:46 -08:00
Todd Gamblin
0d044cdc1b Shorter help strings. 2014-11-08 22:18:20 -08:00
Todd Gamblin
1a424c124c Python 2.6 fix for Mac OS 2014-11-08 22:18:08 -08:00
Todd Gamblin
1da5d12bdd 'spack urls' debugging command, more consistent URL extrapolation.
- spack urls inspects all URLs in packages, prints out colors to show how they are parased.
- URL extrapolation test added.
- Extrapolation is more consistent now.
- Extrapolation handles more complex URLs.
- More test cases for extrapolation.
2014-11-08 22:08:15 -08:00
Todd Gamblin
57076f6ca4 URL parsing improvements 2014-11-08 11:42:54 -08:00
Todd Gamblin
9033ae6460 Add package for Sandia QThreads. 2014-11-07 00:20:39 -08:00
Todd Gamblin
55bf243f16 Improved website scraping. 2014-11-07 00:17:25 -08:00
Todd Gamblin
d78ece658b Change to faster gcc mirror that allows spidering. 2014-11-07 00:13:52 -08:00
Todd Gamblin
3112096651 Merge branch 'hotfix/vcs-not-required' into develop 2014-11-06 13:41:54 -08:00
Todd Gamblin
fa21acc470 Fix inadvertent requirement of hg, svn, git, etc. 2014-11-06 13:22:15 -08:00
Todd Gamblin
193eddda5e Fix for missing format_doc in package-list command. 2014-11-06 11:46:43 -08:00
Todd Gamblin
b97ee67a4b Working GCC package. 2014-11-05 09:54:43 -08:00
Todd Gamblin
a37828bafb Packages for gcc and its dependencies. 2014-11-04 13:42:47 -08:00
Todd Gamblin
488a6737b7 Merge branch 'features/python' into develop 2014-11-03 14:20:37 -08:00
Todd Gamblin
7905b50dcb Bump ImageMagick version 2014-11-03 14:19:24 -08:00
Todd Gamblin
6c4bac2ed8 Update libmonitor URL to point to google code. 2014-11-03 14:17:10 -08:00
Todd Gamblin
6c8c41da98 Working Python 2.7.8, ncurses, readline 2014-11-03 14:12:16 -08:00
Todd Gamblin
0f3b80cddb Fix for SPACK-11: Spack compiler wrapper is now in bash.
- Startup is much faster
- Added test for compiler wrapper parsing.
- Removed old compilation module that had to be imported by old cc.
- Removed cc from python version checks now that it's bash.
2014-11-03 14:12:16 -08:00
Todd Gamblin
1656f62a12 Add bzip2 package and spack pkg add command. 2014-11-03 14:12:16 -08:00
Todd Gamblin
8c8fc749be Initial versions of python and libffi. 2014-11-03 14:12:06 -08:00
Todd Gamblin
8f9de17869 "spack info -r" is now "spack package-list"
- too much going on in this command, and it made subcommand parsing weird.
- information printed is the same but info and package-list are really different commands.
2014-11-01 16:03:09 -07:00
Todd Gamblin
a5859b0b05 Add ability to get subparser by name from argparse 2014-11-01 15:59:29 -07:00
Todd Gamblin
3db22a4e33 Sane self.url for packages (reflects current version) 2014-11-01 15:01:01 -07:00
Todd Gamblin
85a14b68b7 spack compiler add checks for access before listing directories. 2014-10-31 10:30:58 -07:00
Todd Gamblin
f60fd330cb Better error messages for extension() 2014-10-30 15:00:02 -07:00
Todd Gamblin
132c32076a Add Muster parallel clustering library. 2014-10-28 16:44:35 -07:00
Todd Gamblin
84aa69fb0d Release v0.8.10 2014-10-27 23:04:47 -07:00
Todd Gamblin
ee528bc426 Docs for spack list with glob. 2014-10-27 22:40:04 -07:00
Todd Gamblin
5a5f5cb018 Add Bob Robey to README credits. 2014-10-27 21:33:51 -07:00
Todd Gamblin
75e6e794fb Fix bug with extension() for sourceforge URLs. 2014-10-27 21:32:31 -07:00
Todd Gamblin
ee50a4ccce Merge branch 'features/mpe2' into develop 2014-10-27 21:30:58 -07:00
Todd Gamblin
31eb6a579a Merge pull request #5 from scrobey/features/mpe2
Adding autotools and ImageMagick and patch for mpe2
2014-10-27 22:00:43 -06:00
Todd Gamblin
f918ea1ba7 Merge branch 'features/coreutils' into develop 2014-10-27 20:44:32 -07:00
Todd Gamblin
d542b7b003 Merge branch 'features/new-docs' into develop 2014-10-27 20:44:00 -07:00
Todd Gamblin
1c4948e1b0 Fix long-standing multimethod test error.
- New inclusive version ranges from git-fetching branch enable a fix.
- Can now write :1 to include 1.3, 1.4, etc.
  - couldn't do this before so provides() was weird.
2014-10-27 20:02:24 -07:00
Todd Gamblin
d98beeec31 Remove ambiguous test case in url_extrapolate. 2014-10-27 19:59:19 -07:00
Todd Gamblin
4d8a47800a Add docs on spack module refresh. 2014-10-27 19:53:56 -07:00
Todd Gamblin
4ecc7e1c93 Document file filtering functions. 2014-10-27 19:53:55 -07:00
Todd Gamblin
4bf6930416 Docs for modules & dotkits. 2014-10-27 19:53:55 -07:00
Todd Gamblin
e2af2a27bf Merge branch 'features/git-fetching' into develop
Conflicts:
	lib/spack/docs/packaging_guide.rst
	lib/spack/spack/cmd/info.py
	lib/spack/spack/package.py
	lib/spack/spack/stage.py
2014-10-27 19:53:05 -07:00
Todd Gamblin
d41d6ed863 Updated packaging docs. 2014-10-27 00:55:38 -07:00
Todd Gamblin
525344aa85 Make info command show VCS URLs properly. 2014-10-27 00:55:25 -07:00
Bob Robey
f9149b6cb6 Fixing errors in depends_on and updating version for ImageMagick 2014-10-25 21:59:01 -06:00
Bob Robey
76ed5c212c Adding autotools and ImageMagick and patch for mpe2 2014-10-25 21:25:25 -06:00
Todd Gamblin
340b5590f1 Add coreutils package. 2014-10-25 14:41:06 -07:00
Todd Gamblin
2e2e720a2a Add spack md5 command for simple checksumming. 2014-10-25 14:40:17 -07:00
Todd Gamblin
fa4d58db52 Add a dummy depends_on to the boilerplate. 2014-10-25 14:38:42 -07:00
Todd Gamblin
ce1b30c229 Adding initial version of MPE2 package. 2014-10-23 21:08:13 -07:00
Todd Gamblin
94a52a8710 Start documenting new features. 2014-10-23 20:15:11 -07:00
Todd Gamblin
87b87199f2 Fix for SPACK-43: compiler finding fails gracefully on unknown error. 2014-10-22 01:08:08 -07:00
Todd Gamblin
c08985f7d1 Add bib2xhtml 2014-10-22 01:08:08 -07:00
Todd Gamblin
e4c2891d4b Test for URL extrapolation. 2014-10-22 00:49:16 -07:00
Todd Gamblin
0c4b8d45df Consolidate archive_file() implementation into Stage. 2014-10-16 08:50:57 -07:00
Todd Gamblin
fb3003f664 Bug fixes for URLs and mirror fetching. 2014-10-16 06:56:00 -07:00
Todd Gamblin
6fdfd83e6b Add test cases for mirroring. 2014-10-15 21:07:41 -04:00
Todd Gamblin
8e3c2d8a26 Refactor fetch tests to use common mock repo module. 2014-10-15 07:40:01 -07:00
Todd Gamblin
fbd7e96680 Add a mirror module that handles new fetch strategies.
- Uses new fetchers to get source
- Add archive() method to fetch strategies to support this.
- Updated mirror command to use new mirror module
2014-10-14 23:26:43 -07:00
Todd Gamblin
8fd4d32408 Use external argparse in spack list, for 2.6 compatibility. 2014-10-10 09:45:48 -07:00
Todd Gamblin
1fcfb80bdd SPACK-19 no longer an issue. Removing libtool copy. 2014-10-10 09:23:28 -07:00
David Boehme
41c46c2279 Merge branch 'develop' of ssh://cz-stash.llnl.gov:7999/scale/spack into develop 2014-10-08 17:31:44 -07:00
David Boehme
8857b1f69e Add Scalasca 2.1 2014-10-08 17:31:11 -07:00
Todd Gamblin
dd2cea4107 Add available versions to generated package list documentation. 2014-10-08 14:08:11 -07:00
Todd Gamblin
4c614ac768 Add SUNDIALS solver package. 2014-10-08 03:10:29 -07:00
Todd Gamblin
36a87f5bf9 Update documentation to add an auto-generated list of packages. 2014-10-08 03:08:40 -07:00
Todd Gamblin
319b37af0e Add spack edit -c option to edit commands. 2014-10-08 03:08:40 -07:00
Todd Gamblin
ff546358f3 Update docs to use new version format. 2014-10-08 03:08:40 -07:00
Todd Gamblin
ee23cc2527 Add archive creation capability to fetch strategies.
- fetch strategy needs to know how to create archive of fetched repo
- allows mirrors to be created from git/other VCS fetches.
2014-10-07 23:26:39 -07:00
Todd Gamblin
e8d131ef96 Minor bugfix in exception constructor. 2014-10-07 23:23:18 -07:00
Todd Gamblin
1801a85966 Move tty output commands out of package and into clean command. 2014-10-07 23:22:58 -07:00
Todd Gamblin
4bde771970 Fix for SPACK-39: Concretization was too restrictive.
- concretize_version() now Use satisfies(), not intersection.
- version class updated with better intersection/union commands
- version now 1.6 "contains" 1.6.5
- added test for new version functionality

- remove none_high and none_low classes
  - version module is now self-contained; save for external 2.7
    functools.total_ordering for 2.6 compatibility.
2014-10-07 23:22:45 -07:00
Todd Gamblin
1c60b3967d Add simple fnmatch filtering to spack list. 2014-10-06 14:11:19 -07:00
Todd Gamblin
4cae48c8df Add libNBC (non-blocking collectives) 2014-10-06 13:48:50 -07:00
Todd Gamblin
616d232257 Add package for Torsten's netgauge tool. 2014-10-06 10:26:54 -07:00
Todd Gamblin
37e96ff6e1 Added test for Mercurial fetching. 2014-10-04 18:38:47 -07:00
Todd Gamblin
0fa1c5b0a5 Add Mercurial fetch strategy and lwm2. 2014-10-03 16:57:32 -07:00
Todd Gamblin
727d313c30 Fix location.py to use source_path 2014-10-03 16:57:32 -07:00
Todd Gamblin
faae720c36 add tests for svn fetching. 2014-10-03 16:55:53 -07:00
Todd Gamblin
da84764e97 Add test case for git fetching. 2014-10-03 16:55:53 -07:00
Todd Gamblin
c74cd63389 Callpath build works when a tag is fetched from git. 2014-10-03 16:55:53 -07:00
Todd Gamblin
0cc79e0564 Implement per-version attributes for flexible fetch policies.
- Tests pass with URL fetching and new scheme.
- Lots of refactoring
- Infrastructure is there for arbitrary fetch policies and more attribtues on the version() call.
- Mirrors do not currently work properly, and they get in the way of a proper git fetch
2014-10-03 16:55:13 -07:00
Todd Gamblin
52d140c337 Factor out URL fetching into URLFetchStrategy
- Added FetchStrategy class to Spack
- Isolated pieces that need to be separate from Stage for git/svn/http
- Added URLFetchStrategy for curl-based fetching.
2014-10-03 16:53:13 -07:00
Todd Gamblin
720ced4c2e Add test for URL version substitution. 2014-09-30 00:09:11 -07:00
Todd Gamblin
13eca0357f Bugfix for version substitution. 2014-09-29 23:30:48 -07:00
Todd Gamblin
921a5b5bcc Make fetch fail on 404. 2014-09-29 23:28:16 -07:00
Todd Gamblin
26495ddce9 Reverse sort output versions in spack checksum 2014-09-29 23:27:40 -07:00
Todd Gamblin
e4613a60cf Fix for spack cd -i. 2014-09-29 22:43:40 -07:00
Todd Gamblin
70475d08c0 Bugfix for spack cd -h 2014-09-29 22:39:36 -07:00
Todd Gamblin
a8ed1ec414 Minor argparse improvement. 2014-09-29 20:00:00 -07:00
Todd Gamblin
1b67c8493e Merge branch 'features/automaded' into develop 2014-09-28 11:05:42 -07:00
Todd Gamblin
3bd52678be MPICH sets MPI compilers to use real compilers and not spack wrappers. 2014-09-27 21:36:42 -07:00
Todd Gamblin
5cc508393a gfortran version detection brokenon debian. 2014-09-27 16:19:56 -07:00
Todd Gamblin
63292c5826 Update callpath to 1.0.2 2014-09-27 16:07:20 -07:00
Todd Gamblin
d7984c7540 Update checksum to print new version syntax. 2014-09-27 15:33:27 -07:00
Todd Gamblin
608191bd8c Find custom list_urls depending on the archive URL (e.g. github releases) 2014-09-27 15:32:44 -07:00
Todd Gamblin
74a603dcd3 Merge branch 'features/swig' into develop 2014-09-23 21:49:04 -07:00
Todd Gamblin
bff2192498 Added SWIG package. 2014-09-23 21:48:44 -07:00
Todd Gamblin
2de2d4bea7 Modify MPI installs to work without fortran. 2014-09-23 14:59:30 -07:00
Todd Gamblin
86980c298e Merge branch 'develop' 2014-09-19 09:56:00 -07:00
Todd Gamblin
c9fbba22a2 First version of AutomaDeD package. 2014-09-19 09:55:31 -07:00
Todd Gamblin
7380de8ee3 Change git URL to https URL in all docs. 2014-09-19 09:55:13 -07:00
Todd Gamblin
115d069677 Merge branch 'develop' 2014-09-19 09:43:45 -07:00
Todd Gamblin
39acca4b43 Fix for bug in create introduced by LLVM merge. 2014-09-19 09:42:07 -07:00
Todd Gamblin
de030c9932 Merge branch 'develop' 2014-09-18 23:54:26 -07:00
Todd Gamblin
fa5594e13f Merge branch 'features/llvm' into develop
- merging parts of LLVM that can be built now.
- need to wait for standalone builds for some of the others.
2014-09-18 23:30:32 -07:00
Todd Gamblin
9165a000a3 Better C++11 support, remove non-standalone llvm-compiler-rt.
- LLVM non-standalone add-ons are difficult to build outside LLVM.
- May have to wait for future versions of LLVM to build some of these
2014-09-18 23:22:03 -07:00
Todd Gamblin
e46e1d51c2 Merge branch 'features/fileutils-deps' into develop 2014-09-18 21:39:41 -07:00
Todd Gamblin
4d2ccfa028 Take fileutils out and just merge the deps into develop. 2014-09-18 21:33:09 -07:00
Todd Gamblin
e85830e313 Fileutils successfully finds libarchive; can't find dtcmp despite config arg. 2014-09-18 01:49:30 -07:00
Todd Gamblin
4a19fa793e Support for pkg-config. 2014-09-18 01:42:01 -07:00
Adam Moody
250ffc28a3 update libcircle to download tarball from github/hpc 2014-09-17 23:40:16 -07:00
Todd Gamblin
8c4db76c3a Add command to show packages added in particular git revisions.
spack pkg list    [rev]           list packages for revision.
spack pkg diff    [rev1] [rev2]   diff bt/w packages in rev1 and rev2
spack pkg added   [rev1] [rev2]   pkgs added since rev1
spack pkg removed [rev1] [rev2]   pkgs removed since rev2
2014-09-17 16:17:57 -07:00
Todd Gamblin
68274ee657 Add command to show packages added in particular git revisions.
spack pkg list    [rev]           list packages for revision.
spack pkg diff    [rev1] [rev2]   diff bt/w packages in rev1 and rev2
spack pkg added   [rev1] [rev2]   pkgs added since rev1
spack pkg removed [rev1] [rev2]   pkgs removed since rev2
2014-09-17 15:48:13 -07:00
Todd Gamblin
607b4c8414 Merge branch 'features/scorep-packages' into develop 2014-09-17 13:38:33 -07:00
David Boehme
31bd1e069c Add Score-P 1.3 release. Works for gcc, still some issues with Intel builds. 2014-09-17 12:28:33 -07:00
Adam Moody
6c94fc4fd2 added mpileaks (finally!) 2014-09-17 12:28:00 -07:00
Todd Gamblin
881fdb66ae Merge branch 'develop' into features/fileutils
Conflicts:
	lib/spack/spack/packages.py
2014-09-16 23:53:44 -07:00
Todd Gamblin
e2509717b9 Merge branch 'features/callpath' into develop 2014-09-16 21:59:57 -07:00
Todd Gamblin
a4c8e945c7 Some fixups for Adam's callpath and adept-utils packages.
- Make spack packages RPATH *ALL* dependencies (i.e. the whole tree)
- prevents callpath link from finding wrong libelf -- always uses the one dyninst used.
2014-09-16 21:59:46 -07:00
Adam Moody
656cf12cda add adeptutils and callpath packages 2014-09-16 16:50:54 -07:00
Todd Gamblin
ec44791aa3 Remove examples from default STAT build to avoid MPI dependence. 2014-09-05 10:52:43 -07:00
George Todd Gamblin
f98a98718f Merge pull request #23 in SCALE/spack from features/cmake-prefix-path to develop
# By David Beckingsale
# Via Todd Gamblin
* commit '42ca6c8bfc2b7598acd880a013f7898db5245004':
  Add dependency prefixes to CMAKE_PREFIX_PATH
2014-08-22 14:46:33 -07:00
David Beckingsale
42ca6c8bfc Add dependency prefixes to CMAKE_PREFIX_PATH 2014-08-22 14:45:44 -07:00
Todd Gamblin
d87a652582 Add spack cd and spack location commands.
- Better shell support for cd'ing into directories
- Fix some csh weirdness with nested aliases.
2014-08-22 11:00:19 -07:00
Todd Gamblin
eb5efed421 Merge branch 'features/postgresql' into develop
- add spack cd command.
- Fix bug in modules hook

Conflicts:
	lib/spack/spack/cmd/stage.py
	lib/spack/spack/hooks/dotkit.py
	share/spack/setup-env.bash
2014-08-21 22:59:39 -07:00
Todd Gamblin
e301d62332 Remove development TAU version from package. 2014-08-20 11:46:59 -07:00
Todd Gamblin
5a9ef130ea Make EnvModule class use spec instead of package, fix using module of non-present package.
- Using the spec doesn't require the package to be there.
- Restore ability to use non-present packages (which was broken)
2014-08-20 11:43:03 -07:00
Todd Gamblin
8cc2298181 Merge branch 'features/python-2.6-compatibility' into develop
- Changed 'import argparse' to 'from external import argparse' in conflicts.

Conflicts:
	lib/spack/spack/cmd/dotkit.py
	lib/spack/spack/cmd/unuse.py
	lib/spack/spack/cmd/use.py
2014-08-20 09:30:40 -07:00
George Todd Gamblin
42e27d04c1 Merge pull request #19 in SCALE/spack from features/modules to develop
# By Todd Gamblin (4) and David Beckingsale (2)
# Via Todd Gamblin
* commit 'b601fd08caf21b5fc11e6998a5ebd20a04ac57ad':
  Bugfixes for csh environment modules.
  Bugfixes, more consolidation of modules code.
  Add csh/tcsh support for modules
  Consolidate most module code into spack.modules and spack.cmd.module
  Fixed up module support
  Added inital module support
2014-08-18 23:11:02 -07:00
Todd Gamblin
b601fd08ca Bugfixes for csh environment modules. 2014-08-17 01:41:32 -07:00
Todd Gamblin
22bec329c1 Bugfixes, more consolidation of modules code.
- specific module classes use __metaclass__ to register themselves.
- bugfixes in module writing.
2014-08-16 22:22:53 -07:00
Todd Gamblin
776560f8ce Add csh/tcsh support for modules
- csh scripting is a GIANT pain in the ass
- hopefully the thin script layer doesn't get much more complex.
2014-08-16 14:58:15 -07:00
Todd Gamblin
221cf6acb9 Consolidate most module code into spack.modules and spack.cmd.module
- One file with all the module classes (spack/modules.py)
  - Has an EnvModule superclass that does most of the work and consolidates common code
  - Subclasses have specializations for different module systems (TclModule, Dotkit)

- One command (spack module) for all the types of modules to use
  - the one command is used by the scripts, only need to maintain in one place
  - has some subcommands for different module types, but they're handled mostly generically.

- Consolidate zsh support into a single setup-env.sh script.
2014-08-16 14:53:57 -07:00
Todd Gamblin
fa3b19000d update tau tarball 2014-08-11 22:47:24 -07:00
Todd Gamblin
6127b0baa6 new prototype TAU tarball from Kevin 2014-08-11 22:47:24 -07:00
Todd Gamblin
90cd0c7efa Add Kevin's experimental TAU version 2014-08-11 22:47:24 -07:00
Todd Gamblin
0b68d1292d Add package for openssl, have postgres use it.
- Updated version wildcard to include [a-z]|alpha|beta
  to accommodate all the letter suffixes on openssl.
2014-08-11 22:47:23 -07:00
Todd Gamblin
abc7d401e2 Add "spack cd" shell support to cd directly into the staged archive. 2014-08-11 22:47:23 -07:00
Todd Gamblin
5a3803de39 Add options to stage to make it just print out stage dir. 2014-08-11 22:47:23 -07:00
Todd Gamblin
0740c576a7 Package for postgresql. 2014-08-11 22:47:23 -07:00
Todd Gamblin
0bba101ff9 Allow packages to add a dotkit() method and write custom parts of dotkits. 2014-08-11 22:47:22 -07:00
Todd Gamblin
113afe860e More robust symbol inclusion for 'from spack import *'
- avoid errors where some symbols aren't exported to packages.
- reduce the number of places each symbol needs to be mentioned in
  an __all__ list
2014-08-11 22:47:22 -07:00
Todd Gamblin
48d5281e3a Test cases pass; Spack supports Python 2.6! 2014-08-10 18:07:20 -07:00
Todd Gamblin
7082b0a59a cc supports Python 2.6 2014-08-10 18:07:20 -07:00
Todd Gamblin
696e80c62f Get rid of Python 2.7 dict.viewkeys() call. 2014-08-10 18:05:52 -07:00
Todd Gamblin
d95e7ecfe1 Remove dependency on Python2.7 OrderedDict, revise config parser 2014-08-10 17:56:36 -07:00
Todd Gamblin
d86a638099 Add Python 2.7 functools.total_ordering to external modules.
- removing dependence on 2.7
- added it to pyqver2 ads well
2014-08-10 17:54:39 -07:00
Todd Gamblin
ca328a6993 Fix minor warning about Exception.message being deprecated. 2014-08-10 17:52:46 -07:00
Todd Gamblin
a41a19d14d Change dict comprehensions to dict() constructors. 2014-08-10 16:04:41 -07:00
Todd Gamblin
5a5da817a1 Fix SPACK-27 & remove dependence on check_output
- subprocess.check_output is python 2.7 only
- Spack checks for existence of requested prefix, creates it if it does not exist.
2014-08-10 11:52:17 -07:00
Todd Gamblin
7714d08e2e Remvoe dependence on v2.7 argparse by including argparse. 2014-08-10 11:48:07 -07:00
Todd Gamblin
17895bd6fd Add a test case to ensure that Spack is v2.6 compliant. 2014-08-10 11:46:14 -07:00
Todd Gamblin
8ab793a3a6 Add external package with pyqver2 tool 2014-08-10 11:45:32 -07:00
George Todd Gamblin
884a4fecd1 Merge pull request #21 in SCALE/spack from features/directory-layout-test to develop
# By Todd Gamblin
# Via Todd Gamblin
* commit '98797459f343c400f4f6fe988bae47d4bab9116b':
  Minor tweaks after spec update.
  More spec improvements
  Add postorder traversal to specs
  Clean up specs, spec comparison, and spec hashing.
2014-08-09 17:47:08 -07:00
Todd Gamblin
98797459f3 Minor tweaks after spec update.
- spack find -p works properly (get path from spec, not package)

- directory layout and PackageDB normalize things automatically unless
  they're unknown packages (need to do this for spack find -l)

- install test made robust to mock/main package conflicts
2014-08-09 17:41:56 -07:00
Todd Gamblin
5f073ae220 More spec improvements
- Spec.copy() does not create superfluous nodes and preserves DAG
  connections.

- Spec.normalize() doesn't create extra dependency nodes or throw out
  old ones like before.

- Added better test cases for above changes.

Minor things:
- Fixed bug waiting to happen in PackageDB.get()
  - instances was keyed by name, not by spec, so caching wasn't really
    working at all.
- removed unused PackageDB.compute_dependents function.
- Fixed PackageDB.graph_dependencies() so that spack graph works again.
2014-08-09 16:17:40 -07:00
Todd Gamblin
63f8af8078 Add postorder traversal to specs
- Spec.preorder_traversal() is now Spec.traverse().
- Caller can supply order='pre' or order='post'
2014-08-08 13:21:52 -07:00
Todd Gamblin
d5c625d87d Clean up specs, spec comparison, and spec hashing.
- Spec comparison is now less strict
  - compares based on sorted list of dependencies but not
    their structure
  - Makes comparison easy when a spec is not normalized.

- This makes the dep_hash consistent for specs read in from a
  directory layout.  - Can now reliably read in a spec for which the
  package has gone away, and still be able to delete its install.
  - easy switching between git branches

- Fixed latent bug in Spec.flat_dependencies() (was including root)

- added a test for the directory layout so that this code will get
  more exercise.
2014-08-08 13:21:48 -07:00
Todd Gamblin
c55041e9d4 Partial commit of more packages. 2014-08-04 07:54:22 -07:00
David Beckingsale
8738a3a88c Added LLVM package 2014-08-04 07:54:22 -07:00
Todd Gamblin
782e45e5b1 Fix up versions to match new version format, minor formatting. 2014-08-04 07:54:05 -07:00
Todd Gamblin
cabfc374eb More descriptive error when package constructor fails.
- helps package_sanity test identify which package failed.
- encountered while upgrading versions in Adam's packages to the new format.
2014-08-04 07:54:05 -07:00
Adam Moody
3779c78c00 adding libarchive 2014-08-04 07:54:05 -07:00
Adam Moody
a27e178ac2 add libcircle package 2014-08-04 07:54:05 -07:00
Adam Moody
712a2c3742 fileutils package 2014-08-04 07:54:04 -07:00
Adam Moody
a32816c644 cannot uninstall dtcmp because depends on dtcmp 2014-08-04 07:54:04 -07:00
Adam Moody
6e7a7d127d adding dtcmp package 2014-08-04 07:54:04 -07:00
Adam Moody
3dd8e561b9 add lwgrp package 2014-08-04 07:54:04 -07:00
Adam Moody
741084faf4 add mvapich2 package to handle different compilers and variants 2014-08-04 07:54:04 -07:00
David Beckingsale
57ddbd282a Fixed up module support 2014-08-04 07:53:40 -07:00
David Beckingsale
94c5c9667c Added inital module support 2014-08-04 07:53:40 -07:00
George Todd Gamblin
d13d32040c Merge pull request #20 in SCALE/spack from openss to develop
# By Matthew LeGendre (2) and Todd Gamblin (1)
# Via Todd Gamblin
* commit 'd7a3c7e555bfd93fbf93ec55608d7fc6aa8052f8':
  Fix up Matt's openss packages.
  Add sqlite to spack
  Add libmonitor to spack.  Still needs svn support for checkout
2014-08-04 07:51:12 -07:00
Todd Gamblin
d0b179962b find and uninstall work when installed package is no longer in spack.
- Make switching between git branches easier.
- Make future removal of packages easier.
2014-08-04 07:40:53 -07:00
Todd Gamblin
d7a3c7e555 Fix up Matt's openss packages. 2014-08-03 12:57:09 -07:00
David Boehme
557ae64d51 Fix cube compiler configuration 2014-08-01 16:40:57 -07:00
Matthew LeGendre
b7fbc77bab Add sqlite to spack 2014-08-01 15:50:43 -07:00
Matthew LeGendre
d1de958daa Add libmonitor to spack. Still needs svn support for checkout 2014-08-01 15:48:59 -07:00
David Boehme
513b5aecf1 Improve compiler configuration in otf2 package 2014-08-01 10:16:08 -07:00
Todd Gamblin
61e2cb56a4 Got version 1.2.1 building, but 1.3 and onwards are different. 2014-08-01 09:09:57 -07:00
David Boehme
e377abc18c Add Score-P packages. 2014-07-31 17:51:23 -07:00
George Todd Gamblin
2f21ca64e0 Merge pull request #18 in SCALE/spack from develop_add_ompss to develop
* commit 'e011b767fafc1c7287db1cfd254266171e4e382f':
  Converting Luc's packages to the new version format.
  Addind missing dependency nanos->extrae necessary for traces
  Added Paraver and dependencies, restricted Extrae to OpenMPI 1.6
  Adding Extrae and OmpSs with some of their dependencies, hwloc and PAPI. Extrae does not compile for latest versions of any MPI implementation.
  first try for ompss build script
  Allow per-version URLs instead of one single URL per package.
2014-07-31 14:20:42 -07:00
Todd Gamblin
e011b767fa Converting Luc's packages to the new version format. 2014-07-31 14:09:38 -07:00
Luc Jaulmes
5f3bcbfded Addind missing dependency nanos->extrae necessary for traces 2014-07-31 13:57:45 -07:00
Luc Jaulmes
853784d382 Added Paraver and dependencies, restricted Extrae to OpenMPI 1.6 2014-07-31 13:57:45 -07:00
Luc Jaulmes
5a4881c086 Adding Extrae and OmpSs with some of their dependencies, hwloc and PAPI.
Extrae does not compile for latest versions of any MPI implementation.
2014-07-31 13:57:44 -07:00
Luc Jaulmes
5dffa26711 first try for ompss build script 2014-07-31 13:51:37 -07:00
Todd Gamblin
1ad474f1a9 Allow per-version URLs instead of one single URL per package. 2014-07-30 23:30:07 -07:00
318 changed files with 20310 additions and 3270 deletions

1
.gitignore vendored
View File

@@ -6,3 +6,4 @@
.idea
/etc/spackconfig
/share/spack/dotkit
/share/spack/modules

48
LICENSE
View File

@@ -1,4 +1,4 @@
Copyright (c) 2013, Lawrence Livermore National Security, LLC.
Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
Produced at the Lawrence Livermore National Laboratory.
This file is part of Spack.
@@ -55,22 +55,22 @@ Modification
0. This License Agreement applies to any software library or other
program which contains a notice placed by the copyright holder or
other authorized party saying it may be distributed under the terms of
this Lesser General Public License (also called this License). Each
licensee is addressed as you.
this Lesser General Public License (also called "this License"). Each
licensee is addressed as "you".
A library means a collection of software functions and/or data
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The Library, below, refers to any such software library or work
which has been distributed under these terms. A work based on the
Library means either the Library or any derivative work under
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term modification.)
included without limitation in the term "modification".)
Source code for a work means the preferred form of the work for
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control
@@ -83,7 +83,7 @@ covered only if its contents constitute a work based on the Library
it). Whether that is true depends on what the Library does and what
the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Librarys
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
@@ -170,17 +170,17 @@ source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a work that uses the Library. Such a work,
linked with it, is called a "work that uses the Library". Such a work,
in isolation, is not a derivative work of the Library, and therefore
falls outside the scope of this License.
However, linking a work that uses the Library with the Library
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a work that uses the
library. The executable is therefore covered by this License. Section
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License. Section
6 states terms for distribution of such executables.
When a work that uses the Library uses material from a header file
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is
not. Whether this is true is especially significant if the work can be
@@ -200,10 +200,10 @@ distribute the object code for the work under the terms of Section
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also combine or link
a work that uses the Library with the Library to produce a work
a "work that uses the Library" with the Library to produce a work
containing portions of the Library, and distribute that work under
terms of your choice, provided that the terms permit modification of
the work for the customers own use and reverse engineering for
the work for the customer's own use and reverse engineering for
debugging such modifications.
You must give prominent notice with each copy of the work that the
@@ -218,7 +218,7 @@ a) Accompany the work with the complete corresponding machine-readable
source code for the Library including whatever changes were used in
the work (which must be distributed under Sections 1 and 2 above);
and, if the work is an executable liked with the Library, with the
complete machine-readable work that uses the Library, as object code
complete machine-readable "work that uses the Library", as object code
and/or source code, so that the user can modify the Library and then
relink to produce a modified executable containing the modified
Library. (It is understood that the user who changes the contents of
@@ -227,7 +227,7 @@ recompile the application to use the modified definitions.)
b) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (1) uses at run time a copy
of the library already present on the users computer system, rather
of the library already present on the user's computer system, rather
than copying library functions into the executable, and (2) will
operate properly with a modified version of the library, if the user
installs one, as long as the modified version is interface- compatible
@@ -245,8 +245,8 @@ specified materials from the same place.
e) Verify that the user has already received a copy of these materials
or that you have already sent this user a copy.
For an executable, the required form of the work that uses the
Library must include any data and utility programs needed for
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the materials to be distributed need not include anything that is
normally distributed (in either source or binary form) with the major
@@ -296,7 +296,7 @@ the Library or works based on it.
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients exercise of the rights granted
restrictions on the recipients' exercise of the rights granted
herein. You are not responsible for enforcing compliance by third
parties with this License.
@@ -347,7 +347,7 @@ differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
any later version, you have the option of following the terms and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
@@ -367,7 +367,7 @@ NO WARRANTY
1 BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT
WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER
PARTIES PROVIDE THE LIBRARY AS IS WITHOUT WARRANTY OF ANY KIND,
PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND,
EITHER EXPRESSED OR IMPLIED INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE

View File

@@ -22,7 +22,7 @@ for examples and highlights.
To install spack and install your first package:
$ git clone git@github.com:scalability-llnl/spack.git
$ git clone https://github.com/scalability-llnl/spack.git
$ cd spack/bin
$ ./spack install libelf
@@ -35,4 +35,24 @@ for Spack is also available.
Authors
----------------
Spack was written by Todd Gamblin, tgamblin@llnl.gov.
LLNL-CODE-647188
Significant contributions were also made by:
* David Beckingsale
* David Boehme
* Alfredo Gimenez
* Luc Jaulmes
* Matt Legendre
* Greg Lee
* Adam Moody
* Saravan Pantham
* Joachim Protze
* Bob Robey
* Justin Too
Release
----------------
Spack is released under an LGPL license. For more details see the
LICENSE file.
``LLNL-CODE-647188``

View File

@@ -24,11 +24,11 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
if not sys.version_info[:2] >= (2,7):
sys.exit("Spack requires Python 2.7. Version was %s." % sys.version_info)
if not sys.version_info[:2] >= (2,6):
v_info = sys.version_info[:3]
sys.exit("Spack requires Python 2.6 or higher. This is Python %d.%d.%d." % v_info)
import os
import argparse
# Find spack's location and its prefix.
SPACK_FILE = os.path.realpath(os.path.expanduser(__file__))
@@ -51,20 +51,23 @@ del SPACK_FILE, SPACK_PREFIX, SPACK_LIB_PATH
import llnl.util.tty as tty
import spack
from spack.error import SpackError
from external import argparse
# Command parsing
parser = argparse.ArgumentParser(
description='Spack: the Supercomputing PACKage Manager.')
parser.add_argument('-V', '--version', action='version',
version="%s" % spack.spack_version)
parser.add_argument('-v', '--verbose', action='store_true', dest='verbose',
parser.add_argument('-v', '--verbose', action='store_true',
help="Print additional output during builds")
parser.add_argument('-d', '--debug', action='store_true', dest='debug',
parser.add_argument('-d', '--debug', action='store_true',
help="Write out debug logs during compile")
parser.add_argument('-k', '--insecure', action='store_true', dest='insecure',
parser.add_argument('-k', '--insecure', action='store_true',
help="Do not check ssl certificates when downloading archives.")
parser.add_argument('-m', '--mock', action='store_true', dest='mock',
parser.add_argument('-m', '--mock', action='store_true',
help="Use mock packages instead of real ones.")
parser.add_argument('-p', '--profile', action='store_true',
help="Profile execution using cProfile.")
# each command module implements a parser() function, to which we pass its
# subparser for setup.
@@ -75,35 +78,58 @@ for cmd in spack.cmd.commands:
module = spack.cmd.get_module(cmd)
subparser = subparsers.add_parser(cmd, help=module.description)
module.setup_parser(subparser)
# Just print help and exit if run with no arguments at all
if len(sys.argv) == 1:
parser.print_help()
sys.exit(1)
# actually parse the args.
args = parser.parse_args()
# Set up environment based on args.
tty.set_verbose(args.verbose)
tty.set_debug(args.debug)
spack.debug = args.debug
def main():
# Set up environment based on args.
tty.set_verbose(args.verbose)
tty.set_debug(args.debug)
spack.debug = args.debug
spack.spack_working_dir = working_dir
if args.mock:
from spack.packages import PackageDB
spack.db = PackageDB(spack.mock_packages_path)
spack.spack_working_dir = working_dir
if args.mock:
from spack.packages import PackageDB
spack.db = PackageDB(spack.mock_packages_path)
# If the user asked for it, don't check ssl certs.
if args.insecure:
tty.warn("You asked for --insecure, which does not check SSL certificates.")
spack.curl.add_default_arg('-k')
# If the user asked for it, don't check ssl certs.
if args.insecure:
tty.warn("You asked for --insecure, which does not check SSL certificates or checksums.")
spack.curl.add_default_arg('-k')
# Try to load the particular command asked for and run it
command = spack.cmd.get_command(args.command)
try:
command(parser, args)
except SpackError, e:
if spack.debug:
# In debug mode, raise with a full stack trace.
raise
elif e.long_message:
tty.die(e.message, e.long_message)
# Try to load the particular command asked for and run it
command = spack.cmd.get_command(args.command)
try:
return_val = command(parser, args)
except SpackError, e:
if spack.debug:
# In debug mode, raise with a full stack trace.
raise
elif e.long_message:
tty.die(e.message, e.long_message)
else:
tty.die(e.message)
except KeyboardInterrupt:
sys.stderr.write('\n')
tty.die("Keyboard interrupt.")
# Allow commands to return values if they want to exit with some ohter code.
if return_val is None:
sys.exit(0)
elif isinstance(return_val, int):
sys.exit(return_val)
else:
tty.die(e.message)
tty.die("Bad return value from command %s: %s" % (args.command, return_val))
except KeyboardInterrupt:
tty.die("Keyboard interrupt.")
if args.profile:
import cProfile
cProfile.run('main()', sort='tottime')
else:
main()

View File

@@ -1,2 +1,4 @@
package_list.rst
command_index.rst
spack*.rst
_build

View File

@@ -21,6 +21,24 @@ I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
all: html
#
# This autogenerates a package list.
#
package_list:
spack package-list > package_list.rst
#
# Generate a command index
#
command_index:
cp command_index.in command_index.rst
echo >> command_index.rst
grep -ho '.. _spack-.*:' *rst \
| perl -pe 's/.. _([^:]*):/ * :ref:`\1`/' \
| sort >> command_index.rst
custom_targets: package_list command_index
#
# This creates a git repository and commits generated html docs.
# It them pushes the new branch into THIS repository as gh-pages.
@@ -36,12 +54,14 @@ gh-pages: _build/html
touch .nojekyll && \
git init && \
git add . && \
git commit -m "Initial commit" && \
git commit -m "Spack Documentation" && \
git push -f $$root master:gh-pages && \
rm -rf .git
upload:
rsync -avz --rsh=ssh --delete _build/html/ cab:/usr/global/web-pages/lc/www/adept/docs/spack
git push -f origin gh-pages
git push -f github gh-pages
apidoc:
sphinx-apidoc -T -o . $(PYTHONPATH)/spack
@@ -69,9 +89,10 @@ help:
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
-rm -f package_list.rst command_index.rst
-rm -rf $(BUILDDIR)/* $(APIDOC_FILES)
html: apidoc
html: apidoc custom_targets
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."

View File

@@ -13,7 +13,7 @@
<hr/>
<p>
&copy; Copyright 2013,
&copy; Copyright 2013-2014,
<a href="https://scalability.llnl.gov/">Lawrence Livermore National Laboratory</a>.
<br/>
Written by Todd Gamblin, <a href="mailto:tgamblin@llnl.gov">tgamblin@llnl.gov</a>, LLNL-CODE-647188

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,10 @@
.. _command_index:
Command index
=================
This is an alphabetical list of commands with links to the places they
appear in the documentation.
.. hlist::
:columns: 3

View File

@@ -35,7 +35,9 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys, os
import sys
import os
import subprocess
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
@@ -43,14 +45,16 @@
sys.path.insert(0, os.path.abspath('exts'))
# Add the Spack bin directory to the path so that we can use its output in docs.
os.environ['SPACK_ROOT'] = '../../..'
spack_root = '../../..'
os.environ['SPACK_ROOT'] = spack_root
os.environ['PATH'] += os.pathsep + '$SPACK_ROOT/bin'
spack_version = subprocess.Popen(
['spack', '-V'], stderr=subprocess.PIPE).communicate()[1].strip().split('.')
# Set an environment variable so that colify will print output like it would to
# a terminal.
os.environ['COLIFY_TTY'] = 'true'
os.environ['COLUMNS'] = '80'
os.environ['LINES'] = '25'
os.environ['COLIFY_SIZE'] = '25x80'
# Enable todo items
todo_include_todos = True
@@ -90,16 +94,16 @@
# General information about the project.
project = u'Spack'
copyright = u'2013, Lawrence Livermore National Laboratory'
copyright = u'2013-2014, Lawrence Livermore National Laboratory'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = '1.0'
version = '.'.join(spack_version[:2])
# The full version, including alpha/beta/rc tags.
release = '1.0'
release = '.'.join(spack_version[:2])
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.

View File

@@ -4,7 +4,7 @@ Developer Guide
=====================
This guide is intended for people who want to work on Spack itself.
If you just want to develop pacakges, see the :ref:`packaging-guide`.
If you just want to develop packages, see the :ref:`packaging-guide`.
It is assumed that you've read the :ref:`basic-usage` and
:ref:`packaging-guide` sections, and that you're familiar with the
@@ -50,11 +50,11 @@ as a descriptor for one or more instances of that template. Users
express the configuration they want using a spec, and a package turns
the spec into a complete build.
The obvious difficulty with this design is that users underspecify
The obvious difficulty with this design is that users under-specify
what they want. To build a software package, the package object needs
a *complete* specification. In Spack, if a spec describes only one
instance of a package, then we say it is **concrete**. If a spec
could describes many instances, (i.e. it is underspecified in one way
could describes many instances, (i.e. it is under-specified in one way
or another), then we say it is **abstract**.
Spack's job is to take an *abstract* spec from the user, find a
@@ -92,7 +92,7 @@ with a high level view of Spack's directory structure::
Spack is designed so that it could live within a `standard UNIX
directory hierarchy <http://linux.die.net/man/7/hier>`_, so ``lib``,
``var``, and ``opt`` all contain a ``spack`` subdirectory in case
Spack is installed alongside other software. Most of the insteresting
Spack is installed alongside other software. Most of the interesting
parts of Spack live in ``lib/spack``. Files under ``var`` are created
as needed, so there is no ``var`` directory when you initially clone
Spack from the repository.
@@ -123,13 +123,13 @@ Package-related modules
Contains the :class:`Package <spack.package.Package>` class, which
is the superclass for all packages in Spack. Methods on ``Package``
implement all phases of the :ref:`package lifecycle
<pacakge-lifecycle>` and manage the build process.
<package-lifecycle>` and manage the build process.
:mod:`spack.packages`
Contains all of the packages in Spack and methods for managing them.
Functions like :func:`packages.get <spack.packages.get>` and
:func:`class_name_for_package_name
<packages.class_name_for_package_name>` handle mapping packge module
<packages.class_name_for_package_name>` handle mapping package module
names to class names and dynamically instantiating packages by name
from module files.

View File

@@ -1,4 +1,4 @@
Feature Overview
Feature overview
==================
This is a high-level overview of features that make Spack different
@@ -93,7 +93,7 @@ creates a simple python file:
homepage = "http://www.example.com/"
url = "http://www.mr511.de/software/libelf-0.8.13.tar.gz"
versions = { '0.8.13' : '4136d7b4c04df68b686570afa26988ac' }
version('0.8.13', '4136d7b4c04df68b686570afa26988ac')
def install(self, prefix):
configure("--prefix=%s" % prefix)

View File

@@ -9,7 +9,7 @@ Getting spack is easy. You can clone it from the `github repository
.. code-block:: sh
$ git clone git@github.com:scalability-llnl/spack.git
$ git clone https://github.com/scalability-llnl/spack.git
This will create a directory called ``spack``. We'll assume that the
full path to this directory is in some environment called

View File

@@ -29,7 +29,7 @@ package:
.. code-block:: sh
$ git clone git@github.com:scalability-llnl/spack.git
$ git clone https://github.com/scalability-llnl/spack.git
$ cd spack/bin
$ ./spack install libelf
@@ -46,8 +46,11 @@ Table of Contents
getting_started
basic_usage
packaging_guide
mirrors
site_configuration
developer_guide
command_index
package_list
API Docs <spack>
Indices and tables

217
lib/spack/docs/mirrors.rst Normal file
View File

@@ -0,0 +1,217 @@
.. _mirrors:
Mirrors
============================
Some sites may not have access to the internet for fetching packages.
These sites will need a local repository of tarballs from which they
can get their files. Spack has support for this with *mirrors*. A
mirror is a URL that points to a directory, either on the local
filesystem or on some server, containing tarballs for all of Spack's
packages.
Here's an example of a mirror's directory structure::
mirror/
cmake/
cmake-2.8.10.2.tar.gz
dyninst/
dyninst-8.1.1.tgz
dyninst-8.1.2.tgz
libdwarf/
libdwarf-20130126.tar.gz
libdwarf-20130207.tar.gz
libdwarf-20130729.tar.gz
libelf/
libelf-0.8.12.tar.gz
libelf-0.8.13.tar.gz
libunwind/
libunwind-1.1.tar.gz
mpich/
mpich-3.0.4.tar.gz
mvapich2/
mvapich2-1.9.tgz
The structure is very simple. There is a top-level directory. The
second level directories are named after packages, and the third level
contains tarballs for each package, named after each package.
.. note::
Archives are **not** named exactly they were in the package's fetch
URL. They have the form ``<name>-<version>.<extension>``, where
``<name>`` is Spack's name for the package, ``<version>`` is the
version of the tarball, and ``<extension>`` is whatever format the
package's fetch URL contains.
In order to make mirror creation reasonably fast, we copy the
tarball in its original format to the mirror directory, but we do
not standardize on a particular compression algorithm, because this
would potentially require expanding and re-compressing each archive.
.. _spack-mirror:
``spack mirror``
----------------------------
Mirrors are managed with the ``spack mirror`` command. The help for
``spack mirror`` looks like this::
$ spack mirror -h
usage: spack mirror [-h] SUBCOMMAND ...
positional arguments:
SUBCOMMAND
create Create a directory to be used as a spack mirror, and fill
it with package archives.
add Add a mirror to Spack.
remove Remove a mirror by name.
list Print out available mirrors to the console.
optional arguments:
-h, --help show this help message and exit
The ``create`` command actually builds a mirror by fetching all of its
packages from the internet and checksumming them.
The other three commands are for managing mirror configuration. They
control the URL(s) from which Spack downloads its packages.
.. _spack-mirror-create:
``spack mirror create``
----------------------------
You can create a mirror using the ``spack mirror create`` command, assuming
you're on a machine where you can access the internet.
The command will iterate through all of Spack's packages and download
the safe ones into a directory structure like the one above. Here is
what it looks like:
.. code-block:: bash
$ spack mirror create libelf libdwarf
==> Created new mirror in spack-mirror-2014-06-24
==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.13.tar.gz
########################################################## 81.6%
==> Checksum passed for libelf@0.8.13
==> Added libelf@0.8.13
==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.12.tar.gz
###################################################################### 98.6%
==> Checksum passed for libelf@0.8.12
==> Added libelf@0.8.12
==> Trying to fetch from http://www.prevanders.net/libdwarf-20130207.tar.gz
###################################################################### 97.3%
==> Checksum passed for libdwarf@20130207
==> Added libdwarf@20130207
==> Trying to fetch from http://www.prevanders.net/libdwarf-20130126.tar.gz
######################################################## 78.9%
==> Checksum passed for libdwarf@20130126
==> Added libdwarf@20130126
==> Trying to fetch from http://www.prevanders.net/libdwarf-20130729.tar.gz
############################################################# 84.7%
==> Added libdwarf@20130729
==> Added spack-mirror-2014-06-24/libdwarf/libdwarf-20130729.tar.gz to mirror
==> Added python@2.7.8.
==> Successfully updated mirror in spack-mirror-2015-02-24.
Archive stats:
0 already present
5 added
0 failed to fetch.
Once this is done, you can tar up the ``spack-mirror-2014-06-24`` directory and
copy it over to the machine you want it hosted on.
Custom package sets
~~~~~~~~~~~~~~~~~~~~~~~
Normally, ``spack mirror create`` downloads all the archives it has
checksums for. If you want to only create a mirror for a subset of
packages, you can do that by supplying a list of package specs on the
command line after ``spack mirror create``. For example, this
command::
$ spack mirror create libelf@0.8.12: boost@1.44:
Will create a mirror for libelf versions greater than or equal to
0.8.12 and boost versions greater than or equal to 1.44.
Mirror files
~~~~~~~~~~~~~~~~~~~~~~~
If you have a *very* large number of packages you want to mirror, you
can supply a file with specs in it, one per line::
$ cat specs.txt
libdwarf
libelf@0.8.12:
boost@1.44:
boost@1.39.0
...
$ spack mirror create -f specs.txt
...
This is useful if there is a specific suite of software managed by
your site.
.. _spack-mirror-add:
``spack mirror add``
----------------------------
Once you have a mirror, you need to let spack know about it. This is
relatively simple. First, figure out the URL for the mirror. If it's
a file, you can use a file URL like this one::
file:///Users/gamblin2/spack-mirror-2014-06-24
That points to the directory on the local filesystem. If it were on a
web server, you could use a URL like this one:
https://example.com/some/web-hosted/directory/spack-mirror-2014-06-24
Spack will use the URL as the root for all of the packages it fetches.
You can tell your Spack installation to use that mirror like this:
.. code-block:: bash
$ spack mirror add local_filesystem file:///Users/gamblin2/spack-mirror-2014-06-24
Each mirror has a name so that you can refer to it again later.
.. _spack-mirror-list:
``spack mirror list``
----------------------------
If you want to see all the mirrors Spack knows about you can run ``spack mirror list``::
$ spack mirror list
local_filesystem file:///Users/gamblin2/spack-mirror-2014-06-24
.. _spack-mirror-remove:
``spack mirror remove``
----------------------------
And, if you want to remove a mirror, just remove it by name::
$ spack mirror remove local_filesystem
$ spack mirror list
==> No mirrors configured.
Mirror precedence
----------------------------
Adding a mirror really just adds a section in ``~/.spackconfig``::
[mirror "local_filesystem"]
url = file:///Users/gamblin2/spack-mirror-2014-06-24
[mirror "remote_server"]
url = https://example.com/some/web-hosted/directory/spack-mirror-2014-06-24
If you want to change the order in which mirrors are searched for
packages, you can edit this file and reorder the sections. Spack will
search the topmost mirror first and the bottom-most mirror last.

File diff suppressed because it is too large Load Diff

View File

@@ -1,208 +1,15 @@
.. _site-configuration:
Site-specific configuration
Site configuration
===================================
.. _mirrors:
Mirrors
----------------------------
Some sites may not have access to the internet for fetching packages.
These sites will need a local repository of tarballs from which they
can get their files. Spack has support for this with *mirrors*. A
mirror is a URL that points to a directory, either on the local
filesystem or on some server, containing tarballs for all of Spack's
packages.
Here's an example of a mirror's directory structure::
mirror/
cmake/
cmake-2.8.10.2.tar.gz
dyninst/
DyninstAPI-8.1.1.tgz
DyninstAPI-8.1.2.tgz
libdwarf/
libdwarf-20130126.tar.gz
libdwarf-20130207.tar.gz
libdwarf-20130729.tar.gz
libelf/
libelf-0.8.12.tar.gz
libelf-0.8.13.tar.gz
libunwind/
libunwind-1.1.tar.gz
mpich/
mpich-3.0.4.tar.gz
mvapich2/
mvapich2-1.9.tgz
The structure is very simple. There is a top-level directory. The
second level directories are named after packages, and the third level
contains tarballs for each package, named as they were in the
package's fetch URL.
``spack mirror``
~~~~~~~~~~~~~~~~~~~~~~~
Mirrors are managed with the ``spack mirror`` command. The help for
``spack mirror`` looks like this::
$ spack mirror -h
usage: spack mirror [-h] SUBCOMMAND ...
positional arguments:
SUBCOMMAND
create Create a directory to be used as a spack mirror, and fill
it with package archives.
add Add a mirror to Spack.
remove Remove a mirror by name.
list Print out available mirrors to the console.
optional arguments:
-h, --help show this help message and exit
The ``create`` command actually builds a mirror by fetching all of its
packages from the internet and checksumming them.
The other three commands are for managing mirror configuration. They
control the URL(s) from which Spack downloads its packages.
``spack mirror create``
~~~~~~~~~~~~~~~~~~~~~~~
You can create a mirror using the ``spack mirror create`` command, assuming
you're on a machine where you can access the internet.
The command will iterate through all of Spack's packages and download
the safe ones into a directory structure like the one above. Here is
what it looks like:
.. code-block:: bash
$ spack mirror create libelf libdwarf
==> Created new mirror in spack-mirror-2014-06-24
==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.13.tar.gz
########################################################## 81.6%
==> Checksum passed for libelf@0.8.13
==> Added spack-mirror-2014-06-24/libelf/libelf-0.8.13.tar.gz to mirror
==> Trying to fetch from http://www.mr511.de/software/libelf-0.8.12.tar.gz
###################################################################### 98.6%
==> Checksum passed for libelf@0.8.12
==> Added spack-mirror-2014-06-24/libelf/libelf-0.8.12.tar.gz to mirror
==> Trying to fetch from http://www.prevanders.net/libdwarf-20130207.tar.gz
###################################################################### 97.3%
==> Checksum passed for libdwarf@20130207
==> Added spack-mirror-2014-06-24/libdwarf/libdwarf-20130207.tar.gz to mirror
==> Trying to fetch from http://www.prevanders.net/libdwarf-20130126.tar.gz
######################################################## 78.9%
==> Checksum passed for libdwarf@20130126
==> Added spack-mirror-2014-06-24/libdwarf/libdwarf-20130126.tar.gz to mirror
==> Trying to fetch from http://www.prevanders.net/libdwarf-20130729.tar.gz
############################################################# 84.7%
==> Checksum passed for libdwarf@20130729
==> Added spack-mirror-2014-06-24/libdwarf/libdwarf-20130729.tar.gz to mirror
Once this is done, you can tar up the ``spack-mirror-2014-06-24`` directory and
copy it over to the machine you want it hosted on.
Custom package sets
^^^^^^^^^^^^^^^^^^^^^^^^
Normally, ``spack mirror create`` downloads all the archives it has
checksums for. If you want to only create a mirror for a subset of
packages, you can do that by supplying a list of package specs on the
command line after ``spack mirror create``. For example, this
command::
$ spack mirror create libelf@0.8.12: boost@1.44:
Will create a mirror for libelf versions greater than or equal to
0.8.12 and boost versions greater than or equal to 1.44.
Mirror files
^^^^^^^^^^^^^^^^^^^^^^^^
If you have a *very* large number of packages you want to mirror, you
can supply a file with specs in it, one per line::
$ cat specs.txt
libdwarf
libelf@0.8.12:
boost@1.44:
boost@1.39.0
...
$ spack mirror create -f specs.txt
...
This is useful if there is a specific suite of software managed by
your site.
``spack mirror add``
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Once you have a mirrror, you need to let spack know about it. This is
relatively simple. First, figure out the URL for the mirror. If it's
a file, you can use a file URL like this one::
file:///Users/gamblin2/spack-mirror-2014-06-24
That points to the directory on the local filesystem. If it were on a
web server, you could use a URL like this one:
https://example.com/some/web-hosted/directory/spack-mirror-2014-06-24
Spack will use the URL as the root for all of the packages it fetches.
You can tell your Spack installation to use that mirror like this:
.. code-block:: bash
$ spack mirror add local_filesystem file:///Users/gamblin2/spack-mirror-2014-06-24
Each mirror has a name so that you can refer to it again later.
``spack mirror list``
~~~~~~~~~~~~~~~~~~~~~~~~~~~
If you want to see all the mirrors Spack knows about you can run ``spack mirror list``::
$ spack mirror list
local_filesystem file:///Users/gamblin2/spack-mirror-2014-06-24
``spack mirror remove``
~~~~~~~~~~~~~~~~~~~~~~~~~~~
And, if you want to remove a mirror, just remove it by name::
$ spack mirror remove local_filesystem
$ spack mirror list
==> No mirrors configured.
Mirror precedence
~~~~~~~~~~~~~~~~~~~~~~~~~
Adding a mirror really just adds a section in ``~/.spackconfig``::
[mirror "local_filesystem"]
url = file:///Users/gamblin2/spack-mirror-2014-06-24
[mirror "remote_server"]
url = https://example.com/some/web-hosted/directory/spack-mirror-2014-06-24
If you want to change the order in which mirrors are searched for
packages, you can edit this file and reorder the sections. Spack will
search the topmost mirror first and the bottom-most mirror last.
.. _temp-space:
Temporary space
----------------------------
.. warning:: Temporary space configuration will be moved to configuration files.
The intructions here are old and refer to ``__init__.py``
The instructions here are old and refer to ``__init__.py``
By default, Spack will try to do all of its building in temporary
space. There are two main reasons for this. First, Spack is designed
@@ -286,7 +93,7 @@ the virtual spec to specs for possible implementations, and
later, so there is no need to fully concretize the spec when returning
it.
The ``DefaultConcretizer`` is intendend to provide sensible defaults
The ``DefaultConcretizer`` is intended to provide sensible defaults
for each policy, but there are certain choices that it can't know
about. For example, one site might prefer ``OpenMPI`` over ``MPICH``,
or another might prefer an old version of some packages. These types
@@ -327,3 +134,53 @@ Set concretizer to *your own* class instead of the default:
concretizer = MyConcretizer()
The next time you run Spack, your changes should take effect.
Profiling
~~~~~~~~~~~~~~~~~~~~~
Spack has some limited built-in support for profiling, and can report
statistics using standard Python timing tools. To use this feature,
supply ``-p`` to Spack on the command line, before any subcommands.
.. _spack-p:
``spack -p``
^^^^^^^^^^^^^^^^^^
``spack -p`` output looks like this:
.. code-block:: sh
$ spack -p graph dyninst
o dyninst
|\
| |\
| o | libdwarf
|/ /
o | libelf
/
o boost
307670 function calls (305943 primitive calls) in 0.127 seconds
Ordered by: internal time
ncalls tottime percall cumtime percall filename:lineno(function)
853 0.021 0.000 0.066 0.000 inspect.py:472(getmodule)
51197 0.011 0.000 0.018 0.000 inspect.py:51(ismodule)
73961 0.010 0.000 0.010 0.000 {isinstance}
1762 0.006 0.000 0.053 0.000 inspect.py:440(getsourcefile)
32075 0.006 0.000 0.006 0.000 {hasattr}
1760 0.004 0.000 0.004 0.000 {posix.stat}
2240 0.004 0.000 0.004 0.000 {posix.lstat}
2602 0.004 0.000 0.011 0.000 inspect.py:398(getfile)
771 0.004 0.000 0.077 0.000 inspect.py:518(findsource)
2656 0.004 0.000 0.004 0.000 {method 'match' of '_sre.SRE_Pattern' objects}
30772 0.003 0.000 0.003 0.000 {method 'get' of 'dict' objects}
...
The bottom of the output shows the top most time consuming functions,
slowest on top. The profiling support is from Python's built-in tool,
`cProfile
<https://docs.python.org/2/library/profile.html#module-cProfile>`_.

426
lib/spack/env/cc vendored
View File

@@ -1,140 +1,332 @@
#!/usr/bin/env python
import sys
if not sys.version_info[:2] >= (2,7):
sys.exit("Spack requires Python 2.7. Version was %s." % sys.version_info)
#!/bin/bash
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
#
# Spack compiler wrapper script.
#
# Compiler commands go through this compiler wrapper in Spack builds.
# The compiler wrapper is a thin layer around the standard compilers.
# It enables several key pieces of functionality:
#
# 1. It allows Spack to swap compilers into and out of builds easily.
# 2. It adds several options to the compile line so that spack
# packages can find their dependencies at build time and run time:
# -I arguments for dependency /include directories.
# -L arguments for dependency /lib directories.
# -Wl,-rpath arguments for dependency /lib directories.
#
import os
import re
import subprocess
import argparse
from contextlib import closing
# This is the list of environment variables that need to be set before
# the script runs. They are set by routines in spack.build_environment
# as part of spack.package.Package.do_install().
parameters="
SPACK_PREFIX
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_COMPILER_SPEC
SPACK_SHORT_SPEC"
# Import spack parameters through the build environment.
spack_lib = os.environ.get("SPACK_LIB")
if not spack_lib:
print "Spack compiler must be run from spack!"
sys.exit(1)
# The compiler input variables are checked for sanity later:
# SPACK_CC, SPACK_CXX, SPACK_F77, SPACK_FC
# Debug flag is optional; set to true for debug logging:
# SPACK_DEBUG
# Test command is used to unit test the compiler script.
# SPACK_TEST_COMMAND
# Dependencies can be empty for pkgs with no deps:
# SPACK_DEPENDENCIES
# Grab a minimal set of spack packages
sys.path.append(spack_lib)
from spack.compilation import *
import llnl.util.tty as tty
# die()
# Prints a message and exits with error 1.
function die {
echo "$@"
exit 1
}
spack_prefix = get_env_var("SPACK_PREFIX")
spack_debug = get_env_flag("SPACK_DEBUG")
spack_deps = get_path("SPACK_DEPENDENCIES")
spack_env_path = get_path("SPACK_ENV_PATH")
spack_debug_log_dir = get_env_var("SPACK_DEBUG_LOG_DIR")
spack_spec = get_env_var("SPACK_SPEC")
for param in $parameters; do
if [ -z "${!param}" ]; then
die "Spack compiler must be run from spack! Input $param was missing!"
fi
done
compiler_spec = get_env_var("SPACK_COMPILER_SPEC")
spack_cc = get_env_var("SPACK_CC", required=False)
spack_cxx = get_env_var("SPACK_CXX", required=False)
spack_f77 = get_env_var("SPACK_F77", required=False)
spack_fc = get_env_var("SPACK_FC", required=False)
#
# Figure out the type of compiler, the language, and the mode so that
# the compiler script knows what to do.
#
# Possible languages are C, C++, Fortran 77, and Fortran 90.
# 'command' is set based on the input command to $SPACK_[CC|CXX|F77|F90]
#
# 'mode' is set to one of:
# cc compile
# ld link
# ccld compile & link
# cpp preprocessor
# vcheck version check
#
command=$(basename "$0")
case "$command" in
cc|gcc|c89|c99|clang)
command="$SPACK_CC"
language="C"
;;
c++|CC|g++|clang++)
command="$SPACK_CXX"
language="C++"
;;
f77)
command="$SPACK_F77"
language="Fortran 77"
;;
fc|f90|f95)
command="$SPACK_FC"
language="Fortran 90"
;;
cpp)
mode=cpp
;;
ld)
mode=ld
;;
*)
die "Unkown compiler: $command"
;;
esac
# Figure out what type of operation we're doing
command = os.path.basename(sys.argv[0])
# Finish setting up the mode.
if [ -z "$mode" ]; then
mode=ccld
for arg in "$@"; do
if [ "$arg" = -v -o "$arg" = -V -o "$arg" = --version -o "$arg" = -dumpversion ]; then
mode=vcheck
break
elif [ "$arg" = -E ]; then
mode=cpp
break
elif [ "$arg" = -c ]; then
mode=cc
break
fi
done
fi
cpp, cc, ccld, ld, version_check = range(5)
# Dump the version and exist if we're in testing mode.
if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
echo "$mode"
exit
fi
if command == 'cpp':
mode = cpp
elif command == 'ld':
mode = ld
elif '-E' in sys.argv:
mode = cpp
elif '-c' in sys.argv:
mode = cc
else:
mode = ccld
# Check that at least one of the real commands was actually selected,
# otherwise we don't know what to execute.
if [ -z "$command" ]; then
die "ERROR: Compiler '$SPACK_COMPILER_SPEC' does not support compiling $language programs."
fi
# Save original command for debug logging
input_command="$@"
if command in ('cc', 'gcc', 'c89', 'c99', 'clang'):
command = spack_cc
language = "C"
elif command in ('c++', 'CC', 'g++', 'clang++'):
command = spack_cxx
language = "C++"
elif command in ('f77'):
command = spack_f77
language = "Fortran 77"
elif command in ('fc', 'f90', 'f95'):
command = spack_fc
language = "Fortran 90"
elif command in ('ld', 'cpp'):
pass # leave it the same. TODO: what's the right thing?
else:
raise Exception("Unknown compiler: %s" % command)
#
# Now do real parsing of the command line args, trying hard to keep
# non-rpath linker arguments in the proper order w.r.t. other command
# line arguments. This is important for things like groups.
#
includes=()
libraries=()
libs=()
rpaths=()
other_args=()
if command is None:
print "ERROR: Compiler '%s' does not support compiling %s programs." % (
compiler_spec, language)
sys.exit(1)
while [ -n "$1" ]; do
case "$1" in
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
includes+=("$arg")
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
libraries+=("$arg")
;;
-l*)
arg="${1#-l}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
libs+=("$arg")
;;
-Wl,*)
arg="${1#-Wl,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if [[ "$arg" = -rpath=* ]]; then
rpaths+=("${arg#-rpath=}")
elif [[ "$arg" = -rpath ]]; then
shift; arg="$1"
if [[ "$arg" != -Wl,* ]]; then
die "-Wl,-rpath was not followed by -Wl,*"
fi
rpaths+=("${arg#-Wl,}")
else
other_args+=("-Wl,$arg")
fi
;;
-Xlinker,*)
arg="${1#-Xlinker,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if [[ "$arg" = -rpath=* ]]; then
rpaths+=("${arg#-rpath=}")
elif [[ "$arg" = -rpath ]]; then
shift; arg="$1"
if [[ "$arg" != -Xlinker,* ]]; then
die "-Xlinker,-rpath was not followed by -Xlinker,*"
fi
rpaths+=("${arg#-Xlinker,}")
else
other_args+=("-Xlinker,$arg")
fi
;;
*)
other_args+=("$1")
;;
esac
shift
done
version_args = ['-V', '-v', '--version', '-dumpversion']
if any(arg in sys.argv for arg in version_args):
mode = version_check
# Dump parsed values for unit testing if asked for
if [ -n "$SPACK_TEST_COMMAND" ]; then
IFS=$'\n'
case "$SPACK_TEST_COMMAND" in
dump-includes) echo "${includes[*]}";;
dump-libraries) echo "${libraries[*]}";;
dump-libs) echo "${libs[*]}";;
dump-rpaths) echo "${rpaths[*]}";;
dump-other-args) echo "${other_args[*]}";;
dump-all)
echo "INCLUDES:"
echo "${includes[*]}"
echo
echo "LIBRARIES:"
echo "${libraries[*]}"
echo
echo "LIBS:"
echo "${libs[*]}"
echo
echo "RPATHS:"
echo "${rpaths[*]}"
echo
echo "ARGS:"
echo "${other_args[*]}"
;;
*)
echo "ERROR: Unknown test command"
exit 1 ;;
esac
exit
fi
# Parse out the includes, libs, etc. so we can adjust them if need be.
parser = argparse.ArgumentParser(add_help=False)
parser.add_argument("-I", action='append', default=[], dest='include_path')
parser.add_argument("-L", action='append', default=[], dest='lib_path')
parser.add_argument("-l", action='append', default=[], dest='libs')
# Read spack dependencies from the path environment variable
IFS=':' read -ra deps <<< "$SPACK_DEPENDENCIES"
for dep in "${deps[@]}"; do
if [ -d "$dep/include" ]; then
includes+=("$dep/include")
fi
options, other_args = parser.parse_known_args()
rpaths, other_args = parse_rpaths(other_args)
if [ -d "$dep/lib" ]; then
libraries+=("$dep/lib")
rpaths+=("$dep/lib")
fi
# Add dependencies' include and lib paths to our compiler flags.
def add_if_dir(path_list, directory, index=None):
if os.path.isdir(directory):
if index is None:
path_list.append(directory)
else:
path_list.insert(index, directory)
if [ -d "$dep/lib64" ]; then
libraries+=("$dep/lib64")
rpaths+=("$dep/lib64")
fi
done
for dep_dir in spack_deps:
add_if_dir(options.include_path, os.path.join(dep_dir, "include"))
add_if_dir(options.lib_path, os.path.join(dep_dir, "lib"))
add_if_dir(options.lib_path, os.path.join(dep_dir, "lib64"))
# Include all -L's and prefix/whatever dirs in rpath
for dir in "${libraries[@]}"; do
[ "$dir" != "." ] && rpaths+=("$dir")
done
rpaths+=("$SPACK_PREFIX/lib")
rpaths+=("$SPACK_PREFIX/lib64")
# Add our modified arguments to it.
arguments = ['-I%s' % path for path in options.include_path]
arguments += other_args
arguments += ['-L%s' % path for path in options.lib_path]
arguments += ['-l%s' % path for path in options.libs]
# Put the arguments together
args=()
for dir in "${includes[@]}"; do args+=("-I$dir"); done
args+=("${other_args[@]}")
for dir in "${libraries[@]}"; do args+=("-L$dir"); done
for lib in "${libs[@]}"; do args+=("-l$lib"); done
# Add rpaths to install dir and its dependencies. We add both lib and lib64
# here because we don't know which will be created.
rpaths.extend(options.lib_path)
rpaths.append('%s/lib' % spack_prefix)
rpaths.append('%s/lib64' % spack_prefix)
if mode == ccld:
arguments += ['-Wl,-rpath,%s' % p for p in rpaths]
elif mode == ld:
pairs = [('-rpath', '%s' % p) for p in rpaths]
arguments += [item for sublist in pairs for item in sublist]
if [ "$mode" = ccld ]; then
for dir in "${rpaths[@]}"; do
args+=("-Wl,-rpath")
args+=("-Wl,$dir");
done
elif [ "$mode" = ld ]; then
for dir in "${rpaths[@]}"; do
args+=("-rpath")
args+=("$dir");
done
fi
# Unset some pesky environment variables
for var in ["LD_LIBRARY_PATH", "LD_RUN_PATH", "DYLD_LIBRARY_PATH"]:
if var in os.environ:
os.environ.pop(var)
#
# Unset pesky environment variables that could affect build sanity.
#
unset LD_LIBRARY_PATH
unset LD_RUN_PATH
unset DYLD_LIBRARY_PATH
# Ensure that the delegated command doesn't just call this script again.
remove_paths = ['.'] + spack_env_path
path = [p for p in get_path("PATH") if p not in remove_paths]
os.environ["PATH"] = ":".join(path)
#
# Filter '.' and Spack environment directories out of PATH so that
# this script doesn't just call itself
#
IFS=':' read -ra env_path <<< "$PATH"
IFS=':' read -ra spack_env_dirs <<< "$SPACK_ENV_PATH"
spack_env_dirs+=(".")
PATH=""
for dir in "${env_path[@]}"; do
remove=""
for rm_dir in "${spack_env_dirs[@]}"; do
if [ "$dir" = "$rm_dir" ]; then remove=True; fi
done
if [ -z "$remove" ]; then
if [ -z "$PATH" ]; then
PATH="$dir"
else
PATH="$PATH:$dir"
fi
fi
done
export PATH
full_command = [command] + arguments
full_command=("$command")
full_command+=("${args[@]}")
if spack_debug:
input_log = os.path.join(spack_debug_log_dir, 'spack-cc-%s.in.log' % spack_spec)
output_log = os.path.join(spack_debug_log_dir, 'spack-cc-%s.out.log' % spack_spec)
with closing(open(input_log, 'a')) as log:
args = [os.path.basename(sys.argv[0])] + sys.argv[1:]
log.write("%s\n" % " ".join(arg.replace(' ', r'\ ') for arg in args))
with closing(open(output_log, 'a')) as log:
log.write("%s\n" % " ".join(full_command))
#
# Write the input and output commands to debug logs if it's asked for.
#
if [ "$SPACK_DEBUG" = "TRUE" ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_SHORT_SPEC.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_SHORT_SPEC.out.log"
echo "$input_command" >> $input_log
echo "$mode ${full_command[@]}" >> $output_log
fi
rcode = subprocess.call(full_command)
sys.exit(rcode)
exec "${full_command[@]}"

View File

@@ -23,48 +23,11 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
Functions for comparing values that may potentially be None.
These none_low functions consider None as less than all other values.
This module contains external, potentially separately licensed,
packages that are included in spack.
So far:
argparse: We include our own version to be Python 2.6 compatible.
pyqver2: External script to query required python version of python source code.
Used for ensuring 2.6 compatibility.
"""
# Preserve builtin min and max functions
_builtin_min = min
_builtin_max = max
def lt(lhs, rhs):
"""Less-than comparison. None is lower than any value."""
return lhs != rhs and (lhs is None or (rhs is not None and lhs < rhs))
def le(lhs, rhs):
"""Less-than-or-equal comparison. None is less than any value."""
return lhs == rhs or lt(lhs, rhs)
def gt(lhs, rhs):
"""Greater-than comparison. None is less than any value."""
return lhs != rhs and not lt(lhs, rhs)
def ge(lhs, rhs):
"""Greater-than-or-equal comparison. None is less than any value."""
return lhs == rhs or gt(lhs, rhs)
def min(lhs, rhs):
"""Minimum function where None is less than any value."""
if lhs is None or rhs is None:
return None
else:
return _builtin_min(lhs, rhs)
def max(lhs, rhs):
"""Maximum function where None is less than any value."""
if lhs is None:
return rhs
elif rhs is None:
return lhs
else:
return _builtin_max(lhs, rhs)

2399
lib/spack/external/argparse.py vendored Normal file

File diff suppressed because it is too large Load Diff

30
lib/spack/external/functools.py vendored Normal file
View File

@@ -0,0 +1,30 @@
#
# Backport of Python 2.7's total_ordering.
#
def total_ordering(cls):
"""Class decorator that fills in missing ordering methods"""
convert = {
'__lt__': [('__gt__', lambda self, other: not (self < other or self == other)),
('__le__', lambda self, other: self < other or self == other),
('__ge__', lambda self, other: not self < other)],
'__le__': [('__ge__', lambda self, other: not self <= other or self == other),
('__lt__', lambda self, other: self <= other and not self == other),
('__gt__', lambda self, other: not self <= other)],
'__gt__': [('__lt__', lambda self, other: not (self > other or self == other)),
('__ge__', lambda self, other: self > other or self == other),
('__le__', lambda self, other: not self > other)],
'__ge__': [('__le__', lambda self, other: (not self >= other) or self == other),
('__gt__', lambda self, other: self >= other and not self == other),
('__lt__', lambda self, other: not self >= other)]
}
roots = set(dir(cls)) & set(convert)
if not roots:
raise ValueError('must define at least one ordering operation: < > <= >=')
root = max(roots) # prefer __lt__ to __le__ to __gt__ to __ge__
for opname, opfunc in convert[root]:
if opname not in roots:
opfunc.__name__ = opname
opfunc.__doc__ = getattr(int, opname).__doc__
setattr(cls, opname, opfunc)
return cls

262
lib/spack/external/ordereddict.py vendored Normal file
View File

@@ -0,0 +1,262 @@
#
# Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy.
# Passes Python2.7's test suite and incorporates all the latest updates.
#
# From http://code.activestate.com/recipes/576693-ordered-dictionary-for-py24/
# This file is in the public domain, and has no particular license.
#
try:
from thread import get_ident as _get_ident
except ImportError:
from dummy_thread import get_ident as _get_ident
try:
from _abcoll import KeysView, ValuesView, ItemsView
except ImportError:
pass
class OrderedDict(dict):
'Dictionary that remembers insertion order'
# An inherited dict maps keys to values.
# The inherited dict provides __getitem__, __len__, __contains__, and get.
# The remaining methods are order-aware.
# Big-O running times for all methods are the same as for regular dictionaries.
# The internal self.__map dictionary maps keys to links in a doubly linked list.
# The circular doubly linked list starts and ends with a sentinel element.
# The sentinel element never gets deleted (this simplifies the algorithm).
# Each link is stored as a list of length three: [PREV, NEXT, KEY].
def __init__(self, *args, **kwds):
'''Initialize an ordered dictionary. Signature is the same as for
regular dictionaries, but keyword arguments are not recommended
because their insertion order is arbitrary.
'''
if len(args) > 1:
raise TypeError('expected at most 1 arguments, got %d' % len(args))
try:
self.__root
except AttributeError:
self.__root = root = [] # sentinel node
root[:] = [root, root, None]
self.__map = {}
self.__update(*args, **kwds)
def __setitem__(self, key, value, dict_setitem=dict.__setitem__):
'od.__setitem__(i, y) <==> od[i]=y'
# Setting a new item creates a new link which goes at the end of the linked
# list, and the inherited dictionary is updated with the new key/value pair.
if key not in self:
root = self.__root
last = root[0]
last[1] = root[0] = self.__map[key] = [last, root, key]
dict_setitem(self, key, value)
def __delitem__(self, key, dict_delitem=dict.__delitem__):
'od.__delitem__(y) <==> del od[y]'
# Deleting an existing item uses self.__map to find the link which is
# then removed by updating the links in the predecessor and successor nodes.
dict_delitem(self, key)
link_prev, link_next, key = self.__map.pop(key)
link_prev[1] = link_next
link_next[0] = link_prev
def __iter__(self):
'od.__iter__() <==> iter(od)'
root = self.__root
curr = root[1]
while curr is not root:
yield curr[2]
curr = curr[1]
def __reversed__(self):
'od.__reversed__() <==> reversed(od)'
root = self.__root
curr = root[0]
while curr is not root:
yield curr[2]
curr = curr[0]
def clear(self):
'od.clear() -> None. Remove all items from od.'
try:
for node in self.__map.itervalues():
del node[:]
root = self.__root
root[:] = [root, root, None]
self.__map.clear()
except AttributeError:
pass
dict.clear(self)
def popitem(self, last=True):
'''od.popitem() -> (k, v), return and remove a (key, value) pair.
Pairs are returned in LIFO order if last is true or FIFO order if false.
'''
if not self:
raise KeyError('dictionary is empty')
root = self.__root
if last:
link = root[0]
link_prev = link[0]
link_prev[1] = root
root[0] = link_prev
else:
link = root[1]
link_next = link[1]
root[1] = link_next
link_next[0] = root
key = link[2]
del self.__map[key]
value = dict.pop(self, key)
return key, value
# -- the following methods do not depend on the internal structure --
def keys(self):
'od.keys() -> list of keys in od'
return list(self)
def values(self):
'od.values() -> list of values in od'
return [self[key] for key in self]
def items(self):
'od.items() -> list of (key, value) pairs in od'
return [(key, self[key]) for key in self]
def iterkeys(self):
'od.iterkeys() -> an iterator over the keys in od'
return iter(self)
def itervalues(self):
'od.itervalues -> an iterator over the values in od'
for k in self:
yield self[k]
def iteritems(self):
'od.iteritems -> an iterator over the (key, value) items in od'
for k in self:
yield (k, self[k])
def update(*args, **kwds):
'''od.update(E, **F) -> None. Update od from dict/iterable E and F.
If E is a dict instance, does: for k in E: od[k] = E[k]
If E has a .keys() method, does: for k in E.keys(): od[k] = E[k]
Or if E is an iterable of items, does: for k, v in E: od[k] = v
In either case, this is followed by: for k, v in F.items(): od[k] = v
'''
if len(args) > 2:
raise TypeError('update() takes at most 2 positional '
'arguments (%d given)' % (len(args),))
elif not args:
raise TypeError('update() takes at least 1 argument (0 given)')
self = args[0]
# Make progressively weaker assumptions about "other"
other = ()
if len(args) == 2:
other = args[1]
if isinstance(other, dict):
for key in other:
self[key] = other[key]
elif hasattr(other, 'keys'):
for key in other.keys():
self[key] = other[key]
else:
for key, value in other:
self[key] = value
for key, value in kwds.items():
self[key] = value
__update = update # let subclasses override update without breaking __init__
__marker = object()
def pop(self, key, default=__marker):
'''od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
If key is not found, d is returned if given, otherwise KeyError is raised.
'''
if key in self:
result = self[key]
del self[key]
return result
if default is self.__marker:
raise KeyError(key)
return default
def setdefault(self, key, default=None):
'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od'
if key in self:
return self[key]
self[key] = default
return default
def __repr__(self, _repr_running={}):
'od.__repr__() <==> repr(od)'
call_key = id(self), _get_ident()
if call_key in _repr_running:
return '...'
_repr_running[call_key] = 1
try:
if not self:
return '%s()' % (self.__class__.__name__,)
return '%s(%r)' % (self.__class__.__name__, self.items())
finally:
del _repr_running[call_key]
def __reduce__(self):
'Return state information for pickling'
items = [[k, self[k]] for k in self]
inst_dict = vars(self).copy()
for k in vars(OrderedDict()):
inst_dict.pop(k, None)
if inst_dict:
return (self.__class__, (items,), inst_dict)
return self.__class__, (items,)
def copy(self):
'od.copy() -> a shallow copy of od'
return self.__class__(self)
@classmethod
def fromkeys(cls, iterable, value=None):
'''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
and values equal to v (which defaults to None).
'''
d = cls()
for key in iterable:
d[key] = value
return d
def __eq__(self, other):
'''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive
while comparison to a regular mapping is order-insensitive.
'''
if isinstance(other, OrderedDict):
return len(self)==len(other) and self.items() == other.items()
return dict.__eq__(self, other)
def __ne__(self, other):
return not self == other
# -- the following methods are only used in Python 2.7 --
def viewkeys(self):
"od.viewkeys() -> a set-like object providing a view on od's keys"
return KeysView(self)
def viewvalues(self):
"od.viewvalues() -> an object providing a view on od's values"
return ValuesView(self)
def viewitems(self):
"od.viewitems() -> a set-like object providing a view on od's items"
return ItemsView(self)

393
lib/spack/external/pyqver2.py vendored Executable file
View File

@@ -0,0 +1,393 @@
#!/usr/bin/env python
#
# pyqver2.py
# by Greg Hewgill
# https://github.com/ghewgill/pyqver
#
# This software is provided 'as-is', without any express or implied
# warranty. In no event will the author be held liable for any damages
# arising from the use of this software.
#
# Permission is granted to anyone to use this software for any purpose,
# including commercial applications, and to alter it and redistribute it
# freely, subject to the following restrictions:
#
# 1. The origin of this software must not be misrepresented; you must not
# claim that you wrote the original software. If you use this software
# in a product, an acknowledgment in the product documentation would be
# appreciated but is not required.
# 2. Altered source versions must be plainly marked as such, and must not be
# misrepresented as being the original software.
# 3. This notice may not be removed or altered from any source distribution.
#
# Copyright (c) 2009-2013 Greg Hewgill http://hewgill.com
#
import compiler
import platform
import sys
StandardModules = {
"__future__": (2, 1),
"abc": (2, 6),
"argparse": (2, 7),
"ast": (2, 6),
"atexit": (2, 0),
"bz2": (2, 3),
"cgitb": (2, 2),
"collections": (2, 4),
"contextlib": (2, 5),
"cookielib": (2, 4),
"cProfile": (2, 5),
"csv": (2, 3),
"ctypes": (2, 5),
"datetime": (2, 3),
"decimal": (2, 4),
"difflib": (2, 1),
"DocXMLRPCServer": (2, 3),
"dummy_thread": (2, 3),
"dummy_threading": (2, 3),
"email": (2, 2),
"fractions": (2, 6),
"functools": (2, 5),
"future_builtins": (2, 6),
"hashlib": (2, 5),
"heapq": (2, 3),
"hmac": (2, 2),
"hotshot": (2, 2),
"HTMLParser": (2, 2),
"importlib": (2, 7),
"inspect": (2, 1),
"io": (2, 6),
"itertools": (2, 3),
"json": (2, 6),
"logging": (2, 3),
"modulefinder": (2, 3),
"msilib": (2, 5),
"multiprocessing": (2, 6),
"netrc": (1, 5, 2),
"numbers": (2, 6),
"optparse": (2, 3),
"ossaudiodev": (2, 3),
"pickletools": (2, 3),
"pkgutil": (2, 3),
"platform": (2, 3),
"pydoc": (2, 1),
"runpy": (2, 5),
"sets": (2, 3),
"shlex": (1, 5, 2),
"SimpleXMLRPCServer": (2, 2),
"spwd": (2, 5),
"sqlite3": (2, 5),
"ssl": (2, 6),
"stringprep": (2, 3),
"subprocess": (2, 4),
"sysconfig": (2, 7),
"tarfile": (2, 3),
"textwrap": (2, 3),
"timeit": (2, 3),
"unittest": (2, 1),
"uuid": (2, 5),
"warnings": (2, 1),
"weakref": (2, 1),
"winsound": (1, 5, 2),
"wsgiref": (2, 5),
"xml.dom": (2, 0),
"xml.dom.minidom": (2, 0),
"xml.dom.pulldom": (2, 0),
"xml.etree.ElementTree": (2, 5),
"xml.parsers.expat":(2, 0),
"xml.sax": (2, 0),
"xml.sax.handler": (2, 0),
"xml.sax.saxutils": (2, 0),
"xml.sax.xmlreader":(2, 0),
"xmlrpclib": (2, 2),
"zipfile": (1, 6),
"zipimport": (2, 3),
"_ast": (2, 5),
"_winreg": (2, 0),
}
Functions = {
"all": (2, 5),
"any": (2, 5),
"collections.Counter": (2, 7),
"collections.defaultdict": (2, 5),
"collections.OrderedDict": (2, 7),
"functools.total_ordering": (2, 7),
"enumerate": (2, 3),
"frozenset": (2, 4),
"itertools.compress": (2, 7),
"math.erf": (2, 7),
"math.erfc": (2, 7),
"math.expm1": (2, 7),
"math.gamma": (2, 7),
"math.lgamma": (2, 7),
"memoryview": (2, 7),
"next": (2, 6),
"os.getresgid": (2, 7),
"os.getresuid": (2, 7),
"os.initgroups": (2, 7),
"os.setresgid": (2, 7),
"os.setresuid": (2, 7),
"reversed": (2, 4),
"set": (2, 4),
"subprocess.check_call": (2, 5),
"subprocess.check_output": (2, 7),
"sum": (2, 3),
"symtable.is_declared_global": (2, 7),
"weakref.WeakSet": (2, 7),
}
Identifiers = {
"False": (2, 2),
"True": (2, 2),
}
def uniq(a):
if len(a) == 0:
return []
else:
return [a[0]] + uniq([x for x in a if x != a[0]])
class NodeChecker(object):
def __init__(self):
self.vers = dict()
self.vers[(2,0)] = []
def add(self, node, ver, msg):
if ver not in self.vers:
self.vers[ver] = []
self.vers[ver].append((node.lineno, msg))
def default(self, node):
for child in node.getChildNodes():
self.visit(child)
def visitCallFunc(self, node):
def rollup(n):
if isinstance(n, compiler.ast.Name):
return n.name
elif isinstance(n, compiler.ast.Getattr):
r = rollup(n.expr)
if r:
return r + "." + n.attrname
name = rollup(node.node)
if name:
v = Functions.get(name)
if v is not None:
self.add(node, v, name)
self.default(node)
def visitClass(self, node):
if node.bases:
self.add(node, (2,2), "new-style class")
if node.decorators:
self.add(node, (2,6), "class decorator")
self.default(node)
def visitDictComp(self, node):
self.add(node, (2,7), "dictionary comprehension")
self.default(node)
def visitFloorDiv(self, node):
self.add(node, (2,2), "// operator")
self.default(node)
def visitFrom(self, node):
v = StandardModules.get(node.modname)
if v is not None:
self.add(node, v, node.modname)
for n in node.names:
name = node.modname + "." + n[0]
v = Functions.get(name)
if v is not None:
self.add(node, v, name)
def visitFunction(self, node):
if node.decorators:
self.add(node, (2,4), "function decorator")
self.default(node)
def visitGenExpr(self, node):
self.add(node, (2,4), "generator expression")
self.default(node)
def visitGetattr(self, node):
if (isinstance(node.expr, compiler.ast.Const)
and isinstance(node.expr.value, str)
and node.attrname == "format"):
self.add(node, (2,6), "string literal .format()")
self.default(node)
def visitIfExp(self, node):
self.add(node, (2,5), "inline if expression")
self.default(node)
def visitImport(self, node):
for n in node.names:
v = StandardModules.get(n[0])
if v is not None:
self.add(node, v, n[0])
self.default(node)
def visitName(self, node):
v = Identifiers.get(node.name)
if v is not None:
self.add(node, v, node.name)
self.default(node)
def visitSet(self, node):
self.add(node, (2,7), "set literal")
self.default(node)
def visitSetComp(self, node):
self.add(node, (2,7), "set comprehension")
self.default(node)
def visitTryFinally(self, node):
# try/finally with a suite generates a Stmt node as the body,
# but try/except/finally generates a TryExcept as the body
if isinstance(node.body, compiler.ast.TryExcept):
self.add(node, (2,5), "try/except/finally")
self.default(node)
def visitWith(self, node):
if isinstance(node.body, compiler.ast.With):
self.add(node, (2,7), "with statement with multiple contexts")
else:
self.add(node, (2,5), "with statement")
self.default(node)
def visitYield(self, node):
self.add(node, (2,2), "yield expression")
self.default(node)
def get_versions(source):
"""Return information about the Python versions required for specific features.
The return value is a dictionary with keys as a version number as a tuple
(for example Python 2.6 is (2,6)) and the value are a list of features that
require the indicated Python version.
"""
tree = compiler.parse(source)
checker = compiler.walk(tree, NodeChecker())
return checker.vers
def v27(source):
if sys.version_info >= (2, 7):
return qver(source)
else:
print >>sys.stderr, "Not all features tested, run --test with Python 2.7"
return (2, 7)
def qver(source):
"""Return the minimum Python version required to run a particular bit of code.
>>> qver('print "hello world"')
(2, 0)
>>> qver('class test(object): pass')
(2, 2)
>>> qver('yield 1')
(2, 2)
>>> qver('a // b')
(2, 2)
>>> qver('True')
(2, 2)
>>> qver('enumerate(a)')
(2, 3)
>>> qver('total = sum')
(2, 0)
>>> qver('sum(a)')
(2, 3)
>>> qver('(x*x for x in range(5))')
(2, 4)
>>> qver('class C:\\n @classmethod\\n def m(): pass')
(2, 4)
>>> qver('y if x else z')
(2, 5)
>>> qver('import hashlib')
(2, 5)
>>> qver('from hashlib import md5')
(2, 5)
>>> qver('import xml.etree.ElementTree')
(2, 5)
>>> qver('try:\\n try: pass;\\n except: pass;\\nfinally: pass')
(2, 0)
>>> qver('try: pass;\\nexcept: pass;\\nfinally: pass')
(2, 5)
>>> qver('from __future__ import with_statement\\nwith x: pass')
(2, 5)
>>> qver('collections.defaultdict(list)')
(2, 5)
>>> qver('from collections import defaultdict')
(2, 5)
>>> qver('"{0}".format(0)')
(2, 6)
>>> qver('memoryview(x)')
(2, 7)
>>> v27('{1, 2, 3}')
(2, 7)
>>> v27('{x for x in s}')
(2, 7)
>>> v27('{x: y for x in s}')
(2, 7)
>>> qver('from __future__ import with_statement\\nwith x:\\n with y: pass')
(2, 5)
>>> v27('from __future__ import with_statement\\nwith x, y: pass')
(2, 7)
>>> qver('@decorator\\ndef f(): pass')
(2, 4)
>>> qver('@decorator\\nclass test:\\n pass')
(2, 6)
#>>> qver('0o0')
#(2, 6)
#>>> qver('@foo\\nclass C: pass')
#(2, 6)
"""
return max(get_versions(source).keys())
if __name__ == '__main__':
Verbose = False
MinVersion = (2, 3)
Lint = False
files = []
i = 1
while i < len(sys.argv):
a = sys.argv[i]
if a == "--test":
import doctest
doctest.testmod()
sys.exit(0)
if a == "-v" or a == "--verbose":
Verbose = True
elif a == "-l" or a == "--lint":
Lint = True
elif a == "-m" or a == "--min-version":
i += 1
MinVersion = tuple(map(int, sys.argv[i].split(".")))
else:
files.append(a)
i += 1
if not files:
print >>sys.stderr, """Usage: %s [options] source ...
Report minimum Python version required to run given source files.
-m x.y or --min-version x.y (default 2.3)
report version triggers at or above version x.y in verbose mode
-v or --verbose
print more detailed report of version triggers for each version
""" % sys.argv[0]
sys.exit(1)
for fn in files:
try:
f = open(fn)
source = f.read()
f.close()
ver = get_versions(source)
if Verbose:
print fn
for v in sorted([k for k in ver.keys() if k >= MinVersion], reverse=True):
reasons = [x for x in uniq(ver[v]) if x]
if reasons:
# each reason is (lineno, message)
print "\t%s\t%s" % (".".join(map(str, v)), ", ".join([x[1] for x in reasons]))
elif Lint:
for v in sorted([k for k in ver.keys() if k >= MinVersion], reverse=True):
reasons = [x for x in uniq(ver[v]) if x]
for r in reasons:
# each reason is (lineno, message)
print "%s:%s: %s %s" % (fn, r[0], ".".join(map(str, v)), r[1])
else:
print "%s\t%s" % (".".join(map(str, max(ver.keys()))), fn)
except SyntaxError, x:
print "%s: syntax error compiling with Python %s: %s" % (fn, platform.python_version(), x)

View File

@@ -22,13 +22,15 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
__all__ = ['install', 'expand_user', 'working_dir', 'touch', 'mkdirp',
'join_path', 'ancestor', 'can_access', 'filter_file', 'change_sed_delimiter']
__all__ = ['set_install_permissions', 'install', 'expand_user', 'working_dir',
'touch', 'touchp', 'mkdirp', 'force_remove', 'join_path', 'ancestor',
'can_access', 'filter_file', 'change_sed_delimiter', 'is_exe']
import os
import sys
import re
import shutil
import stat
import errno
import getpass
from contextlib import contextmanager, closing
@@ -38,7 +40,7 @@
from spack.util.compression import ALLOWED_ARCHIVE_TYPES
def filter_file(regex, repl, *filenames):
def filter_file(regex, repl, *filenames, **kwargs):
"""Like sed, but uses python regular expressions.
Filters every line of file through regex and replaces the file
@@ -49,16 +51,34 @@ def filter_file(regex, repl, *filenames):
return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
Keyword Options:
string[=False] If True, treat regex as a plain string.
backup[=True] Make a backup files suffixed with ~
ignore_absent[=False] Ignore any files that don't exist.
"""
# Keep callables intact
if not hasattr(repl, '__call__'):
# Allow strings to use \1, \2, etc. for replacement, like sed
string = kwargs.get('string', False)
backup = kwargs.get('backup', True)
ignore_absent = kwargs.get('ignore_absent', False)
# Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl):
unescaped = repl.replace(r'\\', '\\')
repl = lambda m: re.sub(
r'\\([0-9])', lambda x: m.group(int(x.group(1))), unescaped)
def replace_groups_with_groupid(m):
def groupid_to_group(x):
return m.group(int(x.group(1)))
return re.sub(r'\\([1-9])', groupid_to_group, unescaped)
repl = replace_groups_with_groupid
if string:
regex = re.escape(regex)
for filename in filenames:
backup = filename + "~"
if ignore_absent and not os.path.exists(filename):
continue
shutil.copy(filename, backup)
try:
with closing(open(backup)) as infile:
@@ -71,6 +91,10 @@ def filter_file(regex, repl, *filenames):
shutil.move(backup, filename)
raise
finally:
if not backup:
shutil.rmtree(backup, ignore_errors=True)
def change_sed_delimiter(old_delim, new_delim, *filenames):
"""Find all sed search/replace commands and change the delimiter.
@@ -108,10 +132,31 @@ def change_sed_delimiter(old_delim, new_delim, *filenames):
filter_file(double_quoted, '"%s"' % repl, f)
def set_install_permissions(path):
"""Set appropriate permissions on the installed file."""
if os.path.isdir(path):
os.chmod(path, 0755)
else:
os.chmod(path, 0644)
def install(src, dest):
"""Manually install a file to a particular location."""
tty.info("Installing %s to %s" % (src, dest))
shutil.copy(src, dest)
set_install_permissions(dest)
src_mode = os.stat(src).st_mode
dest_mode = os.stat(dest).st_mode
if src_mode | stat.S_IXUSR: dest_mode |= stat.S_IXUSR
if src_mode | stat.S_IXGRP: dest_mode |= stat.S_IXGRP
if src_mode | stat.S_IXOTH: dest_mode |= stat.S_IXOTH
os.chmod(dest, dest_mode)
def is_exe(path):
"""True if path is an executable file."""
return os.path.isfile(path) and os.access(path, os.X_OK)
def expand_user(path):
@@ -124,8 +169,29 @@ def expand_user(path):
return path.replace('%u', username)
def mkdirp(*paths):
"""Creates a directory, as well as parent directories if needed."""
for path in paths:
if not os.path.exists(path):
os.makedirs(path)
elif not os.path.isdir(path):
raise OSError(errno.EEXIST, "File alredy exists", path)
def force_remove(*paths):
"""Remove files without printing errors. Like rm -f, does NOT
remove directories."""
for path in paths:
try:
os.remove(path)
except OSError, e:
pass
@contextmanager
def working_dir(dirname):
def working_dir(dirname, **kwargs):
if kwargs.get('create', False):
mkdirp(dirname)
orig_dir = os.getcwd()
os.chdir(dirname)
yield
@@ -133,16 +199,15 @@ def working_dir(dirname):
def touch(path):
"""Creates an empty file at the specified path."""
with closing(open(path, 'a')) as file:
os.utime(path, None)
def mkdirp(*paths):
for path in paths:
if not os.path.exists(path):
os.makedirs(path)
elif not os.path.isdir(path):
raise OSError(errno.EEXIST, "File alredy exists", path)
def touchp(path):
"""Like touch, but creates any parent directories needed for the file."""
mkdirp(os.path.dirname(path))
touch(path)
def join_path(prefix, *args):

View File

@@ -68,6 +68,12 @@ def index_by(objects, *funcs):
index1 = index_by(list_of_specs, 'arch', 'compiler')
index2 = index_by(list_of_specs, 'compiler')
You can also index by tuples by passing tuples:
index1 = index_by(list_of_specs, ('arch', 'compiler'))
Keys in the resulting dict will look like ('gcc', 'bgqos_0').
"""
if not funcs:
return objects
@@ -75,6 +81,8 @@ def index_by(objects, *funcs):
f = funcs[0]
if isinstance(f, basestring):
f = lambda x: getattr(x, funcs[0])
elif isinstance(f, tuple):
f = lambda x: tuple(getattr(x, p) for p in funcs[0])
result = {}
for o in objects:
@@ -119,9 +127,8 @@ def caller_locals():
def get_calling_package_name():
"""Make sure that the caller is a class definition, and return
the module's name. This is useful for getting the name of
spack packages from inside a relation function.
"""Make sure that the caller is a class definition, and return the
module's name.
"""
stack = inspect.stack()
try:
@@ -144,8 +151,9 @@ def get_calling_package_name():
def attr_required(obj, attr_name):
"""Ensure that a class has a required attribute."""
if not hasattr(obj, attr_name):
tty.die("No required attribute '%s' in class '%s'"
% (attr_name, obj.__class__.__name__))
raise RequiredAttributeError(
"No required attribute '%s' in class '%s'"
% (attr_name, obj.__class__.__name__))
def attr_setdefault(obj, name, value):
@@ -259,3 +267,61 @@ def in_function(function_name):
return False
finally:
del stack
def check_kwargs(kwargs, fun):
"""Helper for making functions with kwargs. Checks whether the kwargs
are empty after all of them have been popped off. If they're
not, raises an error describing which kwargs are invalid.
Example::
def foo(self, **kwargs):
x = kwargs.pop('x', None)
y = kwargs.pop('y', None)
z = kwargs.pop('z', None)
check_kwargs(kwargs, self.foo)
# This raises a TypeError:
foo(w='bad kwarg')
"""
if kwargs:
raise TypeError(
"'%s' is an invalid keyword argument for function %s()."
% (next(kwargs.iterkeys()), fun.__name__))
def match_predicate(*args):
"""Utility function for making string matching predicates.
Each arg can be a:
- regex
- list or tuple of regexes
- predicate that takes a string.
This returns a predicate that is true if:
- any arg regex matches
- any regex in a list or tuple of regexes matches.
- any predicate in args matches.
"""
def match(string):
for arg in args:
if isinstance(arg, basestring):
if re.search(arg, string):
return True
elif isinstance(arg, list) or isinstance(arg, tuple):
if any(re.search(i, string) for i in arg):
return True
elif callable(arg):
if arg(string):
return True
else:
raise ValueError("args to match_predicate must be regex, "
"list of regexes, or callable.")
return False
return match
class RequiredAttributeError(ValueError):
def __init__(self, message):
super(RequiredAttributeError, self).__init__(message)

View File

@@ -0,0 +1,197 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""LinkTree class for setting up trees of symbolic links."""
__all__ = ['LinkTree']
import os
import shutil
from llnl.util.filesystem import *
empty_file_name = '.spack-empty'
def traverse_tree(source_root, dest_root, rel_path='', **kwargs):
"""Traverse two filesystem trees simultaneously.
Walks the LinkTree directory in pre or post order. Yields each
file in the source directory with a matching path from the dest
directory, along with whether the file is a directory.
e.g., for this tree::
root/
a/
file1
file2
b/
file3
When called on dest, this yields::
('root', 'dest')
('root/a', 'dest/a')
('root/a/file1', 'dest/a/file1')
('root/a/file2', 'dest/a/file2')
('root/b', 'dest/b')
('root/b/file3', 'dest/b/file3')
Optional args:
order=[pre|post] -- Whether to do pre- or post-order traveral.
ignore=<predicate> -- Predicate indicating which files to ignore.
follow_nonexisting -- Whether to descend into directories in
src that do not exit in dest. Default True.
follow_links -- Whether to descend into symlinks in src.
"""
follow_nonexisting = kwargs.get('follow_nonexisting', True)
follow_links = kwargs.get('follow_link', False)
# Yield in pre or post order?
order = kwargs.get('order', 'pre')
if order not in ('pre', 'post'):
raise ValueError("Order must be 'pre' or 'post'.")
# List of relative paths to ignore under the src root.
ignore = kwargs.get('ignore', lambda filename: False)
# Don't descend into ignored directories
if ignore(rel_path):
return
source_path = os.path.join(source_root, rel_path)
dest_path = os.path.join(dest_root, rel_path)
# preorder yields directories before children
if order == 'pre':
yield (source_path, dest_path)
for f in os.listdir(source_path):
source_child = os.path.join(source_path, f)
dest_child = os.path.join(dest_path, f)
rel_child = os.path.join(rel_path, f)
# Treat as a directory
if os.path.isdir(source_child) and (
follow_links or not os.path.islink(source_child)):
# When follow_nonexisting isn't set, don't descend into dirs
# in source that do not exist in dest
if follow_nonexisting or os.path.exists(dest_child):
tuples = traverse_tree(source_root, dest_root, rel_child, **kwargs)
for t in tuples: yield t
# Treat as a file.
elif not ignore(os.path.join(rel_path, f)):
yield (source_child, dest_child)
if order == 'post':
yield (source_path, dest_path)
class LinkTree(object):
"""Class to create trees of symbolic links from a source directory.
LinkTree objects are constructed with a source root. Their
methods allow you to create and delete trees of symbolic links
back to the source tree in specific destination directories.
Trees comprise symlinks only to files; directries are never
symlinked to, to prevent the source directory from ever being
modified.
"""
def __init__(self, source_root):
if not os.path.exists(source_root):
raise IOError("No such file or directory: '%s'", source_root)
self._root = source_root
def find_conflict(self, dest_root, **kwargs):
"""Returns the first file in dest that conflicts with src"""
kwargs['follow_nonexisting'] = False
for src, dest in traverse_tree(self._root, dest_root, **kwargs):
if os.path.isdir(src):
if os.path.exists(dest) and not os.path.isdir(dest):
return dest
elif os.path.exists(dest):
return dest
return None
def merge(self, dest_root, **kwargs):
"""Link all files in src into dest, creating directories if necessary."""
kwargs['order'] = 'pre'
for src, dest in traverse_tree(self._root, dest_root, **kwargs):
if os.path.isdir(src):
if not os.path.exists(dest):
mkdirp(dest)
continue
if not os.path.isdir(dest):
raise ValueError("File blocks directory: %s" % dest)
# mark empty directories so they aren't removed on unmerge.
if not os.listdir(dest):
marker = os.path.join(dest, empty_file_name)
touch(marker)
else:
assert(not os.path.exists(dest))
os.symlink(src, dest)
def unmerge(self, dest_root, **kwargs):
"""Unlink all files in dest that exist in src.
Unlinks directories in dest if they are empty.
"""
kwargs['order'] = 'post'
for src, dest in traverse_tree(self._root, dest_root, **kwargs):
if os.path.isdir(src):
# Skip non-existing links.
if not os.path.exists(dest):
continue
if not os.path.isdir(dest):
raise ValueError("File blocks directory: %s" % dest)
# remove directory if it is empty.
if not os.listdir(dest):
shutil.rmtree(dest, ignore_errors=True)
# remove empty dir marker if present.
marker = os.path.join(dest, empty_file_name)
if os.path.exists(marker):
os.remove(marker)
elif os.path.exists(dest):
if not os.path.islink(dest):
raise ValueError("%s is not a link tree!" % dest)
os.remove(dest)

View File

@@ -25,6 +25,9 @@
import sys
import os
import textwrap
import fcntl
import termios
import struct
from StringIO import StringIO
from llnl.util.tty.color import *
@@ -114,21 +117,46 @@ def get_number(prompt, **kwargs):
return number
def get_yes_or_no(prompt, **kwargs):
default_value = kwargs.get('default', None)
if default_value is None:
prompt += ' [y/n] '
elif default_value is True:
prompt += ' [Y/n] '
elif default_value is False:
prompt += ' [y/N] '
else:
raise ValueError("default for get_yes_no() must be True, False, or None.")
result = None
while result is None:
ans = raw_input(prompt).lower()
if not ans:
result = default_value
if result is None:
print "Please enter yes or no."
else:
if ans == 'y' or ans == 'yes':
result = True
elif ans == 'n' or ans == 'no':
result = False
return result
def hline(label=None, **kwargs):
"""Draw an optionally colored or labeled horizontal line.
"""Draw a labeled horizontal line.
Options:
char Char to draw the line with. Default '-'
color Color of the label. Default is no color.
max_width Maximum width of the line. Default is 64 chars.
See tty.color for possible color formats.
"""
char = kwargs.get('char', '-')
color = kwargs.get('color', '')
max_width = kwargs.get('max_width', 64)
char = kwargs.pop('char', '-')
max_width = kwargs.pop('max_width', 64)
if kwargs:
raise TypeError("'%s' is an invalid keyword argument for this function."
% next(kwargs.iterkeys()))
cols, rows = terminal_size()
rows, cols = terminal_size()
if not cols:
cols = max_width
else:
@@ -136,37 +164,34 @@ def hline(label=None, **kwargs):
cols = min(max_width, cols)
label = str(label)
prefix = char * 2 + " " + label + " "
suffix = (cols - len(prefix)) * char
prefix = char * 2 + " "
suffix = " " + (cols - len(prefix) - clen(label)) * char
out = StringIO()
if color:
prefix = char * 2 + " " + color + cescape(label) + "@. "
cwrite(prefix, stream=out, color=True)
else:
out.write(prefix)
out.write(prefix)
out.write(label)
out.write(suffix)
print out.getvalue()
def terminal_size():
"""Gets the dimensions of the console: cols, rows."""
"""Gets the dimensions of the console: (rows, cols)."""
def ioctl_GWINSZ(fd):
try:
cr = struct.unpack('hh', fcntl.ioctl(fd, termios.TIOCGWINSZ, '1234'))
rc = struct.unpack('hh', fcntl.ioctl(fd, termios.TIOCGWINSZ, '1234'))
except:
return
return cr
cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2)
if not cr:
return rc
rc = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2)
if not rc:
try:
fd = os.open(os.ctermid(), os.O_RDONLY)
cr = ioctl_GWINSZ(fd)
rc = ioctl_GWINSZ(fd)
os.close(fd)
except:
pass
if not cr:
cr = (os.environ.get('LINES', 25), os.environ.get('COLUMNS', 80))
if not rc:
rc = (os.environ.get('LINES', 25), os.environ.get('COLUMNS', 80))
return int(cr[1]), int(cr[0])
return int(rc[0]), int(rc[1])

View File

@@ -22,55 +22,71 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
# colify
# By Todd Gamblin, tgamblin@llnl.gov
#
# Takes a list of items as input and finds a good columnization of them,
# similar to how gnu ls does. You can pipe output to this script and
# get a tight display for it. This supports both uniform-width and
# variable-width (tighter) columns.
#
# Run colify -h for more information.
#
"""
Routines for printing columnar output. See colify() for more information.
"""
import os
import sys
import fcntl
import termios
import struct
from StringIO import StringIO
from llnl.util.tty import terminal_size
from llnl.util.tty.color import clen
class ColumnConfig:
def __init__(self, cols):
self.cols = cols
self.line_length = 0
self.valid = True
self.widths = [0] * cols
self.widths = [0] * cols # does not include ansi colors
self.cwidths = [0] * cols # includes ansi colors
def __repr__(self):
attrs = [(a,getattr(self, a)) for a in dir(self) if not a.startswith("__")]
return "<Config: %s>" % ", ".join("%s: %r" % a for a in attrs)
def config_variable_cols(elts, console_cols, padding):
def config_variable_cols(elts, console_width, padding, cols=0):
"""Variable-width column fitting algorithm.
This function determines the most columns that can fit in the
screen width. Unlike uniform fitting, where all columns take
the width of the longest element in the list, each column takes
the width of its own longest element. This packs elements more
efficiently on screen.
If cols is nonzero, force
"""
if cols < 0:
raise ValueError("cols must be non-negative.")
# Get a bound on the most columns we could possibly have.
lengths = [len(elt) for elt in elts]
max_cols = max(1, console_cols / (min(lengths) + padding))
# 'clen' ignores length of ansi color sequences.
lengths = [clen(e) for e in elts]
clengths = [len(e) for e in elts]
max_cols = max(1, console_width / (min(lengths) + padding))
max_cols = min(len(elts), max_cols)
configs = [ColumnConfig(c) for c in xrange(1, max_cols+1)]
for elt, length in enumerate(lengths):
for i, conf in enumerate(configs):
if conf.valid:
col = elt / ((len(elts) + i) / (i + 1))
padded = length
if col < i:
padded += padding
# Range of column counts to try. If forced, use the supplied value.
col_range = [cols] if cols else xrange(1, max_cols+1)
if conf.widths[col] < padded:
conf.line_length += padded - conf.widths[col]
conf.widths[col] = padded
conf.valid = (conf.line_length < console_cols)
# Determine the most columns possible for the console width.
configs = [ColumnConfig(c) for c in col_range]
for i, length in enumerate(lengths):
for conf in configs:
if conf.valid:
col = i / ((len(elts) + conf.cols - 1) / conf.cols)
p = padding if col < (conf.cols - 1) else 0
if conf.widths[col] < (length + p):
conf.line_length += length + p - conf.widths[col]
conf.widths[col] = length + p
conf.cwidths[col] = clengths[i] + p
conf.valid = (conf.line_length < console_width)
try:
config = next(conf for conf in reversed(configs) if conf.valid)
@@ -83,53 +99,107 @@ def config_variable_cols(elts, console_cols, padding):
return config
def config_uniform_cols(elts, console_cols, padding):
max_len = max(len(elt) for elt in elts) + padding
cols = max(1, console_cols / max_len)
cols = min(len(elts), cols)
def config_uniform_cols(elts, console_width, padding, cols=0):
"""Uniform-width column fitting algorithm.
Determines the longest element in the list, and determines how
many columns of that width will fit on screen. Returns a
corresponding column config.
"""
if cols < 0:
raise ValueError("cols must be non-negative.")
# 'clen' ignores length of ansi color sequences.
max_len = max(clen(e) for e in elts) + padding
max_clen = max(len(e) for e in elts) + padding
if cols == 0:
cols = max(1, console_width / max_len)
cols = min(len(elts), cols)
config = ColumnConfig(cols)
config.widths = [max_len] * cols
config.cwidths = [max_clen] * cols
return config
def isatty(ostream):
force = os.environ.get('COLIFY_TTY', 'false').lower() != 'false'
return force or ostream.isatty()
def colify(elts, **options):
"""Takes a list of elements as input and finds a good columnization
of them, similar to how gnu ls does. This supports both
uniform-width and variable-width (tighter) columns.
If elts is not a list of strings, each element is first conveted
using str().
Keyword arguments:
output=<stream> A file object to write to. Default is sys.stdout.
indent=<int> Optionally indent all columns by some number of spaces.
padding=<int> Spaces between columns. Default is 2.
width=<int> Width of the output. Default is 80 if tty is not detected.
cols=<int> Force number of columns. Default is to size to terminal,
or single-column if no tty
tty=<bool> Whether to attempt to write to a tty. Default is to
autodetect a tty. Set to False to force single-column output.
method=<string> Method to use to fit columns. Options are variable or uniform.
Variable-width columns are tighter, uniform columns are all the
same width and fit less data on the screen.
len=<func> Function to use for calculating string length.
Useful for ignoring ansi color. Default is 'len'.
"""
# Get keyword arguments or set defaults
output = options.get("output", sys.stdout)
indent = options.get("indent", 0)
padding = options.get("padding", 2)
cols = options.pop("cols", 0)
output = options.pop("output", sys.stdout)
indent = options.pop("indent", 0)
padding = options.pop("padding", 2)
tty = options.pop('tty', None)
method = options.pop("method", "variable")
console_cols = options.pop("width", None)
if options:
raise TypeError("'%s' is an invalid keyword argument for this function."
% next(options.iterkeys()))
# elts needs to be an array of strings so we can count the elements
elts = [str(elt) for elt in elts]
if not elts:
return
return (0, ())
if not isatty(output):
for elt in elts:
output.write("%s\n" % elt)
return
# environment size is of the form "<rows>x<cols>"
env_size = os.environ.get('COLIFY_SIZE')
if env_size:
try:
r, c = env_size.split('x')
console_rows, console_cols = int(r), int(c)
tty = True
except: pass
console_cols = options.get("cols", None)
# Use only one column if not a tty.
if not tty:
if tty is False or not output.isatty():
cols = 1
# Specify the number of character columns to use.
if not console_cols:
console_cols, console_rows = terminal_size()
console_rows, console_cols = terminal_size()
elif type(console_cols) != int:
raise ValueError("Number of columns must be an int")
console_cols = max(1, console_cols - indent)
method = options.get("method", "variable")
# Choose a method. Variable-width colums vs uniform-width.
if method == "variable":
config = config_variable_cols(elts, console_cols, padding)
config = config_variable_cols(elts, console_cols, padding, cols)
elif method == "uniform":
config = config_uniform_cols(elts, console_cols, padding)
config = config_uniform_cols(elts, console_cols, padding, cols)
else:
raise ValueError("method must be one of: " + allowed_methods)
cols = config.cols
formats = ["%%-%ds" % width for width in config.widths[:-1]]
formats = ["%%-%ds" % width for width in config.cwidths[:-1]]
formats.append("%s") # last column has no trailing space
rows = (len(elts) + cols - 1) / cols
@@ -146,28 +216,32 @@ def colify(elts, **options):
if row == rows_last_col:
cols -= 1
return (config.cols, tuple(config.widths))
if __name__ == "__main__":
import optparse
cols, rows = terminal_size()
parser = optparse.OptionParser()
parser.add_option("-u", "--uniform", action="store_true", default=False,
help="Use uniformly sized columns instead of variable-size.")
parser.add_option("-p", "--padding", metavar="PADDING", action="store",
type=int, default=2, help="Spaces to add between columns. Default is 2.")
parser.add_option("-i", "--indent", metavar="SPACES", action="store",
type=int, default=0, help="Indent the output by SPACES. Default is 0.")
parser.add_option("-w", "--width", metavar="COLS", action="store",
type=int, default=cols, help="Indent the output by SPACES. Default is 0.")
options, args = parser.parse_args()
def colify_table(table, **options):
if table is None:
raise TypeError("Can't call colify_table on NoneType")
elif not table or not table[0]:
raise ValueError("Table is empty in colify_table!")
method = "variable"
if options.uniform:
method = "uniform"
columns = len(table[0])
def transpose():
for i in xrange(columns):
for row in table:
yield row[i]
if sys.stdin.isatty():
parser.print_help()
sys.exit(1)
else:
colify([line.strip() for line in sys.stdin], method=method, **options.__dict__)
if 'cols' in options:
raise ValueError("Cannot override columsn in colify_table.")
options['cols'] = columns
colify(transpose(), **options)
def colified(elts, **options):
"""Invokes the colify() function but returns the result as a string
instead of writing it to an output string."""
sio = StringIO()
options['output'] = sio
colify(elts, **options)
return sio.getvalue()

View File

@@ -149,6 +149,11 @@ def colorize(string, **kwargs):
return re.sub(color_re, match_to_ansi(color), string)
def clen(string):
"""Return the length of a string, excluding ansi color sequences."""
return len(re.sub(r'\033[^m]*m', '', string))
def cwrite(string, stream=sys.stdout, color=None):
"""Replace all color expressions in string with ANSI control
codes and write the result to the stream. If color is
@@ -172,17 +177,20 @@ def cescape(string):
class ColorStream(object):
def __init__(self, stream, color=None):
self.__class__ = type(stream.__class__.__name__,
(self.__class__, stream.__class__), {})
self.__dict__ = stream.__dict__
self.color = color
self.stream = stream
self._stream = stream
self._color = color
def write(self, string, **kwargs):
if kwargs.get('raw', False):
super(ColorStream, self).write(string)
else:
cwrite(string, self.stream, self.color)
raw = kwargs.get('raw', False)
raw_write = getattr(self._stream, 'write')
color = self._color
if self._color is None:
if raw:
color=True
else:
color = self._stream.isatty()
raw_write(colorize(string, color=color))
def writelines(self, sequence, **kwargs):
raw = kwargs.get('raw', False)

View File

@@ -22,25 +22,11 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
#
# When packages call 'from spack import *', this is what is brought in.
#
# Spack internal code calls 'import spack' and accesses other
# variables (spack.db, paths, etc.) directly.
#
# TODO: maybe this should be separated out and should go in build_environment.py?
# TODO: it's not clear where all the stuff that needs to be included in packages
# should live. This file is overloaded for spack core vs. for packages.
__all__ = ['Package', 'when', 'provides', 'depends_on',
'patch', 'Version', 'working_dir', 'which', 'Executable',
'filter_file', 'change_sed_delimiter']
import os
import tempfile
from llnl.util.filesystem import *
# This lives in $prefix/lib/spac/spack/__file__
# This lives in $prefix/lib/spack/spack/__file__
prefix = ancestor(__file__, 4)
# The spack script itself
@@ -58,7 +44,6 @@
stage_path = join_path(var_path, "stage")
install_path = join_path(prefix, "opt")
share_path = join_path(prefix, "share", "spack")
dotkit_path = join_path(share_path, "dotkit")
#
# Set up the packages database.
@@ -81,7 +66,7 @@
# stage directories.
#
from spack.directory_layout import SpecHashDirectoryLayout
install_layout = SpecHashDirectoryLayout(install_path, prefix_size=6)
install_layout = SpecHashDirectoryLayout(install_path)
#
# This controls how things are concretized in spack.
@@ -93,7 +78,7 @@
# Version information
from spack.version import Version
spack_version = Version("0.8")
spack_version = Version("0.8.15")
#
# Executables used by Spack
@@ -141,11 +126,30 @@
#
sys_type = None
#
# Extra imports that should be generally usable from package.py files.
# When packages call 'from spack import *', this extra stuff is brought in.
#
from llnl.util.filesystem import working_dir
from spack.package import Package
from spack.relations import depends_on, provides, patch
# Spack internal code should call 'import spack' and accesses other
# variables (spack.db, paths, etc.) directly.
#
# TODO: maybe this should be separated out and should go in build_environment.py?
# TODO: it's not clear where all the stuff that needs to be included in packages
# should live. This file is overloaded for spack core vs. for packages.
#
__all__ = ['Package', 'Version', 'when', 'ver']
from spack.package import Package, ExtensionConflictError
from spack.version import Version, ver
from spack.multimethod import when
from spack.version import Version
import llnl.util.filesystem
from llnl.util.filesystem import *
__all__ += llnl.util.filesystem.__all__
import spack.relations
from spack.relations import *
__all__ += spack.relations.__all__
import spack.util.executable
from spack.util.executable import *
__all__ += spack.util.executable.__all__

View File

@@ -65,7 +65,7 @@ def get_mac_sys_type():
if not mac_ver:
return None
return "macosx_{}_{}".format(
return "macosx_%s_%s" % (
Version(mac_ver).up_to(2), py_platform.machine())

View File

@@ -28,6 +28,7 @@
calls you can make from within the install() function.
"""
import os
import sys
import shutil
import multiprocessing
import platform
@@ -48,12 +49,11 @@
# set_build_environment_variables and used to pass parameters to
# Spack's compiler wrappers.
#
SPACK_LIB = 'SPACK_LIB'
SPACK_ENV_PATH = 'SPACK_ENV_PATH'
SPACK_DEPENDENCIES = 'SPACK_DEPENDENCIES'
SPACK_PREFIX = 'SPACK_PREFIX'
SPACK_DEBUG = 'SPACK_DEBUG'
SPACK_SPEC = 'SPACK_SPEC'
SPACK_SHORT_SPEC = 'SPACK_SHORT_SPEC'
SPACK_DEBUG_LOG_DIR = 'SPACK_DEBUG_LOG_DIR'
@@ -84,7 +84,7 @@ def __call__(self, *args, **kwargs):
def set_compiler_environment_variables(pkg):
assert(pkg.spec.concrete)
compiler = compilers.compiler_for_spec(pkg.spec.compiler)
compiler = pkg.compiler
# Set compiler variables used by CMake and autotools
os.environ['CC'] = 'cc'
@@ -108,9 +108,6 @@ def set_compiler_environment_variables(pkg):
def set_build_environment_variables(pkg):
"""This ensures a clean install environment when we build packages.
"""
# This tells the compiler script where to find the Spack installation.
os.environ[SPACK_LIB] = spack.lib_path
# Add spack build environment path with compiler wrappers first in
# the path. We handle case sensitivity conflicts like "CC" and
# "cc" by putting one in the <build_env_path>/case-insensitive
@@ -122,7 +119,7 @@ def set_build_environment_variables(pkg):
# Prefixes of all of the package's dependencies go in
# SPACK_DEPENDENCIES
dep_prefixes = [d.package.prefix for d in pkg.spec.dependencies.values()]
dep_prefixes = [d.prefix for d in pkg.spec.traverse(root=False)]
path_set(SPACK_DEPENDENCIES, dep_prefixes)
# Install prefix
@@ -140,9 +137,21 @@ def set_build_environment_variables(pkg):
# Working directory for the spack command itself, for debug logs.
if spack.debug:
os.environ[SPACK_DEBUG] = "TRUE"
os.environ[SPACK_SPEC] = str(pkg.spec)
os.environ[SPACK_SHORT_SPEC] = pkg.spec.short_spec
os.environ[SPACK_DEBUG_LOG_DIR] = spack.spack_working_dir
# Add dependencies to CMAKE_PREFIX_PATH
path_set("CMAKE_PREFIX_PATH", dep_prefixes)
# Add any pkgconfig directories to PKG_CONFIG_PATH
pkg_config_dirs = []
for p in dep_prefixes:
for libdir in ('lib', 'lib64'):
pcdir = join_path(p, libdir, 'pkgconfig')
if os.path.isdir(pcdir):
pkg_config_dirs.append(pcdir)
path_set("PKG_CONFIG_PATH", pkg_config_dirs)
def set_module_variables_for_package(pkg):
"""Populate the module scope of install() with some useful functions.
@@ -153,6 +162,9 @@ def set_module_variables_for_package(pkg):
m.make = MakeExecutable('make', pkg.parallel)
m.gmake = MakeExecutable('gmake', pkg.parallel)
# easy shortcut to os.environ
m.env = os.environ
# number of jobs spack prefers to build with.
m.make_jobs = multiprocessing.cpu_count()
@@ -168,10 +180,14 @@ def set_module_variables_for_package(pkg):
# standard CMake arguments
m.std_cmake_args = ['-DCMAKE_INSTALL_PREFIX=%s' % pkg.prefix,
'-DCMAKE_BUILD_TYPE=None']
'-DCMAKE_BUILD_TYPE=RelWithDebInfo']
if platform.mac_ver()[0]:
m.std_cmake_args.append('-DCMAKE_FIND_FRAMEWORK=LAST')
# Set up CMake rpath
m.std_cmake_args.append('-DCMAKE_INSTALL_RPATH_USE_LINK_PATH=FALSE')
m.std_cmake_args.append('-DCMAKE_INSTALL_RPATH=%s' % ":".join(get_rpaths(pkg)))
# Emulate some shell commands for convenience
m.pwd = os.getcwd
m.cd = os.chdir
@@ -179,6 +195,7 @@ def set_module_variables_for_package(pkg):
m.makedirs = os.makedirs
m.remove = os.remove
m.removedirs = os.removedirs
m.symlink = os.symlink
m.mkdirp = mkdirp
m.install = install
@@ -188,3 +205,80 @@ def set_module_variables_for_package(pkg):
# Useful directories within the prefix are encapsulated in
# a Prefix object.
m.prefix = pkg.prefix
def get_rpaths(pkg):
"""Get a list of all the rpaths for a package."""
rpaths = [pkg.prefix.lib, pkg.prefix.lib64]
rpaths.extend(d.prefix.lib for d in pkg.spec.traverse(root=False)
if os.path.isdir(d.prefix.lib))
rpaths.extend(d.prefix.lib64 for d in pkg.spec.traverse(root=False)
if os.path.isdir(d.prefix.lib64))
return rpaths
def setup_package(pkg):
"""Execute all environment setup routines."""
set_compiler_environment_variables(pkg)
set_build_environment_variables(pkg)
set_module_variables_for_package(pkg)
# Allow dependencies to set up environment as well.
for dep_spec in pkg.spec.traverse(root=False):
dep_spec.package.setup_dependent_environment(
pkg.module, dep_spec, pkg.spec)
def fork(pkg, function):
"""Fork a child process to do part of a spack build.
Arguments:
pkg -- pkg whose environemnt we should set up the
forked process for.
function -- arg-less function to run in the child process.
Usage:
def child_fun():
# do stuff
build_env.fork(pkg, child_fun)
Forked processes are run with the build environemnt set up by
spack.build_environment. This allows package authors to have
full control over the environment, etc. without offecting
other builds that might be executed in the same spack call.
If something goes wrong, the child process is expected toprint
the error and the parent process will exit with error as
well. If things go well, the child exits and the parent
carries on.
"""
try:
pid = os.fork()
except OSError, e:
raise InstallError("Unable to fork build process: %s" % e)
if pid == 0:
# Give the child process the package's build environemnt.
setup_package(pkg)
try:
# call the forked function.
function()
# Use os._exit here to avoid raising a SystemExit exception,
# which interferes with unit tests.
os._exit(0)
except:
# Child doesn't raise or return to main spack code.
# Just runs default exception handler and exits.
sys.excepthook(*sys.exc_info())
os._exit(1)
else:
# Parent process just waits for the child to complete. If the
# child exited badly, assume it already printed an appropriate
# message. Just make the parent exit with an error code.
pid, returncode = os.waitpid(pid, 0)
if returncode != 0:
sys.exit(1)

View File

@@ -121,3 +121,18 @@ def elide_list(line_list, max_num=10):
return line_list[:max_num-1] + ['...'] + line_list[-1:]
else:
return line_list
def disambiguate_spec(spec):
matching_specs = spack.db.get_installed(spec)
if not matching_specs:
tty.die("Spec '%s' matches no installed packages." % spec)
elif len(matching_specs) > 1:
args = ["%s matches multiple packages." % spec,
"Matching packages:"]
args += [" " + str(s) for s in matching_specs]
args += ["Use a more specific spec."]
tty.die(*args)
return matching_specs[0]

View File

@@ -22,49 +22,37 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
Functions for comparing values that may potentially be None.
These none_high functions consider None as greater than all other values.
"""
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
# Preserve builtin min and max functions
_builtin_min = min
_builtin_max = max
description = "Activate a package extension."
def setup_parser(subparser):
subparser.add_argument(
'-f', '--force', action='store_true',
help="Activate without first activating dependencies.")
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help="spec of package extension to activate.")
def lt(lhs, rhs):
"""Less-than comparison. None is greater than any value."""
return lhs != rhs and (rhs is None or (lhs is not None and lhs < rhs))
def activate(parser, args):
# TODO: shouldn't have to concretize here. Fix DAG issues.
specs = spack.cmd.parse_specs(args.spec, concretize=True)
if len(specs) != 1:
tty.die("activate requires one spec. %d given." % len(specs))
# TODO: remove this hack when DAG info is stored in dir layout.
# This ensures the ext spec is always normalized properly.
spack.db.get(specs[0])
def le(lhs, rhs):
"""Less-than-or-equal comparison. None is greater than any value."""
return lhs == rhs or lt(lhs, rhs)
spec = spack.cmd.disambiguate_spec(specs[0])
if not spec.package.is_extension:
tty.die("%s is not an extension." % spec.name)
def gt(lhs, rhs):
"""Greater-than comparison. None is greater than any value."""
return lhs != rhs and not lt(lhs, rhs)
if spec.package.activated:
tty.die("Package %s is already activated." % specs[0].short_spec)
def ge(lhs, rhs):
"""Greater-than-or-equal comparison. None is greater than any value."""
return lhs == rhs or gt(lhs, rhs)
def min(lhs, rhs):
"""Minimum function where None is greater than any value."""
if lhs is None:
return rhs
elif rhs is None:
return lhs
else:
return _builtin_min(lhs, rhs)
def max(lhs, rhs):
"""Maximum function where None is greater than any value."""
if lhs is None or rhs is None:
return None
else:
return _builtin_max(lhs, rhs)
spec.package.do_activate()

View File

@@ -23,12 +23,13 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
from subprocess import check_call, check_output
from subprocess import check_call
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
from llnl.util.filesystem import join_path, mkdirp
import spack
from spack.util.executable import which
description = "Create a new installation of spack in another prefix"
@@ -38,8 +39,10 @@ def setup_parser(subparser):
def get_origin_url():
git_dir = join_path(spack.prefix, '.git')
origin_url = check_output(
['git', '--git-dir=%s' % git_dir, 'config', '--get', 'remote.origin.url'])
git = which('git', required=True)
origin_url = git(
'--git-dir=%s' % git_dir, 'config', '--get', 'remote.origin.url',
return_output=True)
return origin_url.strip()
@@ -49,6 +52,11 @@ def bootstrap(parser, args):
tty.msg("Fetching spack from origin: %s" % origin_url)
if os.path.isfile(prefix):
tty.die("There is already a file at %s" % prefix)
mkdirp(prefix)
if os.path.exists(join_path(prefix, '.git')):
tty.die("There already seems to be a git repository in %s" % prefix)
@@ -62,10 +70,11 @@ def bootstrap(parser, args):
"%s/lib/spack/..." % prefix)
os.chdir(prefix)
check_call(['git', 'init', '--shared', '-q'])
check_call(['git', 'remote', 'add', 'origin', origin_url])
check_call(['git', 'fetch', 'origin', 'master:refs/remotes/origin/master', '-n', '-q'])
check_call(['git', 'reset', '--hard', 'origin/master', '-q'])
git = which('git', required=True)
git('init', '--shared', '-q')
git('remote', 'add', 'origin', origin_url)
git('fetch', 'origin', 'master:refs/remotes/origin/master', '-n', '-q')
git('reset', '--hard', 'origin/master', '-q')
tty.msg("Successfully created a new spack in %s" % prefix,
"Run %s/bin/spack to use this installation." % prefix)

38
lib/spack/spack/cmd/cd.py Normal file
View File

@@ -0,0 +1,38 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import spack.cmd.location
import spack.modules
description="cd to spack directories in the shell."
def setup_parser(subparser):
"""This is for decoration -- spack cd is used through spack's
shell support. This allows spack cd to print a descriptive
help message when called with -h."""
spack.cmd.location.setup_parser(subparser)
def cd(parser, args):
spack.modules.print_help()

View File

@@ -24,7 +24,7 @@
##############################################################################
import os
import re
import argparse
from external import argparse
import hashlib
from pprint import pprint
from subprocess import CalledProcessError
@@ -38,7 +38,7 @@
from spack.stage import Stage, FailedDownloadError
from spack.version import *
description ="Checksum available versions of a package to update a package file."
description ="Checksum available versions of a package."
def setup_parser(subparser):
subparser.add_argument(
@@ -56,7 +56,6 @@ def get_checksums(versions, urls, **kwargs):
first_stage_function = kwargs.get('first_stage_function', None)
keep_stage = kwargs.get('keep_stage', False)
tty.msg("Downloading...")
hashes = []
for i, (url, version) in enumerate(zip(urls, versions)):
@@ -85,24 +84,24 @@ def checksum(parser, args):
pkg = spack.db.get(args.package)
# If the user asked for specific versions, use those.
versions = [ver(v) for v in args.versions]
if not all(type(v) == Version for v in versions):
tty.die("Cannot generate checksums for version lists or " +
"version ranges. Use unambiguous versions.")
if not versions:
versions = pkg.fetch_available_versions()
if args.versions:
versions = {}
for v in args.versions:
v = ver(v)
if not isinstance(v, Version):
tty.die("Cannot generate checksums for version lists or " +
"version ranges. Use unambiguous versions.")
versions[v] = pkg.url_for_version(v)
else:
versions = pkg.fetch_remote_versions()
if not versions:
tty.die("Could not fetch any available versions for %s." % pkg.name)
tty.die("Could not fetch any versions for %s." % pkg.name)
versions = list(reversed(versions))
urls = [pkg.url_for_version(v) for v in versions]
sorted_versions = sorted(versions, reverse=True)
tty.msg("Found %s versions of %s." % (len(urls), pkg.name),
tty.msg("Found %s versions of %s." % (len(versions), pkg.name),
*spack.cmd.elide_list(
["%-10s%s" % (v,u) for v, u in zip(versions, urls)]))
["%-10s%s" % (v, versions[v]) for v in sorted_versions]))
print
archives_to_fetch = tty.get_number(
"How many would you like to checksum?", default=5, abort='q')
@@ -112,12 +111,12 @@ def checksum(parser, args):
return
version_hashes = get_checksums(
versions[:archives_to_fetch], urls[:archives_to_fetch], keep_stage=args.keep_stage)
sorted_versions[:archives_to_fetch],
[versions[v] for v in sorted_versions[:archives_to_fetch]],
keep_stage=args.keep_stage)
if not version_hashes:
tty.die("Could not fetch any available versions for %s." % pkg.name)
tty.die("Could not fetch any versions for %s." % pkg.name)
dict_string = [" '%s' : '%s'," % (v, h) for v, h in version_hashes]
dict_string = ['{'] + dict_string + ["}"]
tty.msg("Checksummed new versions of %s:" % pkg.name, *dict_string)
version_lines = [" version('%s', '%s')" % (v, h) for v, h in version_hashes]
tty.msg("Checksummed new versions of %s:" % pkg.name, *version_lines)

View File

@@ -1,5 +1,5 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
@@ -22,37 +22,25 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
import spack.stage as stage
description = "Remove staged files for packages"
description = "Remove build stage and source tarball for packages."
def setup_parser(subparser):
subparser.add_argument('-c', "--clean", action="store_true", dest='clean',
help="run make clean in the build directory (default)")
subparser.add_argument('-w', "--work", action="store_true", dest='work',
help="delete the build directory and re-expand it from its archive.")
subparser.add_argument('-d', "--dist", action="store_true", dest='dist',
help="delete the downloaded archive.")
subparser.add_argument('packages', nargs=argparse.REMAINDER,
help="specs of packages to clean")
def clean(parser, args):
if not args.packages:
tty.die("spack clean requires at least one package argument")
tty.die("spack clean requires at least one package spec.")
specs = spack.cmd.parse_specs(args.packages, concretize=True)
for spec in specs:
package = spack.db.get(spec)
if args.dist:
package.do_clean_dist()
elif args.work:
package.do_clean_work()
else:
package.do_clean()
package.do_clean()

View File

@@ -22,16 +22,17 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import llnl.util.tty as tty
from llnl.util.tty.color import colorize
from llnl.util.tty.colify import colify
from llnl.util.lang import index_by
import spack.compilers
import spack.spec
import spack.config
from spack.compilation import get_path
from spack.util.environment import get_path
from spack.spec import CompilerSpec
description = "Manage compilers"
@@ -96,9 +97,12 @@ def compiler_info(args):
def compiler_list(args):
tty.msg("Available compilers")
index = index_by(spack.compilers.all_compilers(), 'name')
for name, compilers in index.items():
tty.hline(name, char='-', color=spack.spec.compiler_color)
colify(reversed(sorted(compilers)), indent=4)
for i, (name, compilers) in enumerate(index.items()):
if i >= 1: print
cname = "%s{%s}" % (spack.spec.compiler_color, name)
tty.hline(colorize(cname), char='-')
colify(reversed(sorted(compilers)))
def compiler(parser, args):

View File

@@ -23,7 +23,7 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import argparse
from external import argparse
import llnl.util.tty as tty

View File

@@ -28,6 +28,7 @@
import re
from contextlib import closing
from external.ordereddict import OrderedDict
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
@@ -36,7 +37,7 @@
import spack.cmd.checksum
import spack.package
import spack.url
from spack.util.naming import mod_to_class
from spack.util.naming import *
import spack.util.crypto as crypto
from spack.util.executable import which
@@ -70,7 +71,10 @@ class ${class_name}(Package):
homepage = "http://www.example.com"
url = "${url}"
versions = ${versions}
${versions}
# FIXME: Add dependencies if this package requires them.
# depends_on("foo")
def install(self, spec, prefix):
# FIXME: Modify the configure line to suit your build system here.
@@ -87,6 +91,9 @@ def setup_parser(subparser):
subparser.add_argument(
'--keep-stage', action='store_true', dest='keep_stage',
help="Don't clean up staging area when command completes.")
subparser.add_argument(
'-n', '--name', dest='alternate_name', default=None,
help="Override the autodetected name for the created package.")
subparser.add_argument(
'-f', '--force', action='store_true', dest='force',
help="Overwrite any existing package file with the same name.")
@@ -114,39 +121,35 @@ def __call__(self, stage):
self.configure = '%s\n # %s' % (autotools, cmake)
def make_version_dict(ver_hash_tuples):
max_len = max(len(str(v)) for v,hfg in ver_hash_tuples)
width = max_len + 2
format = "%-" + str(width) + "s : '%s',"
sep = '\n '
return '{ ' + sep.join(format % ("'%s'" % v, h)
for v, h in ver_hash_tuples) + ' }'
def get_name():
"""Prompt user to input a package name."""
name = ""
while not name:
new_name = raw_input("Name: ")
if spack.db.valid_name(name):
name = new_name
else:
print "Package name can only contain A-Z, a-z, 0-9, '_' and '-'"
return name
def make_version_calls(ver_hash_tuples):
"""Adds a version() call to the package for each version found."""
max_len = max(len(str(v)) for v, h in ver_hash_tuples)
format = " version(%%-%ds, '%%s')" % (max_len + 2)
return '\n'.join(format % ("'%s'" % v, h) for v, h in ver_hash_tuples)
def create(parser, args):
url = args.url
# Try to deduce name and version of the new package from the URL
name, version = spack.url.parse_name_and_version(url)
if not name:
tty.msg("Couldn't guess a name for this package.")
name = get_name()
version = spack.url.parse_version(url)
if not version:
tty.die("Couldn't guess a version string from %s." % url)
# Try to guess a name. If it doesn't work, allow the user to override.
if args.alternate_name:
name = args.alternate_name
else:
try:
name = spack.url.parse_name(url, version)
except spack.url.UndetectableNameError, e:
# Use a user-supplied name if one is present
tty.die("Couldn't guess a name for this package. Try running:", "",
"spack create --name <name> <url>")
if not valid_module_name(name):
tty.die("Package name can only contain A-Z, a-z, 0-9, '_' and '-'")
tty.msg("This looks like a URL for %s version %s." % (name, version))
tty.msg("Creating template for package %s" % name)
@@ -157,32 +160,33 @@ def create(parser, args):
else:
mkdirp(os.path.dirname(pkg_path))
versions = list(reversed(spack.package.find_versions_of_archive(url)))
versions = spack.package.find_versions_of_archive(url)
rkeys = sorted(versions.keys(), reverse=True)
versions = OrderedDict(zip(rkeys, (versions[v] for v in rkeys)))
archives_to_fetch = 1
if not versions:
# If the fetch failed for some reason, revert to what the user provided
versions = [version]
urls = [url]
else:
urls = [spack.url.substitute_version(url, v) for v in versions]
if len(urls) > 1:
tty.msg("Found %s versions of %s:" % (len(urls), name),
*spack.cmd.elide_list(
["%-10s%s" % (v,u) for v, u in zip(versions, urls)]))
print
archives_to_fetch = tty.get_number(
"Include how many checksums in the package file?",
default=5, abort='q')
versions = { version : url }
elif len(versions) > 1:
tty.msg("Found %s versions of %s:" % (len(versions), name),
*spack.cmd.elide_list(
["%-10s%s" % (v,u) for v, u in versions.iteritems()]))
print
archives_to_fetch = tty.get_number(
"Include how many checksums in the package file?",
default=5, abort='q')
if not archives_to_fetch:
tty.msg("Aborted.")
return
if not archives_to_fetch:
tty.msg("Aborted.")
return
guesser = ConfigureGuesser()
ver_hash_tuples = spack.cmd.checksum.get_checksums(
versions[:archives_to_fetch], urls[:archives_to_fetch],
first_stage_function=guesser, keep_stage=args.keep_stage)
versions.keys()[:archives_to_fetch],
[versions[v] for v in versions.keys()[:archives_to_fetch]],
first_stage_function=guesser,
keep_stage=args.keep_stage)
if not ver_hash_tuples:
tty.die("Could not fetch any tarballs for %s." % name)
@@ -195,7 +199,7 @@ def create(parser, args):
configure=guesser.configure,
class_name=mod_to_class(name),
url=url,
versions=make_version_dict(ver_hash_tuples)))
versions=make_version_calls(ver_hash_tuples)))
# If everything checks out, go ahead and edit.
spack.editor(pkg_path)

View File

@@ -0,0 +1,104 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
from spack.graph import topological_sort
description = "Deactivate a package extension."
def setup_parser(subparser):
subparser.add_argument(
'-f', '--force', action='store_true',
help="Run deactivation even if spec is NOT currently activated.")
subparser.add_argument(
'-a', '--all', action='store_true',
help="Deactivate all extensions of an extendable pacakge, or "
"deactivate an extension AND its dependencies.")
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help="spec of package extension to deactivate.")
def deactivate(parser, args):
# TODO: shouldn't have to concretize here. Fix DAG issues.
specs = spack.cmd.parse_specs(args.spec, concretize=True)
if len(specs) != 1:
tty.die("deactivate requires one spec. %d given." % len(specs))
# TODO: remove this hack when DAG info is stored properly.
# This ensures the ext spec is always normalized properly.
spack.db.get(specs[0])
spec = spack.cmd.disambiguate_spec(specs[0])
pkg = spec.package
if args.all:
if pkg.extendable:
tty.msg("Deactivating all extensions of %s" % pkg.spec.short_spec)
ext_pkgs = spack.db.installed_extensions_for(spec)
for ext_pkg in ext_pkgs:
ext_pkg.spec.normalize()
if ext_pkg.activated:
ext_pkg.do_deactivate(force=True)
elif pkg.is_extension:
# TODO: store DAG info properly (see above)
spec.normalize()
if not args.force and not spec.package.activated:
tty.die("%s is not activated." % pkg.spec.short_spec)
tty.msg("Deactivating %s and all dependencies." % pkg.spec.short_spec)
topo_order = topological_sort(spec)
index = spec.index()
for name in topo_order:
espec = index[name]
epkg = espec.package
# TODO: store DAG info properly (see above)
epkg.spec.normalize()
if epkg.extends(pkg.extendee_spec):
if epkg.activated or args.force:
epkg.do_deactivate(force=args.force)
else:
tty.die("spack deactivate --all requires an extendable package or an extension.")
else:
if not pkg.is_extension:
tty.die("spack deactivate requires an extension.",
"Did you mean 'spack deactivate --all'?")
if not args.force and not spec.package.activated:
tty.die("Package %s is not activated." % specs[0].short_spec)
spec.package.do_deactivate(force=args.force)

View File

@@ -22,14 +22,14 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
description = "Show dependent packages."
description = "Show installed packages that depend on another."
def setup_parser(subparser):
subparser.add_argument(
@@ -42,5 +42,5 @@ def dependents(parser, args):
tty.die("spack dependents takes only one spec.")
fmt = '$_$@$%@$+$=$#'
deps = [d.format(fmt) for d in specs[0].package.installed_dependents]
tty.msg("Dependents of %s" % specs[0].format(fmt), *deps)
deps = [d.format(fmt, color=True) for d in specs[0].package.installed_dependents]
tty.msg("Dependents of %s" % specs[0].format(fmt, color=True), *deps)

View File

@@ -1,99 +0,0 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import os
import shutil
import argparse
import llnl.util.tty as tty
from llnl.util.lang import partition_list
from llnl.util.filesystem import mkdirp
import spack.cmd
import spack.hooks.dotkit
from spack.spec import Spec
description ="Find dotkits for packages if they exist."
def setup_parser(subparser):
subparser.add_argument(
'--refresh', action='store_true', help='Regenerate all dotkits')
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help='spec to find a dotkit for.')
def dotkit_find(parser, args):
if not args.spec:
parser.parse_args(['dotkit', '-h'])
spec = spack.cmd.parse_specs(args.spec)
if len(spec) > 1:
tty.die("You can only pass one spec.")
spec = spec[0]
if not spack.db.exists(spec.name):
tty.die("No such package: %s" % spec.name)
specs = [s for s in spack.db.installed_package_specs() if s.satisfies(spec)]
if len(specs) == 0:
tty.die("No installed packages match spec %s" % spec)
if len(specs) > 1:
tty.error("Multiple matches for spec %s. Choose one:" % spec)
for s in specs:
sys.stderr.write(s.tree(color=True))
sys.exit(1)
match = specs[0]
if not os.path.isfile(spack.hooks.dotkit.dotkit_file(match.package)):
tty.die("No dotkit is installed for package %s." % spec)
print match.format('$_$@$+$%@$=$#')
def dotkit_refresh(parser, args):
query_specs = spack.cmd.parse_specs(args.spec)
specs = spack.db.installed_package_specs()
if query_specs:
specs = [s for s in specs
if any(s.satisfies(q) for q in query_specs)]
else:
shutil.rmtree(spack.dotkit_path, ignore_errors=False)
mkdirp(spack.dotkit_path)
for spec in specs:
spack.hooks.dotkit.post_install(spec.package)
def dotkit(parser, args):
if args.refresh:
dotkit_refresh(parser, args)
else:
dotkit_find(parser, args)

View File

@@ -27,9 +27,10 @@
from contextlib import closing
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
from llnl.util.filesystem import mkdirp, join_path
import spack
import spack.cmd
from spack.util.naming import mod_to_class
description = "Open package files in $EDITOR"
@@ -44,7 +45,7 @@ class ${class_name}(Package):
homepage = "http://www.example.com"
url = "http://www.example.com/${name}-1.0.tar.gz"
versions = { '1.0' : '0123456789abcdef0123456789abcdef' }
version('1.0', '0123456789abcdef0123456789abcdef')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
@@ -57,6 +58,9 @@ def setup_parser(subparser):
subparser.add_argument(
'-f', '--force', dest='force', action='store_true',
help="Open a new file in $EDITOR even if package doesn't exist.")
subparser.add_argument(
'-c', '--command', dest='edit_command', action='store_true',
help="Edit the command with the supplied name instead of a package.")
subparser.add_argument(
'name', nargs='?', default=None, help="name of package to edit")
@@ -64,25 +68,34 @@ def setup_parser(subparser):
def edit(parser, args):
name = args.name
# By default open the directory where packages live.
if not name:
path = spack.packages_path
else:
path = spack.db.filename_for_package_name(name)
if os.path.exists(path):
if not os.path.isfile(path):
tty.die("Something's wrong. '%s' is not a file!" % path)
if not os.access(path, os.R_OK|os.W_OK):
tty.die("Insufficient permissions on '%s'!" % path)
elif not args.force:
tty.die("No package '%s'. Use spack create, or supply -f/--force "
"to edit a new file." % name)
if args.edit_command:
if not name:
path = spack.cmd.command_path
else:
mkdirp(os.path.dirname(path))
with closing(open(path, "w")) as pkg_file:
pkg_file.write(
package_template.substitute(name=name, class_name=mod_to_class(name)))
path = join_path(spack.cmd.command_path, name + ".py")
if not os.path.exists(path):
tty.die("No command named '%s'." % name)
else:
# By default open the directory where packages or commands live.
if not name:
path = spack.packages_path
else:
path = spack.db.filename_for_package_name(name)
if os.path.exists(path):
if not os.path.isfile(path):
tty.die("Something's wrong. '%s' is not a file!" % path)
if not os.access(path, os.R_OK|os.W_OK):
tty.die("Insufficient permissions on '%s'!" % path)
elif not args.force:
tty.die("No package '%s'. Use spack create, or supply -f/--force "
"to edit a new file." % name)
else:
mkdirp(os.path.dirname(path))
with closing(open(path, "w")) as pkg_file:
pkg_file.write(
package_template.substitute(name=name, class_name=mod_to_class(name)))
# If everything checks out, go ahead and edit.
spack.editor(path)

View File

@@ -0,0 +1,69 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
from external import argparse
import llnl.util.tty as tty
import spack.cmd
import spack.build_environment as build_env
description = "Run a command with the environment for a particular spec's install."
def setup_parser(subparser):
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help="specs of package environment to emulate.")
def env(parser, args):
if not args.spec:
tty.die("spack env requires a spec.")
# Specs may have spaces in them, so if they do, require that the
# caller put a '--' between the spec and the command to be
# executed. If there is no '--', assume that the spec is the
# first argument.
sep = '--'
if sep in args.spec:
s = args.spec.index(sep)
spec = args.spec[:s]
cmd = args.spec[s+1:]
else:
spec = args.spec[0]
cmd = args.spec[1:]
specs = spack.cmd.parse_specs(spec, concretize=True)
if len(specs) > 1:
tty.die("spack env only takes one spec.")
spec = specs[0]
build_env.setup_package(spec.package)
if not cmd:
# If no command act like the "env" command and print out env vars.
for key, val in os.environ.items():
print "%s=%s" % (key, val)
else:
# Otherwise execute the command with the new environment
os.execvp(cmd[0], cmd)

View File

@@ -0,0 +1,98 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
from external import argparse
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack
import spack.cmd
import spack.cmd.find
description = "List extensions for package."
def setup_parser(subparser):
format_group = subparser.add_mutually_exclusive_group()
format_group.add_argument(
'-l', '--long', action='store_const', dest='mode', const='long',
help='Show dependency hashes as well as versions.')
format_group.add_argument(
'-p', '--paths', action='store_const', dest='mode', const='paths',
help='Show paths to extension install directories')
format_group.add_argument(
'-d', '--deps', action='store_const', dest='mode', const='deps',
help='Show full dependency DAG of extensions')
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help='Spec of package to list extensions for')
def extensions(parser, args):
if not args.spec:
tty.die("extensions requires a package spec.")
# Checks
spec = spack.cmd.parse_specs(args.spec)
if len(spec) > 1:
tty.die("Can only list extensions for one package.")
if not spec[0].package.extendable:
tty.die("%s is not an extendable package." % spec[0].name)
spec = spack.cmd.disambiguate_spec(spec[0])
if not spec.package.extendable:
tty.die("%s does not have extensions." % spec.short_spec)
if not args.mode:
args.mode = 'short'
# List package names of extensions
extensions = spack.db.extensions_for(spec)
if not extensions:
tty.msg("%s has no extensions." % spec.cshort_spec)
return
tty.msg(spec.cshort_spec)
tty.msg("%d extensions:" % len(extensions))
colify(ext.name for ext in extensions)
# List specs of installed extensions.
installed = [s.spec for s in spack.db.installed_extensions_for(spec)]
print
if not installed:
tty.msg("None installed.")
return
tty.msg("%d installed:" % len(installed))
spack.cmd.find.display_specs(installed, mode=args.mode)
# List specs of activated extensions.
activated = spack.install_layout.extension_map(spec)
print
if not activated:
tty.msg("None activated.")
return
tty.msg("%d currently activated:" % len(activated))
spack.cmd.find.display_specs(activated.values(), mode=args.mode)

View File

@@ -22,7 +22,7 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import spack
import spack.cmd

View File

@@ -24,13 +24,14 @@
##############################################################################
import sys
import collections
import argparse
import itertools
from external import argparse
from StringIO import StringIO
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.tty.colify import *
from llnl.util.tty.color import *
from llnl.util.lang import partition_list, index_by
from llnl.util.lang import *
import spack
import spack.spec
@@ -40,17 +41,64 @@
def setup_parser(subparser):
format_group = subparser.add_mutually_exclusive_group()
format_group.add_argument(
'-p', '--paths', action='store_true', dest='paths',
'-l', '--long', action='store_const', dest='mode', const='long',
help='Show dependency hashes as well as versions.')
format_group.add_argument(
'-p', '--paths', action='store_const', dest='mode', const='paths',
help='Show paths to package install directories')
format_group.add_argument(
'-l', '--long', action='store_true', dest='full_specs',
help='Show full-length specs of installed packages')
'-d', '--deps', action='store_const', dest='mode', const='deps',
help='Show full dependency DAG of installed packages')
subparser.add_argument(
'query_specs', nargs=argparse.REMAINDER,
help='optional specs to filter results')
def display_specs(specs, **kwargs):
mode = kwargs.get('mode', 'short')
# Make a dict with specs keyed by architecture and compiler.
index = index_by(specs, ('architecture', 'compiler'))
# Traverse the index and print out each package
for i, (architecture, compiler) in enumerate(sorted(index)):
if i > 0: print
header = "%s{%s} / %s{%s}" % (
spack.spec.architecture_color, architecture,
spack.spec.compiler_color, compiler)
tty.hline(colorize(header), char='-')
specs = index[(architecture,compiler)]
specs.sort()
abbreviated = [s.format('$_$@$+', color=True) for s in specs]
if mode == 'paths':
# Print one spec per line along with prefix path
width = max(len(s) for s in abbreviated)
width += 2
format = " %-{}s%s".format(width)
for abbrv, spec in zip(abbreviated, specs):
print format % (abbrv, spec.prefix)
elif mode == 'deps':
for spec in specs:
print spec.tree(indent=4, format='$_$@$+$#', color=True),
elif mode in ('short', 'long'):
fmt = '$-_$@$+'
if mode == 'long':
fmt += '$#'
colify(s.format(fmt, color=True) for s in specs)
else:
raise ValueError(
"Invalid mode for display_specs: %s. Must be one of (paths, deps, short)." % mode)
def find(parser, args):
# Filter out specs that don't exist.
query_specs = spack.cmd.parse_specs(args.query_specs)
@@ -65,39 +113,17 @@ def find(parser, args):
if not query_specs:
return
specs = [s for s in spack.db.installed_package_specs()
if not query_specs or any(s.satisfies(q) for q in query_specs)]
# Get all the specs the user asked for
if not query_specs:
specs = set(spack.db.installed_package_specs())
else:
results = [set(spack.db.get_installed(qs)) for qs in query_specs]
specs = set.union(*results)
# Make a dict with specs keyed by architecture and compiler.
index = index_by(specs, 'architecture', 'compiler')
if not args.mode:
args.mode = 'short'
# Traverse the index and print out each package
for architecture in index:
tty.hline(architecture, char='=', color=spack.spec.architecture_color)
for compiler in index[architecture]:
tty.hline(compiler, char='-', color=spack.spec.compiler_color)
if sys.stdout.isatty():
tty.msg("%d installed packages." % len(specs))
display_specs(specs, mode=args.mode)
specs = index[architecture][compiler]
specs.sort()
abbreviated = [s.format('$_$@$+$#', color=True) for s in specs]
if args.paths:
# Print one spec per line along with prefix path
width = max(len(s) for s in abbreviated)
width += 2
format = " %-{}s%s".format(width)
for abbrv, spec in zip(abbreviated, specs):
print format % (abbrv, spec.package.prefix)
elif args.full_specs:
for spec in specs:
print spec.tree(indent=4, format='$_$@$+', color=True),
else:
max_len = max([len(s.name) for s in specs])
max_len += 4
for spec in specs:
format = '$-' + str(max_len) + '_$@$+$#'
print " " + spec.format(format, color=True)

View File

@@ -22,9 +22,45 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import spack
from external import argparse
import spack
import spack.cmd
from spack.graph import *
description = "Generate graphs of package dependency relationships."
def setup_parser(subparser):
setup_parser.parser = subparser
method = subparser.add_mutually_exclusive_group()
method.add_argument(
'--ascii', action='store_true',
help="Draw graph as ascii to stdout (default).")
method.add_argument(
'--dot', action='store_true',
help="Generate graph in dot format and print to stdout.")
subparser.add_argument(
'--concretize', action='store_true', help="Concretize specs before graphing.")
subparser.add_argument(
'specs', nargs=argparse.REMAINDER, help="specs of packages to graph.")
description = "Write out inter-package dependencies in dot graph format"
def graph(parser, args):
spack.db.graph_dependencies()
specs = spack.cmd.parse_specs(
args.specs, normalize=True, concretize=args.concretize)
if not specs:
setup_parser.parser.print_help()
return 1
if args.dot: # Dot graph only if asked for.
graph_dot(*specs)
elif specs: # ascii is default: user doesn't need to provide it explicitly
graph_ascii(specs[0], debug=spack.debug)
for spec in specs[1:]:
print # extra line bt/w independent graphs
graph_ascii(spec, debug=spack.debug)

View File

@@ -22,52 +22,57 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import re
import textwrap
from llnl.util.tty.colify import colify
from llnl.util.tty.colify import *
import spack
import spack.fetch_strategy as fs
description = "Get detailed information on a particular package"
def setup_parser(subparser):
subparser.add_argument('name', metavar="PACKAGE", help="name of packages to get info on")
subparser.add_argument('name', metavar="PACKAGE", help="Name of package to get info for.")
def info(parser, args):
package = spack.db.get(args.name)
print "Package: ", package.name
print "Homepage: ", package.homepage
print "Download: ", package.url
def print_text_info(pkg):
"""Print out a plain text description of a package."""
print "Package: ", pkg.name
print "Homepage: ", pkg.homepage
print
print "Safe versions: "
if package.versions:
colify(reversed(sorted(package.versions)), indent=4)
if not pkg.versions:
print("None.")
else:
print "None. Use spack versions %s to get a list of downloadable versions." % package.name
maxlen = max(len(str(v)) for v in pkg.versions)
fmt = "%%-%ss" % maxlen
for v in reversed(sorted(pkg.versions)):
f = fs.for_package_version(pkg, v)
print " " + (fmt % v) + " " + str(f)
print
print "Dependencies:"
if package.dependencies:
colify(package.dependencies, indent=4)
if pkg.dependencies:
colify(pkg.dependencies, indent=4)
else:
print " None"
print
print "Virtual packages: "
if package.provided:
for spec, when in package.provided.items():
if pkg.provided:
for spec, when in pkg.provided.items():
print " %s provides %s" % (when, spec)
else:
print " None"
print
print "Description:"
if package.__doc__:
doc = re.sub(r'\s+', ' ', package.__doc__)
lines = textwrap.wrap(doc, 72)
for line in lines:
print " " + line
if pkg.__doc__:
print pkg.format_doc(indent=4)
else:
print " None"
def info(parser, args):
pkg = spack.db.get(args.name)
print_text_info(pkg)

View File

@@ -23,7 +23,7 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import argparse
from external import argparse
import spack
import spack.cmd
@@ -43,6 +43,9 @@ def setup_parser(subparser):
subparser.add_argument(
'-n', '--no-checksum', action='store_true', dest='no_checksum',
help="Do not check packages against checksum")
subparser.add_argument(
'--fake', action='store_true', dest='fake',
help="Fake install. Just remove the prefix and touch a fake file in it.")
subparser.add_argument(
'packages', nargs=argparse.REMAINDER, help="specs of packages to install")
@@ -59,4 +62,5 @@ def install(parser, args):
package = spack.db.get(spec)
package.do_install(keep_prefix=args.keep_prefix,
keep_stage=args.keep_stage,
ignore_deps=args.ignore_deps)
ignore_deps=args.ignore_deps,
fake=args.fake)

View File

@@ -22,15 +22,43 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import llnl.util.tty as tty
from external import argparse
from llnl.util.tty.colify import colify
import spack
import fnmatch
description ="List available spack packages"
def setup_parser(subparser):
pass
subparser.add_argument(
'filter', nargs=argparse.REMAINDER,
help='Optional glob patterns to filter results.')
subparser.add_argument(
'-i', '--insensitive', action='store_true', default=False,
help='Filtering will be case insensitive.')
def list(parser, args):
# Start with all package names.
pkgs = spack.db.all_package_names()
# filter if a filter arg was provided
if args.filter:
def match(p, f):
if args.insensitive:
p = p.lower()
f = f.lower()
return fnmatch.fnmatchcase(p, f)
pkgs = [p for p in pkgs if any(match(p, f) for f in args.filter)]
# sort before displaying.
sorted_packages = sorted(pkgs, key=lambda s:s.lower())
# Print all the package names in columns
colify(spack.db.all_package_names())
indent=0
if sys.stdout.isatty():
tty.msg("%d packages." % len(sorted_packages))
colify(sorted_packages, indent=indent)

View File

@@ -0,0 +1,38 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by David Beckingsale, david@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from external import argparse
import spack.modules
description ="Add package to environment using modules."
def setup_parser(subparser):
"""Parser is only constructed so that this prints a nice help
message with -h. """
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help='Spec of package to load with modules.')
def load(parser, args):
spack.modules.print_help()

View File

@@ -0,0 +1,107 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import sys
from external import argparse
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
import spack
import spack.cmd
description="Print out locations of various diectories used by Spack"
def setup_parser(subparser):
global directories
directories = subparser.add_mutually_exclusive_group()
directories.add_argument(
'-m', '--module-dir', action='store_true', help="Spack python module directory.")
directories.add_argument(
'-r', '--spack-root', action='store_true', help="Spack installation root.")
directories.add_argument(
'-i', '--install-dir', action='store_true',
help="Install prefix for spec (spec need not be installed).")
directories.add_argument(
'-p', '--package-dir', action='store_true',
help="Directory enclosing a spec's package.py file.")
directories.add_argument(
'-P', '--packages', action='store_true',
help="Top-level packages directory for Spack.")
directories.add_argument(
'-s', '--stage-dir', action='store_true', help="Stage directory for a spec.")
directories.add_argument(
'-b', '--build-dir', action='store_true',
help="Checked out or expanded source directory for a spec (requires it to be staged first).")
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help="spec of package to fetch directory for.")
def location(parser, args):
if args.module_dir:
print spack.module_path
elif args.spack_root:
print spack.prefix
elif args.packages:
print spack.db.root
else:
specs = spack.cmd.parse_specs(args.spec)
if not specs:
tty.die("You must supply a spec.")
if len(specs) != 1:
tty.die("Too many specs. Supply only one.")
if args.install_dir:
# install_dir command matches against installed specs.
spec = spack.cmd.disambiguate_spec(specs[0])
print spec.prefix
else:
spec = specs[0]
if args.package_dir:
# This one just needs the spec name.
print join_path(spack.db.root, spec.name)
else:
# These versions need concretized specs.
spec.concretize()
pkg = spack.db.get(spec)
if args.stage_dir:
print pkg.stage.path
else: # args.build_dir is the default.
if not pkg.stage.source_path:
tty.die("Build directory does not exist yet. Run this to create it:",
"spack stage " + " ".join(args.spec))
print pkg.stage.source_path

View File

@@ -0,0 +1,53 @@
##############################################################################
# Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import hashlib
from external import argparse
import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack.util.crypto
description = "Calculate md5 checksums for files."
def setup_parser(subparser):
setup_parser.parser = subparser
subparser.add_argument('files', nargs=argparse.REMAINDER,
help="Files to checksum.")
def md5(parser, args):
if not args.files:
setup_parser.parser.print_help()
return 1
for f in args.files:
if not os.path.isfile(f):
tty.die("Not a file: %s" % f)
if not can_access(f):
tty.die("Cannot read file: %s" % f)
checksum = spack.util.crypto.checksum(hashlib.md5, f)
print "%s %s" % (checksum, f)

View File

@@ -23,25 +23,21 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import shutil
import argparse
import sys
from datetime import datetime
from contextlib import closing
from external import argparse
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.filesystem import mkdirp, join_path
import spack
import spack.cmd
import spack.config
import spack.mirror
from spack.spec import Spec
from spack.error import SpackError
from spack.stage import Stage
from spack.util.compression import extension
description = "Manage spack mirrors."
description = "Manage mirrors."
def setup_parser(subparser):
subparser.add_argument(
@@ -58,6 +54,9 @@ def setup_parser(subparser):
'specs', nargs=argparse.REMAINDER, help="Specs of packages to put in mirror")
create_parser.add_argument(
'-f', '--file', help="File with specs of packages to put in mirror.")
create_parser.add_argument(
'-o', '--one-version-per-spec', action='store_const', const=1, default=0,
help="Only fetch one 'preferred' version per spec, not all known versions.")
add_parser = sp.add_parser('add', help=mirror_add.__doc__)
add_parser.add_argument('name', help="Mnemonic name for mirror.")
@@ -72,8 +71,12 @@ def setup_parser(subparser):
def mirror_add(args):
"""Add a mirror to Spack."""
url = args.url
if url.startswith('/'):
url = 'file://' + url
config = spack.config.get_config('user')
config.set_value('mirror', args.name, 'url', args.url)
config.set_value('mirror', args.name, 'url', url)
config.write()
@@ -105,112 +108,63 @@ def mirror_list(args):
print fmt % (name, val)
def _read_specs_from_file(filename):
with closing(open(filename, "r")) as stream:
for i, string in enumerate(stream):
try:
s = Spec(string)
s.package
args.specs.append(s)
except SpackError, e:
tty.die("Parse error in %s, line %d:" % (args.file, i+1),
">>> " + string, str(e))
def mirror_create(args):
"""Create a directory to be used as a spack mirror, and fill it with
package archives."""
# try to parse specs from the command line first.
args.specs = spack.cmd.parse_specs(args.specs)
specs = spack.cmd.parse_specs(args.specs)
# If there is a file, parse each line as a spec and add it to the list.
if args.file:
with closing(open(args.file, "r")) as stream:
for i, string in enumerate(stream):
try:
s = Spec(string)
s.package
args.specs.append(s)
except SpackError, e:
tty.die("Parse error in %s, line %d:" % (args.file, i+1),
">>> " + string, str(e))
if specs:
tty.die("Cannot pass specs on the command line with --file.")
specs = _read_specs_from_file(args.file)
if not args.specs:
args.specs = [Spec(n) for n in spack.db.all_package_names()]
# If nothing is passed, use all packages.
if not specs:
specs = [Spec(n) for n in spack.db.all_package_names()]
specs.sort(key=lambda s: s.format("$_$@").lower())
# Default name for directory is spack-mirror-<DATESTAMP>
if not args.directory:
directory = args.directory
if not directory:
timestamp = datetime.now().strftime("%Y-%m-%d")
args.directory = 'spack-mirror-' + timestamp
directory = 'spack-mirror-' + timestamp
# Make sure nothing is in the way.
if os.path.isfile(args.directory):
tty.error("%s already exists and is a file." % args.directory)
existed = False
if os.path.isfile(directory):
tty.error("%s already exists and is a file." % directory)
elif os.path.isdir(directory):
existed = True
# Create a directory if none exists
if not os.path.isdir(args.directory):
mkdirp(args.directory)
tty.msg("Created new mirror in %s" % args.directory)
else:
tty.msg("Adding to existing mirror in %s" % args.directory)
# Actually do the work to create the mirror
present, mirrored, error = spack.mirror.create(
directory, specs, num_versions=args.one_version_per_spec)
p, m, e = len(present), len(mirrored), len(error)
# Things to keep track of while parsing specs.
working_dir = os.getcwd()
num_mirrored = 0
num_error = 0
# Iterate through packages and download all the safe tarballs for each of them
for spec in args.specs:
pkg = spec.package
# Skip any package that has no checksummed versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s."
% pkg.name)
continue
# create a subdir for the current package.
pkg_path = join_path(args.directory, pkg.name)
mkdirp(pkg_path)
# Download all the tarballs using Stages, then move them into place
for version in pkg.versions:
# Skip versions that don't match the spec
vspec = Spec('%s@%s' % (pkg.name, version))
if not vspec.satisfies(spec):
continue
mirror_path = "%s/%s-%s.%s" % (
pkg.name, pkg.name, version, extension(pkg.url))
os.chdir(working_dir)
mirror_file = join_path(args.directory, mirror_path)
if os.path.exists(mirror_file):
tty.msg("Already fetched %s." % mirror_file)
num_mirrored += 1
continue
# Get the URL for the version and set up a stage to download it.
url = pkg.url_for_version(version)
stage = Stage(url)
try:
# fetch changes directory into the stage
stage.fetch()
if not args.no_checksum and version in pkg.versions:
digest = pkg.versions[version]
stage.check(digest)
tty.msg("Checksum passed for %s@%s" % (pkg.name, version))
# change back and move the new archive into place.
os.chdir(working_dir)
shutil.move(stage.archive_file, mirror_file)
tty.msg("Added %s to mirror" % mirror_file)
num_mirrored += 1
except Exception, e:
tty.warn("Error while fetching %s." % url, e.message)
num_error += 1
finally:
stage.destroy()
# If nothing happened, try to say why.
if not num_mirrored:
if num_error:
tty.error("No packages added to mirror.",
"All packages failed to fetch.")
else:
tty.error("No packages added to mirror. No versions matched specs:")
colify(args.specs, indent=4)
verb = "updated" if existed else "created"
tty.msg(
"Successfully %s mirror in %s." % (verb, directory),
"Archive stats:",
" %-4d already present" % p,
" %-4d added" % m,
" %-4d failed to fetch." % e)
if error:
tty.error("Failed downloads:")
colify(s.format("$_$@") for s in error)
def mirror(parser, args):
@@ -218,4 +172,5 @@ def mirror(parser, args):
'add' : mirror_add,
'remove' : mirror_remove,
'list' : mirror_list }
action[args.mirror_command](args)

View File

@@ -0,0 +1,107 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import os
import shutil
from external import argparse
import llnl.util.tty as tty
from llnl.util.lang import partition_list
from llnl.util.filesystem import mkdirp
import spack.cmd
from spack.modules import module_types
from spack.util.string import *
from spack.spec import Spec
description ="Manipulate modules and dotkits."
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar='SUBCOMMAND', dest='module_command')
refresh_parser = sp.add_parser('refresh', help='Regenerate all module files.')
find_parser = sp.add_parser('find', help='Find module files for packages.')
find_parser.add_argument(
'module_type', help="Type of module to find file for. [" + '|'.join(module_types) + "]")
find_parser.add_argument('spec', nargs='+', help='spec to find a module file for.')
def module_find(mtype, spec_array):
"""Look at all installed packages and see if the spec provided
matches any. If it does, check whether there is a module file
of type <mtype> there, and print out the name that the user
should type to use that package's module.
"""
if mtype not in module_types:
tty.die("Invalid module type: '%s'. Options are %s." % (mtype, comma_or(module_types)))
specs = spack.cmd.parse_specs(spec_array)
if len(specs) > 1:
tty.die("You can only pass one spec.")
spec = specs[0]
specs = [s for s in spack.db.installed_package_specs() if s.satisfies(spec)]
if len(specs) == 0:
tty.die("No installed packages match spec %s" % spec)
if len(specs) > 1:
tty.error("Multiple matches for spec %s. Choose one:" % spec)
for s in specs:
sys.stderr.write(s.tree(color=True))
sys.exit(1)
mt = module_types[mtype]
mod = mt(specs[0])
if not os.path.isfile(mod.file_name):
tty.die("No %s module is installed for %s." % (mtype, spec))
print mod.use_name
def module_refresh():
"""Regenerate all module files for installed packages known to
spack (some packages may no longer exist)."""
specs = [s for s in spack.db.installed_known_package_specs()]
for name, cls in module_types.items():
tty.msg("Regenerating %s module files." % name)
if os.path.isdir(cls.path):
shutil.rmtree(cls.path, ignore_errors=False)
mkdirp(cls.path)
for spec in specs:
tty.debug(" Writing file for %s." % spec)
cls(spec).write()
def module(parser, args):
if args.module_command == 'refresh':
module_refresh()
elif args.module_command == 'find':
module_find(args.module_type, args.spec)

View File

@@ -0,0 +1,95 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import re
import cgi
from StringIO import StringIO
import llnl.util.tty as tty
from llnl.util.tty.colify import *
import spack
description = "Print a list of all packages in reStructuredText."
def github_url(pkg):
"""Link to a package file on github."""
return ("https://github.com/scalability-llnl/spack/blob/master/var/spack/packages/%s/package.py" %
pkg.name)
def rst_table(elts):
"""Print out a RST-style table."""
cols = StringIO()
ncol, widths = colify(elts, output=cols, tty=True)
header = " ".join("=" * (w-1) for w in widths)
return "%s\n%s%s" % (header, cols.getvalue(), header)
def print_rst_package_list():
"""Print out information on all packages in restructured text."""
pkgs = sorted(spack.db.all_packages(), key=lambda s:s.name.lower())
print ".. _package-list:"
print
print "Package List"
print "=================="
print "This is a list of things you can install using Spack. It is"
print "automatically generated based on the packages in the latest Spack"
print "release."
print
print "Spack currently has %d mainline packages:" % len(pkgs)
print
print rst_table("`%s`_" % p.name for p in pkgs)
print
print "-----"
# Output some text for each package.
for pkg in pkgs:
print
print ".. _%s:" % pkg.name
print
print pkg.name
print "-" * len(pkg.name)
print "Links:"
print " * `%s <%s>`__" % (cgi.escape(pkg.homepage), pkg.homepage)
print " * `%s/package.py <%s>`__" % (pkg.name, github_url(pkg))
print
if pkg.versions:
print "Versions:"
print " " + ", ".join(str(v) for v in reversed(sorted(pkg.versions)))
if pkg.dependencies:
print "Dependencies"
print " " + ", ".join("`%s`_" % d if d != "mpi" else d
for d in pkg.dependencies)
print
print "Description:"
print pkg.format_doc(indent=2)
print
print "-----"
def package_list(parser, args):
print_rst_package_list()

View File

@@ -22,7 +22,7 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import spack.cmd
import spack

139
lib/spack/spack/cmd/pkg.py Normal file
View File

@@ -0,0 +1,139 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
from external import argparse
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack
from spack.util.executable import *
description = "Query packages associated with particular git revisions."
def setup_parser(subparser):
sp = subparser.add_subparsers(
metavar='SUBCOMMAND', dest='pkg_command')
add_parser = sp.add_parser('add', help=pkg_add.__doc__)
add_parser.add_argument('packages', nargs=argparse.REMAINDER,
help="Names of packages to add to git repo.")
list_parser = sp.add_parser('list', help=pkg_list.__doc__)
list_parser.add_argument('rev', default='HEAD', nargs='?',
help="Revision to list packages for.")
diff_parser = sp.add_parser('diff', help=pkg_diff.__doc__)
diff_parser.add_argument('rev1', nargs='?', default='HEAD^',
help="Revision to compare against.")
diff_parser.add_argument('rev2', nargs='?', default='HEAD',
help="Revision to compare to rev1 (default is HEAD).")
add_parser = sp.add_parser('added', help=pkg_added.__doc__)
add_parser.add_argument('rev1', nargs='?', default='HEAD^',
help="Revision to compare against.")
add_parser.add_argument('rev2', nargs='?', default='HEAD',
help="Revision to compare to rev1 (default is HEAD).")
rm_parser = sp.add_parser('removed', help=pkg_removed.__doc__)
rm_parser.add_argument('rev1', nargs='?', default='HEAD^',
help="Revision to compare against.")
rm_parser.add_argument('rev2', nargs='?', default='HEAD',
help="Revision to compare to rev1 (default is HEAD).")
def get_git():
# cd to spack prefix to do git operations
os.chdir(spack.prefix)
# If this is a non-git version of spack, give up.
if not os.path.isdir('.git'):
tty.die("No git repo in %s. Can't use 'spack pkg'" % spack.prefix)
return which("git", required=True)
def list_packages(rev):
git = get_git()
relpath = spack.packages_path[len(spack.prefix + os.path.sep):] + os.path.sep
output = git('ls-tree', '--full-tree', '--name-only', rev, relpath,
return_output=True)
return sorted(line[len(relpath):] for line in output.split('\n') if line)
def pkg_add(args):
for pkg_name in args.packages:
filename = spack.db.filename_for_package_name(pkg_name)
if not os.path.isfile(filename):
tty.die("No such package: %s. Path does not exist:" % pkg_name, filename)
git = get_git()
git('-C', spack.packages_path, 'add', filename)
def pkg_list(args):
"""List packages associated with a particular spack git revision."""
colify(list_packages(args.rev))
def diff_packages(rev1, rev2):
p1 = set(list_packages(rev1))
p2 = set(list_packages(rev2))
return p1.difference(p2), p2.difference(p1)
def pkg_diff(args):
"""Compare packages available in two different git revisions."""
u1, u2 = diff_packages(args.rev1, args.rev2)
if u1:
print "%s:" % args.rev1
colify(sorted(u1), indent=4)
if u1: print
if u2:
print "%s:" % args.rev2
colify(sorted(u2), indent=4)
def pkg_removed(args):
"""Show packages removed since a commit."""
u1, u2 = diff_packages(args.rev1, args.rev2)
if u1: colify(sorted(u1))
def pkg_added(args):
"""Show packages added since a commit."""
u1, u2 = diff_packages(args.rev1, args.rev2)
if u2: colify(sorted(u2))
def pkg(parser, args):
action = { 'add' : pkg_add,
'diff' : pkg_diff,
'list' : pkg_list,
'removed' : pkg_removed,
'added' : pkg_added }
action[args.pkg_command](args)

View File

@@ -23,7 +23,7 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import argparse
from external import argparse
from llnl.util.tty.colify import colify

View File

@@ -25,7 +25,7 @@
import os
import sys
import code
import argparse
from external import argparse
import platform
from contextlib import closing

View File

@@ -0,0 +1,46 @@
##############################################################################
# Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
description = "Revert checked out package source code."
def setup_parser(subparser):
subparser.add_argument('packages', nargs=argparse.REMAINDER,
help="specs of packages to restage")
def restage(parser, args):
if not args.packages:
tty.die("spack restage requires at least one package spec.")
specs = spack.cmd.parse_specs(args.packages, concretize=True)
for spec in specs:
package = spack.db.get(spec)
package.do_restage()

View File

@@ -22,13 +22,13 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import spack.cmd
import llnl.util.tty as tty
import spack.url as url
import spack
import spack.url as url
description = "print out abstract and concrete versions of a spec."

View File

@@ -22,8 +22,10 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
import os
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
@@ -33,18 +35,21 @@ def setup_parser(subparser):
subparser.add_argument(
'-n', '--no-checksum', action='store_true', dest='no_checksum',
help="Do not check downloaded packages against checksum")
dir_parser = subparser.add_mutually_exclusive_group()
subparser.add_argument(
'packages', nargs=argparse.REMAINDER, help="specs of packages to stage")
'specs', nargs=argparse.REMAINDER, help="specs of packages to stage")
def stage(parser, args):
if not args.packages:
if not args.specs:
tty.die("stage requires at least one package argument")
if args.no_checksum:
spack.do_checksum = False
specs = spack.cmd.parse_specs(args.packages, concretize=True)
specs = spack.cmd.parse_specs(args.specs, concretize=True)
for spec in specs:
package = spack.db.get(spec)
package.do_stage()

View File

@@ -22,12 +22,13 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
from external import argparse
import llnl.util.tty as tty
import spack
import spack.cmd
import spack.packages
description="Remove an installed package"
@@ -64,11 +65,18 @@ def uninstall(parser, args):
" b) use a more specific spec."]
tty.die(*args)
if len(matching_specs) == 0:
if args.force: continue
tty.die("%s does not match any installed packages." % spec)
pkgs.extend(spack.db.get(s) for s in matching_specs)
for s in matching_specs:
try:
# should work if package is known to spack
pkgs.append(s.package)
except spack.packages.UnknownPackageError, e:
# The package.py file has gone away -- but still want to uninstall.
spack.Package(s).do_uninstall(force=True)
# Sort packages to be uninstalled by the number of installed dependents
# This ensures we do things in the right order

View File

@@ -0,0 +1,38 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by David Beckingsale, david@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from external import argparse
import spack.modules
description ="Remove package from environment using module."
def setup_parser(subparser):
"""Parser is only constructed so that this prints a nice help
message with -h. """
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help='Spec of package to unload with modules.')
def unload(parser, args):
spack.modules.print_help()

View File

@@ -22,15 +22,17 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
import spack.cmd.use
from external import argparse
import spack.modules
description ="Remove package from environment using dotkit."
def setup_parser(subparser):
"""Parser is only constructed so that this prints a nice help
message with -h. """
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help='Spec of package to remove.')
'spec', nargs=argparse.REMAINDER, help='Spec of package to unuse with dotkit.')
def unuse(parser, args):
spack.cmd.use.print_help()
spack.modules.print_help()

View File

@@ -0,0 +1,58 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import spack
import spack.url
description = "Inspect urls used by packages in spack."
def setup_parser(subparser):
subparser.add_argument(
'-c', '--color', action='store_true',
help="Color the parsed version and name in the urls shown. "
"Version will be cyan, name red.")
subparser.add_argument(
'-e', '--extrapolation', action='store_true',
help="Color the versions used for extrapolation as well."
"Additional versions are green, names magenta.")
def urls(parser, args):
urls = set()
for pkg in spack.db.all_packages():
url = getattr(pkg.__class__, 'url', None)
if url:
urls.add(url)
for params in pkg.versions.values():
url = params.get('url', None)
if url:
urls.add(url)
for url in sorted(urls):
if args.color or args.extrapolation:
print spack.url.color_url(url, subs=args.extrapolation, errors=True)
else:
print url

View File

@@ -22,29 +22,17 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
import llnl.util.tty as tty
import spack
from external import argparse
import spack.modules
description ="Add package to environment using dotkit."
def setup_parser(subparser):
"""Parser is only constructed so that this prints a nice help
message with -h. """
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help='Spec of package to add.')
def print_help():
tty.msg("Spack dotkit support is not initialized.",
"",
"To use dotkit with Spack, you must first run the command",
"below, which you can copy and paste:",
"",
"For bash:",
" . %s/setup-env.bash" % spack.share_path,
"",
"ksh/csh/tcsh shells are currently unsupported",
"")
'spec', nargs=argparse.REMAINDER, help='Spec of package to use with dotkit.')
def use(parser, args):
print_help()
spack.modules.print_help()

View File

@@ -24,6 +24,7 @@
##############################################################################
import os
from llnl.util.tty.colify import colify
import llnl.util.tty as tty
import spack
description ="List available versions of a package"
@@ -34,4 +35,21 @@ def setup_parser(subparser):
def versions(parser, args):
pkg = spack.db.get(args.package)
colify(reversed(pkg.fetch_available_versions()))
safe_versions = pkg.versions
fetched_versions = pkg.fetch_remote_versions()
remote_versions = set(fetched_versions).difference(safe_versions)
tty.msg("Safe versions (already checksummed):")
colify(sorted(safe_versions, reverse=True), indent=2)
tty.msg("Remote versions (not yet checksummed):")
if not remote_versions:
if not fetched_versions:
print " Found no versions for %s" % pkg.name
tty.debug("Check the list_url and list_depth attribute on the "
"package to help Spack find versions.")
else:
print " Found no unckecksummed versions for %s" % pkg.name
else:
colify(sorted(remote_versions, reverse=True), indent=2)

View File

@@ -1,117 +0,0 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""\
The ``compilation`` module contains utility functions used by the compiler
wrapper script.
.. todo::
Think about moving this into the script to increase compilation
speed.
"""
import os
import sys
def get_env_var(name, required=True):
value = os.environ.get(name)
if required and value is None:
print "%s must be run from spack." % os.path.abspath(sys.argv[0])
sys.exit(1)
return value
def get_env_flag(name, required=False):
value = get_env_var(name, required)
if value:
return value.lower() == "true"
return False
def get_path(name):
path = os.environ.get(name, "").strip()
if path:
return path.split(":")
else:
return []
def parse_rpaths(arguments):
"""argparse, for all its features, cannot understand most compilers'
rpath arguments. This handles '-Wl,', '-Xlinker', and '-R'"""
def get_next(arg, args):
"""Get an expected next value of an iterator, or die if it's not there"""
try:
return next(args)
except StopIteration:
# quietly ignore -rpath and -Xlinker without args.
return None
other_args = []
def linker_args():
"""This generator function allows us to parse the linker args separately
from the compiler args, so that we can handle them more naturally.
"""
args = iter(arguments)
for arg in args:
if arg.startswith('-Wl,'):
sub_args = [sub for sub in arg.replace('-Wl,', '', 1).split(',')]
for arg in sub_args:
yield arg
elif arg == '-Xlinker':
target = get_next(arg, args)
if target is not None:
yield target
else:
other_args.append(arg)
# Extract all the possible ways rpath can appear in linker args, then
# append non-rpaths to other_args. This happens in-line as the linker
# args are extracted, so we preserve the original order of arguments.
# This is important for args like --whole-archive, --no-whole-archive,
# and others that tell the linker how to handle the next few libraries
# it encounters on the command line.
rpaths = []
largs = linker_args()
for arg in largs:
if arg == '-rpath':
target = get_next(arg, largs)
if target is not None:
rpaths.append(target)
elif arg.startswith('-R'):
target = arg.replace('-R', '', 1)
if not target:
target = get_next(arg, largs)
if target is None: break
if os.path.isdir(target):
rpaths.append(target)
else:
other_args.extend(['-Wl,' + arg, '-Wl,' + target])
else:
other_args.append('-Wl,' + arg)
return rpaths, other_args

View File

@@ -35,8 +35,8 @@
import spack.spec
from spack.util.multiproc import parmap
from spack.util.executable import *
from spack.util.environment import get_path
from spack.version import Version
from spack.compilation import get_path
__all__ = ['Compiler', 'get_compiler_version']
@@ -94,6 +94,9 @@ class Compiler(object):
# Names of generic arguments used by this compiler
arg_rpath = '-Wl,-rpath,%s'
# argument used to get C++11 options
cxx11_flag = "-std=c++11"
def __init__(self, cspec, cc, cxx, f77, fc):
def check(exe):
@@ -166,6 +169,10 @@ def _find_matches_in_path(cls, compiler_names, detect_version, *path):
checks = []
for directory in path:
if not (os.path.isdir(directory) and
os.access(directory, os.R_OK | os.X_OK)):
continue
files = os.listdir(directory)
for exe in files:
full_path = join_path(directory, exe)
@@ -187,9 +194,15 @@ def check(key):
except ProcessError, e:
tty.debug("Couldn't get version for compiler %s" % full_path, e)
return None
except Exception, e:
# Catching "Exception" here is fine because it just
# means something went wrong running a candidate executable.
tty.debug("Error while executing candidate compiler %s" % full_path,
"%s: %s" %(e.__class__.__name__, e))
return None
successful = [key for key in parmap(check, checks) if key is not None]
return { (v, p, s) : path for v, p, s, path in successful }
return dict(((v, p, s), path) for v, p, s, path in successful)
@classmethod
def find(cls, *path):

View File

@@ -40,7 +40,7 @@
from spack.compiler import Compiler
from spack.util.executable import which
from spack.util.naming import mod_to_class
from spack.compilation import get_path
from spack.util.environment import get_path
_imported_compilers_module = 'spack.compilers'
_required_instance_vars = ['cc', 'cxx', 'f77', 'fc']
@@ -176,7 +176,7 @@ def compilers_for_spec(compiler_spec):
config = _get_config()
def get_compiler(cspec):
items = { k:v for k,v in config.items('compiler "%s"' % cspec) }
items = dict((k,v) for k,v in config.items('compiler "%s"' % cspec))
if not all(n in items for n in _required_instance_vars):
raise InvalidCompilerConfigurationError(cspec)

View File

@@ -22,7 +22,9 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import llnl.util.tty as tty
from spack.compiler import *
from spack.version import ver
class Gcc(Compiler):
# Subclasses use possible names of C compiler
@@ -40,12 +42,21 @@ class Gcc(Compiler):
# MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes.
suffixes = [r'-mp-\d\.\d']
@property
def cxx11_flag(self):
if self.version < ver('4.3'):
tty.die("Only gcc 4.3 and above support c++11.")
elif self.version < ver('4.7'):
return "-std=gnu++0x"
else:
return "-std=gnu++11"
@classmethod
def fc_version(cls, fc):
return get_compiler_version(
fc, '-dumpversion',
# older gfortran versions don't have simple dumpversion output.
r'(?:GNU Fortran \(GCC\))?(\d+\.\d+\.\d+)')
r'(?:GNU Fortran \(GCC\))?(\d+\.\d+(?:\.\d+)?)')
@classmethod

View File

@@ -37,6 +37,15 @@ class Intel(Compiler):
# Subclasses use possible names of Fortran 90 compiler
fc_names = ['ifort']
@property
def cxx11_flag(self):
if self.version < ver('11.1'):
tty.die("Only intel 11.1 and above support c++11.")
elif self.version < ver('13'):
return "-std=c++0x"
else:
return "-std=c++11"
@classmethod
def default_version(cls, comp):

View File

@@ -68,11 +68,14 @@ def concretize_version(self, spec):
# If there are known avaialble versions, return the most recent
# version that satisfies the spec
pkg = spec.package
valid_versions = pkg.available_versions.intersection(spec.versions)
valid_versions = sorted(
[v for v in pkg.versions
if any(v.satisfies(sv) for sv in spec.versions)])
if valid_versions:
spec.versions = ver([valid_versions[-1]])
else:
spec.versions = ver([pkg.default_version])
raise NoValidVersionError(spec)
def concretize_architecture(self, spec):
@@ -118,7 +121,7 @@ def concretize_compiler(self, spec):
return
try:
nearest = next(p for p in spec.preorder_traversal(direction='parents')
nearest = next(p for p in spec.traverse(direction='parents')
if p.compiler is not None).compiler
if not nearest in all_compilers:
@@ -158,3 +161,11 @@ def __init__(self, compiler_spec):
super(UnavailableCompilerVersionError, self).__init__(
"No available compiler version matches '%s'" % compiler_spec,
"Run 'spack compilers' to see available compiler Options.")
class NoValidVersionError(spack.error.SpackError):
"""Raised when there is no available version for a package that
satisfies a spec."""
def __init__(self, spec):
super(NoValidVersionError, self).__init__(
"No available version of %s matches '%s'" % (spec.name, spec.versions))

View File

@@ -84,10 +84,9 @@
import re
import inspect
import ConfigParser as cp
from collections import OrderedDict
from external.ordereddict import OrderedDict
from llnl.util.lang import memoized
import spack.error
__all__ = [
@@ -222,7 +221,6 @@ class SpackConfigParser(cp.RawConfigParser):
"""
# Slightly modify Python option expressions to allow leading whitespace
OPTCRE = re.compile(r'\s*' + cp.RawConfigParser.OPTCRE.pattern)
OPTCRE_NV = re.compile(r'\s*' + cp.RawConfigParser.OPTCRE_NV.pattern)
def __init__(self, file_or_files):
cp.RawConfigParser.__init__(self, dict_type=OrderedDict)
@@ -341,14 +339,13 @@ def write(self, path_or_fp=None):
def _read(self, fp, fpname):
"""This is a copy of Python 2.7's _read() method, with support for
continuation lines removed.
"""
cursect = None # None, or a dictionary
"""This is a copy of Python 2.6's _read() method, with support for
continuation lines removed."""
cursect = None # None, or a dictionary
optname = None
lineno = 0
comment = 0
e = None # None, or an exception
lineno = 0
e = None # None, or an exception
while True:
line = fp.readline()
if not line:
@@ -359,7 +356,6 @@ def _read(self, fp, fpname):
(line.split(None, 1)[0].lower() == 'rem' and line[0] in "rR")):
self._sections["comment-%d" % comment] = line
comment += 1
continue
# a section header or option header?
else:
# is it a section header?
@@ -381,27 +377,21 @@ def _read(self, fp, fpname):
raise cp.MissingSectionHeaderError(fpname, lineno, line)
# an option line?
else:
mo = self._optcre.match(line)
mo = self.OPTCRE.match(line)
if mo:
optname, vi, optval = mo.group('option', 'vi', 'value')
if vi in ('=', ':') and ';' in optval:
# ';' is a comment delimiter only if it follows
# a spacing character
pos = optval.find(';')
if pos != -1 and optval[pos-1].isspace():
optval = optval[:pos]
optval = optval.strip()
# allow empty values
if optval == '""':
optval = ''
optname = self.optionxform(optname.rstrip())
# This check is fine because the OPTCRE cannot
# match if it would set optval to None
if optval is not None:
if vi in ('=', ':') and ';' in optval:
# ';' is a comment delimiter only if it follows
# a spacing character
pos = optval.find(';')
if pos != -1 and optval[pos-1].isspace():
optval = optval[:pos]
optval = optval.strip()
# allow empty values
if optval == '""':
optval = ''
cursect[optname] = [optval]
else:
# valueless option handling
cursect[optname] = optval
cursect[optname] = optval
else:
# a non-fatal parsing error occurred. set up the
# exception but keep going. the exception will be
@@ -414,23 +404,13 @@ def _read(self, fp, fpname):
if e:
raise e
# join the multi-line values collected while reading
all_sections = [self._defaults]
all_sections.extend(self._sections.values())
for options in all_sections:
# skip comments
if isinstance(options, basestring):
continue
for name, val in options.items():
if isinstance(val, list):
options[name] = '\n'.join(val)
def _write(self, fp):
"""Write an .ini-format representation of the configuration state.
This is taken from the default Python 2.7 source. It writes 4
This is taken from the default Python 2.6 source. It writes 4
spaces at the beginning of lines instead of no leading space.
"""
if self._defaults:
@@ -449,11 +429,9 @@ def _write(self, fp):
# Allow leading whitespace
fp.write("[%s]\n" % section)
for (key, value) in self._sections[section].items():
if key == "__name__":
continue
if (value is not None) or (self._optcre == self.OPTCRE):
key = " = ".join((key, str(value).replace('\n', '\n\t')))
fp.write(" %s\n" % (key))
if key != "__name__":
fp.write(" %s = %s\n" %
(key, str(value).replace('\n', '\n\t')))
class SpackConfigurationError(spack.error.SpackError):

View File

@@ -27,9 +27,14 @@
import exceptions
import hashlib
import shutil
import tempfile
from contextlib import closing
import llnl.util.tty as tty
from llnl.util.lang import memoized
from llnl.util.filesystem import join_path, mkdirp
import spack
from spack.spec import Spec
from spack.error import SpackError
@@ -50,6 +55,19 @@ def __init__(self, root):
self.root = root
@property
def hidden_file_paths(self):
"""Return a list of hidden files used by the directory layout.
Paths are relative to the root of an install directory.
If the directory layout uses no hidden files to maintain
state, this should return an empty container, e.g. [] or (,).
"""
raise NotImplementedError()
def all_specs(self):
"""To be implemented by subclasses to traverse all specs for which there is
a directory within the root.
@@ -68,6 +86,42 @@ def make_path_for_spec(self, spec):
raise NotImplementedError()
def extension_map(self, spec):
"""Get a dict of currently installed extension packages for a spec.
Dict maps { name : extension_spec }
Modifying dict does not affect internals of this layout.
"""
raise NotImplementedError()
def check_extension_conflict(self, spec, ext_spec):
"""Ensure that ext_spec can be activated in spec.
If not, raise ExtensionAlreadyInstalledError or
ExtensionConflictError.
"""
raise NotImplementedError()
def check_activated(self, spec, ext_spec):
"""Ensure that ext_spec can be removed from spec.
If not, raise NoSuchExtensionError.
"""
raise NotImplementedError()
def add_extension(self, spec, ext_spec):
"""Add to the list of currently installed extensions."""
raise NotImplementedError()
def remove_extension(self, spec, ext_spec):
"""Remove from the list of currently installed extensions."""
raise NotImplementedError()
def path_for_spec(self, spec):
"""Return an absolute path from the root to a directory for the spec."""
_check_concrete(spec)
@@ -78,12 +132,17 @@ def path_for_spec(self, spec):
def remove_path_for_spec(self, spec):
"""Removes a prefix and any empty parent directories from the root."""
"""Removes a prefix and any empty parent directories from the root.
Raised RemoveFailedError if something goes wrong.
"""
path = self.path_for_spec(spec)
assert(path.startswith(self.root))
if os.path.exists(path):
shutil.rmtree(path, True)
try:
shutil.rmtree(path)
except exceptions.OSError, e:
raise RemoveFailedError(spec, path, e)
path = os.path.dirname(path)
while path != self.root:
@@ -131,12 +190,18 @@ def __init__(self, root, **kwargs):
"""Prefix size is number of characters in the SHA-1 prefix to use
to make each hash unique.
"""
prefix_size = kwargs.get('prefix_size', 8)
spec_file = kwargs.get('spec_file', '.spec')
spec_file_name = kwargs.get('spec_file_name', '.spec')
extension_file_name = kwargs.get('extension_file_name', '.extensions')
super(SpecHashDirectoryLayout, self).__init__(root)
self.prefix_size = prefix_size
self.spec_file = spec_file
self.spec_file_name = spec_file_name
self.extension_file_name = extension_file_name
# Cache of already written/read extension maps.
self._extension_maps = {}
@property
def hidden_file_paths(self):
return ('.spec', '.extensions')
def relative_path_for_spec(self, spec):
@@ -154,15 +219,44 @@ def write_spec(self, spec, path):
def read_spec(self, path):
"""Read the contents of a file and parse them as a spec"""
with closing(open(path)) as spec_file:
string = spec_file.read().replace('\n', '')
return Spec(string)
# Specs from files are assumed normal and concrete
spec = Spec(spec_file.read().replace('\n', ''))
if all(spack.db.exists(s.name) for s in spec.traverse()):
copy = spec.copy()
# TODO: It takes a lot of time to normalize every spec on read.
# TODO: Storing graph info with spec files would fix this.
copy.normalize()
if copy.concrete:
return copy # These are specs spack still understands.
# If we get here, either the spec is no longer in spack, or
# something about its dependencies has changed. So we need to
# just assume the read spec is correct. We'll lose graph
# information if we do this, but this is just for best effort
# for commands like uninstall and find. Currently Spack
# doesn't do anything that needs the graph info after install.
# TODO: store specs with full connectivity information, so
# that we don't have to normalize or reconstruct based on
# changing dependencies in the Spack tree.
spec._normal = True
spec._concrete = True
return spec
def spec_file_path(self, spec):
"""Gets full path to spec file"""
_check_concrete(spec)
return join_path(self.path_for_spec(spec), self.spec_file_name)
def make_path_for_spec(self, spec):
_check_concrete(spec)
path = self.path_for_spec(spec)
spec_file_path = join_path(path, self.spec_file)
spec_file_path = self.spec_file_path(spec)
if os.path.isdir(path):
if not os.path.isfile(spec_file_path):
@@ -176,8 +270,7 @@ def make_path_for_spec(self, spec):
spec_hash = self.hash_spec(spec)
installed_hash = self.hash_spec(installed_spec)
if installed_spec == spec_hash:
raise SpecHashCollisionError(
installed_hash, spec_hash, self.prefix_size)
raise SpecHashCollisionError(installed_hash, spec_hash)
else:
raise InconsistentInstallDirectoryError(
'Spec file in %s does not match SHA-1 hash!'
@@ -187,17 +280,116 @@ def make_path_for_spec(self, spec):
self.write_spec(spec, spec_file_path)
@memoized
def all_specs(self):
if not os.path.isdir(self.root):
return
return []
specs = []
for path in traverse_dirs_at_depth(self.root, 3):
arch, compiler, last_dir = path
spec_file_path = join_path(
self.root, arch, compiler, last_dir, self.spec_file)
self.root, arch, compiler, last_dir, self.spec_file_name)
if os.path.exists(spec_file_path):
spec = self.read_spec(spec_file_path)
yield spec
specs.append(spec)
return specs
def extension_file_path(self, spec):
"""Gets full path to an installed package's extension file"""
_check_concrete(spec)
return join_path(self.path_for_spec(spec), self.extension_file_name)
def _extension_map(self, spec):
"""Get a dict<name -> spec> for all extensions currnetly
installed for this package."""
_check_concrete(spec)
if not spec in self._extension_maps:
path = self.extension_file_path(spec)
if not os.path.exists(path):
self._extension_maps[spec] = {}
else:
exts = {}
with closing(open(path)) as ext_file:
for line in ext_file:
try:
spec = Spec(line.strip())
exts[spec.name] = spec
except spack.error.SpackError, e:
# TODO: do something better here -- should be
# resilient to corrupt files.
raise InvalidExtensionSpecError(str(e))
self._extension_maps[spec] = exts
return self._extension_maps[spec]
def extension_map(self, spec):
"""Defensive copying version of _extension_map() for external API."""
return self._extension_map(spec).copy()
def check_extension_conflict(self, spec, ext_spec):
exts = self._extension_map(spec)
if ext_spec.name in exts:
installed_spec = exts[ext_spec.name]
if ext_spec == installed_spec:
raise ExtensionAlreadyInstalledError(spec, ext_spec)
else:
raise ExtensionConflictError(spec, ext_spec, installed_spec)
def check_activated(self, spec, ext_spec):
exts = self._extension_map(spec)
if (not ext_spec.name in exts) or (ext_spec != exts[ext_spec.name]):
raise NoSuchExtensionError(spec, ext_spec)
def _write_extensions(self, spec, extensions):
path = self.extension_file_path(spec)
# Create a temp file in the same directory as the actual file.
dirname, basename = os.path.split(path)
tmp = tempfile.NamedTemporaryFile(
prefix=basename, dir=dirname, delete=False)
# Write temp file.
with closing(tmp):
for extension in sorted(extensions.values()):
tmp.write("%s\n" % extension)
# Atomic update by moving tmpfile on top of old one.
os.rename(tmp.name, path)
def add_extension(self, spec, ext_spec):
_check_concrete(spec)
_check_concrete(ext_spec)
# Check whether it's already installed or if it's a conflict.
exts = self._extension_map(spec)
self.check_extension_conflict(spec, ext_spec)
# do the actual adding.
exts[ext_spec.name] = ext_spec
self._write_extensions(spec, exts)
def remove_extension(self, spec, ext_spec):
_check_concrete(spec)
_check_concrete(ext_spec)
# Make sure it's installed before removing.
exts = self._extension_map(spec)
self.check_activated(spec, ext_spec)
# do the actual removing.
del exts[ext_spec.name]
self._write_extensions(spec, exts)
class DirectoryLayoutError(SpackError):
@@ -208,10 +400,19 @@ def __init__(self, message):
class SpecHashCollisionError(DirectoryLayoutError):
"""Raised when there is a hash collision in an SpecHashDirectoryLayout."""
def __init__(self, installed_spec, new_spec, prefix_size):
def __init__(self, installed_spec, new_spec):
super(SpecHashDirectoryLayout, self).__init__(
'Specs %s and %s have the same %d character SHA-1 prefix!'
% prefix_size, installed_spec, new_spec)
'Specs %s and %s have the same SHA-1 prefix!'
% installed_spec, new_spec)
class RemoveFailedError(DirectoryLayoutError):
"""Raised when a DirectoryLayout cannot remove an install prefix."""
def __init__(self, installed_spec, prefix, error):
super(RemoveFailedError, self).__init__(
'Could not remove prefix %s for %s : %s'
% prefix, installed_spec.short_spec, error)
self.cause = error
class InconsistentInstallDirectoryError(DirectoryLayoutError):
@@ -225,3 +426,34 @@ class InstallDirectoryAlreadyExistsError(DirectoryLayoutError):
def __init__(self, path):
super(InstallDirectoryAlreadyExistsError, self).__init__(
"Install path %s already exists!")
class InvalidExtensionSpecError(DirectoryLayoutError):
"""Raised when an extension file has a bad spec in it."""
def __init__(self, message):
super(InvalidExtensionSpecError, self).__init__(message)
class ExtensionAlreadyInstalledError(DirectoryLayoutError):
"""Raised when an extension is added to a package that already has it."""
def __init__(self, spec, ext_spec):
super(ExtensionAlreadyInstalledError, self).__init__(
"%s is already installed in %s" % (ext_spec.short_spec, spec.short_spec))
class ExtensionConflictError(DirectoryLayoutError):
"""Raised when an extension is added to a package that already has it."""
def __init__(self, spec, ext_spec, conflict):
super(ExtensionConflictError, self).__init__(
"%s cannot be installed in %s because it conflicts with %s."% (
ext_spec.short_spec, spec.short_spec, conflict.short_spec))
class NoSuchExtensionError(DirectoryLayoutError):
"""Raised when an extension isn't there on deactivate."""
def __init__(self, spec, ext_spec):
super(NoSuchExtensionError, self).__init__(
"%s cannot be removed from %s because it's not activated."% (
ext_spec.short_spec, spec.short_spec))

View File

@@ -28,10 +28,17 @@ class SpackError(Exception):
Subclasses can be found in the modules they have to do with.
"""
def __init__(self, message, long_message=None):
super(SpackError, self).__init__(message)
super(SpackError, self).__init__()
self.message = message
self.long_message = long_message
def __str__(self):
msg = self.message
if self.long_message:
msg += "\n %s" % self.long_message
return msg
class UnsupportedPlatformError(SpackError):
"""Raised by packages when a platform is not supported"""
def __init__(self, message):

View File

@@ -0,0 +1,683 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
Fetch strategies are used to download source code into a staging area
in order to build it. They need to define the following methods:
* fetch()
This should attempt to download/check out source from somewhere.
* check()
Apply a checksum to the downloaded source code, e.g. for an archive.
May not do anything if the fetch method was safe to begin with.
* expand()
Expand (e.g., an archive) downloaded file to source.
* reset()
Restore original state of downloaded code. Used by clean commands.
This may just remove the expanded source and re-expand an archive,
or it may run something like git reset --hard.
* archive()
Archive a source directory, e.g. for creating a mirror.
"""
import os
import sys
import re
import shutil
from functools import wraps
import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack
import spack.error
import spack.util.crypto as crypto
from spack.util.executable import *
from spack.util.string import *
from spack.version import Version, ver
from spack.util.compression import decompressor_for, extension
"""List of all fetch strategies, created by FetchStrategy metaclass."""
all_strategies = []
def _needs_stage(fun):
"""Many methods on fetch strategies require a stage to be set
using set_stage(). This decorator adds a check for self.stage."""
@wraps(fun)
def wrapper(self, *args, **kwargs):
if not self.stage:
raise NoStageError(fun)
return fun(self, *args, **kwargs)
return wrapper
class FetchStrategy(object):
"""Superclass of all fetch strategies."""
enabled = False # Non-abstract subclasses should be enabled.
required_attributes = None # Attributes required in version() args.
class __metaclass__(type):
"""This metaclass registers all fetch strategies in a list."""
def __init__(cls, name, bases, dict):
type.__init__(cls, name, bases, dict)
if cls.enabled: all_strategies.append(cls)
def __init__(self):
# The stage is initialized late, so that fetch strategies can be constructed
# at package construction time. This is where things will be fetched.
self.stage = None
def set_stage(self, stage):
"""This is called by Stage before any of the fetching
methods are called on the stage."""
self.stage = stage
# Subclasses need to implement these methods
def fetch(self): pass # Return True on success, False on fail.
def check(self): pass # Do checksum.
def expand(self): pass # Expand archive.
def reset(self): pass # Revert to freshly downloaded state.
def archive(self, destination): pass # Used to create tarball for mirror.
def __str__(self): # Should be human readable URL.
return "FetchStrategy.__str___"
# This method is used to match fetch strategies to version()
# arguments in packages.
@classmethod
def matches(cls, args):
return any(k in args for k in cls.required_attributes)
class URLFetchStrategy(FetchStrategy):
"""FetchStrategy that pulls source code from a URL for an archive,
checks the archive against a checksum,and decompresses the archive.
"""
enabled = True
required_attributes = ['url']
def __init__(self, url=None, digest=None, **kwargs):
super(URLFetchStrategy, self).__init__()
# If URL or digest are provided in the kwargs, then prefer
# those values.
self.url = kwargs.get('url', None)
if not self.url: self.url = url
self.digest = kwargs.get('md5', None)
if not self.digest: self.digest = digest
if not self.url:
raise ValueError("URLFetchStrategy requires a url for fetching.")
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.archive_file:
tty.msg("Already downloaded %s." % self.archive_file)
return
tty.msg("Trying to fetch from %s" % self.url)
curl_args = ['-O', # save file to disk
'-f', # fail on >400 errors
'-D', '-', # print out HTML headers
'-L', self.url,]
if sys.stdout.isatty():
curl_args.append('-#') # status bar when using a tty
else:
curl_args.append('-sS') # just errors when not.
# Run curl but grab the mime type from the http headers
headers = spack.curl(
*curl_args, return_output=True, fail_on_error=False)
if spack.curl.returncode != 0:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if spack.curl.returncode == 22:
# This is a 404. Curl will print the error.
raise FailedDownloadError(
self.url, "URL %s was not found!" % self.url)
elif spack.curl.returncode == 60:
# This is a certificate error. Suggest spack -k
raise FailedDownloadError(
self.url,
"Curl was unable to fetch due to invalid certificate. "
"This is either an attack, or your cluster's SSL configuration "
"is bad. If you believe your SSL configuration is bad, you "
"can try running spack -k, which will not check SSL certificates."
"Use this at your own risk.")
else:
# This is some other curl error. Curl will print the
# error, but print a spack message too
raise FailedDownloadError(
self.url, "Curl failed with error %d" % spack.curl.returncode)
# Check if we somehow got an HTML file rather than the archive we
# asked for. We only look at the last content type, to handle
# redirects properly.
content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
if content_types and 'text/html' in content_types[-1]:
tty.warn("The contents of " + self.archive_file + " look like HTML.",
"The checksum will likely be bad. If it is, you can use",
"'spack clean --dist' to remove the bad archive, then fix",
"your internet gateway issue and install again.")
if not self.archive_file:
raise FailedDownloadError(self.url)
@property
def archive_file(self):
"""Path to the source archive within this stage directory."""
return self.stage.archive_file
@_needs_stage
def expand(self):
tty.msg("Staging archive: %s" % self.archive_file)
self.stage.chdir()
if not self.archive_file:
raise NoArchiveFileError("URLFetchStrategy couldn't find archive file",
"Failed on expand() for URL %s" % self.url)
decompress = decompressor_for(self.archive_file)
# Expand all tarballs in their own directory to contain
# exploding tarballs.
tarball_container = os.path.join(self.stage.path, "spack-expanded-archive")
mkdirp(tarball_container)
os.chdir(tarball_container)
decompress(self.archive_file)
# If the tarball *didn't* explode, move
# the expanded directory up & remove the protector directory.
files = os.listdir(tarball_container)
if len(files) == 1:
expanded_dir = os.path.join(tarball_container, files[0])
if os.path.isdir(expanded_dir):
shutil.move(expanded_dir, self.stage.path)
os.rmdir(tarball_container)
# Set the wd back to the stage when done.
self.stage.chdir()
def archive(self, destination):
"""Just moves this archive to the destination."""
if not self.archive_file:
raise NoArchiveFileError("Cannot call archive() before fetching.")
if not extension(destination) == extension(self.archive_file):
raise ValueError("Cannot archive without matching extensions.")
shutil.move(self.archive_file, destination)
@_needs_stage
def check(self):
"""Check the downloaded archive against a checksum digest.
No-op if this stage checks code out of a repository."""
if not self.digest:
raise NoDigestError("Attempt to check URLFetchStrategy with no digest.")
checker = crypto.Checker(self.digest)
if not checker.check(self.archive_file):
raise ChecksumError(
"%s checksum failed for %s." % (checker.hash_name, self.archive_file),
"Expected %s but got %s." % (self.digest, checker.sum))
@_needs_stage
def reset(self):
"""Removes the source path if it exists, then re-expands the archive."""
if not self.archive_file:
raise NoArchiveFileError("Tried to reset URLFetchStrategy before fetching",
"Failed on reset() for URL %s" % self.url)
if self.stage.source_path:
shutil.rmtree(self.stage.source_path, ignore_errors=True)
self.expand()
def __repr__(self):
url = self.url if self.url else "no url"
return "URLFetchStrategy<%s>" % url
def __str__(self):
if self.url:
return self.url
else:
return "[no url]"
class VCSFetchStrategy(FetchStrategy):
def __init__(self, name, *rev_types, **kwargs):
super(VCSFetchStrategy, self).__init__()
self.name = name
# Set a URL based on the type of fetch strategy.
self.url = kwargs.get(name, None)
if not self.url: raise ValueError(
"%s requires %s argument." % (self.__class__, name))
# Ensure that there's only one of the rev_types
if sum(k in kwargs for k in rev_types) > 1:
raise FetchStrategyError(
"Supply only one of %s to fetch with %s." % (
comma_or(rev_types), name))
# Set attributes for each rev type.
for rt in rev_types:
setattr(self, rt, kwargs.get(rt, None))
@_needs_stage
def check(self):
tty.msg("No checksum needed when fetching with %s." % self.name)
@_needs_stage
def expand(self):
tty.debug("Source fetched with %s is already expanded." % self.name)
@_needs_stage
def archive(self, destination, **kwargs):
assert(extension(destination) == 'tar.gz')
assert(self.stage.source_path.startswith(self.stage.path))
tar = which('tar', required=True)
patterns = kwargs.get('exclude', None)
if patterns is not None:
if isinstance(patterns, basestring):
patterns = [patterns]
for p in patterns:
tar.add_default_arg('--exclude=%s' % p)
self.stage.chdir()
tar('-czf', destination, os.path.basename(self.stage.source_path))
def __str__(self):
return "VCS: %s" % self.url
def __repr__(self):
return "%s<%s>" % (self.__class__, self.url)
class GitFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a git repository.
Use like this in a package:
version('name', git='https://github.com/project/repo.git')
Optionally, you can provide a branch, or commit to check out, e.g.:
version('1.1', git='https://github.com/project/repo.git', tag='v1.1')
You can use these three optional attributes in addition to ``git``:
* ``branch``: Particular branch to build from (default is master)
* ``tag``: Particular tag to check out
* ``commit``: Particular commit hash in the repo
"""
enabled = True
required_attributes = ('git',)
def __init__(self, **kwargs):
super(GitFetchStrategy, self).__init__(
'git', 'tag', 'branch', 'commit', **kwargs)
self._git = None
# For git fetch branches and tags the same way.
if not self.branch:
self.branch = self.tag
@property
def git_version(self):
vstring = self.git('--version', return_output=True).lstrip('git version ')
return Version(vstring)
@property
def git(self):
if not self._git:
self._git = which('git', required=True)
return self._git
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.stage.source_path:
tty.msg("Already fetched %s." % self.stage.source_path)
return
args = []
if self.commit:
args.append('at commit %s' % self.commit)
elif self.tag:
args.append('at tag %s' % self.tag)
elif self.branch:
args.append('on branch %s' % self.branch)
tty.msg("Trying to clone git repository:", self.url, *args)
if self.commit:
# Need to do a regular clone and check out everything if
# they asked for a particular commit.
self.git('clone', self.url)
self.stage.chdir_to_source()
self.git('checkout', self.commit)
else:
# Can be more efficient if not checking out a specific commit.
args = ['clone']
# If we want a particular branch ask for it.
if self.branch:
args.extend(['--branch', self.branch])
# Try to be efficient if we're using a new enough git.
# This checks out only one branch's history
if self.git_version > ver('1.7.10'):
args.append('--single-branch')
args.append(self.url)
self.git(*args)
self.stage.chdir_to_source()
def archive(self, destination):
super(GitFetchStrategy, self).archive(destination, exclude='.git')
@_needs_stage
def reset(self):
self.stage.chdir_to_source()
self.git('checkout', '.')
self.git('clean', '-f')
def __str__(self):
return "[git] %s" % self.url
class SvnFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a subversion repository.
Use like this in a package:
version('name', svn='http://www.example.com/svn/trunk')
Optionally, you can provide a revision for the URL:
version('name', svn='http://www.example.com/svn/trunk',
revision='1641')
"""
enabled = True
required_attributes = ['svn']
def __init__(self, **kwargs):
super(SvnFetchStrategy, self).__init__(
'svn', 'revision', **kwargs)
self._svn = None
if self.revision is not None:
self.revision = str(self.revision)
@property
def svn(self):
if not self._svn:
self._svn = which('svn', required=True)
return self._svn
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.stage.source_path:
tty.msg("Already fetched %s." % self.stage.source_path)
return
tty.msg("Trying to check out svn repository: %s" % self.url)
args = ['checkout', '--force']
if self.revision:
args += ['-r', self.revision]
args.append(self.url)
self.svn(*args)
self.stage.chdir_to_source()
def _remove_untracked_files(self):
"""Removes untracked files in an svn repository."""
status = self.svn('status', '--no-ignore', return_output=True)
self.svn('status', '--no-ignore')
for line in status.split('\n'):
if not re.match('^[I?]', line):
continue
path = line[8:].strip()
if os.path.isfile(path):
os.unlink(path)
elif os.path.isdir(path):
shutil.rmtree(path, ignore_errors=True)
def archive(self, destination):
super(SvnFetchStrategy, self).archive(destination, exclude='.svn')
@_needs_stage
def reset(self):
self.stage.chdir_to_source()
self._remove_untracked_files()
self.svn('revert', '.', '-R')
def __str__(self):
return "[svn] %s" % self.url
class HgFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a Mercurial repository.
Use like this in a package:
version('name', hg='https://jay.grs.rwth-aachen.de/hg/lwm2')
Optionally, you can provide a branch, or revision to check out, e.g.:
version('torus', hg='https://jay.grs.rwth-aachen.de/hg/lwm2', branch='torus')
You can use the optional 'revision' attribute to check out a
branch, tag, or particular revision in hg. To prevent
non-reproducible builds, using a moving target like a branch is
discouraged.
* ``revision``: Particular revision, branch, or tag.
"""
enabled = True
required_attributes = ['hg']
def __init__(self, **kwargs):
super(HgFetchStrategy, self).__init__(
'hg', 'revision', **kwargs)
self._hg = None
@property
def hg(self):
if not self._hg:
self._hg = which('hg', required=True)
return self._hg
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.stage.source_path:
tty.msg("Already fetched %s." % self.stage.source_path)
return
args = []
if self.revision:
args.append('at revision %s' % self.revision)
tty.msg("Trying to clone Mercurial repository:", self.url, *args)
args = ['clone', self.url]
if self.revision:
args += ['-r', self.revision]
self.hg(*args)
def archive(self, destination):
super(HgFetchStrategy, self).archive(destination, exclude='.hg')
@_needs_stage
def reset(self):
self.stage.chdir()
source_path = self.stage.source_path
scrubbed = "scrubbed-source-tmp"
args = ['clone']
if self.revision:
args += ['-r', self.revision]
args += [source_path, scrubbed]
self.hg(*args)
shutil.rmtree(source_path, ignore_errors=True)
shutil.move(scrubbed, source_path)
self.stage.chdir_to_source()
def __str__(self):
return "[hg] %s" % self.url
def from_url(url):
"""Given a URL, find an appropriate fetch strategy for it.
Currently just gives you a URLFetchStrategy that uses curl.
TODO: make this return appropriate fetch strategies for other
types of URLs.
"""
return URLFetchStrategy(url)
def args_are_for(args, fetcher):
fetcher.matches(args)
def for_package_version(pkg, version):
"""Determine a fetch strategy based on the arguments supplied to
version() in the package description."""
# If it's not a known version, extrapolate one.
if not version in pkg.versions:
url = pkg.url_for_version(version)
if not url:
raise InvalidArgsError(pkg, version)
return URLFetchStrategy(url)
# Grab a dict of args out of the package version dict
args = pkg.versions[version]
# Test all strategies against per-version arguments.
for fetcher in all_strategies:
if fetcher.matches(args):
return fetcher(**args)
# If nothing matched for a *specific* version, test all strategies
# against
for fetcher in all_strategies:
attrs = dict((attr, getattr(pkg, attr, None))
for attr in fetcher.required_attributes)
if 'url' in attrs:
attrs['url'] = pkg.url_for_version(version)
attrs.update(args)
if fetcher.matches(attrs):
return fetcher(**attrs)
raise InvalidArgsError(pkg, version)
class FetchError(spack.error.SpackError):
def __init__(self, msg, long_msg):
super(FetchError, self).__init__(msg, long_msg)
class FailedDownloadError(FetchError):
"""Raised wen a download fails."""
def __init__(self, url, msg=""):
super(FailedDownloadError, self).__init__(
"Failed to fetch file from URL: %s" % url, msg)
self.url = url
class NoArchiveFileError(FetchError):
def __init__(self, msg, long_msg):
super(NoArchiveFileError, self).__init__(msg, long_msg)
class NoDigestError(FetchError):
def __init__(self, msg, long_msg):
super(NoDigestError, self).__init__(msg, long_msg)
class InvalidArgsError(FetchError):
def __init__(self, pkg, version):
msg = "Could not construct a fetch strategy for package %s at version %s"
msg %= (pkg.name, version)
super(InvalidArgsError, self).__init__(msg)
class ChecksumError(FetchError):
"""Raised when archive fails to checksum."""
def __init__(self, message, long_msg=None):
super(ChecksumError, self).__init__(message, long_msg)
class NoStageError(FetchError):
"""Raised when fetch operations are called before set_stage()."""
def __init__(self, method):
super(NoStageError, self).__init__(
"Must call FetchStrategy.set_stage() before calling %s" % method.__name__)

553
lib/spack/spack/graph.py Normal file
View File

@@ -0,0 +1,553 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""Functions for graphing DAGs of dependencies.
This file contains code for graphing DAGs of software packages
(i.e. Spack specs). There are two main functions you probably care
about:
graph_ascii() will output a colored graph of a spec in ascii format,
kind of like the graph git shows with "git log --graph", e.g.::
o mpileaks
|\
| |\
| o | callpath
|/| |
| |\|
| |\ \
| | |\ \
| | | | o adept-utils
| |_|_|/|
|/| | | |
o | | | | mpi
/ / / /
| | o | dyninst
| |/| |
|/|/| |
| | |/
| o | libdwarf
|/ /
o | libelf
/
o boost
graph_dot() will output a graph of a spec (or multiple specs) in dot
format.
Note that ``graph_ascii`` assumes a single spec while ``graph_dot``
can take a number of specs as input.
"""
__all__ = ['topological_sort', 'graph_ascii', 'AsciiGraph', 'graph_dot']
from heapq import *
from llnl.util.lang import *
from llnl.util.tty.color import *
import spack
from spack.spec import Spec
def topological_sort(spec, **kwargs):
"""Topological sort for specs.
Return a list of dependency specs sorted topologically. The spec
argument is not modified in the process.
"""
reverse = kwargs.get('reverse', False)
if not reverse:
parents = lambda s: s.dependents
children = lambda s: s.dependencies
else:
parents = lambda s: s.dependencies
children = lambda s: s.dependents
# Work on a copy so this is nondestructive.
spec = spec.copy()
nodes = spec.index()
topo_order = []
remaining = [name for name in nodes.keys() if not parents(nodes[name])]
heapify(remaining)
while remaining:
name = heappop(remaining)
topo_order.append(name)
node = nodes[name]
for dep in children(node).values():
del parents(dep)[node.name]
if not parents(dep):
heappush(remaining, dep.name)
if any(parents(s) for s in spec.traverse()):
raise ValueError("Spec has cycles!")
else:
return topo_order
def find(seq, predicate):
"""Find index in seq for which predicate is True.
Searches the sequence and returns the index of the element for
which the predicate evaluates to True. Returns -1 if the
predicate does not evaluate to True for any element in seq.
"""
for i, elt in enumerate(seq):
if predicate(elt):
return i
return -1
# Names of different graph line states. We Record previous line
# states so that we can easily determine what to do when connecting.
states = ('node', 'collapse', 'merge-right', 'expand-right', 'back-edge')
NODE, COLLAPSE, MERGE_RIGHT, EXPAND_RIGHT, BACK_EDGE = states
class AsciiGraph(object):
def __init__(self):
# These can be set after initialization or after a call to
# graph() to change behavior.
self.node_character = '*'
self.debug = False
self.indent = 0
# These are colors in the order they'll be used for edges.
# See llnl.util.tty.color for details on color characters.
self.colors = 'rgbmcyRGBMCY'
# Internal vars are used in the graph() function and are
# properly initialized there.
self._name_to_color = None # Node name to color
self._out = None # Output stream
self._frontier = None # frontier
self._nodes = None # dict from name -> node
self._prev_state = None # State of previous line
self._prev_index = None # Index of expansion point of prev line
def _indent(self):
self._out.write(self.indent * ' ')
def _write_edge(self, string, index, sub=0):
"""Write a colored edge to the output stream."""
name = self._frontier[index][sub]
edge = "@%s{%s}" % (self._name_to_color[name], string)
self._out.write(edge)
def _connect_deps(self, i, deps, label=None):
"""Connect dependencies to existing edges in the frontier.
``deps`` are to be inserted at position i in the
frontier. This routine determines whether other open edges
should be merged with <deps> (if there are other open edges
pointing to the same place) or whether they should just be
inserted as a completely new open edge.
Open edges that are not fully expanded (i.e. those that point
at multiple places) are left intact.
Parameters:
label -- optional debug label for the connection.
Returns: True if the deps were connected to another edge
(i.e. the frontier did not grow) and False if the deps were
NOT already in the frontier (i.e. they were inserted and the
frontier grew).
"""
if len(deps) == 1 and deps in self._frontier:
j = self._frontier.index(deps)
# convert a right connection into a left connection
if i < j:
self._frontier.pop(j)
self._frontier.insert(i, deps)
return self._connect_deps(j, deps, label)
collapse = True
if self._prev_state == EXPAND_RIGHT:
# Special case where previous line expanded and i is off by 1.
self._back_edge_line([], j, i+1, True, label + "-1.5 " + str((i+1,j)))
collapse = False
else:
# Previous node also expanded here, so i is off by one.
if self._prev_state == NODE and self._prev_index < i:
i += 1
if i-j > 1:
# We need two lines to connect if distance > 1
self._back_edge_line([], j, i, True, label + "-1 " + str((i,j)))
collapse = False
self._back_edge_line([j], -1, -1, collapse, label + "-2 " + str((i,j)))
return True
elif deps:
self._frontier.insert(i, deps)
return False
def _set_state(self, state, index, label=None):
if state not in states:
raise ValueError("Invalid graph state!")
self._prev_state = state
self._prev_index = index
if self.debug:
self._out.write(" " * 20)
self._out.write("%-20s" % (
str(self._prev_state) if self._prev_state else ''))
self._out.write("%-20s" % (str(label) if label else ''))
self._out.write("%s" % self._frontier)
def _back_edge_line(self, prev_ends, end, start, collapse, label=None):
"""Write part of a backwards edge in the graph.
Writes single- or multi-line backward edges in an ascii graph.
For example, a single line edge::
| | | | o |
| | | |/ / <-- single-line edge connects two nodes.
| | | o |
Or a multi-line edge (requires two calls to back_edge)::
| | | | o |
| |_|_|/ / <-- multi-line edge crosses vertical edges.
|/| | | |
o | | | |
Also handles "pipelined" edges, where the same line contains
parts of multiple edges::
o start
| |_|_|_|/|
|/| | |_|/| <-- this line has parts of 2 edges.
| | |/| | |
o o
Arguments:
prev_ends -- indices in frontier of previous edges that need
to be finished on this line.
end -- end of the current edge on this line.
start -- start index of the current edge.
collapse -- whether the graph will be collapsing (i.e. whether
to slant the end of the line or keep it straight)
label -- optional debug label to print after the line.
"""
def advance(to_pos, edges):
"""Write edges up to <to_pos>."""
for i in range(self._pos, to_pos):
for e in edges():
self._write_edge(*e)
self._pos += 1
flen = len(self._frontier)
self._pos = 0
self._indent()
for p in prev_ends:
advance(p, lambda: [("| ", self._pos)] )
advance(p+1, lambda: [("|/", self._pos)] )
if end >= 0:
advance(end + 1, lambda: [("| ", self._pos)] )
advance(start - 1, lambda: [("|", self._pos), ("_", end)] )
else:
advance(start - 1, lambda: [("| ", self._pos)] )
if start >= 0:
advance(start, lambda: [("|", self._pos), ("/", end)] )
if collapse:
advance(flen, lambda: [(" /", self._pos)] )
else:
advance(flen, lambda: [("| ", self._pos)] )
self._set_state(BACK_EDGE, end, label)
self._out.write("\n")
def _node_line(self, index, name):
"""Writes a line with a node at index."""
self._indent()
for c in range(index):
self._write_edge("| ", c)
self._out.write("%s " % self.node_character)
for c in range(index+1, len(self._frontier)):
self._write_edge("| ", c)
self._out.write(" %s" % name)
self._set_state(NODE, index)
self._out.write("\n")
def _collapse_line(self, index):
"""Write a collapsing line after a node was added at index."""
self._indent()
for c in range(index):
self._write_edge("| ", c)
for c in range(index, len(self._frontier)):
self._write_edge(" /", c)
self._set_state(COLLAPSE, index)
self._out.write("\n")
def _merge_right_line(self, index):
"""Edge at index is same as edge to right. Merge directly with '\'"""
self._indent()
for c in range(index):
self._write_edge("| ", c)
self._write_edge("|", index)
self._write_edge("\\", index+1)
for c in range(index+1, len(self._frontier)):
self._write_edge("| ", c )
self._set_state(MERGE_RIGHT, index)
self._out.write("\n")
def _expand_right_line(self, index):
self._indent()
for c in range(index):
self._write_edge("| ", c)
self._write_edge("|", index)
self._write_edge("\\", index+1)
for c in range(index+2, len(self._frontier)):
self._write_edge(" \\", c)
self._set_state(EXPAND_RIGHT, index)
self._out.write("\n")
def write(self, spec, **kwargs):
"""Write out an ascii graph of the provided spec.
Arguments:
spec -- spec to graph. This only handles one spec at a time.
Optional arguments:
out -- file object to write out to (default is sys.stdout)
color -- whether to write in color. Default is to autodetect
based on output file.
"""
out = kwargs.get('out', None)
if not out:
out = sys.stdout
color = kwargs.get('color', None)
if not color:
color = out.isatty()
self._out = ColorStream(sys.stdout, color=color)
# We'll traverse the spec in topo order as we graph it.
topo_order = topological_sort(spec, reverse=True)
# Work on a copy to be nondestructive
spec = spec.copy()
self._nodes = spec.index()
# Colors associated with each node in the DAG.
# Edges are colored by the node they point to.
self._name_to_color = dict((name, self.colors[i % len(self.colors)])
for i, name in enumerate(topo_order))
# Frontier tracks open edges of the graph as it's written out.
self._frontier = [[spec.name]]
while self._frontier:
# Find an unexpanded part of frontier
i = find(self._frontier, lambda f: len(f) > 1)
if i >= 0:
# Expand frontier until there are enough columns for all children.
# Figure out how many back connections there are and
# sort them so we do them in order
back = []
for d in self._frontier[i]:
b = find(self._frontier[:i], lambda f: f == [d])
if b != -1:
back.append((b, d))
# Do all back connections in sorted order so we can
# pipeline them and save space.
if back:
back.sort()
prev_ends = []
for j, (b, d) in enumerate(back):
self._frontier[i].remove(d)
if i-b > 1:
self._back_edge_line(prev_ends, b, i, False, 'left-1')
del prev_ends[:]
prev_ends.append(b)
# Check whether we did ALL the deps as back edges,
# in which case we're done.
collapse = not self._frontier[i]
if collapse:
self._frontier.pop(i)
self._back_edge_line(prev_ends, -1, -1, collapse, 'left-2')
elif len(self._frontier[i]) > 1:
# Expand forward after doing all back connections
if (i+1 < len(self._frontier) and len(self._frontier[i+1]) == 1
and self._frontier[i+1][0] in self._frontier[i]):
# We need to connect to the element to the right.
# Keep lines straight by connecting directly and
# avoiding unnecessary expand/contract.
name = self._frontier[i+1][0]
self._frontier[i].remove(name)
self._merge_right_line(i)
else:
# Just allow the expansion here.
name = self._frontier[i].pop(0)
deps = [name]
self._frontier.insert(i, deps)
self._expand_right_line(i)
self._frontier.pop(i)
self._connect_deps(i, deps, "post-expand")
# Handle any remaining back edges to the right
j = i+1
while j < len(self._frontier):
deps = self._frontier.pop(j)
if not self._connect_deps(j, deps, "back-from-right"):
j += 1
else:
# Nothing to expand; add dependencies for a node.
name = topo_order.pop()
node = self._nodes[name]
# Find the named node in the frontier and draw it.
i = find(self._frontier, lambda f: name in f)
self._node_line(i, name)
# Replace node with its dependencies
self._frontier.pop(i)
if node.dependencies:
deps = sorted((d for d in node.dependencies), reverse=True)
self._connect_deps(i, deps, "new-deps") # anywhere.
elif self._frontier:
self._collapse_line(i)
def graph_ascii(spec, **kwargs):
node_character = kwargs.get('node', 'o')
out = kwargs.pop('out', None)
debug = kwargs.pop('debug', False)
indent = kwargs.pop('indent', 0)
color = kwargs.pop('color', None)
check_kwargs(kwargs, graph_ascii)
graph = AsciiGraph()
graph.debug = debug
graph.indent = indent
graph.node_character = node_character
graph.write(spec, color=color, out=out)
def graph_dot(*specs, **kwargs):
"""Generate a graph in dot format of all provided specs.
Print out a dot formatted graph of all the dependencies between
package. Output can be passed to graphviz, e.g.:
spack graph --dot qt | dot -Tpdf > spack-graph.pdf
"""
out = kwargs.pop('out', sys.stdout)
check_kwargs(kwargs, graph_dot)
out.write('digraph G {\n')
out.write(' label = "Spack Dependencies"\n')
out.write(' labelloc = "b"\n')
out.write(' rankdir = "LR"\n')
out.write(' ranksep = "5"\n')
out.write('\n')
def quote(string):
return '"%s"' % string
if not specs:
specs = [p.name for p in spack.db.all_packages()]
else:
roots = specs
specs = set()
for spec in roots:
specs.update(Spec(s.name) for s in spec.normalized().traverse())
deps = []
for spec in specs:
out.write(' %-30s [label="%s"]\n' % (quote(spec.name), spec.name))
# Skip virtual specs (we'll find out about them from concrete ones.
if spec.virtual:
continue
# Add edges for each depends_on in the package.
for dep_name, dep in spec.package.dependencies.iteritems():
deps.append((spec.name, dep_name))
# If the package provides something, add an edge for that.
for provider in set(s.name for s in spec.package.provided):
deps.append((provider, spec.name))
out.write('\n')
for pair in deps:
out.write(' "%s" -> "%s"\n' % pair)
out.write('}\n')

View File

@@ -31,7 +31,9 @@
Currently the following hooks are supported:
* pre_install()
* post_install()
* pre_uninstall()
* post_uninstall()
This can be used to implement support for things like module
@@ -47,8 +49,11 @@
def all_hook_modules():
modules = []
for name in list_modules(spack.hooks_path):
mod_name = __name__ + '.' + name
path = join_path(spack.hooks_path, name) + ".py"
modules.append(imp.load_source('spack.hooks', path))
mod = imp.load_source(mod_name, path)
modules.append(mod)
return modules
@@ -67,5 +72,8 @@ def __call__(self, pkg):
#
# Define some functions that can be called to fire off hooks.
#
post_install = HookRunner('post_install')
pre_install = HookRunner('pre_install')
post_install = HookRunner('post_install')
pre_uninstall = HookRunner('pre_uninstall')
post_uninstall = HookRunner('post_uninstall')

View File

@@ -22,62 +22,14 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import re
import textwrap
import shutil
from contextlib import closing
from llnl.util.filesystem import join_path, mkdirp
import spack
def dotkit_file(pkg):
dk_file_name = pkg.spec.format('$_$@$%@$+$=$#') + ".dk"
return join_path(spack.dotkit_path, dk_file_name)
import spack.modules
def post_install(pkg):
if not os.path.exists(spack.dotkit_path):
mkdirp(spack.dotkit_path)
alterations = []
for var, path in [
('PATH', pkg.prefix.bin),
('MANPATH', pkg.prefix.man),
('MANPATH', pkg.prefix.share_man),
('LD_LIBRARY_PATH', pkg.prefix.lib),
('LD_LIBRARY_PATH', pkg.prefix.lib64)]:
if os.path.isdir(path):
alterations.append("dk_alter %s %s\n" % (var, path))
if not alterations:
return
alterations.append("dk_alter CMAKE_PREFIX_PATH %s\n" % pkg.prefix)
dk_file = dotkit_file(pkg)
with closing(open(dk_file, 'w')) as dk:
# Put everything in the spack category.
dk.write('#c spack\n')
dk.write('#d %s\n' % pkg.spec.format("$_ $@"))
# Recycle the description
if pkg.__doc__:
doc = re.sub(r'\s+', ' ', pkg.__doc__)
for line in textwrap.wrap(doc, 72):
dk.write("#h %s\n" % line)
# Write alterations
for alter in alterations:
dk.write(alter)
dk = spack.modules.Dotkit(pkg.spec)
dk.write()
def post_uninstall(pkg):
dk_file = dotkit_file(pkg)
if os.path.exists(dk_file):
shutil.rmtree(dk_file, ignore_errors=True)
dk = spack.modules.Dotkit(pkg.spec)
dk.remove()

View File

@@ -0,0 +1,36 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import spack
def pre_uninstall(pkg):
# Need to do this b/c uninstall does not automatically do it.
# TODO: store full graph info in stored .spec file.
pkg.spec.normalize()
if pkg.is_extension:
if pkg.activated:
pkg.do_deactivate(force=True)

View File

@@ -0,0 +1,35 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by David Beckingsale, david@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import spack.modules
def post_install(pkg):
dk = spack.modules.TclModule(pkg.spec)
dk.write()
def post_uninstall(pkg):
dk = spack.modules.TclModule(pkg.spec)
dk.remove()

193
lib/spack/spack/mirror.py Normal file
View File

@@ -0,0 +1,193 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particualr package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import sys
import os
import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack
import spack.error
import spack.url as url
import spack.fetch_strategy as fs
from spack.spec import Spec
from spack.stage import Stage
from spack.version import *
from spack.util.compression import extension
def mirror_archive_filename(spec):
"""Get the name of the spec's archive in the mirror."""
if not spec.version.concrete:
raise ValueError("mirror.path requires spec with concrete version.")
fetcher = spec.package.fetcher
if isinstance(fetcher, fs.URLFetchStrategy):
# If we fetch this version with a URLFetchStrategy, use URL's archive type
ext = url.downloaded_file_extension(fetcher.url)
else:
# Otherwise we'll make a .tar.gz ourselves
ext = 'tar.gz'
return "%s-%s.%s" % (spec.package.name, spec.version, ext)
def mirror_archive_path(spec):
"""Get the relative path to the spec's archive within a mirror."""
return join_path(spec.name, mirror_archive_filename(spec))
def get_matching_versions(specs, **kwargs):
"""Get a spec for EACH known version matching any spec in the list."""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s." % pkg.name)
continue
num_versions = kwargs.get('num_versions', 0)
for i, v in enumerate(reversed(sorted(pkg.versions))):
# Generate no more than num_versions versions for each spec.
if num_versions and i >= num_versions:
break
# Generate only versions that satisfy the spec.
if v.satisfies(spec.versions):
s = Spec(pkg.name)
s.versions = VersionList([v])
matching.append(s)
return matching
def create(path, specs, **kwargs):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path Path to create a mirror directory hierarchy in.
specs Any package versions matching these specs will be added
to the mirror.
Keyword args:
no_checksum: If True, do not checkpoint when fetching (default False)
num_versions: Max number of versions to fetch per spec,
if spec is ambiguous (default is 0 for all of them)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already prsent.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
This routine iterates through all known package versions, and
it creates specs for those versions. If the version satisfies any spec
in the specs list, it is downloaded and added to the mirror.
"""
# Make sure nothing is in the way.
if os.path.isfile(path):
raise MirrorError("%s already exists and is a file." % path)
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, Spec) else Spec(s) for s in specs]
# Get concrete specs for each matching version of these specs.
version_specs = get_matching_versions(
specs, num_versions=kwargs.get('num_versions', 0))
for s in version_specs:
s.concretize()
# Get the absolute path of the root before we start jumping around.
mirror_root = os.path.abspath(path)
if not os.path.isdir(mirror_root):
mkdirp(mirror_root)
# Things to keep track of while parsing specs.
present = []
mirrored = []
error = []
# Iterate through packages and download all the safe tarballs for each of them
for spec in version_specs:
pkg = spec.package
stage = None
try:
# create a subdirectory for the current package@version
archive_path = os.path.abspath(join_path(path, mirror_archive_path(spec)))
subdir = os.path.dirname(archive_path)
mkdirp(subdir)
if os.path.exists(archive_path):
tty.msg("Already added %s" % spec.format("$_$@"))
present.append(spec)
continue
# Set up a stage and a fetcher for the download
unique_fetch_name = spec.format("$_$@")
fetcher = fs.for_package_version(pkg, pkg.version)
stage = Stage(fetcher, name=unique_fetch_name)
fetcher.set_stage(stage)
# Do the fetch and checksum if necessary
fetcher.fetch()
if not kwargs.get('no_checksum', False):
fetcher.check()
tty.msg("Checksum passed for %s@%s" % (pkg.name, pkg.version))
# Fetchers have to know how to archive their files. Use
# that to move/copy/create an archive in the mirror.
fetcher.archive(archive_path)
tty.msg("Added %s." % spec.format("$_$@"))
mirrored.append(spec)
except Exception, e:
if spack.debug:
sys.excepthook(*sys.exc_info())
else:
tty.warn("Error while fetching %s." % spec.format('$_$@'), e.message)
error.append(spec)
finally:
if stage:
stage.destroy()
return (present, mirrored, error)
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super(MirrorError, self).__init__(msg, long_msg)

255
lib/spack/spack/modules.py Normal file
View File

@@ -0,0 +1,255 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""This module contains code for creating environment modules, which
can include dotkits, tcl modules, lmod, and others.
The various types of modules are installed by post-install hooks and
removed after an uninstall by post-uninstall hooks. This class
consolidates the logic for creating an abstract description of the
information that module systems need. Currently that includes a
number directories to be appended to paths in the user's environment:
* /bin directories to be appended to PATH
* /lib* directories for LD_LIBRARY_PATH
* /man* and /share/man* directories for LD_LIBRARY_PATH
* the package prefix for CMAKE_PREFIX_PATH
This module also includes logic for coming up with unique names for
the module files so that they can be found by the various
shell-support files in $SPACK/share/spack/setup-env.*.
Each hook in hooks/ implements the logic for writing its specific type
of module file.
"""
__all__ = ['EnvModule', 'Dotkit', 'TclModule']
import os
import re
import textwrap
import shutil
from glob import glob
from contextlib import closing
import llnl.util.tty as tty
from llnl.util.filesystem import join_path, mkdirp
import spack
"""Registry of all types of modules. Entries created by EnvModule's
metaclass."""
module_types = {}
def print_help():
"""For use by commands to tell user how to activate shell support."""
tty.msg("This command requires spack's shell integration.",
"",
"To initialize spack's shell commands, you must run one of",
"the commands below. Choose the right command for your shell.",
"",
"For bash and zsh:",
" . %s/setup-env.sh" % spack.share_path,
"",
"For csh and tcsh:",
" setenv SPACK_ROOT %s" % spack.prefix,
" source %s/setup-env.csh" % spack.share_path,
"")
class EnvModule(object):
name = 'env_module'
class __metaclass__(type):
def __init__(cls, name, bases, dict):
type.__init__(cls, name, bases, dict)
if cls.name != 'env_module':
module_types[cls.name] = cls
def __init__(self, spec=None):
# category in the modules system
# TODO: come up with smarter category names.
self.category = "spack"
# Descriptions for the module system's UI
self.short_description = ""
self.long_description = ""
# dict pathname -> list of directories to be prepended to in
# the module file.
self._paths = None
self.spec = spec
@property
def paths(self):
if self._paths is None:
self._paths = {}
def add_path(path_name, directory):
path = self._paths.setdefault(path_name, [])
path.append(directory)
# Add paths if they exist.
for var, directory in [
('PATH', self.spec.prefix.bin),
('MANPATH', self.spec.prefix.man),
('MANPATH', self.spec.prefix.share_man),
('LD_LIBRARY_PATH', self.spec.prefix.lib),
('LD_LIBRARY_PATH', self.spec.prefix.lib64)]:
if os.path.isdir(directory):
add_path(var, directory)
# Add python path unless it's an actual python installation
# TODO: is there a better way to do this?
if self.spec.name != 'python':
site_packages = glob(join_path(self.spec.prefix.lib, "python*/site-packages"))
if site_packages:
add_path('PYTHONPATH', site_packages[0])
# short description is just the package + version
# TODO: maybe packages can optionally provide it.
self.short_description = self.spec.format("$_ $@")
# long description is the docstring with reduced whitespace.
if self.spec.package.__doc__:
self.long_description = re.sub(r'\s+', ' ', self.spec.package.__doc__)
return self._paths
def write(self):
"""Write out a module file for this object."""
module_dir = os.path.dirname(self.file_name)
if not os.path.exists(module_dir):
mkdirp(module_dir)
# If there are no paths, no need for a dotkit.
if not self.paths:
return
with closing(open(self.file_name, 'w')) as f:
self._write(f)
def _write(self, stream):
"""To be implemented by subclasses."""
raise NotImplementedError()
@property
def file_name(self):
"""Subclasses should implement this to return the name of the file
where this module lives."""
raise NotImplementedError()
@property
def use_name(self):
"""Subclasses should implement this to return the name the
module command uses to refer to the package."""
raise NotImplementedError()
def remove(self):
mod_file = self.file_name
if os.path.exists(mod_file):
shutil.rmtree(mod_file, ignore_errors=True)
class Dotkit(EnvModule):
name = 'dotkit'
path = join_path(spack.share_path, "dotkit")
@property
def file_name(self):
return join_path(Dotkit.path, self.spec.architecture,
self.spec.format('$_$@$%@$+$#.dk'))
@property
def use_name(self):
return self.spec.format('$_$@$%@$+$#')
def _write(self, dk_file):
# Category
if self.category:
dk_file.write('#c %s\n' % self.category)
# Short description
if self.short_description:
dk_file.write('#d %s\n' % self.short_description)
# Long description
if self.long_description:
for line in textwrap.wrap(self.long_description, 72):
dk_file.write("#h %s\n" % line)
# Path alterations
for var, dirs in self.paths.items():
for directory in dirs:
dk_file.write("dk_alter %s %s\n" % (var, directory))
# Let CMake find this package.
dk_file.write("dk_alter CMAKE_PREFIX_PATH %s\n" % self.spec.prefix)
class TclModule(EnvModule):
name = 'tcl'
path = join_path(spack.share_path, "modules")
@property
def file_name(self):
return join_path(TclModule.path, self.spec.architecture, self.use_name)
@property
def use_name(self):
return self.spec.format('$_$@$%@$+$#')
def _write(self, m_file):
# TODO: cateogry?
m_file.write('#%Module1.0\n')
# Short description
if self.short_description:
m_file.write('module-whatis \"%s\"\n\n' % self.short_description)
# Long description
if self.long_description:
m_file.write('proc ModulesHelp { } {\n')
doc = re.sub(r'"', '\"', self.long_description)
m_file.write("puts stderr \"%s\"\n" % doc)
m_file.write('}\n\n')
# Path alterations
for var, dirs in self.paths.items():
for directory in dirs:
m_file.write("prepend-path %s \"%s\"\n" % (var, directory))
m_file.write("prepend-path CMAKE_PREFIX_PATH \"%s\"\n" % self.spec.prefix)

File diff suppressed because it is too large Load Diff

View File

@@ -30,7 +30,7 @@
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
from llnl.util.lang import memoized
from llnl.util.lang import *
import spack.error
import spack.spec
@@ -47,10 +47,10 @@
def _autospec(function):
"""Decorator that automatically converts the argument of a single-arg
function to a Spec."""
def converter(self, spec_like):
def converter(self, spec_like, **kwargs):
if not isinstance(spec_like, spack.spec.Spec):
spec_like = spack.spec.Spec(spec_like)
return function(self, spec_like)
return function(self, spec_like, **kwargs)
return converter
@@ -63,15 +63,36 @@ def __init__(self, root):
@_autospec
def get(self, spec):
def get(self, spec, **kwargs):
if spec.virtual:
raise UnknownPackageError(spec.name)
if kwargs.get('new', False):
if spec in self.instances:
del self.instances[spec]
if not spec in self.instances:
package_class = self.get_class_for_package_name(spec.name)
self.instances[spec.name] = package_class(spec)
try:
copy = spec.copy()
self.instances[copy] = package_class(copy)
except Exception, e:
if spack.debug:
sys.excepthook(*sys.exc_info())
raise FailedConstructorError(spec.name, e)
return self.instances[spec.name]
return self.instances[spec]
@_autospec
def delete(self, spec):
"""Force a package to be recreated."""
del self.instances[spec]
def purge(self):
"""Clear entire package instance cache."""
self.instances.clear()
@_autospec
@@ -91,6 +112,24 @@ def providers_for(self, vpkg_spec):
return providers
@_autospec
def extensions_for(self, extendee_spec):
return [p for p in self.all_packages() if p.extends(extendee_spec)]
@_autospec
def installed_extensions_for(self, extendee_spec):
for s in self.installed_package_specs():
try:
if s.package.extends(extendee_spec):
yield s.package
except UnknownPackageError, e:
# Skip packages we know nothing about
continue
# TODO: add some conditional way to do this instead of
# catching exceptions.
def dirname_for_package_name(self, pkg_name):
"""Get the directory name for a particular package. This is the
directory that contains its package.py file."""
@@ -115,7 +154,23 @@ def installed_package_specs(self):
"""Read installed package names straight from the install directory
layout.
"""
return spack.install_layout.all_specs()
# Get specs from the directory layout but ensure that they're
# all normalized properly.
installed = []
for spec in spack.install_layout.all_specs():
spec.normalize()
installed.append(spec)
return installed
def installed_known_package_specs(self):
"""Read installed package names straight from the install
directory layout, but return only specs for which the
package is known to this version of spack.
"""
for spec in spack.install_layout.all_specs():
if self.exists(spec.name):
yield spec
@memoized
@@ -137,6 +192,7 @@ def all_packages(self):
yield self.get(name)
@memoized
def exists(self, pkg_name):
"""Whether a package with the supplied name exists ."""
return os.path.exists(self.filename_for_package_name(pkg_name))
@@ -179,51 +235,17 @@ def get_class_for_package_name(self, pkg_name):
return cls
def compute_dependents(self):
"""Reads in all package files and sets dependence information on
Package objects in memory.
"""
if not hasattr(compute_dependents, index):
compute_dependents.index = {}
for pkg in all_packages():
if pkg._dependents is None:
pkg._dependents = []
for name, dep in pkg.dependencies.iteritems():
dpkg = self.get(name)
if dpkg._dependents is None:
dpkg._dependents = []
dpkg._dependents.append(pkg.name)
def graph_dependencies(self, out=sys.stdout):
"""Print out a graph of all the dependencies between package.
Graph is in dot format."""
out.write('digraph G {\n')
out.write(' label = "Spack Dependencies"\n')
out.write(' labelloc = "b"\n')
out.write(' rankdir = "LR"\n')
out.write(' ranksep = "5"\n')
out.write('\n')
def quote(string):
return '"%s"' % string
deps = []
for pkg in all_packages():
out.write(' %-30s [label="%s"]\n' % (quote(pkg.name), pkg.name))
for dep_name, dep in pkg.dependencies.iteritems():
deps.append((pkg.name, dep_name))
out.write('\n')
for pair in deps:
out.write(' "%s" -> "%s"\n' % pair)
out.write('}\n')
class UnknownPackageError(spack.error.SpackError):
"""Raised when we encounter a package spack doesn't have."""
def __init__(self, name):
super(UnknownPackageError, self).__init__("Package %s not found." % name)
self.name = name
class FailedConstructorError(spack.error.SpackError):
"""Raised when a package's class constructor fails."""
def __init__(self, name, reason):
super(FailedConstructorError, self).__init__(
"Class constructor failed for package '%s'." % name,
str(reason))
self.name = name

View File

@@ -64,7 +64,7 @@ def apply(self, stage):
"""Fetch this patch, if necessary, and apply it to the source
code in the supplied stage.
"""
stage.chdir_to_archive()
stage.chdir_to_source()
patch_stage = None
try:

View File

@@ -68,26 +68,48 @@ class Mpileaks(Package):
spack install mpileaks ^mvapich
spack install mpileaks ^mpich
"""
__all__ = [ 'depends_on', 'extends', 'provides', 'patch', 'version' ]
import re
import inspect
import importlib
from llnl.util.lang import *
import spack
import spack.spec
import spack.error
import spack.url
from spack.version import Version
from spack.patch import Patch
from spack.spec import Spec, parse_anonymous_spec
"""Adds a dependencies local variable in the locals of
the calling class, based on args. """
def depends_on(*specs):
pkg = get_calling_package_name()
dependencies = caller_locals().setdefault('dependencies', {})
def version(ver, checksum=None, **kwargs):
"""Adds a version and metadata describing how to fetch it.
Metadata is just stored as a dict in the package's versions
dictionary. Package must turn it into a valid fetch strategy
later.
"""
pkg = caller_locals()
versions = pkg.setdefault('versions', {})
# special case checksum for backward compatibility
if checksum:
kwargs['md5'] = checksum
# Store the kwargs for the package to use later when constructing
# a fetch strategy.
versions[Version(ver)] = kwargs
def depends_on(*specs):
"""Adds a dependencies local variable in the locals of
the calling class, based on args. """
pkg = get_calling_package_name()
clocals = caller_locals()
dependencies = clocals.setdefault('dependencies', {})
for string in specs:
for spec in spack.spec.parse(string):
if pkg == spec.name:
@@ -95,6 +117,34 @@ def depends_on(*specs):
dependencies[spec.name] = spec
def extends(spec, **kwargs):
"""Same as depends_on, but dependency is symlinked into parent prefix.
This is for Python and other language modules where the module
needs to be installed into the prefix of the Python installation.
Spack handles this by installing modules into their own prefix,
but allowing ONE module version to be symlinked into a parent
Python install at a time.
keyword arguments can be passed to extends() so that extension
packages can pass parameters to the extendee's extension
mechanism.
"""
pkg = get_calling_package_name()
clocals = caller_locals()
dependencies = clocals.setdefault('dependencies', {})
extendees = clocals.setdefault('extendees', {})
if extendees:
raise RelationError("Packages can extend at most one other package.")
spec = Spec(spec)
if pkg == spec.name:
raise CircularReferenceError('extends', pkg)
dependencies[spec.name] = spec
extendees[spec.name] = (spec, kwargs)
def provides(*specs, **kwargs):
"""Allows packages to provide a virtual dependency. If a package provides
'mpi', other packages can declare that they depend on "mpi", and spack

View File

@@ -94,6 +94,7 @@
import itertools
import hashlib
from StringIO import StringIO
from operator import attrgetter
import llnl.util.tty as tty
from llnl.util.lang import *
@@ -309,9 +310,8 @@ def concrete(self):
def __str__(self):
sorted_dep_names = sorted(self.keys())
return ''.join(
["^" + str(self[name]) for name in sorted_dep_names])
["^" + str(self[name]) for name in sorted(self.keys())])
@key_ordering
@@ -345,6 +345,13 @@ def __init__(self, spec_like, *dep_like, **kwargs):
self.compiler = other.compiler
self.dependencies = other.dependencies
# Specs are by default not assumed to be normal, but in some
# cases we've read them from a file want to assume normal.
# This allows us to manipulate specs that Spack doesn't have
# package.py files for.
self._normal = kwargs.get('normal', False)
self._concrete = kwargs.get('concrete', False)
# This allows users to construct a spec DAG with literals.
# Note that given two specs a and b, Spec(a) copies a, but
# Spec(a, b) will copy a but just add b as a dep.
@@ -432,17 +439,30 @@ def concrete(self):
If any of the name, version, architecture, compiler, or depdenencies
are ambiguous,then it is not concrete.
"""
return bool(not self.virtual
and self.versions.concrete
and self.architecture
and self.compiler and self.compiler.concrete
and self.dependencies.concrete)
if self._concrete:
return True
self._concrete = bool(not self.virtual
and self.versions.concrete
and self.architecture
and self.compiler and self.compiler.concrete
and self.dependencies.concrete)
return self._concrete
def preorder_traversal(self, visited=None, d=0, **kwargs):
"""Generic preorder traversal of the DAG represented by this spec.
def traverse(self, visited=None, d=0, **kwargs):
"""Generic traversal of the DAG represented by this spec.
This will yield each node in the spec. Options:
order [=pre|post]
Order to traverse spec nodes. Defaults to preorder traversal.
Options are:
'pre': Pre-order traversal; each node is yielded before its
children in the dependency DAG.
'post': Post-order traversal; each node is yielded after its
children in the dependency DAG.
cover [=nodes|edges|paths]
Determines how extensively to cover the dag. Possible vlaues:
@@ -460,7 +480,7 @@ def preorder_traversal(self, visited=None, d=0, **kwargs):
spec, but also their depth from the root in a (depth, node)
tuple.
keyfun [=id]
key [=id]
Allow a custom key function to track the identity of nodes
in the traversal.
@@ -472,44 +492,57 @@ def preorder_traversal(self, visited=None, d=0, **kwargs):
'parents', traverses upwards in the DAG towards the root.
"""
# get initial values for kwargs
depth = kwargs.get('depth', False)
key_fun = kwargs.get('key', id)
if isinstance(key_fun, basestring):
key_fun = attrgetter(key_fun)
yield_root = kwargs.get('root', True)
cover = kwargs.get('cover', 'nodes')
direction = kwargs.get('direction', 'children')
order = kwargs.get('order', 'pre')
cover_values = ('nodes', 'edges', 'paths')
if cover not in cover_values:
raise ValueError("Invalid value for cover: %s. Choices are %s"
% (cover, ",".join(cover_values)))
direction_values = ('children', 'parents')
if direction not in direction_values:
raise ValueError("Invalid value for direction: %s. Choices are %s"
% (direction, ",".join(direction_values)))
# Make sure kwargs have legal values; raise ValueError if not.
def validate(name, val, allowed_values):
if val not in allowed_values:
raise ValueError("Invalid value for %s: %s. Choices are %s"
% (name, val, ",".join(allowed_values)))
validate('cover', cover, ('nodes', 'edges', 'paths'))
validate('direction', direction, ('children', 'parents'))
validate('order', order, ('pre', 'post'))
if visited is None:
visited = set()
result = (d, self) if depth else self
key = key_fun(self)
if key in visited:
if cover == 'nodes': return
if yield_root or d > 0: yield result
if cover == 'edges': return
else:
if yield_root or d > 0: yield result
# Node traversal does not yield visited nodes.
if key in visited and cover == 'nodes':
return
successors = self.dependencies
if direction == 'parents':
successors = self.dependents
# Determine whether and what to yield for this node.
yield_me = yield_root or d > 0
result = (d, self) if depth else self
visited.add(key)
for name in sorted(successors):
child = successors[name]
for elt in child.preorder_traversal(visited, d+1, **kwargs):
yield elt
# Preorder traversal yields before successors
if yield_me and order == 'pre':
yield result
# Edge traversal yields but skips children of visited nodes
if not (key in visited and cover == 'edges'):
# This code determines direction and yields the children/parents
successors = self.dependencies
if direction == 'parents':
successors = self.dependents
visited.add(key)
for name in sorted(successors):
child = successors[name]
for elt in child.traverse(visited, d+1, **kwargs):
yield elt
# Postorder traversal yields after successors
if yield_me and order == 'post':
yield result
@property
@@ -519,19 +552,27 @@ def short_spec(self):
return self.format('$_$@$%@$+$=$#')
@property
def cshort_spec(self):
"""Returns a version of the spec with the dependencies hashed
instead of completely enumerated."""
return self.format('$_$@$%@$+$=$#', color=True)
@property
def prefix(self):
return Prefix(spack.install_layout.path_for_spec(self))
def dep_hash(self, length=None):
"""Return a hash representing the dependencies of this spec
This will always normalize first so that the hash is consistent.
"""
self.normalize()
"""Return a hash representing all dependencies of this spec
(direct and indirect).
If you want this hash to be consistent, you should
concretize the spec first so that it is not ambiguous.
"""
sha = hashlib.sha1()
sha.update(str(self.dependencies))
sha.update(self.dep_string())
full_hash = sha.hexdigest()
return full_hash[:length]
@@ -594,7 +635,7 @@ def _expand_virtual_packages(self):
a problem.
"""
while True:
virtuals =[v for v in self.preorder_traversal() if v.virtual]
virtuals =[v for v in self.traverse() if v.virtual]
if not virtuals:
return
@@ -605,8 +646,8 @@ def _expand_virtual_packages(self):
spec._replace_with(concrete)
# If there are duplicate providers or duplicate provider deps, this
# consolidates them and merges constraints.
self.normalize()
# consolidates them and merge constraints.
self.normalize(force=True)
def concretize(self):
@@ -621,9 +662,13 @@ def concretize(self):
with requirements of its pacakges. See flatten() and normalize() for
more details on this.
"""
if self._concrete:
return
self.normalize()
self._expand_virtual_packages()
self._concretize_helper()
self._concrete = True
def concretized(self):
@@ -635,47 +680,60 @@ def concretized(self):
return clone
def flat_dependencies(self):
"""Return a DependencyMap containing all of this spec's dependencies
with their constraints merged. If there are any conflicts, throw
an exception.
def flat_dependencies(self, **kwargs):
"""Return a DependencyMap containing all of this spec's
dependencies with their constraints merged.
This will work even on specs that are not normalized; i.e. specs
that have two instances of the same dependency in the DAG.
This is used as the first step of normalization.
If copy is True, returns merged copies of its dependencies
without modifying the spec it's called on.
If copy is False, clears this spec's dependencies and
returns them.
"""
# This ensures that the package descriptions themselves are consistent
if not self.virtual:
self.package.validate_dependencies()
copy = kwargs.get('copy', True)
# Once that is guaranteed, we know any constraint violations are due
# to the spec -- so they're the user's fault, not Spack's.
flat_deps = DependencyMap()
try:
for spec in self.preorder_traversal():
for spec in self.traverse(root=False):
if spec.name not in flat_deps:
new_spec = spec.copy(dependencies=False)
flat_deps[spec.name] = new_spec
if copy:
flat_deps[spec.name] = spec.copy(deps=False)
else:
flat_deps[spec.name] = spec
else:
flat_deps[spec.name].constrain(spec)
if not copy:
for dep in flat_deps.values():
dep.dependencies.clear()
dep.dependents.clear()
self.dependencies.clear()
return flat_deps
except UnsatisfiableSpecError, e:
# This REALLY shouldn't happen unless something is wrong in spack.
# It means we got a spec DAG with two instances of the same package
# that had inconsistent constraints. There's no way for a user to
# produce a spec like this (the parser adds all deps to the root),
# so this means OUR code is not sane!
# Here, the DAG contains two instances of the same package
# with inconsistent constraints. Users cannot produce
# inconsistent specs like this on the command line: the
# parser doesn't allow it. Spack must be broken!
raise InconsistentSpecError("Invalid Spec DAG: %s" % e.message)
return flat_deps
def index(self):
"""Return DependencyMap that points to all the dependencies in this
spec."""
dm = DependencyMap()
for spec in self.traverse():
dm[spec.name] = spec
return dm
def flatten(self):
"""Pull all dependencies up to the root (this spec).
Merge constraints for dependencies with the same name, and if they
conflict, throw an exception. """
self.dependencies = self.flat_dependencies()
for dep in self.flat_dependencies(copy=False):
self._add_dependency(dep)
def _normalize_helper(self, visited, spec_deps, provider_index):
@@ -754,7 +812,7 @@ def _normalize_helper(self, visited, spec_deps, provider_index):
dependency._normalize_helper(visited, spec_deps, provider_index)
def normalize(self):
def normalize(self, **kwargs):
"""When specs are parsed, any dependencies specified are hanging off
the root, and ONLY the ones that were explicitly provided are there.
Normalization turns a partial flat spec into a DAG, where:
@@ -772,14 +830,18 @@ def normalize(self):
TODO: normalize should probably implement some form of cycle detection,
to ensure that the spec is actually a DAG.
"""
if self._normal and not kwargs.get('force', False):
return
# Ensure first that all packages & compilers in the DAG exist.
self.validate_names()
# Then ensure that the packages referenced are sane, that the
# provided spec is sane, and that all dependency specs are in the
# root node of the spec. flat_dependencies will do this for us.
spec_deps = self.flat_dependencies()
self.dependencies.clear()
# Ensure that the package & dep descriptions are consistent & sane
if not self.virtual:
self.package.validate_dependencies()
# Get all the dependencies into one DependencyMap
spec_deps = self.flat_dependencies(copy=False)
# Figure out which of the user-provided deps provide virtual deps.
# Remove virtual deps that are already provided by something in the spec
@@ -792,7 +854,7 @@ def normalize(self):
# If there are deps specified but not visited, they're not
# actually deps of this package. Raise an error.
extra = set(spec_deps.viewkeys()).difference(visited)
extra = set(spec_deps.keys()).difference(visited)
# Also subtract out all the packags that provide a needed vpkg
vdeps = [v for v in self.package.virtual_dependencies()]
@@ -805,11 +867,14 @@ def normalize(self):
raise InvalidDependencyException(
self.name + " does not depend on " + comma_or(extra))
# Mark the spec as normal once done.
self._normal = True
def normalized(self):
"""Return a normalized copy of this spec without modifying this spec."""
clone = self.copy()
clone.normalized()
clone.normalize()
return clone
@@ -818,7 +883,7 @@ def validate_names(self):
If they're not, it will raise either UnknownPackageError or
UnsupportedCompilerError.
"""
for spec in self.preorder_traversal():
for spec in self.traverse():
# Don't get a package for a virtual name.
if not spec.virtual:
spack.db.get(spec.name)
@@ -886,17 +951,17 @@ def _constrain_dependencies(self, other):
def common_dependencies(self, other):
"""Return names of dependencies that self an other have in common."""
common = set(
s.name for s in self.preorder_traversal(root=False))
s.name for s in self.traverse(root=False))
common.intersection_update(
s.name for s in other.preorder_traversal(root=False))
s.name for s in other.traverse(root=False))
return common
def dep_difference(self, other):
"""Returns dependencies in self that are not in other."""
mine = set(s.name for s in self.preorder_traversal(root=False))
mine = set(s.name for s in self.traverse(root=False))
mine.difference_update(
s.name for s in other.preorder_traversal(root=False))
s.name for s in other.traverse(root=False))
return mine
@@ -955,8 +1020,8 @@ def satisfies_dependencies(self, other):
return False
# For virtual dependencies, we need to dig a little deeper.
self_index = ProviderIndex(self.preorder_traversal(), restrict=True)
other_index = ProviderIndex(other.preorder_traversal(), restrict=True)
self_index = ProviderIndex(self.traverse(), restrict=True)
other_index = ProviderIndex(other.traverse(), restrict=True)
# This handles cases where there are already providers for both vpkgs
if not self_index.satisfies(other_index):
@@ -978,7 +1043,7 @@ def satisfies_dependencies(self, other):
def virtual_dependencies(self):
"""Return list of any virtual deps in this spec."""
return [spec for spec in self.preorder_traversal() if spec.virtual]
return [spec for spec in self.traverse() if spec.virtual]
def _dup(self, other, **kwargs):
@@ -993,21 +1058,31 @@ def _dup(self, other, **kwargs):
Whether deps should be copied too. Set to false to copy a
spec but not its dependencies.
"""
# TODO: this needs to handle DAGs.
# Local node attributes get copied first.
self.name = other.name
self.versions = other.versions.copy()
self.variants = other.variants.copy()
self.architecture = other.architecture
self.compiler = None
if other.compiler:
self.compiler = other.compiler.copy()
self.compiler = other.compiler.copy() if other.compiler else None
self.dependents = DependencyMap()
copy_deps = kwargs.get('dependencies', True)
if copy_deps:
self.dependencies = other.dependencies.copy()
else:
self.dependencies = DependencyMap()
self.dependencies = DependencyMap()
# If we copy dependencies, preserve DAG structure in the new spec
if kwargs.get('deps', True):
# This copies the deps from other using _dup(deps=False)
new_nodes = other.flat_dependencies()
new_nodes[self.name] = self
# Hook everything up properly here by traversing.
for spec in other.traverse(cover='nodes'):
parent = new_nodes[spec.name]
for child in spec.dependencies:
if child not in parent.dependencies:
parent._add_dependency(new_nodes[child])
# Since we preserved structure, we can copy _normal safely.
self._normal = other._normal
self._concrete = other._concrete
def copy(self, **kwargs):
@@ -1029,7 +1104,7 @@ def version(self):
def __getitem__(self, name):
"""TODO: reconcile __getitem__, _add_dependency, __contains__"""
for spec in self.preorder_traversal():
for spec in self.traverse():
if spec.name == name:
return spec
@@ -1037,18 +1112,86 @@ def __getitem__(self, name):
def __contains__(self, spec):
"""True if this spec has any dependency that satisfies the supplied
spec."""
"""True if this spec satisfis the provided spec, or if any dependency
does. If the spec has no name, then we parse this one first.
"""
spec = self._autospec(spec)
for s in self.preorder_traversal():
for s in self.traverse():
if s.satisfies(spec):
return True
return False
def _cmp_key(self):
def sorted_deps(self):
"""Return a list of all dependencies sorted by name."""
deps = self.flat_dependencies()
return tuple(deps[name] for name in sorted(deps))
def _eq_dag(self, other, vs, vo):
"""Recursive helper for eq_dag and ne_dag. Does the actual DAG
traversal."""
vs.add(id(self))
vo.add(id(other))
if self.ne_node(other):
return False
if len(self.dependencies) != len(other.dependencies):
return False
ssorted = [self.dependencies[name] for name in sorted(self.dependencies)]
osorted = [other.dependencies[name] for name in sorted(other.dependencies)]
for s, o in zip(ssorted, osorted):
visited_s = id(s) in vs
visited_o = id(o) in vo
# Check for duplicate or non-equal dependencies
if visited_s != visited_o: return False
# Skip visited nodes
if visited_s or visited_o: continue
# Recursive check for equality
if not s._eq_dag(o, vs, vo):
return False
return True
def eq_dag(self, other):
"""True if the full dependency DAGs of specs are equal"""
return self._eq_dag(other, set(), set())
def ne_dag(self, other):
"""True if the full dependency DAGs of specs are not equal"""
return not self.eq_dag(other)
def _cmp_node(self):
"""Comparison key for just *this node* and not its deps."""
return (self.name, self.versions, self.variants,
self.architecture, self.compiler, self.dependencies)
self.architecture, self.compiler)
def eq_node(self, other):
"""Equality with another spec, not including dependencies."""
return self._cmp_node() == other._cmp_node()
def ne_node(self, other):
"""Inequality with another spec, not including dependencies."""
return self._cmp_node() != other._cmp_node()
def _cmp_key(self):
"""Comparison key for this node and all dependencies *without*
considering structure. This is the default, as
normalization will restore structure.
"""
return self._cmp_node() + (self.sorted_deps(),)
def colorized(self):
@@ -1151,28 +1294,29 @@ def write(s, c):
return result
def dep_string(self):
return ''.join("^" + dep.format() for dep in self.sorted_deps())
def __str__(self):
by_name = lambda d: d.name
deps = self.preorder_traversal(key=by_name, root=False)
sorted_deps = sorted(deps, key=by_name)
dep_string = ''.join("^" + dep.format() for dep in sorted_deps)
return self.format() + dep_string
return self.format() + self.dep_string()
def tree(self, **kwargs):
"""Prints out this spec and its dependencies, tree-formatted
with indentation."""
color = kwargs.get('color', False)
depth = kwargs.get('depth', False)
showid = kwargs.get('ids', False)
cover = kwargs.get('cover', 'nodes')
indent = kwargs.get('indent', 0)
format = kwargs.get('format', '$_$@$%@$+$=')
color = kwargs.pop('color', False)
depth = kwargs.pop('depth', False)
showid = kwargs.pop('ids', False)
cover = kwargs.pop('cover', 'nodes')
indent = kwargs.pop('indent', 0)
fmt = kwargs.pop('format', '$_$@$%@$+$=')
check_kwargs(kwargs, self.tree)
out = ""
cur_id = 0
ids = {}
for d, node in self.preorder_traversal(cover=cover, depth=True):
for d, node in self.traverse(order='pre', cover=cover, depth=True):
out += " " * indent
if depth:
out += "%-4d" % d
@@ -1184,7 +1328,7 @@ def tree(self, **kwargs):
out += (" " * d)
if d > 0:
out += "^"
out += node.format(format, color=color) + "\n"
out += node.format(fmt, color=color) + "\n"
return out
@@ -1261,6 +1405,9 @@ def spec(self):
spec.dependents = DependencyMap()
spec.dependencies = DependencyMap()
spec._normal = False
spec._concrete = False
# record this so that we know whether version is
# unspecified or not.
added_version = False
@@ -1490,7 +1637,7 @@ def __init__(self, provided, required, constraint_type):
class UnsatisfiableSpecNameError(UnsatisfiableSpecError):
"""Raised when two specs aren't even for the same package."""
def __init__(self, provided, required):
super(UnsatisfiableVersionSpecError, self).__init__(
super(UnsatisfiableSpecNameError, self).__init__(
provided, required, "name")

View File

@@ -32,18 +32,19 @@
import spack
import spack.config
import spack.fetch_strategy as fs
import spack.error
import spack.util.crypto as crypto
from spack.util.compression import decompressor_for
STAGE_PREFIX = 'spack-stage-'
class Stage(object):
"""A Stage object manaages a directory where an archive is downloaded,
expanded, and built before being installed. It also handles downloading
the archive. A stage's lifecycle looks like this:
"""A Stage object manaages a directory where some source code is
downloaded and built before being installed. It handles
fetching the source code, either as an archive to be expanded
or by checking it out of a repository. A stage's lifecycle
looks like this:
Stage()
Constructor creates the stage directory.
@@ -68,21 +69,31 @@ class Stage(object):
similar, and are intended to persist for only one run of spack.
"""
def __init__(self, url, **kwargs):
def __init__(self, url_or_fetch_strategy, **kwargs):
"""Create a stage object.
Parameters:
url URL of the archive to be downloaded into this stage.
url_or_fetch_strategy
URL of the archive to be downloaded into this stage, OR
a valid FetchStrategy.
name If a name is provided, then this stage is a named stage
and will persist between runs (or if you construct another
stage object later). If name is not provided, then this
stage will be given a unique name automatically.
name
If a name is provided, then this stage is a named stage
and will persist between runs (or if you construct another
stage object later). If name is not provided, then this
stage will be given a unique name automatically.
"""
if isinstance(url_or_fetch_strategy, basestring):
self.fetcher = fs.from_url(url_or_fetch_strategy)
elif isinstance(url_or_fetch_strategy, fs.FetchStrategy):
self.fetcher = url_or_fetch_strategy
else:
raise ValueError("Can't construct Stage without url or fetch strategy")
self.fetcher.set_stage(self)
self.name = kwargs.get('name')
self.mirror_path = kwargs.get('mirror_path')
self.tmp_root = find_tmp_root()
self.url = url
self.path = None
self._setup()
@@ -120,8 +131,7 @@ def _need_to_create_path(self):
if spack.use_tmp_stage:
# If we're using a tmp dir, it's a link, and it points at the right spot,
# then keep it.
if (os.path.commonprefix((real_path, real_tmp)) == real_tmp
and os.path.exists(real_path)):
if (real_path.startswith(real_tmp) and os.path.exists(real_path)):
return False
else:
# otherwise, just unlink it and start over.
@@ -188,7 +198,10 @@ def _setup(self):
@property
def archive_file(self):
"""Path to the source archive within this stage directory."""
paths = [os.path.join(self.path, os.path.basename(self.url))]
if not isinstance(self.fetcher, fs.URLFetchStrategy):
return None
paths = [os.path.join(self.path, os.path.basename(self.fetcher.url))]
if self.mirror_path:
paths.append(os.path.join(self.path, os.path.basename(self.mirror_path)))
@@ -199,17 +212,17 @@ def archive_file(self):
@property
def expanded_archive_path(self):
"""Returns the path to the expanded archive directory if it's expanded;
None if the archive hasn't been expanded.
"""
if not self.archive_file:
return None
def source_path(self):
"""Returns the path to the expanded/checked out source code
within this fetch strategy's path.
for file in os.listdir(self.path):
archive_path = join_path(self.path, file)
if os.path.isdir(archive_path):
return archive_path
This assumes nothing else is going ot be put in the
FetchStrategy's path. It searches for the first
subdirectory of the path it can find, then returns that.
"""
for p in [os.path.join(self.path, f) for f in os.listdir(self.path)]:
if os.path.isdir(p):
return p
return None
@@ -221,71 +234,40 @@ def chdir(self):
tty.die("Setup failed: no such directory: " + self.path)
def fetch_from_url(self, url):
# Run curl but grab the mime type from the http headers
headers = spack.curl('-#', # status bar
'-O', # save file to disk
'-D', '-', # print out HTML headers
'-L', url,
return_output=True, fail_on_error=False)
if spack.curl.returncode != 0:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if spack.curl.returncode == 60:
# This is a certificate error. Suggest spack -k
raise FailedDownloadError(
url,
"Curl was unable to fetch due to invalid certificate. "
"This is either an attack, or your cluster's SSL configuration "
"is bad. If you believe your SSL configuration is bad, you "
"can try running spack -k, which will not check SSL certificates."
"Use this at your own risk.")
# Check if we somehow got an HTML file rather than the archive we
# asked for. We only look at the last content type, to handle
# redirects properly.
content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
if content_types and 'text/html' in content_types[-1]:
tty.warn("The contents of " + self.archive_file + " look like HTML.",
"The checksum will likely be bad. If it is, you can use",
"'spack clean --dist' to remove the bad archive, then fix",
"your internet gateway issue and install again.")
def fetch(self):
"""Downloads the file at URL to the stage. Returns true if it was downloaded,
false if it already existed."""
"""Downloads an archive or checks out code from a repository."""
self.chdir()
if self.archive_file:
tty.msg("Already downloaded %s." % self.archive_file)
fetchers = [self.fetcher]
# TODO: move mirror logic out of here and clean it up!
if self.mirror_path:
urls = ["%s/%s" % (m, self.mirror_path) for m in _get_mirrors()]
digest = None
if isinstance(self.fetcher, fs.URLFetchStrategy):
digest = self.fetcher.digest
fetchers = [fs.URLFetchStrategy(url, digest)
for url in urls] + fetchers
for f in fetchers:
f.set_stage(self)
for fetcher in fetchers:
try:
fetcher.fetch()
break
except spack.error.SpackError, e:
tty.msg("Fetching from %s failed." % fetcher)
tty.debug(e)
continue
else:
urls = [self.url]
if self.mirror_path:
urls = ["%s/%s" % (m, self.mirror_path) for m in _get_mirrors()] + urls
for url in urls:
tty.msg("Trying to fetch from %s" % url)
self.fetch_from_url(url)
if self.archive_file:
break
if not self.archive_file:
raise FailedDownloadError(url)
return self.archive_file
tty.die("All fetchers failed for %s" % self.name)
def check(self, digest):
"""Check the downloaded archive against a checksum digest"""
checker = crypto.Checker(digest)
if not checker.check(self.archive_file):
raise ChecksumError(
"%s checksum failed for %s." % (checker.hash_name, self.archive_file),
"Expected %s but got %s." % (digest, checker.sum))
def check(self):
"""Check the downloaded archive against a checksum digest.
No-op if this stage checks code out of a repository."""
self.fetcher.check()
def expand_archive(self):
@@ -293,19 +275,14 @@ def expand_archive(self):
archive. Fail if the stage is not set up or if the archive is not yet
downloaded.
"""
self.chdir()
if not self.archive_file:
tty.die("Attempt to expand archive before fetching.")
decompress = decompressor_for(self.archive_file)
decompress(self.archive_file)
self.fetcher.expand()
def chdir_to_archive(self):
def chdir_to_source(self):
"""Changes directory to the expanded archive directory.
Dies with an error if there was no expanded archive.
"""
path = self.expanded_archive_path
path = self.source_path
if not path:
tty.die("Attempt to chdir before expanding archive.")
else:
@@ -318,12 +295,7 @@ def restage(self):
"""Removes the expanded archive path if it exists, then re-expands
the archive.
"""
if not self.archive_file:
tty.die("Attempt to restage when not staged.")
if self.expanded_archive_path:
shutil.rmtree(self.expanded_archive_path, True)
self.expand_archive()
self.fetcher.reset()
def destroy(self):
@@ -394,15 +366,20 @@ def find_tmp_root():
return None
class FailedDownloadError(spack.error.SpackError):
"""Raised wen a download fails."""
def __init__(self, url, msg=""):
super(FailedDownloadError, self).__init__(
"Failed to fetch file from URL: %s" % url, msg)
self.url = url
class StageError(spack.error.SpackError):
def __init__(self, message, long_message=None):
super(self, StageError).__init__(message, long_message)
class ChecksumError(spack.error.SpackError):
"""Raised when archive fails to checksum."""
def __init__(self, message, long_msg):
super(ChecksumError, self).__init__(message, long_msg)
class RestageError(StageError):
def __init__(self, message, long_msg=None):
super(RestageError, self).__init__(message, long_msg)
class ChdirError(StageError):
def __init__(self, message, long_msg=None):
super(ChdirError, self).__init__(message, long_msg)
# Keep this in namespace for convenience
FailedDownloadError = fs.FailedDownloadError

View File

@@ -30,12 +30,10 @@
import spack
import spack.test.install
"""Names of tests to be included in Spack's test suite"""
test_names = ['versions',
'url_parse',
'url_substitution',
'packages',
'stage',
'spec_syntax',
@@ -45,7 +43,16 @@
'multimethod',
'install',
'package_sanity',
'config']
'config',
'directory_layout',
'python_version',
'git_fetch',
'svn_fetch',
'hg_fetch',
'mirror',
'url_extrapolate',
'cc',
'link_tree']
def list_tests():
@@ -70,7 +77,7 @@ def run(names, verbose=False):
runner = unittest.TextTestRunner(verbosity=verbosity)
testsRun = errors = failures = skipped = 0
testsRun = errors = failures = 0
for test in names:
module = 'spack.test.' + test
print module
@@ -81,12 +88,10 @@ def run(names, verbose=False):
testsRun += result.testsRun
errors += len(result.errors)
failures += len(result.failures)
skipped += len(result.skipped)
succeeded = not errors and not failures
tty.msg("Tests Complete.",
"%5d tests run" % testsRun,
"%5d skipped" % skipped,
"%5d failures" % failures,
"%5d errors" % errors)

130
lib/spack/spack/test/cc.py Normal file
View File

@@ -0,0 +1,130 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
This test checks that the Spack cc compiler wrapper is parsing
arguments correctly.
"""
import os
import unittest
from llnl.util.filesystem import *
import spack
from spack.util.executable import *
# Complicated compiler test command
test_command = [
'-I/test/include', '-L/test/lib', '-L/other/lib', '-I/other/include',
'arg1',
'-Wl,--start-group',
'arg2',
'-Wl,-rpath=/first/rpath', 'arg3', '-Wl,-rpath', '-Wl,/second/rpath',
'-llib1', '-llib2',
'arg4',
'-Wl,--end-group',
'-Xlinker,-rpath', '-Xlinker,/third/rpath', '-Xlinker,-rpath=/fourth/rpath',
'-llib3', '-llib4',
'arg5', 'arg6']
class CompilerTest(unittest.TestCase):
def setUp(self):
self.cc = Executable(join_path(spack.build_env_path, "cc"))
self.ld = Executable(join_path(spack.build_env_path, "ld"))
self.cpp = Executable(join_path(spack.build_env_path, "cpp"))
os.environ['SPACK_CC'] = "/bin/mycc"
os.environ['SPACK_PREFIX'] = "/usr"
os.environ['SPACK_ENV_PATH']="test"
os.environ['SPACK_DEBUG_LOG_DIR'] = "."
os.environ['SPACK_COMPILER_SPEC'] = "gcc@4.4.7"
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2"
def check_cc(self, command, args, expected):
os.environ['SPACK_TEST_COMMAND'] = command
self.assertEqual(self.cc(*args, return_output=True).strip(), expected)
def check_ld(self, command, args, expected):
os.environ['SPACK_TEST_COMMAND'] = command
self.assertEqual(self.ld(*args, return_output=True).strip(), expected)
def check_cpp(self, command, args, expected):
os.environ['SPACK_TEST_COMMAND'] = command
self.assertEqual(self.cpp(*args, return_output=True).strip(), expected)
def test_vcheck_mode(self):
self.check_cc('dump-mode', ['-I/include', '--version'], "vcheck")
self.check_cc('dump-mode', ['-I/include', '-V'], "vcheck")
self.check_cc('dump-mode', ['-I/include', '-v'], "vcheck")
self.check_cc('dump-mode', ['-I/include', '-dumpversion'], "vcheck")
self.check_cc('dump-mode', ['-I/include', '--version', '-c'], "vcheck")
self.check_cc('dump-mode', ['-I/include', '-V', '-o', 'output'], "vcheck")
def test_cpp_mode(self):
self.check_cc('dump-mode', ['-E'], "cpp")
self.check_cpp('dump-mode', [], "cpp")
def test_ccld_mode(self):
self.check_cc('dump-mode', [], "ccld")
self.check_cc('dump-mode', ['foo.c', '-o', 'foo'], "ccld")
self.check_cc('dump-mode', ['foo.c', '-o', 'foo', '-Wl,-rpath=foo'], "ccld")
self.check_cc('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath=foo'], "ccld")
def test_ld_mode(self):
self.check_ld('dump-mode', [], "ld")
self.check_ld('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath=foo'], "ld")
def test_includes(self):
self.check_cc('dump-includes', test_command,
"\n".join(["/test/include", "/other/include"]))
def test_libraries(self):
self.check_cc('dump-libraries', test_command,
"\n".join(["/test/lib", "/other/lib"]))
def test_libs(self):
self.check_cc('dump-libs', test_command,
"\n".join(["lib1", "lib2", "lib3", "lib4"]))
def test_rpaths(self):
self.check_cc('dump-rpaths', test_command,
"\n".join(["/first/rpath", "/second/rpath", "/third/rpath", "/fourth/rpath"]))
def test_other_args(self):
self.check_cc('dump-other-args', test_command,
"\n".join(["arg1", "-Wl,--start-group", "arg2", "arg3", "arg4",
"-Wl,--end-group", "arg5", "arg6"]))

View File

@@ -134,29 +134,29 @@ def test_concretize_with_provides_when(self):
def test_virtual_is_fully_expanded_for_callpath(self):
# force dependence on fake "zmpi" by asking for MPI 10.0
spec = Spec('callpath ^mpi@10.0')
self.assertIn('mpi', spec.dependencies)
self.assertNotIn('fake', spec)
self.assertTrue('mpi' in spec.dependencies)
self.assertFalse('fake' in spec)
spec.concretize()
self.assertIn('zmpi', spec.dependencies)
self.assertNotIn('mpi', spec)
self.assertIn('fake', spec.dependencies['zmpi'])
self.assertTrue('zmpi' in spec.dependencies)
self.assertFalse('mpi' in spec)
self.assertTrue('fake' in spec.dependencies['zmpi'])
def test_virtual_is_fully_expanded_for_mpileaks(self):
spec = Spec('mpileaks ^mpi@10.0')
self.assertIn('mpi', spec.dependencies)
self.assertNotIn('fake', spec)
self.assertTrue('mpi' in spec.dependencies)
self.assertFalse('fake' in spec)
spec.concretize()
self.assertIn('zmpi', spec.dependencies)
self.assertIn('callpath', spec.dependencies)
self.assertIn('zmpi', spec.dependencies['callpath'].dependencies)
self.assertIn('fake', spec.dependencies['callpath'].dependencies['zmpi'].dependencies)
self.assertTrue('zmpi' in spec.dependencies)
self.assertTrue('callpath' in spec.dependencies)
self.assertTrue('zmpi' in spec.dependencies['callpath'].dependencies)
self.assertTrue('fake' in spec.dependencies['callpath'].dependencies['zmpi'].dependencies)
self.assertNotIn('mpi', spec)
self.assertFalse('mpi' in spec)
def test_my_dep_depends_on_provider_of_my_virtual_dep(self):

View File

@@ -0,0 +1,155 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""\
This test verifies that the Spack directory layout works properly.
"""
import unittest
import tempfile
import shutil
import os
from contextlib import closing
from llnl.util.filesystem import *
import spack
from spack.spec import Spec
from spack.packages import PackageDB
from spack.directory_layout import SpecHashDirectoryLayout
class DirectoryLayoutTest(unittest.TestCase):
"""Tests that a directory layout works correctly and produces a
consistent install path."""
def setUp(self):
self.tmpdir = tempfile.mkdtemp()
self.layout = SpecHashDirectoryLayout(self.tmpdir)
def tearDown(self):
shutil.rmtree(self.tmpdir, ignore_errors=True)
self.layout = None
def test_read_and_write_spec(self):
"""This goes through each package in spack and creates a directory for
it. It then ensures that the spec for the directory's
installed package can be read back in consistently, and
finally that the directory can be removed by the directory
layout.
"""
for pkg in spack.db.all_packages():
spec = pkg.spec
# If a spec fails to concretize, just skip it. If it is a
# real error, it will be caught by concretization tests.
try:
spec.concretize()
except:
continue
self.layout.make_path_for_spec(spec)
install_dir = self.layout.path_for_spec(spec)
spec_path = self.layout.spec_file_path(spec)
# Ensure directory has been created in right place.
self.assertTrue(os.path.isdir(install_dir))
self.assertTrue(install_dir.startswith(self.tmpdir))
# Ensure spec file exists when directory is created
self.assertTrue(os.path.isfile(spec_path))
self.assertTrue(spec_path.startswith(install_dir))
# Make sure spec file can be read back in to get the original spec
spec_from_file = self.layout.read_spec(spec_path)
self.assertEqual(spec, spec_from_file)
self.assertTrue(spec.eq_dag, spec_from_file)
self.assertTrue(spec_from_file.concrete)
# Ensure that specs that come out "normal" are really normal.
with closing(open(spec_path)) as spec_file:
read_separately = Spec(spec_file.read())
read_separately.normalize()
self.assertEqual(read_separately, spec_from_file)
read_separately.concretize()
self.assertEqual(read_separately, spec_from_file)
# Make sure the dep hash of the read-in spec is the same
self.assertEqual(spec.dep_hash(), spec_from_file.dep_hash())
# Ensure directories are properly removed
self.layout.remove_path_for_spec(spec)
self.assertFalse(os.path.isdir(install_dir))
self.assertFalse(os.path.exists(install_dir))
def test_handle_unknown_package(self):
"""This test ensures that spack can at least do *some*
operations with packages that are installed but that it
does not know about. This is actually not such an uncommon
scenario with spack; it can happen when you switch from a
git branch where you're working on a new package.
This test ensures that the directory layout stores enough
information about installed packages' specs to uninstall
or query them again if the package goes away.
"""
mock_db = PackageDB(spack.mock_packages_path)
not_in_mock = set(spack.db.all_package_names()).difference(
set(mock_db.all_package_names()))
# Create all the packages that are not in mock.
installed_specs = {}
for pkg_name in not_in_mock:
spec = spack.db.get(pkg_name).spec
# If a spec fails to concretize, just skip it. If it is a
# real error, it will be caught by concretization tests.
try:
spec.concretize()
except:
continue
self.layout.make_path_for_spec(spec)
installed_specs[spec] = self.layout.path_for_spec(spec)
tmp = spack.db
spack.db = mock_db
# Now check that even without the package files, we know
# enough to read a spec from the spec file.
for spec, path in installed_specs.items():
spec_from_file = self.layout.read_spec(join_path(path, '.spec'))
# To satisfy these conditions, directory layouts need to
# read in concrete specs from their install dirs somehow.
self.assertEqual(path, self.layout.path_for_spec(spec_from_file))
self.assertEqual(spec, spec_from_file)
self.assertEqual(spec.dep_hash(), spec_from_file.dep_hash())
spack.db = tmp

Some files were not shown because too many files have changed in this diff Show More