Compare commits

...

58 Commits

Author SHA1 Message Date
Greg Becker
719d31682f speed up relocation using memoized offsets
refactor to do scanning in a single pass
parallelize new relocate method with threadpool
relocate_by_offsets can recompute if offsets not memoized
2022-06-03 12:19:01 +02:00
Sreenivasa Murthy Kolam
1190d03b0f Bump up the version for ROCm-5.1.3 release (#30819)
* Bump up the version for ROCm-5.1.3 release

* remove extra comma from hashes for device-libs of rocm-openmp-extras
2022-06-01 10:57:10 -07:00
kwryankrattiger
5faa927fe6 Ascent: Patch find conduit python (#30949)
Some systems have trouble when using the python on the login node so
this should provide an option to build that doesn't require running
python.
2022-06-01 10:36:33 -07:00
Adam J. Stewart
c60d220f81 py-jupyterlab: add v3.4.2 (#30867) 2022-06-01 10:18:47 -07:00
Hartmut Kaiser
61d3d60414 Update HPX recipe for HPX V1.8.0 (#30896) 2022-06-01 11:06:03 -06:00
rashawnLK
174258c09a Updated intel-gtpin package.py for most recent version, GTPin 3.0. (#30877)
* Updated intel-gtpin package.py for  most recent version, GTPin 3.0.

* Fixed style issues in package.py -- removed trailing whitespace on two
lines.
2022-06-01 11:05:41 -06:00
Erik
73c6a8f73d Version updates for SUNDIALS and CUDA (#30874) 2022-06-01 11:01:32 -06:00
Olivier Cessenat
86dc904080 ngspice: adding version 37 (#30925) 2022-06-01 10:50:02 -06:00
Weiqun Zhang
9e1c87409d amrex: add v22.06 (#30951) 2022-06-01 10:49:46 -06:00
Sergey Kosukhin
2b30dc2e30 nag: add new version (#30927)
* nag: add new version

* nag: update maintainers
2022-06-01 10:49:32 -06:00
Ben Darwin
b1ce756d69 minc-toolkit: add version 1.9.18.2 (#30926) 2022-06-01 10:45:34 -06:00
Ida Mjelde
1194ac6985 Adding a libunwind variant to libzmq (#30932)
* Adding a libunwind variant to libzmq

* Remove whitespace line 46
2022-06-01 10:29:56 -06:00
Asher Mancinelli
954f961208 tmux: support building from master and utf8 opts (#30928)
* tmux: support building from master and utf8 opts

* Fix style errors
2022-06-01 10:29:35 -06:00
Zack Galbreath
47ac710796 CPU & memory requests for jobs that generate GitLab CI pipelines (#30940)
gitlab ci: make sure pipeline generation isn't resource starved
2022-06-01 09:43:23 -06:00
Derek Ryan Strong
d7fb5a6db4 rclone: add 1.58 (#30887)
* Add rclone 1.58

* Update rclone git repo path
2022-05-31 19:35:12 -07:00
Maciej Wójcik
e0624b9278 gromacs: Add recent releases (#30892)
* gromacs: Add recent releases

* gromacs: Update branch name

* gromacs: Update links
2022-05-31 19:27:23 -07:00
Olivier Cessenat
e86614f7b8 gmsh: adding version 4.10.3 (#30923) 2022-05-31 18:53:03 -07:00
Garth N. Wells
d166b948ce fenics-dolfinx: dependency updates (#30919)
* Add pugixml dependency

* Dependency updates

* Fix Spack Numpy verion

* Test more generous NumPy constraint

* Fix NumPy requirment
2022-05-31 18:49:49 -07:00
lorddavidiii
9cc3a2942d cfitsio: add 4.1.0 (#30920) 2022-05-31 18:46:07 -07:00
Marie Houillon
5d685f9ff6 New version for openCARP packages (#30931)
Co-authored-by: openCARP consortium <info@opencarp.org>
2022-05-31 18:21:55 -07:00
Paul Kuberry
9ddf45964d xyce: add sha for version 7.5.0 (#30941) 2022-05-31 17:59:13 -07:00
iarspider
b88cc77f16 xpmem package: add patches for building on FC 35 with kernel 5.16.18-200 (#29945) 2022-05-31 15:55:51 -07:00
Robert Cohn
f3af38ba9b Fix module support for oneapi compilers (#28901)
Updates to improve Spack-generated modules for Intel oneAPI compilers:

* intel-oneapi-compilers set CC etc.
* Add a new package intel-oneapi-compilers-classic which can be used to
  generate a module which sets CC etc. to older compilers (e.g. icc)
* lmod module logic now updated to treat the intel-oneapi-compilers*
  packages as compilers
2022-05-31 15:02:25 -07:00
Wouter Deconinck
adc9f887ea acts-dd4hep: new package; acts: new version (#30850)
* acts-dd4hep: new package, separated from new acts@19.1.0

* acts-dd4hep: improved versioning

* acts-dd4hep: don't use curl | sha256sum

* acts: new variant `odd` for Open Data Detector

* acts-dd4hep: style changes
2022-05-31 11:30:36 -07:00
Wouter Deconinck
9461f482d9 assimp: new version 5.2.4 (#30929) 2022-05-31 12:25:24 -06:00
Paul Kuberry
e014b889c6 xyce: remove python packages as +pymi dependencies and hdf5 from trilinos dependency (#30938) 2022-05-31 14:04:57 -04:00
snehring
181ac574bb sentieon-genomics: adding version 202112.04 (#30876) 2022-05-31 10:05:27 -06:00
Adam J. Stewart
055c9d125d CUDA: make cuda_arch sticky (#30910) 2022-05-30 12:53:15 -07:00
Evan Bollig
a94438b1f5 Added AWS-AHUG alinux2 pipeline (#24601)
Add spack stacks targeted at Spack + AWS + ARM HPC User Group hackathon.  Includes
a list of miniapps and full-apps that are ready to run on both x86_64 and aarch64.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-30 10:26:39 -06:00
Joseph Wang
f583e471b8 pass CC variable to make (#30912)
Set CC to cc
2022-05-30 08:44:31 +02:00
Brian Van Essen
f67f3b1796 Add new versions of protobuf and py-protobuf (#30503)
* Add new versions

* Updated the hashes to match the published pypi.org hashes.  Added version constraints for Python.
2022-05-30 01:23:26 -05:00
Jean Luca Bez
77c86c759c HDF5 VOL-ASYNC update versions (#30900) 2022-05-29 17:49:52 -04:00
Adam J. Stewart
8084259bd3 protobuf: fix spack versions (#30879) 2022-05-28 14:53:24 -06:00
Evan Bollig
98860c6a5f Alinux isc buildcache (#30462)
Add two new stacks targeted at x86_64 and arm, representing an initial list of packages 
used by current and planned AWS Workshops, and built in conjunction with the ISC22
announcement of the spack public binary cache.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-28 11:32:53 -06:00
Todd Gamblin
e6929b9ff9 0.18.0.dev0 -> 0.19.0.dev0 (#30907) 2022-05-28 17:23:01 +00:00
Tom Scogland
18c2f1a57a refactor: packages import spack.package explicitly (#30404)
Explicitly import package utilities in all packages, and corresponding fallout.

This includes:

* rename `spack.package` to `spack.package_base`
* rename `spack.pkgkit` to `spack.package`
* update all packages in builtin, builtin_mock and tutorials to include `from spack.package import *`
* update spack style
  * ensure packages include the import
  * automatically add the new import and remove any/all imports of `spack` and `spack.pkgkit`
    from packages when using `--fix`
  * add support for type-checking packages with mypy when SPACK_MYPY_CHECK_PACKAGES
    is set in the environment
* fix all type checking errors in packages in spack upstream
* update spack create to include the new imports
* update spack repo to inject the new import, injection persists to allow for a deprecation period

Original message below:
 
As requested @adamjstewart, update all packages to use pkgkit.  I ended up using isort to do this,
so repro is easy:

```console
$ isort -a 'from spack.pkgkit import *' --rm 'spack' ./var/spack/repos/builtin/packages/*/package.py
$ spack style --fix
```

There were several line spacing fixups caused either by space manipulation in isort or by packages
that haven't been touched since we added requirements, but there are no functional changes in here.

* [x] add config to isort to make sure this is maintained going forward
2022-05-28 12:55:44 -04:00
Todd Gamblin
3054cd0eff update changelog for v0.18.0 (#30905) 2022-05-28 17:33:20 +02:00
JDBetteridge
9016b79270 Additional BLAS/LAPACK library configuration for Numpy (#30817)
* Add amdblis and amdlibflame as BLAS/LAPACK options

* Add Cray-libsci as BLAS/LAPACK option

* Use Netlib config for Cray-libsci
2022-05-28 03:33:31 -06:00
Erik Schnetter
9f5c6fb398 hpx: New version 1.8.0 (#30848) 2022-05-28 09:40:20 +02:00
Greg Becker
19087c9d35 target optimization: re-norm optimization scale so that 0 is best. (#29926)
referred targets are currently the only minimization criteria for Spack for which we allow
negative values. That means Spack may be incentivized to add nodes to the DAG if they
match the preferred target.

This PR re-norms the minimization criteria so that preferred targets are weighted from 0,
and default target weights are offset by the number of preferred targets per-package to
calculate node_target_weight.

Also fixes a bug in the test for preferred targets that was making the test easier to pass
than it should be.
2022-05-27 22:49:41 -07:00
Greg Becker
4116b04368 update tutorial command for v0.18.0 and new gpg key (#30904) 2022-05-28 02:36:20 +00:00
JDBetteridge
1485931695 Ensure same BLAS/LAPACK config from Numpy used in Scipy (#30818)
* Call Numpy package's set_blas_lapack() and setup_build_environment() in Scipy package

* Remove broken link from comment

* Use .package attribute of spec to avoid import
2022-05-27 10:46:21 -07:00
Derek Ryan Strong
78cac4d840 Add R 4.2.0 (#30859) 2022-05-27 08:24:28 -05:00
Michael Kuhn
2f628c3a97 gcc: add 9.5.0 (#30893) 2022-05-27 12:18:57 +02:00
Adam J. Stewart
a3a8710cbe Python: fix clingo bootstrapping on Apple M1 (#30834)
This PR fixes several issues I noticed while trying to get Spack working on Apple M1.

- [x] `build_environment.py` attempts to add `spec['foo'].libs` and `spec['foo'].headers` to our compiler wrappers for all dependencies using a try-except that ignores `NoLibrariesError` and `NoHeadersError` respectively. However, The `libs` and `headers` attributes of the Python package were erroneously using `RuntimeError` instead.
- [x] `spack external find python` (used during bootstrapping) currently has no way to determine whether or not an installation is `+shared`, so previously we would only search for static Python libs. However, most distributions including XCode/Conda/Intel ship shared Python libs. I updated `libs` to search for both shared and static (order based on variant) as a fallback.
- [x] The `headers` attribute was recursively searching in `prefix.include` for `pyconfig.h`, but this could lead to non-deterministic behavior if multiple versions of Python are installed and `pyconfig.h` files exist in multiple `<prefix>/include/pythonX.Y` locations. It's safer to search in `sysconfig.get_path('include')` instead.
- [x] The Python installation that comes with XCode is broken, and `sysconfig.get_paths` is hard-coded to return specific directories. This meant that our logic for `platlib`/`purelib`/`include` where we replace `platbase`/`base`/`installed_base` with `prefix` wasn't working and the `mkdirp` in `setup_dependent_package` was trying to create a directory in root, giving permissions issues. Even if you commented out those `mkdirp` calls, Spack would add the wrong directories to `PYTHONPATH`. Added a fallback hard-coded to `lib/pythonX.Y/site-packages` if sysconfig is broken (this is what distutils always did).
2022-05-27 03:18:20 -07:00
Paul R. C. Kent
0bf3a9c2af llvm: 14.0.3 and 14.0.4 (#30888) 2022-05-27 09:51:54 +02:00
Severin Strobl
cff955f7bd otf2/scorep: add versions 3.0/7.1 (#28631) 2022-05-27 00:43:58 +02:00
Scott Wittenburg
3d43ebec72 Revert "strip -Werror: all specific or none (#30284)" (#30878)
This reverts commit 330832c22c.
2022-05-26 14:17:01 -07:00
Robert Pavel
6fd07479e3 Updated mfem constraints in laghos spackage (#30851)
Updated mfme constraints in laghos spackage to better match comments and
support legacy builds of `laghos@1.0:2.0`
2022-05-26 10:20:42 -07:00
Simon Pintarelli
03bc36f8b0 q-e-sirius: remove ~apps constraint (#30857) 2022-05-26 10:18:37 -07:00
Brian Van Essen
93e1b283b7 Added hash for new versions (#30860) 2022-05-26 10:15:59 -07:00
Derek Ryan Strong
df2c0fbfbd Add new versions of GNU parallel (#30862) 2022-05-26 10:09:43 -07:00
Derek Ryan Strong
54a69587c3 Add newer nano versions (#30865) 2022-05-26 10:04:50 -07:00
Hans Johansen
294312f02b Adding new package bricks for x86, cuda (#30863)
* Adding new package bricks for x86, cuda

* Fixed complaints from "spack style" that CI found

* add license comment at top

Co-authored-by: drhansj <drhansj@berkeley.edu>
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
2022-05-26 07:40:11 -07:00
Massimiliano Culpo
0636fdbfef Remove the warning that Spack prints at each spec (#30872)
Add instead a warning box in the documentation
2022-05-26 14:35:20 +00:00
Scott Wittenburg
85e13260cf ci: Support secure binary signing on protected pipelines (#30753)
This PR supports the creation of securely signed binaries built from spack
develop as well as release branches and tags. Specifically:

- remove internal pr mirror url generation logic in favor of buildcache destination
on command line
    - with a single mirror url specified in the spack.yaml, this makes it clearer where 
    binaries from various pipelines are pushed
- designate some tags as reserved: ['public', 'protected', 'notary']
    - these tags are stripped from all jobs by default and provisioned internally
    based on pipeline type
- update gitlab ci yaml to include pipelines on more protected branches than just
develop (so include releases and tags)
    - binaries from all protected pipelines are pushed into mirrors including the
    branch name so releases, tags, and develop binaries are kept separate
- update rebuild jobs running on protected pipelines to run on special runners
provisioned with an intermediate signing key
    - protected rebuild jobs no longer use "SPACK_SIGNING_KEY" env var to
    obtain signing key (in fact, final signing key is nowhere available to rebuild jobs)
    - these intermediate signatures are verified at the end of each pipeline by a new
    signing job to ensure binaries were produced by a protected pipeline
- optionallly schedule a signing/notary job at the end of the pipeline to sign all
packges in the mirror
    - add signing-job-attributes to gitlab-ci section of spack environment to allow
    configuration
    - signing job runs on special runner (separate from protected rebuild runners)
    provisioned with public intermediate key and secret signing key
2022-05-26 08:31:22 -06:00
Adam J. Stewart
b5a519fa51 py-tensorboard: add v2.9.0 (#30832) 2022-05-26 07:43:40 -04:00
Adam J. Stewart
2e2d0b3211 libtiff: remove extra dependencies/patch (#30854) 2022-05-25 23:37:45 -06:00
6798 changed files with 14647 additions and 10488 deletions

View File

@@ -1,3 +1,205 @@
# v0.18.0 (2022-05-28)
`v0.18.0` is a major feature release.
## Major features in this release
1. **Concretizer now reuses by default**
`spack install --reuse` was introduced in `v0.17.0`, and `--reuse`
is now the default concretization mode. Spack will try hard to
resolve dependencies using installed packages or binaries (#30396).
To avoid reuse and to use the latest package configurations, (the
old default), you can use `spack install --fresh`, or add
configuration like this to your environment or `concretizer.yaml`:
```yaml
concretizer:
reuse: false
```
2. **Finer-grained hashes**
Spack hashes now include `link`, `run`, *and* `build` dependencies,
as well as a canonical hash of package recipes. Previously, hashes
only included `link` and `run` dependencies (though `build`
dependencies were stored by environments). We coarsened the hash to
reduce churn in user installations, but the new default concretizer
behavior mitigates this concern and gets us reuse *and* provenance.
You will be able to see the build dependencies of new installations
with `spack find`. Old installations will not change and their
hashes will not be affected. (#28156, #28504, #30717, #30861)
3. **Improved error messages**
Error handling with the new concretizer is now done with
optimization criteria rather than with unsatisfiable cores, and
Spack reports many more details about conflicting constraints.
(#30669)
4. **Unify environments when possible**
Environments have thus far supported `concretization: together` or
`concretization: separately`. These have been replaced by a new
preference in `concretizer.yaml`:
```yaml
concretizer:
unify: [true|false|when_possible]
```
`concretizer:unify:when_possible` will *try* to resolve a fully
unified environment, but if it cannot, it will create multiple
configurations of some packages where it has to. For large
environments that previously had to be concretized separately, this
can result in a huge speedup (40-50x). (#28941)
5. **Automatically find externals on Cray machines**
Spack can now automatically discover installed packages in the Cray
Programming Environment by running `spack external find` (or `spack
external read-cray-manifest` to *only* query the PE). Packages from
the PE (e.g., `cray-mpich` are added to the database with full
dependency information, and compilers from the PE are added to
`compilers.yaml`. Available with the June 2022 release of the Cray
Programming Environment. (#24894, #30428)
6. **New binary format and hardened signing**
Spack now has an updated binary format, with improvements for
security. The new format has a detached signature file, and Spack
verifies the signature before untarring or decompressing the binary
package. The previous format embedded the signature in a `tar`
file, which required the client to run `tar` *before* verifying
(#30750). Spack can still install from build caches using the old
format, but we encourage users to switch to the new format going
forward.
Production GitLab pipelines have been hardened to securely sign
binaries. There is now a separate signing stage so that signing
keys are never exposed to build system code, and signing keys are
ephemeral and only live as long as the signing pipeline stage.
(#30753)
7. **Bootstrap mirror generation**
The `spack bootstrap mirror` command can automatically create a
mirror for bootstrapping the concretizer and other needed
dependencies in an air-gapped environment. (#28556)
8. **Nascent Windows support**
Spack now has initial support for Windows. Spack core has been
refactored to run in the Windows environment, and a small number of
packages can now build for Windows. More details are
[in the documentation](https://spack.rtfd.io/en/latest/getting_started.html#spack-on-windows)
(#27021, #28385, many more)
9. **Makefile generation**
`spack env depfile` can be used to generate a `Makefile` from an
environment, which can be used to build packages the environment
in parallel on a single node. e.g.:
```console
spack -e myenv env depfile > Makefile
make
```
Spack propagates `gmake` jobserver information to builds so that
their jobs can share cores. (#30039, #30254, #30302, #30526)
10. **New variant features**
In addition to being conditional themselves, variants can now have
[conditional *values*](https://spack.readthedocs.io/en/latest/packaging_guide.html#conditional-possible-values)
that are only possible for certain configurations of a package. (#29530)
Variants can be
[declared "sticky"](https://spack.readthedocs.io/en/latest/packaging_guide.html#sticky-variants),
which prevents them from being enabled or disabled by the
concretizer. Sticky variants must be set explicitly by users
on the command line or in `packages.yaml`. (#28630)
* Allow conditional possible values in variants
* Add a "sticky" property to variants
## Other new features of note
* Environment views can optionally link only `run` dependencies
with `link:run` (#29336)
* `spack external find --all` finds library-only packages in
addition to build dependencies (#28005)
* Customizable `config:license_dir` option (#30135)
* `spack external find --path PATH` takes a custom search path (#30479)
* `spack spec` has a new `--format` argument like `spack find` (#27908)
* `spack concretize --quiet` skips printing concretized specs (#30272)
* `spack info` now has cleaner output and displays test info (#22097)
* Package-level submodule option for git commit versions (#30085, #30037)
* Using `/hash` syntax to refer to concrete specs in an environment
now works even if `/hash` is not installed. (#30276)
## Major internal refactors
* full hash (see above)
* new develop versioning scheme `0.19.0-dev0`
* Allow for multiple dependencies/dependents from the same package (#28673)
* Splice differing virtual packages (#27919)
## Performance Improvements
* Concretization of large environments with `unify: when_possible` is
much faster than concretizing separately (#28941, see above)
* Single-pass view generation algorithm is 2.6x faster (#29443)
## Archspec improvements
* `oneapi` and `dpcpp` flag support (#30783)
* better support for `M1` and `a64fx` (#30683)
## Removals and Deprecations
* Spack no longer supports Python `2.6` (#27256)
* Removed deprecated `--run-tests` option of `spack install`;
use `spack test` (#30461)
* Removed deprecated `spack flake8`; use `spack style` (#27290)
* Deprecate `spack:concretization` config option; use
`concretizer:unify` (#30038)
* Deprecate top-level module configuration; use module sets (#28659)
* `spack activate` and `spack deactivate` are deprecated in favor of
environments; will be removed in `0.19.0` (#29430; see also `link:run`
in #29336 above)
## Notable Bugfixes
* Fix bug that broke locks with many parallel builds (#27846)
* Many bugfixes and consistency improvements for the new concretizer
and `--reuse` (#30357, #30092, #29835, #29933, #28605, #29694, #28848)
## Packages
* `CMakePackage` uses `CMAKE_INSTALL_RPATH_USE_LINK_PATH` (#29703)
* Refactored `lua` support: `lua-lang` virtual supports both
`lua` and `luajit` via new `LuaPackage` build system(#28854)
* PythonPackage: now installs packages with `pip` (#27798)
* Python: improve site_packages_dir handling (#28346)
* Extends: support spec, not just package name (#27754)
* `find_libraries`: search for both .so and .dylib on macOS (#28924)
* Use stable URLs and `?full_index=1` for all github patches (#29239)
## Spack community stats
* 6,416 total packages, 458 new since `v0.17.0`
* 219 new Python packages
* 60 new R packages
* 377 people contributed to this release
* 337 committers to packages
* 85 committers to core
# v0.17.2 (2022-04-13)
### Spack bugfixes
@@ -11,7 +213,7 @@
* Fixed a few bugs affecting the spack ci command (#29518, #29419)
* Fix handling of Intel compiler environment (#29439)
* Fix a few edge cases when reindexing the DB (#28764)
* Remove "Known issues" from documentation (#29664)
* Remove "Known issues" from documentation (#29664)
* Other miscellaneous bugfixes (0b72e070583fc5bcd016f5adc8a84c99f2b7805f, #28403, #29261)
# v0.17.1 (2021-12-23)

View File

@@ -50,6 +50,13 @@ build cache files for the "ninja" spec:
Note that the targeted spec must already be installed. Once you have a build cache,
you can add it as a mirror, discussed next.
.. warning::
Spack improved the format used for binary caches in v0.18. The entire v0.18 series
will be able to verify and install binary caches both in the new and in the old format.
Support for using the old format is expected to end in v0.19, so we advise users to
recreate relevant buildcaches using Spack v0.18 or higher.
---------------------------------------
Finding or installing build cache files
---------------------------------------

View File

@@ -151,7 +151,7 @@ Package-related modules
^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.package`
Contains the :class:`~spack.package.Package` class, which
Contains the :class:`~spack.package_base.Package` class, which
is the superclass for all packages in Spack. Methods on ``Package``
implement all phases of the :ref:`package lifecycle
<package-lifecycle>` and manage the build process.

View File

@@ -2393,9 +2393,9 @@ Influence how dependents are built or run
Spack provides a mechanism for dependencies to influence the
environment of their dependents by overriding the
:meth:`setup_dependent_run_environment <spack.package.PackageBase.setup_dependent_run_environment>`
:meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
or the
:meth:`setup_dependent_build_environment <spack.package.PackageBase.setup_dependent_build_environment>`
:meth:`setup_dependent_build_environment <spack.package_base.PackageBase.setup_dependent_build_environment>`
methods.
The Qt package, for instance, uses this call:
@@ -2417,7 +2417,7 @@ will have the ``PYTHONPATH``, ``PYTHONHOME`` and ``PATH`` environment
variables set appropriately before starting the installation. To make things
even simpler the ``python setup.py`` command is also inserted into the module
scope of dependents by overriding a third method called
:meth:`setup_dependent_package <spack.package.PackageBase.setup_dependent_package>`
:meth:`setup_dependent_package <spack.package_base.PackageBase.setup_dependent_package>`
:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
@@ -3022,7 +3022,7 @@ The classes that are currently provided by Spack are:
+----------------------------------------------------------+----------------------------------+
| **Base Class** | **Purpose** |
+==========================================================+==================================+
| :class:`~spack.package.Package` | General base class not |
| :class:`~spack.package_base.Package` | General base class not |
| | specialized for any build system |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.makefile.MakefilePackage` | Specialized class for packages |
@@ -3153,7 +3153,7 @@ for the install phase is:
For those not used to Python instance methods, this is the
package itself. In this case it's an instance of ``Foo``, which
extends ``Package``. For API docs on Package objects, see
:py:class:`Package <spack.package.Package>`.
:py:class:`Package <spack.package_base.Package>`.
``spec``
This is the concrete spec object created by Spack from an

26
lib/spack/env/cc vendored
View File

@@ -401,8 +401,7 @@ input_command="$*"
# command line and recombine them with Spack arguments later. We
# parse these out so that we can make sure that system paths come
# last, that package arguments come first, and that Spack arguments
# are injected properly. Based on configuration, we also strip -Werror
# arguments.
# are injected properly.
#
# All other arguments, including -l arguments, are treated as
# 'other_args' and left in their original order. This ensures that
@@ -441,29 +440,6 @@ while [ $# -ne 0 ]; do
continue
fi
if [ -n "${SPACK_COMPILER_FLAGS_KEEP}" ] ; then
# NOTE: the eval is required to allow `|` alternatives inside the variable
eval "\
case '$1' in
$SPACK_COMPILER_FLAGS_KEEP)
append other_args_list "$1"
shift
continue
;;
esac
"
fi
if [ -n "${SPACK_COMPILER_FLAGS_REMOVE}" ] ; then
eval "\
case '$1' in
$SPACK_COMPILER_FLAGS_REMOVE)
shift
continue
;;
esac
"
fi
case "$1" in
-isystem*)
arg="${1#-isystem}"

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: (major, minor, micro, dev release) tuple
spack_version_info = (0, 18, 0, 'dev0')
spack_version_info = (0, 19, 0, 'dev0')
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
spack_version = '.'.join(str(s) for s in spack_version_info)

View File

@@ -12,7 +12,7 @@
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.package_base
import spack.repo
import spack.util.executable

View File

@@ -210,7 +210,7 @@ def get_all_built_specs(self):
return spec_list
def find_built_spec(self, spec):
def find_built_spec(self, spec, mirrors_to_check=None):
"""Look in our cache for the built spec corresponding to ``spec``.
If the spec can be found among the configured binary mirrors, a
@@ -225,6 +225,8 @@ def find_built_spec(self, spec):
Args:
spec (spack.spec.Spec): Concrete spec to find
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
Returns:
An list of objects containing the found specs and mirror url where
@@ -240,17 +242,23 @@ def find_built_spec(self, spec):
]
"""
self.regenerate_spec_cache()
return self.find_by_hash(spec.dag_hash())
return self.find_by_hash(spec.dag_hash(), mirrors_to_check=mirrors_to_check)
def find_by_hash(self, find_hash):
def find_by_hash(self, find_hash, mirrors_to_check=None):
"""Same as find_built_spec but uses the hash of a spec.
Args:
find_hash (str): hash of the spec to search
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
"""
if find_hash not in self._mirrors_for_spec:
return None
return self._mirrors_for_spec[find_hash]
results = self._mirrors_for_spec[find_hash]
if not mirrors_to_check:
return results
mirror_urls = mirrors_to_check.values()
return [r for r in results if r['mirror_url'] in mirror_urls]
def update_spec(self, spec, found_list):
"""
@@ -616,10 +624,14 @@ def get_buildfile_manifest(spec):
"""
data = {"text_to_relocate": [], "binary_to_relocate": [],
"link_to_relocate": [], "other": [],
"binary_to_relocate_fullpath": []}
"binary_to_relocate_fullpath": [], "offsets": {}}
blacklist = (".spack", "man")
# Get all the paths we will want to relocate in binaries
paths_to_relocate = [s.prefix for s in spec.traverse(root=True)]
paths_to_relocate.append(spack.store.layout.root)
# Do this at during tarball creation to save time when tarball unpacked.
# Used by make_package_relative to determine binaries to change.
for root, dirs, files in os.walk(spec.prefix, topdown=True):
@@ -654,6 +666,11 @@ def get_buildfile_manifest(spec):
(m_subtype in ('x-mach-binary')
and sys.platform == 'darwin') or
(not filename.endswith('.o'))):
# Last path to relocate is the layout root, which is a substring
# of the others
indices = relocate.compute_indices(path_name, paths_to_relocate)
data['offsets'][rel_path_name] = indices
data['binary_to_relocate'].append(rel_path_name)
data['binary_to_relocate_fullpath'].append(path_name)
added = True
@@ -692,6 +709,7 @@ def write_buildinfo_file(spec, workdir, rel=False):
buildinfo['relocate_binaries'] = manifest['binary_to_relocate']
buildinfo['relocate_links'] = manifest['link_to_relocate']
buildinfo['prefix_to_hash'] = prefix_to_hash
buildinfo['offsets'] = manifest['offsets']
filename = buildinfo_file_name(workdir)
with open(filename, 'w') as outfile:
outfile.write(syaml.dump(buildinfo, default_flow_style=True))
@@ -1465,11 +1483,25 @@ def is_backup_file(file):
# If we are not installing back to the same install tree do the relocation
if old_prefix != new_prefix:
files_to_relocate = [os.path.join(workdir, filename)
for filename in buildinfo.get('relocate_binaries')
]
# Relocate links to the new install prefix
links = [link for link in buildinfo.get('relocate_links', [])]
relocate.relocate_links(
links, old_layout_root, old_prefix, new_prefix
)
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
# do the relocation of rpaths in binaries
# TODO: Is this necessary? How are null-terminated strings handled
# in the rpath header?
files_to_relocate = [
os.path.join(workdir, filename)
for filename in buildinfo.get('relocate_binaries')
]
platform = spack.platforms.by_name(spec.platform)
if 'macho' in platform.binary_formats:
relocate.relocate_macho_binaries(files_to_relocate,
@@ -1485,25 +1517,11 @@ def is_backup_file(file):
prefix_to_prefix_bin, rel,
old_prefix,
new_prefix)
# Relocate links to the new install prefix
links = [link for link in buildinfo.get('relocate_links', [])]
relocate.relocate_links(
links, old_layout_root, old_prefix, new_prefix
)
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
paths_to_relocate = [old_prefix, old_layout_root]
paths_to_relocate.extend(prefix_to_hash.keys())
files_to_relocate = list(filter(
lambda pathname: not relocate.file_is_relocatable(
pathname, paths_to_relocate=paths_to_relocate),
map(lambda filename: os.path.join(workdir, filename),
buildinfo['relocate_binaries'])))
# relocate the install prefixes in binary files including dependencies
relocate.relocate_text_bin(files_to_relocate, prefix_to_prefix_bin)
# If offsets is None, we will recompute offsets when needed
offsets = buildinfo.get('offsets', None)
relocate.relocate_text_bin(
files_to_relocate, prefix_to_prefix_bin, offsets, workdir)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
@@ -1592,9 +1610,6 @@ def extract_tarball(spec, download_result, allow_root=False, unsigned=False,
# Handle the older buildcache layout where the .spack file
# contains a spec json/yaml, maybe an .asc file (signature),
# and another tarball containing the actual install tree.
tty.warn("This binary package uses a deprecated layout, "
"and support for it will eventually be removed "
"from spack.")
tmpdir = tempfile.mkdtemp()
try:
tarfile_path = _extract_inner_tarball(
@@ -1822,7 +1837,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
tty.debug("No Spack mirrors are currently configured")
return {}
results = binary_index.find_built_spec(spec)
results = binary_index.find_built_spec(spec, mirrors_to_check=mirrors_to_check)
# Maybe we just didn't have the latest information from the mirror, so
# try to fetch directly, unless we are only considering the indices.

View File

@@ -55,7 +55,7 @@
import spack.config
import spack.install_test
import spack.main
import spack.package
import spack.package_base
import spack.paths
import spack.platforms
import spack.repo
@@ -242,17 +242,6 @@ def clean_environment():
# show useful matches.
env.set('LC_ALL', build_lang)
remove_flags = set()
keep_flags = set()
if spack.config.get('config:flags:keep_werror') == 'all':
keep_flags.add('-Werror*')
else:
if spack.config.get('config:flags:keep_werror') == 'specific':
keep_flags.add('-Werror=*')
remove_flags.add('-Werror*')
env.set('SPACK_COMPILER_FLAGS_KEEP', '|'.join(keep_flags))
env.set('SPACK_COMPILER_FLAGS_REMOVE', '|'.join(remove_flags))
# Remove any macports installs from the PATH. The macports ld can
# cause conflicts with the built-in linker on el capitan. Solves
# assembler issues, e.g.:
@@ -733,7 +722,7 @@ def get_std_cmake_args(pkg):
package were a CMakePackage instance.
Args:
pkg (spack.package.PackageBase): package under consideration
pkg (spack.package_base.PackageBase): package under consideration
Returns:
list: arguments for cmake
@@ -749,7 +738,7 @@ def get_std_meson_args(pkg):
package were a MesonPackage instance.
Args:
pkg (spack.package.PackageBase): package under consideration
pkg (spack.package_base.PackageBase): package under consideration
Returns:
list: arguments for meson
@@ -759,12 +748,12 @@ def get_std_meson_args(pkg):
def parent_class_modules(cls):
"""
Get list of superclass modules that descend from spack.package.PackageBase
Get list of superclass modules that descend from spack.package_base.PackageBase
Includes cls.__module__
"""
if (not issubclass(cls, spack.package.PackageBase) or
issubclass(spack.package.PackageBase, cls)):
if (not issubclass(cls, spack.package_base.PackageBase) or
issubclass(spack.package_base.PackageBase, cls)):
return []
result = []
module = sys.modules.get(cls.__module__)
@@ -782,7 +771,7 @@ def load_external_modules(pkg):
associated with them.
Args:
pkg (spack.package.PackageBase): package to load deps for
pkg (spack.package_base.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
@@ -1120,7 +1109,7 @@ def start_build_process(pkg, function, kwargs):
Args:
pkg (spack.package.PackageBase): package whose environment we should set up the
pkg (spack.package_base.PackageBase): package whose environment we should set up the
child process for.
function (typing.Callable): argless function to run in the child
process.
@@ -1245,7 +1234,7 @@ def make_stack(tb, stack=None):
if 'self' in frame.f_locals:
# Find the first proper subclass of PackageBase.
obj = frame.f_locals['self']
if isinstance(obj, spack.package.PackageBase):
if isinstance(obj, spack.package_base.PackageBase):
break
# We found obj, the Package implementation we care about.

View File

@@ -9,7 +9,7 @@
from spack.build_systems.autotools import AutotoolsPackage
from spack.directives import extends
from spack.package import ExtensionError
from spack.package_base import ExtensionError
from spack.util.executable import which

View File

@@ -16,7 +16,7 @@
from spack.build_environment import InstallError
from spack.directives import conflicts, depends_on
from spack.operating_systems.mac_os import macos_version
from spack.package import PackageBase, run_after, run_before
from spack.package_base import PackageBase, run_after, run_before
from spack.util.executable import Executable
from spack.version import Version

View File

@@ -8,7 +8,7 @@
from llnl.util.filesystem import install, mkdirp
from spack.build_systems.cmake import CMakePackage
from spack.package import run_after
from spack.package_base import run_after
def cmake_cache_path(name, value, comment=""):

View File

@@ -18,7 +18,7 @@
import spack.build_environment
from spack.directives import conflicts, depends_on, variant
from spack.package import InstallError, PackageBase, run_after
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.path import convert_to_posix_path
# Regex to extract the primary generator from the CMake generator

View File

@@ -6,7 +6,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when
from spack.package import PackageBase
from spack.package_base import PackageBase
class CudaPackage(PackageBase):
@@ -37,6 +37,7 @@ class CudaPackage(PackageBase):
variant('cuda_arch',
description='CUDA architecture',
values=spack.variant.any_combination_of(*cuda_arch_values),
sticky=True,
when='+cuda')
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#nvcc-examples

View File

@@ -3,14 +3,16 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class GNUMirrorPackage(spack.package.PackageBase):
class GNUMirrorPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for GNU packages."""
#: Path of the package in a GNU mirror
gnu_mirror_path = None
gnu_mirror_path = None # type: Optional[str]
#: List of GNU mirrors used by Spack
base_mirrors = [

View File

@@ -26,7 +26,7 @@
import spack.error
from spack.build_environment import dso_suffix
from spack.package import InstallError, PackageBase, run_after
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
@@ -1115,7 +1115,7 @@ def _setup_dependent_env_callback(
raise InstallError('compilers_of_client arg required for MPI')
def setup_dependent_package(self, module, dep_spec):
# https://spack.readthedocs.io/en/latest/spack.html#spack.package.PackageBase.setup_dependent_package
# https://spack.readthedocs.io/en/latest/spack.html#spack.package_base.PackageBase.setup_dependent_package
# Reminder: "module" refers to Python module.
# Called before the install() method of dependents.

View File

@@ -10,7 +10,7 @@
from spack.directives import depends_on, extends
from spack.multimethod import when
from spack.package import PackageBase
from spack.package_base import PackageBase
from spack.util.executable import Executable

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import conflicts
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class MakefilePackage(PackageBase):

View File

@@ -7,7 +7,7 @@
from llnl.util.filesystem import install_tree, working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
from spack.util.executable import which

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on, variant
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class MesonPackage(PackageBase):

View File

@@ -6,7 +6,7 @@
import inspect
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class OctavePackage(PackageBase):

View File

@@ -14,7 +14,7 @@
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.package import Package
from spack.package_base import Package
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable

View File

@@ -10,7 +10,7 @@
from llnl.util.filesystem import filter_file
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
from spack.util.executable import Executable

View File

@@ -6,6 +6,7 @@
import os
import re
import shutil
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import (
@@ -19,13 +20,13 @@
from llnl.util.lang import match_predicate
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class PythonPackage(PackageBase):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi = None
pypi = None # type: Optional[str]
maintainers = ['adamjstewart']
@@ -46,7 +47,7 @@ class PythonPackage(PackageBase):
# package manually
depends_on('py-wheel', type='build')
py_namespace = None
py_namespace = None # type: Optional[str]
@staticmethod
def _std_args(cls):

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class QMakePackage(PackageBase):

View File

@@ -5,9 +5,10 @@
import inspect
from typing import Optional
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class RPackage(PackageBase):
@@ -28,10 +29,10 @@ class RPackage(PackageBase):
# package attributes that can be expanded to set the homepage, url,
# list_url, and git values
# For CRAN packages
cran = None
cran = None # type: Optional[str]
# For Bioconductor packages
bioc = None
bioc = None # type: Optional[str]
maintainers = ['glennpj']

View File

@@ -3,13 +3,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
from spack.build_environment import SPACK_NO_PARALLEL_MAKE, determine_number_of_jobs
from spack.directives import extends
from spack.package import PackageBase
from spack.package_base import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
@@ -36,8 +37,8 @@ class RacketPackage(PackageBase):
extends('racket')
pkgs = False
subdirectory = None
name = None
subdirectory = None # type: Optional[str]
name = None # type: Optional[str]
parallel = True
@property

View File

@@ -77,7 +77,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package import PackageBase
from spack.package_base import PackageBase
class ROCmPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class RubyPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class SConsPackage(PackageBase):

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import find, join_path, working_dir
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class SIPPackage(PackageBase):

View File

@@ -3,15 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class SourceforgePackage(spack.package.PackageBase):
class SourceforgePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceforge
packages."""
#: Path of the package in a Sourceforge mirror
sourceforge_mirror_path = None
sourceforge_mirror_path = None # type: Optional[str]
#: List of Sourceforge mirrors used by Spack
base_mirrors = [

View File

@@ -2,16 +2,17 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package
import spack.package_base
import spack.util.url
class SourcewarePackage(spack.package.PackageBase):
class SourcewarePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceware.org
packages."""
#: Path of the package in a Sourceware mirror
sourceware_mirror_path = None
sourceware_mirror_path = None # type: Optional[str]
#: List of Sourceware mirrors used by Spack
base_mirrors = [

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class WafPackage(PackageBase):

View File

@@ -3,15 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class XorgPackage(spack.package.PackageBase):
class XorgPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for x.org
packages."""
#: Path of the package in a x.org mirror
xorg_mirror_path = None
xorg_mirror_path = None # type: Optional[str]
#: List of x.org mirrors used by Spack
# Note: x.org mirrors are a bit tricky, since many are out-of-sync or off.

View File

@@ -33,7 +33,6 @@
import spack.util.executable as exe
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.error import SpackError
from spack.spec import Spec
@@ -42,10 +41,8 @@
'always',
]
SPACK_PR_MIRRORS_ROOT_URL = 's3://spack-binaries-prs'
SPACK_SHARED_PR_MIRROR_URL = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
'shared_pr_mirror')
TEMP_STORAGE_MIRROR_NAME = 'ci_temporary_mirror'
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
spack_gpg = spack.main.SpackCommand('gpg')
spack_compiler = spack.main.SpackCommand('compiler')
@@ -199,6 +196,11 @@ def _get_cdash_build_name(spec, build_group):
spec.name, spec.version, spec.compiler, spec.architecture, build_group)
def _remove_reserved_tags(tags):
"""Convenience function to strip reserved tags from jobs"""
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def _get_spec_string(spec):
format_elements = [
'{name}{@version}',
@@ -231,8 +233,10 @@ def _add_dependency(spec_label, dep_label, deps):
deps[spec_label].add(dep_label)
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only)
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False,
mirrors_to_check=None):
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
if spec_deps_obj:
dependencies = spec_deps_obj['dependencies']
@@ -249,7 +253,7 @@ def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
_add_dependency(entry['spec'], entry['depends'], deps)
def stage_spec_jobs(specs, check_index_only=False):
def stage_spec_jobs(specs, check_index_only=False, mirrors_to_check=None):
"""Take a set of release specs and generate a list of "stages", where the
jobs in any stage are dependent only on jobs in previous stages. This
allows us to maximize build parallelism within the gitlab-ci framework.
@@ -261,6 +265,8 @@ def stage_spec_jobs(specs, check_index_only=False):
are up to date on those mirrors. This flag limits that search to
the binary cache indices on those mirrors to speed the process up,
even though there is no garantee the index is up to date.
mirrors_to_checK: Optional mapping giving mirrors to check instead of
any configured mirrors.
Returns: A tuple of information objects describing the specs, dependencies
and stages:
@@ -297,8 +303,8 @@ def _remove_satisfied_deps(deps, satisfied_list):
deps = {}
spec_labels = {}
_get_spec_dependencies(
specs, deps, spec_labels, check_index_only=check_index_only)
_get_spec_dependencies(specs, deps, spec_labels, check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
# Save the original deps, as we need to return them at the end of the
# function. In the while loop below, the "dependencies" variable is
@@ -340,7 +346,7 @@ def _print_staging_summary(spec_labels, dependencies, stages):
_get_spec_string(s)))
def _compute_spec_deps(spec_list, check_index_only=False):
def _compute_spec_deps(spec_list, check_index_only=False, mirrors_to_check=None):
"""
Computes all the dependencies for the spec(s) and generates a JSON
object which provides both a list of unique spec names as well as a
@@ -413,7 +419,7 @@ def append_dep(s, d):
continue
up_to_date_mirrors = bindist.get_mirrors_for_spec(
spec=s, index_only=check_index_only)
spec=s, mirrors_to_check=mirrors_to_check, index_only=check_index_only)
skey = _spec_deps_key(s)
spec_labels[skey] = {
@@ -602,8 +608,8 @@ def get_spec_filter_list(env, affected_pkgs, dependencies=True, dependents=True)
def generate_gitlab_ci_yaml(env, print_summary, output_file,
prune_dag=False, check_index_only=False,
run_optimizer=False, use_dependencies=False,
artifacts_root=None):
""" Generate a gitlab yaml file to run a dynamic chile pipeline from
artifacts_root=None, remote_mirror_override=None):
""" Generate a gitlab yaml file to run a dynamic child pipeline from
the spec matrix in the active environment.
Arguments:
@@ -629,6 +635,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory.
remote_mirror_override (str): Typically only needed when one spack.yaml
is used to populate several mirrors with binaries, based on some
criteria. Spack protected pipelines populate different mirrors based
on branch name, facilitated by this option.
"""
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
@@ -678,17 +688,19 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
for s in affected_specs:
tty.debug(' {0}'.format(s.name))
generate_job_name = os.environ.get('CI_JOB_NAME', None)
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', None)
# Downstream jobs will "need" (depend on, for both scheduling and
# artifacts, which include spack.lock file) this pipeline generation
# job by both name and pipeline id. If those environment variables
# do not exist, then maybe this is just running in a shell, in which
# case, there is no expectation gitlab will ever run the generated
# pipeline and those environment variables do not matter.
generate_job_name = os.environ.get('CI_JOB_NAME', 'job-does-not-exist')
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', 'pipeline-does-not-exist')
# Values: "spack_pull_request", "spack_protected_branch", or not set
spack_pipeline_type = os.environ.get('SPACK_PIPELINE_TYPE', None)
is_pr_pipeline = spack_pipeline_type == 'spack_pull_request'
spack_pr_branch = os.environ.get('SPACK_PR_BRANCH', None)
pr_mirror_url = None
if spack_pr_branch:
pr_mirror_url = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
spack_pr_branch)
spack_buildcache_copy = os.environ.get('SPACK_COPY_BUILDCACHE', None)
if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
tty.die('spack ci generate requires an env containing a mirror')
@@ -743,14 +755,25 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'strip-compilers': False,
})
# Add per-PR mirror (and shared PR mirror) if enabled, as some specs might
# be up to date in one of those and thus not need to be rebuilt.
if pr_mirror_url:
spack.mirror.add(
'ci_pr_mirror', pr_mirror_url, cfg.default_modify_scope())
spack.mirror.add('ci_shared_pr_mirror',
SPACK_SHARED_PR_MIRROR_URL,
cfg.default_modify_scope())
# If a remote mirror override (alternate buildcache destination) was
# specified, add it here in case it has already built hashes we might
# generate.
mirrors_to_check = None
if remote_mirror_override:
if spack_pipeline_type == 'spack_protected_branch':
# Overriding the main mirror in this case might result
# in skipping jobs on a release pipeline because specs are
# up to date in develop. Eventually we want to notice and take
# advantage of this by scheduling a job to copy the spec from
# develop to the release, but until we have that, this makes
# sure we schedule a rebuild job if the spec isn't already in
# override mirror.
mirrors_to_check = {
'override': remote_mirror_override
}
else:
spack.mirror.add(
'ci_pr_mirror', remote_mirror_override, cfg.default_modify_scope())
pipeline_artifacts_dir = artifacts_root
if not pipeline_artifacts_dir:
@@ -825,11 +848,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
phase_spec.concretize()
staged_phases[phase_name] = stage_spec_jobs(
concrete_phase_specs,
check_index_only=check_index_only)
check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
finally:
# Clean up PR mirror if enabled
if pr_mirror_url:
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
# Clean up remote mirror override if enabled
if remote_mirror_override:
if spack_pipeline_type != 'spack_protected_branch':
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
all_job_names = []
output_object = {}
@@ -889,6 +914,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tags = [tag for tag in runner_attribs['tags']]
if spack_pipeline_type is not None:
# For spack pipelines "public" and "protected" are reserved tags
tags = _remove_reserved_tags(tags)
if spack_pipeline_type == 'spack_protected_branch':
tags.extend(['aws', 'protected'])
elif spack_pipeline_type == 'spack_pull_request':
tags.extend(['public'])
variables = {}
if 'variables' in runner_attribs:
variables.update(runner_attribs['variables'])
@@ -1174,6 +1207,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
service_job_config,
cleanup_job)
if 'tags' in cleanup_job:
service_tags = _remove_reserved_tags(cleanup_job['tags'])
cleanup_job['tags'] = service_tags
cleanup_job['stage'] = 'cleanup-temp-storage'
cleanup_job['script'] = [
'spack -d mirror destroy --mirror-url {0}/$CI_PIPELINE_ID'.format(
@@ -1181,9 +1218,74 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
]
cleanup_job['when'] = 'always'
cleanup_job['retry'] = service_job_retries
cleanup_job['interruptible'] = True
output_object['cleanup'] = cleanup_job
if ('signing-job-attributes' in gitlab_ci and
spack_pipeline_type == 'spack_protected_branch'):
# External signing: generate a job to check and sign binary pkgs
stage_names.append('stage-sign-pkgs')
signing_job_config = gitlab_ci['signing-job-attributes']
signing_job = {}
signing_job_attrs_to_copy = [
'image',
'tags',
'variables',
'before_script',
'script',
'after_script',
]
_copy_attributes(signing_job_attrs_to_copy,
signing_job_config,
signing_job)
signing_job_tags = []
if 'tags' in signing_job:
signing_job_tags = _remove_reserved_tags(signing_job['tags'])
for tag in ['aws', 'protected', 'notary']:
if tag not in signing_job_tags:
signing_job_tags.append(tag)
signing_job['tags'] = signing_job_tags
signing_job['stage'] = 'stage-sign-pkgs'
signing_job['when'] = 'always'
signing_job['retry'] = {
'max': 2,
'when': ['always']
}
signing_job['interruptible'] = True
output_object['sign-pkgs'] = signing_job
if spack_buildcache_copy:
# Generate a job to copy the contents from wherever the builds are getting
# pushed to the url specified in the "SPACK_BUILDCACHE_COPY" environment
# variable.
src_url = remote_mirror_override or remote_mirror_url
dest_url = spack_buildcache_copy
stage_names.append('stage-copy-buildcache')
copy_job = {
'stage': 'stage-copy-buildcache',
'tags': ['spack', 'public', 'medium', 'aws', 'x86_64'],
'image': 'ghcr.io/spack/python-aws-bash:0.0.1',
'when': 'on_success',
'interruptible': True,
'retry': service_job_retries,
'script': [
'. ./share/spack/setup-env.sh',
'spack --version',
'aws s3 sync --exclude *index.json* --exclude *pgp* {0} {1}'.format(
src_url, dest_url)
]
}
output_object['copy-mirror'] = copy_job
if rebuild_index_enabled:
# Add a final job to regenerate the index
stage_names.append('stage-rebuild-index')
@@ -1194,9 +1296,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
service_job_config,
final_job)
if 'tags' in final_job:
service_tags = _remove_reserved_tags(final_job['tags'])
final_job['tags'] = service_tags
index_target_mirror = mirror_urls[0]
if is_pr_pipeline:
index_target_mirror = pr_mirror_url
if remote_mirror_override:
index_target_mirror = remote_mirror_override
final_job['stage'] = 'stage-rebuild-index'
final_job['script'] = [
@@ -1205,6 +1311,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
]
final_job['when'] = 'always'
final_job['retry'] = service_job_retries
final_job['interruptible'] = True
output_object['rebuild-index'] = final_job
@@ -1237,8 +1344,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'SPACK_PIPELINE_TYPE': str(spack_pipeline_type)
}
if pr_mirror_url:
output_object['variables']['SPACK_PR_MIRROR_URL'] = pr_mirror_url
if remote_mirror_override:
(output_object['variables']
['SPACK_REMOTE_MIRROR_OVERRIDE']) = remote_mirror_override
spack_stack_name = os.environ.get('SPACK_CI_STACK_NAME', None)
if spack_stack_name:

View File

@@ -14,7 +14,7 @@
import spack.repo
import spack.stage
import spack.util.crypto
from spack.package import preferred_version
from spack.package_base import preferred_version
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import Version, ver

View File

@@ -64,6 +64,11 @@ def setup_parser(subparser):
'--dependencies', action='store_true', default=False,
help="(Experimental) disable DAG scheduling; use "
' "plain" dependencies.')
generate.add_argument(
'--buildcache-destination', default=None,
help="Override the mirror configured in the environment (spack.yaml) " +
"in order to push binaries from the generated pipeline to a " +
"different location.")
prune_group = generate.add_mutually_exclusive_group()
prune_group.add_argument(
'--prune-dag', action='store_true', dest='prune_dag',
@@ -127,6 +132,7 @@ def ci_generate(args):
prune_dag = args.prune_dag
index_only = args.index_only
artifacts_root = args.artifacts_root
buildcache_destination = args.buildcache_destination
if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml")
@@ -140,7 +146,8 @@ def ci_generate(args):
spack_ci.generate_gitlab_ci_yaml(
env, True, output_file, prune_dag=prune_dag,
check_index_only=index_only, run_optimizer=run_optimizer,
use_dependencies=use_dependencies, artifacts_root=artifacts_root)
use_dependencies=use_dependencies, artifacts_root=artifacts_root,
remote_mirror_override=buildcache_destination)
if copy_yaml_to:
copy_to_dir = os.path.dirname(copy_yaml_to)
@@ -180,6 +187,9 @@ def ci_rebuild(args):
if not gitlab_ci:
tty.die('spack ci rebuild requires an env containing gitlab-ci cfg')
tty.msg('SPACK_BUILDCACHE_DESTINATION={0}'.format(
os.environ.get('SPACK_BUILDCACHE_DESTINATION', None)))
# Grab the environment variables we need. These either come from the
# pipeline generation step ("spack ci generate"), where they were written
# out as variables, or else provided by GitLab itself.
@@ -196,7 +206,7 @@ def ci_rebuild(args):
compiler_action = get_env_var('SPACK_COMPILER_ACTION')
cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME')
spack_pipeline_type = get_env_var('SPACK_PIPELINE_TYPE')
pr_mirror_url = get_env_var('SPACK_PR_MIRROR_URL')
remote_mirror_override = get_env_var('SPACK_REMOTE_MIRROR_OVERRIDE')
remote_mirror_url = get_env_var('SPACK_REMOTE_MIRROR_URL')
# Construct absolute paths relative to current $CI_PROJECT_DIR
@@ -244,6 +254,10 @@ def ci_rebuild(args):
tty.debug('Pipeline type - PR: {0}, develop: {1}'.format(
spack_is_pr_pipeline, spack_is_develop_pipeline))
# If no override url exists, then just push binary package to the
# normal remote mirror url.
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
# Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
# force something or pipelines might not have a way to propagate build
@@ -373,7 +387,24 @@ def ci_rebuild(args):
cfg.default_modify_scope())
# Check configured mirrors for a built spec with a matching hash
matches = bindist.get_mirrors_for_spec(job_spec, index_only=False)
mirrors_to_check = None
if remote_mirror_override and spack_pipeline_type == 'spack_protected_branch':
# Passing "mirrors_to_check" below means we *only* look in the override
# mirror to see if we should skip building, which is what we want.
mirrors_to_check = {
'override': remote_mirror_override
}
# Adding this mirror to the list of configured mirrors means dependencies
# could be installed from either the override mirror or any other configured
# mirror (e.g. remote_mirror_url which is defined in the environment or
# pipeline_mirror_url), which is also what we want.
spack.mirror.add('mirror_override',
remote_mirror_override,
cfg.default_modify_scope())
matches = bindist.get_mirrors_for_spec(
job_spec, mirrors_to_check=mirrors_to_check, index_only=False)
if matches:
# Got a hash match on at least one configured mirror. All
@@ -517,13 +548,6 @@ def ci_rebuild(args):
# any logs from the staging directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# Create buildcache on remote mirror, either on pr-specific mirror or
# on the main mirror defined in the gitlab-enabled spack environment
if spack_is_pr_pipeline:
buildcache_mirror_url = pr_mirror_url
else:
buildcache_mirror_url = remote_mirror_url
# If the install succeeded, create a buildcache entry for this job spec
# and push it to one or more mirrors. If the install did not succeed,
# print out some instructions on how to reproduce this build failure

View File

@@ -57,7 +57,7 @@
# See the Spack documentation for more information on packaging.
# ----------------------------------------------------------------------------
from spack import *
from spack.package import *
class {class_name}({base_class_name}):

View File

@@ -11,7 +11,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package
import spack.package_base
import spack.repo
import spack.store
@@ -57,7 +57,7 @@ def dependencies(parser, args):
else:
spec = specs[0]
dependencies = spack.package.possible_dependencies(
dependencies = spack.package_base.possible_dependencies(
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,

View File

@@ -200,7 +200,7 @@ def external_list(args):
list(spack.repo.path.all_packages())
# Print all the detectable packages
tty.msg("Detectable packages per repository")
for namespace, pkgs in sorted(spack.package.detectable_packages.items()):
for namespace, pkgs in sorted(spack.package_base.detectable_packages.items()):
print("Repository:", namespace)
colify.colify(pkgs, indent=4, output=sys.stdout)

View File

@@ -18,7 +18,7 @@
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
from spack.package import has_test_method, preferred_version
from spack.package_base import has_test_method, preferred_version
description = 'get detailed information on a particular package'
section = 'basic'
@@ -269,14 +269,14 @@ def print_tests(pkg):
names = []
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
if has_test_method(pkg_cls):
pkg_base = spack.package.PackageBase
pkg_base = spack.package_base.PackageBase
test_pkgs = [str(cls.test) for cls in inspect.getmro(pkg_cls) if
issubclass(cls, pkg_base) and cls.test != pkg_base.test]
test_pkgs = list(set(test_pkgs))
names.extend([(test.split()[1]).lower() for test in test_pkgs])
# TODO Refactor START
# Use code from package.py's test_process IF this functionality is
# Use code from package_base.py's test_process IF this functionality is
# accepted.
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))

View File

@@ -302,7 +302,7 @@ def install(parser, args, **kwargs):
)
reporter = spack.report.collect_info(
spack.package.PackageInstaller, '_install_task', args.log_format, args)
spack.package_base.PackageInstaller, '_install_task', args.log_format, args)
if args.log_file:
reporter.filename = args.log_file

View File

@@ -12,7 +12,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.store
from spack.database import InstallStatuses

View File

@@ -18,7 +18,7 @@
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package
import spack.package_base
import spack.solver.asp as asp
description = "concretize a specs using an ASP solver"

View File

@@ -65,7 +65,7 @@ def is_package(f):
packages, since we allow `from spack import *` and poking globals
into packages.
"""
return f.startswith("var/spack/repos/")
return f.startswith("var/spack/repos/") and f.endswith('package.py')
#: decorator for adding tools to the list
@@ -236,7 +236,7 @@ def translate(match):
continue
if not args.root_relative and re_obj:
line = re_obj.sub(translate, line)
print(" " + line)
print(line)
def print_style_header(file_list, args):
@@ -290,20 +290,26 @@ def run_flake8(flake8_cmd, file_list, args):
@tool("mypy")
def run_mypy(mypy_cmd, file_list, args):
# always run with config from running spack prefix
mypy_args = [
common_mypy_args = [
"--config-file", os.path.join(spack.paths.prefix, "pyproject.toml"),
"--package", "spack",
"--package", "llnl",
"--show-error-codes",
]
# not yet, need other updates to enable this
# if any([is_package(f) for f in file_list]):
# mypy_args.extend(["--package", "packages"])
mypy_arg_sets = [common_mypy_args + [
"--package", "spack",
"--package", "llnl",
]]
if 'SPACK_MYPY_CHECK_PACKAGES' in os.environ:
mypy_arg_sets.append(common_mypy_args + [
'--package', 'packages',
'--disable-error-code', 'no-redef',
])
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode = mypy_cmd.returncode
returncode = 0
for mypy_args in mypy_arg_sets:
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode |= mypy_cmd.returncode
rewrite_and_print_output(output, args)
rewrite_and_print_output(output, args)
print_tool_result("mypy", returncode)
return returncode
@@ -318,16 +324,29 @@ def run_isort(isort_cmd, file_list, args):
pat = re.compile("ERROR: (.*) Imports are incorrectly sorted")
replacement = "ERROR: {0} Imports are incorrectly sorted"
returncode = 0
for chunk in grouper(file_list, 100):
packed_args = isort_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode |= isort_cmd.returncode
returncode = [0]
rewrite_and_print_output(output, args, pat, replacement)
def process_files(file_list, is_args):
for chunk in grouper(file_list, 100):
packed_args = is_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode[0] |= isort_cmd.returncode
print_tool_result("isort", returncode)
return returncode
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = ('--rm', 'spack', '--rm', 'spack.pkgkit', '--rm',
'spack.package_defs', '-a', 'from spack.package import *')
packages_isort_args = packages_isort_args + isort_args
# packages
process_files(filter(is_package, file_list),
packages_isort_args)
# non-packages
process_files(filter(lambda f: not is_package(f), file_list),
isort_args)
print_tool_result("isort", returncode[0])
return returncode[0]
@tool("black")

View File

@@ -20,7 +20,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.install_test
import spack.package
import spack.package_base
import spack.repo
import spack.report
@@ -189,7 +189,7 @@ def test_run(args):
# Set up reporter
setattr(args, 'package', [s.format() for s in test_suite.specs])
reporter = spack.report.collect_info(
spack.package.PackageBase, 'do_test', args.log_format, args)
spack.package_base.PackageBase, 'do_test', args.log_format, args)
if not reporter.filename:
if args.log_file:
if os.path.isabs(args.log_file):
@@ -217,7 +217,7 @@ def test_list(args):
else set()
def has_test_and_tags(pkg_class):
return spack.package.has_test_method(pkg_class) and \
return spack.package_base.has_test_method(pkg_class) and \
(not args.tag or pkg_class.name in tagged)
if args.list_all:

View File

@@ -24,7 +24,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.17"
tutorial_branch = "releases/v0.18"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -15,7 +15,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.store
from spack.database import InstallStatuses
@@ -221,7 +221,7 @@ def do_uninstall(env, specs, force):
except spack.repo.UnknownEntityError:
# The package.py file has gone away -- but still
# want to uninstall.
spack.package.Package.uninstall_by_spec(item, force=True)
spack.package_base.Package.uninstall_by_spec(item, force=True)
# A package is ready to be uninstalled when nothing else references it,
# unless we are requested to force uninstall it.

View File

@@ -422,7 +422,7 @@ def url_list_parsing(args, urls, url, pkg):
urls (set): List of URLs that have already been added
url (str or None): A URL to potentially add to ``urls`` depending on
``args``
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
Returns:
set: The updated set of ``urls``
@@ -470,7 +470,7 @@ def name_parsed_correctly(pkg, name):
"""Determine if the name of a package was correctly parsed.
Args:
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
name (str): The name that was extracted from the URL
Returns:
@@ -487,7 +487,7 @@ def version_parsed_correctly(pkg, version):
"""Determine if the version of a package was correctly parsed.
Args:
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
version (str): The version that was extracted from the URL
Returns:

View File

@@ -105,9 +105,6 @@
'build_stage': '$tempdir/spack-stage',
'concretizer': 'clingo',
'license_dir': spack.paths.default_license_dir,
'flags': {
'keep_werror': 'none',
},
}
}

View File

@@ -240,7 +240,7 @@ def compute_windows_program_path_for_package(pkg):
program files location, return list of best guesses
Args:
pkg (spack.package.Package): package for which
pkg (spack.package_base.Package): package for which
Program Files location is to be computed
"""
if not is_windows:

View File

@@ -1556,7 +1556,7 @@ def _extrapolate(pkg, version):
try:
return URLFetchStrategy(pkg.url_for_version(version),
fetch_options=pkg.fetch_options)
except spack.package.NoURLError:
except spack.package_base.NoURLError:
msg = ("Can't extrapolate a URL for version %s "
"because package %s defines no URLs")
raise ExtrapolationError(msg % (version, pkg.name))

View File

@@ -95,10 +95,7 @@ def view_copy(src, dst, view, spec=None):
view.get_projection_for_spec(dep)
if spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(
binaries=[dst],
prefixes=prefix_to_projection
)
spack.relocate.relocate_text_bin([dst], prefix_to_projection)
else:
prefix_to_projection[spack.store.layout.root] = view._root
prefix_to_projection[orig_sbang] = new_sbang

View File

@@ -50,7 +50,7 @@
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
import spack.store
@@ -103,7 +103,7 @@ def _check_last_phase(pkg):
package already.
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
@@ -125,7 +125,7 @@ def _handle_external_and_upstream(pkg, explicit):
database if it is external package.
Args:
pkg (spack.package.Package): the package whose installation is under
pkg (spack.package_base.Package): the package whose installation is under
consideration
explicit (bool): the package was explicitly requested by the user
Return:
@@ -265,7 +265,7 @@ def _install_from_cache(pkg, cache_only, explicit, unsigned=False):
Extract the package from binary cache
Args:
pkg (spack.package.PackageBase): the package to install from the binary cache
pkg (spack.package_base.PackageBase): package to install from the binary cache
cache_only (bool): only extract from binary cache
explicit (bool): ``True`` if installing the package was explicitly
requested by the user, otherwise, ``False``
@@ -355,7 +355,7 @@ def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned,
Process the binary cache tarball.
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
binary_spec (spack.spec.Spec): the spec whose cache has been confirmed
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
@@ -394,7 +394,7 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False):
Try to extract the package from binary cache.
Args:
pkg (spack.package.PackageBase): the package to be extracted from binary cache
pkg (spack.package_base.PackageBase): package to be extracted from binary cache
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
otherwise, ``False``
@@ -530,7 +530,7 @@ def log(pkg):
Copy provenance into the install directory on success
Args:
pkg (spack.package.Package): the package that was built and installed
pkg (spack.package_base.Package): the package that was built and installed
"""
packages_dir = spack.store.layout.build_packages_path(pkg.spec)
@@ -616,7 +616,7 @@ def package_id(pkg):
and packages for combinatorial environments.
Args:
pkg (spack.package.PackageBase): the package from which the identifier is
pkg (spack.package_base.PackageBase): the package from which the identifier is
derived
"""
if not pkg.spec.concrete:
@@ -769,7 +769,7 @@ def _add_bootstrap_compilers(
Args:
compiler: the compiler to boostrap
architecture: the architecture for which to bootstrap the compiler
pkgs (spack.package.PackageBase): the package with possible compiler
pkgs (spack.package_base.PackageBase): the package with possible compiler
dependencies
request (BuildRequest): the associated install request
all_deps (defaultdict(set)): dictionary of all dependencies and
@@ -786,7 +786,7 @@ def _add_init_task(self, pkg, request, is_compiler, all_deps):
Creates and queus the initial build task for the package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
request (BuildRequest or None): the associated install request
where ``None`` can be used to indicate the package was
explicitly requested by the user
@@ -968,7 +968,7 @@ def _cleanup_task(self, pkg):
Cleanup the build task for the spec
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
"""
self._remove_task(package_id(pkg))
@@ -982,7 +982,7 @@ def _ensure_install_ready(self, pkg):
already locked.
Args:
pkg (spack.package.PackageBase): the package being locally installed
pkg (spack.package_base.PackageBase): the package being locally installed
"""
pkg_id = package_id(pkg)
pre = "{0} cannot be installed locally:".format(pkg_id)
@@ -1014,7 +1014,8 @@ def _ensure_locked(self, lock_type, pkg):
Args:
lock_type (str): 'read' for a read lock, 'write' for a write lock
pkg (spack.package.PackageBase): the package whose spec is being installed
pkg (spack.package_base.PackageBase): the package whose spec is being
installed
Return:
(lock_type, lock) tuple where lock will be None if it could not
@@ -1228,7 +1229,7 @@ def _install_task(self, task):
# Create a child process to do the actual installation.
# Preserve verbosity settings across installs.
spack.package.PackageBase._verbose = (
spack.package_base.PackageBase._verbose = (
spack.build_environment.start_build_process(
pkg, build_process, install_args)
)
@@ -1373,7 +1374,7 @@ def _setup_install_dir(self, pkg):
Write a small metadata file with the current spack environment.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
"""
if not os.path.exists(pkg.spec.prefix):
tty.debug('Creating the installation directory {0}'.format(pkg.spec.prefix))
@@ -1447,7 +1448,7 @@ def _flag_installed(self, pkg, dependent_ids=None):
known dependents.
Args:
pkg (spack.package.Package): Package that has been installed locally,
pkg (spack.package_base.Package): Package that has been installed locally,
externally or upstream
dependent_ids (list or None): list of the package's
dependent ids, or None if the dependent ids are limited to
@@ -1536,7 +1537,7 @@ def install(self):
Install the requested package(s) and or associated dependencies.
Args:
pkg (spack.package.Package): the package to be built and installed"""
pkg (spack.package_base.Package): the package to be built and installed"""
self._init_queue()
fail_fast_err = 'Terminating after first install failure'
@@ -1788,7 +1789,7 @@ def __init__(self, pkg, install_args):
process in the build.
Arguments:
pkg (spack.package.PackageBase) the package being installed.
pkg (spack.package_base.PackageBase) the package being installed.
install_args (dict) arguments to do_install() from parent process.
"""
@@ -1848,8 +1849,8 @@ def run(self):
# get verbosity from do_install() parameter or saved value
self.echo = self.verbose
if spack.package.PackageBase._verbose is not None:
self.echo = spack.package.PackageBase._verbose
if spack.package_base.PackageBase._verbose is not None:
self.echo = spack.package_base.PackageBase._verbose
self.pkg.stage.keep = self.keep_stage
@@ -2001,7 +2002,7 @@ def build_process(pkg, install_args):
This function's return value is returned to the parent process.
Arguments:
pkg (spack.package.PackageBase): the package being installed.
pkg (spack.package_base.PackageBase): the package being installed.
install_args (dict): arguments to do_install() from parent process.
"""
@@ -2049,7 +2050,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
Instantiate a build task for a package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
request (BuildRequest or None): the associated install request
where ``None`` can be used to indicate the package was
explicitly requested by the user
@@ -2062,7 +2063,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package.PackageBase):
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
self.pkg = pkg
@@ -2240,11 +2241,11 @@ def __init__(self, pkg, install_args):
Instantiate a build request for a package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
install_args (dict): the install arguments associated with ``pkg``
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package.PackageBase):
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
self.pkg = pkg
@@ -2314,7 +2315,7 @@ def get_deptypes(self, pkg):
"""Determine the required dependency types for the associated package.
Args:
pkg (spack.package.PackageBase): explicit or implicit package being
pkg (spack.package_base.PackageBase): explicit or implicit package being
installed
Returns:
@@ -2337,7 +2338,7 @@ def run_tests(self, pkg):
"""Determine if the tests should be run for the provided packages
Args:
pkg (spack.package.PackageBase): explicit or implicit package being
pkg (spack.package_base.PackageBase): explicit or implicit package being
installed
Returns:

View File

@@ -196,6 +196,14 @@ def provides(self):
if self.spec.name == 'llvm-amdgpu':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'rocmcc'
# Special case for oneapi
if self.spec.name == 'intel-oneapi-compilers':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'oneapi'
# Special case for oneapi classic
if self.spec.name == 'intel-oneapi-compilers-classic':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'intel'
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -356,7 +356,7 @@ def patch_for_package(self, sha256, pkg):
Arguments:
sha256 (str): sha256 hash to look up
pkg (spack.package.Package): Package object to get patch for.
pkg (spack.package_base.Package): Package object to get patch for.
We build patch objects lazily because building them requires that
we have information about the package's location in its repo.

View File

@@ -1,83 +0,0 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# flake8: noqa: F401
"""pkgkit is a set of useful build tools and directives for packages.
Everything in this module is automatically imported into Spack package files.
"""
import llnl.util.filesystem
from llnl.util.filesystem import *
import spack.directives
import spack.util.executable
from spack.build_systems.aspell_dict import AspellDictPackage
from spack.build_systems.autotools import AutotoolsPackage
from spack.build_systems.cached_cmake import (
CachedCMakePackage,
cmake_cache_option,
cmake_cache_path,
cmake_cache_string,
)
from spack.build_systems.cmake import CMakePackage
from spack.build_systems.cuda import CudaPackage
from spack.build_systems.gnu import GNUMirrorPackage
from spack.build_systems.intel import IntelPackage
from spack.build_systems.lua import LuaPackage
from spack.build_systems.makefile import MakefilePackage
from spack.build_systems.maven import MavenPackage
from spack.build_systems.meson import MesonPackage
from spack.build_systems.octave import OctavePackage
from spack.build_systems.oneapi import (
IntelOneApiLibraryPackage,
IntelOneApiPackage,
IntelOneApiStaticLibraryList,
)
from spack.build_systems.perl import PerlPackage
from spack.build_systems.python import PythonPackage
from spack.build_systems.qmake import QMakePackage
from spack.build_systems.r import RPackage
from spack.build_systems.racket import RacketPackage
from spack.build_systems.rocm import ROCmPackage
from spack.build_systems.ruby import RubyPackage
from spack.build_systems.scons import SConsPackage
from spack.build_systems.sip import SIPPackage
from spack.build_systems.sourceforge import SourceforgePackage
from spack.build_systems.sourceware import SourcewarePackage
from spack.build_systems.waf import WafPackage
from spack.build_systems.xorg import XorgPackage
from spack.dependency import all_deptypes
from spack.directives import *
from spack.install_test import get_escaped_text_output
from spack.installer import (
ExternalPackageError,
InstallError,
InstallLockError,
UpstreamPackageError,
)
from spack.mixins import filter_compiler_wrappers
from spack.multimethod import when
from spack.package import (
BundlePackage,
DependencyConflictError,
Package,
build_system_flags,
env_flags,
flatten_dependencies,
inject_flags,
install_dependency_symlinks,
on_package_attributes,
run_after,
run_before,
)
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,
conditional,
disjoint_sets,
)
from spack.version import Version, ver

View File

@@ -469,47 +469,6 @@ def _replace_prefix_text(filename, compiled_prefixes):
f.truncate()
def _replace_prefix_bin(filename, byte_prefixes):
"""Replace all the occurrences of the old install prefix with a
new install prefix in binary files.
The new install prefix is prefixed with ``os.sep`` until the
lengths of the prefixes are the same.
Args:
filename (str): target binary file
byte_prefixes (OrderedDict): OrderedDictionary where the keys are
precompiled regex of the old prefixes and the values are the new
prefixes (uft-8 encoded)
"""
with open(filename, 'rb+') as f:
data = f.read()
f.seek(0)
for orig_bytes, new_bytes in byte_prefixes.items():
original_data_len = len(data)
# Skip this hassle if not found
if orig_bytes not in data:
continue
# We only care about this problem if we are about to replace
length_compatible = len(new_bytes) <= len(orig_bytes)
if not length_compatible:
tty.debug('Binary failing to relocate is %s' % filename)
raise BinaryTextReplaceError(orig_bytes, new_bytes)
pad_length = len(orig_bytes) - len(new_bytes)
padding = os.sep * pad_length
padding = padding.encode('utf-8')
data = data.replace(orig_bytes, new_bytes + padding)
# Really needs to be the same length
if not len(data) == original_data_len:
print('Length of pad:', pad_length, 'should be', len(padding))
print(new_bytes, 'was to replace', orig_bytes)
raise BinaryStringReplacementError(
filename, original_data_len, len(data))
f.write(data)
f.truncate()
def relocate_macho_binaries(path_names, old_layout_root, new_layout_root,
prefix_to_prefix, rel, old_prefix, new_prefix):
"""
@@ -817,49 +776,6 @@ def relocate_text(files, prefixes, concurrency=32):
tp.join()
def relocate_text_bin(binaries, prefixes, concurrency=32):
"""Replace null terminated path strings hard coded into binaries.
The new install prefix must be shorter than the original one.
Args:
binaries (list): binaries to be relocated
prefixes (OrderedDict): String prefixes which need to be changed.
concurrency (int): Desired degree of parallelism.
Raises:
BinaryTextReplaceError: when the new path is longer than the old path
"""
byte_prefixes = collections.OrderedDict({})
for orig_prefix, new_prefix in prefixes.items():
if orig_prefix != new_prefix:
if isinstance(orig_prefix, bytes):
orig_bytes = orig_prefix
else:
orig_bytes = orig_prefix.encode('utf-8')
if isinstance(new_prefix, bytes):
new_bytes = new_prefix
else:
new_bytes = new_prefix.encode('utf-8')
byte_prefixes[orig_bytes] = new_bytes
# Do relocations on text in binaries that refers to the install tree
# multiprocesing.ThreadPool.map requires single argument
args = []
for binary in binaries:
args.append((binary, byte_prefixes))
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
tp.map(llnl.util.lang.star(_replace_prefix_bin), args)
finally:
tp.terminate()
tp.join()
def is_relocatable(spec):
"""Returns True if an installed spec is relocatable.
@@ -1126,3 +1042,120 @@ def fixup_macos_rpaths(spec):
))
else:
tty.debug('No rpath fixup needed for ' + specname)
def compute_indices(filename, paths_to_relocate):
"""
Compute the indices in filename at which each of paths_to_relocate occurs.
Arguments:
filename (str): file to compute indices for
paths_to_relocate (List[str]): paths to find indices of
Returns:
Dict
"""
with open(filename, 'rb') as f:
contents = f.read()
substring_prefix = os.path.commonprefix(paths_to_relocate).encode('utf-8')
indices = {}
index = 0
max_length = max(len(path) for path in paths_to_relocate)
while True:
try:
# We search for the smallest substring of all paths we relocate
# In practice, this is the spack install root, and we relocate
# prefixes in the root and the root itself
index = contents.index(substring_prefix, index)
except ValueError:
# The string isn't found in the rest of the binary
break
else:
# only copy the smallest portion of the binary for comparisons
substring_to_check = contents[index:index + max_length]
for path in paths_to_relocate:
# We guarantee any substring in the list comes after any superstring
p = path.encode('utf-8')
if substring_to_check.startswith(p):
indices[index] = str(path)
index += len(path)
break
else:
index += 1
return indices
def _relocate_binary_text(filename, offsets, prefix_to_prefix):
"""
Relocate the text of a single binary file, given the offsets at which the
replacements need to be made
Arguments:
filename (str): file to modify
offsets (Dict[int, str]): locations of the strings to replace
prefix_to_prefix (Dict[str, str]): strings to replace and their replacements
"""
with open(filename, 'rb+') as f:
for index, prefix in offsets.items():
replacement = prefix_to_prefix[prefix].encode('utf-8')
if len(replacement) > len(prefix):
raise BinaryTextReplaceError(prefix, replacement)
# read forward until we find the end of the string including
# the prefix and compute the replacement as we go
f.seek(index + len(prefix))
c = f.read(1)
while c not in (None, b'\x00'):
replacement += c
c = f.read(1)
# seek back to the index position and write the replacement in
# and add null-terminator
f.seek(index)
f.write(replacement)
f.write(b'\x00')
def relocate_text_bin(
files_to_relocate, prefix_to_prefix, offsets=None,
relative_root=None, concurrency=32
):
"""
For each file given, replace all keys in the given translation dict with
the associated values. Optionally executes using precomputed memoized offsets
for the substitutions.
Arguments:
files_to_relocate (List[str]): The files to modify
prefix_to_prefix (Dict[str, str]): keys are strings to replace, values are
replacements
offsets (Dict[str, Dict[int, str]): (optional) Mapping from relative filenames to
a mapping from indices to strings to replace found at each index
relative_root (str): (optional) prefix for relative paths in offsets
"""
# defaults to the common prefix of all input files
rel_root = relative_root or os.path.commonprefix(files_to_relocate)
if offsets is None:
offsets = {}
for filename in files_to_relocate:
indices = compute_indices(
filename,
list(prefix_to_prefix.keys()),
)
relpath = os.path.relpath(filename, rel_root)
offsets[relpath] = indices
args = [
(filename, offsets[os.path.relpath(filename, rel_root)], prefix_to_prefix)
for filename in files_to_relocate
]
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
tp.map(llnl.util.lang.star(_relocate_binary_text), args)
finally:
tp.terminate()
tp.join()

View File

@@ -202,17 +202,11 @@ class RepoLoader(_PrependFileLoader):
"""Loads a Python module associated with a package in specific repository"""
#: Code in ``_package_prepend`` is prepended to imported packages.
#:
#: Spack packages were originally expected to call `from spack import *`
#: themselves, but it became difficult to manage and imports in the Spack
#: core the top-level namespace polluted by package symbols this way. To
#: solve this, the top-level ``spack`` package contains very few symbols
#: of its own, and importing ``*`` is essentially a no-op. The common
#: routines and directives that packages need are now in ``spack.pkgkit``,
#: and the import system forces packages to automatically include
#: this. This way, old packages that call ``from spack import *`` will
#: continue to work without modification, but it's no longer required.
#: Spack packages are expected to call `from spack.package import *`
#: themselves, but we are allowing a deprecation period before breaking
#: external repos that don't do this yet.
_package_prepend = ('from __future__ import absolute_import;'
'from spack.pkgkit import *')
'from spack.package import *')
def __init__(self, fullname, repo, package_name):
self.repo = repo
@@ -450,10 +444,10 @@ def is_package_file(filename):
# Package files are named `package.py` and are not in lib/spack/spack
# We have to remove the file extension because it can be .py and can be
# .pyc depending on context, and can differ between the files
import spack.package # break cycle
import spack.package_base # break cycle
filename_noext = os.path.splitext(filename)[0]
packagebase_filename_noext = os.path.splitext(
inspect.getfile(spack.package.PackageBase))[0]
inspect.getfile(spack.package_base.PackageBase))[0]
return (filename_noext != packagebase_filename_noext and
os.path.basename(filename_noext) == 'package')

View File

@@ -15,7 +15,7 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package
import spack.package_base
from spack.install_test import TestSuite
from spack.reporter import Reporter
from spack.reporters.cdash import CDash
@@ -131,7 +131,7 @@ def gather_info(do_fn):
"""
@functools.wraps(do_fn)
def wrapper(instance, *args, **kwargs):
if isinstance(instance, spack.package.PackageBase):
if isinstance(instance, spack.package_base.PackageBase):
pkg = instance
elif hasattr(args[0], 'pkg'):
pkg = args[0].pkg

View File

@@ -22,7 +22,7 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package
import spack.package_base
from spack.error import SpackError
from spack.reporter import Reporter
from spack.util.crypto import checksum

View File

@@ -7,7 +7,7 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package
import spack.package_base
from spack.reporter import Reporter
__all__ = ['JUnit']

View File

@@ -93,8 +93,10 @@ def rewire_node(spec, explicit):
False,
spec.build_spec.prefix,
spec.prefix)
relocate.relocate_text_bin(binaries=bins_to_relocate,
prefixes=prefix_to_prefix)
# Relocate text strings of prefixes embedded in binaries
relocate.relocate_text_bin(bins_to_relocate, prefix_to_prefix)
# Copy package into place, except for spec.json (because spec.json
# describes the old spec and not the new spliced spec).
shutil.copytree(os.path.join(tempdir, spec.dag_hash()), spec.prefix,

View File

@@ -91,16 +91,7 @@
'additional_external_search_paths': {
'type': 'array',
'items': {'type': 'string'}
},
'flags': {
'type': 'object',
'properties': {
'keep_werror': {
'type': 'string',
'enum': ['all', 'specific', 'none'],
},
},
},
}
},
'deprecatedProperties': {
'properties': ['module_roots'],

View File

@@ -110,6 +110,7 @@
},
},
'service-job-attributes': runner_selector_schema,
'signing-job-attributes': runner_selector_schema,
'rebuild-index': {'type': 'boolean'},
'broken-specs-url': {'type': 'string'},
},

View File

@@ -41,7 +41,7 @@
import spack.directives
import spack.environment as ev
import spack.error
import spack.package
import spack.package_base
import spack.package_prefs
import spack.platforms
import spack.repo
@@ -716,6 +716,7 @@ def __init__(self, tests=False):
self.variant_values_from_specs = set()
self.version_constraints = set()
self.target_constraints = set()
self.default_targets = {}
self.compiler_version_constraints = set()
self.post_facts = []
@@ -1164,7 +1165,7 @@ def preferred_variants(self, pkg_name):
pkg_name, variant.name, value
))
def preferred_targets(self, pkg_name):
def target_preferences(self, pkg_name):
key_fn = spack.package_prefs.PackagePrefs(pkg_name, 'target')
if not self.target_specs_cache:
@@ -1175,13 +1176,25 @@ def preferred_targets(self, pkg_name):
target_specs = self.target_specs_cache
preferred_targets = [x for x in target_specs if key_fn(x) < 0]
if not preferred_targets:
return
preferred = preferred_targets[0]
self.gen.fact(fn.package_target_weight(
str(preferred.architecture.target), pkg_name, -30
))
for i, preferred in enumerate(preferred_targets):
self.gen.fact(fn.package_target_weight(
str(preferred.architecture.target), pkg_name, i
))
# generate weights for non-preferred targets on a per-package basis
default_targets = {
name: weight for
name, weight in self.default_targets.items()
if not any(preferred.architecture.target.name == name
for preferred in preferred_targets)
}
num_preferred = len(preferred_targets)
for name, weight in default_targets.items():
self.gen.fact(fn.default_target_weight(
name, pkg_name, weight + num_preferred + 30
))
def flag_defaults(self):
self.gen.h2("Compiler flag defaults")
@@ -1572,7 +1585,7 @@ def target_defaults(self, specs):
compiler.name, compiler.version, uarch.family.name
))
i = 0
i = 0 # TODO compute per-target offset?
for target in candidate_targets:
self.gen.fact(fn.target(target.name))
self.gen.fact(fn.target_family(target.name, target.family.name))
@@ -1581,11 +1594,13 @@ def target_defaults(self, specs):
# prefer best possible targets; weight others poorly so
# they're not used unless set explicitly
# these are stored to be generated as facts later offset by the
# number of preferred targets
if target.name in best_targets:
self.gen.fact(fn.default_target_weight(target.name, i))
self.default_targets[target.name] = i
i += 1
else:
self.gen.fact(fn.default_target_weight(target.name, 100))
self.default_targets[target.name] = 100
self.gen.newline()
@@ -1807,7 +1822,7 @@ def setup(self, driver, specs, reuse=None):
self.possible_virtuals = set(
x.name for x in specs if x.virtual
)
possible = spack.package.possible_dependencies(
possible = spack.package_base.possible_dependencies(
*specs,
virtuals=self.possible_virtuals,
deptype=spack.dependency.all_deptypes
@@ -1862,7 +1877,7 @@ def setup(self, driver, specs, reuse=None):
self.pkg_rules(pkg, tests=self.tests)
self.gen.h2('Package preferences: %s' % pkg)
self.preferred_variants(pkg)
self.preferred_targets(pkg)
self.target_preferences(pkg)
# Inject dev_path from environment
env = ev.active_environment()

View File

@@ -779,10 +779,13 @@ target_compatible(Descendent, Ancestor)
#defined target_satisfies/2.
#defined target_parent/2.
% The target weight is either the default target weight
% or a more specific per-package weight if set
% If the package does not have any specific weight for this
% target, offset the default weights by the number of specific
% weights and use that. We additionally offset by 30 to ensure
% preferences are propagated even against large numbers of
% otherwise "better" matches.
target_weight(Target, Package, Weight)
:- default_target_weight(Target, Weight),
:- default_target_weight(Target, Package, Weight),
node(Package),
not derive_target_from_parent(_, Package),
not package_target_weight(Target, Package, _).
@@ -1238,7 +1241,7 @@ opt_criterion(1, "non-preferred targets").
%-----------------
#heuristic version(Package, Version) : version_declared(Package, Version, 0), node(Package). [10, true]
#heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), node(Package). [10, true]
#heuristic node_target(Package, Target) : default_target_weight(Target, 0), node(Package). [10, true]
#heuristic node_target(Package, Target) : package_target_weight(Target, Package, 0), node(Package). [10, true]
#heuristic node_target_weight(Package, 0) : node(Package). [10, true]
#heuristic variant_value(Package, Variant, Value) : variant_default_value(Package, Variant, Value), node(Package). [10, true]
#heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true]

View File

@@ -128,7 +128,7 @@ def test_cmake_bad_generator(config, mock_packages):
s.concretize()
pkg = spack.repo.get(s)
pkg.generator = 'Yellow Sticky Notes'
with pytest.raises(spack.package.InstallError):
with pytest.raises(spack.package_base.InstallError):
get_std_cmake_args(pkg)

View File

@@ -12,9 +12,6 @@
import pytest
import spack.build_environment
import spack.config
import spack.spec
from spack.paths import build_env_path
from spack.util.environment import set_env, system_dirs
from spack.util.executable import Executable, ProcessError
@@ -132,9 +129,7 @@ def wrapper_environment():
SPACK_TARGET_ARGS="-march=znver2 -mtune=znver2",
SPACK_LINKER_ARG='-Wl,',
SPACK_DTAGS_TO_ADD='--disable-new-dtags',
SPACK_DTAGS_TO_STRIP='--enable-new-dtags',
SPACK_COMPILER_FLAGS_KEEP='',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',):
SPACK_DTAGS_TO_STRIP='--enable-new-dtags'):
yield
@@ -162,21 +157,6 @@ def check_args(cc, args, expected):
assert expected == cc_modified_args
def check_args_contents(cc, args, must_contain, must_not_contain):
"""Check output arguments that cc produces when called with args.
This assumes that cc will print debug command output with one element
per line, so that we see whether arguments that should (or shouldn't)
contain spaces are parsed correctly.
"""
with set_env(SPACK_TEST_COMMAND='dump-args'):
cc_modified_args = cc(*args, output=str).strip().split('\n')
for a in must_contain:
assert a in cc_modified_args
for a in must_not_contain:
assert a not in cc_modified_args
def check_env_var(executable, var, expected):
"""Check environment variables updated by the passed compiler wrapper
@@ -662,63 +642,6 @@ def test_no_ccache_prepend_for_fc(wrapper_environment):
common_compile_args)
def test_keep_and_remove(wrapper_environment):
werror_specific = ['-Werror=meh']
werror = ['-Werror']
werror_all = werror_specific + werror
with set_env(
SPACK_COMPILER_FLAGS_KEEP='',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
):
check_args_contents(cc, test_args + werror_all, ['-Wl,--end-group'], werror_all)
with set_env(
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
):
check_args_contents(cc, test_args + werror_all, werror_specific, werror)
with set_env(
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*|-llib1|-Wl*',
):
check_args_contents(
cc,
test_args + werror_all,
werror_specific,
werror + ["-llib1", "-Wl,--rpath"]
)
@pytest.mark.parametrize('cfg_override,initial,expected,must_be_gone', [
# Set and unset variables
('config:flags:keep_werror:all',
['-Werror', '-Werror=specific', '-bah'],
['-Werror', '-Werror=specific', '-bah'],
[],
),
('config:flags:keep_werror:specific',
['-Werror', '-Werror=specific', '-bah'],
['-Werror=specific', '-bah'],
['-Werror'],
),
('config:flags:keep_werror:none',
['-Werror', '-Werror=specific', '-bah'],
['-bah'],
['-Werror', '-Werror=specific'],
),
])
@pytest.mark.usefixtures('wrapper_environment', 'mutable_config')
def test_flag_modification(cfg_override, initial, expected, must_be_gone):
spack.config.add(cfg_override)
env = spack.build_environment.clean_environment()
env.apply_modifications()
check_args_contents(
cc,
test_args + initial,
expected,
must_be_gone
)
@pytest.mark.regression('9160')
def test_disable_new_dtags(wrapper_environment, wrapper_flags):
with set_env(SPACK_TEST_COMMAND='dump-args'):

View File

@@ -10,7 +10,7 @@
import spack.cmd.install
import spack.config
import spack.package
import spack.package_base
import spack.util.spack_json as sjson
from spack.main import SpackCommand
from spack.spec import Spec

View File

@@ -635,10 +635,6 @@ def test_ci_generate_for_pr_pipeline(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
@@ -683,10 +679,6 @@ def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
@@ -920,6 +912,77 @@ def fake_dl_method(spec, *args, **kwargs):
env_cmd('deactivate')
def test_ci_generate_mirror_override(tmpdir, mutable_mock_env_path,
install_mockery_mutable_config, mock_packages,
mock_fetch, mock_stage, mock_binary_index,
ci_base_environment):
"""Ensure that protected pipelines using --buildcache-destination do not
skip building specs that are not in the override mirror when they are
found in the main mirror."""
os.environ.update({
'SPACK_PIPELINE_TYPE': 'spack_protected_branch',
})
working_dir = tmpdir.join('working_dir')
mirror_dir = working_dir.join('mirror')
mirror_url = 'file://{0}'.format(mirror_dir.strpath)
spack_yaml_contents = """
spack:
definitions:
- packages: [patchelf]
specs:
- $packages
mirrors:
test-mirror: {0}
gitlab-ci:
mappings:
- match:
- patchelf
runner-attributes:
tags:
- donotcare
image: donotcare
service-job-attributes:
tags:
- nonbuildtag
image: basicimage
""".format(mirror_url)
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write(spack_yaml_contents)
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
first_ci_yaml = str(tmpdir.join('.gitlab-ci-1.yml'))
second_ci_yaml = str(tmpdir.join('.gitlab-ci-2.yml'))
with ev.read('test'):
install_cmd()
buildcache_cmd('create', '-u', '--mirror-url', mirror_url, 'patchelf')
buildcache_cmd('update-index', '--mirror-url', mirror_url, output=str)
# This generate should not trigger a rebuild of patchelf, since it's in
# the main mirror referenced in the environment.
ci_cmd('generate', '--check-index-only', '--output-file', first_ci_yaml)
# Because we used a mirror override (--buildcache-destination) on a
# spack protected pipeline, we expect to only look in the override
# mirror for the spec, and thus the patchelf job should be generated in
# this pipeline
ci_cmd('generate', '--check-index-only', '--output-file', second_ci_yaml,
'--buildcache-destination', 'file:///mirror/not/exist')
with open(first_ci_yaml) as fd1:
first_yaml = fd1.read()
assert 'no-specs-to-rebuild' in first_yaml
with open(second_ci_yaml) as fd2:
second_yaml = fd2.read()
assert 'no-specs-to-rebuild' not in second_yaml
@pytest.mark.disable_clean_stage_check
def test_push_mirror_contents(tmpdir, mutable_mock_env_path,
install_mockery_mutable_config, mock_packages,
@@ -1151,10 +1214,6 @@ def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path,
with ev.read('test'):
monkeypatch.setattr(
spack.main, 'get_version', lambda: '0.15.3-416-12ad69eb1')
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
@@ -1256,10 +1315,6 @@ def test_ci_generate_with_workarounds(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile, '--dependencies')
with open(outputfile) as f:
@@ -1417,11 +1472,6 @@ def fake_get_mirrors_for_spec(spec=None, mirrors_to_check=None,
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as of:
@@ -1630,11 +1680,6 @@ def test_ci_generate_temp_storage_url(tmpdir, mutable_mock_env_path,
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile)
@@ -1715,6 +1760,64 @@ def test_ci_generate_read_broken_specs_url(tmpdir, mutable_mock_env_path,
assert(ex not in output)
def test_ci_generate_external_signing_job(tmpdir, mutable_mock_env_path,
install_mockery,
mock_packages, monkeypatch,
ci_base_environment):
"""Verify that in external signing mode: 1) each rebuild jobs includes
the location where the binary hash information is written and 2) we
properly generate a final signing job in the pipeline."""
os.environ.update({
'SPACK_PIPELINE_TYPE': 'spack_protected_branch'
})
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
spack:
specs:
- archive-files
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
temporary-storage-url-prefix: file:///work/temp/mirror
mappings:
- match:
- archive-files
runner-attributes:
tags:
- donotcare
image: donotcare
signing-job-attributes:
tags:
- nonbuildtag
- secretrunner
image:
name: customdockerimage
entrypoint: []
variables:
IMPORTANT_INFO: avalue
script:
- echo hello
""")
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as of:
pipeline_doc = syaml.load(of.read())
assert 'sign-pkgs' in pipeline_doc
signing_job = pipeline_doc['sign-pkgs']
assert 'tags' in signing_job
signing_job_tags = signing_job['tags']
for expected_tag in ['notary', 'protected', 'aws']:
assert expected_tag in signing_job_tags
def test_ci_reproduce(tmpdir, mutable_mock_env_path,
install_mockery, mock_packages, monkeypatch,
last_two_git_commits, ci_base_environment, mock_binary_index):

View File

@@ -9,7 +9,7 @@
import spack.caches
import spack.main
import spack.package
import spack.package_base
import spack.stage
clean = spack.main.SpackCommand('clean')
@@ -31,7 +31,7 @@ def __init__(self, name):
def __call__(self, *args, **kwargs):
counts[self.name] += 1
monkeypatch.setattr(spack.package.PackageBase, 'do_clean',
monkeypatch.setattr(spack.package_base.PackageBase, 'do_clean',
Counter('package'))
monkeypatch.setattr(spack.stage, 'purge', Counter('stages'))
monkeypatch.setattr(

View File

@@ -20,7 +20,7 @@
import spack.config
import spack.environment as ev
import spack.hash_types as ht
import spack.package
import spack.package_base
import spack.util.executable
from spack.error import SpackError
from spack.main import SpackCommand
@@ -66,7 +66,7 @@ def test_install_package_and_dependency(
def test_install_runtests_notests(monkeypatch, mock_packages, install_mockery):
def check(pkg):
assert not pkg.run_tests
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', check)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', check)
install('-v', 'dttop')
@@ -75,7 +75,7 @@ def test_install_runtests_root(monkeypatch, mock_packages, install_mockery):
def check(pkg):
assert pkg.run_tests == (pkg.name == 'dttop')
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', check)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', check)
install('--test=root', 'dttop')
@@ -84,7 +84,7 @@ def test_install_runtests_all(monkeypatch, mock_packages, install_mockery):
def check(pkg):
assert pkg.run_tests
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', check)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', check)
install('--test=all', 'a')
@@ -257,7 +257,7 @@ def test_install_commit(
"""
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(spack.package.PackageBase,
monkeypatch.setattr(spack.package_base.PackageBase,
'git', 'file://%s' % repo_path,
raising=False)

View File

@@ -22,7 +22,7 @@
#: new fake package template
pkg_template = '''\
from spack import *
from spack.package import *
class {name}(Package):
homepage = "http://www.example.com"

View File

@@ -28,7 +28,7 @@ def test_stage_spec(monkeypatch):
def fake_stage(pkg, mirror_only=False):
expected.remove(pkg.name)
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
stage('trivial-install-test-package', 'mpileaks')
@@ -42,7 +42,7 @@ def check_stage_path(monkeypatch, tmpdir):
def fake_stage(pkg, mirror_only=False):
assert pkg.path == expected_path
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
return expected_path
@@ -69,7 +69,7 @@ def fake_stage(pkg, mirror_only=False):
assert pkg.name == 'trivial-install-test-package'
assert pkg.path is None
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
e = ev.create('test')
e.add('mpileaks')
@@ -87,7 +87,7 @@ def fake_stage(pkg, mirror_only=False):
assert pkg.name == 'mpileaks'
assert pkg.version == Version('100.100')
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
e = ev.create('test')
e.add('mpileaks@100.100')
@@ -115,7 +115,7 @@ def test_stage_full_env(mutable_mock_env_path, monkeypatch):
def fake_stage(pkg, mirror_only=False):
expected.remove(pkg.name)
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
with e:
stage()

View File

@@ -76,7 +76,9 @@ def flake8_package_with_errors(scope="function"):
package = FileFilter(filename)
package.filter("state = 'unmodified'", "state = 'modified'", string=True)
package.filter(
"from spack import *", "from spack import *\nimport os", string=True
"from spack.package import *",
"from spack.package import *\nimport os",
string=True
)
yield filename
finally:
@@ -273,17 +275,17 @@ def test_style(flake8_package, tmpdir):
relative = os.path.relpath(flake8_package)
# no args
output = style()
output = style(fail_on_error=False)
assert relative in output
assert "spack style checks were clean" in output
# one specific arg
output = style(flake8_package)
output = style(flake8_package, fail_on_error=False)
assert relative in output
assert "spack style checks were clean" in output
# specific file that isn't changed
output = style(__file__)
output = style(__file__, fail_on_error=False)
assert relative not in output
assert __file__ in output
assert "spack style checks were clean" in output

View File

@@ -13,7 +13,7 @@
import spack.cmd.install
import spack.config
import spack.package
import spack.package_base
import spack.paths
import spack.store
from spack.main import SpackCommand
@@ -239,7 +239,7 @@ def test_test_list(
reason="Not supported on Windows (yet)")
def test_has_test_method_fails(capsys):
with pytest.raises(SystemExit):
spack.package.has_test_method('printing-package')
spack.package_base.has_test_method('printing-package')
captured = capsys.readouterr()[1]
assert 'is not a class' in captured

View File

@@ -144,7 +144,7 @@ def test_preferred_target(self, mutable_mock_repo):
update_packages('mpileaks', 'target', [default])
spec = concretize('mpileaks')
assert str(spec['mpich'].target) == default
assert str(spec['mpileaks'].target) == default
assert str(spec['mpich'].target) == default
def test_preferred_versions(self):

View File

@@ -35,7 +35,7 @@
import spack.database
import spack.directory_layout
import spack.environment as ev
import spack.package
import spack.package_base
import spack.package_prefs
import spack.paths
import spack.platforms
@@ -532,7 +532,7 @@ def _pkg_install_fn(pkg, spec, prefix):
@pytest.fixture
def mock_pkg_install(monkeypatch):
monkeypatch.setattr(spack.package.PackageBase, 'install',
monkeypatch.setattr(spack.package_base.PackageBase, 'install',
_pkg_install_fn, raising=False)
@@ -934,7 +934,7 @@ def mock_fetch(mock_archive, monkeypatch):
mock_fetcher.append(URLFetchStrategy(mock_archive.url))
monkeypatch.setattr(
spack.package.PackageBase, 'fetcher', mock_fetcher)
spack.package_base.PackageBase, 'fetcher', mock_fetcher)
class MockLayout(object):

View File

@@ -29,7 +29,7 @@
from llnl.util.tty.colify import colify
import spack.database
import spack.package
import spack.package_base
import spack.repo
import spack.spec
import spack.store
@@ -792,7 +792,7 @@ def test_uninstall_by_spec(mutable_database):
with mutable_database.write_transaction():
for spec in mutable_database.query():
if spec.installed:
spack.package.PackageBase.uninstall_by_spec(spec, force=True)
spack.package_base.PackageBase.uninstall_by_spec(spec, force=True)
else:
mutable_database.remove(spec)
assert len(mutable_database.query()) == 0

View File

@@ -10,7 +10,7 @@
import spack.build_environment
import spack.repo
import spack.spec
from spack.pkgkit import build_system_flags, env_flags, inject_flags
from spack.package import build_system_flags, env_flags, inject_flags
@pytest.fixture()

View File

@@ -15,7 +15,7 @@
import spack.repo
import spack.store
import spack.util.spack_json as sjson
from spack.package import (
from spack.package_base import (
InstallError,
PackageBase,
PackageStillNeededError,
@@ -238,7 +238,7 @@ def test_flatten_deps(
assert dependency_name not in os.listdir(pkg.prefix)
# Flatten the dependencies and ensure the dependency directory is there.
spack.package.flatten_dependencies(spec, pkg.prefix)
spack.package_base.flatten_dependencies(spec, pkg.prefix)
dependency_dir = os.path.join(pkg.prefix, dependency_name)
assert os.path.isdir(dependency_dir)
@@ -322,7 +322,7 @@ def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch):
# If remove_prefix is called at any point in this test, that is an
# error
pkg.succeed = False # make the build fail
monkeypatch.setattr(spack.package.Package, 'remove_prefix', mock_remove_prefix)
monkeypatch.setattr(spack.package_base.Package, 'remove_prefix', mock_remove_prefix)
with pytest.raises(spack.build_environment.ChildError):
pkg.do_install(keep_prefix=True)
assert os.path.exists(pkg.prefix)
@@ -340,7 +340,7 @@ def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch):
def test_second_install_no_overwrite_first(install_mockery, mock_fetch, monkeypatch):
spec = Spec('canfail').concretized()
pkg = spack.repo.get(spec)
monkeypatch.setattr(spack.package.Package, 'remove_prefix', mock_remove_prefix)
monkeypatch.setattr(spack.package_base.Package, 'remove_prefix', mock_remove_prefix)
pkg.succeed = True
pkg.do_install()

View File

@@ -69,7 +69,8 @@ def create_build_task(pkg, install_args={}):
Create a built task for the given (concretized) package
Args:
pkg (spack.package.PackageBase): concretized package associated with the task
pkg (spack.package_base.PackageBase): concretized package associated with
the task
install_args (dict): dictionary of kwargs (or install args)
Return:
@@ -716,7 +717,7 @@ def _add(_compilers):
task.compiler = True
# Preclude any meaningful side-effects
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', _true)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', _true)
monkeypatch.setattr(inst.PackageInstaller, '_setup_install_dir', _noop)
monkeypatch.setattr(spack.build_environment, 'start_build_process', _noop)
monkeypatch.setattr(spack.database.Database, 'add', _noop)
@@ -1049,7 +1050,7 @@ def test_install_fail_fast_on_except(install_mockery, monkeypatch, capsys):
# This will prevent b from installing, which will cause the build of a
# to be skipped.
monkeypatch.setattr(
spack.package.PackageBase,
spack.package_base.PackageBase,
'do_patch',
_test_install_fail_fast_on_except_patch
)

View File

@@ -17,7 +17,7 @@
import llnl.util.filesystem as fs
import spack.package
import spack.package_base
import spack.repo
@@ -80,7 +80,7 @@ def test_possible_dependencies_virtual(mock_packages, mpi_names):
# only one mock MPI has a dependency
expected['fake'] = set()
assert expected == spack.package.possible_dependencies(
assert expected == spack.package_base.possible_dependencies(
"mpi", transitive=False)
@@ -125,7 +125,7 @@ def test_possible_dependencies_with_multiple_classes(
'dt-diamond-bottom': set(),
})
assert expected == spack.package.possible_dependencies(*pkgs)
assert expected == spack.package_base.possible_dependencies(*pkgs)
def setup_install_test(source_paths, install_test_root):

View File

@@ -16,7 +16,7 @@
# do sanity checks only on packagess modified by a PR
import spack.cmd.style as style
import spack.fetch_strategy
import spack.package
import spack.package_base
import spack.paths
import spack.repo
import spack.util.crypto as crypto
@@ -265,7 +265,7 @@ def test_all_dependencies_exist():
"""Make sure no packages have nonexisting dependencies."""
missing = {}
pkgs = [pkg for pkg in spack.repo.path.all_package_names()]
spack.package.possible_dependencies(
spack.package_base.possible_dependencies(
*pkgs, transitive=True, missing=missing)
lines = [

View File

@@ -135,10 +135,10 @@ def test_urls_for_versions(mock_packages, config):
def test_url_for_version_with_no_urls(mock_packages, config):
pkg = spack.repo.get('git-test')
with pytest.raises(spack.package.NoURLError):
with pytest.raises(spack.package_base.NoURLError):
pkg.url_for_version('1.0')
with pytest.raises(spack.package.NoURLError):
with pytest.raises(spack.package_base.NoURLError):
pkg.url_for_version('1.1')
@@ -377,7 +377,7 @@ def test_fetch_options(mock_packages, config):
def test_has_test_method_fails(capsys):
with pytest.raises(SystemExit):
spack.package.has_test_method('printing-package')
spack.package_base.has_test_method('printing-package')
captured = capsys.readouterr()[1]
assert 'is not a class' in captured

View File

@@ -21,7 +21,7 @@
import spack.binary_distribution as bindist
import spack.cmd.buildcache as buildcache
import spack.package
import spack.package_base
import spack.repo
import spack.store
import spack.util.gpg
@@ -575,10 +575,10 @@ def fetch(self):
def fake_fn(self):
return fetcher
orig_fn = spack.package.PackageBase.fetcher
spack.package.PackageBase.fetcher = fake_fn
orig_fn = spack.package_base.PackageBase.fetcher
spack.package_base.PackageBase.fetcher = fake_fn
yield
spack.package.PackageBase.fetcher = orig_fn
spack.package_base.PackageBase.fetcher = orig_fn
@pytest.mark.parametrize("manual,instr", [(False, False), (False, True),
@@ -598,7 +598,7 @@ def _instr(pkg):
pkg.manual_download = manual
if instr:
monkeypatch.setattr(spack.package.PackageBase, 'download_instr',
monkeypatch.setattr(spack.package_base.PackageBase, 'download_instr',
_instr)
expected = pkg.download_instr if manual else 'All fetchers failed'
@@ -616,7 +616,7 @@ def fetch(self):
raise Exception("Sources are fetched but shouldn't have been")
fetcher = FetchStrategyComposite()
fetcher.append(FetchingNotAllowed())
monkeypatch.setattr(spack.package.PackageBase, 'fetcher', fetcher)
monkeypatch.setattr(spack.package_base.PackageBase, 'fetcher', fetcher)
def test_fetch_without_code_is_noop(install_mockery, fetching_not_allowed):

View File

@@ -294,14 +294,37 @@ def test_set_elf_rpaths_warning(mock_patchelf):
assert output is None
def test_relocate_binary_text(tmpdir):
filename = str(tmpdir.join('binary'))
with open(filename, 'wb') as f:
f.write(b'somebinarytext')
f.write(b'/usr/relpath')
f.write(b'\0')
f.write(b'morebinarytext')
spack.relocate._relocate_binary_text(filename, {14: '/usr'}, {'/usr': '/foo'})
with open(filename, 'rb') as f:
contents = f.read()
assert b'/foo/relpath\0' in contents
@pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc')
@skip_unless_linux
def test_replace_prefix_bin(hello_world):
def test_relocate_binary(hello_world):
# Compile an "Hello world!" executable and set RPATHs
executable = hello_world(rpaths=['/usr/lib', '/usr/lib64'])
with open(str(executable), 'rb') as f:
contents = f.read()
index_0 = contents.index(b'/usr')
index_1 = contents.index(b'/usr', index_0 + 1)
offsets = {index_0: '/usr/lib', index_1: '/usr/lib64'}
# Relocate the RPATHs
spack.relocate._replace_prefix_bin(str(executable), {b'/usr': b'/foo'})
spack.relocate._relocate_binary_text(
str(executable), offsets,
{'/usr/lib': '/foo/lib', '/usr/lib64': '/foo/lib64'}
)
# Some compilers add rpaths so ensure changes included in final result
assert '/foo/lib:/foo/lib64' in rpaths_for(executable)
@@ -390,13 +413,11 @@ def test_relocate_text_bin(hello_world, copy_binary, tmpdir):
assert not text_in_bin(str(new_binary.dirpath()), new_binary)
# Check this call succeed
orig_path_bytes = str(orig_binary.dirpath()).encode('utf-8')
new_path_bytes = str(new_binary.dirpath()).encode('utf-8')
orig_path_bytes = str(orig_binary.dirpath())
new_path_bytes = str(new_binary.dirpath())
spack.relocate.relocate_text_bin(
[str(new_binary)],
{orig_path_bytes: new_path_bytes}
)
[str(new_binary)], {orig_path_bytes: new_path_bytes})
# Check original directory is not there anymore and it was
# substituted with the new one
@@ -405,15 +426,14 @@ def test_relocate_text_bin(hello_world, copy_binary, tmpdir):
def test_relocate_text_bin_raise_if_new_prefix_is_longer(tmpdir):
short_prefix = b'/short'
long_prefix = b'/much/longer'
short_prefix = '/short'
long_prefix = '/much/longer'
fpath = str(tmpdir.join('fakebin'))
with open(fpath, 'w') as f:
f.write('/short')
with pytest.raises(spack.relocate.BinaryTextReplaceError):
spack.relocate.relocate_text_bin(
[fpath], {short_prefix: long_prefix}
)
[fpath], {short_prefix: long_prefix})
@pytest.mark.requires_executables('install_name_tool', 'file', 'cc')

View File

@@ -7,7 +7,7 @@
import pytest
import spack.package
import spack.package_base
import spack.paths
import spack.repo
@@ -116,11 +116,12 @@ def test_absolute_import_spack_packages_as_python_modules(mock_packages):
assert hasattr(spack.pkg.builtin.mock, 'mpileaks')
assert hasattr(spack.pkg.builtin.mock.mpileaks, 'Mpileaks')
assert isinstance(spack.pkg.builtin.mock.mpileaks.Mpileaks,
spack.package.PackageMeta)
assert issubclass(spack.pkg.builtin.mock.mpileaks.Mpileaks, spack.package.Package)
spack.package_base.PackageMeta)
assert issubclass(spack.pkg.builtin.mock.mpileaks.Mpileaks,
spack.package_base.Package)
def test_relative_import_spack_packages_as_python_modules(mock_packages):
from spack.pkg.builtin.mock.mpileaks import Mpileaks
assert isinstance(Mpileaks, spack.package.PackageMeta)
assert issubclass(Mpileaks, spack.package.Package)
assert isinstance(Mpileaks, spack.package_base.PackageMeta)
assert issubclass(Mpileaks, spack.package_base.Package)

View File

@@ -8,7 +8,7 @@
import pytest
import spack.error
import spack.package
import spack.package_base
import spack.util.hash as hashutil
from spack.dependency import Dependency, all_deptypes, canonical_deptype
from spack.spec import Spec

View File

@@ -14,7 +14,7 @@
from llnl.util.link_tree import MergeConflictError
import spack.package
import spack.package_base
import spack.spec
from spack.directory_layout import DirectoryLayout
from spack.filesystem_view import YamlFilesystemView

View File

@@ -14,7 +14,7 @@
from llnl.util.filesystem import working_dir
import spack.package
import spack.package_base
import spack.spec
from spack.util.executable import which
from spack.version import Version, VersionList, VersionRange, ver
@@ -590,7 +590,7 @@ def test_invalid_versions(version_str):
reason="Not supported on Windows (yet)")
def test_versions_from_git(mock_git_version_info, monkeypatch, mock_packages):
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(spack.package.PackageBase, 'git', 'file://%s' % repo_path,
monkeypatch.setattr(spack.package_base.PackageBase, 'git', 'file://%s' % repo_path,
raising=False)
for commit in commits:
@@ -614,7 +614,7 @@ def test_git_hash_comparisons(
"""Check that hashes compare properly to versions
"""
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(spack.package.PackageBase,
monkeypatch.setattr(spack.package_base.PackageBase,
'git', 'file://%s' % repo_path,
raising=False)

View File

@@ -15,7 +15,7 @@
class MockPackageBase(object):
"""Internal base class for mocking ``spack.package.PackageBase``.
"""Internal base class for mocking ``spack.package_base.PackageBase``.
Use ``MockPackageMultiRepo.add_package()`` to create new instances.

View File

@@ -7,7 +7,7 @@
import spack.directives
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.spec
import spack.util.hash
@@ -63,7 +63,7 @@ def __init__(self, spec):
# list of URL attributes and metadata attributes
# these will be removed from packages.
self.metadata_attrs = [s.url_attr for s in spack.fetch_strategy.all_strategies]
self.metadata_attrs += spack.package.Package.metadata_attrs
self.metadata_attrs += spack.package_base.Package.metadata_attrs
self.spec = spec
self.in_classdef = False # used to avoid nested classdefs

View File

@@ -594,7 +594,7 @@ def find_versions_of_archive(
list_depth (int): max depth to follow links on list_url pages.
Defaults to 0.
concurrency (int): maximum number of concurrent requests
reference_package (spack.package.Package or None): a spack package
reference_package (spack.package_base.Package or None): a spack package
used as a reference for url detection. Uses the url_for_version
method on the package to produce reference urls which, if found,
are preferred.

View File

@@ -95,7 +95,7 @@ def validate_or_raise(self, vspec, pkg=None):
Args:
vspec (Variant): instance to be validated
pkg (spack.package.Package): the package that required the validation,
pkg (spack.package_base.Package): the package that required the validation,
if available
Raises:

View File

@@ -177,6 +177,7 @@ class Version(object):
]
def __init__(self, string):
# type: (str) -> None
if not isinstance(string, str):
string = str(string)

Some files were not shown because too many files have changed in this diff Show More