Compare commits

..

44 Commits

Author SHA1 Message Date
Massimiliano Culpo
13e6f87ef6 Make CI work for Windows on a push to the release branch 2022-07-20 08:34:13 +02:00
Massimiliano Culpo
45183d2b49 Add changelog and bumped version 2022-07-20 08:10:41 +02:00
Massimiliano Culpo
22a7f98141 Fix typo in function 2022-07-20 08:10:41 +02:00
Jonathon Anderson
1bbf2fa93e Only hack botocore when needed (#31610)
Newer versions of botocore (>=1.23.47) support the full IOBase
interface, so the hacks added to supplement the missing attributes are
no longer needed. Conditionally disable the hacks if they appear to be
unnecessary based on the class hierarchy found at runtime.
2022-07-20 08:10:41 +02:00
Harmen Stoppels
67f2d64a3f Use lexists instead of exists during fetch (#31509) 2022-07-20 08:10:41 +02:00
Harmen Stoppels
7405d18e98 environment.py: only acquire write lock when necessary (#31493) 2022-07-20 08:10:41 +02:00
Harmen Stoppels
bff1de69a5 file_cache.py: idempotent remove without races (#31477)
There's a race condition in `remove()` as the lockfile is removed after
releasing the lock, which is a problem when another process acquires a
write lock during deletion.

Also simplify life a bit in multiprocessing when a file is possibly
removed multiple times, which currently is an error on the second
deletion, so the proposed fix is to make remove(...) idempotent and not
error when deleting non-existing cache entries.

Don't tests for existence of lockfile, cause windows/linux behavior is different
2022-07-20 08:10:41 +02:00
Peter Scheibel
44f360556d Cray manifest: compiler duplicates (#31173)
* remove unhelpful comment
* Filter compiler duplicates while reading manifest
* more-specific version matching edited to use module-specific version (to avoid an issue where a user might add a compiler with the same version to the initial test configuration
2022-07-20 08:10:41 +02:00
Peter Scheibel
96b68a210f spack external find: handle manifest with bad permissions (#31201)
Allow `spack external find` (with no extra args) to proceed if the manifest file exists but
without sufficient permissions; in that case, print a warning. Also add a test for that behavior.

TODOs:

- [x] continue past any exception raised during manifest parsing as part of `spack external find`, 
      except for CTRL-C (and other errors that indicate immediate program termination)
- [x] Semi-unrelated but came up when discussing this with the user who reported this issue to
      me: the manifest parser now accepts older schemas 

See: https://github.com/spack/spack/issues/31191
2022-07-20 08:10:41 +02:00
Cody Balos
d3ee0b9c07 Fix typo in documentation note about concretizer:unify (#31246) 2022-07-20 08:10:41 +02:00
Massimiliano Culpo
ab1e04f1d0 Try fixing the CI issue in Windows 2022-07-20 08:10:41 +02:00
Marco De La Pierre
fbc2e59221 Modify dockerfile template, so that any command can be executed (#29741) 2022-07-20 08:10:41 +02:00
Massimiliano Culpo
4f40c9aab9 Update containerize templates to account for view indirection (#31321)
fixes #30965
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
618161075d ASP-based solver: rescale target weights so that 0 is always the best score (#31226)
fixes #30997

Instead of giving a penalty of 30 to all nodes when preferences
are not package specific, give a penalty of 100 to all targets
of a node where we have package specific preferences, if the target
is not explicitly preferred.
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
21fa564df8 Canonicalize positional argument to spack bootstrap mirror (#31180)
fixes #31139
2022-07-20 08:10:41 +02:00
Brian Van Essen
b3e9abc72a Bugfix external find --all for libraries (#31186)
* Fixed a bug in the 'external find --all' command where the call failed
to find packages by both executable and library. The bug was that the
call `path.all_packages()` incorrectly turned the variable
`packages_to_check` into a generator rather than keeping it a list.
Thus the second call to `detection.by_library` had no work to do.

* Fixed the help message for the find external and compiler commands as
well as others that used the `scopes_metavar` field to define where
the results should be stored in configuration space.  Specifically,
the fact that configuration could be added to the environment was not
mentioned in the help message.
2022-07-20 08:10:41 +02:00
Christian Goll
488a513109 OpenSUSE Tumbleweed: use GLIBC version as distro version (#19895)
Tumbleweed is a rolling release that would have used a date 
as a version instead.
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
53cb6312a8 Stricter compatibility rules for OS and compiler when reusing specs (#31170)
* Stricter compatibility rules for OS and compiler when reusing specs
* Add unit test
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
7371fef3ca docs: quote string to show valid YAML (#31178)
fixes #31167
2022-07-20 08:10:41 +02:00
Peter Scheibel
60d9e12594 Manifest parsing: avoid irrelevant files (#31144)
* Manifest directory may not contain manifest files: exclude non-manifest files
* Manifest files use different name for rocmcc: add translation for it
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
c4e492202d ASP-based solver: fix rules on version weights selection (#31153)
* ASP: sort and deduplicate version weights from installed specs

* Pick version weights according to provenance

* Add unit test
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
2742e3d332 concretize.lp: impose a lower bound on the number of version facts if a solution exists (#31142)
* concretize.lp: impose a lower bound on the number of version facts if a valid version exists

fixes #30864

* Add a unit test
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
54f1ba9c3b Pin setuptools version in Github Action Workflows (#31118)
fixes #31109
2022-07-20 08:10:41 +02:00
Danny McClanahan
a78944c9eb fix doubly shell quoting args to spack spec (#29282)
* add test to verify fix works
* fix spec cflags/variants parsing test (breaking change)
* fix `spack spec` arg quoting issue
* add error report for deprecated cflags coalescing
* use .group(n) vs subscript regex group extraction for 3.5 compat
* add random test for untested functionality to pass codecov
* fix new test failure since rebase
2022-07-20 08:10:41 +02:00
Todd Gamblin
6a4d1af5be bugfix: preserve dict order for Spec.dag_hash() in Python 2 (#31092)
Fix a bug introduced in #21720. `spack_json.dump()` calls `_strify()` on dictionaries to
convert `unicode` to `str`, but it constructs `dict` objects instead of
`collections.OrderedDict` objects, so in Python 2 (or earlier versions of 3) it can
scramble dictionary order.

This can cause hashes to differ between Python 2 and Python 3, or between Python 3.7
and earlier Python 3's.

- [x] use `OrderedDict` in `_strify`
- [x] add a regression test
2022-07-20 08:10:41 +02:00
Sergey Kosukhin
6baf171700 clingo: fix string formatting in error messages (#31084) 2022-07-20 08:10:41 +02:00
Axel Huebl
891bb7131b openPMD-api: 0.14.5 (#31023)
Add the latest release.
2022-07-20 08:10:41 +02:00
Axel Huebl
3876367a75 WarpX: 22.06 (#31012)
* WarpX: 22.06

Update `warpx` & `py-warpx` to the latest release, `22.06`.

* Patch: Fix 1D CUDA Builds
2022-07-20 08:10:41 +02:00
Axel Huebl
19e4649979 Prepare: openPMD-api 0.15.0-dev (#29484)
Anticipate openPMD-api changes in the next major release that are
already in `dev` (aka Spack `develop`):
- C++17 requirement
- drop: `mpark-variant` public dependency
- add: `toml11` private dependency

Also add @franzpoeschel as co-maintainer for the Spack package.
2022-07-20 08:10:41 +02:00
Axel Huebl
3dd7619f73 WarpX: Patch no-MPI & Lib Install (#30866)
Fixes WarpX issues:
- https://github.com/ECP-WarpX/WarpX/pull/3134
- https://github.com/ECP-WarpX/WarpX/pull/3141

and uses GitHub patch URLs directly instead of storing
patch copies.
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
2a7a040327 bootstrap: account for disabled sources (#31042)
* bootstrap: account for disabled sources

Fix a bug introduced in #30192, which effectively skips
any prescription on disabled bootstrapping sources.

* Add unit test to avoid regression
2022-07-20 08:10:41 +02:00
Massimiliano Culpo
83bf44f2fe archspec: bump to v0.1.4 (#30856)
Fixes compiler flags for oneapi and dpcpp
2022-07-20 08:10:41 +02:00
Adam J. Stewart
c527b43d18 Use stable URLs for patch-diff GitHub patches (#30953) 2022-07-20 08:10:41 +02:00
Adam J. Stewart
4866c587e6 Python: fix clingo bootstrapping on Apple M1 (#30834)
This PR fixes several issues I noticed while trying to get Spack working on Apple M1.

- [x] `build_environment.py` attempts to add `spec['foo'].libs` and `spec['foo'].headers` to our compiler wrappers for all dependencies using a try-except that ignores `NoLibrariesError` and `NoHeadersError` respectively. However, The `libs` and `headers` attributes of the Python package were erroneously using `RuntimeError` instead.
- [x] `spack external find python` (used during bootstrapping) currently has no way to determine whether or not an installation is `+shared`, so previously we would only search for static Python libs. However, most distributions including XCode/Conda/Intel ship shared Python libs. I updated `libs` to search for both shared and static (order based on variant) as a fallback.
- [x] The `headers` attribute was recursively searching in `prefix.include` for `pyconfig.h`, but this could lead to non-deterministic behavior if multiple versions of Python are installed and `pyconfig.h` files exist in multiple `<prefix>/include/pythonX.Y` locations. It's safer to search in `sysconfig.get_path('include')` instead.
- [x] The Python installation that comes with XCode is broken, and `sysconfig.get_paths` is hard-coded to return specific directories. This meant that our logic for `platlib`/`purelib`/`include` where we replace `platbase`/`base`/`installed_base` with `prefix` wasn't working and the `mkdirp` in `setup_dependent_package` was trying to create a directory in root, giving permissions issues. Even if you commented out those `mkdirp` calls, Spack would add the wrong directories to `PYTHONPATH`. Added a fallback hard-coded to `lib/pythonX.Y/site-packages` if sysconfig is broken (this is what distutils always did).
2022-07-20 08:10:41 +02:00
Evan Bollig
c09bf37ff6 Added AWS-AHUG alinux2 pipeline (#24601)
Add spack stacks targeted at Spack + AWS + ARM HPC User Group hackathon.  Includes
a list of miniapps and full-apps that are ready to run on both x86_64 and aarch64.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-31 06:29:22 +02:00
Evan Bollig
deeebc4593 Alinux isc buildcache (#30462)
Add two new stacks targeted at x86_64 and arm, representing an initial list of packages 
used by current and planned AWS Workshops, and built in conjunction with the ISC22
announcement of the spack public binary cache.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-31 06:29:22 +02:00
Greg Becker
eef202ea85 fix dev paths for deps 2022-05-28 20:18:13 +02:00
Todd Gamblin
4c6564f10a update changelog for v0.18.0 (#30905) 2022-05-28 17:33:52 +02:00
Massimiliano Culpo
82919cb6a5 Remove the warning that Spack prints at each spec (#30872)
Add instead a warning box in the documentation
2022-05-28 16:37:51 +02:00
Greg Becker
844c799299 target optimization: re-norm optimization scale so that 0 is best. (#29926)
referred targets are currently the only minimization criteria for Spack for which we allow
negative values. That means Spack may be incentivized to add nodes to the DAG if they
match the preferred target.

This PR re-norms the minimization criteria so that preferred targets are weighted from 0,
and default target weights are offset by the number of preferred targets per-package to
calculate node_target_weight.

Also fixes a bug in the test for preferred targets that was making the test easier to pass
than it should be.
2022-05-27 22:51:03 -07:00
Greg Becker
9198ab63ae update tutorial command for v0.18.0 and new gpg key 2022-05-27 18:45:19 -07:00
Todd Gamblin
8f9bc5bba4 Revert "strip -Werror: all specific or none (#30284)"
This reverts commit 330832c22c.

`-Werror` chagnes were unfortunately causing the `rdma-core` build to fail.
Reverting on `v0.18`; we can fix this in `develop`
2022-05-26 12:13:40 -07:00
Scott Wittenburg
ca0c968639 ci: Support secure binary signing on protected pipelines (#30753)
This PR supports the creation of securely signed binaries built from spack
develop as well as release branches and tags. Specifically:

- remove internal pr mirror url generation logic in favor of buildcache destination
on command line
    - with a single mirror url specified in the spack.yaml, this makes it clearer where 
    binaries from various pipelines are pushed
- designate some tags as reserved: ['public', 'protected', 'notary']
    - these tags are stripped from all jobs by default and provisioned internally
    based on pipeline type
- update gitlab ci yaml to include pipelines on more protected branches than just
develop (so include releases and tags)
    - binaries from all protected pipelines are pushed into mirrors including the
    branch name so releases, tags, and develop binaries are kept separate
- update rebuild jobs running on protected pipelines to run on special runners
provisioned with an intermediate signing key
    - protected rebuild jobs no longer use "SPACK_SIGNING_KEY" env var to
    obtain signing key (in fact, final signing key is nowhere available to rebuild jobs)
    - these intermediate signatures are verified at the end of each pipeline by a new
    signing job to ensure binaries were produced by a protected pipeline
- optionallly schedule a signing/notary job at the end of the pipeline to sign all
packges in the mirror
    - add signing-job-attributes to gitlab-ci section of spack environment to allow
    configuration
    - signing job runs on special runner (separate from protected rebuild runners)
    provisioned with public intermediate key and secret signing key
2022-05-26 09:10:18 -07:00
Gregory Becker
d99a1b1047 release number for v0.18.0 2022-05-25 20:29:03 -07:00
7029 changed files with 17119 additions and 25291 deletions

View File

@@ -12,7 +12,6 @@ on:
# built-in repository or documentation
- 'var/spack/repos/builtin/**'
- '!var/spack/repos/builtin/packages/clingo-bootstrap/**'
- '!var/spack/repos/builtin/packages/clingo/**'
- '!var/spack/repos/builtin/packages/python/**'
- '!var/spack/repos/builtin/packages/re2c/**'
- 'lib/spack/docs/**'
@@ -20,10 +19,6 @@ on:
# nightly at 2:16 AM
- cron: '16 2 * * *'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
jobs:
fedora-clingo-sources:
@@ -180,11 +175,10 @@ jobs:
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: ${{ matrix.macos-version }}
runs-on: macos-latest
strategy:
matrix:
python-version: ['3.5', '3.6', '3.7', '3.8', '3.9', '3.10']
macos-version: ['macos-10.15', 'macos-11', 'macos-12']
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
@@ -192,7 +186,7 @@ jobs:
brew install tree
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: ${{ matrix.python-version }}
- name: Bootstrap clingo
@@ -211,7 +205,7 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: ${{ matrix.python-version }}
- name: Setup repo

View File

@@ -19,10 +19,6 @@ on:
release:
types: [published]
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
jobs:
deploy-images:
runs-on: ubuntu-latest

View File

@@ -16,10 +16,6 @@ on:
- '.github/workflows/macos_python.yml'
# TODO: run if we touch any of the recipes involved in this
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
# GitHub Action Limits
# https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions
@@ -30,7 +26,7 @@ jobs:
runs-on: macos-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -46,7 +42,7 @@ jobs:
timeout-minutes: 700
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -60,7 +56,7 @@ jobs:
runs-on: macos-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: 3.9
- name: spack install

View File

@@ -9,11 +9,6 @@ on:
branches:
- develop
- releases/**
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
jobs:
# Validate that the code can be run on all the Python versions
# supported by Spack
@@ -21,7 +16,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: '3.10'
- name: Install Python Packages
@@ -39,12 +34,12 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: '3.10'
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools types-six
pip install --upgrade pip six setuptools==62.3.4 types-six
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -114,7 +109,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -179,7 +174,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: '3.10'
- name: Install System packages
@@ -245,7 +240,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: '3.10'
- name: Install System packages
@@ -294,7 +289,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -337,7 +332,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08 # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
with:
python-version: '3.10'
- name: Install Python packages
@@ -350,7 +345,7 @@ jobs:
coverage run $(which spack) audit packages
coverage combine
coverage xml
- name: Package audits (without coverage)
- name: Package audits (wwithout coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
run: |
. share/spack/setup-env.sh

View File

@@ -9,11 +9,6 @@ on:
branches:
- develop
- releases/**
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
defaults:
run:
shell:
@@ -23,7 +18,7 @@ jobs:
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python Packages
@@ -41,12 +36,12 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six setuptools flake8 isort>=4.3.5 mypy>=0.800 black pywin32 types-python-dateutil
python -m pip install --upgrade pip six setuptools==62.3.4 flake8 isort>=4.3.5 mypy>=0.800 black pywin32 types-python-dateutil
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
@@ -63,7 +58,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python packages
@@ -83,7 +78,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python packages
@@ -103,7 +98,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python packages
@@ -128,7 +123,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python packages
@@ -159,7 +154,7 @@ jobs:
run:
shell: pwsh
steps:
- uses: actions/setup-python@c4e89fac7e8767b327bbad6cb4d859eda999cf08
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,3 +1,21 @@
# v0.18.1 (2022-07-19)
### Spack Bugfixes
* Fix several bugs related to bootstrapping (#30834,#31042,#31180)
* Fix a regression that was causing spec hashes to differ between
Python 2 and Python 3 (#31092)
* Fixed compiler flags for oneAPI and DPC++ (#30856)
* Fixed several issues related to concretization (#31142,#31153,#31170,#31226)
* Improved support for Cray manifest file and `spack external find` (#31144,#31201,#31173.#31186)
* Assign a version to openSUSE Tumbleweed according to the GLIBC version
in the system (#19895)
* Improved Dockerfile generation for `spack containerize` (#29741,#31321)
* Fixed a few bugs related to concurrent execution of commands (#31509,#31493,#31477)
### Package updates
* WarpX: add v22.06, fixed libs property (#30866,#31102)
* openPMD: add v0.14.5, update recipe for @develop (#29484,#31023)
# v0.18.0 (2022-05-28)
`v0.18.0` is a major feature release.

View File

@@ -62,12 +62,11 @@ on these ideas for each distinct build system that Spack supports:
build_systems/bundlepackage
build_systems/cudapackage
build_systems/custompackage
build_systems/inteloneapipackage
build_systems/intelpackage
build_systems/multiplepackage
build_systems/rocmpackage
build_systems/sourceforgepackage
build_systems/custompackage
build_systems/multiplepackage
For reference, the :py:mod:`Build System API docs <spack.build_systems>`
provide a list of build systems and methods/attributes that can be

View File

@@ -84,8 +84,8 @@ build ``hdf5`` with Intel oneAPI MPI do::
spack install hdf5 +mpi ^intel-oneapi-mpi
Using Externally Installed oneAPI Tools
=======================================
Using an Externally Installed oneAPI
====================================
Spack can also use oneAPI tools that are manually installed with
`Intel Installers`_. The procedures for configuring Spack to use
@@ -110,7 +110,7 @@ Another option is to manually add the configuration to
Libraries
---------
If you want Spack to use oneMKL that you have installed without Spack in
If you want Spack to use MKL that you have installed without Spack in
the default location, then add the following to
``~/.spack/packages.yaml``, adjusting the version as appropriate::
@@ -139,7 +139,7 @@ You can also use Spack-installed libraries. For example::
spack load intel-oneapi-mkl
Will update your environment CPATH, LIBRARY_PATH, and other
environment variables for building an application with oneMKL.
environment variables for building an application with MKL.
More information
================

View File

@@ -15,9 +15,6 @@ IntelPackage
Intel packages in Spack
^^^^^^^^^^^^^^^^^^^^^^^^
This is an earlier version of Intel software development tools and has
now been replaced by Intel oneAPI Toolkits.
Spack can install and use several software development products offered by Intel.
Some of these are available under no-cost terms, others require a paid license.
All share the same basic steps for configuration, installation, and, where

View File

@@ -48,9 +48,8 @@ important to understand.
**build backend**
Libraries used to define how to build a wheel. Examples
include `setuptools <https://setuptools.pypa.io/>`__,
`flit <https://flit.readthedocs.io/>`_,
`poetry <https://python-poetry.org/>`_, and
`hatchling <https://hatch.pypa.io/latest/>`_.
`flit <https://flit.readthedocs.io/>`_, and
`poetry <https://python-poetry.org/>`_.
^^^^^^^^^^^
Downloading
@@ -327,33 +326,6 @@ for specifying the version requirements. Note that ``~=`` works
differently in poetry than in setuptools and flit for versions that
start with a zero.
"""""""""
hatchling
"""""""""
If the ``pyproject.toml`` lists ``hatchling.build`` as the
``build-backend``, it uses the hatchling build system. Look for
dependencies under the following keys:
* ``requires-python``
This specifies the version of Python that is required
* ``project.dependencies``
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
* ``project.optional-dependencies``
This section includes keys with lists of optional dependencies
needed to enable those features. You should add a variant that
optionally adds these dependencies. This variant should be ``False``
by default.
See https://hatch.pypa.io/latest/config/dependency/ for more
information.
""""""
wheels
""""""
@@ -694,4 +666,3 @@ For more information on build backend tools, see:
* setuptools: https://setuptools.pypa.io/
* flit: https://flit.readthedocs.io/
* poetry: https://python-poetry.org/
* hatchling: https://hatch.pypa.io/latest/

View File

@@ -1,55 +0,0 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _sourceforgepackage:
------------------
SourceforgePackage
------------------
``SourceforgePackage`` is a
`mixin-class <https://en.wikipedia.org/wiki/Mixin>`_. It automatically
sets the URL based on a list of Sourceforge mirrors listed in
`sourceforge_mirror_path`, which defaults to a half dozen known mirrors.
Refer to the package source
(`<https://github.com/spack/spack/blob/develop/lib/spack/spack/build_systems/sourceforge.py>`__) for the current list of mirrors used by Spack.
^^^^^^^
Methods
^^^^^^^
This package provides a method for populating mirror URLs.
**urls**
This method returns a list of possible URLs for package source.
It is decorated with `property` so its results are treated as
a package attribute.
Refer to
`<https://spack.readthedocs.io/en/latest/packaging_guide.html#mirrors-of-the-main-url>`__
for information on how Spack uses the `urls` attribute during
fetching.
^^^^^
Usage
^^^^^
This helper package can be added to your package by adding it as a base
class of your package and defining the relative location of an archive
file for one version of your software.
.. code-block:: python
:emphasize-lines: 1,3
class MyPackage(AutotoolsPackage, SourceforgePackage):
...
sourceforge_mirror_path = "my-package/mypackage.1.0.0.tar.gz"
...
Over 40 packages are using ``SourceforcePackage`` this mix-in as of
July 2022 so there are multiple packages to choose from if you want
to see a real example.

View File

@@ -109,10 +109,9 @@ Spack Images on Docker Hub
--------------------------
Docker images with Spack preinstalled and ready to be used are
built when a release is tagged, or nightly on ``develop``. The images
are then pushed both to `Docker Hub <https://hub.docker.com/u/spack>`_
and to `GitHub Container Registry <https://github.com/orgs/spack/packages?repo_name=spack>`_.
The OS that are currently supported are summarized in the table below:
built on `Docker Hub <https://hub.docker.com/u/spack>`_
at every push to ``develop`` or to a release branch. The OS that
are currently supported are summarized in the table below:
.. _containers-supported-os:
@@ -122,31 +121,22 @@ The OS that are currently supported are summarized in the table below:
* - Operating System
- Base Image
- Spack Image
* - Ubuntu 16.04
- ``ubuntu:16.04``
- ``spack/ubuntu-xenial``
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - Ubuntu 20.04
- ``ubuntu:20.04``
- ``spack/ubuntu-focal``
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
* - CentOS Stream
- ``quay.io/centos/centos:stream``
- ``spack/centos-stream``
* - openSUSE Leap
- ``opensuse/leap``
- ``spack/leap15``
* - Amazon Linux 2
- ``amazonlinux:2``
- ``spack/amazon-linux``
All the images are tagged with the corresponding release of Spack:
.. image:: images/ghcr_spack.png
.. image:: dockerhub_spack.png
with the exception of the ``latest`` tag that points to the HEAD
of the ``develop`` branch. These images are available for anyone

View File

@@ -107,6 +107,7 @@ with a high level view of Spack's directory structure:
llnl/ <- some general-use libraries
spack/ <- spack module; contains Python code
analyzers/ <- modules to run analysis on installed packages
build_systems/ <- modules for different build systems
cmd/ <- each file in here is a spack subcommand
compilers/ <- compiler description files
@@ -150,7 +151,7 @@ Package-related modules
^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.package`
Contains the :class:`~spack.package_base.Package` class, which
Contains the :class:`~spack.package.Package` class, which
is the superclass for all packages in Spack. Methods on ``Package``
implement all phases of the :ref:`package lifecycle
<package-lifecycle>` and manage the build process.
@@ -241,6 +242,22 @@ Unit tests
Implements Spack's test suite. Add a module and put its name in
the test suite in ``__init__.py`` to add more unit tests.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Research and Monitoring Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.monitor`
Contains :class:`~spack.monitor.SpackMonitorClient`. This is accessed from
the ``spack install`` and ``spack analyze`` commands to send build and
package metadata up to a `Spack Monitor
<https://github.com/spack/spack-monitor>`_ server.
:mod:`spack.analyzers`
A module folder with a :class:`~spack.analyzers.analyzer_base.AnalyzerBase`
that provides base functions to run, save, and (optionally) upload analysis
results to a `Spack Monitor <https://github.com/spack/spack-monitor>`_ server.
^^^^^^^^^^^^^
Other Modules
@@ -284,6 +301,240 @@ Most spack commands look something like this:
The information in Package files is used at all stages in this
process.
Conceptually, packages are overloaded. They contain:
-------------
Stage objects
-------------
.. _writing-analyzers:
-----------------
Writing analyzers
-----------------
To write an analyzer, you should add a new python file to the
analyzers module directory at ``lib/spack/spack/analyzers`` .
Your analyzer should be a subclass of the :class:`AnalyzerBase <spack.analyzers.analyzer_base.AnalyzerBase>`. For example, if you want
to add an analyzer class ``Myanalyzer`` you would write to
``spack/analyzers/myanalyzer.py`` and import and
use the base as follows:
.. code-block:: python
from .analyzer_base import AnalyzerBase
class Myanalyzer(AnalyzerBase):
Note that the class name is your module file name, all lowercase
except for the first capital letter. You can look at other analyzers in
that analyzer directory for examples. The guide here will tell you about the basic functions needed.
^^^^^^^^^^^^^^^^^^^^^^^^^
Analyzer Output Directory
^^^^^^^^^^^^^^^^^^^^^^^^^
By default, when you run ``spack analyze run`` an analyzer output directory will
be created in your spack user directory in your ``$HOME``. The reason we output here
is because the install directory might not always be writable.
.. code-block:: console
~/.spack/
analyzers
Result files will be written here, organized in subfolders in the same structure
as the package, with each analyzer owning it's own subfolder. for example:
.. code-block:: console
$ tree ~/.spack/analyzers/
/home/spackuser/.spack/analyzers/
└── linux-ubuntu20.04-skylake
└── gcc-9.3.0
└── zlib-1.2.11-sl7m27mzkbejtkrajigj3a3m37ygv4u2
├── environment_variables
│   └── spack-analyzer-environment-variables.json
├── install_files
│   └── spack-analyzer-install-files.json
└── libabigail
└── lib
└── spack-analyzer-libabigail-libz.so.1.2.11.xml
Notice that for the libabigail analyzer, since results are generated per object,
we honor the object's folder in case there are equivalently named files in
different folders. The result files are typically written as json so they can be easily read and uploaded in a future interaction with a monitor.
^^^^^^^^^^^^^^^^^
Analyzer Metadata
^^^^^^^^^^^^^^^^^
Your analyzer is required to have the class attributes ``name``, ``outfile``,
and ``description``. These are printed to the user with they use the subcommand
``spack analyze list-analyzers``. Here is an example.
As we mentioned above, note that this analyzer would live in a module named
``libabigail.py`` in the analyzers folder so that the class can be discovered.
.. code-block:: python
class Libabigail(AnalyzerBase):
name = "libabigail"
outfile = "spack-analyzer-libabigail.json"
description = "Application Binary Interface (ABI) features for objects"
This means that the name and output file should be unique for your analyzer.
Note that "all" cannot be the name of an analyzer, as this key is used to indicate
that the user wants to run all analyzers.
.. _analyzer_run_function:
^^^^^^^^^^^^^^^^^^^^^^^^
An analyzer run Function
^^^^^^^^^^^^^^^^^^^^^^^^
The core of an analyzer is its ``run()`` function, which should accept no
arguments. You can assume your analyzer has the package spec of interest at ``self.spec``
and it's up to the run function to generate whatever analysis data you need,
and then return the object with a key as the analyzer name. The result data
should be a list of objects, each with a name, ``analyzer_name``, ``install_file``,
and one of ``value`` or ``binary_value``. The install file should be for a relative
path, and not the absolute path. For example, let's say we extract a metric called
``metric`` for ``bin/wget`` using our analyzer ``thebest-analyzer``.
We might have data that looks like this:
.. code-block:: python
result = {"name": "metric", "analyzer_name": "thebest-analyzer", "value": "1", "install_file": "bin/wget"}
We'd then return it as follows - note that they key is the analyzer name at ``self.name``.
.. code-block:: python
return {self.name: result}
This will save the complete result to the analyzer metadata folder, as described
previously. If you want support for adding a different kind of metadata (e.g.,
not associated with an install file) then the monitor server would need to be updated
to support this first.
^^^^^^^^^^^^^^^^^^^^^^^^^
An analyzer init Function
^^^^^^^^^^^^^^^^^^^^^^^^^
If you don't need any extra dependencies or checks, you can skip defining an analyzer
init function, as the base class will handle it. Typically, it will accept
a spec, and an optional output directory (if the user does not want the default
metadata folder for analyzer results). The analyzer init function should call
it's parent init, and then do any extra checks or validation that are required to
work. For example:
.. code-block:: python
def __init__(self, spec, dirname=None):
super(Myanalyzer, self).__init__(spec, dirname)
# install extra dependencies, do extra preparation and checks here
At the end of the init, you will have available to you:
- **self.spec**: the spec object
- **self.dirname**: an optional directory name the user as provided at init to save
- **self.output_dir**: the analyzer metadata directory, where we save by default
- **self.meta_dir**: the path to the package metadata directory (.spack) if you need it
And can proceed to write your analyzer.
^^^^^^^^^^^^^^^^^^^^^^^
Saving Analyzer Results
^^^^^^^^^^^^^^^^^^^^^^^
The analyzer will have ``save_result`` called, with the result object generated
to save it to the filesystem, and if the user has added the ``--monitor`` flag
to upload it to a monitor server. If your result follows an accepted result
format and you don't need to parse it further, you don't need to add this
function to your class. However, if your result data is large or otherwise
needs additional parsing, you can define it. If you define the function, it
is useful to know about the ``output_dir`` property, which you can join
with your output file relative path of choice:
.. code-block:: python
outfile = os.path.join(self.output_dir, "my-output-file.txt")
The directory will be provided by the ``output_dir`` property but it won't exist,
so you should create it:
.. code::block:: python
# Create the output directory
if not os.path.exists(self._output_dir):
os.makedirs(self._output_dir)
If you are generating results that match to specific files in the package
install directory, you should try to maintain those paths in the case that
there are equivalently named files in different directories that would
overwrite one another. As an example of an analyzer with a custom save,
the Libabigail analyzer saves ``*.xml`` files to the analyzer metadata
folder in ``run()``, as they are either binaries, or as xml (text) would
usually be too big to pass in one request. For this reason, the files
are saved during ``run()`` and the filenames added to the result object,
and then when the result object is passed back into ``save_result()``,
we skip saving to the filesystem, and instead read the file and send
each one (separately) to the monitor:
.. code-block:: python
def save_result(self, result, monitor=None, overwrite=False):
"""ABI results are saved to individual files, so each one needs to be
read and uploaded. Result here should be the lookup generated in run(),
the key is the analyzer name, and each value is the result file.
We currently upload the entire xml as text because libabigail can't
easily read gzipped xml, but this will be updated when it can.
"""
if not monitor:
return
name = self.spec.package.name
for obj, filename in result.get(self.name, {}).items():
# Don't include the prefix
rel_path = obj.replace(self.spec.prefix + os.path.sep, "")
# We've already saved the results to file during run
content = spack.monitor.read_file(filename)
# A result needs an analyzer, value or binary_value, and name
data = {"value": content, "install_file": rel_path, "name": "abidw-xml"}
tty.info("Sending result for %s %s to monitor." % (name, rel_path))
monitor.send_analyze_metadata(self.spec.package, {"libabigail": [data]})
Notice that this function, if you define it, requires a result object (generated by
``run()``, a monitor (if you want to send), and a boolean ``overwrite`` to be used
to check if a result exists first, and not write to it if the result exists and
overwrite is False. Also notice that since we already saved these files to the analyzer metadata folder, we return early if a monitor isn't defined, because this function serves to send results to the monitor. If you haven't saved anything to the analyzer metadata folder
yet, you might want to do that here. You should also use ``tty.info`` to give
the user a message of "Writing result to $DIRNAME."
.. _writing-commands:
@@ -448,6 +699,23 @@ with a hook, and this is the purpose of this particular hook. Akin to
``on_phase_success`` we require the same variables - the package that failed,
the name of the phase, and the log file where we might find errors.
"""""""""""""""""""""""""""""""""
``on_analyzer_save(pkg, result)``
"""""""""""""""""""""""""""""""""
After an analyzer has saved some result for a package, this hook is called,
and it provides the package that we just ran the analysis for, along with
the loaded result. Typically, a result is structured to have the name
of the analyzer as key, and the result object that is defined in detail in
:ref:`analyzer_run_function`.
.. code-block:: python
def on_analyzer_save(pkg, result):
"""given a package and a result...
"""
print('Do something extra with a package analysis result here')
^^^^^^^^^^^^^^^^^^^^^^
Adding a New Hook Type

Binary file not shown.

After

Width:  |  Height:  |  Size: 88 KiB

View File

@@ -1013,7 +1013,7 @@ The following advanced example shows how generated targets can be used in a
SPACK ?= spack
.PHONY: all clean env
.PHONY: all clean fetch env
all: env
@@ -1022,6 +1022,9 @@ The following advanced example shows how generated targets can be used in a
env.mk: spack.lock
$(SPACK) -e . env depfile -o $@ --make-target-prefix spack
fetch: spack/fetch
$(info Environment fetched!)
env: spack/env
$(info Environment installed!)
@@ -1034,10 +1037,10 @@ The following advanced example shows how generated targets can be used in a
endif
When ``make`` is invoked, it first "remakes" the missing include ``env.mk``
from its rule, which triggers concretization. When done, the generated target
``spack/env`` is available. In the above example, the ``env`` target uses this generated
target as a prerequisite, meaning that it can make use of the installed packages in
its commands.
from its rule, which triggers concretization. When done, the generated targets
``spack/fetch`` and ``spack/env`` are available. In the above
example, the ``env`` target uses the latter as a prerequisite, meaning
that it can make use of the installed packages in its commands.
As it is typically undesirable to remake ``env.mk`` as part of ``make clean``,
the include is conditional.
@@ -1045,6 +1048,7 @@ the include is conditional.
.. note::
When including generated ``Makefile``\s, it is important to use
the ``--make-target-prefix`` flag and use the non-phony target
``<target-prefix>/env`` as prerequisite, instead of the phony target
``<target-prefix>/all``.
the ``--make-target-prefix`` flag and use the non-phony targets
``<target-prefix>/env`` and ``<target-prefix>/fetch`` as
prerequisites, instead of the phony targets ``<target-prefix>/all``
and ``<target-prefix>/fetch-all`` respectively.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 70 KiB

View File

@@ -308,7 +308,7 @@ the variable ``FOOBAR`` will be unset.
spec constraints are instead evaluated top to bottom.
""""""""""""""""""""""""""""""""""""""""""""
Exclude or include specific module files
Blacklist or whitelist specific module files
""""""""""""""""""""""""""""""""""""""""""""
You can use anonymous specs also to prevent module files from being written or
@@ -322,8 +322,8 @@ your system. If you write a configuration file like:
modules:
default:
tcl:
include: ['gcc', 'llvm'] # include will have precedence over exclude
exclude: ['%gcc@4.4.7'] # Assuming gcc@4.4.7 is the system compiler
whitelist: ['gcc', 'llvm'] # Whitelist will have precedence over blacklist
blacklist: ['%gcc@4.4.7'] # Assuming gcc@4.4.7 is the system compiler
you will prevent the generation of module files for any package that
is compiled with ``gcc@4.4.7``, with the only exception of any ``gcc``
@@ -490,7 +490,7 @@ satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. warning::
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
@@ -589,7 +589,7 @@ Filter out environment modifications
Modifications to certain environment variables in module files are there by
default, for instance because they are generated by prefix inspections.
If you want to prevent modifications to some environment variables, you can
do so by using the ``exclude_env_vars``:
do so by using the environment blacklist:
.. code-block:: yaml
@@ -599,7 +599,7 @@ do so by using the ``exclude_env_vars``:
all:
filter:
# Exclude changes to any of these variables
exclude_env_vars: ['CPATH', 'LIBRARY_PATH']
environment_blacklist: ['CPATH', 'LIBRARY_PATH']
The configuration above will generate module files that will not contain
modifications to either ``CPATH`` or ``LIBRARY_PATH``.

View File

@@ -1070,32 +1070,13 @@ Commits
Submodules
You can supply ``submodules=True`` to cause Spack to fetch submodules
recursively along with the repository at fetch time.
recursively along with the repository at fetch time. For more information
about git submodules see the manpage of git: ``man git-submodule``.
.. code-block:: python
version('1.0.1', tag='v1.0.1', submodules=True)
If a package has needs more fine-grained control over submodules, define
``submodules`` to be a callable function that takes the package instance as
its only argument. The function should return a list of submodules to be fetched.
.. code-block:: python
def submodules(package):
submodules = []
if "+variant-1" in package.spec:
submodules.append("submodule_for_variant_1")
if "+variant-2" in package.spec:
submodules.append("submodule_for_variant_2")
return submodules
class MyPackage(Package):
version("0.1.0", submodules=submodules)
For more information about git submodules see the manpage of git: ``man
git-submodule``.
.. _github-fetch:
@@ -2412,9 +2393,9 @@ Influence how dependents are built or run
Spack provides a mechanism for dependencies to influence the
environment of their dependents by overriding the
:meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
:meth:`setup_dependent_run_environment <spack.package.PackageBase.setup_dependent_run_environment>`
or the
:meth:`setup_dependent_build_environment <spack.package_base.PackageBase.setup_dependent_build_environment>`
:meth:`setup_dependent_build_environment <spack.package.PackageBase.setup_dependent_build_environment>`
methods.
The Qt package, for instance, uses this call:
@@ -2436,7 +2417,7 @@ will have the ``PYTHONPATH``, ``PYTHONHOME`` and ``PATH`` environment
variables set appropriately before starting the installation. To make things
even simpler the ``python setup.py`` command is also inserted into the module
scope of dependents by overriding a third method called
:meth:`setup_dependent_package <spack.package_base.PackageBase.setup_dependent_package>`
:meth:`setup_dependent_package <spack.package.PackageBase.setup_dependent_package>`
:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
@@ -2794,256 +2775,6 @@ Suppose a user invokes ``spack install`` like this:
Spack will fail with a constraint violation, because the version of
MPICH requested is too low for the ``mpi`` requirement in ``foo``.
.. _custom-attributes:
------------------
Custom attributes
------------------
Often a package will need to provide attributes for dependents to query
various details about what it provides. While any number of custom defined
attributes can be implemented by a package, the four specific attributes
described below are always available on every package with default
implementations and the ability to customize with alternate implementations
in the case of virtual packages provided:
=========== =========================================== =====================
Attribute Purpose Default
=========== =========================================== =====================
``home`` The installation path for the package ``spec.prefix``
``command`` An executable command for the package | ``spec.name`` found
in
| ``.home.bin``
``headers`` A list of headers provided by the package | All headers
searched
| recursively in
``.home.include``
``libs`` A list of libraries provided by the package | ``lib{spec.name}``
searched
| recursively in
``.home`` starting
| with ``lib``,
``lib64``, then the
| rest of ``.home``
=========== =========================================== =====================
Each of these can be customized by implementing the relevant attribute
as a ``@property`` in the package's class:
.. code-block:: python
:linenos:
class Foo(Package):
...
@property
def libs(self):
# The library provided by Foo is libMyFoo.so
return find_libraries('libMyFoo', root=self.home, recursive=True)
A package may also provide a custom implementation of each attribute
for the virtual packages it provides by implementing the
``virtualpackagename_attributename`` property in the package's class.
The implementation used is the first one found from:
#. Specialized virtual: ``Package.virtualpackagename_attributename``
#. Generic package: ``Package.attributename``
#. Default
The use of customized attributes is demonstrated in the next example.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Example: Customized attributes for virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Consider a package ``foo`` that can optionally provide two virtual
packages ``bar`` and ``baz``. When both are enabled the installation tree
appears as follows:
.. code-block:: console
include/foo.h
include/bar/bar.h
lib64/libFoo.so
lib64/libFooBar.so
baz/include/baz/baz.h
baz/lib/libFooBaz.so
The install tree shows that ``foo`` is providing the header ``include/foo.h``
and library ``lib64/libFoo.so`` in it's install prefix. The virtual
package ``bar`` is providing ``include/bar/bar.h`` and library
``lib64/libFooBar.so``, also in ``foo``'s install prefix. The ``baz``
package, however, is provided in the ``baz`` subdirectory of ``foo``'s
prefix with the ``include/baz/baz.h`` header and ``lib/libFooBaz.so``
library. Such a package could implement the optional attributes as
follows:
.. code-block:: python
:linenos:
class Foo(Package):
...
variant('bar', default=False, description='Enable the Foo implementation of bar')
variant('baz', default=False, description='Enable the Foo implementation of baz')
...
provides('bar', when='+bar')
provides('baz', when='+baz')
....
# Just the foo headers
@property
def headers(self):
return find_headers('foo', root=self.home.include, recursive=False)
# Just the foo libraries
@property
def libs(self):
return find_libraries('libFoo', root=self.home, recursive=True)
# The header provided by the bar virutal package
@property
def bar_headers(self):
return find_headers('bar/bar.h', root=self.home.include, recursive=False)
# The libary provided by the bar virtual package
@property
def bar_libs(self):
return find_libraries('libFooBar', root=sef.home, recursive=True)
# The baz virtual package home
@property
def baz_home(self):
return self.prefix.baz
# The header provided by the baz virtual package
@property
def baz_headers(self):
return find_headers('baz/baz', root=self.baz_home.include, recursive=False)
# The library provided by the baz virtual package
@property
def baz_libs(self):
return find_libraries('libFooBaz', root=self.baz_home, recursive=True)
Now consider another package, ``foo-app``, depending on all three:
.. code-block:: python
:linenos:
class FooApp(CMakePackage):
...
depends_on('foo')
depends_on('bar')
depends_on('baz')
The resulting spec objects for it's dependencies shows the result of
the above attribute implementations:
.. code-block:: python
# The core headers and libraries of the foo package
>>> spec['foo']
foo@1.0%gcc@11.3.1+bar+baz arch=linux-fedora35-haswell
>>> spec['foo'].prefix
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# home defaults to the package install prefix without an explicit implementation
>>> spec['foo'].home
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# foo headers from the foo prefix
>>> spec['foo'].headers
HeaderList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include/foo.h',
])
# foo include directories from the foo prefix
>>> spec['foo'].headers.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include']
# foo libraries from the foo prefix
>>> spec['foo'].libs
LibraryList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64/libFoo.so',
])
# foo library directories from the foo prefix
>>> spec['foo'].libs.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64']
.. code-block:: python
# The virtual bar package in the same prefix as foo
# bar resolves to the foo package
>>> spec['bar']
foo@1.0%gcc@11.3.1+bar+baz arch=linux-fedora35-haswell
>>> spec['bar'].prefix
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# home defaults to the foo prefix without either a Foo.bar_home
# or Foo.home implementation
>>> spec['bar'].home
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# bar header in the foo prefix
>>> spec['bar'].headers
HeaderList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include/bar/bar.h'
])
# bar include dirs from the foo prefix
>>> spec['bar'].headers.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include']
# bar library from the foo prefix
>>> spec['bar'].libs
LibraryList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64/libFooBar.so'
])
# bar library directories from the foo prefix
>>> spec['bar'].libs.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64']
.. code-block:: python
# The virtual baz package in a subdirectory of foo's prefix
# baz resolves to the foo package
>>> spec['baz']
foo@1.0%gcc@11.3.1+bar+baz arch=linux-fedora35-haswell
>>> spec['baz'].prefix
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# baz_home implementation provides the subdirectory inside the foo prefix
>>> spec['baz'].home
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz'
# baz headers in the baz subdirectory of the foo prefix
>>> spec['baz'].headers
HeaderList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/include/baz/baz.h'
])
# baz include directories in the baz subdirectory of the foo prefix
>>> spec['baz'].headers.directories
[
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/include'
]
# baz libraries in the baz subdirectory of the foo prefix
>>> spec['baz'].libs
LibraryList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so'
])
# baz library directories in the baz subdirectory of the foo porefix
>>> spec['baz'].libs.directories
[
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib'
]
.. _abstract-and-concrete:
-------------------------
@@ -3291,7 +3022,7 @@ The classes that are currently provided by Spack are:
+----------------------------------------------------------+----------------------------------+
| **Base Class** | **Purpose** |
+==========================================================+==================================+
| :class:`~spack.package_base.Package` | General base class not |
| :class:`~spack.package.Package` | General base class not |
| | specialized for any build system |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.makefile.MakefilePackage` | Specialized class for packages |
@@ -3422,7 +3153,7 @@ for the install phase is:
For those not used to Python instance methods, this is the
package itself. In this case it's an instance of ``Foo``, which
extends ``Package``. For API docs on Package objects, see
:py:class:`Package <spack.package_base.Package>`.
:py:class:`Package <spack.package.Package>`.
``spec``
This is the concrete spec object created by Spack from an
@@ -5745,24 +5476,6 @@ Version Lists
Spack packages should list supported versions with the newest first.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using ``home`` vs ``prefix``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
``home`` and ``prefix`` are both attributes that can be queried on a
package's dependencies, often when passing configure arguments pointing to the
location of a dependency. The difference is that while ``prefix`` is the
location on disk where a concrete package resides, ``home`` is the `logical`
location that a package resides, which may be different than ``prefix`` in
the case of virtual packages or other special circumstances. For most use
cases inside a package, it's dependency locations can be accessed via either
``self.spec['foo'].home`` or ``self.spec['foo'].prefix``. Specific packages
that should be consumed by dependents via ``.home`` instead of ``.prefix``
should be noted in their respective documentation.
See :ref:`custom-attributes` for more details and an example implementing
a custom ``home`` attribute.
---------------------------
Packaging workflow commands
---------------------------

View File

@@ -7,7 +7,7 @@ bash, , , Compiler wrappers
tar, , , Extract/create archives
gzip, , , Compress/Decompress archives
unzip, , , Compress/Decompress archives
bzip2, , , Compress/Decompress archives
bzip, , , Compress/Decompress archives
xz, , , Compress/Decompress archives
zstd, , Optional, Compress/Decompress archives
file, , , Create/Use Buildcaches
@@ -15,4 +15,4 @@ gnupg2, , , Sign/Verify Buildcaches
git, , , Manage Software Repositories
svn, , Optional, Manage Software Repositories
hg, , Optional, Manage Software Repositories
Python header files, , Optional (e.g. ``python3-dev`` on Debian), Bootstrapping from sources
Python header files, , Optional (e.g. ``python3-dev`` on Debian), Bootstrapping from sources
1 Name Supported Versions Notes Requirement Reason
7 tar Extract/create archives
8 gzip Compress/Decompress archives
9 unzip Compress/Decompress archives
10 bzip2 bzip Compress/Decompress archives
11 xz Compress/Decompress archives
12 zstd Optional Compress/Decompress archives
13 file Create/Use Buildcaches
15 git Manage Software Repositories
16 svn Optional Manage Software Repositories
17 hg Optional Manage Software Repositories
18 Python header files Optional (e.g. ``python3-dev`` on Debian) Bootstrapping from sources

View File

@@ -308,68 +308,6 @@ def change_sed_delimiter(old_delim, new_delim, *filenames):
filter_file(double_quoted, '"%s"' % repl, f)
@contextmanager
def exploding_archive_catch(stage):
# Check for an exploding tarball, i.e. one that doesn't expand to
# a single directory. If the tarball *didn't* explode, move its
# contents to the staging source directory & remove the container
# directory. If the tarball did explode, just rename the tarball
# directory to the staging source directory.
#
# NOTE: The tar program on Mac OS X will encode HFS metadata in
# hidden files, which can end up *alongside* a single top-level
# directory. We initially ignore presence of hidden files to
# accomodate these "semi-exploding" tarballs but ensure the files
# are copied to the source directory.
# Expand all tarballs in their own directory to contain
# exploding tarballs.
tarball_container = os.path.join(stage.path,
"spack-expanded-archive")
mkdirp(tarball_container)
orig_dir = os.getcwd()
os.chdir(tarball_container)
try:
yield
# catch an exploding archive on sucessful extraction
os.chdir(orig_dir)
exploding_archive_handler(tarball_container, stage)
except Exception as e:
# return current directory context to previous on failure
os.chdir(orig_dir)
raise e
@system_path_filter
def exploding_archive_handler(tarball_container, stage):
"""
Args:
tarball_container: where the archive was expanded to
stage: Stage object referencing filesystem location
where archive is being expanded
"""
files = os.listdir(tarball_container)
non_hidden = [f for f in files if not f.startswith('.')]
if len(non_hidden) == 1:
src = os.path.join(tarball_container, non_hidden[0])
if os.path.isdir(src):
stage.srcdir = non_hidden[0]
shutil.move(src, stage.source_path)
if len(files) > 1:
files.remove(non_hidden[0])
for f in files:
src = os.path.join(tarball_container, f)
dest = os.path.join(stage.path, f)
shutil.move(src, dest)
os.rmdir(tarball_container)
else:
# This is a non-directory entry (e.g., a patch file) so simply
# rename the tarball container to be the source path.
shutil.move(tarball_container, stage.source_path)
else:
shutil.move(tarball_container, stage.source_path)
@system_path_filter(arg_slice=slice(1))
def get_owner_uid(path, err_msg=None):
if not os.path.exists(path):

View File

@@ -1072,15 +1072,3 @@ def __exit__(self, exc_type, exc_value, tb):
# Suppress any exception from being re-raised:
# https://docs.python.org/3/reference/datamodel.html#object.__exit__.
return True
class classproperty(object):
"""Non-data descriptor to evaluate a class-level property. The function that performs
the evaluation is injected at creation time and take an instance (could be None) and
an owner (i.e. the class that originated the instance)
"""
def __init__(self, callback):
self.callback = callback
def __get__(self, instance, owner):
return self.callback(owner)

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: (major, minor, micro, dev release) tuple
spack_version_info = (0, 19, 0, 'dev0')
spack_version_info = (0, 18, 1)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
spack_version = '.'.join(str(s) for s in spack_version_info)

View File

@@ -0,0 +1,42 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This package contains code for creating analyzers to extract Application
Binary Interface (ABI) information, along with simple analyses that just load
existing metadata.
"""
from __future__ import absolute_import
import llnl.util.tty as tty
import spack.paths
import spack.util.classes
mod_path = spack.paths.analyzers_path
analyzers = spack.util.classes.list_classes("spack.analyzers", mod_path)
# The base analyzer does not have a name, and cannot do dict comprehension
analyzer_types = {}
for a in analyzers:
if not hasattr(a, "name"):
continue
analyzer_types[a.name] = a
def list_all():
"""A helper function to list all analyzers and their descriptions
"""
for name, analyzer in analyzer_types.items():
print("%-25s: %-35s" % (name, analyzer.description))
def get_analyzer(name):
"""Courtesy function to retrieve an analyzer, and exit on error if it
does not exist.
"""
if name in analyzer_types:
return analyzer_types[name]
tty.die("Analyzer %s does not exist" % name)

View File

@@ -0,0 +1,116 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""An analyzer base provides basic functions to run the analysis, save results,
and (optionally) interact with a Spack Monitor
"""
import os
import llnl.util.tty as tty
import spack.config
import spack.hooks
import spack.monitor
import spack.util.path
def get_analyzer_dir(spec, analyzer_dir=None):
"""
Given a spec, return the directory to save analyzer results.
We create the directory if it does not exist. We also check that the
spec has an associated package. An analyzer cannot be run if the spec isn't
associated with a package. If the user provides a custom analyzer_dir,
we use it over checking the config and the default at ~/.spack/analyzers
"""
# An analyzer cannot be run if the spec isn't associated with a package
if not hasattr(spec, "package") or not spec.package:
tty.die("A spec can only be analyzed with an associated package.")
# The top level directory is in the user home, or a custom location
if not analyzer_dir:
analyzer_dir = spack.util.path.canonicalize_path(
spack.config.get('config:analyzers_dir', '~/.spack/analyzers'))
# We follow the same convention as the spec install (this could be better)
package_prefix = os.sep.join(spec.package.prefix.split('/')[-3:])
meta_dir = os.path.join(analyzer_dir, package_prefix)
return meta_dir
class AnalyzerBase(object):
def __init__(self, spec, dirname=None):
"""
Verify that the analyzer has correct metadata.
An Analyzer is intended to run on one spec install, so the spec
with its associated package is required on init. The child analyzer
class should define an init function that super's the init here, and
also check that the analyzer has all dependencies that it
needs. If an analyzer subclass does not have dependencies, it does not
need to define an init. An Analyzer should not be allowed to proceed
if one or more dependencies are missing. The dirname, if defined,
is an optional directory name to save to (instead of the default meta
spack directory).
"""
self.spec = spec
self.dirname = dirname
self.meta_dir = os.path.dirname(spec.package.install_log_path)
for required in ["name", "outfile", "description"]:
if not hasattr(self, required):
tty.die("Please add a %s attribute on the analyzer." % required)
def run(self):
"""
Given a spec with an installed package, run the analyzer on it.
"""
raise NotImplementedError
@property
def output_dir(self):
"""
The full path to the output directory.
This includes the nested analyzer directory structure. This function
does not create anything.
"""
if not hasattr(self, "_output_dir"):
output_dir = get_analyzer_dir(self.spec, self.dirname)
self._output_dir = os.path.join(output_dir, self.name)
return self._output_dir
def save_result(self, result, overwrite=False):
"""
Save a result to the associated spack monitor, if defined.
This function is on the level of the analyzer because it might be
the case that the result is large (appropriate for a single request)
or that the data is organized differently (e.g., more than one
request per result). If an analyzer subclass needs to over-write
this function with a custom save, that is appropriate to do (see abi).
"""
# We maintain the structure in json with the analyzer as key so
# that in the future, we could upload to a monitor server
if result[self.name]:
outfile = os.path.join(self.output_dir, self.outfile)
# Only try to create the results directory if we have a result
if not os.path.exists(self._output_dir):
os.makedirs(self._output_dir)
# Don't overwrite an existing result if overwrite is False
if os.path.exists(outfile) and not overwrite:
tty.info("%s exists and overwrite is False, skipping." % outfile)
else:
tty.info("Writing result to %s" % outfile)
spack.monitor.write_json(result[self.name], outfile)
# This hook runs after a save result
spack.hooks.on_analyzer_save(self.spec.package, result)

View File

@@ -0,0 +1,33 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""A configargs analyzer is a class of analyzer that typically just uploads
already existing metadata about config args from a package spec install
directory."""
import os
import spack.monitor
from .analyzer_base import AnalyzerBase
class ConfigArgs(AnalyzerBase):
name = "config_args"
outfile = "spack-analyzer-config-args.json"
description = "config args loaded from spack-configure-args.txt"
def run(self):
"""
Load the configure-args.txt and save in json.
The run function will find the spack-config-args.txt file in the
package install directory, and read it into a json structure that has
the name of the analyzer as the key.
"""
config_file = os.path.join(self.meta_dir, "spack-configure-args.txt")
return {self.name: spack.monitor.read_file(config_file)}

View File

@@ -0,0 +1,54 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""An environment analyzer will read and parse the environment variables
file in the installed package directory, generating a json file that has
an index of key, value pairs for environment variables."""
import os
import llnl.util.tty as tty
from spack.util.environment import EnvironmentModifications
from .analyzer_base import AnalyzerBase
class EnvironmentVariables(AnalyzerBase):
name = "environment_variables"
outfile = "spack-analyzer-environment-variables.json"
description = "environment variables parsed from spack-build-env.txt"
def run(self):
"""
Load, parse, and save spack-build-env.txt to analyzers.
Read in the spack-build-env.txt file from the package install
directory and parse the environment variables into key value pairs.
The result should have the key for the analyzer, the name.
"""
env_file = os.path.join(self.meta_dir, "spack-build-env.txt")
return {self.name: self._read_environment_file(env_file)}
def _read_environment_file(self, filename):
"""
Read and parse the environment file.
Given an environment file, we want to read it, split by semicolons
and new lines, and then parse down to the subset of SPACK_* variables.
We assume that all spack prefix variables are not secrets, and unlike
the install_manifest.json, we don't (at least to start) parse the values
to remove path prefixes specific to user systems.
"""
if not os.path.exists(filename):
tty.warn("No environment file available")
return
mods = EnvironmentModifications.from_sourcing_file(filename)
env = {}
mods.apply_modifications(env)
return env

View File

@@ -0,0 +1,31 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""The install files json file (install_manifest.json) already exists in
the package install folder, so this analyzer simply moves it to the user
analyzer folder for further processing."""
import os
import spack.monitor
from .analyzer_base import AnalyzerBase
class InstallFiles(AnalyzerBase):
name = "install_files"
outfile = "spack-analyzer-install-files.json"
description = "install file listing read from install_manifest.json"
def run(self):
"""
Load in the install_manifest.json and save to analyzers.
We write it out to the analyzers folder, with key as the analyzer name.
"""
manifest_file = os.path.join(self.meta_dir, "install_manifest.json")
return {self.name: spack.monitor.read_json(manifest_file)}

View File

@@ -0,0 +1,114 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import llnl.util.tty as tty
import spack
import spack.binary_distribution
import spack.bootstrap
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.repo
import spack.util.executable
from .analyzer_base import AnalyzerBase
class Libabigail(AnalyzerBase):
name = "libabigail"
outfile = "spack-analyzer-libabigail.json"
description = "Application Binary Interface (ABI) features for objects"
def __init__(self, spec, dirname=None):
"""
init for an analyzer ensures we have all needed dependencies.
For the libabigail analyzer, this means Libabigail.
Since the output for libabigail is one file per object, we communicate
with the monitor multiple times.
"""
super(Libabigail, self).__init__(spec, dirname)
# This doesn't seem to work to import on the module level
tty.debug("Preparing to use Libabigail, will install if missing.")
with spack.bootstrap.ensure_bootstrap_configuration():
# libabigail won't install lib/bin/share without docs
spec = spack.spec.Spec("libabigail+docs")
spack.bootstrap.ensure_executables_in_path_or_raise(
["abidw"], abstract_spec=spec
)
self.abidw = spack.util.executable.which('abidw')
def run(self):
"""
Run libabigail, and save results to filename.
This run function differs in that we write as we generate and then
return a dict with the analyzer name as the key, and the value of a
dict of results, where the key is the object name, and the value is
the output file written to.
"""
manifest = spack.binary_distribution.get_buildfile_manifest(self.spec)
# This result will store a path to each file
result = {}
# Generate an output file for each binary or object
for obj in manifest.get("binary_to_relocate_fullpath", []):
# We want to preserve the path in the install directory in case
# a library has an equivalenly named lib or executable, for example
outdir = os.path.dirname(obj.replace(self.spec.package.prefix,
'').strip(os.path.sep))
outfile = "spack-analyzer-libabigail-%s.xml" % os.path.basename(obj)
outfile = os.path.join(self.output_dir, outdir, outfile)
outdir = os.path.dirname(outfile)
# Create the output directory
if not os.path.exists(outdir):
os.makedirs(outdir)
# Sometimes libabigail segfaults and dumps
try:
self.abidw(obj, "--out-file", outfile)
result[obj] = outfile
tty.info("Writing result to %s" % outfile)
except spack.error.SpackError:
tty.warn("Issue running abidw for %s" % obj)
return {self.name: result}
def save_result(self, result, overwrite=False):
"""
Read saved ABI results and upload to monitor server.
ABI results are saved to individual files, so each one needs to be
read and uploaded. Result here should be the lookup generated in run(),
the key is the analyzer name, and each value is the result file.
We currently upload the entire xml as text because libabigail can't
easily read gzipped xml, but this will be updated when it can.
"""
if not spack.monitor.cli:
return
name = self.spec.package.name
for obj, filename in result.get(self.name, {}).items():
# Don't include the prefix
rel_path = obj.replace(self.spec.prefix + os.path.sep, "")
# We've already saved the results to file during run
content = spack.monitor.read_file(filename)
# A result needs an analyzer, value or binary_value, and name
data = {"value": content, "install_file": rel_path, "name": "abidw-xml"}
tty.info("Sending result for %s %s to monitor." % (name, rel_path))
spack.hooks.on_analyzer_save(self.spec.package, {"libabigail": [data]})

View File

@@ -281,15 +281,15 @@ def _check_build_test_callbacks(pkgs, error_cls):
"""Ensure stand-alone test method is not included in build-time callbacks"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
test_callbacks = pkg_cls.build_time_test_callbacks
pkg = spack.repo.get(pkg_name)
test_callbacks = pkg.build_time_test_callbacks
if test_callbacks and 'test' in test_callbacks:
msg = ('{0} package contains "test" method in '
'build_time_test_callbacks')
instr = ('Remove "test" from: [{0}]'
.format(', '.join(test_callbacks)))
errors.append(error_cls(msg.format(pkg_name), [instr]))
errors.append(error_cls(msg.format(pkg.name), [instr]))
return errors
@@ -304,8 +304,8 @@ def _check_patch_urls(pkgs, error_cls):
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
for condition, patches in pkg_cls.patches.items():
pkg = spack.repo.get(pkg_name)
for condition, patches in pkg.patches.items():
for patch in patches:
if not isinstance(patch, spack.patch.UrlPatch):
continue
@@ -317,7 +317,7 @@ def _check_patch_urls(pkgs, error_cls):
if not patch.url.endswith(full_index_arg):
errors.append(error_cls(
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg,
pkg.name, full_index_arg,
),
[patch.url],
))
@@ -331,21 +331,21 @@ def _linting_package_file(pkgs, error_cls):
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg = spack.repo.get(pkg_name)
# Does the homepage have http, and if so, does https work?
if pkg_cls.homepage.startswith('http://'):
https = re.sub("http", "https", pkg_cls.homepage, 1)
if pkg.homepage.startswith('http://'):
https = re.sub("http", "https", pkg.homepage, 1)
try:
response = urlopen(https)
except Exception as e:
msg = 'Error with attempting https for "{0}": '
errors.append(error_cls(msg.format(pkg_cls.name), [str(e)]))
errors.append(error_cls(msg.format(pkg.name), [str(e)]))
continue
if response.getcode() == 200:
msg = 'Package "{0}" uses http but has a valid https endpoint.'
errors.append(msg.format(pkg_cls.name))
errors.append(msg.format(pkg.name))
return llnl.util.lang.dedupe(errors)
@@ -355,10 +355,10 @@ def _unknown_variants_in_directives(pkgs, error_cls):
"""Report unknown or wrong variants in directives for this package"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg = spack.repo.get(pkg_name)
# Check "conflicts" directive
for conflict, triggers in pkg_cls.conflicts.items():
for conflict, triggers in pkg.conflicts.items():
for trigger, _ in triggers:
vrn = spack.spec.Spec(conflict)
try:
@@ -371,34 +371,34 @@ def _unknown_variants_in_directives(pkgs, error_cls):
# When os and target constraints can be created independently of
# the platform, TODO change this back to add an error.
errors.extend(_analyze_variants_in_directive(
pkg_cls, spack.spec.Spec(trigger),
pkg, spack.spec.Spec(trigger),
directive='conflicts', error_cls=error_cls
))
errors.extend(_analyze_variants_in_directive(
pkg_cls, vrn, directive='conflicts', error_cls=error_cls
pkg, vrn, directive='conflicts', error_cls=error_cls
))
# Check "depends_on" directive
for _, triggers in pkg_cls.dependencies.items():
for _, triggers in pkg.dependencies.items():
triggers = list(triggers)
for trigger in list(triggers):
vrn = spack.spec.Spec(trigger)
errors.extend(_analyze_variants_in_directive(
pkg_cls, vrn, directive='depends_on', error_cls=error_cls
pkg, vrn, directive='depends_on', error_cls=error_cls
))
# Check "patch" directive
for _, triggers in pkg_cls.provided.items():
for _, triggers in pkg.provided.items():
triggers = [spack.spec.Spec(x) for x in triggers]
for vrn in triggers:
errors.extend(_analyze_variants_in_directive(
pkg_cls, vrn, directive='patch', error_cls=error_cls
pkg, vrn, directive='patch', error_cls=error_cls
))
# Check "resource" directive
for vrn in pkg_cls.resources:
for vrn in pkg.resources:
errors.extend(_analyze_variants_in_directive(
pkg_cls, vrn, directive='resource', error_cls=error_cls
pkg, vrn, directive='resource', error_cls=error_cls
))
return llnl.util.lang.dedupe(errors)
@@ -409,15 +409,15 @@ def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg = spack.repo.get(pkg_name)
filename = spack.repo.path.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items():
for dependency_name, dependency_data in pkg.dependencies.items():
# No need to analyze virtual packages
if spack.repo.path.is_virtual(dependency_name):
continue
try:
dependency_pkg_cls = spack.repo.path.get_pkg_class(dependency_name)
dependency_pkg = spack.repo.get(dependency_name)
except spack.repo.UnknownPackageError:
# This dependency is completely missing, so report
# and continue the analysis
@@ -433,8 +433,8 @@ def _unknown_variants_in_dependencies(pkgs, error_cls):
dependency_variants = dependency_edge.spec.variants
for name, value in dependency_variants.items():
try:
v, _ = dependency_pkg_cls.variants[name]
v.validate_or_raise(value, pkg_cls=dependency_pkg_cls)
v, _ = dependency_pkg.variants[name]
v.validate_or_raise(value, pkg=dependency_pkg)
except Exception as e:
summary = (pkg_name + ": wrong variant used for a "
"dependency in a 'depends_on' directive")
@@ -456,10 +456,10 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
"""Report if version constraints used in directives are not satisfiable"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg = spack.repo.get(pkg_name)
filename = spack.repo.path.filename_for_package_name(pkg_name)
dependencies_to_check = []
for dependency_name, dependency_data in pkg_cls.dependencies.items():
for dependency_name, dependency_data in pkg.dependencies.items():
# Skip virtual dependencies for the time being, check on
# their versions can be added later
if spack.repo.path.is_virtual(dependency_name):
@@ -470,19 +470,19 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
)
for s in dependencies_to_check:
dependency_pkg_cls = None
dependency_pkg = None
try:
dependency_pkg_cls = spack.repo.path.get_pkg_class(s.name)
dependency_pkg = spack.repo.get(s.name)
assert any(
v.satisfies(s.versions) for v in list(dependency_pkg_cls.versions)
v.satisfies(s.versions) for v in list(dependency_pkg.versions)
)
except Exception:
summary = ("{0}: dependency on {1} cannot be satisfied "
"by known versions of {1.name}").format(pkg_name, s)
details = ['happening in ' + filename]
if dependency_pkg_cls is not None:
if dependency_pkg is not None:
details.append('known versions of {0.name} are {1}'.format(
s, ', '.join([str(x) for x in dependency_pkg_cls.versions])
s, ', '.join([str(x) for x in dependency_pkg.versions])
))
errors.append(error_cls(summary=summary, details=details))
@@ -500,7 +500,7 @@ def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
for name, v in constraint.variants.items():
try:
variant, _ = pkg.variants[name]
variant.validate_or_raise(v, pkg_cls=pkg)
variant.validate_or_raise(v, pkg=pkg)
except variant_exceptions as e:
summary = pkg.name + ': wrong variant in "{0}" directive'
summary = summary.format(directive)

View File

@@ -618,7 +618,7 @@ def get_buildfile_manifest(spec):
Return a data structure with information about a build, including
text_to_relocate, binary_to_relocate, binary_to_relocate_fullpath
link_to_relocate, and other, which means it doesn't fit any of previous
checks (and should not be relocated). We exclude docs (man) and
checks (and should not be relocated). We blacklist docs (man) and
metadata (.spack). This can be used to find a particular kind of file
in spack, or to generate the build metadata.
"""
@@ -626,12 +626,12 @@ def get_buildfile_manifest(spec):
"link_to_relocate": [], "other": [],
"binary_to_relocate_fullpath": []}
exclude_list = (".spack", "man")
blacklist = (".spack", "man")
# Do this at during tarball creation to save time when tarball unpacked.
# Used by make_package_relative to determine binaries to change.
for root, dirs, files in os.walk(spec.prefix, topdown=True):
dirs[:] = [d for d in dirs if d not in exclude_list]
dirs[:] = [d for d in dirs if d not in blacklist]
# Directories may need to be relocated too.
for directory in dirs:

View File

@@ -80,41 +80,32 @@ def _try_import_from_store(module, query_spec, query_info=None):
for candidate_spec in installed_specs:
pkg = candidate_spec['python'].package
module_paths = [
module_paths = {
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
] # type: list[str]
path_before = list(sys.path)
# NOTE: try module_paths first and last, last allows an existing version in path
# to be picked up and used, possibly depending on something in the store, first
# allows the bootstrap version to work when an incompatible version is in
# sys.path
orders = [
module_paths + sys.path,
sys.path + module_paths,
]
for path in orders:
sys.path = path
try:
_fix_ext_suffix(candidate_spec)
if _python_import(module):
msg = ('[BOOTSTRAP MODULE {0}] The installed spec "{1}/{2}" '
'provides the "{0}" Python module').format(
module, query_spec, candidate_spec.dag_hash()
)
tty.debug(msg)
if query_info is not None:
query_info['spec'] = candidate_spec
return True
except Exception as e:
msg = ('unexpected error while trying to import module '
'"{0}" from spec "{1}" [error="{2}"]')
tty.warn(msg.format(module, candidate_spec, str(e)))
else:
msg = "Spec {0} did not provide module {1}"
tty.warn(msg.format(candidate_spec, module))
}
sys.path.extend(module_paths)
sys.path = path_before
try:
_fix_ext_suffix(candidate_spec)
if _python_import(module):
msg = ('[BOOTSTRAP MODULE {0}] The installed spec "{1}/{2}" '
'provides the "{0}" Python module').format(
module, query_spec, candidate_spec.dag_hash()
)
tty.debug(msg)
if query_info is not None:
query_info['spec'] = candidate_spec
return True
except Exception as e:
msg = ('unexpected error while trying to import module '
'"{0}" from spec "{1}" [error="{2}"]')
tty.warn(msg.format(module, candidate_spec, str(e)))
else:
msg = "Spec {0} did not provide module {1}"
tty.warn(msg.format(candidate_spec, module))
sys.path = sys.path[:-3]
return False
@@ -652,10 +643,10 @@ def _add_compilers_if_missing():
def _add_externals_if_missing():
search_list = [
# clingo
spack.repo.path.get_pkg_class('cmake'),
spack.repo.path.get_pkg_class('bison'),
spack.repo.path.get('cmake'),
spack.repo.path.get('bison'),
# GnuPG
spack.repo.path.get_pkg_class('gawk')
spack.repo.path.get('gawk')
]
detected_packages = spack.detection.by_executable(search_list)
spack.detection.update_configuration(detected_packages, scope='bootstrap')

View File

@@ -55,7 +55,7 @@
import spack.config
import spack.install_test
import spack.main
import spack.package_base
import spack.package
import spack.paths
import spack.platforms
import spack.repo
@@ -722,7 +722,7 @@ def get_std_cmake_args(pkg):
package were a CMakePackage instance.
Args:
pkg (spack.package_base.PackageBase): package under consideration
pkg (spack.package.PackageBase): package under consideration
Returns:
list: arguments for cmake
@@ -738,7 +738,7 @@ def get_std_meson_args(pkg):
package were a MesonPackage instance.
Args:
pkg (spack.package_base.PackageBase): package under consideration
pkg (spack.package.PackageBase): package under consideration
Returns:
list: arguments for meson
@@ -748,12 +748,12 @@ def get_std_meson_args(pkg):
def parent_class_modules(cls):
"""
Get list of superclass modules that descend from spack.package_base.PackageBase
Get list of superclass modules that descend from spack.package.PackageBase
Includes cls.__module__
"""
if (not issubclass(cls, spack.package_base.PackageBase) or
issubclass(spack.package_base.PackageBase, cls)):
if (not issubclass(cls, spack.package.PackageBase) or
issubclass(spack.package.PackageBase, cls)):
return []
result = []
module = sys.modules.get(cls.__module__)
@@ -771,7 +771,7 @@ def load_external_modules(pkg):
associated with them.
Args:
pkg (spack.package_base.PackageBase): package to load deps for
pkg (spack.package.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
@@ -1109,7 +1109,7 @@ def start_build_process(pkg, function, kwargs):
Args:
pkg (spack.package_base.PackageBase): package whose environment we should set up the
pkg (spack.package.PackageBase): package whose environment we should set up the
child process for.
function (typing.Callable): argless function to run in the child
process.
@@ -1234,7 +1234,7 @@ def make_stack(tb, stack=None):
if 'self' in frame.f_locals:
# Find the first proper subclass of PackageBase.
obj = frame.f_locals['self']
if isinstance(obj, spack.package_base.PackageBase):
if isinstance(obj, spack.package.PackageBase):
break
# We found obj, the Package implementation we care about.

View File

@@ -9,7 +9,7 @@
from spack.build_systems.autotools import AutotoolsPackage
from spack.directives import extends
from spack.package_base import ExtensionError
from spack.package import ExtensionError
from spack.util.executable import which

View File

@@ -16,7 +16,7 @@
from spack.build_environment import InstallError
from spack.directives import conflicts, depends_on
from spack.operating_systems.mac_os import macos_version
from spack.package_base import PackageBase, run_after, run_before
from spack.package import PackageBase, run_after, run_before
from spack.util.executable import Executable
from spack.version import Version

View File

@@ -8,7 +8,7 @@
from llnl.util.filesystem import install, mkdirp
from spack.build_systems.cmake import CMakePackage
from spack.package_base import run_after
from spack.package import run_after
def cmake_cache_path(name, value, comment=""):

View File

@@ -18,7 +18,7 @@
import spack.build_environment
from spack.directives import conflicts, depends_on, variant
from spack.package_base import InstallError, PackageBase, run_after
from spack.package import InstallError, PackageBase, run_after
from spack.util.path import convert_to_posix_path
# Regex to extract the primary generator from the CMake generator

View File

@@ -6,7 +6,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when
from spack.package_base import PackageBase
from spack.package import PackageBase
class CudaPackage(PackageBase):
@@ -37,7 +37,6 @@ class CudaPackage(PackageBase):
variant('cuda_arch',
description='CUDA architecture',
values=spack.variant.any_combination_of(*cuda_arch_values),
sticky=True,
when='+cuda')
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#nvcc-examples

View File

@@ -3,16 +3,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package_base
import spack.package
import spack.util.url
class GNUMirrorPackage(spack.package_base.PackageBase):
class GNUMirrorPackage(spack.package.PackageBase):
"""Mixin that takes care of setting url and mirrors for GNU packages."""
#: Path of the package in a GNU mirror
gnu_mirror_path = None # type: Optional[str]
gnu_mirror_path = None
#: List of GNU mirrors used by Spack
base_mirrors = [

View File

@@ -26,7 +26,7 @@
import spack.error
from spack.build_environment import dso_suffix
from spack.package_base import InstallError, PackageBase, run_after
from spack.package import InstallError, PackageBase, run_after
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
@@ -1115,7 +1115,7 @@ def _setup_dependent_env_callback(
raise InstallError('compilers_of_client arg required for MPI')
def setup_dependent_package(self, module, dep_spec):
# https://spack.readthedocs.io/en/latest/spack.html#spack.package_base.PackageBase.setup_dependent_package
# https://spack.readthedocs.io/en/latest/spack.html#spack.package.PackageBase.setup_dependent_package
# Reminder: "module" refers to Python module.
# Called before the install() method of dependents.
@@ -1259,14 +1259,6 @@ def install(self, spec, prefix):
for f in glob.glob('%s/intel*log' % tmpdir):
install(f, dst)
@run_after('install')
def validate_install(self):
# Sometimes the installer exits with an error but doesn't pass a
# non-zero exit code to spack. Check for the existence of a 'bin'
# directory to catch this error condition.
if not os.path.exists(self.prefix.bin):
raise InstallError('The installer has failed to install anything.')
@run_after('install')
def configure_rpath(self):
if '+rpath' not in self.spec:

View File

@@ -10,7 +10,7 @@
from spack.directives import depends_on, extends
from spack.multimethod import when
from spack.package_base import PackageBase
from spack.package import PackageBase
from spack.util.executable import Executable

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import conflicts
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class MakefilePackage(PackageBase):

View File

@@ -7,7 +7,7 @@
from llnl.util.filesystem import install_tree, working_dir
from spack.directives import depends_on
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
from spack.util.executable import which

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on, variant
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class MesonPackage(PackageBase):

View File

@@ -6,7 +6,7 @@
import inspect
from spack.directives import extends
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class OctavePackage(PackageBase):

View File

@@ -14,7 +14,7 @@
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.package_base import Package
from spack.package import Package
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -45,16 +45,18 @@ def component_dir(self):
raise NotImplementedError
@property
def component_prefix(self):
def component_path(self):
"""Path to component <prefix>/<component>/<version>."""
return self.prefix.join(join_path(self.component_dir, self.spec.version))
return join_path(self.prefix, self.component_dir, str(self.spec.version))
def install(self, spec, prefix):
self.install_component(basename(self.url_for_version(spec.version)))
def install_component(self, installer_path):
def install(self, spec, prefix, installer_path=None):
"""Shared install method for all oneapi packages."""
# intel-oneapi-compilers overrides the installer_path when
# installing fortran, which comes from a spack resource
if installer_path is None:
installer_path = basename(self.url_for_version(spec.version))
if platform.system() == 'Linux':
# Intel installer assumes and enforces that all components
# are installed into a single prefix. Spack wants to
@@ -75,7 +77,7 @@ def install_component(self, installer_path):
bash = Executable('bash')
# Installer writes files in ~/intel set HOME so it goes to prefix
bash.add_default_env('HOME', self.prefix)
bash.add_default_env('HOME', prefix)
# Installer checks $XDG_RUNTIME_DIR/.bootstrapper_lock_file as well
bash.add_default_env('XDG_RUNTIME_DIR',
join_path(self.stage.path, 'runtime'))
@@ -83,13 +85,13 @@ def install_component(self, installer_path):
bash(installer_path,
'-s', '-a', '-s', '--action', 'install',
'--eula', 'accept',
'--install-dir', self.prefix)
'--install-dir', prefix)
if getpass.getuser() == 'root':
shutil.rmtree('/var/intel/installercache', ignore_errors=True)
# Some installers have a bug and do not return an error code when failing
if not isdir(join_path(self.prefix, self.component_dir)):
if not isdir(join_path(prefix, self.component_dir)):
raise RuntimeError('install failed')
def setup_run_environment(self, env):
@@ -102,7 +104,7 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
env.extend(EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, 'env', 'vars.sh')))
join_path(self.component_path, 'env', 'vars.sh')))
class IntelOneApiLibraryPackage(IntelOneApiPackage):
@@ -116,12 +118,12 @@ class IntelOneApiLibraryPackage(IntelOneApiPackage):
@property
def headers(self):
include_path = join_path(self.component_prefix, 'include')
include_path = join_path(self.component_path, 'include')
return find_headers('*', include_path, recursive=True)
@property
def libs(self):
lib_path = join_path(self.component_prefix, 'lib', 'intel64')
lib_path = join_path(self.component_path, 'lib', 'intel64')
lib_path = lib_path if isdir(lib_path) else dirname(lib_path)
return find_libraries('*', root=lib_path, shared=True, recursive=True)

View File

@@ -10,7 +10,7 @@
from llnl.util.filesystem import filter_file
from spack.directives import extends
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
from spack.util.executable import Executable

View File

@@ -6,30 +6,26 @@
import os
import re
import shutil
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import (
filter_file,
find,
find_all_headers,
find_libraries,
is_nonsymlink_exe_with_shebang,
path_contains_subdirectory,
same_path,
working_dir,
)
from llnl.util.lang import classproperty, match_predicate
from llnl.util.lang import match_predicate
from spack.directives import depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class PythonPackage(PackageBase):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi = None # type: Optional[str]
pypi = None
maintainers = ['adamjstewart']
@@ -50,7 +46,7 @@ class PythonPackage(PackageBase):
# package manually
depends_on('py-wheel', type='build')
py_namespace = None # type: Optional[str]
py_namespace = None
@staticmethod
def _std_args(cls):
@@ -77,21 +73,24 @@ def _std_args(cls):
'--no-index',
]
@classproperty
def homepage(cls):
if cls.pypi:
name = cls.pypi.split('/')[0]
@property
def homepage(self):
if self.pypi:
name = self.pypi.split('/')[0]
return 'https://pypi.org/project/' + name + '/'
@classproperty
def url(cls):
if cls.pypi:
return 'https://files.pythonhosted.org/packages/source/' + cls.pypi[0] + '/' + cls.pypi
@property
def url(self):
if self.pypi:
return (
'https://files.pythonhosted.org/packages/source/'
+ self.pypi[0] + '/' + self.pypi
)
@classproperty
def list_url(cls):
if cls.pypi:
name = cls.pypi.split('/')[0]
@property
def list_url(self):
if self.pypi:
name = self.pypi.split('/')[0]
return 'https://pypi.org/simple/' + name + '/'
@property
@@ -178,37 +177,6 @@ def install(self, spec, prefix):
with working_dir(self.build_directory):
pip(*args)
@property
def headers(self):
"""Discover header files in platlib."""
# Headers may be in either location
include = inspect.getmodule(self).include
platlib = inspect.getmodule(self).platlib
headers = find_all_headers(include) + find_all_headers(platlib)
if headers:
return headers
msg = 'Unable to locate {} headers in {} or {}'
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property
def libs(self):
"""Discover libraries in platlib."""
# Remove py- prefix in package name
library = 'lib' + self.spec.name[3:].replace('-', '?')
root = inspect.getmodule(self).platlib
for shared in [True, False]:
libs = find_libraries(library, root, shared=shared, recursive=True)
if libs:
return libs
msg = 'Unable to recursively locate {} libraries in {}'
raise NoLibrariesError(msg.format(self.spec.name, root))
# Testing
def test(self):

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class QMakePackage(PackageBase):

View File

@@ -2,13 +2,12 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import Optional
import llnl.util.lang as lang
import inspect
from spack.directives import extends
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class RPackage(PackageBase):
@@ -29,10 +28,10 @@ class RPackage(PackageBase):
# package attributes that can be expanded to set the homepage, url,
# list_url, and git values
# For CRAN packages
cran = None # type: Optional[str]
cran = None
# For Bioconductor packages
bioc = None # type: Optional[str]
bioc = None
maintainers = ['glennpj']
@@ -42,27 +41,27 @@ class RPackage(PackageBase):
extends('r')
@lang.classproperty
def homepage(cls):
if cls.cran:
return 'https://cloud.r-project.org/package=' + cls.cran
elif cls.bioc:
return 'https://bioconductor.org/packages/' + cls.bioc
@property
def homepage(self):
if self.cran:
return 'https://cloud.r-project.org/package=' + self.cran
elif self.bioc:
return 'https://bioconductor.org/packages/' + self.bioc
@lang.classproperty
def url(cls):
if cls.cran:
@property
def url(self):
if self.cran:
return (
'https://cloud.r-project.org/src/contrib/'
+ cls.cran + '_' + str(list(cls.versions)[0]) + '.tar.gz'
+ self.cran + '_' + str(list(self.versions)[0]) + '.tar.gz'
)
@lang.classproperty
def list_url(cls):
if cls.cran:
@property
def list_url(self):
if self.cran:
return (
'https://cloud.r-project.org/src/contrib/Archive/'
+ cls.cran + '/'
+ self.cran + '/'
)
@property

View File

@@ -3,15 +3,13 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import Optional
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
from spack.build_environment import SPACK_NO_PARALLEL_MAKE, determine_number_of_jobs
from spack.directives import extends
from spack.package_base import PackageBase
from spack.package import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
@@ -38,14 +36,14 @@ class RacketPackage(PackageBase):
extends('racket')
pkgs = False
subdirectory = None # type: Optional[str]
name = None # type: Optional[str]
subdirectory = None
name = None
parallel = True
@lang.classproperty
def homepage(cls):
if cls.pkgs:
return 'https://pkgs.racket-lang.org/package/{0}'.format(cls.name)
@property
def homepage(self):
if self.pkgs:
return 'https://pkgs.racket-lang.org/package/{0}'.format(self.name)
@property
def build_directory(self):

View File

@@ -77,7 +77,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package_base import PackageBase
from spack.package import PackageBase
class ROCmPackage(PackageBase):
@@ -90,10 +90,9 @@ class ROCmPackage(PackageBase):
# https://llvm.org/docs/AMDGPUUsage.html
# Possible architectures
amdgpu_targets = (
'gfx701', 'gfx801', 'gfx802', 'gfx803', 'gfx900', 'gfx900:xnack-',
'gfx906', 'gfx908', 'gfx90a',
'gfx906:xnack-', 'gfx908:xnack-', 'gfx90a:xnack-', 'gfx90a:xnack+',
'gfx1010', 'gfx1011', 'gfx1012', 'gfx1030', 'gfx1031',
'gfx701', 'gfx801', 'gfx802', 'gfx803',
'gfx900', 'gfx906', 'gfx908', 'gfx90a', 'gfx1010',
'gfx1011', 'gfx1012'
)
variant('rocm', default=False, description='Enable ROCm support')

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import extends
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class RubyPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import depends_on
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class SConsPackage(PackageBase):

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import find, join_path, working_dir
from spack.directives import depends_on, extends
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class SIPPackage(PackageBase):

View File

@@ -3,17 +3,15 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package_base
import spack.package
import spack.util.url
class SourceforgePackage(spack.package_base.PackageBase):
class SourceforgePackage(spack.package.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceforge
packages."""
#: Path of the package in a Sourceforge mirror
sourceforge_mirror_path = None # type: Optional[str]
sourceforge_mirror_path = None
#: List of Sourceforge mirrors used by Spack
base_mirrors = [

View File

@@ -2,17 +2,16 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package_base
import spack.package
import spack.util.url
class SourcewarePackage(spack.package_base.PackageBase):
class SourcewarePackage(spack.package.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceware.org
packages."""
#: Path of the package in a Sourceware mirror
sourceware_mirror_path = None # type: Optional[str]
sourceware_mirror_path = None
#: List of Sourceware mirrors used by Spack
base_mirrors = [

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package_base import PackageBase, run_after
from spack.package import PackageBase, run_after
class WafPackage(PackageBase):

View File

@@ -3,17 +3,15 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package_base
import spack.package
import spack.util.url
class XorgPackage(spack.package_base.PackageBase):
class XorgPackage(spack.package.PackageBase):
"""Mixin that takes care of setting url and mirrors for x.org
packages."""
#: Path of the package in a x.org mirror
xorg_mirror_path = None # type: Optional[str]
xorg_mirror_path = None
#: List of x.org mirrors used by Spack
# Note: x.org mirrors are a bit tricky, since many are out-of-sync or off.

View File

@@ -771,13 +771,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
mirrors_to_check = {
'override': remote_mirror_override
}
# If we have a remote override and we want generate pipeline using
# --check-index-only, then the override mirror needs to be added to
# the configured mirrors when bindist.update() is run, or else we
# won't fetch its index and include in our local cache.
spack.mirror.add(
'ci_pr_mirror', remote_mirror_override, cfg.default_modify_scope())
else:
spack.mirror.add(
'ci_pr_mirror', remote_mirror_override, cfg.default_modify_scope())
pipeline_artifacts_dir = artifacts_root
if not pipeline_artifacts_dir:
@@ -823,7 +819,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
user_artifacts_dir, ci_project_dir)
# Speed up staging by first fetching binary indices from all mirrors
# (including the override mirror we may have just added above).
# (including the per-PR mirror we may have just added above).
try:
bindist.binary_index.update()
except bindist.FetchCacheError as e:
@@ -857,7 +853,8 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
finally:
# Clean up remote mirror override if enabled
if remote_mirror_override:
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
if spack_pipeline_type != 'spack_protected_branch':
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
all_job_names = []
output_object = {}
@@ -1628,9 +1625,8 @@ def copy_stage_logs_to_artifacts(job_spec, job_log_dir):
job_log_dir (str): Path into which build log should be copied
"""
try:
pkg_cls = spack.repo.path.get_pkg_class(job_spec.name)
job_pkg = pkg_cls(job_spec)
tty.debug('job package: {0.fullname}'.format(job_pkg))
job_pkg = spack.repo.get(job_spec)
tty.debug('job package: {0}'.format(job_pkg))
stage_dir = job_pkg.stage.path
tty.debug('stage dir: {0}'.format(stage_dir))
build_out_src = os.path.join(stage_dir, 'spack-build-out.txt')

View File

@@ -0,0 +1,116 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
import llnl.util.tty as tty
import spack.analyzers
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.fetch_strategy
import spack.monitor
import spack.paths
import spack.report
description = "run analyzers on installed packages"
section = "analysis"
level = "long"
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar='SUBCOMMAND', dest='analyze_command')
sp.add_parser('list-analyzers',
description="list available analyzers",
help="show list of analyzers that are available to run.")
# This adds the monitor group to the subparser
spack.monitor.get_monitor_group(subparser)
# Run Parser
run_parser = sp.add_parser('run', description="run an analyzer",
help="provide the name of the analyzer to run.")
run_parser.add_argument(
'--overwrite', action='store_true',
help="re-analyze even if the output file already exists.")
run_parser.add_argument(
'-p', '--path', default=None,
dest='path',
help="write output to a different directory than ~/.spack/analyzers")
run_parser.add_argument(
'-a', '--analyzers', default=None,
dest="analyzers", action="append",
help="add an analyzer (defaults to all available)")
arguments.add_common_arguments(run_parser, ['spec'])
def analyze_spec(spec, analyzers=None, outdir=None, monitor=None, overwrite=False):
"""
Do an analysis for a spec, optionally adding monitoring.
We also allow the user to specify a custom output directory.
analyze_spec(spec, args.analyzers, args.outdir, monitor)
Args:
spec (spack.spec.Spec): spec object of installed package
analyzers (list): list of analyzer (keys) to run
monitor (spack.monitor.SpackMonitorClient): a monitor client
overwrite (bool): overwrite result if already exists
"""
analyzers = analyzers or list(spack.analyzers.analyzer_types.keys())
# Load the build environment from the spec install directory, and send
# the spec to the monitor if it's not known
if monitor:
monitor.load_build_environment(spec)
monitor.new_configuration([spec])
for name in analyzers:
# Instantiate the analyzer with the spec and outdir
analyzer = spack.analyzers.get_analyzer(name)(spec, outdir)
# Run the analyzer to get a json result - results are returned as
# a dictionary with a key corresponding to the analyzer type, so
# we can just update the data
result = analyzer.run()
# Send the result. We do them separately because:
# 1. each analyzer might have differently organized output
# 2. the size of a result can be large
analyzer.save_result(result, overwrite)
def analyze(parser, args, **kwargs):
# If the user wants to list analyzers, do so and exit
if args.analyze_command == "list-analyzers":
spack.analyzers.list_all()
sys.exit(0)
# handle active environment, if any
env = ev.active_environment()
# Get an disambiguate spec (we should only have one)
specs = spack.cmd.parse_specs(args.spec)
if not specs:
tty.die("You must provide one or more specs to analyze.")
spec = spack.cmd.disambiguate_spec(specs[0], env)
# The user wants to monitor builds using github.com/spack/spack-monitor
# It is instantianted once here, and then available at spack.monitor.cli
monitor = None
if args.use_monitor:
monitor = spack.monitor.get_client(
host=args.monitor_host,
prefix=args.monitor_prefix,
)
# Run the analysis
analyze_spec(spec, args.analyzers, args.path, monitor, args.overwrite)

View File

@@ -99,8 +99,8 @@ def blame(parser, args):
blame_file = path
if not blame_file:
pkg_cls = spack.repo.path.get_pkg_class(args.package_or_file)
blame_file = pkg_cls.module.__file__.rstrip('c') # .pyc -> .py
pkg = spack.repo.get(args.package_or_file)
blame_file = pkg.module.__file__.rstrip('c') # .pyc -> .py
# get git blame for the package
with working_dir(spack.paths.prefix):

View File

@@ -12,12 +12,11 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.spec
import spack.stage
import spack.util.crypto
from spack.package_base import preferred_version
from spack.package import preferred_version
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import VersionBase, ver
from spack.version import Version, ver
description = "checksum available versions of a package"
section = "packaging"
@@ -55,8 +54,7 @@ def checksum(parser, args):
tty.die("`spack checksum` accepts package names, not URLs.")
# Get the package we're going to generate checksums for
pkg_cls = spack.repo.path.get_pkg_class(args.package)
pkg = pkg_cls(spack.spec.Spec(args.package))
pkg = spack.repo.get(args.package)
url_dict = {}
versions = args.versions
@@ -67,7 +65,7 @@ def checksum(parser, args):
remote_versions = None
for version in versions:
version = ver(version)
if not isinstance(version, VersionBase):
if not isinstance(version, Version):
tty.die("Cannot generate checksums for version lists or "
"version ranges. Use unambiguous versions.")
url = pkg.find_valid_url_for_version(version)

View File

@@ -58,21 +58,6 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ['specs'])
def remove_python_cache():
for directory in [lib_path, var_path]:
for root, dirs, files in os.walk(directory):
for f in files:
if f.endswith('.pyc') or f.endswith('.pyo'):
fname = os.path.join(root, f)
tty.debug('Removing {0}'.format(fname))
os.remove(fname)
for d in dirs:
if d == '__pycache__':
dname = os.path.join(root, d)
tty.debug('Removing {0}'.format(dname))
shutil.rmtree(dname)
def clean(parser, args):
# If nothing was set, activate the default
if not any([args.specs, args.stage, args.downloads, args.failures,
@@ -85,7 +70,8 @@ def clean(parser, args):
for spec in specs:
msg = 'Cleaning build stage [{0}]'
tty.msg(msg.format(spec.short_spec))
spec.package.do_clean()
package = spack.repo.get(spec)
package.do_clean()
if args.stage:
tty.msg('Removing all temporary build stages')
@@ -109,7 +95,18 @@ def clean(parser, args):
if args.python_cache:
tty.msg('Removing python cache files')
remove_python_cache()
for directory in [lib_path, var_path]:
for root, dirs, files in os.walk(directory):
for f in files:
if f.endswith('.pyc') or f.endswith('.pyo'):
fname = os.path.join(root, f)
tty.debug('Removing {0}'.format(fname))
os.remove(fname)
for d in dirs:
if d == '__pycache__':
dname = os.path.join(root, d)
tty.debug('Removing {0}'.format(dname))
shutil.rmtree(dname)
if args.bootstrap:
bootstrap_prefix = spack.util.path.canonicalize_path(

View File

@@ -403,4 +403,4 @@ def add_s3_connection_args(subparser, add_help):
default=None)
subparser.add_argument(
'--s3-endpoint-url',
help="Endpoint URL to use to connect to this S3 mirror")
help="Access Token to use to connect to this S3 mirror")

View File

@@ -9,6 +9,7 @@
import spack.container
import spack.container.images
import spack.monitor
description = ("creates recipes to build images for different"
" container runtimes")
@@ -17,6 +18,7 @@
def setup_parser(subparser):
monitor_group = spack.monitor.get_monitor_group(subparser) # noqa
subparser.add_argument(
'--list-os', action='store_true', default=False,
help='list all the OS that can be used in the bootstrap phase and exit'
@@ -44,5 +46,14 @@ def containerize(parser, args):
raise ValueError(msg.format(config_file))
config = spack.container.validate(config_file)
# If we have a monitor request, add monitor metadata to config
if args.use_monitor:
config['spack']['monitor'] = {
"host": args.monitor_host,
"keep_going": args.monitor_keep_going,
"prefix": args.monitor_prefix,
"tags": args.monitor_tags
}
recipe = spack.container.recipe(config, last_phase=args.last_stage)
print(recipe)

View File

@@ -57,7 +57,7 @@
# See the Spack documentation for more information on packaging.
# ----------------------------------------------------------------------------
from spack.package import *
from spack import *
class {class_name}({base_class_name}):
@@ -826,7 +826,7 @@ def get_versions(args, name):
spack.util.url.require_url_format(args.url)
if args.url.startswith('file://'):
valid_url = False # No point in spidering these
except (ValueError, TypeError):
except ValueError:
valid_url = False
if args.url is not None and args.template != 'bundle' and valid_url:

View File

@@ -11,7 +11,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package_base
import spack.package
import spack.repo
import spack.store
@@ -57,7 +57,7 @@ def dependencies(parser, args):
else:
spec = specs[0]
dependencies = spack.package_base.possible_dependencies(
dependencies = spack.package.possible_dependencies(
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,

View File

@@ -39,9 +39,9 @@ def inverted_dependencies():
actual dependents.
"""
dag = {}
for pkg_cls in spack.repo.path.all_package_classes():
dag.setdefault(pkg_cls.name, set())
for dep in pkg_cls.dependencies:
for pkg in spack.repo.path.all_packages():
dag.setdefault(pkg.name, set())
for dep in pkg.dependencies:
deps = [dep]
# expand virtuals if necessary
@@ -49,7 +49,7 @@ def inverted_dependencies():
deps += [s.name for s in spack.repo.path.providers_for(dep)]
for d in deps:
dag.setdefault(d, set()).add(pkg_cls.name)
dag.setdefault(d, set()).add(pkg.name)
return dag

View File

@@ -87,7 +87,9 @@ def dev_build(self, args):
# Forces the build to run out of the source directory.
spec.constrain('dev_path=%s' % source_path)
spec.concretize()
package = spack.repo.get(spec)
if spec.installed:
tty.error("Already installed in %s" % spec.prefix)
@@ -107,7 +109,7 @@ def dev_build(self, args):
elif args.test == 'root':
tests = [spec.name for spec in specs]
spec.package.do_install(
package.do_install(
tests=tests,
make_jobs=args.jobs,
keep_prefix=args.keep_prefix,
@@ -120,5 +122,5 @@ def dev_build(self, args):
# drop into the build environment of the package?
if args.shell is not None:
spack.build_environment.setup_package(spec.package, dirty=False)
spack.build_environment.setup_package(package, dirty=False)
os.execvp(args.shell, [args.shell])

View File

@@ -54,9 +54,8 @@ def develop(parser, args):
tty.msg(msg)
continue
spec = spack.spec.Spec(entry['spec'])
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls(spec).stage.steal_source(abspath)
stage = spack.spec.Spec(entry['spec']).package.stage
stage.steal_source(abspath)
if not env.dev_specs:
tty.warn("No develop specs to download")

View File

@@ -104,9 +104,9 @@ def edit(parser, args):
path = os.path.join(path, name)
if not os.path.exists(path):
files = glob.glob(path + '*')
exclude_list = ['.pyc', '~'] # exclude binaries and backups
blacklist = ['.pyc', '~'] # blacklist binaries and backups
files = list(filter(
lambda x: all(s not in x for s in exclude_list), files))
lambda x: all(s not in x for s in blacklist), files))
if len(files) > 1:
m = 'Multiple files exist with the name {0}.'.format(name)
m += ' Please specify a suffix. Files are:\n\n'

View File

@@ -559,11 +559,11 @@ def env_depfile(args):
target_prefix = args.make_target_prefix
def get_target(name):
# The `all` and `clean` targets are phony. It doesn't make sense to
# The `all`, `fetch` and `clean` targets are phony. It doesn't make sense to
# have /abs/path/to/env/metadir/{all,clean} targets. But it *does* make
# sense to have a prefix like `env/all`, `env/clean` when they are
# sense to have a prefix like `env/all`, `env/fetch`, `env/clean` when they are
# supposed to be included
if name in ('all', 'clean') and os.path.isabs(target_prefix):
if name in ('all', 'fetch-all', 'clean') and os.path.isabs(target_prefix):
return name
else:
return os.path.join(target_prefix, name)
@@ -571,6 +571,9 @@ def get_target(name):
def get_install_target(name):
return os.path.join(target_prefix, '.install', name)
def get_fetch_target(name):
return os.path.join(target_prefix, '.fetch', name)
for _, spec in env.concretized_specs():
for s in spec.traverse(root=True):
hash_to_spec[s.dag_hash()] = s
@@ -585,30 +588,46 @@ def get_install_target(name):
# All package install targets, not just roots.
all_install_targets = [get_install_target(h) for h in hash_to_spec.keys()]
# Fetch targets for all packages in the environment, not just roots.
all_fetch_targets = [get_fetch_target(h) for h in hash_to_spec.keys()]
buf = six.StringIO()
buf.write("""SPACK ?= spack
.PHONY: {} {}
.PHONY: {} {} {}
{}: {}
{}: {}
{}: {}
\t@touch $@
{}: {}
\t@touch $@
{}:
\t@mkdir -p {}
\t@mkdir -p {} {}
{}: | {}
\t$(info Fetching $(SPEC))
\t$(SPACK) -e '{}' fetch $(SPACK_FETCH_FLAGS) /$(notdir $@) && touch $@
{}: {}
\t$(info Installing $(SPEC))
\t{}$(SPACK) -e '{}' install $(SPACK_INSTALL_FLAGS) --only-concrete --only=package \
--no-add /$(notdir $@) && touch $@
""".format(get_target('all'), get_target('clean'),
""".format(get_target('all'), get_target('fetch-all'), get_target('clean'),
get_target('all'), get_target('env'),
get_target('fetch-all'), get_target('fetch'),
get_target('env'), ' '.join(root_install_targets),
get_target('dirs'), get_target('.install'),
get_target('.install/%'), get_target('dirs'),
get_target('fetch'), ' '.join(all_fetch_targets),
get_target('dirs'), get_target('.fetch'), get_target('.install'),
get_target('.fetch/%'), get_target('dirs'),
env.path,
get_target('.install/%'), get_target('.fetch/%'),
'+' if args.jobserver else '', env.path))
# Targets are of the form <prefix>/<name>: [<prefix>/<depname>]...,
@@ -638,9 +657,11 @@ def get_install_target(name):
# --make-target-prefix can be any existing directory we do not control,
# including empty string (which means deleting the containing folder
# would delete the folder with the Makefile)
buf.write("{}:\n\trm -f -- {} {}\n".format(
buf.write("{}:\n\trm -f -- {} {} {} {}\n".format(
get_target('clean'),
get_target('env'),
get_target('fetch'),
' '.join(all_fetch_targets),
' '.join(all_install_targets)))
makefile = buf.getvalue()

View File

@@ -52,8 +52,8 @@ def extensions(parser, args):
extendable_pkgs = []
for name in spack.repo.all_package_names():
pkg_cls = spack.repo.path.get_pkg_class(name)
if pkg_cls.extendable:
pkg = spack.repo.get(name)
if pkg.extendable:
extendable_pkgs.append(name)
colify(extendable_pkgs, indent=4)
@@ -64,12 +64,12 @@ def extensions(parser, args):
if len(spec) > 1:
tty.die("Can only list extensions for one package.")
if not spec[0].package.extendable:
tty.die("%s is not an extendable package." % spec[0].name)
env = ev.active_environment()
spec = cmd.disambiguate_spec(spec[0], env)
if not spec.package.extendable:
tty.die("%s is not an extendable package." % spec[0].name)
if not spec.package.extendable:
tty.die("%s does not have extensions." % spec.short_spec)

View File

@@ -119,37 +119,34 @@ def external_find(args):
args.tags = []
# Construct the list of possible packages to be detected
pkg_cls_to_check = []
packages_to_check = []
# Add the packages that have been required explicitly
if args.packages:
pkg_cls_to_check = [
spack.repo.path.get_pkg_class(pkg) for pkg in args.packages
]
packages_to_check = list(spack.repo.get(pkg) for pkg in args.packages)
if args.tags:
allowed = set(spack.repo.path.packages_with_tags(*args.tags))
pkg_cls_to_check = [x for x in pkg_cls_to_check if x.name in allowed]
packages_to_check = [x for x in packages_to_check if x in allowed]
if args.tags and not pkg_cls_to_check:
if args.tags and not packages_to_check:
# If we arrived here we didn't have any explicit package passed
# as argument, which means to search all packages.
# Since tags are cached it's much faster to construct what we need
# to search directly, rather than filtering after the fact
pkg_cls_to_check = [
spack.repo.path.get_pkg_class(pkg_name)
for tag in args.tags
for pkg_name in spack.repo.path.packages_with_tags(tag)
packages_to_check = [
spack.repo.get(pkg) for tag in args.tags for pkg in
spack.repo.path.packages_with_tags(tag)
]
pkg_cls_to_check = list(set(pkg_cls_to_check))
packages_to_check = list(set(packages_to_check))
# If the list of packages is empty, search for every possible package
if not args.tags and not pkg_cls_to_check:
pkg_cls_to_check = list(spack.repo.path.all_package_classes())
if not args.tags and not packages_to_check:
packages_to_check = list(spack.repo.path.all_packages())
detected_packages = spack.detection.by_executable(
pkg_cls_to_check, path_hints=args.path)
packages_to_check, path_hints=args.path)
detected_packages.update(spack.detection.by_library(
pkg_cls_to_check, path_hints=args.path))
packages_to_check, path_hints=args.path))
new_entries = spack.detection.update_configuration(
detected_packages, scope=args.scope, buildable=not args.not_buildable
@@ -220,10 +217,10 @@ def _collect_and_consume_cray_manifest_files(
def external_list(args):
# Trigger a read of all packages, might take a long time.
list(spack.repo.path.all_package_classes())
list(spack.repo.path.all_packages())
# Print all the detectable packages
tty.msg("Detectable packages per repository")
for namespace, pkgs in sorted(spack.package_base.detectable_packages.items()):
for namespace, pkgs in sorted(spack.package.detectable_packages.items()):
print("Repository:", namespace)
colify.colify(pkgs, indent=4, output=sys.stdout)

View File

@@ -18,7 +18,7 @@
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
from spack.package_base import has_test_method, preferred_version
from spack.package import has_test_method, preferred_version
description = 'get detailed information on a particular package'
section = 'basic'
@@ -269,14 +269,14 @@ def print_tests(pkg):
names = []
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
if has_test_method(pkg_cls):
pkg_base = spack.package_base.PackageBase
pkg_base = spack.package.PackageBase
test_pkgs = [str(cls.test) for cls in inspect.getmro(pkg_cls) if
issubclass(cls, pkg_base) and cls.test != pkg_base.test]
test_pkgs = list(set(test_pkgs))
names.extend([(test.split()[1]).lower() for test in test_pkgs])
# TODO Refactor START
# Use code from package_base.py's test_process IF this functionality is
# Use code from package.py's test_process IF this functionality is
# accepted.
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))
@@ -292,9 +292,10 @@ def print_tests(pkg):
v_specs = [spack.spec.Spec(v_name) for v_name in v_names]
for v_spec in v_specs:
try:
pkg_cls = spack.repo.path.get_pkg_class(v_spec.name)
pkg = v_spec.package
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
if has_test_method(pkg_cls):
names.append('{0}.test'.format(pkg_cls.name.lower()))
names.append('{0}.test'.format(pkg.name.lower()))
except spack.repo.UnknownPackageError:
pass
@@ -385,9 +386,7 @@ def print_virtuals(pkg):
def info(parser, args):
spec = spack.spec.Spec(args.package)
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
pkg = spack.repo.get(args.package)
# Output core package information
header = section_title(

View File

@@ -17,6 +17,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.fetch_strategy
import spack.monitor
import spack.paths
import spack.report
from spack.error import SpackError
@@ -104,6 +105,8 @@ def setup_parser(subparser):
'--cache-only', action='store_true', dest='cache_only', default=False,
help="only install package from binary mirrors")
monitor_group = spack.monitor.get_monitor_group(subparser) # noqa
subparser.add_argument(
'--include-build-deps', action='store_true', dest='include_build_deps',
default=False, help="""include build deps when installing from cache,
@@ -289,8 +292,17 @@ def install(parser, args, **kwargs):
parser.print_help()
return
# The user wants to monitor builds using github.com/spack/spack-monitor
if args.use_monitor:
monitor = spack.monitor.get_client(
host=args.monitor_host,
prefix=args.monitor_prefix,
tags=args.monitor_tags,
save_local=args.monitor_save_local,
)
reporter = spack.report.collect_info(
spack.package_base.PackageInstaller, '_install_task', args.log_format, args)
spack.package.PackageInstaller, '_install_task', args.log_format, args)
if args.log_file:
reporter.filename = args.log_file
@@ -329,6 +341,10 @@ def get_tests(specs):
reporter.filename = default_log_file(specs[0])
reporter.specs = specs
# Tell the monitor about the specs
if args.use_monitor and specs:
monitor.new_configuration(specs)
tty.msg("Installing environment {0}".format(env.name))
with reporter('build'):
env.install_all(**kwargs)
@@ -374,6 +390,10 @@ def get_tests(specs):
except SpackError as e:
tty.debug(e)
reporter.concretization_report(e.message)
# Tell spack monitor about it
if args.use_monitor and abstract_specs:
monitor.failed_concretization(abstract_specs)
raise
# 2. Concrete specs from yaml files
@@ -434,4 +454,17 @@ def get_tests(specs):
# overwrite all concrete explicit specs from this build
kwargs['overwrite'] = [spec.dag_hash() for spec in specs]
# Update install_args with the monitor args, needed for build task
kwargs.update({
"monitor_keep_going": args.monitor_keep_going,
"monitor_host": args.monitor_host,
"use_monitor": args.use_monitor,
"monitor_prefix": args.monitor_prefix,
})
# If we are using the monitor, we send configs. and create build
# The dag_hash is the main package id
if args.use_monitor and specs:
monitor.new_configuration(specs)
install_specs(args, kwargs, zip(abstract_specs, specs))

View File

@@ -84,9 +84,9 @@ def match(p, f):
if f.match(p):
return True
pkg_cls = spack.repo.path.get_pkg_class(p)
if pkg_cls.__doc__:
return f.match(pkg_cls.__doc__)
pkg = spack.repo.get(p)
if pkg.__doc__:
return f.match(pkg.__doc__)
return False
else:
def match(p, f):
@@ -133,7 +133,7 @@ def get_dependencies(pkg):
@formatter
def version_json(pkg_names, out):
"""Print all packages with their latest versions."""
pkg_classes = [spack.repo.path.get_pkg_class(name) for name in pkg_names]
pkgs = [spack.repo.get(name) for name in pkg_names]
out.write('[\n')
@@ -147,14 +147,14 @@ def version_json(pkg_names, out):
' "maintainers": {5},\n'
' "dependencies": {6}'
'}}'.format(
pkg_cls.name,
VersionList(pkg_cls.versions).preferred(),
json.dumps([str(v) for v in reversed(sorted(pkg_cls.versions))]),
pkg_cls.homepage,
github_url(pkg_cls),
json.dumps(pkg_cls.maintainers),
json.dumps(get_dependencies(pkg_cls))
) for pkg_cls in pkg_classes
pkg.name,
VersionList(pkg.versions).preferred(),
json.dumps([str(v) for v in reversed(sorted(pkg.versions))]),
pkg.homepage,
github_url(pkg),
json.dumps(pkg.maintainers),
json.dumps(get_dependencies(pkg))
) for pkg in pkgs
])
out.write(pkg_latest)
# important: no trailing comma in JSON arrays
@@ -172,7 +172,7 @@ def html(pkg_names, out):
"""
# Read in all packages
pkg_classes = [spack.repo.path.get_pkg_class(name) for name in pkg_names]
pkgs = [spack.repo.get(name) for name in pkg_names]
# Start at 2 because the title of the page from Sphinx is id1.
span_id = 2
@@ -189,7 +189,7 @@ def head(n, span_id, title, anchor=None):
# Start with the number of packages, skipping the title and intro
# blurb, which we maintain in the RST file.
out.write('<p>\n')
out.write('Spack currently has %d mainline packages:\n' % len(pkg_classes))
out.write('Spack currently has %d mainline packages:\n' % len(pkgs))
out.write('</p>\n')
# Table of links to all packages
@@ -209,9 +209,9 @@ def head(n, span_id, title, anchor=None):
out.write('<hr class="docutils"/>\n')
# Output some text for each package.
for pkg_cls in pkg_classes:
out.write('<div class="section" id="%s">\n' % pkg_cls.name)
head(2, span_id, pkg_cls.name)
for pkg in pkgs:
out.write('<div class="section" id="%s">\n' % pkg.name)
head(2, span_id, pkg.name)
span_id += 1
out.write('<dl class="docutils">\n')
@@ -219,10 +219,10 @@ def head(n, span_id, title, anchor=None):
out.write('<dt>Homepage:</dt>\n')
out.write('<dd><ul class="first last simple">\n')
if pkg_cls.homepage:
if pkg.homepage:
out.write(('<li>'
'<a class="reference external" href="%s">%s</a>'
'</li>\n') % (pkg_cls.homepage, escape(pkg_cls.homepage, True)))
'</li>\n') % (pkg.homepage, escape(pkg.homepage, True)))
else:
out.write('No homepage\n')
out.write('</ul></dd>\n')
@@ -231,19 +231,19 @@ def head(n, span_id, title, anchor=None):
out.write('<dd><ul class="first last simple">\n')
out.write(('<li>'
'<a class="reference external" href="%s">%s/package.py</a>'
'</li>\n') % (github_url(pkg_cls), pkg_cls.name))
'</li>\n') % (github_url(pkg), pkg.name))
out.write('</ul></dd>\n')
if pkg_cls.versions:
if pkg.versions:
out.write('<dt>Versions:</dt>\n')
out.write('<dd>\n')
out.write(', '.join(
str(v) for v in reversed(sorted(pkg_cls.versions))))
str(v) for v in reversed(sorted(pkg.versions))))
out.write('\n')
out.write('</dd>\n')
for deptype in spack.dependency.all_deptypes:
deps = pkg_cls.dependencies_of_type(deptype)
deps = pkg.dependencies_of_type(deptype)
if deps:
out.write('<dt>%s Dependencies:</dt>\n' % deptype.capitalize())
out.write('<dd>\n')
@@ -256,7 +256,7 @@ def head(n, span_id, title, anchor=None):
out.write('<dt>Description:</dt>\n')
out.write('<dd>\n')
out.write(escape(pkg_cls.format_doc(indent=2), True))
out.write(escape(pkg.format_doc(indent=2), True))
out.write('\n')
out.write('</dd>\n')
out.write('</dl>\n')

View File

@@ -12,7 +12,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package_base
import spack.package
import spack.repo
import spack.store
from spack.database import InstallStatuses

View File

@@ -221,7 +221,7 @@ def _read_specs_from_file(filename):
for i, string in enumerate(stream):
try:
s = Spec(string)
spack.repo.path.get_pkg_class(s.name)
s.package
specs.append(s)
except SpackError as e:
tty.debug(e)

View File

@@ -131,7 +131,7 @@ def check_module_set_name(name):
_missing_modules_warning = (
"Modules have been omitted for one or more specs, either"
" because they were excluded or because the spec is"
" because they were blacklisted or because the spec is"
" associated with a package that is installed upstream and"
" that installation has not generated a module file. Rerun"
" this command with debug output enabled for more details.")
@@ -180,7 +180,7 @@ def loads(module_type, specs, args, out=None):
for spec, mod in modules:
if not mod:
module_output_for_spec = (
'## excluded or missing from upstream: {0}'.format(
'## blacklisted or missing from upstream: {0}'.format(
spec.format()))
else:
d['exclude'] = '## ' if spec.name in exclude_set else ''
@@ -293,8 +293,8 @@ def refresh(module_type, specs, args):
cls(spec, args.module_set_name) for spec in specs
if spack.repo.path.exists(spec.name)]
# Filter excluded packages early
writers = [x for x in writers if not x.conf.excluded]
# Filter blacklisted packages early
writers = [x for x in writers if not x.conf.blacklisted]
# Detect name clashes in module files
file2writer = collections.defaultdict(list)

View File

@@ -0,0 +1,33 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.monitor
description = "interact with a monitor server"
section = "analysis"
level = "long"
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar='SUBCOMMAND', dest='monitor_command')
# This adds the monitor group to the subparser
spack.monitor.get_monitor_group(subparser)
# Spack Monitor Uploads
monitor_parser = sp.add_parser('upload', description="upload to spack monitor")
monitor_parser.add_argument("upload_dir", help="directory root to upload")
def monitor(parser, args, **kwargs):
if args.monitor_command == "upload":
monitor = spack.monitor.get_client(
host=args.monitor_host,
prefix=args.monitor_prefix,
)
# Upload the directory
monitor.upload_local_save(args.upload_dir)

View File

@@ -31,4 +31,5 @@ def patch(parser, args):
specs = spack.cmd.parse_specs(args.specs, concretize=True)
for spec in specs:
spec.package.do_patch()
package = spack.repo.get(spec)
package.do_patch()

View File

@@ -50,7 +50,7 @@ def _show_patch(sha256):
owner = rec['owner']
if 'relative_path' in rec:
pkg_dir = spack.repo.path.get_pkg_class(owner).package_dir
pkg_dir = spack.repo.get(owner).package_dir
path = os.path.join(pkg_dir, rec['relative_path'])
print(" path: %s" % path)
else:

View File

@@ -24,4 +24,5 @@ def restage(parser, args):
specs = spack.cmd.parse_specs(args.specs, concretize=True)
for spec in specs:
spec.package.do_restage()
package = spack.repo.get(spec)
package.do_restage()

View File

@@ -18,7 +18,7 @@
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package_base
import spack.package
import spack.solver.asp as asp
description = "concretize a specs using an ASP solver"

View File

@@ -24,7 +24,6 @@ def setup_parser(subparser):
subparser.add_argument(
'-p', '--path', dest='path',
help="path to stage package, does not add to spack tree")
arguments.add_concretizer_args(subparser)
def stage(parser, args):
@@ -59,7 +58,8 @@ def stage(parser, args):
for spec in specs:
spec = spack.cmd.matching_spec_from_env(spec)
package = spack.repo.get(spec)
if custom_path:
spec.package.path = custom_path
spec.package.do_stage()
tty.msg("Staged {0} in {1}".format(spec.package.name, spec.package.stage.path))
package.path = custom_path
package.do_stage()
tty.msg("Staged {0} in {1}".format(package.name, package.stage.path))

View File

@@ -65,7 +65,7 @@ def is_package(f):
packages, since we allow `from spack import *` and poking globals
into packages.
"""
return f.startswith("var/spack/repos/") and f.endswith('package.py')
return f.startswith("var/spack/repos/")
#: decorator for adding tools to the list
@@ -94,16 +94,16 @@ def changed_files(base="develop", untracked=True, all_files=False, root=None):
git = which("git", required=True)
# ensure base is in the repo
base_sha = git("rev-parse", "--quiet", "--verify", "--revs-only", base,
fail_on_error=False, output=str)
git("show-ref", "--verify", "--quiet", "refs/heads/%s" % base,
fail_on_error=False)
if git.returncode != 0:
tty.die(
"This repository does not have a '%s' revision." % base,
"This repository does not have a '%s' branch." % base,
"spack style needs this branch to determine which files changed.",
"Ensure that '%s' exists, or specify files to check explicitly." % base
)
range = "{0}...".format(base_sha.strip())
range = "{0}...".format(base)
git_args = [
# Add changed files committed since branching off of develop
@@ -236,7 +236,7 @@ def translate(match):
continue
if not args.root_relative and re_obj:
line = re_obj.sub(translate, line)
print(line)
print(" " + line)
def print_style_header(file_list, args):
@@ -290,26 +290,20 @@ def run_flake8(flake8_cmd, file_list, args):
@tool("mypy")
def run_mypy(mypy_cmd, file_list, args):
# always run with config from running spack prefix
common_mypy_args = [
mypy_args = [
"--config-file", os.path.join(spack.paths.prefix, "pyproject.toml"),
"--show-error-codes",
]
mypy_arg_sets = [common_mypy_args + [
"--package", "spack",
"--package", "llnl",
]]
if 'SPACK_MYPY_CHECK_PACKAGES' in os.environ:
mypy_arg_sets.append(common_mypy_args + [
'--package', 'packages',
'--disable-error-code', 'no-redef',
])
"--show-error-codes",
]
# not yet, need other updates to enable this
# if any([is_package(f) for f in file_list]):
# mypy_args.extend(["--package", "packages"])
returncode = 0
for mypy_args in mypy_arg_sets:
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode |= mypy_cmd.returncode
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode = mypy_cmd.returncode
rewrite_and_print_output(output, args)
rewrite_and_print_output(output, args)
print_tool_result("mypy", returncode)
return returncode
@@ -324,29 +318,16 @@ def run_isort(isort_cmd, file_list, args):
pat = re.compile("ERROR: (.*) Imports are incorrectly sorted")
replacement = "ERROR: {0} Imports are incorrectly sorted"
returncode = [0]
returncode = 0
for chunk in grouper(file_list, 100):
packed_args = isort_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode |= isort_cmd.returncode
def process_files(file_list, is_args):
for chunk in grouper(file_list, 100):
packed_args = is_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode[0] |= isort_cmd.returncode
rewrite_and_print_output(output, args, pat, replacement)
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = ('--rm', 'spack', '--rm', 'spack.pkgkit', '--rm',
'spack.package_defs', '-a', 'from spack.package import *')
packages_isort_args = packages_isort_args + isort_args
# packages
process_files(filter(is_package, file_list),
packages_isort_args)
# non-packages
process_files(filter(lambda f: not is_package(f), file_list),
isort_args)
print_tool_result("isort", returncode[0])
return returncode[0]
print_tool_result("isort", returncode)
return returncode
@tool("black")

View File

@@ -20,7 +20,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.install_test
import spack.package_base
import spack.package
import spack.repo
import spack.report
@@ -189,7 +189,7 @@ def test_run(args):
# Set up reporter
setattr(args, 'package', [s.format() for s in test_suite.specs])
reporter = spack.report.collect_info(
spack.package_base.PackageBase, 'do_test', args.log_format, args)
spack.package.PackageBase, 'do_test', args.log_format, args)
if not reporter.filename:
if args.log_file:
if os.path.isabs(args.log_file):
@@ -217,7 +217,7 @@ def test_list(args):
else set()
def has_test_and_tags(pkg_class):
return spack.package_base.has_test_method(pkg_class) and \
return spack.package.has_test_method(pkg_class) and \
(not args.tag or pkg_class.name in tagged)
if args.list_all:

View File

@@ -15,7 +15,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.error
import spack.package_base
import spack.package
import spack.repo
import spack.store
from spack.database import InstallStatuses
@@ -221,7 +221,7 @@ def do_uninstall(env, specs, force):
except spack.repo.UnknownEntityError:
# The package.py file has gone away -- but still
# want to uninstall.
spack.package_base.Package.uninstall_by_spec(item, force=True)
spack.package.Package.uninstall_by_spec(item, force=True)
# A package is ready to be uninstalled when nothing else references it,
# unless we are requested to force uninstall it.

View File

@@ -14,7 +14,6 @@
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
import spack.util.crypto as crypto
from spack.url import (
UndetectableNameError,
@@ -148,13 +147,13 @@ def url_list(args):
urls = set()
# Gather set of URLs from all packages
for pkg_cls in spack.repo.path.all_package_classes():
url = getattr(pkg_cls, 'url', None)
urls = url_list_parsing(args, urls, url, pkg_cls)
for pkg in spack.repo.path.all_packages():
url = getattr(pkg, 'url', None)
urls = url_list_parsing(args, urls, url, pkg)
for params in pkg_cls.versions.values():
for params in pkg.versions.values():
url = params.get('url', None)
urls = url_list_parsing(args, urls, url, pkg_cls)
urls = url_list_parsing(args, urls, url, pkg)
# Print URLs
for url in sorted(urls):
@@ -185,9 +184,8 @@ def url_summary(args):
tty.msg('Generating a summary of URL parsing in Spack...')
# Loop through all packages
for pkg_cls in spack.repo.path.all_package_classes():
for pkg in spack.repo.path.all_packages():
urls = set()
pkg = pkg_cls(spack.spec.Spec(pkg_cls.name))
url = getattr(pkg, 'url', None)
if url:
@@ -320,20 +318,19 @@ def add(self, pkg_name, fetcher):
version_stats = UrlStats()
resource_stats = UrlStats()
for pkg_cls in spack.repo.path.all_package_classes():
for pkg in spack.repo.path.all_packages():
npkgs += 1
for v in pkg_cls.versions:
for v in pkg.versions:
try:
pkg = pkg_cls(spack.spec.Spec(pkg_cls.name))
fetcher = fs.for_package_version(pkg, v)
except (fs.InvalidArgsError, fs.FetcherConflict):
continue
version_stats.add(pkg_cls.name, fetcher)
version_stats.add(pkg.name, fetcher)
for _, resources in pkg_cls.resources.items():
for _, resources in pkg.resources.items():
for resource in resources:
resource_stats.add(pkg_cls.name, resource.fetcher)
resource_stats.add(pkg.name, resource.fetcher)
# print a nice summary table
tty.msg("URL stats for %d packages:" % npkgs)
@@ -393,8 +390,8 @@ def print_stat(indent, name, stat_name=None):
tty.msg("Found %d issues." % total_issues)
for issue_type, pkgs in issues.items():
tty.msg("Package URLs with %s" % issue_type)
for pkg_cls, pkg_issues in pkgs.items():
color.cprint(" @*C{%s}" % pkg_cls)
for pkg, pkg_issues in pkgs.items():
color.cprint(" @*C{%s}" % pkg)
for issue in pkg_issues:
print(" %s" % issue)
@@ -425,7 +422,7 @@ def url_list_parsing(args, urls, url, pkg):
urls (set): List of URLs that have already been added
url (str or None): A URL to potentially add to ``urls`` depending on
``args``
pkg (spack.package_base.PackageBase): The Spack package
pkg (spack.package.PackageBase): The Spack package
Returns:
set: The updated set of ``urls``
@@ -473,7 +470,7 @@ def name_parsed_correctly(pkg, name):
"""Determine if the name of a package was correctly parsed.
Args:
pkg (spack.package_base.PackageBase): The Spack package
pkg (spack.package.PackageBase): The Spack package
name (str): The name that was extracted from the URL
Returns:
@@ -490,7 +487,7 @@ def version_parsed_correctly(pkg, version):
"""Determine if the version of a package was correctly parsed.
Args:
pkg (spack.package_base.PackageBase): The Spack package
pkg (spack.package.PackageBase): The Spack package
version (str): The version that was extracted from the URL
Returns:

View File

@@ -12,7 +12,6 @@
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.spec
from spack.version import infinity_versions, ver
description = "list available versions of a package"
@@ -40,9 +39,7 @@ def setup_parser(subparser):
def versions(parser, args):
spec = spack.spec.Spec(args.package)
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
pkg = spack.repo.get(args.package)
safe_versions = pkg.versions

View File

@@ -81,14 +81,6 @@ def cxx11_flag(self):
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
return "-std=c++20"
@property
def c99_flag(self):
return "-std=c99"

View File

@@ -180,6 +180,26 @@ def paths(self):
view='/opt/view'
)
@tengine.context_property
def monitor(self):
"""Enable using spack monitor during build."""
Monitor = collections.namedtuple('Monitor', [
'enabled', 'host', 'prefix', 'keep_going', 'tags'
])
monitor = self.config.get("monitor")
# If we don't have a monitor group, cut out early.
if not monitor:
return Monitor(False, None, None, None, None)
return Monitor(
enabled=True,
host=monitor.get('host'),
prefix=monitor.get('prefix'),
keep_going=monitor.get("keep_going"),
tags=monitor.get('tags')
)
@tengine.context_property
def manifest(self):
"""The spack.yaml file that should be used in the image"""
@@ -188,6 +208,8 @@ def manifest(self):
# Copy in the part of spack.yaml prescribed in the configuration file
manifest = copy.deepcopy(self.config)
manifest.pop('container')
if "monitor" in manifest:
manifest.pop("monitor")
# Ensure that a few paths are where they need to be
manifest.setdefault('config', syaml.syaml_dict())

View File

@@ -86,13 +86,13 @@ def spec_from_entry(entry):
arch=arch_str
)
pkg_cls = spack.repo.path.get_pkg_class(entry['name'])
package = spack.repo.get(entry['name'])
if 'parameters' in entry:
variant_strs = list()
for name, value in entry['parameters'].items():
# TODO: also ensure that the variant value is valid?
if not (name in pkg_cls.variants):
if not (name in package.variants):
tty.debug("Omitting variant {0} for entry {1}/{2}"
.format(name, entry['name'], entry['hash'][:7]))
continue

View File

@@ -240,7 +240,7 @@ def compute_windows_program_path_for_package(pkg):
program files location, return list of best guesses
Args:
pkg (spack.package_base.Package): package for which
pkg (spack.package.Package): package for which
Program Files location is to be computed
"""
if not is_windows:

View File

@@ -220,7 +220,7 @@ def by_executable(packages_to_check, path_hints=None):
searching by path.
Args:
packages_to_check (list): list of package classes to be detected
packages_to_check (list): list of packages to be detected
path_hints (list): list of paths to be searched. If None the list will be
constructed based on the PATH environment variable.
"""
@@ -228,7 +228,7 @@ def by_executable(packages_to_check, path_hints=None):
exe_pattern_to_pkgs = collections.defaultdict(list)
for pkg in packages_to_check:
if hasattr(pkg, 'executables'):
for exe in pkg.platform_executables():
for exe in pkg.platform_executables:
exe_pattern_to_pkgs[exe].append(pkg)
# Add Windows specific, package related paths to the search paths
path_hints.extend(compute_windows_program_path_for_package(pkg))

View File

@@ -46,7 +46,7 @@ class OpenMpi(Package):
from spack.dependency import Dependency, canonical_deptype, default_deptype
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import GitVersion, Version, VersionChecksumError, VersionLookupError
from spack.version import Version, VersionChecksumError
__all__ = ['DirectiveError', 'DirectiveMeta', 'version', 'conflicts', 'depends_on',
'extends', 'provides', 'patch', 'variant', 'resource']
@@ -330,17 +330,7 @@ def _execute_version(pkg):
kwargs['checksum'] = checksum
# Store kwargs for the package to later with a fetch_strategy.
version = Version(ver)
if isinstance(version, GitVersion):
if not hasattr(pkg, 'git') and 'git' not in kwargs:
msg = "Spack version directives cannot include git hashes fetched from"
msg += " URLs. Error in package '%s'\n" % pkg.name
msg += " version('%s', " % version.string
msg += ', '.join("%s='%s'" % (argname, value)
for argname, value in kwargs.items())
msg += ")"
raise VersionLookupError(msg)
pkg.versions[version] = kwargs
pkg.versions[Version(ver)] = kwargs
return _execute_version

View File

@@ -1113,13 +1113,8 @@ def develop(self, spec, path, clone=False):
# "steal" the source code via staging API
abspath = os.path.normpath(os.path.join(self.path, path))
# Stage, at the moment, requires a concrete Spec, since it needs the
# dag_hash for the stage dir name. Below though we ask for a stage
# to be created, to copy it afterwards somewhere else. It would be
# better if we can create the `source_path` directly into its final
# destination.
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls(spec).stage.steal_source(abspath)
stage = spec.package.stage
stage.steal_source(abspath)
# If it wasn't already in the list, append it
self.dev_specs[spec.name] = {'path': path, 'spec': str(spec)}

View File

@@ -35,7 +35,6 @@
import six.moves.urllib.parse as urllib_parse
import llnl.util
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.filesystem import (
get_single_file,
@@ -120,11 +119,6 @@ def __init__(self, **kwargs):
# 'no_cache' option from version directive.
self.cache_enabled = not kwargs.pop('no_cache', False)
self.package = None
def set_package(self, package):
self.package = package
# Subclasses need to implement these methods
def fetch(self):
"""Fetch source code archive or repo.
@@ -248,10 +242,6 @@ def source_id(self):
if all(component_ids):
return component_ids
def set_package(self, package):
for item in self:
item.package = package
@fetcher
class URLFetchStrategy(FetchStrategy):
@@ -498,7 +488,7 @@ def _fetch_curl(self, url):
self._check_headers(headers)
if save_file and (partial_file is not None):
fs.rename(partial_file, save_file)
os.rename(partial_file, save_file)
@property # type: ignore # decorated properties unsupported in mypy
@_needs_stage
@@ -530,7 +520,7 @@ def expand(self):
"Failed on expand() for URL %s" % self.url)
if not self.extension:
self.extension = extension(self.url)
self.extension = extension(self.archive_file)
if self.stage.expanded:
tty.debug('Source already staged to %s' % self.stage.source_path)
@@ -538,11 +528,50 @@ def expand(self):
decompress = decompressor_for(self.archive_file, self.extension)
# Expand all tarballs in their own directory to contain
# exploding tarballs.
tarball_container = os.path.join(self.stage.path,
"spack-expanded-archive")
# Below we assume that the command to decompress expand the
# archive in the current working directory
with fs.exploding_archive_catch(self.stage):
mkdirp(tarball_container)
with working_dir(tarball_container):
decompress(self.archive_file)
# Check for an exploding tarball, i.e. one that doesn't expand to
# a single directory. If the tarball *didn't* explode, move its
# contents to the staging source directory & remove the container
# directory. If the tarball did explode, just rename the tarball
# directory to the staging source directory.
#
# NOTE: The tar program on Mac OS X will encode HFS metadata in
# hidden files, which can end up *alongside* a single top-level
# directory. We initially ignore presence of hidden files to
# accomodate these "semi-exploding" tarballs but ensure the files
# are copied to the source directory.
files = os.listdir(tarball_container)
non_hidden = [f for f in files if not f.startswith('.')]
if len(non_hidden) == 1:
src = os.path.join(tarball_container, non_hidden[0])
if os.path.isdir(src):
self.stage.srcdir = non_hidden[0]
shutil.move(src, self.stage.source_path)
if len(files) > 1:
files.remove(non_hidden[0])
for f in files:
src = os.path.join(tarball_container, f)
dest = os.path.join(self.stage.path, f)
shutil.move(src, dest)
os.rmdir(tarball_container)
else:
# This is a non-directory entry (e.g., a patch file) so simply
# rename the tarball container to be the source path.
shutil.move(tarball_container, self.stage.source_path)
else:
shutil.move(tarball_container, self.stage.source_path)
def archive(self, destination):
"""Just moves this archive to the destination."""
if not self.archive_file:
@@ -985,20 +1014,9 @@ def clone(self, dest=None, commit=None, branch=None, tag=None, bare=False):
git(*args)
# Init submodules if the user asked for them.
git_commands = []
submodules = self.submodules
if callable(submodules):
submodules = list(submodules(self.package))
git_commands.append(["submodule", "init", "--"] + submodules)
git_commands.append(['submodule', 'update', '--recursive'])
elif submodules:
git_commands.append(["submodule", "update", "--init", "--recursive"])
if not git_commands:
return
with working_dir(dest):
for args in git_commands:
if self.submodules:
with working_dir(dest):
args = ['submodule', 'update', '--init', '--recursive']
if not spack.config.get('config:debug'):
args.insert(1, '--quiet')
git(*args)
@@ -1538,7 +1556,7 @@ def _extrapolate(pkg, version):
try:
return URLFetchStrategy(pkg.url_for_version(version),
fetch_options=pkg.fetch_options)
except spack.package_base.NoURLError:
except spack.package.NoURLError:
msg = ("Can't extrapolate a URL for version %s "
"because package %s defines no URLs")
raise ExtrapolationError(msg % (version, pkg.name))
@@ -1575,30 +1593,16 @@ def for_package_version(pkg, version):
check_pkg_attributes(pkg)
if not isinstance(version, spack.version.VersionBase):
if not isinstance(version, spack.version.Version):
version = spack.version.Version(version)
# if it's a commit, we must use a GitFetchStrategy
if isinstance(version, spack.version.GitVersion):
if not hasattr(pkg, "git"):
raise FetchError(
"Cannot fetch git version for %s. Package has no 'git' attribute" %
pkg.name
)
if version.is_commit and hasattr(pkg, "git"):
# Populate the version with comparisons to other commits
version.generate_git_lookup(pkg.name)
# For GitVersion, we have no way to determine whether a ref is a branch or tag
# Fortunately, we handle branches and tags identically, except tags are
# handled slightly more conservatively for older versions of git.
# We call all non-commit refs tags in this context, at the cost of a slight
# performance hit for branches on older versions of git.
# Branches cannot be cached, so we tell the fetcher not to cache tags/branches
ref_type = 'commit' if version.is_commit else 'tag'
version.generate_commit_lookup(pkg.name)
kwargs = {
'git': pkg.git,
ref_type: version.ref,
'no_cache': True,
'commit': str(version)
}
kwargs['submodules'] = getattr(pkg, 'submodules', False)
fetcher = GitFetchStrategy(**kwargs)

View File

@@ -535,10 +535,9 @@ def graph_dot(specs, deptype='all', static=False, out=None):
deptype = spack.dependency.canonical_deptype(deptype)
def static_graph(spec, deptype):
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
possible = pkg_cls.possible_dependencies(
expand_virtuals=True, deptype=deptype
)
pkg = spec.package
possible = pkg.possible_dependencies(
expand_virtuals=True, deptype=deptype)
nodes = set() # elements are (node name, node label)
edges = set() # elements are (src key, dest key)

Some files were not shown because too many files have changed in this diff Show More