Compare commits

..

1 Commits

Author SHA1 Message Date
Joseph Snyder
deb3f0f43a Prevent S3 information on non-S3 mirrors
Switch from looking at the presence of the S3 information keys to
determine if the dictionary URL is used to instead look at the value
in the keys.

Add s3_endpoint_url as an additional key value for the S3 information.
2022-01-10 12:48:51 -05:00
7291 changed files with 21169 additions and 40952 deletions

View File

@@ -16,29 +16,19 @@ body:
attributes:
label: Steps to reproduce the issue
description: |
Fill in the console output from the exact spec you are trying to build.
value: |
Fill in the exact spec you are trying to build and the relevant part of the error message
placeholder: |
```console
$ spack spec -I <spec>
$ spack install <spec>
...
```
- type: textarea
id: error
attributes:
label: Error message
description: |
Please post the error message from spack inside the `<details>` tag below:
value: |
<details><summary>Error message</summary><pre>
...
</pre></details>
validations:
required: true
- type: textarea
id: information
attributes:
label: Information on your system
description: Please include the output of `spack debug report`.
description: Please include the output of `spack debug report`
validations:
required: true
- type: markdown

View File

@@ -31,7 +31,7 @@ jobs:
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -61,7 +61,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -90,7 +90,7 @@ jobs:
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -118,7 +118,7 @@ jobs:
bzip2 curl file gcc-c++ gcc gcc-fortran tar git gpg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -138,7 +138,7 @@ jobs:
- name: Install dependencies
run: |
brew install cmake bison@2.7 tree
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -157,8 +157,8 @@ jobs:
- name: Install dependencies
run: |
brew install tree
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Bootstrap clingo
@@ -174,8 +174,8 @@ jobs:
matrix:
python-version: ['2.7', '3.5', '3.6', '3.7', '3.8', '3.9']
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Setup repo and non-root user
@@ -202,7 +202,7 @@ jobs:
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- uses: actions/checkout@v2
- name: Setup repo and non-root user
run: |
git --version
@@ -231,7 +231,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- uses: actions/checkout@v2
- name: Setup repo and non-root user
run: |
git --version
@@ -256,7 +256,7 @@ jobs:
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- uses: actions/checkout@v2
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -272,7 +272,7 @@ jobs:
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- uses: actions/checkout@v2
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -37,7 +37,7 @@ jobs:
name: Build ${{ matrix.dockerfile[0] }}
steps:
- name: Checkout
uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
@@ -67,7 +67,7 @@ jobs:
uses: docker/setup-buildx-action@94ab11c41e45d028884a99163086648e898eed25 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@dd4fa0671be5250ee6f50aedf4cb05514abda2c7 # @v1
uses: docker/login-action@42d299face0c5c43a0487c477f595ac9cf22f1a7 # @v1
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -75,13 +75,13 @@ jobs:
- name: Log in to DockerHub
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@dd4fa0671be5250ee6f50aedf4cb05514abda2c7 # @v1
uses: docker/login-action@42d299face0c5c43a0487c477f595ac9cf22f1a7 # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[1] }}
uses: docker/build-push-action@7f9d37fa544684fb73bfe4835ed7214c255ce02b # @v2
uses: docker/build-push-action@a66e35b9cbcf4ad0ea91ffcaf7bbad63ad9e0229 # @v2
with:
file: share/spack/docker/${{matrix.dockerfile[1]}}
platforms: ${{ matrix.dockerfile[2] }}

View File

@@ -2,7 +2,19 @@
. share/spack/setup-env.sh
echo -e "config:\n build_jobs: 2" > etc/spack/config.yaml
spack config add "packages:all:target:[x86_64]"
spack compiler find
# TODO: remove this explicit setting once apple-clang detection is fixed
cat <<EOF > etc/spack/compilers.yaml
compilers:
- compiler:
spec: apple-clang@11.0.3
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++
f77: /usr/local/bin/gfortran-9
fc: /usr/local/bin/gfortran-9
modules: []
operating_system: catalina
target: x86_64
EOF
spack compiler info apple-clang
spack debug report
spack solve zlib

View File

@@ -24,23 +24,23 @@ jobs:
name: gcc with clang
runs-on: macos-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
run: |
. .github/workflows/install_spack.sh
# 9.2.0 is the latest version on which we apply homebrew patch
spack install -v --fail-fast gcc@11.2.0 %apple-clang
spack install -v --fail-fast gcc@9.2.0 %apple-clang
install_jupyter_clang:
name: jupyter
runs-on: macos-latest
timeout-minutes: 700
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -52,8 +52,8 @@ jobs:
name: scipy, mpl, pd
runs-on: macos-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install

View File

@@ -15,8 +15,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python Packages
@@ -31,10 +31,10 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python packages
@@ -57,7 +57,7 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
with_coverage: ${{ steps.coverage.outputs.with_coverage }}
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -106,10 +106,10 @@ jobs:
- python-version: 3.9
concretizer: original
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -121,16 +121,12 @@ jobs:
patchelf cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov "coverage[toml]<=6.2"
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
# ensure style checks are not skipped in unit tests for python >= 3.6
# note that true/false (i.e., 1/0) are opposite in conditions in python and bash
if python -c 'import sys; sys.exit(not sys.version_info >= (3, 6))'; then
pip install --upgrade flake8 isort>=4.3.5 mypy>=0.900 black
fi
- name: Pin pathlib for Python 2.7
if: ${{ matrix.python-version == 2.7 }}
run: |
pip install -U pathlib2==2.3.6
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -171,10 +167,10 @@ jobs:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -184,7 +180,7 @@ jobs:
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -218,7 +214,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -237,10 +233,10 @@ jobs:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -252,7 +248,7 @@ jobs:
patchelf kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2 clingo
pip install --upgrade pip six setuptools pytest codecov coverage[toml] clingo
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -286,16 +282,16 @@ jobs:
matrix:
python-version: [3.8]
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade pytest codecov coverage[toml]==6.2
pip install --upgrade pytest codecov coverage[toml]
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov
@@ -331,13 +327,13 @@ jobs:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
- name: Package audits (with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
run: |

View File

@@ -34,18 +34,10 @@ includes the sbang tool directly in bin/sbang. These packages are covered
by various permissive licenses. A summary listing follows. See the
license included with each package for full details.
PackageName: altgraph
PackageHomePage: https://altgraph.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT
PackageName: argparse
PackageHomePage: https://pypi.python.org/pypi/argparse
PackageLicenseDeclared: Python-2.0
PackageName: astunparse
PackageHomePage: https://github.com/simonpercivall/astunparse
PackageLicenseDeclared: Python-2.0
PackageName: attrs
PackageHomePage: https://github.com/python-attrs/attrs
PackageLicenseDeclared: MIT
@@ -70,10 +62,6 @@ PackageName: jsonschema
PackageHomePage: https://pypi.python.org/pypi/jsonschema
PackageLicenseDeclared: MIT
PackageName: macholib
PackageHomePage: https://macholib.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT
PackageName: markupsafe
PackageHomePage: https://pypi.python.org/pypi/MarkupSafe
PackageLicenseDeclared: BSD-3-Clause
@@ -105,3 +93,11 @@ PackageLicenseDeclared: Apache-2.0 OR MIT
PackageName: six
PackageHomePage: https://pypi.python.org/pypi/six
PackageLicenseDeclared: MIT
PackageName: macholib
PackageHomePage: https://macholib.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT
PackageName: altgraph
PackageHomePage: https://altgraph.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) 2013-2022 LLNS, LLC and other Spack Project Developers.
Copyright (c) 2013-2020 LLNS, LLC and other Spack Project Developers.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@@ -10,7 +10,6 @@ For more on Spack's release structure, see
| Version | Supported |
| ------- | ------------------ |
| develop | :white_check_mark: |
| 0.17.x | :white_check_mark: |
| 0.16.x | :white_check_mark: |
## Reporting a Vulnerability

View File

@@ -1,7 +1,7 @@
#!/bin/sh
# -*- python -*-
#
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,6 +1,6 @@
#!/bin/sh
#
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,17 +0,0 @@
# -------------------------------------------------------------------------
# This is the default spack configuration file.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing
# `$SPACK_ROOT/etc/spack/concretizer.yaml`, `~/.spack/concretizer.yaml`,
# or by adding a `concretizer:` section to an environment.
# -------------------------------------------------------------------------
concretizer:
# Whether to consider installed packages or packages from buildcaches when
# concretizing specs. If `true`, we'll try to use as many installs/binaries
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization.
reuse: false

View File

@@ -155,17 +155,14 @@ config:
# The concretization algorithm to use in Spack. Options are:
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs.
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs. This will soon be deprecated in
# favor of clingo.
#
# See `concretizer.yaml` for more settings you can fine-tune when
# using clingo.
concretizer: clingo

View File

@@ -46,13 +46,7 @@ modules:
enable:
- tcl
tcl:
all:
autoload: none
# Default configurations if lmod is enabled
lmod:
all:
autoload: direct
hierarchy:
- mpi

View File

@@ -34,7 +34,6 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libllvm: [llvm, llvm-amdgpu]
lua-lang: [lua, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb]
mkl: [intel-mkl]

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -194,9 +194,9 @@ Reusing installed dependencies
.. warning::
The ``--reuse`` option described here will become the default installation
method in the next Spack version, and you will be able to get the current
behavior by using ``spack install --fresh``.
The ``--reuse`` option described here is experimental, and it will
likely be replaced with a different option and configuration settings
in the next Spack release.
By default, when you run ``spack install``, Spack tries to build a new
version of the package you asked for, along with updated versions of
@@ -216,9 +216,6 @@ the ``mpich`` will be build with the installed versions, if possible.
You can use the :ref:`spack spec -I <cmd-spack-spec>` command to see what
will be reused and what will be built before you install.
You can configure Spack to use the ``--reuse`` behavior by default in
``concretizer.yaml``.
.. _cmd-spack-uninstall:
^^^^^^^^^^^^^^^^^^^
@@ -1283,7 +1280,7 @@ Normally users don't have to bother specifying the architecture if they
are installing software for their current host, as in that case the
values will be detected automatically. If you need fine-grained control
over which packages use which targets (or over *all* packages' default
target), see :ref:`package-preferences`.
target), see :ref:`concretization-preferences`.
.. admonition:: Cray machines

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -209,49 +209,11 @@ Specific limitations include:
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _concretizer-options:
.. _concretization-preferences:
----------------------
Concretizer options
----------------------
``packages.yaml`` gives the concretizer preferences for specific packages,
but you can also use ``concretizer.yaml`` to customize aspects of the
algorithm it uses to select the dependencies you install:
.. _code-block: yaml
concretizer:
# Whether to consider installed packages or packages from buildcaches when
# concretizing specs. If `true`, we'll try to use as many installs/binaries
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization.
reuse: false
^^^^^^^^^^^^^^^^
``reuse``
^^^^^^^^^^^^^^^^
This controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``). .
You can use ``spack install --reuse`` to enable reuse for a single installation,
and you can use ``spack install --fresh`` to do a fresh install if ``reuse`` is
enabled by default.
.. note::
``reuse: false`` is the current default, but ``reuse: true`` will be the default
in the next Spack release. You will still be able to use ``spack install --fresh``
to get the old behavior.
.. _package-preferences:
-------------------
Package Preferences
-------------------
--------------------------
Concretization Preferences
--------------------------
Spack can be configured to prefer certain compilers, package
versions, dependencies, and variants during concretization.
@@ -307,7 +269,6 @@ concretization rules. A provider lists a value that packages may
``depend_on`` (e.g, MPI) and a list of rules for fulfilling that
dependency.
.. _package_permissions:
-------------------

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -649,7 +649,7 @@ follow `the next section <intel-install-libs_>`_ instead.
* If you specified a custom variant (for example ``+vtune``) you may want to add this as your
preferred variant in the packages configuration for the ``intel-parallel-studio`` package
as described in :ref:`package-preferences`. Otherwise you will have to specify
as described in :ref:`concretization-preferences`. Otherwise you will have to specify
the variant everytime ``intel-parallel-studio`` is being used as ``mkl``, ``fftw`` or ``mpi``
implementation to avoid pulling in a different variant.
@@ -811,13 +811,13 @@ by one of the following means:
$ spack install libxc@3.0.0%intel
* Alternatively, request Intel compilers implicitly by package preferences.
* Alternatively, request Intel compilers implicitly by concretization preferences.
Configure the order of compilers in the appropriate ``packages.yaml`` file,
under either an ``all:`` or client-package-specific entry, in a
``compiler:`` list. Consult the Spack documentation for
`Configuring Package Preferences <https://spack-tutorial.readthedocs.io/en/latest/tutorial_configuration.html#configuring-package-preferences>`_
and
:ref:`Package Preferences <package-preferences>`.
:ref:`Concretization Preferences <concretization-preferences>`.
Example: ``etc/spack/packages.yaml`` might simply contain:
@@ -867,7 +867,7 @@ virtual package, in order of decreasing preference. To learn more about the
``providers:`` settings, see the Spack tutorial for
`Configuring Package Preferences <https://spack-tutorial.readthedocs.io/en/latest/tutorial_configuration.html#configuring-package-preferences>`_
and the section
:ref:`Package Preferences <package-preferences>`.
:ref:`Concretization Preferences <concretization-preferences>`.
Example: The following fairly minimal example for ``packages.yaml`` shows how
to exclusively use the standalone ``intel-mkl`` package for all the linear

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -9,80 +9,216 @@
PythonPackage
-------------
Python packages and modules have their own special build system. This
documentation covers everything you'll need to know in order to write
a Spack build recipe for a Python library.
Python packages and modules have their own special build system.
^^^^^^^^^^^
Terminology
^^^^^^^^^^^
^^^^^^
Phases
^^^^^^
In the Python ecosystem, there are a number of terms that are
important to understand.
The ``PythonPackage`` base class provides the following phases that
can be overridden:
**PyPI**
The `Python Package Index <https://pypi.org/>`_, where most Python
libraries are hosted.
* ``build``
* ``build_py``
* ``build_ext``
* ``build_clib``
* ``build_scripts``
* ``install``
* ``install_lib``
* ``install_headers``
* ``install_scripts``
* ``install_data``
**sdist**
Source distributions, distributed as tarballs (.tar.gz) and zip
files (.zip). Contain the source code of the package.
**bdist**
Built distributions, distributed as wheels (.whl). Contain the
pre-built library.
**wheel**
A binary distribution format common in the Python ecosystem. This
file is actually just a zip file containing specific metadata and
code. See the
`documentation <https://packaging.python.org/en/latest/specifications/binary-distribution-format/>`_
for more details.
**build frontend**
Command-line tools used to build and install wheels. Examples
include `pip <https://pip.pypa.io/>`_,
`build <https://pypa-build.readthedocs.io/>`_, and
`installer <https://installer.readthedocs.io/>`_.
**build backend**
Libraries used to define how to build a wheel. Examples
include `setuptools <https://setuptools.pypa.io/>`__,
`flit <https://flit.readthedocs.io/>`_, and
`poetry <https://python-poetry.org/>`_.
^^^^^^^^^^^
Downloading
^^^^^^^^^^^
The first step in packaging a Python library is to figure out where
to download it from. The vast majority of Python packages are hosted
on `PyPI <https://pypi.org/>`_, which is
:ref:`preferred over GitHub <pypi-vs-github>` for downloading
packages. Search for the package name on PyPI to find the project
page. The project page is usually located at::
https://pypi.org/project/<package-name>
On the project page, there is a "Download files" tab containing
download URLs. Whenever possible, we prefer to build Spack packages
from source. If PyPI only has wheels, check to see if the project is
hosted on GitHub and see if GitHub has source distributions. The
project page usually has a "Homepage" and/or "Source code" link for
this. If the project is closed-source, it may only have wheels
available. For example, ``py-azureml-sdk`` is closed-source and can
be downloaded from::
https://pypi.io/packages/py3/a/azureml_sdk/azureml_sdk-1.11.0-py3-none-any.whl
Once you've found a URL to download the package from, run:
These are all standard ``setup.py`` commands and can be found by running:
.. code-block:: console
$ spack create <url>
$ python setup.py --help-commands
to create a new package template.
By default, only the ``build`` and ``install`` phases are run:
#. ``build`` - build everything needed to install
#. ``install`` - install everything from build directory
If for whatever reason you need to run more phases, simply modify your
``phases`` list like so:
.. code-block:: python
phases = ['build_ext', 'install']
Each phase provides a function ``<phase>`` that runs:
.. code-block:: console
$ python -s setup.py --no-user-cfg <phase>
Each phase also has a ``<phase_args>`` function that can pass arguments to
this call. All of these functions are empty except for the ``install_args``
function, which passes ``--prefix=/path/to/installation/prefix``. There is
also some additional logic specific to setuptools and eggs.
If you need to run a phase that is not a standard ``setup.py`` command,
you'll need to define a function for it like so:
.. code-block:: python
phases = ['configure', 'build', 'install']
def configure(self, spec, prefix):
self.setup_py('configure')
^^^^^^
Wheels
^^^^^^
Some Python packages are closed-source and distributed as wheels.
Instead of using the ``PythonPackage`` base class, you should extend
the ``Package`` base class and implement the following custom installation
procedure:
.. code-block:: python
def install(self, spec, prefix):
pip = which('pip')
pip('install', self.stage.archive_file, '--prefix={0}'.format(prefix))
This will require a dependency on pip, as mentioned below.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
Python packages can be identified by the presence of a ``setup.py`` file.
This file is used by package managers like ``pip`` to determine a
package's dependencies and the version of dependencies required, so if
the ``setup.py`` file is not accurate, the package will not build properly.
For this reason, the ``setup.py`` file should be fairly reliable. If the
documentation and ``setup.py`` disagree on something, the ``setup.py``
file should be considered to be the truth. As dependencies are added or
removed, the documentation is much more likely to become outdated than
the ``setup.py``.
The Python ecosystem has evolved significantly over the years. Before
setuptools became popular, most packages listed their dependencies in a
``requirements.txt`` file. Once setuptools took over, these dependencies
were listed directly in the ``setup.py``. Newer PEPs introduced additional
files, like ``setup.cfg`` and ``pyproject.toml``. You should look out for
all of these files, as they may all contain important information about
package dependencies.
Some Python packages are closed-source and are distributed as Python
wheels. For example, ``py-azureml-sdk`` downloads a ``.whl`` file. This
file is simply a zip file, and can be extracted using:
.. code-block:: console
$ unzip *.whl
The zip file will not contain a ``setup.py``, but it will contain a
``METADATA`` file which contains all the information you need to
write a ``package.py`` build recipe.
.. _pypi:
^^^^
PyPI
^^^^
The vast majority of Python packages are hosted on PyPI (The Python
Package Index), which is :ref:`preferred over GitHub <pypi-vs-github>`
for downloading packages. ``pip`` only supports packages hosted on PyPI, making
it the only option for developers who want a simple installation.
Search for "PyPI <package-name>" to find the download page. Note that
some pages are versioned, and the first result may not be the newest
version. Click on the "Latest Version" button to the top right to see
if a newer version is available. The download page is usually at::
https://pypi.org/project/<package-name>
Since PyPI is so common, the ``PythonPackage`` base class has a
``pypi`` attribute that can be set. Once set, ``pypi`` will be used
to define the ``homepage``, ``url``, and ``list_url``. For example,
the following:
.. code-block:: python
homepage = 'https://pypi.org/project/setuptools/'
url = 'https://pypi.org/packages/source/s/setuptools/setuptools-49.2.0.zip'
list_url = 'https://pypi.org/simple/setuptools/'
is equivalent to:
.. code-block:: python
pypi = 'setuptools/setuptools-49.2.0.zip'
^^^^^^^^^^^
Description
^^^^^^^^^^^
The top of the PyPI downloads page contains a description of the
package. The first line is usually a short description, while there
may be a several line "Project Description" that follows. Choose whichever
is more useful. You can also get these descriptions on the command-line
using:
.. code-block:: console
$ python setup.py --description
$ python setup.py --long-description
^^^^^^^^
Homepage
^^^^^^^^
Package developers use ``setup.py`` to upload new versions to PyPI.
The ``setup`` method often passes metadata like ``homepage`` to PyPI.
This metadata is displayed on the left side of the download page.
Search for the text "Homepage" under "Project links" to find it. You
should use this page instead of the PyPI page if they differ. You can
also get the homepage on the command-line by running:
.. code-block:: console
$ python setup.py --url
^^^
URL
^^^
If ``pypi`` is set as mentioned above, ``url`` and ``list_url`` will
be automatically set for you. If both ``.tar.gz`` and ``.zip`` versions
are available, ``.tar.gz`` is preferred. If some releases offer both
``.tar.gz`` and ``.zip`` versions, but some only offer ``.zip`` versions,
use ``.zip``.
Some Python packages are closed-source and do not ship ``.tar.gz`` or ``.zip``
files on either PyPI or GitHub. If this is the case, you can still download
and install a Python wheel. For example, ``py-azureml-sdk`` is closed source
and can be downloaded from::
https://pypi.io/packages/py3/a/azureml_sdk/azureml_sdk-1.11.0-py3-none-any.whl
You may see Python-specific or OS-specific URLs. Note that when you add a
``.whl`` URL, you should add ``expand=False`` to ensure that Spack doesn't
try to extract the wheel:
.. code-block:: python
version('1.11.0', sha256='d8c9d24ea90457214d798b0d922489863dad518adde3638e08ef62de28fb183a', expand=False)
.. _pypi-vs-github:
@@ -90,13 +226,11 @@ to create a new package template.
PyPI vs. GitHub
"""""""""""""""
Many packages are hosted on PyPI, but are developed on GitHub or
another version control system hosting service. The source code can
be downloaded from either location, but PyPI is preferred for the
following reasons:
Many packages are hosted on PyPI, but are developed on GitHub or another
version control systems. The tarball can be downloaded from either
location, but PyPI is preferred for the following reasons:
#. PyPI contains the bare minimum number of files needed to install
the package.
#. PyPI contains the bare minimum number of files needed to install the package.
You may notice that the tarball you download from PyPI does not
have the same checksum as the tarball you download from GitHub.
@@ -133,124 +267,252 @@ following reasons:
PyPI is nice because it makes it physically impossible to
re-release the same version of a package with a different checksum.
The only reason to use GitHub instead of PyPI is if PyPI only has
wheels or if the PyPI sdist is missing a file needed to build the
package. If this is the case, please add a comment above the ``url``
explaining this.
Use the :ref:`pypi attribute <pypi>` to facilitate construction of PyPI package
references.
^^^^
PyPI
^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
Since PyPI is so commonly used to host Python libraries, the
``PythonPackage`` base class has a ``pypi`` attribute that can be
set. Once set, ``pypi`` will be used to define the ``homepage``,
``url``, and ``list_url``. For example, the following:
There are a few dependencies common to the ``PythonPackage`` build system.
""""""
Python
""""""
Obviously, every ``PythonPackage`` needs Python at build-time to run
``python setup.py build && python setup.py install``. Python is also
needed at run-time if you want to import the module. Due to backwards
incompatible changes between Python 2 and 3, it is very important to
specify which versions of Python are supported. If the documentation
mentions that Python 3 is required, this can be specified as:
.. code-block:: python
homepage = 'https://pypi.org/project/setuptools/'
url = 'https://pypi.org/packages/source/s/setuptools/setuptools-49.2.0.zip'
list_url = 'https://pypi.org/simple/setuptools/'
depends_on('python@3:', type=('build', 'run'))
is equivalent to:
If Python 2 is required, this would look like:
.. code-block:: python
pypi = 'setuptools/setuptools-49.2.0.zip'
depends_on('python@:2', type=('build', 'run'))
If a package has a different homepage listed on PyPI, you can
override it by setting your own ``homepage``.
^^^^^^^^^^^
Description
^^^^^^^^^^^
The top of the PyPI project page contains a short description of the
package. The "Project description" tab may also contain a longer
description of the package. Either of these can be used to populate
the package docstring.
^^^^^^^^^^^^^
Build backend
^^^^^^^^^^^^^
Once you've determined the basic metadata for a package, the next
step is to determine the build backend. ``PythonPackage`` uses
`pip <https://pip.pypa.io/>`_ to install the package, but pip
requires a backend to actually build the package.
To determine the build backend, look for a ``pyproject.toml`` file.
If there is no ``pyproject.toml`` file and only a ``setup.py`` or
``setup.cfg`` file, you can assume that the project uses
:ref:`setuptools`. If there is a ``pyproject.toml`` file, see if it
contains a ``[build-system]`` section. For example:
.. code-block:: toml
[build-system]
requires = [
"setuptools>=42",
"wheel",
]
build-backend = "setuptools.build_meta"
This section does two things: the ``requires`` key lists build
dependencies of the project, and the ``build-backend`` key defines
the build backend. All of these build dependencies should be added as
dependencies to your package:
If Python 2.7 is the only version that works, you can use:
.. code-block:: python
depends_on('py-setuptools@42:', type='build')
depends_on('python@2.7:2.8', type=('build', 'run'))
Note that ``py-wheel`` is already listed as a build dependency in the
``PythonPackage`` base class, so you don't need to add it unless you
need to specify a specific version requirement or change the
dependency type.
The documentation may not always specify supported Python versions.
Another place to check is in the ``setup.py`` or ``setup.cfg`` file.
Look for a line containing ``python_requires``. An example from
`py-numpy <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-numpy/package.py>`_
looks like:
See `PEP 517 <https://www.python.org/dev/peps/pep-0517/>`_ and
`PEP 518 <https://www.python.org/dev/peps/pep-0518/>`_ for more
information on the design of ``pyproject.toml``.
.. code-block:: python
Depending on which build backend a project uses, there are various
places that run-time dependencies can be listed.
python_requires='>=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*'
"""""""""
distutils
"""""""""
Before the introduction of setuptools and other build backends,
Python packages had to rely on the built-in distutils library.
Distutils is missing many of the features that setuptools and other
build backends offer, and users are encouraged to use setuptools
instead. In fact, distutils was deprecated in Python 3.10 and will be
removed in Python 3.12. Because of this, pip actually replaces all
imports of distutils with setuptools. If a package uses distutils,
you should instead add a build dependency on setuptools. Check for a
``requirements.txt`` file that may list dependencies of the project.
You may also find a version check at the top of the ``setup.py``:
.. _setuptools:
.. code-block:: python
if sys.version_info[:2] < (2, 7) or (3, 0) <= sys.version_info[:2] < (3, 4):
raise RuntimeError("Python version 2.7 or >= 3.4 required.")
This can be converted to Spack's spec notation like so:
.. code-block:: python
depends_on('python@2.7:2.8,3.4:', type=('build', 'run'))
If you are writing a recipe for a package that only distributes
wheels, look for a section in the ``METADATA`` file that looks like::
Requires-Python: >=3.5,<4
This would be translated to:
.. code-block:: python
extends('python')
depends_on('python@3.5:3', type=('build', 'run'))
Many ``setup.py`` or ``setup.cfg`` files also contain information like::
Programming Language :: Python :: 2
Programming Language :: Python :: 2.6
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.3
Programming Language :: Python :: 3.4
Programming Language :: Python :: 3.5
Programming Language :: Python :: 3.6
This is a list of versions of Python that the developer likely tests.
However, you should not use this to restrict the versions of Python
the package uses unless one of the two former methods (``python_requires``
or ``sys.version_info``) is used. There is no logic in setuptools
that prevents the package from building for Python versions not in
this list, and often new releases like Python 3.7 or 3.8 work just fine.
""""""""""
setuptools
""""""""""
If the ``pyproject.toml`` lists ``setuptools.build_meta`` as a
``build-backend``, or if the package has a ``setup.py`` that imports
``setuptools``, or if the package has a ``setup.cfg`` file, then it
uses setuptools to build. Setuptools is a replacement for the
distutils library, and has almost the exact same API. Dependencies
can be listed in the ``setup.py`` or ``setup.cfg`` file. Look for the
following arguments:
Originally, the Python language had a single build system called
distutils, which is built into Python. Distutils provided a common
framework for package authors to describe their project and how it
should be built. However, distutils was not without limitations.
Most notably, there was no way to list a project's dependencies
with distutils. Along came setuptools, a non-builtin build system
designed to overcome the limitations of distutils. Both projects
use a similar API, making the transition easy while adding much
needed functionality. Today, setuptools is used in around 90% of
the Python packages in Spack.
Since setuptools isn't built-in to Python, you need to add it as a
dependency. To determine whether or not a package uses setuptools,
search the file for an import statement like:
.. code-block:: python
import setuptools
or:
.. code-block:: python
from setuptools import setup
Some packages are designed to work with both setuptools and distutils,
so you may find something like:
.. code-block:: python
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
This uses setuptools if available, and falls back to distutils if not.
In this case, you would still want to add a setuptools dependency, as
it offers us more control over the installation.
Unless specified otherwise, setuptools is usually a build-only dependency.
That is, it is needed to install the software, but is not needed at
run-time. This can be specified as:
.. code-block:: python
depends_on('py-setuptools', type='build')
"""
pip
"""
Packages distributed as Python wheels will require an extra dependency
on pip:
.. code-block:: python
depends_on('py-pip', type='build')
We will use pip to install the actual wheel.
""""""
cython
""""""
Compared to compiled languages, interpreted languages like Python can
be quite a bit slower. To work around this, some Python developers
rewrite computationally demanding sections of code in C, a process
referred to as "cythonizing". In order to build these package, you
need to add a build dependency on cython:
.. code-block:: python
depends_on('py-cython', type='build')
Look for references to "cython" in the ``setup.py`` to determine
whether or not this is necessary. Cython may be optional, but
even then you should list it as a required dependency. Spack is
designed to compile software, and is meant for HPC facilities
where speed is crucial. There is no reason why someone would not
want an optimized version of a library instead of the pure-Python
version.
Note that some release tarballs come pre-cythonized, and cython is
not needed as a dependency. However, this is becoming less common
as Python continues to evolve and developers discover that cythonized
sources are no longer compatible with newer versions of Python and
need to be re-cythonized.
^^^^^^^^^^^^^^^^^^^
Python dependencies
^^^^^^^^^^^^^^^^^^^
When you install a package with ``pip``, it reads the ``setup.py``
file in order to determine the dependencies of the package.
If the dependencies are not yet installed, ``pip`` downloads them
and installs them for you. This may sound convenient, but Spack
cannot rely on this behavior for two reasons:
#. Spack needs to be able to install packages on air-gapped networks.
If there is no internet connection, ``pip`` can't download the
package dependencies. By explicitly listing every dependency in
the ``package.py``, Spack knows what to download ahead of time.
#. Duplicate installations of the same dependency may occur.
Spack supports *activation* of Python extensions, which involves
symlinking the package installation prefix to the Python installation
prefix. If your package is missing a dependency, that dependency
will be installed to the installation directory of the same package.
If you try to activate the package + dependency, it may cause a
problem if that package has already been activated.
For these reasons, you must always explicitly list all dependencies.
Although the documentation may list the package's dependencies,
often the developers assume people will use ``pip`` and won't have to
worry about it. Always check the ``setup.py`` to find the true
dependencies.
If the package relies on ``distutils``, it may not explicitly list its
dependencies. Check for statements like:
.. code-block:: python
try:
import numpy
except ImportError:
raise ImportError("numpy must be installed prior to installation")
Obviously, this means that ``py-numpy`` is a dependency.
If the package uses ``setuptools``, check for the following clues:
* ``python_requires``
This specifies the version of Python that is required.
As mentioned above, this specifies which versions of Python are
required.
* ``setup_requires``
@@ -262,88 +524,43 @@ following arguments:
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
* ``extras_require``
* ``extra_requires``
These packages are optional dependencies that enable additional
functionality. You should add a variant that optionally adds these
dependencies. This variant should be False by default.
* ``tests_require``
* ``test_requires``
These are packages that are required to run the unit tests for the
package. These dependencies can be specified using the
``type='test'`` dependency type. However, the PyPI tarballs rarely
contain unit tests, so there is usually no reason to add these.
See https://setuptools.pypa.io/en/latest/userguide/dependency_management.html
for more information on how setuptools handles dependency management.
See `PEP 440 <https://www.python.org/dev/peps/pep-0440/#version-specifiers>`_
for documentation on version specifiers in setuptools.
In the root directory of the package, you may notice a
``requirements.txt`` file. It may look like this file contains a list
of all of the package's dependencies. Don't be fooled. This file is
used by tools like Travis to install the pre-requisites for the
package... and a whole bunch of other things. It often contains
dependencies only needed for unit tests, like:
""""
flit
""""
* mock
* nose
* pytest
There are actually two possible ``build-backend`` for flit, ``flit``
and ``flit_core``. If you see these in the ``pyproject.toml``, add a
build dependency to your package. With flit, all dependencies are
listed directly in the ``pyproject.toml`` file. Older versions of
flit used to store this info in a ``flit.ini`` file, so check for
this too.
It can also contain dependencies for building the documentation, like
sphinx. If you can't find any information about the package's
dependencies, you can take a look in ``requirements.txt``, but be sure
not to add test or documentation dependencies.
Either of these files may contain keys like:
Newer PEPs have added alternative ways to specify a package's dependencies.
If you don't see any dependencies listed in the ``setup.py``, look for a
``setup.cfg`` or ``pyproject.toml``. These files can be used to store the
same ``install_requires`` information that ``setup.py`` used to use.
* ``requires-python``
If you are write a recipe for a package that only distributes wheels,
check the ``METADATA`` file for lines like::
This specifies the version of Python that is required
* ``dependencies`` or ``requires``
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
* ``project.optional-dependencies`` or ``requires-extra``
This section includes keys with lists of optional dependencies
needed to enable those features. You should add a variant that
optionally adds these dependencies. This variant should be False
by default.
See https://flit.readthedocs.io/en/latest/pyproject_toml.html for
more information.
""""""
poetry
""""""
Like flit, poetry also has two possible ``build-backend``, ``poetry``
and ``poetry_core``. If you see these in the ``pyproject.toml``, add
a build dependency to your package. With poetry, all dependencies are
listed directly in the ``pyproject.toml`` file. Dependencies are
listed in a ``[tool.poetry.dependencies]`` section, and use a
`custom syntax <https://python-poetry.org/docs/dependency-specification/#version-constraints>`_
for specifying the version requirements. Note that ``~=`` works
differently in poetry than in setuptools and flit for versions that
start with a zero.
""""""
wheels
""""""
Some Python packages are closed-source and are distributed as Python
wheels. For example, ``py-azureml-sdk`` downloads a ``.whl`` file. This
file is simply a zip file, and can be extracted using:
.. code-block:: console
$ unzip *.whl
The zip file will not contain a ``setup.py``, but it will contain a
``METADATA`` file which contains all the information you need to
write a ``package.py`` build recipe. Check for lines like::
Requires-Python: >=3.5,<4
Requires-Dist: azureml-core (~=1.11.0)
Requires-Dist: azureml-dataset-runtime[fuse] (~=1.11.0)
Requires-Dist: azureml-train (~=1.11.0)
@@ -355,58 +572,62 @@ write a ``package.py`` build recipe. Check for lines like::
Requires-Dist: azureml-train-automl (~=1.11.0); extra == 'automl'
``Requires-Python`` is equivalent to ``python_requires`` and
``Requires-Dist`` is equivalent to ``install_requires``.
``Provides-Extra`` is used to name optional features (variants) and
a ``Requires-Dist`` with ``extra == 'foo'`` will list any
dependencies needed for that feature.
Lines that use ``Requires-Dist`` are similar to ``install_requires``.
Lines that use ``Provides-Extra`` are similar to ``extra_requires``,
and you can add a variant for those dependencies. The ``~=1.11.0``
syntax is equivalent to ``1.11.0:1.11``.
""""""""""
setuptools
""""""""""
Setuptools is a bit of a special case. If a package requires setuptools
at run-time, how do they express this? They could add it to
``install_requires``, but setuptools is imported long before this and is
needed to read this line. And since you can't install the package
without setuptools, the developers assume that setuptools will already
be there, so they never mention when it is required. We don't want to
add run-time dependencies if they aren't needed, so you need to
determine whether or not setuptools is needed. Grep the installation
directory for any files containing a reference to ``setuptools`` or
``pkg_resources``. Both modules come from ``py-setuptools``.
``pkg_resources`` is particularly common in scripts found in
``prefix/bin``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to setup.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The default install phase should be sufficient to install most
packages. However, the installation instructions for a package may
suggest passing certain flags to the ``setup.py`` call. The
``PythonPackage`` class has two techniques for doing this.
The default build and install phases should be sufficient to install
most packages. However, you may want to pass additional flags to
either phase.
""""""""""""""
Global options
""""""""""""""
You can view the available options for a particular phase with:
These flags are added directly after ``setup.py`` when pip runs
``python setup.py install``. For example, the ``py-pyyaml`` package
has an optional dependency on ``libyaml`` that can be enabled like so:
.. code-block:: console
$ python setup.py <phase> --help
Each phase provides a ``<phase_args>`` function that can be used to
pass arguments to that phase. For example,
`py-numpy <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-numpy/package.py>`_
adds:
.. code-block:: python
def global_options(self, spec, prefix):
options = []
if '+libyaml' in spec:
options.append('--with-libyaml')
else:
options.append('--without-libyaml')
return options
def build_args(self, spec, prefix):
args = []
# From NumPy 1.10.0 on it's possible to do a parallel build.
if self.version >= Version('1.10.0'):
# But Parallel build in Python 3.5+ is broken. See:
# https://github.com/spack/spack/issues/7927
# https://github.com/scipy/scipy/issues/7112
if spec['python'].version < Version('3.5'):
args = ['-j', str(make_jobs)]
"""""""""""""""
Install options
"""""""""""""""
These flags are added directly after ``install`` when pip runs
``python setup.py install``. For example, the ``py-pyyaml`` package
allows you to specify the directories to search for ``libyaml``:
.. code-block:: python
def install_options(self, spec, prefix):
options = []
if '+libyaml' in spec:
options.extend([
spec['libyaml'].libs.search_flags,
spec['libyaml'].headers.include_flags,
])
return options
return args
^^^^^^^
@@ -448,9 +669,9 @@ a "package" is a directory containing files like:
whereas a "module" is a single Python file.
The ``PythonPackage`` base class automatically detects these package
and module names for you. If, for whatever reason, the module names
detected are wrong, you can provide the names yourself by overriding
The ``PythonPackage`` base class automatically detects these module
names for you. If, for whatever reason, the module names detected
are wrong, you can provide the names yourself by overriding
``import_modules`` like so:
.. code-block:: python
@@ -471,8 +692,10 @@ This can be expressed like so:
@property
def import_modules(self):
modules = ['yaml']
if '+libyaml' in self.spec:
modules.append('yaml.cyaml')
return modules
@@ -490,8 +713,8 @@ Unit tests
""""""""""
The package may have its own unit or regression tests. Spack can
run these tests during the installation by adding test methods after
installation.
run these tests during the installation by adding phase-appropriate
test methods.
For example, ``py-numpy`` adds the following as a check to run
after the ``install`` phase:
@@ -517,14 +740,34 @@ when testing is enabled during the installation (i.e., ``spack install
Setup file in a sub-directory
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Many C/C++ libraries provide optional Python bindings in a
subdirectory. To tell pip which directory to build from, you can
override the ``build_directory`` attribute. For example, if a package
provides Python bindings in a ``python`` directory, you can use:
In order to be compatible with package managers like ``pip``, the package
is required to place its ``setup.py`` in the root of the tarball. However,
not every Python package cares about ``pip`` or PyPI. If you are installing
a package that is not hosted on PyPI, you may find that it places its
``setup.py`` in a sub-directory. To handle this, add the directory containing
``setup.py`` to the package like so:
.. code-block:: python
build_directory = 'python'
build_directory = 'source'
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Alternate names for setup.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
As previously mentioned, packages need to call their setup script ``setup.py``
in order to be compatible with package managers like ``pip``. However, some
packages like
`py-meep <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-meep/package.py>`_ and
`py-adios <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-adios/package.py>`_
come with multiple setup scripts, one for a serial build and another for a
parallel build. You can override the default name to use like so:
.. code-block:: python
def setup_file(self):
return 'setup-mpi.py' if '+mpi' in self.spec else 'setup.py'
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -538,14 +781,10 @@ on Python are not necessarily ``PythonPackage``'s.
Choosing a build system
"""""""""""""""""""""""
First of all, you need to select a build system. ``spack create``
usually does this for you, but if for whatever reason you need to do
this manually, choose ``PythonPackage`` if and only if the package
contains one of the following files:
* ``pyproject.toml``
* ``setup.py``
* ``setup.cfg``
First of all, you need to select a build system. ``spack create`` usually
does this for you, but if for whatever reason you need to do this manually,
choose ``PythonPackage`` if and only if the package contains a ``setup.py``
file.
"""""""""""""""""""""""
Choosing a package name
@@ -618,9 +857,10 @@ having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
thumb is to check the installation prefix. If Python libraries are
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
installed to ``prefix/lib/python2.7/site-packages`` (where 2.7 is the
MAJOR.MINOR version of Python you used to install the package), then
you should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``prefix/bin``, then
don't use ``extends``, as symlinking the package wouldn't be useful.
^^^^^^^^^^^^^^^^^^^^^
@@ -653,17 +893,4 @@ External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on Python packaging, see:
* https://packaging.python.org/
For more information on build and installation frontend tools, see:
* pip: https://pip.pypa.io/
* build: https://pypa-build.readthedocs.io/
* installer: https://installer.readthedocs.io/
For more information on build backend tools, see:
* setuptools: https://setuptools.pypa.io/
* flit: https://flit.readthedocs.io/
* poetry: https://python-poetry.org/
https://packaging.python.org/

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -13,16 +13,12 @@ Spack has many configuration files. Here is a quick list of them, in
case you want to skip directly to specific docs:
* :ref:`compilers.yaml <compiler-config>`
* :ref:`concretizer.yaml <concretizer-options>`
* :ref:`config.yaml <config-yaml>`
* :ref:`mirrors.yaml <mirrors>`
* :ref:`modules.yaml <modules>`
* :ref:`packages.yaml <build-settings>`
* :ref:`repos.yaml <repositories>`
You can also add any of these as inline configuration in ``spack.yaml``
in an :ref:`environment <environment-configuration>`.
-----------
YAML Format
-----------

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -38,7 +38,8 @@ obtained by cloning the corresponding git repository:
.. code-block:: console
$ cd ~/
$ pwd
/home/user
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
@@ -61,7 +62,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config:
extensions:
- ~/tmp/spack-scripting
- /home/user/tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line:

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -271,10 +271,9 @@ Compiler configuration
----------------------
Spack has the ability to build packages with multiple compilers and
compiler versions. Compilers can be made available to Spack by
specifying them manually in ``compilers.yaml``, or automatically by
running ``spack compiler find``, but for convenience Spack will
automatically detect compilers the first time it needs them.
compiler versions. Spack searches for compilers on your machine
automatically the first time it is run. It does this by inspecting
your ``PATH``.
.. _cmd-spack-compilers:
@@ -282,7 +281,7 @@ automatically detect compilers the first time it needs them.
``spack compilers``
^^^^^^^^^^^^^^^^^^^
You can see which compilers are available to Spack by running ``spack
You can see which compilers spack has found by running ``spack
compilers`` or ``spack compiler list``:
.. code-block:: console
@@ -321,10 +320,9 @@ An alias for ``spack compiler find``.
``spack compiler find``
^^^^^^^^^^^^^^^^^^^^^^^
Lists the compilers currently available to Spack. If you do not see
a compiler in this list, but you want to use it with Spack, you can
simply run ``spack compiler find`` with the path to where the
compiler is installed. For example:
If you do not see a compiler in this list, but you want to use it with
Spack, you can simply run ``spack compiler find`` with the path to
where the compiler is installed. For example:
.. code-block:: console

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -615,39 +615,44 @@ modifications to either ``CPATH`` or ``LIBRARY_PATH``.
Autoload dependencies
"""""""""""""""""""""
Often it is required for a module to have its (transient) dependencies loaded as well.
One example where this is useful is when one package needs to use executables provided
by its dependency; when the dependency is autoloaded, the executable will be in the
PATH. Similarly for scripting languages such as Python, packages and their dependencies
have to be loaded together.
Autoloading is enabled by default for LMod, as it has great builtin support for through
the ``depends_on`` function. For Environment Modules it is disabled by default.
Autoloading can also be enabled conditionally:
In some cases it can be useful to have module files that automatically load
their dependencies. This may be the case for Python extensions, if not
activated using ``spack activate``:
.. code-block:: yaml
modules:
default:
tcl:
all:
autoload: none
^python:
autoload: direct
modules:
default:
tcl:
^python:
autoload: 'direct'
The configuration file above will produce module files that will
load their direct dependencies if the package installed depends on ``python``.
The allowed values for the ``autoload`` statement are either ``none``,
``direct`` or ``all``.
``direct`` or ``all``. The default is ``none``.
.. tip::
Building external software
Setting ``autoload`` to ``direct`` for all packages can be useful
when building software outside of a Spack installation that depends on
artifacts in that installation. E.g. (adjust ``lmod`` vs ``tcl``
as appropriate):
.. code-block:: yaml
modules:
default:
lmod:
all:
autoload: 'direct'
.. note::
TCL prerequisites
In the ``tcl`` section of the configuration file it is possible to use
the ``prerequisites`` directive that accepts the same values as
``autoload``. It will produce module files that have a ``prereq``
statement, which can be used to autoload dependencies in some versions
of Environment Modules.
statement instead of automatically loading other modules.
------------------------
Maintaining Module Files

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -705,8 +705,7 @@ as follows:
#. The following special strings are considered larger than any other
numeric or non-numeric version component, and satisfy the following
order between themselves:
``develop > main > master > head > trunk > stable``.
order between themselves: ``develop > main > master > head > trunk``.
#. Numbers are ordered numerically, are less than special strings, and
larger than other non-numeric components.
@@ -1442,32 +1441,6 @@ The ``when`` clause follows the same syntax and accepts the same
values as the ``when`` argument of
:py:func:`spack.directives.depends_on`
^^^^^^^^^^^^^^^
Sticky Variants
^^^^^^^^^^^^^^^
The variant directive can be marked as ``sticky`` by setting to ``True`` the
corresponding argument:
.. code-block:: python
variant('bar', default=False, sticky=True)
A ``sticky`` variant differs from a regular one in that it is always set
to either:
#. An explicit value appearing in a spec literal or
#. Its default value
The concretizer thus is not free to pick an alternate value to work
around conflicts, but will error out instead.
Setting this property on a variant is useful in cases where the
variant allows some dangerous or controversial options (e.g. using unsupported versions
of a compiler for a library) and the packager wants to ensure that
allowing these options is done on purpose by the user, rather than
automatically by the solver.
^^^^^^^^^^^^^^^^^^^
Overriding Variants
^^^^^^^^^^^^^^^^^^^
@@ -2470,24 +2443,6 @@ Now, the ``py-numpy`` package can be used as an argument to ``spack
activate``. When it is activated, all the files in its prefix will be
symbolically linked into the prefix of the python package.
A package can only extend one other package at a time. To support packages
that may extend one of a list of other packages, Spack supports multiple
``extends`` directives as long as at most one of them is selected as
a dependency during concretization. For example, a lua package could extend
either lua or luajit, but not both:
.. code-block:: python
class LuaLpeg(Package):
...
variant('use_lua', default=True)
extends('lua', when='+use_lua')
extends('lua-luajit', when='~use_lua')
...
Now, a user can install, and activate, the ``lua-lpeg`` package for either
lua or luajit.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding additional constraints
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -2877,7 +2832,7 @@ be concretized on their system. For example, one user may prefer packages
built with OpenMPI and the Intel compiler. Another user may prefer
packages be built with MVAPICH and GCC.
See the :ref:`package-preferences` section for more details.
See the :ref:`concretization-preferences` section for more details.
.. _group_when_spec:

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

10
lib/spack/env/cc vendored
View File

@@ -1,7 +1,7 @@
#!/bin/sh
# shellcheck disable=SC2034 # evals in this script fool shellcheck
#
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -241,28 +241,28 @@ case "$command" in
mode=cpp
debug_flags="-g"
;;
cc|c89|c99|gcc|clang|armclang|icc|icx|pgcc|nvc|xlc|xlc_r|fcc|amdclang)
cc|c89|c99|gcc|clang|armclang|icc|icx|pgcc|nvc|xlc|xlc_r|fcc)
command="$SPACK_CC"
language="C"
comp="CC"
lang_flags=C
debug_flags="-g"
;;
c++|CC|g++|clang++|armclang++|icpc|icpx|dpcpp|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++)
c++|CC|g++|clang++|armclang++|icpc|icpx|dpcpp|pgc++|nvc++|xlc++|xlc++_r|FCC)
command="$SPACK_CXX"
language="C++"
comp="CXX"
lang_flags=CXX
debug_flags="-g"
;;
ftn|f90|fc|f95|gfortran|flang|armflang|ifort|ifx|pgfortran|nvfortran|xlf90|xlf90_r|nagfor|frt|amdflang)
ftn|f90|fc|f95|gfortran|flang|armflang|ifort|ifx|pgfortran|nvfortran|xlf90|xlf90_r|nagfor|frt)
command="$SPACK_FC"
language="Fortran 90"
comp="FC"
lang_flags=F
debug_flags="-g"
;;
f77|xlf|xlf_r|pgf77|amdflang)
f77|xlf|xlf_r|pgf77)
command="$SPACK_F77"
language="Fortran 77"
comp="F77"

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cpp

View File

@@ -1 +0,0 @@
../fc

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -6,13 +6,6 @@
"""This module contains the following external, potentially separately
licensed, packages that are included in Spack:
altgraph
--------
* Homepage: https://altgraph.readthedocs.io/en/latest/index.html
* Usage: dependency of macholib
* Version: 0.17.2
archspec
--------
@@ -31,22 +24,6 @@
vendored copy ever needs to be updated again:
https://github.com/spack/spack/pull/6786/commits/dfcef577b77249106ea4e4c69a6cd9e64fa6c418
astunparse
----------------
* Homepage: https://github.com/simonpercivall/astunparse
* Usage: Unparsing Python ASTs for package hashes in Spack
* Version: 1.6.3 (plus modifications)
* Note: This is in ``spack.util.unparse`` because it's very heavily
modified, and we want to track coverage for it.
Specifically, we have modified this library to generate consistent unparsed ASTs
regardless of the Python version. It is based on:
1. The original ``astunparse`` library;
2. Modifications for consistency;
3. Backports from the ``ast.unparse`` function in Python 3.9 and later
The unparsing is now mostly consistent with upstream ``ast.unparse``, so if
we ever require Python 3.9 or higher, we can drop this external package.
attrs
----------------
@@ -91,13 +68,6 @@
* Version: 3.2.0 (last version before 2.7 and 3.6 support was dropped)
* Note: We don't include tests or benchmarks; just what Spack needs.
macholib
--------
* Homepage: https://macholib.readthedocs.io/en/latest/index.html#
* Usage: Manipulation of Mach-o binaries for relocating macOS buildcaches on Linux
* Version: 1.15.2
markupsafe
----------
@@ -154,4 +124,18 @@
* Usage: Python 2 and 3 compatibility utilities.
* Version: 1.16.0
macholib
--------
* Homepage: https://macholib.readthedocs.io/en/latest/index.html#
* Usage: Manipulation of Mach-o binaries for relocating macOS buildcaches on Linux
* Version: 1.12
altgraph
--------
* Homepage: https://altgraph.readthedocs.io/en/latest/index.html
* Usage: dependency of macholib
* Version: 0.16.1
"""

View File

@@ -1,4 +1,4 @@
"""
'''
altgraph.Dot - Interface to the dot language
============================================
@@ -107,7 +107,7 @@
- for more details on how to control the graph drawing process see the
`graphviz reference
<http://www.research.att.com/sw/tools/graphviz/refs.html>`_.
"""
'''
import os
import warnings
@@ -115,34 +115,25 @@
class Dot(object):
"""
'''
A class providing a **graphviz** (dot language) representation
allowing a fine grained control over how the graph is being
displayed.
If the :command:`dot` and :command:`dotty` programs are not in the current
system path their location needs to be specified in the contructor.
"""
'''
def __init__(
self,
graph=None,
nodes=None,
edgefn=None,
nodevisitor=None,
edgevisitor=None,
name="G",
dot="dot",
dotty="dotty",
neato="neato",
graphtype="digraph",
):
"""
self, graph=None, nodes=None, edgefn=None, nodevisitor=None,
edgevisitor=None, name="G", dot='dot', dotty='dotty',
neato='neato', graphtype="digraph"):
'''
Initialization.
"""
'''
self.name, self.attr = name, {}
assert graphtype in ["graph", "digraph"]
assert graphtype in ['graph', 'digraph']
self.type = graphtype
self.temp_dot = "tmp_dot.dot"
@@ -157,10 +148,8 @@ def __init__(
if graph is not None and nodes is None:
nodes = graph
if graph is not None and edgefn is None:
def edgefn(node, graph=graph):
return graph.out_nbrs(node)
if nodes is None:
nodes = ()
@@ -188,19 +177,20 @@ def edgefn(node, graph=graph):
self.edge_style(head, tail, **edgestyle)
def style(self, **attr):
"""
'''
Changes the overall style
"""
'''
self.attr = attr
def display(self, mode="dot"):
"""
def display(self, mode='dot'):
'''
Displays the current graph via dotty
"""
'''
if mode == "neato":
if mode == 'neato':
self.save_dot(self.temp_neo)
neato_cmd = "%s -o %s %s" % (self.neato, self.temp_dot, self.temp_neo)
neato_cmd = "%s -o %s %s" % (
self.neato, self.temp_dot, self.temp_neo)
os.system(neato_cmd)
else:
self.save_dot(self.temp_dot)
@@ -209,24 +199,24 @@ def display(self, mode="dot"):
os.system(plot_cmd)
def node_style(self, node, **kwargs):
"""
'''
Modifies a node style to the dot representation.
"""
'''
if node not in self.edges:
self.edges[node] = {}
self.nodes[node] = kwargs
def all_node_style(self, **kwargs):
"""
'''
Modifies all node styles
"""
'''
for node in self.nodes:
self.node_style(node, **kwargs)
def edge_style(self, head, tail, **kwargs):
"""
'''
Modifies an edge style to the dot representation.
"""
'''
if tail not in self.nodes:
raise GraphError("invalid node %s" % (tail,))
@@ -239,10 +229,10 @@ def edge_style(self, head, tail, **kwargs):
def iterdot(self):
# write graph title
if self.type == "digraph":
yield "digraph %s {\n" % (self.name,)
elif self.type == "graph":
yield "graph %s {\n" % (self.name,)
if self.type == 'digraph':
yield 'digraph %s {\n' % (self.name,)
elif self.type == 'graph':
yield 'graph %s {\n' % (self.name,)
else:
raise GraphError("unsupported graphtype %s" % (self.type,))
@@ -250,11 +240,11 @@ def iterdot(self):
# write overall graph attributes
for attr_name, attr_value in sorted(self.attr.items()):
yield '%s="%s";' % (attr_name, attr_value)
yield "\n"
yield '\n'
# some reusable patterns
cpatt = '%s="%s",' # to separate attributes
epatt = "];\n" # to end attributes
cpatt = '%s="%s",' # to separate attributes
epatt = '];\n' # to end attributes
# write node attributes
for node_name, node_attr in sorted(self.nodes.items()):
@@ -266,24 +256,25 @@ def iterdot(self):
# write edge attributes
for head in sorted(self.edges):
for tail in sorted(self.edges[head]):
if self.type == "digraph":
if self.type == 'digraph':
yield '\t"%s" -> "%s" [' % (head, tail)
else:
yield '\t"%s" -- "%s" [' % (head, tail)
for attr_name, attr_value in sorted(self.edges[head][tail].items()):
for attr_name, attr_value in \
sorted(self.edges[head][tail].items()):
yield cpatt % (attr_name, attr_value)
yield epatt
# finish file
yield "}\n"
yield '}\n'
def __iter__(self):
return self.iterdot()
def save_dot(self, file_name=None):
"""
'''
Saves the current graph representation into a file
"""
'''
if not file_name:
warnings.warn(DeprecationWarning, "always pass a file_name")
@@ -293,18 +284,19 @@ def save_dot(self, file_name=None):
for chunk in self.iterdot():
fp.write(chunk)
def save_img(self, file_name=None, file_type="gif", mode="dot"):
"""
def save_img(self, file_name=None, file_type="gif", mode='dot'):
'''
Saves the dot file as an image file
"""
'''
if not file_name:
warnings.warn(DeprecationWarning, "always pass a file_name")
file_name = "out"
if mode == "neato":
if mode == 'neato':
self.save_dot(self.temp_neo)
neato_cmd = "%s -o %s %s" % (self.neato, self.temp_dot, self.temp_neo)
neato_cmd = "%s -o %s %s" % (
self.neato, self.temp_dot, self.temp_neo)
os.system(neato_cmd)
plot_cmd = self.dot
else:
@@ -313,9 +305,5 @@ def save_img(self, file_name=None, file_type="gif", mode="dot"):
file_name = "%s.%s" % (file_name, file_type)
create_cmd = "%s -T%s %s -o %s" % (
plot_cmd,
file_type,
self.temp_dot,
file_name,
)
plot_cmd, file_type, self.temp_dot, file_name)
os.system(create_cmd)

View File

@@ -13,9 +13,8 @@
#--Nathan Denny, May 27, 1999
"""
from collections import deque
from altgraph import GraphError
from collections import deque
class Graph(object):
@@ -59,10 +58,8 @@ def __init__(self, edges=None):
raise GraphError("Cannot create edge from %s" % (item,))
def __repr__(self):
return "<Graph: %d nodes, %d edges>" % (
self.number_of_nodes(),
self.number_of_edges(),
)
return '<Graph: %d nodes, %d edges>' % (
self.number_of_nodes(), self.number_of_edges())
def add_node(self, node, node_data=None):
"""
@@ -114,7 +111,7 @@ def add_edge(self, head_id, tail_id, edge_data=1, create_nodes=True):
self.nodes[tail_id][0].append(edge)
self.nodes[head_id][1].append(edge)
except KeyError:
raise GraphError("Invalid nodes %s -> %s" % (head_id, tail_id))
raise GraphError('Invalid nodes %s -> %s' % (head_id, tail_id))
# store edge information
self.edges[edge] = (head_id, tail_id, edge_data)
@@ -127,12 +124,13 @@ def hide_edge(self, edge):
time.
"""
try:
head_id, tail_id, edge_data = self.hidden_edges[edge] = self.edges[edge]
head_id, tail_id, edge_data = \
self.hidden_edges[edge] = self.edges[edge]
self.nodes[tail_id][0].remove(edge)
self.nodes[head_id][1].remove(edge)
del self.edges[edge]
except KeyError:
raise GraphError("Invalid edge %s" % edge)
raise GraphError('Invalid edge %s' % edge)
def hide_node(self, node):
"""
@@ -146,7 +144,7 @@ def hide_node(self, node):
self.hide_edge(edge)
del self.nodes[node]
except KeyError:
raise GraphError("Invalid node %s" % node)
raise GraphError('Invalid node %s' % node)
def restore_node(self, node):
"""
@@ -159,7 +157,7 @@ def restore_node(self, node):
self.restore_edge(edge)
del self.hidden_nodes[node]
except KeyError:
raise GraphError("Invalid node %s" % node)
raise GraphError('Invalid node %s' % node)
def restore_edge(self, edge):
"""
@@ -172,7 +170,7 @@ def restore_edge(self, edge):
self.edges[edge] = head_id, tail_id, data
del self.hidden_edges[edge]
except KeyError:
raise GraphError("Invalid edge %s" % edge)
raise GraphError('Invalid edge %s' % edge)
def restore_all_edges(self):
"""
@@ -205,7 +203,7 @@ def edge_by_id(self, edge):
head, tail, data = self.edges[edge]
except KeyError:
head, tail = None, None
raise GraphError("Invalid edge %s" % edge)
raise GraphError('Invalid edge %s' % edge)
return (head, tail)
@@ -341,7 +339,7 @@ def out_edges(self, node):
try:
return list(self.nodes[node][1])
except KeyError:
raise GraphError("Invalid node %s" % node)
raise GraphError('Invalid node %s' % node)
def inc_edges(self, node):
"""
@@ -350,7 +348,7 @@ def inc_edges(self, node):
try:
return list(self.nodes[node][0])
except KeyError:
raise GraphError("Invalid node %s" % node)
raise GraphError('Invalid node %s' % node)
def all_edges(self, node):
"""
@@ -490,7 +488,7 @@ def iterdfs(self, start, end=None, forward=True):
The forward parameter specifies whether it is a forward or backward
traversal.
"""
visited, stack = {start}, deque([start])
visited, stack = set([start]), deque([start])
if forward:
get_edges = self.out_edges
@@ -517,7 +515,7 @@ def iterdata(self, start, end=None, forward=True, condition=None):
condition callback is only called when node_data is not None.
"""
visited, stack = {start}, deque([start])
visited, stack = set([start]), deque([start])
if forward:
get_edges = self.out_edges
@@ -549,7 +547,7 @@ def _iterbfs(self, start, end=None, forward=True):
traversal. Returns a list of tuples where the first value is the hop
value the second value is the node id.
"""
queue, visited = deque([(start, 0)]), {start}
queue, visited = deque([(start, 0)]), set([start])
# the direction of the bfs depends on the edges that are sampled
if forward:

View File

@@ -1,7 +1,7 @@
"""
'''
altgraph.GraphAlgo - Graph algorithms
=====================================
"""
'''
from altgraph import GraphError
@@ -28,9 +28,9 @@ def dijkstra(graph, start, end=None):
Adapted to altgraph by Istvan Albert, Pennsylvania State University -
June, 9 2004
"""
D = {} # dictionary of final distances
P = {} # dictionary of predecessors
Q = _priorityDictionary() # estimated distances of non-final vertices
D = {} # dictionary of final distances
P = {} # dictionary of predecessors
Q = _priorityDictionary() # estimated distances of non-final vertices
Q[start] = 0
for v in Q:
@@ -44,8 +44,7 @@ def dijkstra(graph, start, end=None):
if w in D:
if vwLength < D[w]:
raise GraphError(
"Dijkstra: found better path to already-final vertex"
)
"Dijkstra: found better path to already-final vertex")
elif w not in Q or vwLength < Q[w]:
Q[w] = vwLength
P[w] = v
@@ -77,7 +76,7 @@ def shortest_path(graph, start, end):
# Utility classes and functions
#
class _priorityDictionary(dict):
"""
'''
Priority dictionary using binary heaps (internal use only)
David Eppstein, UC Irvine, 8 Mar 2002
@@ -93,22 +92,22 @@ class _priorityDictionary(dict):
order. Each item is not removed until the next item is requested,
so D[x] will still return a useful value until the next iteration
of the for-loop. Each operation takes logarithmic amortized time.
"""
'''
def __init__(self):
"""
'''
Initialize priorityDictionary by creating binary heap of pairs
(value,key). Note that changing or removing a dict entry will not
remove the old pair from the heap until it is found by smallest()
or until the heap is rebuilt.
"""
'''
self.__heap = []
dict.__init__(self)
def smallest(self):
"""
'''
Find smallest item after removing deleted items from front of heap.
"""
'''
if len(self) == 0:
raise IndexError("smallest of empty priorityDictionary")
heap = self.__heap
@@ -116,11 +115,9 @@ def smallest(self):
lastItem = heap.pop()
insertionPoint = 0
while 1:
smallChild = 2 * insertionPoint + 1
if (
smallChild + 1 < len(heap)
and heap[smallChild] > heap[smallChild + 1]
):
smallChild = 2*insertionPoint+1
if smallChild+1 < len(heap) and \
heap[smallChild] > heap[smallChild+1]:
smallChild += 1
if smallChild >= len(heap) or lastItem <= heap[smallChild]:
heap[insertionPoint] = lastItem
@@ -130,24 +127,22 @@ def smallest(self):
return heap[0][1]
def __iter__(self):
"""
'''
Create destructive sorted iterator of priorityDictionary.
"""
'''
def iterfn():
while len(self) > 0:
x = self.smallest()
yield x
del self[x]
return iterfn()
def __setitem__(self, key, val):
"""
'''
Change value stored in dictionary and add corresponding pair to heap.
Rebuilds the heap if the number of deleted items gets large, to avoid
memory leakage.
"""
'''
dict.__setitem__(self, key, val)
heap = self.__heap
if len(heap) > 2 * len(self):
@@ -157,15 +152,15 @@ def __setitem__(self, key, val):
newPair = (val, key)
insertionPoint = len(heap)
heap.append(None)
while insertionPoint > 0 and newPair < heap[(insertionPoint - 1) // 2]:
heap[insertionPoint] = heap[(insertionPoint - 1) // 2]
insertionPoint = (insertionPoint - 1) // 2
while insertionPoint > 0 and newPair < heap[(insertionPoint-1)//2]:
heap[insertionPoint] = heap[(insertionPoint-1)//2]
insertionPoint = (insertionPoint-1)//2
heap[insertionPoint] = newPair
def setdefault(self, key, val):
"""
'''
Reimplement setdefault to pass through our customized __setitem__.
"""
'''
if key not in self:
self[key] = val
return self[key]

View File

@@ -1,11 +1,11 @@
"""
'''
altgraph.GraphStat - Functions providing various graph statistics
=================================================================
"""
'''
def degree_dist(graph, limits=(0, 0), bin_num=10, mode="out"):
"""
def degree_dist(graph, limits=(0, 0), bin_num=10, mode='out'):
'''
Computes the degree distribution for a graph.
Returns a list of tuples where the first element of the tuple is the
@@ -15,10 +15,10 @@ def degree_dist(graph, limits=(0, 0), bin_num=10, mode="out"):
Example::
....
"""
'''
deg = []
if mode == "inc":
if mode == 'inc':
get_deg = graph.inc_degree
else:
get_deg = graph.out_degree
@@ -34,38 +34,38 @@ def degree_dist(graph, limits=(0, 0), bin_num=10, mode="out"):
return results
_EPS = 1.0 / (2.0 ** 32)
_EPS = 1.0/(2.0**32)
def _binning(values, limits=(0, 0), bin_num=10):
"""
'''
Bins data that falls between certain limits, if the limits are (0, 0) the
minimum and maximum values are used.
Returns a list of tuples where the first element of the tuple is the
center of the bin and the second element of the tuple are the counts.
"""
'''
if limits == (0, 0):
min_val, max_val = min(values) - _EPS, max(values) + _EPS
else:
min_val, max_val = limits
# get bin size
bin_size = (max_val - min_val) / float(bin_num)
bin_size = (max_val - min_val)/float(bin_num)
bins = [0] * (bin_num)
# will ignore these outliers for now
for value in values:
try:
if (value - min_val) >= 0:
index = int((value - min_val) / float(bin_size))
index = int((value - min_val)/float(bin_size))
bins[index] += 1
except IndexError:
pass
# make it ready for an x,y plot
result = []
center = (bin_size / 2) + min_val
center = (bin_size/2) + min_val
for i, y in enumerate(bins):
x = center + bin_size * i
result.append((x, y))

View File

@@ -1,29 +1,31 @@
"""
'''
altgraph.GraphUtil - Utility classes and functions
==================================================
"""
'''
import random
from collections import deque
from altgraph import Graph, GraphError
from altgraph import Graph
from altgraph import GraphError
def generate_random_graph(node_num, edge_num, self_loops=False, multi_edges=False):
"""
def generate_random_graph(
node_num, edge_num, self_loops=False, multi_edges=False):
'''
Generates and returns a :py:class:`~altgraph.Graph.Graph` instance with
*node_num* nodes randomly connected by *edge_num* edges.
"""
'''
g = Graph.Graph()
if not multi_edges:
if self_loops:
max_edges = node_num * node_num
else:
max_edges = node_num * (node_num - 1)
max_edges = node_num * (node_num-1)
if edge_num > max_edges:
raise GraphError("inconsistent arguments to 'generate_random_graph'")
raise GraphError(
"inconsistent arguments to 'generate_random_graph'")
nodes = range(node_num)
@@ -50,16 +52,17 @@ def generate_random_graph(node_num, edge_num, self_loops=False, multi_edges=Fals
return g
def generate_scale_free_graph(steps, growth_num, self_loops=False, multi_edges=False):
"""
def generate_scale_free_graph(
steps, growth_num, self_loops=False, multi_edges=False):
'''
Generates and returns a :py:class:`~altgraph.Graph.Graph` instance that
will have *steps* \\* *growth_num* nodes and a scale free (powerlaw)
will have *steps* \* *growth_num* nodes and a scale free (powerlaw)
connectivity. Starting with a fully connected graph with *growth_num*
nodes at every step *growth_num* nodes are added to the graph and are
connected to existing nodes with a probability proportional to the degree
of these existing nodes.
"""
# The code doesn't seem to do what the documentation claims.
'''
# FIXME: The code doesn't seem to do what the documentation claims.
graph = Graph.Graph()
# initialize the graph
@@ -110,7 +113,7 @@ def filter_stack(graph, head, filters):
in *removes*.
"""
visited, removes, orphans = {head}, set(), set()
visited, removes, orphans = set([head]), set(), set()
stack = deque([(head, head)])
get_data = graph.node_data
get_edges = graph.out_edges
@@ -134,6 +137,8 @@ def filter_stack(graph, head, filters):
visited.add(tail)
stack.append((last_good, tail))
orphans = [(lg, tl) for (lg, tl) in orphans if tl not in removes]
orphans = [
(lg, tl)
for (lg, tl) in orphans if tl not in removes]
return visited, removes, orphans

View File

@@ -27,7 +27,7 @@ def __init__(self, graph=None, debug=0):
graph.add_node(self, None)
def __repr__(self):
return "<%s>" % (type(self).__name__,)
return '<%s>' % (type(self).__name__,)
def flatten(self, condition=None, start=None):
"""
@@ -58,7 +58,6 @@ def iter_edges(lst, n):
if ident not in seen:
yield self.findNode(ident)
seen.add(ident)
return iter_edges(outraw, 3), iter_edges(incraw, 2)
def edgeData(self, fromNode, toNode):
@@ -88,12 +87,12 @@ def filterStack(self, filters):
visited, removes, orphans = filter_stack(self.graph, self, filters)
for last_good, tail in orphans:
self.graph.add_edge(last_good, tail, edge_data="orphan")
self.graph.add_edge(last_good, tail, edge_data='orphan')
for node in removes:
self.graph.hide_node(node)
return len(visited) - 1, len(removes), len(orphans)
return len(visited)-1, len(removes), len(orphans)
def removeNode(self, node):
"""
@@ -136,7 +135,7 @@ def getRawIdent(self, node):
"""
if node is self:
return node
ident = getattr(node, "graphident", None)
ident = getattr(node, 'graphident', None)
return ident
def __contains__(self, node):
@@ -193,7 +192,8 @@ def msg(self, level, s, *args):
Print a debug message with the given level
"""
if s and level <= self.debug:
print("%s%s %s" % (" " * self.indent, s, " ".join(map(repr, args))))
print("%s%s %s" % (
" " * self.indent, s, ' '.join(map(repr, args))))
def msgin(self, level, s, *args):
"""

View File

@@ -1,4 +1,4 @@
"""
'''
altgraph - a python graph library
=================================
@@ -138,11 +138,13 @@
@newfield contributor: Contributors:
@contributor: U{Reka Albert <http://www.phys.psu.edu/~ralbert/>}
"""
import pkg_resources
__version__ = pkg_resources.require("altgraph")[0].version
'''
# import pkg_resources
# __version__ = pkg_resources.require('altgraph')[0].version
# pkg_resources is not finding the altgraph import despite the fact that it is in sys.path
# there is no .dist-info or .egg-info for pkg_resources to query the version from
# so it must be set manually
__version__ = '0.16.1'
class GraphError(ValueError):
pass

View File

@@ -3,43 +3,21 @@
"""
from __future__ import print_function
import os
import struct
import sys
import struct
import os
from macholib.util import fileview
from .mach_o import (
FAT_MAGIC,
FAT_MAGIC_64,
LC_DYSYMTAB,
LC_ID_DYLIB,
LC_LOAD_DYLIB,
LC_LOAD_UPWARD_DYLIB,
LC_LOAD_WEAK_DYLIB,
LC_PREBOUND_DYLIB,
LC_REEXPORT_DYLIB,
LC_REGISTRY,
LC_SEGMENT,
LC_SEGMENT_64,
LC_SYMTAB,
MH_CIGAM,
MH_CIGAM_64,
MH_FILETYPE_SHORTNAMES,
MH_MAGIC,
MH_MAGIC_64,
S_ZEROFILL,
fat_arch,
fat_arch64,
fat_header,
load_command,
mach_header,
mach_header_64,
section,
section_64,
)
from .mach_o import MH_FILETYPE_SHORTNAMES, LC_DYSYMTAB, LC_SYMTAB
from .mach_o import load_command, S_ZEROFILL, section_64, section
from .mach_o import LC_REGISTRY, LC_ID_DYLIB, LC_SEGMENT, fat_header
from .mach_o import LC_SEGMENT_64, MH_CIGAM_64, MH_MAGIC_64, FAT_MAGIC
from .mach_o import mach_header, fat_arch64, FAT_MAGIC_64, fat_arch
from .mach_o import LC_REEXPORT_DYLIB, LC_PREBOUND_DYLIB, LC_LOAD_WEAK_DYLIB
from .mach_o import LC_LOAD_UPWARD_DYLIB, LC_LOAD_DYLIB, mach_header_64
from .mach_o import MH_CIGAM, MH_MAGIC
from .ptypes import sizeof
from macholib.util import fileview
try:
from macholib.compat import bytes
except ImportError:
@@ -53,23 +31,23 @@
if sys.version_info[0] == 2:
range = xrange # noqa: F821
__all__ = ["MachO"]
__all__ = ['MachO']
_RELOCATABLE = {
_RELOCATABLE = set((
# relocatable commands that should be used for dependency walking
LC_LOAD_DYLIB,
LC_LOAD_UPWARD_DYLIB,
LC_LOAD_WEAK_DYLIB,
LC_PREBOUND_DYLIB,
LC_REEXPORT_DYLIB,
}
))
_RELOCATABLE_NAMES = {
LC_LOAD_DYLIB: "load_dylib",
LC_LOAD_UPWARD_DYLIB: "load_upward_dylib",
LC_LOAD_WEAK_DYLIB: "load_weak_dylib",
LC_PREBOUND_DYLIB: "prebound_dylib",
LC_REEXPORT_DYLIB: "reexport_dylib",
LC_LOAD_DYLIB: 'load_dylib',
LC_LOAD_UPWARD_DYLIB: 'load_upward_dylib',
LC_LOAD_WEAK_DYLIB: 'load_weak_dylib',
LC_PREBOUND_DYLIB: 'prebound_dylib',
LC_REEXPORT_DYLIB: 'reexport_dylib',
}
@@ -87,14 +65,13 @@ def lc_str_value(offset, cmd_info):
cmd_load, cmd_cmd, cmd_data = cmd_info
offset -= sizeof(cmd_load) + sizeof(cmd_cmd)
return cmd_data[offset:].strip(b"\x00")
return cmd_data[offset:].strip(b'\x00')
class MachO(object):
"""
Provides reading/writing the Mach-O header of a specific existing file
"""
# filename - the original filename of this mach-o
# sizediff - the current deviation from the initial mach-o size
# header - the mach-o header
@@ -114,7 +91,7 @@ def __init__(self, filename):
# initialized by load
self.fat = None
self.headers = []
with open(filename, "rb") as fp:
with open(filename, 'rb') as fp:
self.load(fp)
def __repr__(self):
@@ -122,7 +99,7 @@ def __repr__(self):
def load(self, fh):
assert fh.tell() == 0
header = struct.unpack(">I", fh.read(4))[0]
header = struct.unpack('>I', fh.read(4))[0]
fh.seek(0)
if header in (FAT_MAGIC, FAT_MAGIC_64):
self.load_fat(fh)
@@ -135,9 +112,11 @@ def load(self, fh):
def load_fat(self, fh):
self.fat = fat_header.from_fileobj(fh)
if self.fat.magic == FAT_MAGIC:
archs = [fat_arch.from_fileobj(fh) for i in range(self.fat.nfat_arch)]
archs = [fat_arch.from_fileobj(fh)
for i in range(self.fat.nfat_arch)]
elif self.fat.magic == FAT_MAGIC_64:
archs = [fat_arch64.from_fileobj(fh) for i in range(self.fat.nfat_arch)]
archs = [fat_arch64.from_fileobj(fh)
for i in range(self.fat.nfat_arch)]
else:
raise ValueError("Unknown fat header magic: %r" % (self.fat.magic))
@@ -153,18 +132,19 @@ def rewriteLoadCommands(self, *args, **kw):
def load_header(self, fh, offset, size):
fh.seek(offset)
header = struct.unpack(">I", fh.read(4))[0]
header = struct.unpack('>I', fh.read(4))[0]
fh.seek(offset)
if header == MH_MAGIC:
magic, hdr, endian = MH_MAGIC, mach_header, ">"
magic, hdr, endian = MH_MAGIC, mach_header, '>'
elif header == MH_CIGAM:
magic, hdr, endian = MH_CIGAM, mach_header, "<"
magic, hdr, endian = MH_CIGAM, mach_header, '<'
elif header == MH_MAGIC_64:
magic, hdr, endian = MH_MAGIC_64, mach_header_64, ">"
magic, hdr, endian = MH_MAGIC_64, mach_header_64, '>'
elif header == MH_CIGAM_64:
magic, hdr, endian = MH_CIGAM_64, mach_header_64, "<"
magic, hdr, endian = MH_CIGAM_64, mach_header_64, '<'
else:
raise ValueError("Unknown Mach-O header: 0x%08x in %r" % (header, fh))
raise ValueError("Unknown Mach-O header: 0x%08x in %r" % (
header, fh))
hdr = MachOHeader(self, fh, offset, size, magic, hdr, endian)
self.headers.append(hdr)
@@ -177,7 +157,6 @@ class MachOHeader(object):
"""
Provides reading/writing the Mach-O header of a specific existing file
"""
# filename - the original filename of this mach-o
# sizediff - the current deviation from the initial mach-o size
# header - the mach-o header
@@ -210,19 +189,15 @@ def __init__(self, parent, fh, offset, size, magic, hdr, endian):
def __repr__(self):
return "<%s filename=%r offset=%d size=%d endian=%r>" % (
type(self).__name__,
self.parent.filename,
self.offset,
self.size,
self.endian,
)
type(self).__name__, self.parent.filename, self.offset, self.size,
self.endian)
def load(self, fh):
fh = fileview(fh, self.offset, self.size)
fh.seek(0)
self.sizediff = 0
kw = {"_endian_": self.endian}
kw = {'_endian_': self.endian}
header = self.mach_header.from_fileobj(fh, **kw)
self.header = header
# if header.magic != self.MH_MAGIC:
@@ -261,9 +236,8 @@ def load(self, fh):
section_cls = section_64
expected_size = (
sizeof(klass)
+ sizeof(load_command)
+ (sizeof(section_cls) * cmd_cmd.nsects)
sizeof(klass) + sizeof(load_command) +
(sizeof(section_cls) * cmd_cmd.nsects)
)
if cmd_load.cmdsize != expected_size:
raise ValueError("Segment size mismatch")
@@ -274,12 +248,12 @@ def load(self, fh):
low_offset = min(low_offset, cmd_cmd.fileoff)
else:
# this one has multiple segments
for _j in range(cmd_cmd.nsects):
for j in range(cmd_cmd.nsects):
# read the segment
seg = section_cls.from_fileobj(fh, **kw)
# if the segment has a size and is not zero filled
# then its beginning is the offset of this segment
not_zerofill = (seg.flags & S_ZEROFILL) != S_ZEROFILL
not_zerofill = ((seg.flags & S_ZEROFILL) != S_ZEROFILL)
if seg.offset > 0 and seg.size > 0 and not_zerofill:
low_offset = min(low_offset, seg.offset)
if not_zerofill:
@@ -292,7 +266,7 @@ def load(self, fh):
# data is a list of segments
cmd_data = segs
# These are disabled for now because writing back doesn't work
# XXX: Disabled for now because writing back doesn't work
# elif cmd_load.cmd == LC_CODE_SIGNATURE:
# c = fh.tell()
# fh.seek(cmd_cmd.dataoff)
@@ -306,17 +280,17 @@ def load(self, fh):
else:
# data is a raw str
data_size = cmd_load.cmdsize - sizeof(klass) - sizeof(load_command)
data_size = (
cmd_load.cmdsize - sizeof(klass) - sizeof(load_command)
)
cmd_data = fh.read(data_size)
cmd.append((cmd_load, cmd_cmd, cmd_data))
read_bytes += cmd_load.cmdsize
# make sure the header made sense
if read_bytes != header.sizeofcmds:
raise ValueError(
"Read %d bytes, header reports %d bytes"
% (read_bytes, header.sizeofcmds)
)
raise ValueError("Read %d bytes, header reports %d bytes" % (
read_bytes, header.sizeofcmds))
self.total_size = sizeof(self.mach_header) + read_bytes
self.low_offset = low_offset
@@ -329,9 +303,8 @@ def walkRelocatables(self, shouldRelocateCommand=_shouldRelocateCommand):
if shouldRelocateCommand(lc.cmd):
name = _RELOCATABLE_NAMES[lc.cmd]
ofs = cmd.name - sizeof(lc.__class__) - sizeof(cmd.__class__)
yield idx, name, data[
ofs : data.find(b"\x00", ofs) # noqa: E203
].decode(sys.getfilesystemencoding())
yield idx, name, data[ofs:data.find(b'\x00', ofs)].decode(
sys.getfilesystemencoding())
def rewriteInstallNameCommand(self, loadcmd):
"""Rewrite the load command of this dylib"""
@@ -344,9 +317,8 @@ def changedHeaderSizeBy(self, bytes):
self.sizediff += bytes
if (self.total_size + self.sizediff) > self.low_offset:
print(
"WARNING: Mach-O header in %r may be too large to relocate"
% (self.parent.filename,)
)
"WARNING: Mach-O header in %r may be too large to relocate" % (
self.parent.filename,))
def rewriteLoadCommands(self, changefunc):
"""
@@ -355,22 +327,22 @@ def rewriteLoadCommands(self, changefunc):
data = changefunc(self.parent.filename)
changed = False
if data is not None:
if self.rewriteInstallNameCommand(data.encode(sys.getfilesystemencoding())):
if self.rewriteInstallNameCommand(
data.encode(sys.getfilesystemencoding())):
changed = True
for idx, _name, filename in self.walkRelocatables():
for idx, name, filename in self.walkRelocatables():
data = changefunc(filename)
if data is not None:
if self.rewriteDataForCommand(
idx, data.encode(sys.getfilesystemencoding())
):
if self.rewriteDataForCommand(idx, data.encode(
sys.getfilesystemencoding())):
changed = True
return changed
def rewriteDataForCommand(self, idx, data):
lc, cmd, old_data = self.commands[idx]
hdrsize = sizeof(lc.__class__) + sizeof(cmd.__class__)
align = struct.calcsize("Q")
data = data + (b"\x00" * (align - (len(data) % align)))
align = struct.calcsize('Q')
data = data + (b'\x00' * (align - (len(data) % align)))
newsize = hdrsize + len(data)
self.commands[idx] = (lc, cmd, data)
self.changedHeaderSizeBy(newsize - lc.cmdsize)
@@ -380,17 +352,10 @@ def rewriteDataForCommand(self, idx, data):
def synchronize_size(self):
if (self.total_size + self.sizediff) > self.low_offset:
raise ValueError(
(
"New Mach-O header is too large to relocate in %r "
"(new size=%r, max size=%r, delta=%r)"
)
% (
self.parent.filename,
self.total_size + self.sizediff,
self.low_offset,
self.sizediff,
)
)
("New Mach-O header is too large to relocate in %r "
"(new size=%r, max size=%r, delta=%r)") % (
self.parent.filename, self.total_size + self.sizediff,
self.low_offset, self.sizediff))
self.header.sizeofcmds += self.sizediff
self.total_size = sizeof(self.mach_header) + self.header.sizeofcmds
self.sizediff = 0
@@ -431,16 +396,16 @@ def write(self, fileobj):
# zero out the unused space, doubt this is strictly necessary
# and is generally probably already the case
fileobj.write(b"\x00" * (self.low_offset - fileobj.tell()))
fileobj.write(b'\x00' * (self.low_offset - fileobj.tell()))
def getSymbolTableCommand(self):
for lc, cmd, _data in self.commands:
for lc, cmd, data in self.commands:
if lc.cmd == LC_SYMTAB:
return cmd
return None
def getDynamicSymbolTableCommand(self):
for lc, cmd, _data in self.commands:
for lc, cmd, data in self.commands:
if lc.cmd == LC_DYSYMTAB:
return cmd
return None
@@ -449,23 +414,22 @@ def get_filetype_shortname(self, filetype):
if filetype in MH_FILETYPE_SHORTNAMES:
return MH_FILETYPE_SHORTNAMES[filetype]
else:
return "unknown"
return 'unknown'
def main(fn):
m = MachO(fn)
seen = set()
for header in m.headers:
for _idx, name, other in header.walkRelocatables():
for idx, name, other in header.walkRelocatables():
if other not in seen:
seen.add(other)
print("\t" + name + ": " + other)
print('\t' + name + ": " + other)
if __name__ == "__main__":
if __name__ == '__main__':
import sys
files = sys.argv[1:] or ["/bin/ls"]
files = sys.argv[1:] or ['/bin/ls']
for fn in files:
print(fn)
main(fn)

View File

@@ -8,10 +8,10 @@
from altgraph.ObjectGraph import ObjectGraph
from macholib.dyld import dyld_find
from macholib.itergraphreport import itergraphreport
from macholib.MachO import MachO
from macholib.itergraphreport import itergraphreport
__all__ = ["MachOGraph"]
__all__ = ['MachOGraph']
try:
unicode
@@ -25,14 +25,13 @@ def __init__(self, filename):
self.headers = ()
def __repr__(self):
return "<%s graphident=%r>" % (type(self).__name__, self.graphident)
return '<%s graphident=%r>' % (type(self).__name__, self.graphident)
class MachOGraph(ObjectGraph):
"""
Graph data structure of Mach-O dependencies
"""
def __init__(self, debug=0, graph=None, env=None, executable_path=None):
super(MachOGraph, self).__init__(debug=debug, graph=graph)
self.env = env
@@ -42,18 +41,16 @@ def __init__(self, debug=0, graph=None, env=None, executable_path=None):
def locate(self, filename, loader=None):
if not isinstance(filename, (str, unicode)):
raise TypeError("%r is not a string" % (filename,))
if filename.startswith("@loader_path/") and loader is not None:
if filename.startswith('@loader_path/') and loader is not None:
fn = self.trans_table.get((loader.filename, filename))
if fn is None:
loader_path = loader.loader_path
try:
fn = dyld_find(
filename,
env=self.env,
filename, env=self.env,
executable_path=self.executable_path,
loader_path=loader_path,
)
loader_path=loader_path)
self.trans_table[(loader.filename, filename)] = fn
except ValueError:
return None
@@ -63,8 +60,8 @@ def locate(self, filename, loader=None):
if fn is None:
try:
fn = dyld_find(
filename, env=self.env, executable_path=self.executable_path
)
filename, env=self.env,
executable_path=self.executable_path)
self.trans_table[filename] = fn
except ValueError:
return None
@@ -86,11 +83,11 @@ def run_file(self, pathname, caller=None):
m = self.findNode(pathname, loader=caller)
if m is None:
if not os.path.exists(pathname):
raise ValueError("%r does not exist" % (pathname,))
raise ValueError('%r does not exist' % (pathname,))
m = self.createNode(MachO, pathname)
self.createReference(caller, m, edge_data="run_file")
self.createReference(caller, m, edge_data='run_file')
self.scan_node(m)
self.msgout(2, "")
self.msgout(2, '')
return m
def load_file(self, name, caller=None):
@@ -106,20 +103,20 @@ def load_file(self, name, caller=None):
self.scan_node(m)
else:
m = self.createNode(MissingMachO, name)
self.msgout(2, "")
self.msgout(2, '')
return m
def scan_node(self, node):
self.msgin(2, "scan_node", node)
self.msgin(2, 'scan_node', node)
for header in node.headers:
for _idx, name, filename in header.walkRelocatables():
for idx, name, filename in header.walkRelocatables():
assert isinstance(name, (str, unicode))
assert isinstance(filename, (str, unicode))
m = self.load_file(filename, caller=node)
self.createReference(node, m, edge_data=name)
self.msgout(2, "", node)
self.msgout(2, '', node)
def itergraphreport(self, name="G"):
def itergraphreport(self, name='G'):
nodes = map(self.graph.describe_node, self.graph.iterdfs(self))
describe_edge = self.graph.describe_edge
return itergraphreport(nodes, describe_edge, name=name)
@@ -137,5 +134,5 @@ def main(args):
g.graphreport()
if __name__ == "__main__":
main(sys.argv[1:] or ["/bin/ls"])
if __name__ == '__main__':
main(sys.argv[1:] or ['/bin/ls'])

View File

@@ -1,16 +1,10 @@
import os
from collections import deque
from macholib.dyld import framework_info
from macholib.MachOGraph import MachOGraph, MissingMachO
from macholib.util import (
flipwritable,
has_filename_filter,
in_system_path,
iter_platform_files,
mergecopy,
mergetree,
)
from macholib.util import iter_platform_files, in_system_path, mergecopy, \
mergetree, flipwritable, has_filename_filter
from macholib.dyld import framework_info
from collections import deque
class ExcludedMachO(MissingMachO):
@@ -29,20 +23,22 @@ def createNode(self, cls, name):
def locate(self, filename, loader=None):
newname = super(FilteredMachOGraph, self).locate(filename, loader)
print("locate", filename, loader, "->", newname)
if newname is None:
return None
return self.delegate.locate(newname, loader=loader)
class MachOStandalone(object):
def __init__(self, base, dest=None, graph=None, env=None, executable_path=None):
self.base = os.path.join(os.path.abspath(base), "")
def __init__(
self, base, dest=None, graph=None, env=None,
executable_path=None):
self.base = os.path.join(os.path.abspath(base), '')
if dest is None:
dest = os.path.join(self.base, "Contents", "Frameworks")
dest = os.path.join(self.base, 'Contents', 'Frameworks')
self.dest = dest
self.mm = FilteredMachOGraph(
self, graph=graph, env=env, executable_path=executable_path
)
self, graph=graph, env=env, executable_path=executable_path)
self.changemap = {}
self.excludes = []
self.pending = deque()
@@ -84,7 +80,8 @@ def copy_dylib(self, filename):
# when two libraries link to the same dylib but using different
# symlinks.
if os.path.islink(filename):
dest = os.path.join(self.dest, os.path.basename(os.path.realpath(filename)))
dest = os.path.join(
self.dest, os.path.basename(os.path.realpath(filename)))
else:
dest = os.path.join(self.dest, os.path.basename(filename))
@@ -99,9 +96,9 @@ def mergetree(self, src, dest):
return mergetree(src, dest)
def copy_framework(self, info):
dest = os.path.join(self.dest, info["shortname"] + ".framework")
destfn = os.path.join(self.dest, info["name"])
src = os.path.join(info["location"], info["shortname"] + ".framework")
dest = os.path.join(self.dest, info['shortname'] + '.framework')
destfn = os.path.join(self.dest, info['name'])
src = os.path.join(info['location'], info['shortname'] + '.framework')
if not os.path.exists(dest):
self.mergetree(src, dest)
self.pending.append((destfn, iter_platform_files(dest)))
@@ -110,7 +107,7 @@ def copy_framework(self, info):
def run(self, platfiles=None, contents=None):
mm = self.mm
if contents is None:
contents = "@executable_path/.."
contents = '@executable_path/..'
if platfiles is None:
platfiles = iter_platform_files(self.base)
@@ -124,20 +121,18 @@ def run(self, platfiles=None, contents=None):
mm.run_file(fn, caller=ref)
changemap = {}
skipcontents = os.path.join(os.path.dirname(self.dest), "")
skipcontents = os.path.join(os.path.dirname(self.dest), '')
machfiles = []
for node in mm.flatten(has_filename_filter):
machfiles.append(node)
dest = os.path.join(
contents,
os.path.normpath(node.filename[len(skipcontents) :]), # noqa: E203
)
contents, os.path.normpath(node.filename[len(skipcontents):]))
changemap[node.filename] = dest
def changefunc(path):
if path.startswith("@loader_path/"):
# This is a quick hack for py2app: In that
if path.startswith('@loader_path/'):
# XXX: This is a quick hack for py2app: In that
# usecase paths like this are found in the load
# commands of relocatable wheels. Those don't
# need rewriting.
@@ -145,8 +140,9 @@ def changefunc(path):
res = mm.locate(path)
rv = changemap.get(res)
if rv is None and path.startswith("@loader_path/"):
rv = changemap.get(mm.locate(mm.trans_table.get((node.filename, path))))
if rv is None and path.startswith('@loader_path/'):
rv = changemap.get(mm.locate(mm.trans_table.get(
(node.filename, path))))
return rv
for node in machfiles:
@@ -154,14 +150,14 @@ def changefunc(path):
if fn is None:
continue
rewroteAny = False
for _header in node.headers:
for header in node.headers:
if node.rewriteLoadCommands(changefunc):
rewroteAny = True
if rewroteAny:
old_mode = flipwritable(fn)
try:
with open(fn, "rb+") as f:
for _header in node.headers:
with open(fn, 'rb+') as f:
for header in node.headers:
f.seek(0)
node.write(f)
f.seek(0, 2)

View File

@@ -3,20 +3,12 @@
"""
from __future__ import with_statement
from macholib.mach_o import relocation_info, dylib_reference, dylib_module
from macholib.mach_o import dylib_table_of_contents, nlist, nlist_64
from macholib.mach_o import MH_CIGAM_64, MH_MAGIC_64
import sys
from macholib.mach_o import (
MH_CIGAM_64,
MH_MAGIC_64,
dylib_module,
dylib_reference,
dylib_table_of_contents,
nlist,
nlist_64,
relocation_info,
)
__all__ = ["SymbolTable"]
__all__ = ['SymbolTable']
if sys.version_info[0] == 2:
range = xrange # noqa: F821
@@ -29,7 +21,7 @@ def __init__(self, macho, header=None, openfile=None):
if header is None:
header = macho.headers[0]
self.macho_header = header
with openfile(macho.filename, "rb") as fh:
with openfile(macho.filename, 'rb') as fh:
self.symtab = header.getSymbolTableCommand()
self.dysymtab = header.getDynamicSymbolTableCommand()
@@ -51,32 +43,22 @@ def readSymbolTable(self, fh):
else:
cls = nlist
for _i in range(cmd.nsyms):
for i in range(cmd.nsyms):
cmd = cls.from_fileobj(fh, _endian_=self.macho_header.endian)
if cmd.n_un == 0:
nlists.append((cmd, ""))
nlists.append((cmd, ''))
else:
nlists.append(
(
cmd,
strtab[cmd.n_un : strtab.find(b"\x00", cmd.n_un)], # noqa: E203
)
)
(cmd, strtab[cmd.n_un:strtab.find(b'\x00', cmd.n_un)]))
return nlists
def readDynamicSymbolTable(self, fh):
cmd = self.dysymtab
nlists = self.nlists
self.localsyms = nlists[
cmd.ilocalsym : cmd.ilocalsym + cmd.nlocalsym # noqa: E203
]
self.extdefsyms = nlists[
cmd.iextdefsym : cmd.iextdefsym + cmd.nextdefsym # noqa: E203
]
self.undefsyms = nlists[
cmd.iundefsym : cmd.iundefsym + cmd.nundefsym # noqa: E203
]
self.localsyms = nlists[cmd.ilocalsym:cmd.ilocalsym+cmd.nlocalsym]
self.extdefsyms = nlists[cmd.iextdefsym:cmd.iextdefsym+cmd.nextdefsym]
self.undefsyms = nlists[cmd.iundefsym:cmd.iundefsym+cmd.nundefsym]
if cmd.tocoff == 0:
self.toc = None
else:
@@ -93,7 +75,7 @@ def readmodtab(self, fh, off, n):
def readsym(self, fh, off, n):
fh.seek(self.macho_header.offset + off)
refs = []
for _i in range(n):
for i in range(n):
ref = dylib_reference.from_fileobj(fh)
isym, flags = divmod(ref.isym_flags, 256)
refs.append((self.nlists[isym], flags))

View File

@@ -5,4 +5,4 @@
And also Apple's documentation.
"""
__version__ = "1.15.2"
__version__ = '1.10'

View File

@@ -1,24 +1,26 @@
from __future__ import absolute_import, print_function
from __future__ import print_function, absolute_import
import os
import sys
from macholib import macho_dump, macho_standalone
from macholib.util import is_platform_file
from macholib import macho_dump
from macholib import macho_standalone
gCommand = None
def check_file(fp, path, callback):
if not os.path.exists(path):
print("%s: %s: No such file or directory" % (gCommand, path), file=sys.stderr)
print(
'%s: %s: No such file or directory' % (gCommand, path),
file=sys.stderr)
return 1
try:
is_plat = is_platform_file(path)
except IOError as msg:
print("%s: %s: %s" % (gCommand, path, msg), file=sys.stderr)
print('%s: %s: %s' % (gCommand, path, msg), file=sys.stderr)
return 1
else:
@@ -32,9 +34,10 @@ def walk_tree(callback, paths):
for base in paths:
if os.path.isdir(base):
for root, _dirs, files in os.walk(base):
for root, dirs, files in os.walk(base):
for fn in files:
err |= check_file(sys.stdout, os.path.join(root, fn), callback)
err |= check_file(
sys.stdout, os.path.join(root, fn), callback)
else:
err |= check_file(sys.stdout, base, callback)
@@ -57,17 +60,17 @@ def main():
gCommand = sys.argv[1]
if gCommand == "dump":
if gCommand == 'dump':
walk_tree(macho_dump.print_file, sys.argv[2:])
elif gCommand == "find":
elif gCommand == 'find':
walk_tree(lambda fp, path: print(path, file=fp), sys.argv[2:])
elif gCommand == "standalone":
elif gCommand == 'standalone':
for dn in sys.argv[2:]:
macho_standalone.standaloneApp(dn)
elif gCommand in ("help", "--help"):
elif gCommand in ('help', '--help'):
print_usage(sys.stdout)
sys.exit(0)

View File

@@ -1,8 +1,7 @@
"""
Internal helpers for basic commandline tools
"""
from __future__ import absolute_import, print_function
from __future__ import print_function, absolute_import
import os
import sys
@@ -11,16 +10,15 @@
def check_file(fp, path, callback):
if not os.path.exists(path):
print(
"%s: %s: No such file or directory" % (sys.argv[0], path), file=sys.stderr
)
print('%s: %s: No such file or directory' % (
sys.argv[0], path), file=sys.stderr)
return 1
try:
is_plat = is_platform_file(path)
except IOError as msg:
print("%s: %s: %s" % (sys.argv[0], path, msg), file=sys.stderr)
print('%s: %s: %s' % (sys.argv[0], path, msg), file=sys.stderr)
return 1
else:
@@ -40,9 +38,10 @@ def main(callback):
for base in args:
if os.path.isdir(base):
for root, _dirs, files in os.walk(base):
for root, dirs, files in os.walk(base):
for fn in files:
err |= check_file(sys.stdout, os.path.join(root, fn), callback)
err |= check_file(
sys.stdout, os.path.join(root, fn), callback)
else:
err |= check_file(sys.stdout, base, callback)

View File

@@ -2,45 +2,18 @@
dyld emulation
"""
import ctypes
import os
import platform
import sys
from itertools import chain
from macholib.dylib import dylib_info
import os
import sys
from macholib.framework import framework_info
from macholib.dylib import dylib_info
__all__ = ["dyld_find", "framework_find", "framework_info", "dylib_info"]
if sys.platform == "darwin" and [
int(x) for x in platform.mac_ver()[0].split(".")[:2]
] >= [10, 16]:
try:
libc = ctypes.CDLL("libSystem.dylib")
except OSError:
_dyld_shared_cache_contains_path = None
else:
try:
_dyld_shared_cache_contains_path = libc._dyld_shared_cache_contains_path
except AttributeError:
_dyld_shared_cache_contains_path = None
else:
_dyld_shared_cache_contains_path.restype = ctypes.c_bool
_dyld_shared_cache_contains_path.argtypes = [ctypes.c_char_p]
if sys.version_info[0] != 2:
__dyld_shared_cache_contains_path = _dyld_shared_cache_contains_path
def _dyld_shared_cache_contains_path(path):
return __dyld_shared_cache_contains_path(path.encode())
else:
_dyld_shared_cache_contains_path = None
__all__ = [
'dyld_find', 'framework_find',
'framework_info', 'dylib_info',
]
# These are the defaults as per man dyld(1)
#
@@ -58,16 +31,13 @@ def _dyld_shared_cache_contains_path(path):
"/usr/lib",
]
# XXX: Is this function still needed?
if sys.version_info[0] == 2:
def _ensure_utf8(s):
if isinstance(s, unicode): # noqa: F821
return s.encode("utf8")
return s.encode('utf8')
return s
else:
def _ensure_utf8(s):
if s is not None and not isinstance(s, str):
raise ValueError(s)
@@ -78,31 +48,31 @@ def _dyld_env(env, var):
if env is None:
env = os.environ
rval = env.get(var)
if rval is None or rval == "":
if rval is None or rval == '':
return []
return rval.split(":")
return rval.split(':')
def dyld_image_suffix(env=None):
if env is None:
env = os.environ
return env.get("DYLD_IMAGE_SUFFIX")
return env.get('DYLD_IMAGE_SUFFIX')
def dyld_framework_path(env=None):
return _dyld_env(env, "DYLD_FRAMEWORK_PATH")
return _dyld_env(env, 'DYLD_FRAMEWORK_PATH')
def dyld_library_path(env=None):
return _dyld_env(env, "DYLD_LIBRARY_PATH")
return _dyld_env(env, 'DYLD_LIBRARY_PATH')
def dyld_fallback_framework_path(env=None):
return _dyld_env(env, "DYLD_FALLBACK_FRAMEWORK_PATH")
return _dyld_env(env, 'DYLD_FALLBACK_FRAMEWORK_PATH')
def dyld_fallback_library_path(env=None):
return _dyld_env(env, "DYLD_FALLBACK_LIBRARY_PATH")
return _dyld_env(env, 'DYLD_FALLBACK_LIBRARY_PATH')
def dyld_image_suffix_search(iterator, env=None):
@@ -113,8 +83,8 @@ def dyld_image_suffix_search(iterator, env=None):
def _inject(iterator=iterator, suffix=suffix):
for path in iterator:
if path.endswith(".dylib"):
yield path[: -len(".dylib")] + suffix + ".dylib"
if path.endswith('.dylib'):
yield path[:-len('.dylib')] + suffix + '.dylib'
else:
yield path + suffix
yield path
@@ -132,7 +102,7 @@ def dyld_override_search(name, env=None):
if framework is not None:
for path in dyld_framework_path(env):
yield os.path.join(path, framework["name"])
yield os.path.join(path, framework['name'])
# If DYLD_LIBRARY_PATH is set then use the first file that exists
# in the path. If none use the original name.
@@ -144,18 +114,16 @@ def dyld_executable_path_search(name, executable_path=None):
# If we haven't done any searching and found a library and the
# dylib_name starts with "@executable_path/" then construct the
# library name.
if name.startswith("@executable_path/") and executable_path is not None:
yield os.path.join(
executable_path, name[len("@executable_path/") :] # noqa: E203
)
if name.startswith('@executable_path/') and executable_path is not None:
yield os.path.join(executable_path, name[len('@executable_path/'):])
def dyld_loader_search(name, loader_path=None):
# If we haven't done any searching and found a library and the
# dylib_name starts with "@loader_path/" then construct the
# library name.
if name.startswith("@loader_path/") and loader_path is not None:
yield os.path.join(loader_path, name[len("@loader_path/") :]) # noqa: E203
if name.startswith('@loader_path/') and loader_path is not None:
yield os.path.join(loader_path, name[len('@loader_path/'):])
def dyld_default_search(name, env=None):
@@ -168,11 +136,11 @@ def dyld_default_search(name, env=None):
if fallback_framework_path:
for path in fallback_framework_path:
yield os.path.join(path, framework["name"])
yield os.path.join(path, framework['name'])
else:
for path in _DEFAULT_FRAMEWORK_FALLBACK:
yield os.path.join(path, framework["name"])
yield os.path.join(path, framework['name'])
fallback_library_path = dyld_fallback_library_path(env)
if fallback_library_path:
@@ -190,20 +158,12 @@ def dyld_find(name, executable_path=None, env=None, loader_path=None):
"""
name = _ensure_utf8(name)
executable_path = _ensure_utf8(executable_path)
for path in dyld_image_suffix_search(
chain(
dyld_override_search(name, env),
dyld_executable_path_search(name, executable_path),
dyld_loader_search(name, loader_path),
dyld_default_search(name, env),
),
env,
):
if (
_dyld_shared_cache_contains_path is not None
and _dyld_shared_cache_contains_path(path)
):
return path
for path in dyld_image_suffix_search(chain(
dyld_override_search(name, env),
dyld_executable_path_search(name, executable_path),
dyld_loader_search(name, loader_path),
dyld_default_search(name, env),
), env):
if os.path.isfile(path):
return path
raise ValueError("dylib %s could not be found" % (name,))
@@ -222,9 +182,9 @@ def framework_find(fn, executable_path=None, env=None):
return dyld_find(fn, executable_path=executable_path, env=env)
except ValueError:
pass
fmwk_index = fn.rfind(".framework")
fmwk_index = fn.rfind('.framework')
if fmwk_index == -1:
fmwk_index = len(fn)
fn += ".framework"
fn += '.framework'
fn = os.path.join(fn, os.path.basename(fn[:fmwk_index]))
return dyld_find(fn, executable_path=executable_path, env=env)

View File

@@ -4,10 +4,9 @@
import re
__all__ = ["dylib_info"]
__all__ = ['dylib_info']
_DYLIB_RE = re.compile(
r"""(?x)
_DYLIB_RE = re.compile(r"""(?x)
(?P<location>^.*)(?:^|/)
(?P<name>
(?P<shortname>\w+?)
@@ -15,8 +14,7 @@
(?:_(?P<suffix>[^._]+))?
\.dylib$
)
"""
)
""")
def dylib_info(filename):

View File

@@ -4,10 +4,9 @@
import re
__all__ = ["framework_info"]
__all__ = ['framework_info']
_STRICT_FRAMEWORK_RE = re.compile(
r"""(?x)
_STRICT_FRAMEWORK_RE = re.compile(r"""(?x)
(?P<location>^.*)(?:^|/)
(?P<name>
(?P<shortname>[-_A-Za-z0-9]+).framework/
@@ -15,8 +14,7 @@
(?P=shortname)
(?:_(?P<suffix>[^_]+))?
)$
"""
)
""")
def framework_info(filename):

View File

@@ -1,5 +1,7 @@
"""
Utilities for creating dot output from a MachOGraph
XXX: need to rewrite this based on altgraph.Dot
"""
from collections import deque
@@ -9,28 +11,28 @@
except ImportError:
imap = map
__all__ = ["itergraphreport"]
__all__ = ['itergraphreport']
def itergraphreport(nodes, describe_edge, name="G"):
def itergraphreport(nodes, describe_edge, name='G'):
edges = deque()
nodetoident = {}
def nodevisitor(node, data, outgoing, incoming):
return {"label": str(node)}
return {'label': str(node)}
def edgevisitor(edge, data, head, tail):
return {}
yield "digraph %s {\n" % (name,)
attr = {"rankdir": "LR", "concentrate": "true"}
yield 'digraph %s {\n' % (name,)
attr = dict(rankdir='LR', concentrate='true')
cpatt = '%s="%s"'
for item in attr.items():
yield "\t%s;\n" % (cpatt % item,)
for item in attr.iteritems():
yield '\t%s;\n' % (cpatt % item,)
# find all packages (subgraphs)
for (node, data, _outgoing, _incoming) in nodes:
nodetoident[node] = getattr(data, "identifier", node)
for (node, data, outgoing, incoming) in nodes:
nodetoident[node] = getattr(data, 'identifier', node)
# create sets for subgraph, write out descriptions
for (node, data, outgoing, incoming) in nodes:
@@ -41,19 +43,17 @@ def edgevisitor(edge, data, head, tail):
# describe node
yield '\t"%s" [%s];\n' % (
node,
",".join(
[
(cpatt % item)
for item in nodevisitor(node, data, outgoing, incoming).items()
]
),
','.join([
(cpatt % item) for item in
nodevisitor(node, data, outgoing, incoming).iteritems()
]),
)
graph = []
while edges:
edge, data, head, tail = edges.popleft()
if data in ("run_file", "load_dylib"):
if data in ('run_file', 'load_dylib'):
graph.append((edge, data, head, tail))
def do_graph(edges, tabs):
@@ -64,10 +64,10 @@ def do_graph(edges, tabs):
yield edgestr % (
head,
tail,
",".join([(cpatt % item) for item in attribs.items()]),
','.join([(cpatt % item) for item in attribs.iteritems()]),
)
for s in do_graph(graph, "\t"):
for s in do_graph(graph, '\t'):
yield s
yield "}\n"
yield '}\n'

File diff suppressed because it is too large Load Diff

View File

@@ -5,14 +5,15 @@
import sys
from macholib._cmdline import main as _main
from macholib.mach_o import CPU_TYPE_NAMES, MH_CIGAM_64, MH_MAGIC_64, get_cpu_subtype
from macholib.MachO import MachO
from macholib.mach_o import get_cpu_subtype, CPU_TYPE_NAMES
from macholib.mach_o import MH_CIGAM_64, MH_MAGIC_64
ARCH_MAP = {
("<", "64-bit"): "x86_64",
("<", "32-bit"): "i386",
(">", "64-bit"): "ppc64",
(">", "32-bit"): "ppc",
('<', '64-bit'): 'x86_64',
('<', '32-bit'): 'i386',
('>', '64-bit'): 'ppc64',
('>', '32-bit'): 'ppc',
}
@@ -23,34 +24,34 @@ def print_file(fp, path):
seen = set()
if header.MH_MAGIC == MH_MAGIC_64 or header.MH_MAGIC == MH_CIGAM_64:
sz = "64-bit"
sz = '64-bit'
else:
sz = "32-bit"
sz = '32-bit'
arch = CPU_TYPE_NAMES.get(header.header.cputype, header.header.cputype)
arch = CPU_TYPE_NAMES.get(
header.header.cputype, header.header.cputype)
subarch = get_cpu_subtype(header.header.cputype, header.header.cpusubtype)
subarch = get_cpu_subtype(
header.header.cputype, header.header.cpusubtype)
print(
" [%s endian=%r size=%r arch=%r subarch=%r]"
% (header.__class__.__name__, header.endian, sz, arch, subarch),
file=fp,
)
for _idx, _name, other in header.walkRelocatables():
print(' [%s endian=%r size=%r arch=%r subarch=%r]' % (
header.__class__.__name__, header.endian, sz, arch, subarch),
file=fp)
for idx, name, other in header.walkRelocatables():
if other not in seen:
seen.add(other)
print("\t" + other, file=fp)
print("", file=fp)
print('\t' + other, file=fp)
print('', file=fp)
def main():
print(
"WARNING: 'macho_dump' is deprecated, use 'python -mmacholib dump' " "instead"
)
"WARNING: 'macho_dump' is deprecated, use 'python -mmacholib dump' "
"instead")
_main(print_file)
if __name__ == "__main__":
if __name__ == '__main__':
try:
sys.exit(main())
except KeyboardInterrupt:

View File

@@ -1,6 +1,5 @@
#!/usr/bin/env python
from __future__ import print_function
from macholib._cmdline import main as _main
@@ -10,12 +9,12 @@ def print_file(fp, path):
def main():
print(
"WARNING: 'macho_find' is deprecated, " "use 'python -mmacholib dump' instead"
)
"WARNING: 'macho_find' is deprecated, "
"use 'python -mmacholib dump' instead")
_main(print_file)
if __name__ == "__main__":
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:

View File

@@ -8,8 +8,10 @@
def standaloneApp(path):
if not (os.path.isdir(path) and os.path.exists(os.path.join(path, "Contents"))):
print("%s: %s does not look like an app bundle" % (sys.argv[0], path))
if not (os.path.isdir(path) and os.path.exists(
os.path.join(path, 'Contents'))):
print(
'%s: %s does not look like an app bundle' % (sys.argv[0], path))
sys.exit(1)
files = MachOStandalone(path).run()
strip_files(files)
@@ -18,13 +20,12 @@ def standaloneApp(path):
def main():
print(
"WARNING: 'macho_standalone' is deprecated, use "
"'python -mmacholib standalone' instead"
)
"'python -mmacholib standalone' instead")
if not sys.argv[1:]:
raise SystemExit("usage: %s [appbundle ...]" % (sys.argv[0],))
raise SystemExit('usage: %s [appbundle ...]' % (sys.argv[0],))
for fn in sys.argv[1:]:
standaloneApp(fn)
if __name__ == "__main__":
if __name__ == '__main__':
main()

View File

@@ -4,12 +4,12 @@
"""
import struct
import sys
from itertools import chain, starmap
try:
from itertools import imap, izip
from itertools import izip, imap
except ImportError:
izip, imap = zip, map
from itertools import chain, starmap
__all__ = """
sizeof
@@ -44,7 +44,7 @@ def sizeof(s):
"""
Return the size of an object when packed
"""
if hasattr(s, "_size_"):
if hasattr(s, '_size_'):
return s._size_
elif isinstance(s, bytes):
@@ -58,15 +58,14 @@ class MetaPackable(type):
Fixed size struct.unpack-able types use from_tuple as their designated
initializer
"""
def from_mmap(cls, mm, ptr, **kw):
return cls.from_str(mm[ptr : ptr + cls._size_], **kw) # noqa: E203
return cls.from_str(mm[ptr:ptr+cls._size_], **kw)
def from_fileobj(cls, f, **kw):
return cls.from_str(f.read(cls._size_), **kw)
def from_str(cls, s, **kw):
endian = kw.get("_endian_", cls._endian_)
endian = kw.get('_endian_', cls._endian_)
return cls.from_tuple(struct.unpack(endian + cls._format_, s), **kw)
def from_tuple(cls, tpl, **kw):
@@ -74,7 +73,7 @@ def from_tuple(cls, tpl, **kw):
class BasePackable(object):
_endian_ = ">"
_endian_ = '>'
def to_str(self):
raise NotImplementedError
@@ -83,7 +82,7 @@ def to_fileobj(self, f):
f.write(self.to_str())
def to_mmap(self, mm, ptr):
mm[ptr : ptr + self._size_] = self.to_str() # noqa: E203
mm[ptr:ptr+self._size_] = self.to_str()
# This defines a class with a custom metaclass, we'd normally
@@ -93,10 +92,9 @@ def to_mmap(self, mm, ptr):
def _make():
def to_str(self):
cls = type(self)
endian = getattr(self, "_endian_", cls._endian_)
endian = getattr(self, '_endian_', cls._endian_)
return struct.pack(endian + cls._format_, self)
return MetaPackable("Packable", (BasePackable,), {"to_str": to_str})
return MetaPackable("Packable", (BasePackable,), {'to_str': to_str})
Packable = _make()
@@ -111,8 +109,8 @@ def pypackable(name, pytype, format):
size, items = _formatinfo(format)
def __new__(cls, *args, **kwds):
if "_endian_" in kwds:
_endian_ = kwds.pop("_endian_")
if '_endian_' in kwds:
_endian_ = kwds.pop('_endian_')
else:
_endian_ = cls._endian_
@@ -120,11 +118,12 @@ def __new__(cls, *args, **kwds):
result._endian_ = _endian_
return result
return type(Packable)(
name,
(pytype, Packable),
{"_format_": format, "_size_": size, "_items_": items, "__new__": __new__},
)
return type(Packable)(name, (pytype, Packable), {
'_format_': format,
'_size_': size,
'_items_': items,
'__new__': __new__,
})
def _formatinfo(format):
@@ -132,7 +131,7 @@ def _formatinfo(format):
Calculate the size and number of items in a struct format.
"""
size = struct.calcsize(format)
return size, len(struct.unpack(format, b"\x00" * size))
return size, len(struct.unpack(format, b'\x00' * size))
class MetaStructure(MetaPackable):
@@ -143,17 +142,17 @@ class MetaStructure(MetaPackable):
we can do a bunch of calculations up front and pack or
unpack the whole thing in one struct call.
"""
def __new__(cls, clsname, bases, dct):
fields = dct["_fields_"]
fields = dct['_fields_']
names = []
types = []
structmarks = []
format = ""
format = ''
items = 0
size = 0
def struct_property(name, typ):
def _get(self):
return self._objects_[name]
@@ -170,16 +169,16 @@ def _set(self, obj):
types.append(typ)
format += typ._format_
size += typ._size_
if typ._items_ > 1:
if (typ._items_ > 1):
structmarks.append((items, typ._items_, typ))
items += typ._items_
dct["_structmarks_"] = structmarks
dct["_names_"] = names
dct["_types_"] = types
dct["_size_"] = size
dct["_items_"] = items
dct["_format_"] = format
dct['_structmarks_'] = structmarks
dct['_names_'] = names
dct['_types_'] = types
dct['_size_'] = size
dct['_items_'] = items
dct['_format_'] = format
return super(MetaStructure, cls).__new__(cls, clsname, bases, dct)
def from_tuple(cls, tpl, **kw):
@@ -197,7 +196,7 @@ def from_tuple(cls, tpl, **kw):
# See metaclass discussion earlier in this file
def _make():
class_dict = {}
class_dict["_fields_"] = ()
class_dict['_fields_'] = ()
def as_method(function):
class_dict[function.__name__] = function
@@ -220,7 +219,7 @@ def __init__(self, *args, **kwargs):
@as_method
def _get_packables(self):
for obj in imap(self._objects_.__getitem__, self._names_):
if hasattr(obj, "_get_packables"):
if hasattr(obj, '_get_packables'):
for obj in obj._get_packables():
yield obj
@@ -229,19 +228,18 @@ def _get_packables(self):
@as_method
def to_str(self):
return struct.pack(self._endian_ + self._format_, *self._get_packables())
return struct.pack(
self._endian_ + self._format_, *self._get_packables())
@as_method
def __cmp__(self, other):
if type(other) is not type(self):
raise TypeError(
"Cannot compare objects of type %r to objects of type %r"
% (type(other), type(self))
)
'Cannot compare objects of type %r to objects of type %r' % (
type(other), type(self)))
if sys.version_info[0] == 2:
_cmp = cmp # noqa: F821
else:
def _cmp(a, b):
if a < b:
return -1
@@ -253,8 +251,7 @@ def _cmp(a, b):
raise TypeError()
for cmpval in starmap(
_cmp, izip(self._get_packables(), other._get_packables())
):
_cmp, izip(self._get_packables(), other._get_packables())):
if cmpval != 0:
return cmpval
return 0
@@ -292,12 +289,12 @@ def __ge__(self, other):
@as_method
def __repr__(self):
result = []
result.append("<")
result.append('<')
result.append(type(self).__name__)
for nm in self._names_:
result.append(" %s=%r" % (nm, getattr(self, nm)))
result.append(">")
return "".join(result)
result.append(' %s=%r' % (nm, getattr(self, nm)))
result.append('>')
return ''.join(result)
return MetaStructure("Structure", (BasePackable,), class_dict)
@@ -311,17 +308,17 @@ def __repr__(self):
long = int
# export common packables with predictable names
p_char = pypackable("p_char", bytes, "c")
p_int8 = pypackable("p_int8", int, "b")
p_uint8 = pypackable("p_uint8", int, "B")
p_int16 = pypackable("p_int16", int, "h")
p_uint16 = pypackable("p_uint16", int, "H")
p_int32 = pypackable("p_int32", int, "i")
p_uint32 = pypackable("p_uint32", long, "I")
p_int64 = pypackable("p_int64", long, "q")
p_uint64 = pypackable("p_uint64", long, "Q")
p_float = pypackable("p_float", float, "f")
p_double = pypackable("p_double", float, "d")
p_char = pypackable('p_char', bytes, 'c')
p_int8 = pypackable('p_int8', int, 'b')
p_uint8 = pypackable('p_uint8', int, 'B')
p_int16 = pypackable('p_int16', int, 'h')
p_uint16 = pypackable('p_uint16', int, 'H')
p_int32 = pypackable('p_int32', int, 'i')
p_uint32 = pypackable('p_uint32', long, 'I')
p_int64 = pypackable('p_int64', long, 'q')
p_uint64 = pypackable('p_uint64', long, 'Q')
p_float = pypackable('p_float', float, 'f')
p_double = pypackable('p_double', float, 'd')
# Deprecated names, need trick to emit deprecation warning.
p_byte = p_int8

View File

@@ -1,18 +1,18 @@
import os
import shutil
import sys
import stat
import struct
import sys
import shutil
from macholib import mach_o
MAGIC = [
struct.pack("!L", getattr(mach_o, "MH_" + _))
for _ in ["MAGIC", "CIGAM", "MAGIC_64", "CIGAM_64"]
struct.pack('!L', getattr(mach_o, 'MH_' + _))
for _ in ['MAGIC', 'CIGAM', 'MAGIC_64', 'CIGAM_64']
]
FAT_MAGIC_BYTES = struct.pack("!L", mach_o.FAT_MAGIC)
FAT_MAGIC_BYTES = struct.pack('!L', mach_o.FAT_MAGIC)
MAGIC_LEN = 4
STRIPCMD = ["/usr/bin/strip", "-x", "-S", "-"]
STRIPCMD = ['/usr/bin/strip', '-x', '-S', '-']
try:
unicode
@@ -20,7 +20,7 @@
unicode = str
def fsencoding(s, encoding=sys.getfilesystemencoding()): # noqa: M511,B008
def fsencoding(s, encoding=sys.getfilesystemencoding()):
"""
Ensure the given argument is in filesystem encoding (not unicode)
"""
@@ -66,17 +66,16 @@ def __init__(self, fileobj, start, size):
self._end = start + size
def __repr__(self):
return "<fileview [%d, %d] %r>" % (self._start, self._end, self._fileobj)
return '<fileview [%d, %d] %r>' % (
self._start, self._end, self._fileobj)
def tell(self):
return self._fileobj.tell() - self._start
def _checkwindow(self, seekto, op):
if not (self._start <= seekto <= self._end):
raise IOError(
"%s to offset %d is outside window [%d, %d]"
% (op, seekto, self._start, self._end)
)
raise IOError("%s to offset %d is outside window [%d, %d]" % (
op, seekto, self._start, self._end))
def seek(self, offset, whence=0):
seekto = offset
@@ -88,22 +87,21 @@ def seek(self, offset, whence=0):
seekto += self._end
else:
raise IOError("Invalid whence argument to seek: %r" % (whence,))
self._checkwindow(seekto, "seek")
self._checkwindow(seekto, 'seek')
self._fileobj.seek(seekto)
def write(self, bytes):
here = self._fileobj.tell()
self._checkwindow(here, "write")
self._checkwindow(here + len(bytes), "write")
self._checkwindow(here, 'write')
self._checkwindow(here + len(bytes), 'write')
self._fileobj.write(bytes)
def read(self, size=sys.maxsize):
if size < 0:
raise ValueError(
"Invalid size %s while reading from %s", size, self._fileobj
)
"Invalid size %s while reading from %s", size, self._fileobj)
here = self._fileobj.tell()
self._checkwindow(here, "read")
self._checkwindow(here, 'read')
bytes = min(size, self._end - here)
return self._fileobj.read(bytes)
@@ -112,7 +110,8 @@ def mergecopy(src, dest):
"""
copy2, but only if the destination isn't up to date
"""
if os.path.exists(dest) and os.stat(dest).st_mtime >= os.stat(src).st_mtime:
if os.path.exists(dest) and \
os.stat(dest).st_mtime >= os.stat(src).st_mtime:
return
copy2(src, dest)
@@ -139,16 +138,13 @@ def mergetree(src, dst, condition=None, copyfn=mergecopy, srcbase=None):
continue
try:
if os.path.islink(srcname):
# XXX: This is naive at best, should check srcbase(?)
realsrc = os.readlink(srcname)
os.symlink(realsrc, dstname)
elif os.path.isdir(srcname):
mergetree(
srcname,
dstname,
condition=condition,
copyfn=copyfn,
srcbase=srcbase,
)
srcname, dstname,
condition=condition, copyfn=copyfn, srcbase=srcbase)
else:
copyfn(srcname, dstname)
except (IOError, os.error) as why:
@@ -162,10 +158,10 @@ def sdk_normalize(filename):
Normalize a path to strip out the SDK portion, normally so that it
can be decided whether it is in a system path or not.
"""
if filename.startswith("/Developer/SDKs/"):
pathcomp = filename.split("/")
if filename.startswith('/Developer/SDKs/'):
pathcomp = filename.split('/')
del pathcomp[1:4]
filename = "/".join(pathcomp)
filename = '/'.join(pathcomp)
return filename
@@ -177,9 +173,9 @@ def in_system_path(filename):
Return True if the file is in a system path
"""
fn = sdk_normalize(os.path.realpath(filename))
if fn.startswith("/usr/local/"):
if fn.startswith('/usr/local/'):
return False
elif fn.startswith("/System/") or fn.startswith("/usr/"):
elif fn.startswith('/System/') or fn.startswith('/usr/'):
if fn in NOT_SYSTEM_FILES:
return False
return True
@@ -191,7 +187,7 @@ def has_filename_filter(module):
"""
Return False if the module does not have a filename attribute
"""
return getattr(module, "filename", None) is not None
return getattr(module, 'filename', None) is not None
def get_magic():
@@ -208,16 +204,16 @@ def is_platform_file(path):
if not os.path.exists(path) or os.path.islink(path):
return False
# If the header is fat, we need to read into the first arch
with open(path, "rb") as fileobj:
with open(path, 'rb') as fileobj:
bytes = fileobj.read(MAGIC_LEN)
if bytes == FAT_MAGIC_BYTES:
# Read in the fat header
fileobj.seek(0)
header = mach_o.fat_header.from_fileobj(fileobj, _endian_=">")
header = mach_o.fat_header.from_fileobj(fileobj, _endian_='>')
if header.nfat_arch < 1:
return False
# Read in the first fat arch header
arch = mach_o.fat_arch.from_fileobj(fileobj, _endian_=">")
arch = mach_o.fat_arch.from_fileobj(fileobj, _endian_='>')
fileobj.seek(arch.offset)
# Read magic off the first header
bytes = fileobj.read(MAGIC_LEN)
@@ -231,7 +227,7 @@ def iter_platform_files(dst):
"""
Walk a directory and yield each full path that is a Mach-O file
"""
for root, _dirs, files in os.walk(dst):
for root, dirs, files in os.walk(dst):
for fn in files:
fn = os.path.join(root, fn)
if is_platform_file(fn):
@@ -246,7 +242,7 @@ def strip_files(files, argv_max=(256 * 1024)):
while tostrip:
cmd = list(STRIPCMD)
flips = []
pathlen = sum(len(s) + 1 for s in cmd)
pathlen = sum([len(s) + 1 for s in cmd])
while pathlen < argv_max:
if not tostrip:
break

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -6,98 +6,79 @@
This is a fake set of symbols to allow spack to import typing in python
versions where we do not support type checking (<3)
"""
from collections import defaultdict
# (1) Unparameterized types.
Annotated = object
Any = object
AnyStr = object
ByteString = object
Counter = object
Final = object
Hashable = object
NoReturn = object
Sized = object
SupportsAbs = object
SupportsBytes = object
SupportsComplex = object
SupportsFloat = object
SupportsIndex = object
SupportsInt = object
SupportsRound = object
# (2) Parameterized types.
AbstractSet = defaultdict(lambda: object)
AsyncContextManager = defaultdict(lambda: object)
AsyncGenerator = defaultdict(lambda: object)
AsyncIterable = defaultdict(lambda: object)
AsyncIterator = defaultdict(lambda: object)
Awaitable = defaultdict(lambda: object)
Callable = defaultdict(lambda: object)
ChainMap = defaultdict(lambda: object)
ClassVar = defaultdict(lambda: object)
Collection = defaultdict(lambda: object)
Container = defaultdict(lambda: object)
ContextManager = defaultdict(lambda: object)
Coroutine = defaultdict(lambda: object)
DefaultDict = defaultdict(lambda: object)
Deque = defaultdict(lambda: object)
Dict = defaultdict(lambda: object)
ForwardRef = defaultdict(lambda: object)
FrozenSet = defaultdict(lambda: object)
Generator = defaultdict(lambda: object)
Generic = defaultdict(lambda: object)
ItemsView = defaultdict(lambda: object)
Iterable = defaultdict(lambda: object)
Iterator = defaultdict(lambda: object)
KeysView = defaultdict(lambda: object)
List = defaultdict(lambda: object)
Literal = defaultdict(lambda: object)
Mapping = defaultdict(lambda: object)
MappingView = defaultdict(lambda: object)
MutableMapping = defaultdict(lambda: object)
MutableSequence = defaultdict(lambda: object)
MutableSet = defaultdict(lambda: object)
NamedTuple = defaultdict(lambda: object)
Optional = defaultdict(lambda: object)
OrderedDict = defaultdict(lambda: object)
Reversible = defaultdict(lambda: object)
Sequence = defaultdict(lambda: object)
Set = defaultdict(lambda: object)
Tuple = defaultdict(lambda: object)
Type = defaultdict(lambda: object)
TypedDict = defaultdict(lambda: object)
Union = defaultdict(lambda: object)
ValuesView = defaultdict(lambda: object)
# (3) Type variable declarations.
TypeVar = lambda *args, **kwargs: None
# (4) Functions.
cast = lambda _type, x: x
Annotated = None
Any = None
Callable = None
ForwardRef = None
Generic = None
Literal = None
Optional = None
Tuple = None
TypeVar = None
Union = None
AbstractSet = None
ByteString = None
Container = None
Hashable = None
ItemsView = None
Iterable = None
Iterator = None
KeysView = None
Mapping = None
MappingView = None
MutableMapping = None
MutableSequence = None
MutableSet = None
Sequence = None
Sized = None
ValuesView = None
Awaitable = None
AsyncIterator = None
AsyncIterable = None
Coroutine = None
Collection = None
AsyncGenerator = None
AsyncContextManager = None
Reversible = None
SupportsAbs = None
SupportsBytes = None
SupportsComplex = None
SupportsFloat = None
SupportsInt = None
SupportsRound = None
ChainMap = None
Dict = None
List = None
OrderedDict = None
Set = None
FrozenSet = None
NamedTuple = None
Generator = None
AnyStr = None
cast = None
get_args = None
get_origin = None
get_type_hints = None
no_type_check = None
no_type_check_decorator = None
NoReturn = None
## typing_extensions
# We get a ModuleNotFoundError when attempting to import anything from typing_extensions
# if we separate this into a separate typing_extensions.py file for some reason.
# (1) Unparameterized types.
IntVar = object
Literal = object
NewType = object
Text = object
# (2) Parameterized types.
Protocol = defaultdict(lambda: object)
# (3) Macro for avoiding evaluation except during type checking.
TYPE_CHECKING = False
# (4) Decorators.
final = lambda x: x
overload = lambda x: x
runtime_checkable = lambda x: x
# these are the typing extension symbols
ClassVar = None
Final = None
Protocol = None
Type = None
TypedDict = None
ContextManager = None
Counter = None
Deque = None
DefaultDict = None
SupportsIndex = None
final = None
IntVar = None
Literal = None
NewType = None
overload = None
runtime_checkable = None
Text = None
TYPE_CHECKING = None

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,39 +0,0 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# isort: off
import sys
if sys.version_info < (3,):
from itertools import ifilter as filter
from itertools import imap as map
from itertools import izip as zip
from itertools import izip_longest as zip_longest # novm
from urllib import urlencode as urlencode
from urllib import urlopen as urlopen
else:
filter = filter
map = map
zip = zip
from itertools import zip_longest as zip_longest # novm # noqa: F401
from urllib.parse import urlencode as urlencode # novm # noqa: F401
from urllib.request import urlopen as urlopen # novm # noqa: F401
if sys.version_info >= (3, 3):
from collections.abc import Hashable as Hashable # novm
from collections.abc import Iterable as Iterable # novm
from collections.abc import Mapping as Mapping # novm
from collections.abc import MutableMapping as MutableMapping # novm
from collections.abc import MutableSequence as MutableSequence # novm
from collections.abc import MutableSet as MutableSet # novm
from collections.abc import Sequence as Sequence # novm
else:
from collections import Hashable as Hashable # noqa: F401
from collections import Iterable as Iterable # noqa: F401
from collections import Mapping as Mapping # noqa: F401
from collections import MutableMapping as MutableMapping # noqa: F401
from collections import MutableSequence as MutableSequence # noqa: F401
from collections import MutableSet as MutableSet # noqa: F401
from collections import Sequence as Sequence # noqa: F401

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -21,11 +21,15 @@
import six
from llnl.util import tty
from llnl.util.compat import Sequence
from llnl.util.lang import dedupe, memoized
from spack.util.executable import Executable
if sys.version_info >= (3, 3):
from collections.abc import Sequence # novm
else:
from collections import Sequence
__all__ = [
'FileFilter',
'FileList',
@@ -1638,18 +1642,12 @@ def find_libraries(libraries, root, shared=True, recursive=False):
raise TypeError(message)
# Construct the right suffix for the library
if shared:
# Used on both Linux and macOS
suffixes = ['so']
if sys.platform == 'darwin':
# Only used on macOS
suffixes.append('dylib')
if shared is True:
suffix = 'dylib' if sys.platform == 'darwin' else 'so'
else:
suffixes = ['a']
suffix = 'a'
# List of libraries we are searching with suffixes
libraries = ['{0}.{1}'.format(lib, suffix) for lib in libraries
for suffix in suffixes]
libraries = ['{0}.{1}'.format(lib, suffix) for lib in libraries]
if not recursive:
# If not recursive, look for the libraries directly in root

View File

@@ -1,11 +1,10 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import division
import contextlib
import functools
import inspect
import os
@@ -13,10 +12,19 @@
import sys
from datetime import datetime, timedelta
import six
from six import string_types
from llnl.util.compat import MutableMapping, zip_longest
if sys.version_info < (3, 0):
from itertools import izip_longest # novm
zip_longest = izip_longest
else:
from itertools import zip_longest # novm
if sys.version_info >= (3, 3):
from collections.abc import Hashable, MutableMapping # novm
else:
from collections import Hashable, MutableMapping
# Ignore emacs backups when listing modules
ignore_modules = [r'^\.#', '~$']
@@ -166,19 +174,6 @@ def union_dicts(*dicts):
return result
# Used as a sentinel that disambiguates tuples passed in *args from coincidentally
# matching tuples formed from kwargs item pairs.
_kwargs_separator = (object(),)
def stable_args(*args, **kwargs):
"""A key factory that performs a stable sort of the parameters."""
key = args
if kwargs:
key += _kwargs_separator + tuple(sorted(kwargs.items()))
return key
def memoized(func):
"""Decorator that caches the results of a function, storing them in
an attribute of that function.
@@ -186,23 +181,15 @@ def memoized(func):
func.cache = {}
@functools.wraps(func)
def _memoized_function(*args, **kwargs):
key = stable_args(*args, **kwargs)
def _memoized_function(*args):
if not isinstance(args, Hashable):
# Not hashable, so just call the function.
return func(*args)
try:
return func.cache[key]
except KeyError:
ret = func(*args, **kwargs)
func.cache[key] = ret
return ret
except TypeError as e:
# TypeError is raised when indexing into a dict if the key is unhashable.
raise six.raise_from(
UnhashableArguments(
"args + kwargs '{}' was not hashable for function '{}'"
.format(key, func.__name__),
),
e)
if args not in func.cache:
func.cache[args] = func(*args)
return func.cache[args]
return _memoized_function
@@ -944,15 +931,3 @@ def elide_list(line_list, max_num=10):
return line_list[:max_num - 1] + ['...'] + line_list[-1:]
else:
return line_list
@contextlib.contextmanager
def nullcontext(*args, **kwargs):
"""Empty context manager.
TODO: replace with contextlib.nullcontext() if we ever require python 3.7.
"""
yield
class UnhashableArguments(TypeError):
"""Raise when an @memoized function receives unhashable arg or kwarg values."""

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

Some files were not shown because too many files have changed in this diff Show More