Commit Graph

273 Commits

Author SHA1 Message Date
Todd Gamblin
549ba1ed32 black: fix style check package and flake8 formatting for black
Black will automatically fix a lot of the exceptions we previously allowed for
directives, so we don't need them in our custom `flake8_formatter` anymore.

- [x] remove `E501` (long line) exceptions for directives from `flake8_formatter`,
      as they won't help us now.
- [x] Refine exceptions for long URLs in the `flake8_formatter`.
- [x] Adjust the mock `flake8-package` to exhibit the exceptions we still allow.
- [x] Update style tests for new `flake8-package`.
- [x] Blacken style test.
2022-07-31 13:29:20 -07:00
Todd Gamblin
156af2a60a black: clean up noqa comments from most of the code
Many noqa's in the code are no longer necessary now that the column limit is 99
characters. Others can easily be eliminated, and still more can just be made more
specific if they do not have to do with line length.

The only E501's still in the code are in the tests for `spack.util.path` and the tests
for `spack style`.
2022-07-31 13:29:20 -07:00
Massimiliano Culpo
7f2b5e8e57
spack.repo.get() can only be called on concrete specs (#31411)
The goal of this PR is to make clearer where we need a package object in Spack as opposed to a package class.

We currently instantiate a lot of package objects when we could make do with a class.  We should use the class
when we only need metadata, and we should only instantiate and us an instance of `PackageBase` at build time.

Modifications:
- [x] Remove the `spack.repo.get` convenience function (which was used in many places, and not really needed)
- [x] Use `spack.repo.path.get_pkg_class` wherever possible
- [x] Try to route most of the need for `spack.repo.path.get` through `Spec.package`
- [x] Introduce a non-data descriptor, that can be used as a decorator, to have "class level properties"
- [x] Refactor unit tests that had to be modified to reduce code duplication
- [x] `Spec.package` and `Repo.get` now require a concrete spec as input
- [x] Remove `RepoPath.all_packages` and `Repo.all_packages`
2022-07-12 19:45:24 -04:00
Chuck Atkins
85dc20cb55
Spec: Add a new virtual-customizable home attribute (#30917)
* Spec: Add a new virtual-customizable home attribute

* java: Use the new builtin home attribute

* python: Use the new builtin home attribute
2022-06-17 10:29:08 -04:00
Massimiliano Culpo
895ceeda38
concretize.lp: impose a lower bound on the number of version facts if a solution exists (#31142)
* concretize.lp: impose a lower bound on the number of version facts if a valid version exists

fixes #30864

* Add a unit test
2022-06-16 09:25:56 -07:00
Jordan Galby
e74d85a524
Fix empty install prefix post-install sanity check (#30983) 2022-06-07 16:17:33 +02:00
Chuck Atkins
9d7cc43673
Package: Don't warn for missing source on bundle packages without code (#30913) 2022-06-06 15:33:03 -07:00
Tom Scogland
18c2f1a57a
refactor: packages import spack.package explicitly (#30404)
Explicitly import package utilities in all packages, and corresponding fallout.

This includes:

* rename `spack.package` to `spack.package_base`
* rename `spack.pkgkit` to `spack.package`
* update all packages in builtin, builtin_mock and tutorials to include `from spack.package import *`
* update spack style
  * ensure packages include the import
  * automatically add the new import and remove any/all imports of `spack` and `spack.pkgkit`
    from packages when using `--fix`
  * add support for type-checking packages with mypy when SPACK_MYPY_CHECK_PACKAGES
    is set in the environment
* fix all type checking errors in packages in spack upstream
* update spack create to include the new imports
* update spack repo to inject the new import, injection persists to allow for a deprecation period

Original message below:
 
As requested @adamjstewart, update all packages to use pkgkit.  I ended up using isort to do this,
so repro is easy:

```console
$ isort -a 'from spack.pkgkit import *' --rm 'spack' ./var/spack/repos/builtin/packages/*/package.py
$ spack style --fix
```

There were several line spacing fixups caused either by space manipulation in isort or by packages
that haven't been touched since we added requirements, but there are no functional changes in here.

* [x] add config to isort to make sure this is maintained going forward
2022-05-28 12:55:44 -04:00
Massimiliano Culpo
f2a81af70e
Best effort co-concretization (iterative algorithm) (#28941)
Currently, environments can either be concretized fully together or fully separately. This works well for users who create environments for interoperable software and can use `concretizer:unify:true`. It does not allow environments with conflicting software to be concretized for maximal interoperability.

The primary use-case for this is facilities providing system software. Facilities provide multiple MPI implementations, but all software built against a given MPI ought to be interoperable.

This PR adds a concretization option `concretizer:unify:when_possible`. When this option is used, Spack will concretize specs in the environment separately, but will optimize for minimal differences in overlapping packages.

* Add a level of indirection to root specs

This commit introduce the "literal" atom, which comes with
a few different "arities". The unary "literal" contains an
integer that id the ID of a spec literal. Other "literals"
contain information on the requests made by literal ID. For
instance zlib@1.2.11 generates the following facts:

literal(0,"root","zlib").
literal(0,"node","zlib").
literal(0,"node_version_satisfies","zlib","1.2.11").

This should help with solving large environments "together
where possible" since later literals can be now solved
together in batches.

* Add a mechanism to relax the number of literals being solved

* Modify spack solve to display the new criteria

Since the new criteria is above all the build criteria,
we need to modify the way we display the output.

Originally done by Greg in #27964 and cherry-picked
to this branch by the co-author of the commit.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Inject reusable specs into the solve

Instead of coupling the PyclingoDriver() object with
spack.config, inject the concrete specs that can be
reused.

A method level function takes care of reading from
the store and the buildcache.

* spack solve: show output of multi-rounds

* add tests for best-effort coconcretization

* Enforce having at least a literal being solved

Co-authored-by: Greg Becker <becker33@llnl.gov>
2022-05-24 12:13:28 -07:00
Todd Gamblin
521c206030 concretizer: enable hash reuse with full hash
With the original DAG hash, we did not store build dependencies in the database, but
with the full DAG hash, we do. Previously, we'd never tell the concretizer about build
dependencies of things used by hash, because we never had them. Now, we have to avoid
telling the concretizer about them, or they'll unnecessarily constrain build
dependencies for new concretizations.

- [x] Make database track all dependencies included in the `dag_hash`
- [x] Modify spec_clauses so that build dependency information is optional
      and off by default.
- [x] `spack diff` asks `spec_clauses` for build dependencies for completeness
- [x] Modify `concretize.lp` so that reuse optimization doesn't affect fresh
      installations.
- [x] Modify concretizer setup so that it does *not* prioritize installed versions
      over package versions. We don't need this with reuse, so they're low priority.
- [x] Fix `test_installed_deps` for full hash and new concretizer (does not work
      for old concretizer with full hash -- leave this for later if we need it)
- [x] Move `test_installed_deps` mock packages to `builtin.mock` for easier debugging
      with `spack -m`.
- [x] Fix `test_reuse_installed_packages_when_package_def_changes` for full hash
2022-05-13 10:45:12 -07:00
Scott Wittenburg
cb0d12b9d5 Fix how environments are read from lockfile 2022-05-13 10:45:12 -07:00
Tamara Dahlgren
011a491b16
package audit: ensure stand-alone test method not include in build-phase testing (#30352) 2022-05-05 18:04:16 +02:00
Peter Scheibel
bb43308c44
Add command for reading JSON-based DB description (now with more tests) (#29652)
This is an amended version of https://github.com/spack/spack/pull/24894 (reverted in https://github.com/spack/spack/pull/29603). https://github.com/spack/spack/pull/24894
broke all instances of `spack external find` (namely when it is invoked without arguments/options)
because it was mandating the presence of a file which most systems would not have.
This allows `spack external find` to proceed if that file is not present and adds tests for this.

- [x] Add a test which confirms that `spack external find` successfully reads a manifest file
      if present in the default manifest path

--- Original commit message ---

Adds `spack external read-cray-manifest`, which reads a json file that describes a
set of package DAGs. The parsed results are stored directly in the database. A user
can see these installed specs with `spack find` (like any installed spec). The easiest
way to use them right now as dependencies is to run
`spack spec ... ^/hash-of-external-package`.

Changes include:

* `spack external read-cray-manifest --file <path/to/file>` will add all specs described
  in the file to Spack's installation DB and will also install described compilers to the
  compilers configuration (the expected format of the file is described in this PR as well including examples of the file)
* Database records now may include an "origin" (the command added in this PR
  registers the origin as "external-db"). In the future, it is assumed users may want
  to be able to treat installs registered with this command differently (e.g. they may
  want to uninstall all specs added with this command)
* Hash properties are now always preserved when copying specs if the source spec
  is concrete
  * I don't think the hashes of installed-and-concrete specs should change and this
    was the easiest way to handle that
  * also specs that are concrete preserve their `.normal` property when copied
    (external specs may mention compilers that are not registered, and without this
    change they would fail in `normalize` when calling `validate_or_raise`)
  * it might be this should only be the case if the spec was installed

- [x] Improve testing
- [x] Specifically mark DB records added with this command (so that users can do
      something like "uninstall all packages added with `spack read-external-db`)
  * This is now possible with `spack uninstall --all --origin=external-db` (this will
    remove all specs added from manifest files)
- [x] Strip variants that are listed in json entries but don't actually exist for the package
2022-04-28 10:56:26 -07:00
Tamara Dahlgren
0c31ab87c9
Feature: Allow re-use of run_test() in install_time_test_callbacks (#26594)
Allow re-use of run_test() in install_time_test_callbacks

Co-authored-by: Greg Becker <becker33@llnl.gov>
2022-04-26 17:40:05 -07:00
Peter Scheibel
267da78559
Package-level submodule attribute: support explicit versions (#30085) 2022-04-22 11:02:17 -07:00
Peter Scheibel
89f6db21f1
Ad-hoc Git commit versions: support submodules (#30037)
* Allow packages to add a 'submodules' property that determines when ad-hoc Git-commit-based versions should initialize submodules

* add support for ad-hoc git-commit-based versions to instantiate submodules if the associated package has a 'submodules' property and it indicates this should happen for the associated spec

* allow Package-level submodule request to influence all explicitly-defined version() in the Package

* skip test on windows which fails because of long paths
2022-04-13 20:05:14 -07:00
Nathan Hanford
a3520d14bd
Splice differing virtual packages (#27919)
Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
2022-04-12 13:31:39 -07:00
Peter Scheibel
2aec5b65f3
Git commit versions bugfix: Environments and Concretization (#29717)
Spack added support in #24639 for ad-hoc Git-commit-hash-based
versions: A user can install a package x@hash, where X is a package
that stores its source code in a Git repository, and the hash refers
to a commit in that repository which is not recorded as an explicit
version in the package.py file for X.

A couple issues were found relating to this:

* If an environment defines an alternative package repo (i.e. with
  repos.yaml), and spack.yaml contains user Specs with ad-hoc
  Git-commit-hash-based versions for packages in that repo,
  then as part of retrieving the data needed for version comparisons
  it will attempt to retrieve the package before the environment's
  configuration is instantiated.
* The bookkeeping information added to compare ad-hoc git versions was
  being stripped from Specs during concretization (such that user
  Specs which succeeded before concretizing would then fail after)

This addresses the issues:

* The first issue is resolved by deferring access to the associated
  Package until the versions are actually compared to one another.
* The second issue is resolved by ensuring that the Git bookkeeping
  information is explicitly applied to Specs after they are concretized.

This also:

* Resolves an ambiguity in the mock_git_version_info fixture used to
  create a tree of Git commits and provide a list where each index
  maps to a known commit.
* Isolates the cache used for Git repositories in tests using the
  mock_git_version_info fixture
* Adds a TODO which points out that if the remote Git repository
  overwrites tags, that Spack will then fail when using
  ad-hoc Git-commit-hash-based versions
2022-04-12 09:58:26 -07:00
Massimiliano Culpo
f2fc4ee9af
Allow conditional possible values in variants (#29530)
Allow declaring possible values for variants with an associated condition. If the variant takes one of those values, the condition is imposed as a further constraint.

The idea of this PR is to implement part of the mechanisms needed for modeling [packages with multiple build-systems]( https://github.com/spack/seps/pull/3). After this PR the build-system directive can be implemented as:
```python
variant(
    'build-system',
    default='cmake',
    values=(
        'autotools',
        conditional('cmake', when='@X.Y:')
    ), 
    description='...',
)
```

Modifications:
- [x] Allow conditional possible values in variants
- [x] Add a unit-test for the feature
- [x] Add documentation
2022-04-04 17:37:57 -07:00
Nathan Hanford
88d8ca9b65
rewiring of spliced specs (#26873)
* tests for rewiring pure specs to spliced specs

* relocate text, binaries, and links

* using llnl.util.symlink for windows compat.

Note: This does not include CLI hooks for relocation.

Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
2022-04-04 14:45:35 -07:00
Harmen Stoppels
59e522e815
environment views: single pass view generation (#29443)
Reduces the number of stat calls to a bare minimum:
- Single pass over src prefixes
- Handle projection clashes in memory

Symlinked directories in the src prefixes are now conditionally
transformed into directories with symlinks in the dst dir. Notably
`intel-mkl`, `cuda` and `qt` has top-level symlinked directories that
previously resulted in empty directories in the view. We now avoid
cycles and possible exponential blowup by only expanding symlinks that:
- point to dirs deeper in the folder structure;
- are a fixed depth of 2.
2022-03-24 03:54:33 -06:00
Adam J. Stewart
5df10c04cd
Use stable URLs and ?full_index=1 for all github patches (#29239)
The number of commit characters in patch files fetched from GitHub can change,
so we should use `full_index=1` to enforce full commit hashes (and a stable
patch `sha256`).

Similarly, URLs for branches like `master` don't give us stable patch files,
because branches are moving targets. Use specific tags or commits for those.

- [x] update all github patch URLs to use `full_index=1`
- [x] don't use `master` or other branches for patches
- [x] add an audit check and a test for `?full_index=1`

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2022-03-23 08:50:00 +01:00
Nils Vu
bfb6873ce3
Revert "Add command for reading a json-based DB description (#24894)" (#29603)
This reverts commit 531b1c5c3d.
2022-03-19 16:30:46 -06:00
Peter Scheibel
531b1c5c3d
Add command for reading a json-based DB description (#24894)
Adds `spack external read-cray-manifest`, which reads a json file that describes a set of package DAGs. The parsed results are stored directly in the database. A user can see these installed specs with `spack find` (like any installed spec). The easiest way to use them right now as dependencies is to run `spack spec ... ^/hash-of-external-package`.

Changes include:

* `spack external read-cray-manifest --file <path/to/file>` will add all specs described in the file to Spack's installation DB and will also install described compilers to the compilers configuration (the expected format of the file is described in this PR as well including examples of the file)
* Database records now may include an "origin" (the command added in this PR registers the origin as "external-db"). In the future, it is assumed users may want to be able to treat installs registered with this command differently (e.g. they may want to uninstall all specs added with this command)
* Hash properties are now always preserved when copying specs if the source spec is concrete
  * I don't think the hashes of installed-and-concrete specs should change and this was the easiest way to handle that
  * also specs that are concrete preserve their `.normal` property when copied (external specs may mention compilers that are not registered, and without this change they would fail in `normalize` when calling `validate_or_raise`)
  * it might be this should only be the case if the spec was installed

- [x] Improve testing
- [x] Specifically mark DB records added with this command (so that users can do something like "uninstall all packages added with `spack read-external-db`)
  * This is now possible with `spack uninstall --all --origin=external-db` (this will remove all specs added from manifest files)
- [x] Strip variants that are listed in json entries but don't actually exist for the package

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-03-18 17:07:22 -07:00
John Parent
4aee27816e Windows Support: Testing Suite integration
Broaden support for execution of the test suite
on Windows.
General bug and review fixups
2022-03-17 09:01:01 -07:00
John Parent
3a994032f8 Spack on Windows package ports
CMake - Windows Bootstrap (#25825)

Remove hardcoded cmake compiler (#26410)

Revert breaking cmake changes
Ensure no autotools on Windows

Perl on Windows (#26612)

Python source build windows (#26313)

Reconfigure sysconf for Windows

Python2.6 compatibility

Fxixup new sbang tests for windows

Ruby support (#28287)

Add NASM support (#28319)

Add mock Ninja package for testing
2022-03-17 09:01:01 -07:00
Massimiliano Culpo
a6eed4a7c7
Remove "setup_environment" and "setup_dependent_environment" (#29463)
fixes #29446

The new setup_*_environment functions have been falling back
to calling the old functions and warn the user since #11115.

This commit removes the fallback behavior and any use of:
- setup_environment
- setup_dependent_environment
in the codebase
2022-03-13 11:51:55 -04:00
Tamara Dahlgren
0b4f40ab79
Testing: Summarize test results and add verbose output (#28700) 2022-02-23 18:36:21 -08:00
Greg Becker
130354b867
spack audit: fix spurious failures for target/platform conflicts (#28860) 2022-02-10 09:10:23 +01:00
Massimiliano Culpo
cd04109e17
Add a "sticky" property to variants (#28630)
* Add sticky variants

* Add unit tests for sticky variants

* Add documentation for sticky variants

* Revert "Revert 19736 because conflicts are avoided by clingo by default (#26721)"

This reverts commit 33ef7d57c1.

* Add stickiness to "allow-unsupported-compiler"
2022-02-02 10:05:24 -08:00
Todd Gamblin
93377942d1 Update copyright year to 2022 2022-01-14 22:50:21 -08:00
Adam J. Stewart
3540f8200a
PythonPackage: install packages with pip (#27798)
* Use pip to bootstrap pip

* Bootstrap wheel from source

* Update PythonPackage to install using pip

* Update several packages

* Add wheel as base class dep

* Build phase no longer exists

* Add py-poetry package, fix py-flit-core bootstrapping

* Fix isort build

* Clean up many more packages

* Remove unused import

* Fix unit tests

* Don't directly run setup.py

* Typo fix

* Remove unused imports

* Fix issues caught by CI

* Remove custom setup.py file handling

* Use PythonPackage for installing wheels

* Remove custom phases in PythonPackages

* Remove <phase>_args methods

* Remove unused import

* Fix various packages

* Try to test Python packages directly in CI

* Actually run the pipeline

* Fix more packages

* Fix mappings, fix packages

* Fix dep version

* Work around bug in concretizer

* Various concretization fixes

* Fix gitlab yaml, packages

* Fix typo in gitlab yaml

* Skip more packages that fail to concretize

* Fix? jupyter ecosystem concretization issues

* Solve Jupyter concretization issues

* Prevent duplicate entries in PYTHONPATH

* Skip fenics-dolfinx

* Build fewer Python packages

* Fix missing npm dep

* Specify image

* More package fixes

* Add backends for every from-source package

* Fix version arg

* Remove GitLab CI stuff, add py-installer package

* Remove test deps, re-add install_options

* Function declaration syntax fix

* More build fixes

* Update spack create template

* Update PythonPackage documentation

* Fix documentation build

* Fix unit tests

* Remove pip flag added only in newer pip

* flux: add explicit dependency on jsonschema

* Update packages that have been added since this was branched off of develop

* Move Python 2 deprecation to a separate PR

* py-neurolab: add build dep on py-setuptools

* Use wheels for pip/wheel

* Allow use of pre-installed pip for external Python

* pip -> python -m pip

* Use python -m pip for all packages

* Fix py-wrapt

* Add both platlib and purelib to PYTHONPATH

* py-pyyaml: setuptools is needed for all versions

* py-pyyaml: link flags aren't needed

* Appease spack audit packages

* Some build backend is required for all versions, distutils -> setuptools

* Correctly handle different setup.py filename

* Use wheels for py-tomli to avoid circular dep on py-flit-core

* Fix busco installation procedure

* Clarify things in spack create template

* Test other Python build backends

* Undo changes to busco

* Various fixes

* Don't test other backends
2022-01-14 12:37:57 -06:00
Andrew W Elble
96535cc4f9
MANPATH needs a trailing ':' to utilize system defaults (#21682)
otherwise spack breaks using system man pages by default.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-12-16 10:54:35 +00:00
Tamara Dahlgren
9240614928
Bugfix: Simplify preferred-test versions; set checksum defaults (#28026)
* Simplify preferred-test versions; set checksum defaults

* Fix test_preferred failure
2021-12-16 06:55:28 +01:00
Adam J. Stewart
0960c0810c
Extends: support spec, not just package name (#27754) 2021-12-10 13:30:21 +01:00
Harmen Stoppels
0024e5cc9b
Make _enable_or_disable(...) return an empty array for conditional variants whose condition is not met (#27504) 2021-11-22 10:47:09 +01:00
Harmen Stoppels
c5aee4d9b4
define_from_variant: return an empty string for non-existing variants (#27503)
This permits to use conditional variants without a lot of boilerplate.
2021-11-19 14:10:00 +01:00
Greg Becker
67cd92e6a3
Allow conditional variants (#24858)
A common question from users has been how to model variants 
that are new in new versions of a package, or variants that are 
dependent on other variants. Our stock answer so far has been
an unsatisfying combination of "just have it do nothing in the old
version" and "tell Spack it conflicts".

This PR enables conditional variants, on any spec condition. The 
syntax is straightforward, and matches that of previous features.
2021-11-03 08:11:31 +01:00
Paul Ferrell
4ee37c37de
buildcaches: fix directory link relocation (#26948)
When relocating a binary distribution, Spack only checks files to see
if they are a link that needs to be relocated. Directories can be
such links as well, however, and need to undergo the same checks
and potential relocation.
2021-10-28 14:34:31 +02:00
Massimiliano Culpo
6d69d23aa5 Add a unit test to prevent regression 2021-10-25 09:11:04 +02:00
Massimiliano Culpo
2d45a9d617
Speed-up environment concretization on linux with a process pool (#26264)
* Speed-up environment concretization with a process pool

We can exploit the fact that the environment is concretized
separately and use a pool of processes to concretize it.

* Add module spack.util.parallel

Module includes `pool` and `parallel_map` abstractions,
along with implementation details for both.

* Add a new hash type to pass specs across processes

* Add tty msg with concretization time
2021-10-19 10:09:34 -05:00
Massimiliano Culpo
eded8f48dc
ASP-based solver: add a rule for version uniqueness in virtual packages (#26740)
fixes #26718

A virtual package may or may not have a version, but it
never has more than one. Previously we were missing a rule
for that.
2021-10-14 23:06:41 +02:00
Massimiliano Culpo
949094544e
Constrain abstract specs rather than concatenating strings in the "when" context manager (#26700)
Using the Spec.constrain method doesn't work since it might
trigger a repository lookup which could break our directives
and triggers a circular import error.

To fix that we introduce a function to merge abstract anonymous
specs, based only on package names, which does not perform any
lookup in the repository.
2021-10-14 12:33:10 +02:00
Patrick Gartung
047c95aa8d
buildcache: do one less tar file extraction
The buildcache is now extracted in a temporary folder within the current store,
moved to its final place and relocated. 

"spack clean -s" has been extended to also clean the temporary extraction directory.

Add hardlinks with absolute paths for libraries in the corge, garply and quux packages
to detect incorrect handling of hardlinks in tests.
2021-10-13 17:38:29 +02:00
Massimiliano Culpo
551120ee0b
ASP-based solver: decrease the priority of multi-valued variant optimization for root (#26677)
The ASP-based solver maximizes the number of values in multi-valued
variants (if other higher order constraints are met), to avoid cases
where only a subset of the values that have been specified on the
command line or imposed by another constraint are selected.

Here we swap the priority of this optimization target with the
selection of the default providers, to avoid unexpected results
like the one in #26598
2021-10-12 14:15:48 +02:00
Massimiliano Culpo
319ae9254e
Remove the spack.architecture module (#25986)
The `spack.architecture` module contains an `Arch` class that is very similar to `spack.spec.ArchSpec` but points to platform, operating system and target objects rather than "names". There's a TODO in the class since 2016:

abb0f6e27c/lib/spack/spack/architecture.py (L70-L75)

and this PR basically addresses that. Since there are just a few places where the `Arch` class was used, here we query the relevant platform objects where they are needed directly from `spack.platforms`. This permits to clean the code from vestigial logic.

Modifications:
- [x] Remove the `spack.architecture` module and replace its use by `spack.platforms`
- [x] Remove unneeded tests
2021-10-06 10:28:12 -07:00
Harmen Stoppels
b9e72557e8
Remove .99 from version ranges (#26422)
In most cases, .99 can be omitted thanks to #26402 .
2021-10-03 09:09:02 -04:00
Harmen Stoppels
87450f3688
Use gnuconfig package for config file replacement (#26035)
* Use gnuconfig package for config file replacement

Currently the autotools build system tries to pick up config.sub and
config.guess files from the system (in /usr/share) on arm and power.
This is introduces an implicit system dependency which we can avoid by
distributing config.guess and config.sub files in a separate package,
such as the new `gnuconfig` package which is very lightweight/text only
(unlike automake where we previously pulled these files from as a
backup). This PR adds `gnuconfig` as an unconditional build dependency
for arm and power archs.

In case the user needs a system version of config.sub and config.guess,
they are free to mark `gnuconfig` as an external package with the prefix
pointing to the directory containing the config files:

```yaml
    gnuconfig:
      externals:
      - spec: gnuconfig@master
        prefix: /tmp/tmp.ooBlkyAKdw/lol
      buildable: false
```

Apart from that, this PR gives some better instructions for users when
replacing config files goes wrong.

* Mock needs this package too now, because autotools adds a depends_on

* Add documentation

* Make patch_config_files a prop, fix the docs, add integrations tests

* Make macOS happy
2021-09-27 18:38:14 -04:00
iarspider
ca8d16c9d1
Allow setting variant name in AutotoolsPackage._activate_or_not (#26054) 2021-09-20 10:54:24 +02:00
Vanessasaurus
ef5ad4eb34
Adding ability to compare git references to spack install (#24639)
This will allow a user to (from anywhere a Spec is parsed including both name and version) refer to a git commit in lieu of 
a package version, and be able to make comparisons with releases in the history based on commits (or with other commits). We do this by way of:

 - Adding a property, is_commit, to a version, meaning I can always check if a version is a commit and then change some action.
 - Adding an attribute to the Version object which can lookup commits from a git repo and find the last known version before that commit, and the distance
 - Construct new Version comparators, which are tuples. For normal versions, they are unchanged. For commits with a previous version x.y.z, d commits away, the comparator is (x, y, z, '', d). For commits with no previous version, the comparator is ('', d) where d is the distance from the first commit in the repo.
 - Metadata on git commits is cached in the misc_cache, for quick lookup later.
 - Git repos are cached as bare repos in `~/.spack/git_repos`
 - In both caches, git repo urls are turned into file paths within the cache

If a commit cannot be found in the cached git repo, we fetch from the repo. If a commit is found in the cached metadata, we do not recompare to newly downloaded tags (assuming repo structure does not change). The cached metadata may be thrown out by using the `spack clean -m` option if you know the repo structure has changed in a way that invalidates existing entries. Future work will include automatic updates.

# Finding previous versions
Spack will search the repo for any tags that match the string of a version given by the `version` directive. Spack will also search for any tags that match `v + string` for any version string. Beyond that, Spack will search for tags that match a SEMVER regex (i.e., tags of the form x.y.z) and interpret those tags as valid versions as well. Future work will increase the breadth of tags understood by Spack

For each tag, Spack queries git to determine whether the tag is an ancestor of the commit in question or not. Spack then sorts the tags that are ancestors of the commit by commit-distance in the repo, and takes the nearest ancestor. The version represented by that tag is listed as the previous version for the commit.

Not all commits will find a previous version, depending on the package workflow. Future work may enable more tangential relationships between commits and versions to be discovered, but many commits in real world git repos require human knowledge to associate with a most recent previous version. Future work will also allow packages to specify commit/tag/version relationships manually for such situations.

# Version comparisons.
The empty string is a valid component of a Spack version tuple, and is in fact the lowest-valued component. It cannot be generated as part of any valid version. These two characteristics make it perfect for delineating previous versions from distances. For any version x.y.z, (x, y, z, '', _) will be less than any "real" version beginning x.y.z. This ensures that no distance from a release will cause the commit to be interpreted as "greater than" a version which is not an ancestor of it.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-09-14 22:12:34 -07:00
Nathan Hanford
d83f7110d5
specs: move to new spec.json format with build provenance (#22845)
This is a major rework of Spack's core core `spec.yaml` metadata format.  It moves from `spec.yaml` to `spec.json` for speed, and it changes the format in several ways. Specifically:

1. The spec format now has a `_meta` section with a version (now set to version `2`).  This will simplify major changes like this one in the future.
2. The node list in spec dictionaries is no longer keyed by name. Instead, it is a list of records with no required key. The name, hash, etc. are fields in the dictionary records like any other.
3. Dependencies can be keyed by any hash (`hash`, `full_hash`, `build_hash`).
4. `build_spec` provenance from #20262 is included in the spec format. This means that, for spliced specs, we preserve the *full* provenance of how to build, and we can reproduce a spliced spec from the original builds that produced it.

**NOTE**: Because we have switched the spec format, this PR changes Spack's hashing algorithm.  This means that after this commit, Spack will think a lot of things need rebuilds.

There are two major benefits this PR provides:
* The switch to JSON format speeds up Spack significantly, as Python's builtin JSON implementation is orders of magnitude faster than YAML. 
* The new Spec format will soon allow us to represent DAGs with potentially multiple versions of the same dependency -- e.g., for build dependencies or for compilers-as-dependencies.  This PR lays the necessary groundwork for those features.

The old `spec.yaml` format continues to be supported, but is now considered a legacy format, and Spack will opportunistically convert these to the new `spec.json` format.
2021-09-09 01:48:30 -07:00
Todd Gamblin
c309adb4b3
url stats: add --show-issues option (#25792)
* tests: make `spack url [stats|summary]` work on mock packages

Mock packages have historically had mock hashes, but this means they're also invalid
as far as Spack's hash detection is concerned.

- [x] convert all hashes in mock package to md5 or sha256
- [x] ensure that all mock packages have a URL
- [x] ignore some special cases with multiple VCS fetchers

* url stats: add `--show-issues` option

`spack url stats` tells us how many URLs are using what protocol, type of checksum,
etc., but it previously did not tell us which packages and URLs had the issues. This
adds a `--show-issues` option to show URLs with insecure (`http`) URLs or `md5` hashes
(which are now deprecated by NIST).
2021-09-08 07:59:06 -07:00
Massimiliano Culpo
7dd3592eab
Regression test for version selection with preferences (#25602)
This commit adds a regression test for version selection
with preferences in `packages.yaml`. Before PR 25585 we
used negative weights in a minimization to select the
optimal version. This may lead to situations where a
dependency may make the version score of dependents
"better" if it is preferred in packages.yaml.
2021-08-26 09:28:47 -07:00
Harmen Stoppels
73208f5835
Fix bindist network issues (#25587)
* Fix bindist network issues

* Another one using the network
2021-08-24 12:09:23 -05:00
Massimiliano Culpo
371bc37dd4
Rework rules for provider weights (#25331)
Preferred providers had a non-zero weight because in an earlier formulation of the logic program that was needed to prefer external providers over default providers. With the current formulation for externals this is not needed anymore, so we can give a weight of zero to both default choices and providers that are externals. _Using zero ensures that we don't introduce any drift towards having less providers, which was happening when minimizing positive weights_.

Modifications:

- [x] Default weight for providers starts at 0 (instead of 10, needed before to prefer externals)
- [x] Rules to compute the `provider_weight` have been refactored. There are multiple possible weights for a given `Virtual`. Only one gets selected by the solver (the one that minimizes the objective function).
- [x] `provider_weight` are now accounting for each different `Virtual`. Before there was a single weight per provider, even if the package was providing multiple virtuals.

* Give preferred providers a weight of zero

Preferred providers had a non-zero weight because in an earlier
formulation of the logic program that was needed to prefer
external providers over default providers.

With the current formulation for externals this is not needed anymore,
so we can give a weight of zero to default choices. Using zero
ensures that we don't introduce any drift towards having
less providers, which was happening when minimizing positive weights.

* Simplify how we compute weights for providers

Rewrite rules so that specific events (i.e. being
an external) unlock the possibility to use certain
weights. The weight being considered is then selected
by the minimization process to be the one that gives
the best score.

* Allow providers to have different weights for different virtuals

Before this change we didn't differentiate providers based on
the virtual they provide, which meant that packages providing
more than one virtual had nonetheless a single weight.

With this change there will be a weight per virtual.
2021-08-10 14:15:45 -07:00
Todd Gamblin
24c01d57cf
imports: sort imports everywhere in Spack (#24695)
* fix remaining flake8 errors

* imports: sort imports everywhere in Spack

We enabled import order checking in #23947, but fixing things manually drives
people crazy. This used `spack style --fix --all` from #24071 to automatically
sort everything in Spack so PR submitters won't have to deal with it.

This should go in after #24071, as it assumes we're using `isort`, not
`flake8-import-order` to order things. `isort` seems to be more flexible and
allows `llnl` mports to be in their own group before `spack` ones, so this
seems like a good switch.
2021-07-08 22:12:30 +00:00
Massimiliano Culpo
3d11716e54
Add when context manager to group common constraints in packages (#24650)
This PR adds a context manager that permit to group the common part of a `when=` argument and add that to the context:
```python
class Gcc(AutotoolsPackage):
    with when('+nvptx'):
        depends_on('cuda')
        conflicts('@:6', msg='NVPTX only supported in gcc 7 and above')
        conflicts('languages=ada')
        conflicts('languages=brig')
        conflicts('languages=go')
```
The above snippet is equivalent to:
```python
class Gcc(AutotoolsPackage):
    depends_on('cuda', when='+nvptx')
    conflicts('@:6', when='+nvptx', msg='NVPTX only supported in gcc 7 and above')
    conflicts('languages=ada', when='+nvptx')
    conflicts('languages=brig', when='+nvptx')
    conflicts('languages=go', when='+nvptx')
```
which needs a repetition of the `when='+nvptx'` argument. The context manager might help improving readability and permits to group together directives related to the same semantic aspect (e.g. all the directives needed to model the behavior of `gcc` when `+nvptx` is active). 

Modifications:

- [x] Added a `when` context manager to be used with package directives
- [x] Add unit tests and documentation for the new feature
- [x] Modified `cp2k` and `gcc` to show the use of the context manager
2021-07-02 08:43:15 -07:00
Massimiliano Culpo
acc11f676d
ASP-based solver: fix provider logic (#24351)
This commit fixes a subtle bug that may occur when
a package is a "possible_provider" of a virtual but
no "provides_virtual" can be deduced. In that case
the cardinality constraint on "provides_virtual"
may arbitrarily assign a package the role of provider
even if the constraints for it to be one are not fulfilled.

The fix reworks the logic around three concepts:
- "possible_provider": a package may provide a virtual if some constraints are met
- "provides_virtual": a package meet the constraints to provide a virtual
- "provider": a package selected to provide a virtual
2021-06-22 11:37:24 -07:00
Erik Schnetter
e3b220f699
Implement CVS fetcher (#23212)
Spack packages can now fetch versions from CVS repositories. Note
this fetch mechanism is unsafe unless using :extssh:. Most public
CVS repositories use an insecure protocol implemented as part of CVS.
2021-06-22 09:51:31 -07:00
Vanessasaurus
8e249c03de
adding save of build times on install (#24350)
Here we are adding an install_times.json into the spack install metadata folder.
We record a total, global time, along with the times for each phase. The type
of phase or install start / end is included (e.g., build or fail)

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-06-22 03:01:15 -06:00
Massimiliano Culpo
e321578bbe
ASP-based solver: reordered low priority optimization criteria (#24184)
Minimizing compiler mismatches in the DAG and preferring newer 
versions of packages are now higher priority than trying to use as 
many default values as possible in multi-valued variants.
2021-06-08 16:10:49 +02:00
Tamara Dahlgren
929d1de3e5
Stand-alone/Smoke tests: copy cached test sources to test stage (#23713) 2021-05-25 07:24:32 -07:00
Massimiliano Culpo
fc2ac099cd
ASP-based solver: account for deprecated versions (#23491)
fixes #22351

The ASP-based solver now accounts for the presence
in the DAG of deprecated versions and tries to minimize
their number at highest priority.
2021-05-12 07:17:38 -07:00
Peter Scheibel
5230730941
Environments: add run deps to shell modifications (#23485)
When adding an Environment to a user's shell, Spack was only adding
root specs. This now includes run dependencies of root specs.
2021-05-11 14:30:57 -07:00
Massimiliano Culpo
2a509ea0bf
ASP-based solver: variants set from cli are considered as defaults (#23542)
Variants explicitly set in an abstract root spec are considered
as defaults for the package they refer to, and they override
what is in packages.yaml and in package.py. This is relevant
only for multi-valued variants, where a constraint may extend
an already default value.
2021-05-11 12:38:17 -07:00
Peter Josef Scheibel
320eb4872d fix check when retrieving matching spec from environment when there is a match with a root spec as well as with a dependency of a root spec 2021-05-07 10:07:53 -07:00
Zack Galbreath
295377b2b4
Don't report configure errors to CDash for successful packages (#23286)
Convert configure errors detected by our log scraper into warnings when
the package being installed reports that it was successful.
2021-04-27 12:20:32 -06:00
Massimiliano Culpo
4079bbce97 Externals are preferred even when they have non-default variant values
fixes #22596

Variants which are specified in an external spec are not
scored negatively if they encode a non-default value.
2021-03-29 16:06:11 -07:00
Massimiliano Culpo
4ed5c366fa Enforce uniqueness of the version_weight atom per node
fixes #22565

This change enforces the uniqueness of the version_weight
atom per node(Package) in the DAG. It does so by applying
FTSE and adding an extra layer of indirection with the
possible_version_weight/2 atom.

Before this change it may have happened that for the same
node two different version_weight/2 were in the answer set,
each of which referred to a different spec with the same
version, and their weights would sum up.

This lead to unexpected result like preferring to build a
new version of an external if the external version was
older.
2021-03-29 16:06:11 -07:00
Massimiliano Culpo
35c3a25ca6
ASP-based solver: model disjoint sets for multivalued variants (#22534)
* ASP-based solver: avoid adding values to variants when they're set

fixes #22533
fixes #21911

Added a rule that prevents any value to slip in a variant when the
variant is set explicitly. This is relevant for multi-valued variants,
in particular for those that have disjoint sets of values.

* Ensure disjoint sets have a clear semantics for external packages
2021-03-26 09:22:38 -05:00
Harmen Stoppels
1659beb220
Fix spack graph when deptypes are filtered (#22121) 2021-03-08 16:47:00 -08:00
Michael Kuron
4453058862
Old concretizer: prevent unexpected propagation of external config (#20976)
When using an external package with the old concretizer, all
dependencies of that external package were severed. This was not
performed bidirectionally though, so for an external package W with
a dependency on Z, if some other package Y depended on Z, Z could
still pull properties (e.g. compiler) from W since it was not
severed as a parent dependency.

This performs the severing bidirectionally, and adds tests to
confirm expected behavior when using config from DAG-adjacent
packages during concretization.
2021-02-25 15:42:40 -08:00
Nathan Hanford
8ef67e2b15
New splice method in class Spec. (#20262)
* Spec.splice feature

Construct a new spec with a dependency swapped out. Currently can only swap dependencies of the same name, and can only apply to concrete specs.

This feature is not yet attached to any install functionality, but will eventually allow us to "rewire" a package to depend on a different set of dependencies.

Docstring is reformatted for git below

Splices dependency "other" into this ("target") Spec, and return the result as a concrete Spec.

If transitive, then other and its dependencies will be extrapolated to a list of Specs and spliced in accordingly.

For example, let there exist a dependency graph as follows:

        T
        | \
        Z<-H

In this example, Spec T depends on H and Z, and H also depends on Z.

Suppose, however, that we wish to use a differently-built H, known as H'. This function will splice in the new H' in one of two ways:

1. transitively, where H' depends on the Z' it was built with, and the new T* also directly depends on this new Z', or
2. intransitively, where the new T* and H' both depend on the original Z.

Since the Spec returned by this splicing function is no longer deployed the same way it was built, any such changes are tracked by setting the build_spec to point to the corresponding dependency from the original Spec.

Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
2021-02-23 13:56:00 -08:00
Massimiliano Culpo
7226bd64dc
Improve error message for inconsistencies in package.py (#21811)
* Improve error message for inconsistencies in package.py

Sometimes directives refer to variants that do not exist.
Make it such that:

1. The name of the variant
2. The name of the package which is supposed to have
   such variant
3. The name of the package making this assumption

are all printed in the error message for easier debugging.

* Add unit tests
2021-02-22 19:09:43 -08:00
Massimiliano Culpo
f2e3edf6db
Testing: use spack.store.use_store everywhere (#21656)
Keep spack.store.store and spack.store.db consistent in unit tests

* Remove calls to monkeypatch for spack.store.store and spack.store.db:
  tests that used these called one or the other, which lead to
  inconsistencies (the tests passed regardless but were fragile as a
  result)
* Fixtures making use of monkeypatch with mock_store now use the
  updated use_store function, which sets store.store and store.db
  consistently
* subprocess_context.TestState now transfers the serializes and
  restores spack.store.store (without the monkeypatch changes this
  would have created inconsistencies)
2021-02-18 13:22:49 -08:00
Adam J. Stewart
df5992293a
Python: add maintainer(s) (#21125)
* Python: add maintainer(s)

* Fix unit tests
2021-02-04 10:00:21 -06:00
Nathan Hanford
ebc871abbf
[WIP] relocate.py: parallelize test replacement logic (#19690)
* sbang pushed back to callers;
star moved to util.lang

* updated unit test

* sbang test moved; local tests pass

Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
2021-01-20 09:17:47 -08:00
Massimiliano Culpo
0bd76ac9e6 concretizer: require at least a dependency type to say the dependency holds
fixes #20784

Similarly to the previous bug, here we were deducing
conditions to be imposed on nodes that were not part
of the DAG.
2021-01-12 22:23:39 -08:00
Massimiliano Culpo
ed8fe68cf2 concretizer: dependency conditions cannot hold if package is external
fixes #20736

Before this one line fix we were erroneously deducing
that dependency conditions hold even if a package
was external.

This may result in answer sets that contain imposed
conditions on a node without the node being present
in the DAG, hence #20736.
2021-01-12 22:23:39 -08:00
Todd Gamblin
a8ccb8e116 copyrights: update all files with license headers for 2021
- [x] add `concretize.lp`, `spack.yaml`, etc. to licensed files
- [x] update all licensed files to say 2013-2021 using
      `spack license update-copyright-year`
- [x] appease mypy with some additions to package.py that needed
      for oneapi.py
2021-01-02 12:12:00 -08:00
Todd Gamblin
1571d6240b style: ensure that all packages pass spack style -a
- fix trailing whitespace and other issues uncovered by better flake8
  checking.

- fix extra whitespace printed by `spack style` command
2020-12-23 16:17:54 -08:00
Andrew W Elble
a90026fb89
concretizer: try hard to obtain all needed variant_possible_value()'s (#20102)
Track all the variant values mentioned when emitting constraints, validate them
and emit a fact that allows them as possible values.

This modification ensures that open-ended variants (variants accepting any string 
or any integer) are projected to the finite set of values that are relevant for this 
concretization.
2020-12-08 15:46:52 +01:00
vvolkl
ed258ca9e9
Add "spack versions --new" flag to only show new versions (#20030)
* [cmd versions] add spack versions --new flag to only fetch new versions

format

[cmd versions] rename --latest to --newest and add --remote-only

[cmd versions] add tests for --remote-only and --new

format

[cmd versions] update shell tab completion

[cmd versions] remove test for --remote-only --new which gives empty output

[cmd versions] final rename

format

* add brillig mock package

* add test for spack versions --new

* [brillig] format

* [versions] increase test coverage

* Update lib/spack/spack/cmd/versions.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update lib/spack/spack/cmd/versions.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-12-07 09:29:10 -06:00
Massimiliano Culpo
3843f43e69
concretizer: each external version is allowed by definition (#20247)
Registering external versions among the lists of allowed ones
generates the correct rules for `version_satisfies`
2020-12-06 10:29:05 +01:00
Massimiliano Culpo
8b74b50cff
concretizer: restrict maximizing variant values to MV variants (#20194) 2020-12-04 16:27:03 +01:00
Andrew W Elble
09aae616c7
concretizer: call inject_patches_variants() on the roots of the specs (#20203)
As was done in the old concretizer. Fixes an issue where conditionally
patched dependencies did not show up in spec (gdal+jasper)
2020-12-03 16:28:34 +01:00
Massimiliano Culpo
e2033566bf
concretizer: remove ad-hoc rule for external packages (#20193)
fixes #20040

Matching compilers among nodes has been prioritized
in #20020. Selection of default variants has been
tuned in #20182. With this setup there is no need
to have an ad-hoc rule for external packages. On
the contrary it should be removed to prefer having
default variant values over more external nodes in
the DAG.
2020-12-01 10:11:40 +01:00
Massimiliano Culpo
7fd777c3d9
concretizer: swap priority of selecting provider and default variant (#20182)
refers #20040

Before this PR optimization rules would have selected default
providers at a higher priority than default variants. Here we
swap this priority and we consider variants that are forced by
any means (root spec or spec in depends_on clause) the same as
if they were with a default value.

This prevents the solver from avoiding expected configurations
just because they contain directives like:

depends_on('pkg+foo')

and `+foo` is not the default variant value for pkg.
2020-12-01 07:45:48 +01:00
Massimiliano Culpo
8dd3797d32
concretizer: treat target ranges in directives correctly (#19988)
fixes #19981

This commit adds support for target ranges in directives,
for instance:

conflicts('+foo', when='target=x86_64:,aarch64:')

If any target in a spec body is not a known target the
following clause will be emitted:

node_target_satisfies(Package, TargetConstraint)

when traversing the spec and a definition of
the clause will then be printed at the end similarly
to what is done for package and compiler versions.
2020-11-27 20:53:39 +01:00
Massimiliano Culpo
03ff89fee6
concretizer: prioritize matching compilers over newer versions (#20020)
fixes #20019

Before this modification having a newer version of a node came
at higher priority in the optimization than having matching
compilers. This could result in unexpected configurations for
packages with conflict directives on compilers of the type:

conflicts('%gcc@X.Y:', when='@:A.B')

where changing the compiler for just that node is preferred to
lower the node version to less than 'A.B'. Now the priority has
been switched so the solver will try to lower the version of the
nodes in question before changing their compiler.
2020-11-26 13:10:48 +01:00
Massimiliano Culpo
8991cc4632
concretizer: allow a bool to be passed as argument for tests dependencies (#20082)
refers #20079

Added docstrings to 'concretize' and 'concretized' to
document the format for tests.

Added tests for the activation of test dependencies.
2020-11-26 08:55:17 +01:00
Massimiliano Culpo
983fb11dee
concretizer: treat conditional providers correctly (#20086)
refers #20040

This modification emits rules like:

provides_virtual("netlib-lapack","blas") :- variant_value("netlib-lapack","external-blas","False").

for packages that provide virtual dependencies conditionally instead
of a fact that doesn't account for the condition.
2020-11-25 22:03:42 +01:00
Greg Becker
77b2e578ec
spack test (#15702)
Users can add test() methods to their packages to run smoke tests on
installations with the new `spack test` command (the old `spack test` is
now `spack unit-test`). spack test is environment-aware, so you can
`spack install` an environment and then run `spack test run` to run smoke
tests on all of its packages. Historical test logs can be perused with
`spack test results`. Generic smoke tests for MPI implementations, C,
C++, and Fortran compilers as well as specific smoke tests for 18
packages.

Inside the test method, individual tests can be run separately (and
continue to run best-effort after a test failure) using the `run_test`
method. The `run_test` method encapsulates finding test executables,
running and checking return codes, checking output, and error handling.

This handles the following trickier aspects of testing with direct
support in Spack's package API:

- [x] Caching source or intermediate build files at build time for
      use at test time.
- [x] Test dependencies,
- [x] packages that require a compiler for testing (such as library only
      packages).

See the packaging guide for more details on using Spack testing support.
Included is support for package.py files for virtual packages. This does
not change the Spack interface, but is a major change in internals.

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: wspear <wjspear@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-11-18 02:39:02 -08:00
Massimiliano Culpo
7ffad278d3 concretizer: modified weights for providers and matching for externals
This commit address the case of concretizing a root spec with a
transitive conditional dependency on a virtual package, provided
by an external. Before these modifications default variant values
for the dependency bringing in the virtual package were not
respected, and the external package providing the virtual was added
to the DAG.

The issue stems from two facts:
- Selecting a provider has higher precedence than selecting default variants
- To ensure that an external is preferred, we used a negative weight

To solve it we shift all the providers weight so that:
- External providers have a weight of 0
- Non external provider have a weight of 10 or more

Using a weight of zero for external providers is such that having
an external provider, if present, or not having a provider at all
has the same effect on the higher priority minimization.

Also fixed a few minor bugs in concretize.lp, that were causing
spurious entries in the final answer set.

Cleaned concretize.lp from leftover rules.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
9a03fd2834 concretizer: don't require a provider for virtual deps if spec is external
This commit introduces a new rule:

real_node(Package) :- not external(Package), node(Package).

that permits to distinguish between an external node and a
real node that shouldn't trim dependency. It solves the
case of concretizing ninja with an external Python.
2020-11-17 10:04:13 -08:00
Massimiliano Culpo
e0ae60edc4 Added unit tests to for regressions on open concretizer bugs 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
930b05fab4 Add unit tests for dependencies being patched by parent 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
e7208b1598 tests: verify to handle dependencies conditional on other dependencies 2020-11-17 10:04:13 -08:00
Massimiliano Culpo
d00e8394f8 concretizer: handle conflicts with compiler ranges correctly
As reported, conflicts with compiler ranges were not treated
correctly. This commit adds tests to verify the expected behavior
for the new concretizer.

The new rules to enforce a correct behavior involve:
- Adding a rule to prefer the compiler selected for
  the root package, if no other preference is set
- Give a strong negative weight to compiler preferences
  expressed in packages.yaml
- Maximize on compiler AND compiler version match
2020-11-17 10:04:13 -08:00
Tamara Dahlgren
6fa6af1070
Support parallel environment builds (#18131)
As of #13100, Spack installs the dependencies of a _single_ spec in parallel.
Environments, when installed, can only get parallelism from each individual
spec, as they're installed in order.  This PR makes entire environments build
in parallel by extending Spack's package installer to accept multiple root
specs.  The install command and Environment class have been updated to use
the new parallel install method.

The specs and kwargs for each *uninstalled* package (when not force-replacing
installations) of an environment are collected, passed to the `PackageInstaller`,
and processed using a single build queue.

This introduces a `BuildRequest` class to track install arguments, and it
significantly cleans up the code used to track package ids during installation.
Package ids in the build queue are now just DAG hashes as you would expect,

Other tasks:

- [x] Finish updating the unit tests based on `PackageInstaller`'s use of
      `BuildRequest` and the associated changes
- [x] Change `environment.py`'s `install_all` to use the `PackageInstaller` directly
- [x] Change the `install` command to leverage the new installation process for multiple specs
- [x] Change install output messages for external packages, e.g.:
       `[+] /usr` -> `[+] /usr (external bzip2-1.0.8-<dag-hash>`
- [x] Fix incomplete environment install's view setup/update and not confirming all 
       packages are installed (?)
- [x] Ensure externally installed package dependencies are properly accounted for in 
       remaining build tasks
- [x] Add tests for coverage (if insufficient and can identity the appropriate, uncovered non-comment lines)
- [x] Add documentation
- [x] Resolve multi-compiler environment install issues
- [x] Fix issue with environment installation reporting (restore CDash/JUnit reports)
2020-11-17 02:41:07 -08:00