Compare commits

..

781 Commits

Author SHA1 Message Date
John W. Parent
bf14b424bb proj: correct CMake arg for shared build with proj older than 7.0.0 (#43089)
* proj: correct CMake arg for shared build with proj older than 7.0.0

* Actually use new CMake arg

* Update var/spack/repos/builtin/packages/proj/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-09 00:40:27 -07:00
Arne Becker
14209a86a6 perl-bio-eutilities and deps: new packages (#42869)
This adds Spack packages for these Perl distributons:
- Bio::DB::EUtilities and its dependencies:
- Bio::ASN1::EntrezGene
- Bio::Cluster
- Bio::Variation
2024-03-08 10:42:26 -08:00
Arne Becker
b7d9900764 perl-test-warn: update version, dependencies (#42901)
* perl-test-warn: new package
  Adds Test::Warn
* Revive older version
* perl-test-warn: fix URL and deprecate old version
2024-03-08 10:40:20 -08:00
Arne Becker
bc155e7b90 perl-starman and deps: new packages (#42958)
Adds Starman and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Starman
- Starman
2024-03-08 10:36:20 -08:00
Massimiliano Culpo
65f9ba345f gnupg: add v2.4.5, remove deprecated versions (#43090)
Also updates dependencies:
  libassuan: add v2.5.7
  libgcrypt: deprecate end of life versions
2024-03-08 10:20:55 -08:00
Dave Keeshan
ca49bc5652 verible: Add revision v0.0-3607-g46de0f64 (#43086) 2024-03-08 10:18:39 -08:00
snehring
b84b85a7e0 visit: link against hip when built with vtk-m+rocm (#42452) 2024-03-08 19:09:35 +01:00
John W. Parent
016cdba16f Curl package: update Windows support (#42752)
* Properly specify Curl builder interface for static vs shared curl
  with NMake system to ensure all built curls export expected
  symbols.
* Symlinks curl library build artifact to more idiomatic name for
  FindCurl.cmake implementations and other NMake consumers.
2024-03-08 10:08:12 -08:00
Jonas Eschle
4806e6549f Add package zfit (#42667)
* Add package zfit (WIP)

* Add package zfit

* Add package zfit

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* enh: add extras, 0.18.1

* fix: add default

* fix:  typo

* fix:  typo

* chore: cleanup arguments

* chore: cleanup arguments

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* chore: cleanup arguments

* fix:  typo

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-08 11:04:35 -07:00
Adam J. Stewart
c14b277150 py-fiona: add v1.9.6 (#43099) 2024-03-08 10:32:15 -07:00
John W. Parent
919025d9f3 VTK package: use CMake config provide by PROJ (#43088)
PROJ can install its own config file: VTK should prefer that over manual detection.
2024-03-08 10:32:00 -07:00
Adam J. Stewart
52f57c90eb Retiring as PythonPackage maintainer (#43091) 2024-03-08 18:29:24 +01:00
fgava90
ee1fa3e50c Deprecating py-pylint@2.3 as it cannot build with python@3.8: (#43096)
* Deprecating py-pylint@2.3 as it cannot build with python@3.8:

* Style fix

* Removed versions because can't build with python@3.7

---------

Co-authored-by: Gava, Francesco <francesco.gava@mclaren.com>
2024-03-08 07:43:46 -07:00
jmlapre
772928241b py-cheap-repr: new package (#42944)
* py_cheap_repr: add initial package.py

* Update var/spack/repos/builtin/packages/py-cheap-repr/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cheap-repr: use pypi link instead

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-08 06:45:25 -06:00
Martin Lang
7440bb4c36 bigdft-psolver: new versions 1.9.3, 1.9.4 (#42644) 2024-03-08 07:22:15 +01:00
Tom Payerle
c464866deb sfcgal: add v1.5.1, and restrict versions of cgal dependency (#42278)
Added new versions @1.5.1 and @1.4.1 (sfcgal moved from github to gitlab)

Placed restrictions on what versions of cgal are supported for different
cfcgal versions.  These restrictions are based on what was found in the
version history at https://gitlab.com/sfcgal/SFCGAL/-/blob/v1.5.1/NEWS?ref_type=tags
as well as the CMakeLists.txt for different versions.

@1.4 and @1.5 seem to require a specific version of cgal based on CMakeLists.txt

Earlier versions (@1.3.8:1.3.10) claim to support cgal@4.3: (but Spack recipe
claims did not work until @4.7, so sticking with that as minimum).  CMakeList.txt
suggests they support cgal@5 as well, but version history suggests otherwise.
2024-03-08 07:08:34 +01:00
Tom Payerle
799a8a5090 sfcgal: add custom libs property (#42276) 2024-03-08 07:06:38 +01:00
m-shunji
c218ee50e9 libint: add link option for fujitsu compiler (#42233)
Co-authored-by: inada-yoshie <inada.yoshie@fujitsu.com>
2024-03-08 07:03:18 +01:00
Teague Sterling
8ff7a20320 duckdb: add v0.10.0, overhaul of package recipe (#38922)
* Expanding the duckdb package to fix the version number (required for external extensions to work) being pulled from git and have variants for the built-in extensions at build-time. This also changes the build system from CMakePackage to Makefile package (as advised from upstream).

* - Reorganized and cleaned up variants
 - Updated the patch to work for 0.10.0+
 - Removed (or made non-default) some unecessary variants
 - Added ninja as the default generator
 - Set up some shared library dependencies;
2024-03-08 06:32:29 +01:00
Dave Keeshan
e3fe6bc0f7 Add 5.022 and cleanup how autoconf is done (#43085) 2024-03-07 18:29:27 -07:00
Sreenivasa Murthy Kolam
c6fcb1068f Enable tensorflow-2.11 support for ROCm (#38520)
* enable tensorflow-2.11 support for ROCm

* add latest sha for mesa and limit the patches to older version.similar
changes in #37910 to enable gitlab-ci pass

* address review commemts
2024-03-07 15:23:40 -07:00
Weiqun Zhang
54ac3e72ed amrex: add v24.03 (#43083) 2024-03-07 13:59:29 -08:00
Matthew Thompson
274fbebc4c Update GFE packages (#43082)
* Update GFE packages
* Add pflogger 1.13.1
2024-03-07 13:47:33 -08:00
Tim Haines
d40eb19918 Dyninst: add v13.0.0 (#43068) 2024-03-07 13:43:18 -08:00
dependabot[bot]
31de670bd2 build(deps): bump dorny/paths-filter from 3.0.1 to 3.0.2 (#43000)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 3.0.1 to 3.0.2.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](ebc4d7e9eb...de90cc6fb3)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-07 11:54:24 -08:00
Harmen Stoppels
6c0961549b zlib-ng: 2.1.6 (#43060) 2024-03-07 12:35:36 -07:00
Tim Fuller
c090bc5ebe Drop optional dependencies of Spack (#43081)
Remove dependency on `importlib_metadata` and `pkg_resources`, which can be problematic if the version in PYTHONPATH is incompatible with the interpreter Spack is running under.
2024-03-07 17:52:49 +00:00
renjithravindrankannath
bca4d37d76 hip: update patch PARAMETERS_MIN_ALIGNMENT for 6.0 (#42834) 2024-03-07 18:31:23 +01:00
Sreenivasa Murthy Kolam
9b484d2eea fix build issue with rocm-openmp-extras for 5.4.3 release (#43046)
* fix build issue with rocm-openmp-extras for 5.4.3 release
https://github.com/spack/spack/issues/36930
* fix style errors
2024-03-07 10:23:07 -07:00
Adam J. Stewart
a57b0e1e2d py-lightning: add v2.2.1 (#43042) 2024-03-07 09:22:52 -08:00
Luc Berger
e3cb3b29d9 Kokkos Kernels: adding missing TPLs and pre-conditions (#43043)
* Kokkos Kernels: adding missing TPLs and pre-conditions
  Adding variants and dependencies for rocBLAS and rocSPARSE.
  Also adding a "when=" close to the TPL variants that prevents
  enabling the TPLs in versions of the library when it was not
  yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: fix issue with unpacking wrong number of args
  After changing the tpls dictionary we need to update the unpacking
  logic to catch the right number of outputs out of it!
* Kokkos Kernels: updating doc string for tpls var and using f-string
  Improving comment a bit and switching to f-string for more readability.
  When setup to do more testing will try to use f-string in the CMake
  options generation part of the package.
* Style change
2024-03-07 09:00:10 -08:00
Mark W. Krentel
ac48ecd375 meson: add version 1.3.2 (#43036) 2024-03-07 09:56:20 -07:00
Juan Miguel Carceller
0bb20d34db autotools: fix a typo in comment (#43080)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-07 16:53:14 +00:00
Eric Berquist
971fda5c33 sst-core: use ncurses for interactive sst-info (#43063)
* sst-core now effectively depends on ncurses

* use --with-curses

* sst-core: update comment about ncurses

* should have curses for build, link, and run
2024-03-07 10:45:47 -06:00
Mikael Simberg
dcc4423a9d pika: Add 0.23.0 (#43077) 2024-03-07 16:06:54 +01:00
runiq
82c380b563 Fix spack find bootstrapping docs (#43074)
Closes #43052.

Maybe moving the argument to the `find` subcommand is a good idea, but I
just wanted to get the docs fix out.

Co-authored-by: Patrice Peterson <patrice.peterson@itz.uni-halle.de>
2024-03-07 14:13:32 +01:00
Adam J. Stewart
8bcf6a31ae ML CI: variants are now required (#42851) 2024-03-07 11:42:13 +01:00
Vanessasaurus
ddd88e266a Automated deployment to update package flux-core 2024-03-07 (#43067) 2024-03-07 11:13:31 +01:00
Martin Aumüller
08c597d83e botan: add v3.3.0, v2.9.14 (#43047) 2024-03-06 09:29:04 -08:00
Martin Aumüller
bf5340755d embree: add v4.3.1 (#43048) 2024-03-06 09:25:22 -08:00
Satish Balay
f8e70a0c96 llvm: add 18.1.0 (#43049) 2024-03-06 17:36:03 +01:00
Tim Fuller
7e468aefd5 Allow loading extensions through python entry-points (#42370)
This PR adds the ability to load spack extensions through `importlib.metadata` entry 
points, in addition to the regular configuration variable.

It requires Python 3.8 or greater to be properly supported.
2024-03-06 11:18:49 +01:00
John W. Parent
e685d04f84 netcdf-c package: Skip problematic post install on Windows (#43039)
#42878 adds a post install filter of the netCDFConfig.cmake file
that replaces a valid CMake target on Windows with an invalid one.
Don't do this replacement on Windows.
2024-03-05 21:44:36 -07:00
Adam J. Stewart
9d962f55b0 spack.patch: fix type hint circular import (#43041) 2024-03-05 17:26:44 -08:00
renjithravindrankannath
00d3066b97 migraphx: add v6.0.0 & v6.0.2 (#42918)
* Updates for migraphx 6.0.0 & 6.0.2
* Style check error and audit check error fix
* Adding patch for half-include-directory
* The parameter GPU_TARGETS is used from 5.7 in migraphx
* Adding rocmlir dependency in migraphx and 6.0 updates in rocmlir
* Applying upcoming changes to make CK JIT optional and enable
  compilation on Windows in order to build without ck dependency
2024-03-05 15:16:31 -08:00
Martin Lang
5ca0dcecb2 mpfr: missing dependency for version 4.0.1 (#43012)
* mpfr: missing dependency for version 4.0.1
  mpfr 4.0.1 (like 4.0.2) needs autoconf-archive where it takes the
  AX_PHREAD macro from
* autoconf-archive is also required for mpfr@4.0.0
2024-03-05 15:11:31 -08:00
Pranav Sivaraman
fa8fb7903b nvptx-tools: add v2023-09-13 (#40897)
New changes have been made to nvptx-tools that address dropping support
for sm_30 in later CUDA versions (12.0+).

Also refactor gcc to make nvptx-tools a dependency instead of a resource.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-03-05 22:51:40 +01:00
Adam J. Stewart
f35ff441f2 spack.patch: add type hints (#42811)
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-03-05 13:19:43 -08:00
Arne Becker
9ea9ee05c8 perl-catalyst-view-json: new package (#42981)
Adds Catalyst::View::JSON
2024-03-05 12:04:38 -08:00
Arne Becker
24ddc49c1b perl-bsd-resource: new package (#42982)
Adds BSD::Resource
2024-03-05 12:03:52 -08:00
Arne Becker
2f4266161c perl-chart-gnuplot: new package (#42983)
Adds Chart::Gnuplot
2024-03-05 12:03:00 -08:00
Arne Becker
7bcb0fff7d perl-config-inifiles: new package (#42984)
Adds Config::IniFiles
2024-03-05 12:02:06 -08:00
Arne Becker
cd332c6370 perl-plack-middleware-assets and deps: new packages (#42990)
This adds Plack::Middleware::Assets and its missing deps:
- CSS::Minifier::XS
- JavaScript::Minifier::XS
- Test::DiagINC
2024-03-05 11:59:52 -08:00
Arne Becker
6daf9677f3 perl-plack-middleware-crossorigin: new package (#42991)
Adds Plack::Middleware::CrossOrigin
2024-03-05 11:59:25 -08:00
G-Ragghianti
cb6450977d Removing application of the ldconfig patch because it conflicts with (#43001)
gdrcopy version 2.4.1
2024-03-05 11:57:24 -08:00
Vanessasaurus
bf62ac0769 Automated deployment to update package flux-sched 2024-03-05 (#43005)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-03-05 11:53:29 -08:00
Sergey Kosukhin
0223fe746b serialbox: fix PYTHONPATH (#43034) 2024-03-05 11:48:18 -08:00
Arne Becker
12fba13441 perl-chi-driver-memcached: new package (#43011)
Adds CHI::Driver::Memcached.

Modified perl-chi to make the tests work. Testing modules in perl-chi
were not loaded when testing CHI::Driver::Memcached, so added the "run"
type to these.
2024-03-05 11:46:57 -08:00
Arne Becker
0c44f5a140 perl-html-template: new package (#43017)
Adds HTML::Template
2024-03-05 11:45:14 -08:00
Arne Becker
f4853790c5 perl-catalyst-devel and deps: new packages (#43013)
This add Catalyst::Devel and its missing dependencies:
- Catalyst::Plugin::ConfigLoader
- Catalyst::Plugin::Static::Simple
- File::ChangeNotify
2024-03-05 11:44:53 -08:00
Arne Becker
9ed2e396f4 perl-plack-middleware-deflater: new package (#43014)
Adds Plack::Middleware::Deflater
2024-03-05 11:43:39 -08:00
Arne Becker
3ee6fc937e perl-data-predicate: new package (#43015)
* perl-data-predicate: new package
Adds Data::Predicate
* Added description
2024-03-05 11:42:48 -08:00
Arne Becker
c9b6cc9a58 perl-exporter-auto: new package (#43016)
Adds Exporter::Auto
2024-03-05 11:41:46 -08:00
Arne Becker
58b394bcec perl-list-compare: new package (#43018)
Adds List::Compare
2024-03-05 11:39:05 -08:00
Arne Becker
4d89eeca9b perl-sereal and deps: new packages (#43022)
* perl-sereal and deps: new packages
This adds Sereal and its missing dependencies:
- Sereal::Encoder
- Sereal::Decoder
* Add missing files.
2024-03-05 11:38:24 -08:00
Arne Becker
bfc71e9dae perl-string-approx: new package (#43023)
Adds String::Approx
2024-03-05 11:35:04 -08:00
Arne Becker
f061dcda74 perl-string-numeric: new package (#43024)
Adds String::Numeric
2024-03-05 11:34:03 -08:00
Arne Becker
cc460894fd perl-test-file-contents: new package (#43025)
Adds Test::File::Contents
2024-03-05 11:32:50 -08:00
Arne Becker
5e09660e87 perl-test-mockobject and deps: new packages (#43026)
Adds Test::MockObject and its dependencies.
Installed OK with build-time tests. Added dependencies:
- UNIVERSAL::can
- UNIVERSAL::isa
2024-03-05 11:31:59 -08:00
Arne Becker
5a8efb3b14 perl-test-perl-critic: new package (#43027)
Adds Test::Perl::Critic
2024-03-05 11:30:12 -08:00
Arne Becker
99002027c4 perl-test-pod-coverage and deps: new packages (#43028)
Adds Test::Pod::Coverage and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Pod::Coverage
2024-03-05 11:28:28 -08:00
Arne Becker
a247879be3 perl-tie-ixhash: new package (#43029)
Adds Tie::IxHash
2024-03-05 11:25:59 -08:00
Sergey Kosukhin
7b46993fed eccodes: drop redundant build dependency (#43035) 2024-03-05 11:25:00 -08:00
Arne Becker
dd59f4ba34 perl-text-csv-xs: new package (#43030)
Adds Text::CSV_XS
2024-03-05 11:22:38 -08:00
Arne Becker
18ab14e659 perl-xml-hash-xs: new package (#43031)
Adds XML::Hash::XS
2024-03-05 11:20:00 -08:00
Massimiliano Culpo
28eb5e1bf6 archspec: add v0.2.3 (#43009) 2024-03-05 20:09:46 +01:00
Sergey Kosukhin
c658ddbfa3 icon: add new package (#43037) 2024-03-05 12:03:38 -07:00
Massimiliano Culpo
12963c894f libksba: add v1.6.6, remove deprecated versions (#43006) 2024-03-05 19:30:39 +01:00
Mark W. Krentel
61fa12508f hpcviewer: add version 2024.02 (#42997) 2024-03-05 10:15:32 -08:00
Martin Lang
daf6acef6e py-numpy: patch for AVX512 build flags on Intel Classic Compiler (#43020) 2024-03-05 17:27:05 +01:00
Alex Richert
d30621e787 dakota: make python dependency optional, add v6.19 (#42914)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-05 12:57:13 +01:00
Wouter Deconinck
dd4b365608 container: don't map develop to latest (#42952)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-05 10:47:04 +01:00
Massimiliano Culpo
157d47fc5a ASP-based solver: improve reusing nodes with gcc-runtime (#42408)
* ASP-based solver: improve reusing nodes with gcc-runtime

This PR skips emitting dependency constraints on "gcc-runtime",
for concrete specs that are considered for reuse.

Instead, an appropriate version of gcc-runtime is recomputed
considering also the concrete nodes from reused specs.

This ensures that root nodes in a DAG have always a runtime
that is at a version greater or equal than their dependencies.

* Add unit-test for view with multiple runtimes
* Select latest version of runtimes in views
* Construct result keeping track of latest
* Keep ordering stable, just in case
2024-03-04 22:46:28 -08:00
Massimiliano Culpo
13daa1b692 libgpg-error: add v1.48 (#42872) 2024-03-05 05:56:50 +01:00
Mosè Giordano
f923e650f9 py-pythran: @:0.12.1 is incompatible with python@3.11: (#42994)
Ref: https://github.com/serge-sans-paille/pythran/issues/2101 and https://github.com/scipy/scipy/issues/18390.
2024-03-04 18:28:17 -07:00
Victoria Cherkas
1a1bbb8af2 metkit: add git url (#42867) 2024-03-04 21:15:03 +01:00
Victoria Cherkas
594fcc3c80 metkit: add additional eckit dependency constraint (#42871) 2024-03-04 21:14:09 +01:00
Tom Payerle
76ec19b26e hdf-eos2: support version @3.0, plus assorted fixes (#41782)
1) support for version @3.0
Unfortunately, download seems to require registration now
so using manual_download mechanism for @3:

2) copying from hdf-eos5 patch from @vanderwb to enable
use of Spack compiler wrappers instead of h4cc

3) Patching an issue in hdf-eos2 configure script.  The
script will test for jpeg, libz libraries, succeed and
append HAVE_LIBJPEG=1, etc to confdefs.h, and then abort
because HAVE_LIBJPEG not set in running environment.

4) Add some LDFLAGS to build environment.  Otherwise
seems to fail to build test script due to rpc dependence
in HDF4.
2024-03-04 21:08:19 +01:00
Arne Becker
00baaf868e perl-perl-critic-moose and deps: new packages (#42992)
Adds Perl::Critic::Moose and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Perl::Critic
- Pod::Parser
- Perl::Tidy
- PPI
- PPIx::QuoteLike
- List::SomeUtils
- PPIx::Regexp
- B::Keywords
- PPIx::Utils
- String::Format
- Pod::Spell
- Test::SubCalls
- Test::Object
- Lingua::EN::Inflect
- Hook::LexWrap
2024-03-04 12:54:14 -07:00
Harmen Stoppels
3b06347f65 repo.py: cleanup packages_with_tags (#42980) 2024-03-04 20:50:04 +01:00
Eric Berquist
5b9e207db2 bear: add up to version 3.1.3 (#42993) 2024-03-04 12:17:24 -07:00
psakievich
d6fd9017c4 Document new environment variable expansion in projections (#42963)
Adding docs and test for #42917

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-03-04 12:17:08 -07:00
Sreenivasa Murthy Kolam
913d79238e llvm-amdgpu: remove the openmp variant. (#42807)
Add rocm-openmp-extras package as a dependency for +openmp for rocm
2024-03-04 12:16:53 -07:00
Arne Becker
250038fa9b perl-dbix-class and deps: new packages (#42986)
Adds DBIx::Class and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Class::C3::Componentised
- Data::Dumper::Concise
- Config::Any
- Context::Preserve
- Class::Accessor::Grouped
- Module::Find
- SQL::Abstract::Classic
- Class::C3
- SQL::Abstract
- Algorithm::C3
2024-03-04 12:11:38 -07:00
Arne Becker
26c553fce7 perl-net-server-ss-prefork and deps: new packages (#42989)
Adds Net::Server::SS::PreFork and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Server::Starter
- Net::Server
- HTTP::Server::Simple
2024-03-04 12:11:14 -07:00
Arne Becker
e24c242fb7 perl-gzip-faster: new package (#42988)
Adds Gzip::Faster
2024-03-04 12:10:57 -07:00
Arne Becker
ca14ce2629 perl-catalyst-action-rest: new package (#42960)
Adds Catalyst::Action::REST
2024-03-04 10:59:57 -08:00
Arne Becker
44f443946c perl-graphviz and deps: new packages (#42987)
Adds GraphViz and its dependencies.
Installed OK with build-time tests. Added dependencies:
- XML::XPath
2024-03-04 11:58:12 -07:00
Arne Becker
6e6bc89bda perl-catalyst-action-renderview and deps: new packages (#42961)
Adds Catalyst::Action::RenderView and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
2024-03-04 10:57:50 -08:00
Arne Becker
8714ea6652 perl-catalyst-component-instancepercontext: new package (#42962)
Adds Catalyst::Component::InstancePerContext
2024-03-04 10:56:08 -08:00
Sinan
df92f0a7d4 imagemagick: add v7.0.9:7.1.1-29 (#42968)
* package/imagemagick add new version, improve
* confimed that build fails when libsm is missing on linux

---------

Co-authored-by: Sinan81 <Sinan@world>
2024-03-04 10:52:56 -08:00
Alec Scott
d24b91157c restic: add v0.16.4 (#42956) 2024-03-04 19:52:20 +01:00
Alec Scott
1a0f77388c direnv: add v2.34.0 (#42955) 2024-03-04 19:51:47 +01:00
Harmen Stoppels
34571d4ad6 linux-pam: add missing libtirpc dep (#42976) 2024-03-04 19:29:25 +01:00
Rocco Meli
a574f40732 add namd 3.0b6 (#42979) 2024-03-04 19:06:03 +01:00
Arne Becker
d4ffe244af perl-chi and deps: new packages (#42959)
* perl-chi and deps: new packages

Adds CHI and its dependencies.
Installed OK with build-time tests. Added dependencies:
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI

* Add license
2024-03-04 08:59:33 -08:00
AMD Toolchain Support
e08e66ad89 AOCL: add v4.2.0 (#42920)
* AOCL: add v4.2.0
   Co-authored-by: Phil Tooley <phil.tooley@amd.com> and
                vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
* Review comments for spack community PR #42920

---------

Co-authored-by: Phil Tooley <phil.tooley@amd.com> and vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-03-04 08:43:27 -08:00
Vanessasaurus
0543710258 flux-core: flux-security dependency with build/link (#42985) 2024-03-04 16:48:12 +01:00
Harmen Stoppels
5d994e48d5 modules: allow autoload: run, like in environment views (#42743) 2024-03-04 08:49:45 +01:00
Harmen Stoppels
d1fa23e9c6 versions: fix typing problems (#42903)
Fix the type declaration of VersionList.versions.

Fix further problems exposed by that fix.
2024-03-04 08:38:54 +01:00
Isaac Corley
f1db8b7871 torchgeo v0.5.2 (#42967) 2024-03-03 13:57:39 -07:00
Axel Huebl
c5cca54c27 Fix mgard: OpenMP on AppleClang (#42933)
macOS AppleClang does not provide OpenMP by default with XCode.
Use LLVM's OpenMP to fix compile errors of mgard with OpenMP (default).
2024-03-03 08:57:24 +01:00
Arne Becker
a9c1648db8 perl-log-dispatch-filerotate and dep: new package (#42965)
Adds Log::Dispatch::FileRotate and its dependency:
- Log::Dispatch

Installed OK, build-time tests ran successfully.
2024-03-03 00:40:27 -07:00
Arne Becker
3bd911377e perl-catalyst-plugin-cache: new package (#42964)
Adds Catalyst::Plugin::Cache
2024-03-02 23:34:26 -07:00
otsukay
fcb2f7d3aa Remove unnecessary if statements, which are harmful since +blas+lapack variants have been removed. (#42936)
Co-authored-by: Yuichi Otsuka <otsukay@riken.jp>
2024-03-02 22:13:47 -07:00
Arne Becker
a8a9e0160a perl-test-time-hires: new package (#42957)
Adds Test::Time::HiRes
2024-03-02 18:10:08 -08:00
jfavre
9ca6aaeafd libcatalyst: add fortran & python variants (#42941)
* Update package.py

Adding two variants 'fortran' and 'python' to enable language wrappings

* Update package.py

remove extra space
2024-03-02 17:55:46 -08:00
Adam J. Stewart
aed5c39312 py-numpy: add v1.26.4 (#42515) 2024-03-02 21:56:43 +01:00
Chris Marsh
eb36bb2a8f netcdf-c package: correct library names (#42878)
An incorrect hdf5 library name is added to pkconfig and CMake config
files when netcdf-c is built with CMake.
2024-03-02 11:05:59 -08:00
dependabot[bot]
8dcf860888 build(deps): bump codecov/codecov-action from 4.0.2 to 4.1.0 (#42860)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.2 to 4.1.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](0cfda1dd0a...54bcd8715e)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-02 10:39:09 -08:00
Arne Becker
a5b7cb6e6f perl-test-xml-simple and deps: new packages (#42873)
* perl-test-longstring: New package

Adds Test::LongString

* perl-test-xml-simple: new package

- Adds perl-test-xml-simple
2024-03-02 10:37:35 -08:00
Arne Becker
34c0bfefa6 perl-test-xml and deps: new packages (#42870)
- Adds perl-test-xml and dependency:
- Adds perl-xml-semanticdiff
2024-03-02 10:35:56 -08:00
Elliott Slaughter
ea5db048f3 legion: Add 23.09.0 and 23.12.0, remove control_replication. (#42915)
* legion: Add 23.09.0 and 23.12.0, remove control_replication.

The branch control_replication has been merged to master and should no
longer be used.

* flecsi: Switch to Legion master branch.

Legion control_replication has been merged to master.

* Fix Legion 23.09.0 and 23.12.0 build for ROCm 6.
2024-03-02 10:32:44 -08:00
Adam J. Stewart
e68a17f2c6 Bazel: fix patching of 4.2.4 (#42938) 2024-03-02 10:29:18 -08:00
Brian Vanderwende
4af9ec3d8a Add ncvis package and add option to wxwidgets (#38204)
* Add ncvis and opengl option for wxwidgets

* Style fixes for ncvis

* Replace in with satisfies for opengl constraint

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2024-03-02 10:26:36 -08:00
Zachary Newell
eb90e2c894 Added NCCL version 2.20.3-1 (#42951)
Added NCCL version 2.20.3-1 to the package.py. I tested compiling it and running nccl-tests on Ubuntu 22.04.
2024-03-02 06:59:17 -06:00
Mosè Giordano
763f444d63 py-numba: add tbb variant (#42930)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-03-01 02:03:39 -07:00
Chris Marsh
6614c4322d Seacas: fix patch hash (#42934) 2024-03-01 01:48:36 -07:00
Chris Marsh
983422facf libogg does not build a shared libary with cmake (#42877)
* when built with cmake, libogg does not build with a shared libary by default. This resolves that

* spack style fixes

* Clean up imports

* enforce +pic when +shared
2024-02-29 21:12:27 -06:00
Ye Luo
d0bdd66238 Add Quantum ESPRESSO 7.3.1 (#42927) 2024-02-29 20:08:25 -07:00
Tim Fuller
3a50d32299 Show extension commands with spack -h (#41726)
* Execute `args.help` after setting main options so that extension commands will show with `spack -h`

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-02-29 16:51:42 -08:00
psakievich
50ee3624c0 Support environment variable expansion inside module projections (#42917) 2024-02-29 16:49:37 -08:00
afzpatel
2840cb54b5 initial commit to fix ck gpu targets cmake arg (#42924) 2024-02-29 15:48:07 -06:00
eugeneswalker
5c482d0d7e reduce size of e4s to deal with large rebuild artifact (#42884) 2024-02-29 13:44:01 -07:00
joscot-linaro
f3ad990b55 linaro-forge: added 23.1.2 version (#42922) 2024-02-29 12:28:03 -08:00
Victoria Cherkas
977603cd96 Update fdb package.py with libs (#42874)
* Update fdb package.py with libs
* Formatting
2024-02-29 12:23:37 -08:00
Wanlin Wang
1f919255fd Update riscv-gnu-toolchain package.py (#42893)
* Update package.py
  1. add one compiler type named 'musl'
  2. add a variant name 'multilib'
  3. add a variant name 'cmodel'
* Added one compiler type named 'musl'.
  Added a variant named 'multilib'.
  Added a variant named 'cmodel'.
  Added several versions.
* aarch64 is not supported.
2024-02-29 12:11:40 -08:00
Adam J. Stewart
5140a9b6a3 py-keras: add v3.0.5 (#42697) 2024-02-29 17:56:03 +01:00
Chris Marsh
1732ad0af4 vtk: Update proj dependency (#42797)
* Update proj dependency to enable newer proj usage

* Allow for any proj version
2024-02-29 06:07:21 -08:00
dependabot[bot]
e7510ae292 build(deps): bump docker/setup-buildx-action from 3.0.0 to 3.1.0 (#42883)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.0.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](f95db51fdd...0d103c3126)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-29 14:48:24 +01:00
Harmen Stoppels
0c224ba4a7 libevent: remove autotools build deps again (#42908)
The deps were added in #40945 to make it work on macOS 11, because the
old configure scripts only detect macOS 10. Apparently people reported the
autoreconf script caused issues, later fixed in #41057. However, also
with that fix, things are incorrect, cause people now report:

```
libtool: You should recreate aclocal.m4 with macros from libtool 2.4.7
libtool: and run autoconf again.
```

HOWEVER, all this is unnecessary, because the underlying issue was
already fixed long ago, it's just that it regressed at some point, but
it's back in place since #41205.
2024-02-29 09:57:21 +01:00
Terry Cojean
86b4a867ef ginkgo: add PAPI SDE support (#39425)
Signed-off-by: Terry Cojean <terry.cojean@kit.edu>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-29 06:04:34 +01:00
Arne Becker
6049e5f6eb perl-readonly-xs: new package (#42897)
This adds Readonly::XS. Since this module can not be used by itself, the
Spack package comes with a test override. This anticipates that the perl
builder will one day have a generic standalone module usage test.
2024-02-28 14:27:22 -08:00
Arne Becker
0339690a59 perl-test-json, perl-json-any: New packages (#42896)
* perl-test-json: New package
  Adds Test::JSON
* Adds perl-json-any
2024-02-28 14:25:34 -08:00
Arne Becker
2bae1b7e00 perl-test-xpath: New package (#42895)
Adds Test::XPath
2024-02-28 14:23:41 -08:00
Arne Becker
ae5b605f99 perl-uri-find: New package (#42894)
Adds URI::Find
2024-02-28 14:22:19 -08:00
Arne Becker
35898d94d2 perl-net-cidr-lite: new package (#42898)
* perl-net-cidr-lite: new package
   Adds Net::CIDR::Lite
* Add license
2024-02-28 14:20:52 -08:00
Arne Becker
7e00bd5014 perl-mojolicious: new package (#42899)
Adds Mojolicious
2024-02-28 14:16:36 -08:00
Arne Becker
1f3aefb0a3 perl-cache-memcached and deps: new packages (#42911)
Adds Cache::Memcached and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Cache::Memcached
2024-02-28 14:02:38 -08:00
Harmen Stoppels
d4601d0e53 Unit tests: skip tests that intermittently fail on Windows (#42909) 2024-02-28 14:00:09 -08:00
Tom Payerle
935660e3d5 mysql: explicity cast python command to str in _fix_dtrace_shebang() (#40781)
This should fix issue #40780

We explicitly cast self.spec["python"].command to str in the filter_file
call in _fix_dtrace_shebang to avoid the error
==> Error: TypeError: expected str, bytes or os.PathLike object, not Executable

Not sure why the error is appearing (is it only for specific python versions, etc?),
but the fix should be quite safe.
2024-02-28 13:00:51 -08:00
Alec Scott
17bfc41841 bison: remove unnessisary deps, add variant for colored output (#42209) 2024-02-28 12:32:40 -08:00
Arne Becker
49b38e3a78 perl-yaml-syck: New package (#42892)
Adds YAML::Syck
2024-02-28 12:26:09 -08:00
Arne Becker
66f078ff84 perl-catalyst-runtime and deps: new packages (#42886)
* perl-catalyst-runtime and deps: new packages
  This add Perl Catalyst::Runtime and its missing dependencies.
  Adds:
  - perl-catalyst-runtime
  - perl-apache-logformat-compiler
  - perl-cgi-simple
  - perl-cgi-struct
  - perl-class-c3-adopt-next
  - perl-cookie-baker
  - perl-data-dump
  - perl-devel-stacktrace-ashtml
  - perl-filesys-notify-simple
  - perl-getopt-long-descriptive
  - perl-hash-multivalue
  - perl-http-body
  - perl-http-entity-parser
  - perl-http-headers-fast
  - perl-http-multipartparser
  - perl-moosex-emulate-class-accessor-fast
  - perl-moosex-getopt
  - perl-moosex-methodattributes
  - perl-moosex-role-parameterized
  - perl-path-class
  - perl-plack
  - perl-plack-middleware-fixmissingbodyinredirect
  - perl-plack-middleware-methodoverride
  - perl-plack-middleware-removeredundantbody
  - perl-plack-middleware-reverseproxy
  - perl-plack-test-externalserver
  - perl-posix-strftime-compiler
  - perl-stream-buffered
  - perl-string-rewriteprefix
  - perl-test-mocktime
  - perl-test-tcp
  - perl-test-time
  - perl-test-trap
  - perl-tree-simple
  - perl-tree-simple-visitorfactory
  - perl-uri-ws
  - perl-www-form-urlencoded
2024-02-28 12:17:45 -08:00
AMD Toolchain Support
304a63507a AOCC: add v4.2.0 (#42891)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-02-28 20:24:57 +01:00
Jonas Eschle
c7afc0eb5f Upgrade TensorFlow Probability with newer versions (#42673)
* enh: add newer versions

* enh: add newer versions

* format

* fix typo

* Update package.py

* make jax and TF optional dependencies

* style fix

* remove dependency

* remove old TFP version

* fix:  style
2024-02-28 12:29:23 -06:00
kwryankrattiger
57cde78c56 ParaView Release Candidate 5.12.0-RC3 (#42654) 2024-02-28 09:41:10 -08:00
Arne Becker
b01f6308d5 perl-json-xsand deps: new packages (#42904)
Adds JSON::XS and its deps:
- Canary::Stability
- Types::Serialiser
2024-02-28 09:30:46 -08:00
eugeneswalker
13709bb7b7 e4s: new packages: glvis, laghos (#42847)
* e4s: new packages: glvis, laghos

* gl: require: osmesa

* be explicit: glvis ^llvm so that llvm-amdgpu not chosen

* glvis fails on oneapi stack due to issue 42839
2024-02-28 09:26:53 -08:00
Harmen Stoppels
661ae1f230 versions: simplify list if union not disjoint (#42902)
Spack merges ranges and concrete versions if they have non-empty
intersection. That is not enough for adjacent version ranges.

This commit ensures that disjoint ranges in version lists are simplified
if their union is not disjoint:

```python
"@1.0:2.0,2.1,2.2:3,4:6" # simplifies to "@1.0:6"
```
2024-02-28 16:33:25 +01:00
Sinan
287e1039f5 package_py_systemd_python_improve (#42865)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-02-28 07:56:17 -06:00
Jonas Eschle
015dc4ee2e Add package zfit interface (#42666)
* Add package zfit interface

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of jonas-eschle

* Update var/spack/repos/builtin/packages/py-zfit-interface/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-28 07:52:40 -06:00
Axel Huebl
46165982b1 C-Blosc2: Fuzzer Tests (#42881)
The fuzzer tests are a bit flaky and have linker issues on
clang. We generally only should build them in testing.
2024-02-27 21:08:33 -07:00
eugeneswalker
c9a111946e e4s oneapi: remove outdated package preferences (#42875) 2024-02-27 14:35:06 -08:00
renjithravindrankannath
62160021c1 Adding dependency of roctracer-dev and patch in miopen-hip (#42637) 2024-02-27 14:52:47 -07:00
Tom Payerle
3290e2c189 openexr: Add custom libs property (#42274)
Libraries for openexr are named libOpenEXR*.so, etc., so the default libs
handler in spec does not find them.

Add a custom libs property to address this.

Partial fix for #42273

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-02-27 10:45:29 +01:00
George Young
2a9fc3452a regtools: add new package (#42852)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-27 10:44:34 +01:00
YI Zeping
2ea8a2e6ae btop: add cmake version restriction (#42835)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-27 10:08:22 +01:00
Lydéric Debusschère
fe4a4ddcf9 bazel: allow offline build of major versions 5 and 6 (#41575)
* bazel: allow offline build of major versions 5 and 6; add variant download_data

* bazel: add maintainer LydDeb

* bazel: install offline only; remove variant download_data

* bazel: fix variable name: resource_dico --> resource_dictionary

* bazel: fix style

* bazel: fix the build of version 4

* bazel: add comment about resources

* bazel: access to resource stages with self.stage

* bazel: add except to solve AttributeError: 'Stage' object has no attribute 'resource'

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2024-02-27 03:03:25 -06:00
Juan Miguel Carceller
c45714fd3c delphes: use the same C++ standard as in ROOT (#42816)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 09:54:26 +01:00
Juan Miguel Carceller
523d12d9a8 garfieldpp: Add version 5.0 (#42817)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 09:54:03 +01:00
Howard Pritchard
5340e0184d Open MPI: adjust pmix dependency for 5.0.x (#42827)
for various reasons had to advance dependency of 5.0.2 to at least
pmix 4.2.4.  5.0.1 and 5.0.0 can also build with 4.2.4 pmix or newer.

related to #42651

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-02-27 09:49:29 +01:00
AMD Toolchain Support
bca7698138 openfoam: add mpfr search paths (#42779)
Co-authored-by: Branden Moore <branden.moore@amd.com>
2024-02-27 09:37:48 +01:00
Peter Scheibel
5c26ce5385 skip test which is causing spurious failures on Windows (#42832) 2024-02-27 09:36:10 +01:00
Eisuke Kawashima
02137dda17 eigenexa: add 2.7–2.12 (#38170) 2024-02-27 09:23:08 +01:00
eugeneswalker
4abac88895 e4s ci: use ubuntu 22.04 images (#42843) 2024-02-27 01:12:53 -07:00
stepanvanecek
79c2a55e00 gpuscout: new package (#42761)
Co-authored-by: Stepan Vanecek <stepan@Stepans-MBP.fritz.box>
2024-02-27 09:05:09 +01:00
Carsten Uphoff
71c169293c double-batched: add v0.5.0 (#42850)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-02-27 08:59:42 +01:00
Wouter Deconinck
bcc5ded205 dd4hep: new version 1.28 (#42846) 2024-02-27 08:50:01 +01:00
Kensuke WATANABE
379a5d8fa0 root: add dependent package required for build time tests (#42849) 2024-02-27 08:43:08 +01:00
Juan Miguel Carceller
d8c2782949 bdsim: use the same C++ standard as in ROOT, add a patch (#42031)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 08:34:42 +01:00
dependabot[bot]
6dde6ca887 build(deps): bump pytest from 8.0.1 to 8.0.2 in /lib/spack/docs (#42861)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.0.1 to 8.0.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.0.1...8.0.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-27 08:30:03 +01:00
downloadico
8f8c262fb3 picard: add version 3.1.1 (#42862) 2024-02-27 08:29:06 +01:00
afzpatel
93b8e771b6 rocm-gdb: add v6.0.2 (#42855) 2024-02-27 08:22:56 +01:00
Todd Gamblin
48088ee24a refactor: add type annotations and refactor solver conditions (#42081)
Refactoring `SpackSolverSetup` is a bit easier with type annotations, so I started
adding some. This adds annotations for the (many) instance variables on
`SpackSolverSetup` as well as a few other places.

This also refactors `condition()` to reduce redundancy and to allow
`_get_condition_id()` to be called independently of the larger condition
function.


Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-26 22:26:01 +00:00
Mikael Simberg
c7df258ca6 Update camp missing headers patch to be applied with all compilers (#42857) 2024-02-26 12:30:01 -05:00
Erik Heeren
b8e8fa2dcd py-find-libpython: 0.3.1 (#42853)
* py-find-libpython: 0.3.1

* py-find-libpython: sort versions
2024-02-26 10:07:51 -07:00
Adam J. Stewart
8e885f4eb2 ImageMagick: fewer dependencies on macOS (#42739) 2024-02-26 17:45:29 +01:00
Miranda Mundt
116308fa17 py-pyomo: add v6.7.1 (#42795)
* Update Pyomo spack package for 6.7.1 release

* Apply changes from @adamjstewart

* Update sphinx+Pyomo versions

* Whoops - typo
2024-02-26 10:14:36 -06:00
Adam J. Stewart
5eb4d858cb Update PyTorch ecosystem (#42819) 2024-02-25 19:20:25 -08:00
eugeneswalker
8dd5f36b68 e4s external rocm ci: use ubuntu 22 image with rocm 5.7.1 (#42842)
* e4s external rocm ci: use ubuntu 22 image with rocm 5.7.1

* comment out slate+rocm due to build error
2024-02-25 17:50:56 -08:00
eugeneswalker
e3ce3ab266 e4s ci: add py-mpi4py, py-numba (#42845) 2024-02-25 17:23:39 -08:00
Jeremy Fix
0618cb98d1 py-ipyvuetify: new package (#42836)
* py-ipyvuetify: new package

* Limit py-jupyter-packing version to 0.7.x

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-jupyterlab version and type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-ipyvue version range to exclude 2

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* rm py-wheel, already considered for PythonPackage

* fix: pynpm only required for build, reorder dependencies as in the pyproject.toml

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-25 07:34:23 -06:00
Maciej Wójcik
3b4a27ce7b snakemake: new version with plugins (#42713)
* snakemake: add Snakemake 8 with dependencies

* snakemake: add missing description

* Whitespace

* Whitespace

* Whitespace

* Whitespace

* py-conda-inject: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-azure-batch: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-cluster-generic: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake: add upper bound for Python

* py-snakemake-executor-plugin-drmaa: specify dependency type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-googlebatch: correct dependency version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-tes: correct dependency version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-storage-plugin-s3: reorder

* snakemake: remove newly added variants

* snakemake: remove newly added variants

* snakemake: remove newly added variant

* snakemake: update version

* snakemake: update version

* snakemake: whitespace

* py-snakemake-storage-plugin-s3: update version

* snakemake: use newer version

* snakemake: whitespace

* snakemake: update interfaces

* py-snakemake-storage-plugin-gcs: link issue

* snakemake: update versions

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-25 03:47:38 -07:00
Veselin Dobrev
07afbd5619 [laghos] Add a patch for MPI_Session (#42841) 2024-02-24 16:07:59 -07:00
Pranav Sivaraman
5d8cd207ec zoxide: new package (#42840)
* feat: zoxide package

* Apply suggestions from code review

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2024-02-24 14:53:08 -07:00
Adam J. Stewart
3990589b08 py-lightly: add v1.5.0 (#42820) 2024-02-24 12:27:53 -08:00
dependabot[bot]
38d821461c build(deps): bump codecov/codecov-action from 4.0.1 to 4.0.2 (#42831)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.1 to 4.0.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](e0b68c6749...0cfda1dd0a)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-24 12:26:30 -08:00
Adam J. Stewart
80b13a0059 py-pandas: add v2.2.1 (#42838) 2024-02-24 12:20:55 -08:00
Maciej Wójcik
ab101d33be py-azure-...: add new versions (#42742)
* py-azure-core: add new versions

* py-azure-identity: add new versions, flatten dependencies

* py-azure-storage-blob: add new versions

* py-msal: add new versions

* py-azure-...: black is terrible

* py-azure-storage-blob: correct dependency

* Reorder

* Reorder
2024-02-24 12:29:05 -06:00
Sinan
cc742126ef package/libspatialite: add conflict, new version (#42573)
* package/libspatialite: add conflict, new version

* depends on new version of freexl

* fix bug

* remove manual download stuff

* improve style

* first depracate

* [@spackbot] updating style on behalf of Sinan81

* get rid of conflict, reorder deps

* remove manual download

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2024-02-24 09:29:46 -06:00
Stephen Hudson
a1f21436a4 libEnsemble: add v1.2.1 (#42828) 2024-02-24 09:27:44 -06:00
Alex Richert
95fdc92c1f Allow awscli-v2 to be installed without examples/ dir (#42773)
* Allow awscli-v2 to be installed without examples/ dir

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-24 09:25:59 -06:00
Alex Richert
6680c6b72e libjpeg-turbo: add v2.1.5, update recipe (#37963)
Co-authored-by: Alec Scott <hi@alecbcs.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-23 19:23:52 -07:00
Chris Marsh
74b6bf14b8 netcdf-cxx4: convert to CMake-based build (#42766)
The CMake-based build is anticipated to work in all cases where the
Autotools-based build did, and to address all prior issues with less
maintenance of the package. In detail:

* Fixes #42735 (CMake's find_package helps with linking to proper
  netcdf-c)
* Replaces older Autotools-based build
* All preexisting variants are handled
* Record hdf5 as an explicit dependency (was missing before)
* Add +tests option

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-02-23 15:37:33 -08:00
Vicente Bolea
7c315fc14b proj: apply stdint.h patch in version 8 (#42791)
* proj: apply stdint.h patch in version 8

* Update var/spack/repos/builtin/packages/proj/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 14:48:20 -07:00
John W. Parent
f51c9fc6c3 Windows path handling: change representation for paths with spaces (#42754)
Some builds on Windows break when encountering paths with spaces. This
reencodes some paths in Windows 8.3 filename format (when on Windows):
this serves as an equivalent identifier for the file, but in a form that
does not have spaces.

8.3 filenames are also truncated in length, which could be helpful, but
that is not the primary intended purpose of using this format.

Overall

* nmake/msbuild packages do this generally for the install prefix
* curl/perl require additional modifications (as written now, each package
  may require calls to `windows_sfn` to work when the Spack
  root/install/staging prefixes contain spaces)

Some items for follow-up:

* Spack itself does not create paths with spaces "on top" of whatever
  the user configures or where it is placed (e.g. the Spack root, the
  staging directory, etc.), so it might be possible to edit some of these
  paths once and avoid a proliferation of individual `windows_sfn`
  calls in individual packages.
* This approach may result in the insertion of 8.3-style paths into
  build artifacts (on Windows), handling this may require additional
  bookkeeping (e.g. when relocating).
2024-02-23 13:30:11 -08:00
John W. Parent
3e713bb0fa vtk package: support vtk@9 on Windows (#42751) 2024-02-23 11:58:58 -08:00
Peter Scheibel
55bbb10984 Alert user to failed concretizations (#42655)
With this change an error message is emitted when the result of concretization 
is in an inconsistent state.
2024-02-23 20:15:25 +01:00
Simon Pintarelli
6e37f873f5 tiled-mm: add v2.3 (#42829) 2024-02-23 19:53:51 +01:00
Massimiliano Culpo
d49cbecf8c Cleanup spack.schema (#42815)
* Move spec_list into its own file, instead of __init__.py

* Remove spack.schema.spack

This module was introduced in #33960 It's almost an exact duplicate of
spack.schema.env, and is not used anywhere.

* Fix typo
2024-02-23 10:23:54 -08:00
Dr Marco Claudio De La Pierre
fe07645e3b Update/add packages in the Nextflow ecosystem (#42776)
Signed-off-by: Dr Marco Claudio De La Pierre <marco.delapierre@gmail.com>
2024-02-23 17:53:46 +01:00
Ben Morgan
c5b8d5c92a geant4: new version v11.2.1 (#42822) 2024-02-23 08:07:35 -05:00
Jeremy Fix
2278816cb3 py-jwcrypto: new package (#42783)
* adds the spack recipe for py-jwcrypto

* split long line to fix E501

* Specify versions for py-cryptography and py-typing-extensions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:58:11 -07:00
Jeremy Fix
4bd305b6d3 py-reacton: new package (#42794)
* adds the spack recipe for reacton python package

* Fix versions for ipywidgets and typing-extensions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:48:06 -07:00
Erik Heeren
a26e0ff999 py-find-libpython: new package (#42804)
* py-find-libpython: new package

* Update var/spack/repos/builtin/packages/py-find-libpython/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:42:58 -07:00
Jeremy Fix
4550fb83c0 py-ipyvue: new package (#42789)
* add spack recipe for ipyvue

* Specify version for ipywidgets

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:37:48 -07:00
Todd Gamblin
5b96c2e89f py-sympy: add version 1.12 (#42770) 2024-02-23 05:42:49 -06:00
Sinan
18c8406091 pdal: fix version range for patch (#42769)
* Update package.py

fix bug.

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:28:41 -06:00
Caetano Melone
bccefa14cb py-codespell: add package (#42694)
* py-codespell: add package

* setuptools-scm conflict

confirmed via https://github.com/codespell-project/codespell/issues/3365

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* reorder dependencies and versions

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:07:24 -06:00
Maciej Wójcik
20fc5a174a py-s3transfer, py-boto3, py-botocore: add new versions (#42741)
* py-s3transfer: add new versions

* py-boto3: add new versions

* py-botocore: add new versions

* py-boto3: correct version ranges
2024-02-23 04:18:45 -06:00
Alec Scott
3abcb408d1 py-ansible: add v2.16.3 (#42734)
* py-ansible: add v2.16.3

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add specific python version requirements from setup.cfg

* Add additional ranges for py-setuptools

* Update var/spack/repos/builtin/packages/py-ansible/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 04:13:33 -06:00
Maciej Wójcik
a0d97d9294 py-argparse-dataclass: add new package (#42494)
* py-argparse-dataclass: add new package

* Remove obvious dependency
2024-02-23 04:11:21 -06:00
Massimiliano Culpo
0979a6a875 Remove dead code from Environment (#42818)
Environment.concretize_and_add is not used anywhere.
2024-02-23 10:48:07 +01:00
Massimiliano Culpo
98de8e3257 Fix wrong call to a function (#42814) 2024-02-23 06:37:22 +01:00
akimler
23b299c086 matio: add v1.5.26 (#42808) 2024-02-23 06:03:06 +01:00
Massimiliano Culpo
adcd3364c9 elpa: remove deprecated versions (#42802) 2024-02-23 06:02:06 +01:00
Tom Payerle
a2908fb9f7 qb3: add custom libs property (#42275) 2024-02-22 15:13:05 -07:00
John W. Parent
f514e72011 netcdf-c package: fix hdf5 linking on Windows (#42749) 2024-02-22 14:18:34 -07:00
Adam J. Stewart
b61d964eb8 PythonPackage: check purelib for libs/headers (#42602)
* PythonPackage: check purelib for libs/headers

* Update error messages too

* Fix functools.reduce argument order
2024-02-22 13:17:21 -08:00
John W. Parent
2066eda3cd Seacas package: add Windows support (#42692) 2024-02-22 13:28:33 -07:00
Alex Richert
d8b186a381 dakota: add boost components (#42659)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-22 13:03:13 -07:00
Adam J. Stewart
d258aec099 GDAL: add v3.8.4 (#42805) 2024-02-22 11:42:14 -08:00
Harmen Stoppels
3d1d5f755f oci: when base image uses Image Manifest Version 2, follow suit (#42777) 2024-02-22 16:33:56 +01:00
Thomas-Ulrich
90778873d1 tandem: update package (#42785) 2024-02-22 15:45:20 +01:00
AMD Toolchain Support
1af57e13fb elpa: fix support for patched version (#42803)
Co-authored-by: Ning Li <ning.li@amd.com>
2024-02-22 15:43:12 +01:00
AMD Toolchain Support
28d25affcc ELPA: Linking fixes for BLAS and OpenMP (#42747)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2024-02-22 15:21:00 +01:00
Martin Lang
3ebaf33915 libgd: fix INT_MAX not defined (#42104)
Compiling version 2.2.4 fails (on a Debian system with only a minimum set of packages installed) with an error because `INT_MAX` is undeclared:
```
   263    gd_gd2.c: In function '_gd2GetHeader':
>> 264    gd_gd2.c:212:54: error: 'INT_MAX' undeclared (first use in this function)
   265      212 |                 if (*ncx <= 0 || *ncy <= 0 || *ncx > INT_MAX / *ncy) {
   266          |                                                      ^~~~~~~
   267    gd_gd2.c:87:1: note: 'INT_MAX' is defined in header '<limits.h>'; did you forget to '#include <limits.h>'?
```
2024-02-22 03:23:59 -07:00
Alec Scott
e8d981b405 rust: add v1.76.0 (#42798) 2024-02-22 03:09:31 -07:00
Steven Hahn
02f222f6a3 google benchmark: Add variant with libpfm4 (#42620)
Signed-off-by: Steven Hahn <hahnse@ornl.gov>
2024-02-22 07:29:49 +01:00
James Beal
8345c6fb85 delly2: add v1.2.6 (#42745)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-22 07:28:40 +01:00
Xavier Delaruelle
3f23634c46 environment-modules: add version 5.4.0 (#42763) 2024-02-22 07:10:55 +01:00
MatthewLieber
d5766431a0 mvapich: add v3.0 (#42756)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-02-22 07:09:14 +01:00
Martin Lang
1388bfe47d bigdft-atlab: add v1.9.3, v1.9.4 (#42643) 2024-02-22 07:05:00 +01:00
dependabot[bot]
579dec3b35 build(deps): bump urllib3 from 2.2.0 to 2.2.1 in /lib/spack/docs (#42757)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.2.0 to 2.2.1.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.2.0...2.2.1)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-22 07:01:45 +01:00
Martin Lang
9608ed9dbd cgal: add v5.5.3 (#42650) 2024-02-22 06:58:03 +01:00
Wouter Deconinck
42b739d6d5 podio: depends_on py-graphviz type run (for podio-vis) (#42787)
The podio-vis tool depends at run-time on py-graphviz, https://github.com/AIDASoft/podio/blob/master/tools/podio-vis#L10.
2024-02-22 06:14:38 +01:00
Matthieu Dorier
91a0c71ed1 nlohmann-json-schema-validator: added version 2.2.0 and 2.3.0 (#42792) 2024-02-22 06:11:29 +01:00
Dom Heinzeller
6ee6fbe56b ecflow: apply ctsapi_cassert.patch for all compilers (#42793) 2024-02-22 06:09:54 +01:00
Mikael Simberg
be4eae3fa8 pika: add sanitizers variant (#42778) 2024-02-22 05:54:33 +01:00
Harmen Stoppels
ad70b88d5f spack gc: do not show uninstalled but needed specs (#42696) 2024-02-22 05:21:39 +01:00
eugeneswalker
c1d230f25f e4s ci stacks: add python packages (#42774)
* e4s ci stacks: add python packages

* comment out failing specs
2024-02-21 20:59:05 -07:00
John W. Parent
4bc52fc1a3 env activate: use Win-compatible print on Windows (#42755)
Use "echo" instead of "printf" on Windows.
2024-02-21 11:02:04 -08:00
John W. Parent
7d728822f0 Windows: fix error with can_symlink check (#42753) 2024-02-21 10:18:25 -08:00
John W. Parent
e96640d2b9 cgns package: don't use MPI wrappers on Windows (#42750) 2024-02-21 10:10:48 -08:00
Alex Richert
e196978c7c Add 'docs' variant to rust-bootstrap (#42768)
* Add 'docs' variant to rust-bootstrap

* remove docs for rust-bootstrap
2024-02-21 11:04:13 -07:00
Harmen Stoppels
de3d1e6c66 rocm: removal of deprecated <5.1 versions (#42676)
The package `aomp` is removed entirely, as it was too outdated to have non-deprecated dependencies.
2024-02-21 14:07:40 +01:00
Massimiliano Culpo
2d8e0825fe binutils: add v2.42 (#42760) 2024-02-20 23:03:51 -08:00
pauleonix
d5c06c4e2c asio: add patches 1.28.2 and 1.28.1 (#42762) 2024-02-20 23:02:43 -08:00
Wouter Deconinck
0d92b07dbd pythia6: deal with dead pythiasix.hepforge.org (#42162)
* pythia6: deal with dead pythiasix.hepforge.org

* pythia6: rm main81.f from CMakeLists.txt

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-02-20 15:30:54 -06:00
Auriane R
6cfcbc0167 Update conflict between stdexec and clang (#42765) 2024-02-20 13:32:54 -07:00
Massimiliano Culpo
f9e9fab2da clingo: add v5.7.1 (#42758) 2024-02-20 07:49:49 -08:00
Massimiliano Culpo
b3f790c259 btop: add v1.3.2 (#42759) 2024-02-20 07:46:35 -08:00
George Young
b5b5130bed pblat: add new package (#42517)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-19 22:28:27 +01:00
Vicente Bolea
d670a2a5ce adios2: update kokkos dependency (#42621)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-19 11:54:08 -07:00
Harmen Stoppels
0ee3a3c401 Use relative target in symlinks to modified files in view (#42699) 2024-02-19 16:33:38 +01:00
Dave Keeshan
ad6dc0d103 verible: add v0.0-3539-g9442853c (#42628) 2024-02-19 14:40:23 +01:00
Alex Richert
6e373b46f8 scorep: specify binutils headers and libs (#42656) 2024-02-19 14:35:22 +01:00
Thomas-Ulrich
abe617c4ea hipsycl: update package (#42518) 2024-02-19 14:34:05 +01:00
Satish Balay
a0e80b23b9 DTK: specify MPI compilers (#42592)
Co-authored-by: balay <balay@users.noreply.github.com>
2024-02-19 14:24:28 +01:00
Dom Heinzeller
4d051eb6ff ecflow: fix compilation with Intel classic compilers (#42622) 2024-02-19 14:23:09 +01:00
kinagaki-fj
0d89083cb5 omm-bundle: add new package (#42304) 2024-02-19 14:16:39 +01:00
Alex Richert
23e586fd85 ferret: add support for gcc@10: (#42660) 2024-02-19 14:14:11 +01:00
Richard Berger
c2b116175b kokkos: disable CUDA_MALLOC_ASYNC on cray-mpich (#42661)
Co-authored-by: Daniel Arndt <arndtd@ornl.gov>
2024-02-19 13:52:48 +01:00
Mikael Simberg
a1f90620c3 umpire: depend on camp~rocm when umpire itself has ~rocm (#42701) 2024-02-19 11:58:31 +01:00
Harmen Stoppels
668879141f remove a few redundant calls to setup_run_environment (#42718)
Any package `X` used as `depends_on("x", type="build")` will have
`X.setup_run_environment(env)` called, because it has to be able to
"run" in the build environment.

So there is no point in calling `setup_run_environment` from
`setup_dependent_build_environment`.

Also it's redundant to call `setup_run_environment` in
`setup_dependent_run_environment`, cause (a) the latter is called _for
every parent edge_ instead of once per node, and (b) it's only called
after `setup_run_environment` is called anyways. Better to call
`setup_run_environment` once and only once.
2024-02-19 11:43:45 +01:00
Adam J. Stewart
267defd9d3 py-matplotlib: add v3.8.3 (#42698) 2024-02-19 11:40:18 +01:00
Ken Raffenetti
603d3f2283 mpich: Remove invalid pmi option (#42686)
pmi=off is not a valid configuration option. MPICH cannot function
without a PMI library. Fixes #42685.
2024-02-19 11:38:49 +01:00
James Beal
02bfbbe269 Bump for new version of bedtools2 (#42034)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-19 11:35:26 +01:00
Alec Scott
44d08d2c24 gnupg: make discoverable as external (#42736) 2024-02-19 11:27:39 +01:00
AMD Toolchain Support
c6faab10aa CP2K: fix multiple use of spec["fftw"] (#42724)
fftw object was originally created with spec["fftw:openmp"], but
referencing spec["fftw"] overwrites the 'last_query' in the spec object,
so later use of fftw.libs was not returing FFTW OpenMP libs.

Also allow the post-install fixup to support amdfftw as well as fftw.

Co-authored-by: Branden Moore <branden.moore@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-02-19 11:21:11 +01:00
Geoffrey Lentner
baa203f115 duckdb: add v0.9.2 (#42374) 2024-02-19 03:02:43 -07:00
Alec Scott
575a33846f bfs: add v3.1.1 (#42740) 2024-02-19 10:54:00 +01:00
Adam J. Stewart
79df065730 py-shapely: add v2.0.3 (#42738) 2024-02-18 21:29:18 +01:00
William Moses
953ee2d0ca Bump enzyme to 0.0.100 (#42626) 2024-02-18 08:48:21 -08:00
Nai-Yuan Chiang
6796b581ed hiop new release, v1.0.3 (#42730) 2024-02-18 08:45:44 -08:00
Christoph Junghans
f49a5519b7 byfl: initial commit (#42731) 2024-02-18 08:43:56 -08:00
Tom Drever
e07a6f51dc Add py-click-option-group (#42678)
* Add py-click-option-group

* Specify dependency versions
2024-02-18 08:22:25 -06:00
Dani
cb73d71cf1 new builtin package: py-biobb-model (#42681)
* new builtin package: py-biobb-model

* Update var/spack/repos/builtin/packages/py-biobb-model/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:18:03 -06:00
Jonas Eschle
5949fc2c88 add package py-jacobi (#42672)
* add package py-jacobi

* fix:  add description

* fix:  add description

* fix:  add description

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/py-jacobi/package.py

I don't think that numpy is used in "build"? But not important

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:17:35 -06:00
Jonas Eschle
fd10cfdebf add package py dotmap (#42665)
* add package py dotmap

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* fix:  add description

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
2024-02-18 03:16:32 -06:00
Maciej Wójcik
32506e222d py-sysrsync: add new package (#42492)
* py-sysrsync: add new package

* py-sysrsync: specify dependency type

* py-sysrsync: add constraint for Python
2024-02-18 03:15:42 -06:00
Maciej Wójcik
a7d5cc6d68 py-google-...: add new versions and few new packages (#42671)
* py-google-cloud-storage: add new versions

* py-google-api-core: add new versions

* py-proto-plus: add new package

* py-google-api-core: add grpc variant

* py-google-api-core: add grpc variant

* py-google-api-core: add missing prefix

* py-google-cloud-batch: add new package

* py-google-cloud-logging: add new package

* py-google-cloud-appengine-logging: add new package

* py-google-cloud-audit-log: add new package

* py-grpc-google-iam-v1: add new package

* py-proto-plus: remove obvious dependency

* Whitespace

* Whitespace

* py-google-cloud-audit-log: correct conflict

* py-proto-plus: correct dependency type

* Whitespace

* py-google-auth: add new version

* py-google-resumable-media: add new version

* py-google-cloud-storage: constrain version of dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-grpcio-status: use newer version

* py-google-resumable-media: add upper bound of dependency

* Add types of dependencies.

* py-grpcio: add new version

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:15:02 -06:00
Vanessasaurus
222241f232 flux-core: add uuid (#42635)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-18 08:54:11 +01:00
Sinan
aa1820eb5c pdal: new package (#42714)
* new package pdal

* [@spackbot] updating style on behalf of Sinan81

* fix style

* add license

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* [@spackbot] updating style on behalf of Sinan81

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* improve dependency spec

* add maintainer

* add conflict

* fix bug

* improve

* improve

* [@spackbot] updating style on behalf of Sinan81

* fix style

* specify cmake dependency version

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Alec Scott <alec@bcs.sh>
2024-02-17 13:40:40 -08:00
George Young
16ea5f68ba cryodrgn: new package @2.3.0 (#42443)
* cryodrgn: new package @2.3.0

* correcting dependency ranges

* correcting dependency ranges

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-17 15:07:56 -06:00
Dani
aeec515b4f new builtin package: py-biobb-structure-utils (#42683) 2024-02-17 14:42:10 -06:00
Maciej Wójcik
6e2ec2950b py-kubernetes: add new versions (#42670)
* py-kubernetes: add new versions

* py-oauthlib: add new version
2024-02-17 14:32:20 -06:00
Maciej Wójcik
fe5772898d py-azure-... and py-msrest: add new versions (#42624)
* py-azure-batch: add new versions

* py-azure-core: add new versions

* py-azure-identity: add new versions

* py-azure-mgmt-batch: add new versions

* py-azure-mgmt-core: add new versions

* py-azure-storage-blob: add new versions

* py-msrest: add new versions

* Whitespace

* Whitespace

* py-msrest: add a note

* py-msrest: version-dependent URL

* py-azure-mgmt-batch: correct version of dependency
2024-02-17 14:24:11 -06:00
Alex Leute
384ddf8e93 py-smote-variants: Added package py-smote-variants (#42502)
* py-smote-variants: Added package py-smote-variants

Also added py-minisom and py-metric-learn as dependencies

* py-metric-learn: Added build dependency on setuptools

* py-smote-variants: Added a dependency on py-pytest-runner

As well as a comment about why statistics isn't included

* [@spackbot] updating style on behalf of alex391

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-02-17 14:20:03 -06:00
Tom Vander Aa
32c2e240f8 py-charm4py: needs Cython<3.0 (#42491)
* py-charm4py: needs Cython<3.0

* Update var/spack/repos/builtin/packages/py-charm4py/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-17 14:16:31 -06:00
dependabot[bot]
b88c4792a7 build(deps): bump clingo from 5.6.2 to 5.7.1 in /.github/workflows/style (#42732)
Bumps [clingo](https://github.com/potassco/clingo) from 5.6.2 to 5.7.1.
- [Release notes](https://github.com/potassco/clingo/releases)
- [Changelog](https://github.com/potassco/clingo/blob/master/CHANGES.md)
- [Commits](https://github.com/potassco/clingo/compare/v5.6.2...v5.7.1)

---
updated-dependencies:
- dependency-name: clingo
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 22:24:10 -08:00
dependabot[bot]
ac92e94b00 build(deps): bump pytest from 8.0.0 to 8.0.1 in /lib/spack/docs (#42733)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.0.0 to 8.0.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.0.0...8.0.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 22:23:13 -08:00
Luc Berger
4a01865f7b Kokkos Ecosystem: update for release 4.2.01 (#42711)
* Kokkos Ecosystem: update for release 4.2.01

Will rebase this on top of develop once Kokkos Core PR merges.

* Kokkos Ecosystem: update license statement to reflect current license
2024-02-16 17:28:29 -07:00
Sebastian Pipping
025165e22e expat: Add latest release 2.6.0 with security fixes (#42680)
* expat: Add latest release 2.6.0 with security fixes

* expat: Deprecate vulnerable 2.5.0

* expat: Add past CVEs and where they were fixed
2024-02-16 17:23:46 -07:00
SXS Bot
cda9bc3e1d spectre: add v2024.02.05 (#42496)
* spectre: add v2024.02.05

* [@spackbot] updating style on behalf of sxs-bot

---------

Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-02-16 15:49:06 -07:00
Lydéric Debusschère
caf21dda42 rust: enable vendor (#42365)
* rust: enable vendor

* rust: modify vendor description; move the call of variant

* rust: fix style

* rust: typo

* rust: remove variant 'vendor' to let vendoring as default fonctionality

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2024-02-16 14:13:44 -08:00
HELICS-bot
e1779a2884 helics: Add version 3.5.0 (#42572)
* helics: Add version 3.5.0

* helics: define CMAKE_CXX_STANDARD=20 when GCC>=13 is used to compile

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <mast9@llnl.gov>
2024-02-16 14:12:50 -08:00
Adam J. Stewart
f55a018fd9 py-torchmetrics: add v1.3.1 (#42638) 2024-02-16 14:00:43 -08:00
dependabot[bot]
f5964e1dde build(deps): bump black in /.github/workflows/style (#42631)
Bumps [black](https://github.com/psf/black) from 24.1.1 to 24.2.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 13:52:38 -08:00
dependabot[bot]
23e0fe2e21 build(deps): bump black from 24.1.1 to 24.2.0 in /lib/spack/docs (#42629)
Bumps [black](https://github.com/psf/black) from 24.1.1 to 24.2.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 13:51:50 -08:00
dependabot[bot]
44438e6171 build(deps): bump python-levenshtein in /lib/spack/docs (#42630)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.24.0 to 0.25.0.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.24.0...v0.25.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 13:51:12 -08:00
Martin Lang
af8e000a93 bigdft-futile: new versions 1.9.3, 1.9.4 (#42646) 2024-02-16 13:41:18 -08:00
Adam J. Stewart
375b82a593 py-black: add v24.2.0 (#42639) 2024-02-16 13:39:16 -08:00
Adam J. Stewart
2030e2b089 py-jsonargparse: add v4.27.5 (#42640) 2024-02-16 13:38:17 -08:00
Carlos Bederián
34aba94148 openmpi: add ucc to fabrics (#41889) 2024-02-16 22:37:00 +01:00
Nicolas Morales
43c909e19c Update maintainers for Kokkos and add Kokkos 4.2.1 (#42690)
* update kokkos spack package maintainers

* add Kokkos 4.2.01

* Update var/spack/repos/builtin/packages/kokkos/package.py

Co-authored-by: Luc Berger <lberge@sandia.gov>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
Co-authored-by: Luc Berger <lberge@sandia.gov>
2024-02-16 11:12:36 -07:00
Martin Lang
7c011d304f nfft: new versions 3.5.1, 3.5.2, 3.5.3 (#42645) 2024-02-16 09:58:03 -08:00
Martin Lang
1546fc7e5f dftbplus: new versions 22.2, 23.1 (#42647) 2024-02-16 09:55:53 -08:00
Rob Falgout
75a134f085 Update package.py for hypre release 2.31.0 (#42689) 2024-02-16 11:46:29 -06:00
Alex Richert
d0c4675a9b Add aocc support to ESMF (#42708)
* Add aocc support to ESMF

* Update package.py
2024-02-16 09:33:47 -08:00
WuK
0507c3c63d add new CUTLASS versions (#42715) 2024-02-16 09:23:50 -08:00
dependabot[bot]
59caa93571 build(deps): bump dorny/paths-filter from 3.0.0 to 3.0.1 (#42710)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 3.0.0 to 3.0.1.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](0bc4621a31...ebc4d7e9eb)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-16 09:16:35 -08:00
Arne Becker
b4b53a9a9f perl-datetime-format-iso8601: New package (#42719)
Adds DateTime::Format::ISO8601
2024-02-16 09:15:54 -08:00
Todd Gamblin
be1cfffa45 clingo: add version 5.7.0 (#42707)
5.7.0 was just released. It includes a number of changes requested and/or
upstreamed by Spack developers, e.g.:

* API for accessing optimization priorities: https://github.com/potassco/clingo/pull/406
* Hash optimization: https://github.com/potassco/clingo/pull/441
* Contributing Guide: https://github.com/potassco/clingo/pull/465
* Hiding more ELF symbols:
  * https://github.com/potassco/clingo/pull/447
  * https://github.com/potassco/clingo/pull/449
2024-02-16 18:14:19 +01:00
Victor Lopez Herrero
75b7109222 dlb: add v3.4 (#42722) 2024-02-16 09:13:23 -08:00
Mikael Simberg
c56cf8c0d2 Add support for clang with OpenMP and other minor changes to oneapi build system (#42717)
* Add support for clang in oneapi packages with OpenMP

* Add fallback search for libomp in OneApi package with OpenMP threading

* Add requires for the compiler when using threads=openmp in intel-oneapi-mkl

* Cosmetic changes to messages in oneapi.py

* Update error message in oneapi.py

Co-authored-by: Robert Cohn <rscohn2@gmail.com>

* Update another error message in oneapi.py

Co-authored-by: Robert Cohn <rscohn2@gmail.com>

* Inline helper error function in oneapi.py

* Update one more error message in oneapi.py

* Wrap long line in oneapi.py

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2024-02-16 11:39:31 -05:00
Robert Cohn
5c3df6e8ca [intel-oneapi-mkl] provide omp lib for threads=openmp (#42653) 2024-02-16 08:48:42 +01:00
Tim Fuller
79087d08d9 allow packages to request no submodules be updated (#40409)
* allow packages to request no submodules be updated when self.submodules is a
  callable function

* Extend the test added in Allow more fine-grained control over what submodules are
  updated: part 2 #27293 to include this case

* Update the type signature for the submodules arg of version() in directives.py

---------

Co-authored-by: tjfulle <tjfulle@users.noreply.github.com>
2024-02-15 22:36:37 -08:00
Victor Brunini
d31e503e5b develop: Add -b/--build-directory option to set build_directory package attribute (#39606)
* develop: Add -b/--build-directory option to set build_directory package attribute.

* Update docs

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
Co-authored-by: vbrunini <vbrunini@users.noreply.github.com>
2024-02-16 06:30:58 +00:00
Eric Berquist
55b62c2168 SST: only run autoreconf for versions from Git branches (#42712) 2024-02-15 22:27:07 -06:00
Scott Wittenburg
6c3511ee1d Fix spack --profile|--pdb <cmd> (#42662) 2024-02-15 15:15:40 -07:00
Seth R. Johnson
d19fa74909 celeritas: new version 0.4.2 (#42702) 2024-02-15 16:41:10 +00:00
Jemma Stachelek
a2ad2d1c9f docs: fix typo (#42688) 2024-02-15 11:21:51 +01:00
Harmen Stoppels
55863bd680 compilers: fixup order of arguments to satisfies (#42682) 2024-02-15 10:21:06 +00:00
Harmen Stoppels
7d4dcd52d9 elpa: fix checksum (#42674) 2024-02-14 11:56:27 +01:00
Henrik Stooss
5e4e72ddd2 veclibfort: explicitly add platform=darwin as requirement (#42664) 2024-02-14 11:18:04 +01:00
John W. Parent
447c48e2fd VTK: limit patches to v8 (#42505)
* VTK: limit patches to v8

* Finer scrope on patch version applicability
2024-02-14 03:36:57 -06:00
Sreenivasa Murthy Kolam
3be4f4db86 Deprecate ROCm 5.1.0 till 5.2.3 (#41794) 2024-02-14 09:03:00 +01:00
John Pennycook
ca97a0fefb cmake: Enable compilation database generation (#42353)
* cmake: Enable CMAKE_EXPORT_COMPILE_COMMANDS

Enabling this option causes CMake to generate a compile_commands.json file
containing a compilation database that can be used to drive third-party tools.

CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5

Exporting compilation databases is only supported for Makefile and Ninja
generators, so check these conditions as well.

CMAKE_EXPORT_COMPILE_COMMANDS is only enabled in supported configurations
2024-02-13 16:47:40 -07:00
Jon Rood
59f56327fe Remove boost as dependency for trilinos+stk (#39556)
* Remove boost as dependency for trilinos+stk.

* Formatting.

* Put bounds on STK requirement of Boost.
2024-02-13 13:28:25 -07:00
Victor Brunini
e4c871489a boost: Add patch to workaround mpl compiler error with gcc 8.3 and c++17. (#39144)
Works around this issue: https://github.com/boostorg/mpl/issues/44

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-02-13 12:23:13 -07:00
Nichols A. Romero
c048101d90 ELPA package: add patched version from Nov 2023 (#42539) 2024-02-13 10:47:28 -08:00
Alex Richert
e0304bf509 pdt: add aocc support (#42634) 2024-02-13 11:03:46 -07:00
Maciej Wójcik
0e2a9fe26a py-py-tes: add new package (#42531)
* py-py-tes: add new package

* py-py-tes: add constraint on Python
2024-02-13 10:31:11 -07:00
George Young
d1e01d5646 khmer: new package @2.1.1 (#42450)
* khmer: new package @2.1.1

* rationalising patch

* adding dep, modifying patch

* Update var/spack/repos/builtin/packages/khmer/package.py

Indeed!

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:02:44 -07:00
George Young
7b04910f84 possvm: new package @1.2 (#42516)
* possvm: new package @1.2

* black!

* appeasing flake8

* Updating commit ID

* Adding graphing dep

* Update var/spack/repos/builtin/packages/py-markov-clustering/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 09:49:36 -07:00
Maciej Wójcik
0664a2cdb2 py-pysftp: add new package (#42525)
* py-pysftp: add new package

* py-pysftp: correct version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 09:49:14 -07:00
Jeremy Fix
fc38fe1c69 Adds the spack recipe for building the pynpm python package (#42582)
* Adds the spack recipe for building the pynpm python package

* fix license header

* Update var/spack/repos/builtin/packages/py-pynpm/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 09:42:11 -07:00
George Young
29cb1d0ce0 cellpose: new package @2.2.3 (#42403)
* py-fastremap: new package @1.14.1

* py-pyqtgraph: new package @0.13.3

* py-roifile: new package @2024.1.10

* py-superqt: new package @0.6.1

* cellpose: new package @2.2.3

* Appeasing black ...

* Dropping the cuda variant

* Update var/spack/repos/builtin/packages/py-fastremap/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Adding python version dependency

* Update var/spack/repos/builtin/packages/py-pyqtgraph/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pyqtgraph/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-roifile/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-superqt/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Adding missing dep & conflict

* Appeasing black

* Adding missing py- prefix for dep

* Switching over to py-pyqt6

* Switching over to py-pyqt6

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:08:41 -06:00
Dani
ce777e3c89 py-biobb-structure-checking: new package (#42580)
* new builtin package: py-biobb-structure-checking

* Update var/spack/repos/builtin/packages/py-biobb-structure-checking/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* dependencies are also build dependencies

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:05:37 -06:00
Dani
bd44cedd0d new builtin package: py-biobb-io (#42451)
* new builtin package: py-biobb-io

* Update var/spack/repos/builtin/packages/py-biobb-io/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:03:42 -06:00
Dani
316a9a5d11 py-biobb-gromacs: add new package (#42579)
* new builtin package: py-biobb-gromacs

* Update var/spack/repos/builtin/packages/py-biobb-gromacs/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* added setuptools dependency

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-13 10:03:06 -06:00
Harmen Stoppels
4a04989bbb PythonExtension.add_files_to_view: link non-executable/non-shebang regular files (#42641) 2024-02-13 12:55:37 +01:00
Jordan Ogas
2c4b529896 squashfuse: add versions (#42589) 2024-02-13 09:03:49 +01:00
Henri Menke
e37c099ddb clfft: workaround compiler error (#42519) 2024-02-13 00:10:51 -07:00
George Young
4d7898a669 psipred: new package @4.02 (#42529)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-12 18:20:33 -08:00
Sreenivasa Murthy Kolam
91b0528abf ROCm packages and dependents: add 6.0.2 release (#42544)
* Bump up the version for rocm-6.0.2 release
* extend the patches that were created for apps for rocm-6.0.0 and rocm-6.0.2 releases
  (but apply hipfft patch for only 6.0.0)
2024-02-12 17:54:33 -08:00
Zack Galbreath
8ee3073350 More updates for GitLab CI memory requests (#42425)
* gitlab: remove requests for unreferenced packages

The packages removed in this commit are not built by any of
our current GitLab CI stacks.

* gitlab: update memory requests for "huge" packages

* gitlab: reduce memory requests for overprovisioned packages

* gitlab: more memory for py-torch (again)

* gitlab: update memory but keep CPU the same
2024-02-12 16:41:56 -06:00
Massimiliano Culpo
cb3c014a43 audit: detect self-referential depends_on (#42456) 2024-02-12 21:56:06 +01:00
Victoria Cherkas
2a01e9679a Adds latest releases of eccodes/eckit/metkit (#42618) 2024-02-12 13:41:15 -07:00
Harmen Stoppels
519deac544 Fix multiple issues with Python in views (#42601)
This fixes bugs, performance issues, and removes no longer necessary code.

Short version:

1. Creating views from Python extensions would error if the Spack `opt` dir itself was in some symlinked directory. Use of `realpath` would expand those, and keying into `merge_map` would fail.
2. Creating views from Python extensions (and Python itself, potentially) could fail if the `bin/` dir contains symlinks pointing outside the package prefix -- Spack keyed into `merge_map[target_of_symlink]` incorrectly.
3. In the `python` package the `remove_files_from_view` function was broken after a breaking API change two years ago (#24355). However, the entire function body was redundant anyways, so solved it by removing it.
4. Notions of "global view" (i.e. python extensions being linked into Python's own prefix instead of into a view) are completely outdated, and removed. It used to be supported but was removed years ago.
5. Views for Python extension would _always_ copy non-symlinks in `./bin/*`, which is a big mistake, since all we care about is rewriting shebangs of scripts; we don't want to copy binaries. Now we first check if the file is executable, and then read two bytes to check if it has a shebang, and only if so, copy the entire file and patch up shebangs.

The bug fixes for (1) and (2) basically consist of getting rid of `realpath` entirely, and instead simply keep track of file identifiers of files that are copied/modified in the view. Only after patching up regular files do we iterate over symlinks and check if they target one of those. If so, retarget it to the modified file in the view.
2024-02-12 19:52:52 +01:00
John Biddiscombe
c33a8dc223 h5hut: fix to work with latest hdf5 (H5_USE_110_API) (#42607) 2024-02-12 04:33:39 -07:00
Richard Berger
742e2fc7e4 caliper: allow newer papi to be used (#42501) 2024-02-12 11:53:57 +01:00
Jean Luca Bez
e90b616428 pdc: add v0.4 (#42508) 2024-02-12 11:52:54 +01:00
Brian Vanderwende
3d037c5150 nco: add v5.1.9 (#42512) 2024-02-12 11:46:07 +01:00
Ivan Maidanski
fd60e97784 bdw-gc: add v8.2.6 (#42524) 2024-02-12 11:38:12 +01:00
Thomas Madlener
c2cda6bc48 lcio: add v2.21 (#42514) 2024-02-12 11:37:10 +01:00
Thomas Madlener
1cd95a4bb7 podio: add 0.99 pre-release version and deprecate all older versions (#42526) 2024-02-12 11:36:30 +01:00
James Beal
727eaf3c82 samtools: add v1.19.2 (#42550)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-12 11:09:05 +01:00
Thomas Madlener
d014671bcb edm4hep: add v0.10.5, deprecate broken v0.10.4 (#42561) 2024-02-12 11:01:44 +01:00
Caetano Melone
4edb073a20 litestream: add new package (#42565) 2024-02-12 10:58:12 +01:00
G-Ragghianti
5d2b9514db Fixed papi release tar hash due to rebuild of tar file (#42567) 2024-02-12 10:57:12 +01:00
Ken Raffenetti
2491855678 mpich: Fix +vci variant for newer releases (#42570)
--with-ch4-max-vcis=default is no longer accepted by MPICH configure
since the 4.1 release. Just omit the option from the +vci variant, since
configure will select the default value in its absence.
2024-02-12 10:55:57 +01:00
Mark Grondona
b23038db53 flux-core: drop czmq,jsonschema requirements for recent versions (#42560)
Problem: Older versions of the flux-core package require czmq and
jsonschema, but these dependencies have been dropped in recent
versions.

Add `when=` arguments to drop these requirements for the appropriate
versions of flux-core.
2024-02-12 10:55:00 +01:00
Julien Cortial
6d68dcf13c libuv: update compiler requirements (#42576) 2024-02-12 10:54:10 +01:00
Adam J. Stewart
fbf6db035b py-lightning: add v2.2.0 (#42577) 2024-02-12 10:49:17 +01:00
Richard Berger
1a43fc1e62 lammps: add v20240207 (#42587)
* lammps: correct license

LAMMPS has always been GPL-2.0 only. The clarification was made here:
9a8ac23663
2024-02-12 10:46:05 +01:00
Mikael Simberg
8210853276 pika: add v0.22.2 (#42617) 2024-02-12 02:38:50 -07:00
Sebastian Ehlert
e8bf6ab352 orca: added new versions (#38822) 2024-02-12 10:20:36 +01:00
Sinan
0aa91b99ed freexl: add v2.0.0 (#42574)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-02-12 10:16:31 +01:00
kinagaki-fj
d93b035f14 xmlf90: add version and fix for fujitsu compiler (#42305) 2024-02-12 10:14:11 +01:00
kinagaki-fj
adb0757f72 blitz: fix for fujitsu compiler (#42307)
Co-authored-by: m-shunji <m.shunji@fujitsu.com>
2024-02-12 10:11:26 +01:00
George Malerbo
2369a8f4e5 raylib: add new package (#42594) 2024-02-12 10:07:33 +01:00
Alec Scott
a6844d26e0 coretuils: add v9.4 (#42596) 2024-02-12 10:03:00 +01:00
Alec Scott
95c663224d curl: add v8.6.0 (#42597) 2024-02-12 10:02:26 +01:00
Brian Vanderwende
6dc01f0d94 Fix for tirpc variable in libdap4 (#42511) 2024-02-12 10:01:53 +01:00
Adam J. Stewart
2d3eeecef8 Sleef: fix build on macOS arm64, add test support (#42583) 2024-02-12 09:59:12 +01:00
Alec Scott
167a168a63 bfs: add v3.1 (#42595) 2024-02-12 09:54:45 +01:00
Elsa Gonsiorowski, PhD
10af235895 lwgrp: add autotools deps when building @main (#42564) 2024-02-12 09:40:52 +01:00
Cyrus Harrison
5d6dec5a5c conduit: add v0.9.1 (#42510) 2024-02-12 09:30:04 +01:00
Massimiliano Culpo
6ae0696e2f sirius: fix self-referential dependencies (#42552) 2024-02-12 09:21:14 +01:00
Massimiliano Culpo
9688e91a6b spla: fix self-referential dependencies (#42553) 2024-02-12 09:15:27 +01:00
Alec Scott
c026943c8e glab: add v1.36.0 (#42610) 2024-02-12 09:14:20 +01:00
Alec Scott
662ab2d4c5 xz: add v5.4.6 & v5.4.5 (#42612) 2024-02-12 09:13:49 +01:00
Alec Scott
722b00bbfb go: add v1.22.0 (#42611) 2024-02-12 09:12:31 +01:00
Alberto Invernizzi
6dbb56ba36 nvpl-lapack: fix versioning and add missing dependency (#42599)
* fix wrong versioning
use doc version and not the one extrapolated from the path (i.e. 0.2.0.1)

* nvpl-lapack requires nvpl-blas
propagate matching variants to nvpl-blas dependency

Co-authored-by: albestro <albestro@users.noreply.github.com>
2024-02-12 09:11:47 +01:00
Dave Keeshan
2e5bdd2679 yosys: add v0.38 (#42616) 2024-02-12 09:08:03 +01:00
Maciej Wójcik
97ec167452 py-immutables: fix building, add new versions (#42530)
* py-immutables: fix building, add new versions

* Explain constraint

* py-immutables: Wrap line
2024-02-11 16:07:54 -07:00
Garth N. Wells
2b8dcc0f57 fenicsx: deprecate older versions (#42613)
* Deprecate FEniCSx packages

* Format files

* Simplification
2024-02-11 15:57:39 -07:00
Maciej Wójcik
6312658888 py-configargparse: add new versions (#42533)
* py-configargparse: add new versions

* py-configargparse: Remove unnecessary constraint
2024-02-11 15:52:37 -07:00
Mikhail Titov
213818ffb5 Update package versions: RADICAL-Cybertools (RE, RG, RP, RS, RU) (#41997)
* rct: update packages (RE, RG, RP, RS, RU) with new versions

* rct: updated style

* rct: 1.46 release

* rct: RP 1.46.1 (hotfix applied)

* rct: deprecated old versions

* Update var/spack/repos/builtin/packages/py-radical-entk/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-radical-pilot/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-radical-saga/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* rct: brought back dependencies for deprecated versions

* rct: RP hotfix release 1.46.2

* rct: new stack release 1.47.0

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-11 15:09:20 -06:00
Alex Leute
aed5d1a88d py-gpy: added +plotting variant (#42588)
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-02-11 14:58:23 -06:00
Stephen Hudson
8b94128625 libEnsemble: add v1.2.0 (#42591) 2024-02-11 14:57:04 -06:00
Maciej Wójcik
4d91fbdf0f py-versioneer-518: add a workaround (#42534) 2024-02-11 14:48:29 -06:00
George Young
d7aa9d38fa py-deeptools: add 3.5.3 (#39951)
* py-deeptools: add 3.5.3

* switching tp pypi

* removing patch for tests

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-11 14:31:46 -06:00
Eric Berquist
b3369ac669 Add pre-commit 3.6.0 (#42404)
* Add pre-commit 3.6.0

* pre-commit 3.6.0 drops support for Python <3.9
2024-02-11 14:29:26 -06:00
George Young
e68fde6f4e py-ucsf-pyem: updating to commit e92cd4d (#42428)
* py-ucsf-pyem: updating to commit e92cd4d

* py-pyfftw: updating to @0.13:1

* py-ucsf-pyem: correcting install of scripts

* Upper limits

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Upper limits

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* updates from github review

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-11 13:01:58 -06:00
Dani
f0e49a54c0 new builtin package: py-simpletraj (#42460)
* new builtin package: py-simpletraj

* long line splitted in two for the style test

* removed trailing whitespace for the style test
2024-02-11 12:56:31 -06:00
Robert Cohn
2c67571726 [oneapi]: make headers match oneapi vars.sh (#42614)
* [oneapi]: make headers match oneapi vars.sh

* update

* update
2024-02-11 08:23:53 -08:00
Maciej Wójcik
fae6d3780f gromacs: add new version (#42609) 2024-02-10 08:19:12 -07:00
Howard Pritchard
34ba8e9527 openmpi: add 5.0.2 (#42568)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-02-10 14:36:18 +01:00
Massimiliano Culpo
686d1bc1ea mfem: fix self-referential dependencies (#42487) 2024-02-10 13:08:13 +01:00
snehring
9b42f9ab18 viennarna package: add v2.6.4 (#42606) 2024-02-10 02:38:03 -07:00
Alec Scott
1d508e1be3 fzf package: add v0.46.1 (#42603)
* fzf: add v0.46.1
* Standardize vim plugin install
2024-02-09 16:29:20 -08:00
Alec Scott
41f735a4ee gh package: add v2.43.1 (#42604)
* gh: add v2.43.1
* gh: fix go dependency versions
2024-02-09 16:28:06 -08:00
James Taliaferro
e9e6eb613b libsixel: add new package (#40031)
* add new package libsixel

* reorder to group variants and dependencies together

* Switch to using source from fork

The original developer of libsixel, Hayaki Saito (@saitoha), disappeared
in early 2020 and has not updated the repository or been seen since.
However, a fork of the project has been created at libsixel/libsixel,
and that fork has been getting much more regular updates. In this commit
I switch the package to using the better-maintained, now apparently
canonical fork of the project.

That project switched the build system from autotools to Meson, so much
of the build arguments code needed to be rewritten. The newest official
release also contained an error in meson.build which would break on
newer versions of Meson. That error only applied to the libjpeg and
libpng variants, though, so I switched the default to use libgd which
has broader format support anyway.

* blacken

* broke description into multiple lines

* allow libjpeg, etc to be loaded automatically
2024-02-09 12:09:10 -08:00
Chris Marsh
6dd19594ab Resolve unzip build failure with %oneapi (#42593)
* apply patch to all compilers as per spack/issues/42007

* [@spackbot] updating style on behalf of Chrismarsh

* Resolve flake8 style issue from spackbot

* fix flake8 trailing whitespace

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-02-09 12:04:05 -08:00
Harmen Stoppels
4f0a8fce52 hooks: remove 7 unused hooks (#42575)
These 7 hooks were not used.

- Six of them related to install phases were unused after `spack`
  `monitor` was removed, and the code seems to have bit rotten as there
  were reports they were not (always?) triggered when they should.
- The post environment one was made redundant after spack install for
  environment started following the common code path for generating
  module files in #42147.

It should not be a breaking change to remove, since users cannot define
hooks in extensions, they would have to fork Spack.

If we ever _were_ to make those hooks extendable outside of core Spack,
it would also be better to start with fewer rather than more, cause
everything you expose gets relied upon...

Removing those also allows us to rethink what hooks we really need, and
in particular it seems like we need a hook that runs post install also when
the spec is inserted into the database.
2024-02-09 20:52:09 +01:00
Massimiliano Culpo
7ff3b17f14 ASP-based solver: fix issue with conditional requirements and trigger conditions (#42566)
The lack of a rule to avoid enforcing requirements on multi-valued variants, when the condition activating the environment was not met, resulted in multiple optimal solutions. The fix is to prevent imposing a requirement if the when= rule activating it is not met.
2024-02-09 20:50:04 +01:00
Samuel Li
f71669175f sperr: add v0.8.1 (#42506) 2024-02-09 18:27:27 +01:00
simonLeary42
27b72b7691 Allow + in module file names (#41999) 2024-02-09 15:50:05 +01:00
Massimiliano Culpo
c5e0270ef0 dd4hep: remove self-referential dependencies (#42483)
This shouldn't be an issue, but express the self-reference
with a conflict.
2024-02-09 15:13:00 +01:00
Harmen Stoppels
c59d2d5b1c docs: overhaul module_file_support.rst (#42585)
The section was highly outdated as it referred to old defaults, and
failed to mention `hide_implicits: true`.

This commit restructures it, moves some deeply nested sections a level
up, and promotes `hide_implicits: true` + `autoload: direct` before
talking about `exclude`.
2024-02-09 13:32:43 +01:00
Paul R. C. Kent
a832d31ccd llvm: add 17.0.5, 17.0.6 (#42571) 2024-02-09 11:29:12 +01:00
Veselin Dobrev
47a8bde4da Add the latest OpenSSL versions: 3.2.1, 3.1.5, 3.0.13 (#42581) 2024-02-09 10:30:01 +01:00
kwryankrattiger
13050a9fb3 CI: Add ability to enable and disable stacks (#42255)
It is useful to enable/disable stacks in order to handle turning
specific stacks on/off based on runner availability, stack stability,
testing requirements, etc.

The disabled stack list takes precedence over the enable stack list. The
assumption is that stacks that are disabled are so due to some
functionality missing or broken for that stack.

The enable stack list implicitly disables all stacks not listed in the
enable list.
2024-02-08 15:26:32 -06:00
Massimiliano Culpo
753e8b53d3 xyce: fix self-referential dependencies (#42557) 2024-02-08 21:04:58 +01:00
Robert Underwood
af49f81ec5 libice fix for older glibc (#42586) 2024-02-08 19:52:37 +01:00
Massimiliano Culpo
db1a7406ca pastix: fix self-referential dependencies (#42546) 2024-02-08 16:27:01 +01:00
Adam J. Stewart
ecc9145d2c spack help --spec: add @= notation (#42584) 2024-02-08 15:57:44 +01:00
Rocco Meli
7090983c67 cp2k: patch for compilation with RelWithDebInfo (#42563) 2024-02-08 15:42:43 +01:00
Sergey Kosukhin
09fdea959f openmpi: add patch fixing MPI_MIN for unsigned long (#32392) 2024-02-08 15:41:46 +01:00
Massimiliano Culpo
e419e4ca93 q-e-sirius: fix self-referential dependencies (#42549) 2024-02-08 12:44:00 +01:00
Massimiliano Culpo
9e8f6e8d54 mpich: fix self-referential dependencies (#42527) 2024-02-08 09:34:54 +01:00
Massimiliano Culpo
753d69856a suite-sparse: fix self-referential dependencies (#42556) 2024-02-08 09:34:26 +01:00
Massimiliano Culpo
6d28caefc7 grib-util: fix self-referential dependencies (#42558) 2024-02-08 09:33:38 +01:00
John W. Parent
7b9eac02ff Windows registry: improve search efficiency & error reporting (#41720)
* Registry queries can fail due to simultaneous access from other
  processes. Wrap some of these accesses and retry when this may
  be the cause (the generated exceptions don't allow pinpointing
  this as the reason, but we add logic to identify cases which
  are definitely unrecoverable, and retry if it is not one of
  these).
* Add make recursion optioal for most registry search functions;
  disable recursive search in case where it was originally always
  recursive.
2024-02-07 13:26:07 -08:00
Massimiliano Culpo
138f8ba6e2 strumpack: fix self-referential dependencies (#42554) 2024-02-07 19:00:26 +01:00
Jonathon Anderson
046f9e89a1 hpctoolkit: Add dependency on xxhash for @develop (#42513) 2024-02-07 18:05:08 +01:00
Kevin Huck
2cca64d01d apex: add new release, deprecate old options, remove boost (#42538) 2024-02-07 17:17:04 +01:00
Alberto Invernizzi
9aed13adb9 nvpl-blas and nvpl-lapack: new packages for NVidia performance libraries (#41775)
Co-authored-by: Raffaele Solcà <rasolca@cscs.ch>
2024-02-07 17:01:11 +01:00
Massimiliano Culpo
85b6c344bd r: fix self-referential dependencies (#42551) 2024-02-07 16:55:28 +01:00
Massimiliano Culpo
9359c9b9db plink2: fix self-referential dependencies (#42547) 2024-02-07 16:55:09 +01:00
Massimiliano Culpo
d43f62cc5f orc: fix self-referential dependencies (#42542) 2024-02-07 16:54:37 +01:00
Massimiliano Culpo
a30d4612f5 oce: fix self-referential dependencies (#42540) 2024-02-07 16:54:14 +01:00
Massimiliano Culpo
e32009a7e3 openscenegraph: fix self-referential dependencies (#42541) 2024-02-07 16:53:44 +01:00
Massimiliano Culpo
b0f3489d68 py-*: fix self-referential dependencies (#42548) 2024-02-07 16:52:26 +01:00
simonLeary42
9099e4c233 libcatalyst cmake_minimum_required (#42532)
d469357740
2024-02-07 16:10:46 +01:00
Vanessasaurus
a1b895e547 flux-core: add v0.59.0 (#42536)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-07 16:10:05 +01:00
Vanessasaurus
df1ff0affa flux-pmix: add v0.5.0 (#42537)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-07 16:09:34 +01:00
George Young
95e828f3d8 s4pred: add new package (#42520) 2024-02-07 16:05:14 +01:00
Rocco Meli
a28c6caac0 dbcsr: examples variant (#42543) 2024-02-07 15:54:19 +01:00
Rocco Meli
514260d8cb cp2k+mpi requires dbcsr+mpi (#42545) 2024-02-07 15:35:20 +01:00
Massimiliano Culpo
642ec1918f julia: fix self-referential dependencies (#42486) 2024-02-07 08:46:03 +01:00
Massimiliano Culpo
bdae9f776b netcdf-c: fix self-referential dependencies (#42528) 2024-02-07 08:22:56 +01:00
Massimiliano Culpo
5c86a3cca2 lammps: fix self-referential dependencies (#42521) 2024-02-07 08:22:11 +01:00
Massimiliano Culpo
ea53008604 molgw: fix self-referential dependencies (#42523) 2024-02-06 19:01:44 +01:00
Massimiliano Culpo
01ea8f46e7 ldak: fix self-referential dependencies (#42522) 2024-02-06 17:45:40 +01:00
jalcaraz
ff1e700b56 TAU: Added new test for other variants. (#42503)
Now it tests all GPUs, syscall and python.
2024-02-06 08:37:22 -08:00
renjithravindrankannath
5efa723289 mesa: updating llvm dependency for version 23 (#42481) 2024-02-06 14:53:31 +01:00
Massimiliano Culpo
4985f87a52 dbcsr: fix self-referential dependencies (#42482) 2024-02-06 14:41:27 +01:00
Massimiliano Culpo
ae000f963c hpx: remove self-referential dependencies (#42485)
This shouldn't be an issue, but avoid self references
on "^asio".
2024-02-06 13:37:40 +01:00
Olivier Cessenat
0960f691a1 keepassxc: new version 2.7.6 (#42478)
AUTOTYPE is set by default in the 2.7.6 release, it was not previously.
2024-02-05 18:24:42 +01:00
Mosè Giordano
713b19cac7 py-pythran: apply patch to fix compilation with GCC 13 (#42490)
* py-pythran: apply patch to fix compilation with GCC 13

* Update var/spack/repos/builtin/packages/py-pythran/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-05 09:08:41 -07:00
Thomas Madlener
05f1f07e51 edm4hep: add v0.10.4 (#42488) 2024-02-05 07:43:51 -07:00
Mosè Giordano
955a01dfa4 hh-suite: apply patch to fix compilation with GCC 13 (#42489)
This fixes errors like
```
     294    In file included from /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.cpp:8:
  >> 295    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:37:37: error: 'uint16_t' has not been declared
     296       37 |   void writeU16(std::ostream& file, uint16_t);
     297          |                                     ^~~~~~~~
  >> 298    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:38:28: error: 'uint16_t' has not been declared
     299       38 |   void readU16(char** ptr, uint16_t &result);
     300          |                            ^~~~~~~~
  >> 301    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:40:37: error: 'uint32_t' has not been declared
     302       40 |   void writeU32(std::ostream& file, uint32_t);
     303          |                                     ^~~~~~~~
  >> 304    /build_stage/spack-stage-hh-suite-3.3.0-4kkv3zqhcadpubeo63l73xq3shr7gjmh/spack-src/src/a3m_compress.h:41:27: error: 'uint32_t' has not been declared
     305       41 |   void readU32(char**ptr, uint32_t &result);
     306          |                           ^~~~~~~~
```
2024-02-05 07:38:49 -07:00
Massimiliano Culpo
928ee7569c ecp-data-vis-sdk: remove self-referential dependencies (#42484)
This shouldn't be an issue, but express this limitation
with a conflict.
2024-02-05 15:17:46 +01:00
George Young
8190903821 motioncor2: add v1.6.4 (#42380)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-05 14:32:38 +01:00
George Young
473347df41 kentutils: update to @459, update download location & deps (#42448)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-05 04:23:33 -07:00
Adam J. Stewart
1ef52d7c8e py-pytest: add v8.0.0 (#42344) 2024-02-05 12:16:58 +01:00
Tom Payerle
561fe13bad cgal: add v5.3.2 (#42378) 2024-02-05 12:08:27 +01:00
George Young
c15ed38cef star: updating to 2.7.11a (#42011)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-05 12:06:01 +01:00
David Boehme
fa4568d9c9 kripke: add adiak dependency with Caliper enabled (#42414) 2024-02-05 11:17:24 +01:00
Nathalie Furmento
ef4f78a6cd starpu: add release 1.4.4 (#42446) 2024-02-05 11:12:50 +01:00
Adam J. Stewart
da36d069db py-lightly: fix conflict definition (#42449) 2024-02-05 11:11:23 +01:00
Cameron Rutherford
6ff3e17a7d hiop: require RAJA when cuda is enabled (#42407) 2024-02-05 11:04:08 +01:00
snehring
bb200be57d sentieon-genomics: add new version 202308.02 (#42465) 2024-02-05 11:01:00 +01:00
Olivier Cessenat
a238563fdb mpfr: add v4.2.1 (#42479) 2024-02-05 02:59:22 -07:00
Adam J. Stewart
bfc6f1d2a9 py-lightning: add v2.1.4 (#42468) 2024-02-05 10:59:00 +01:00
Olivier Cessenat
99b8a08366 stripack: sets the licence SPDX (#42473) 2024-02-05 10:45:23 +01:00
Olivier Cessenat
dba5ae939d visit-ffp and visit-unv: sets the licence SPDX (#42474) 2024-02-05 10:42:07 +01:00
Olivier Cessenat
7a4aa823d1 ngspice: new version 42 (#42475) 2024-02-05 10:41:06 +01:00
Olivier Cessenat
89e387cb67 latex2html: new version 2024 (#42476) 2024-02-05 10:38:24 +01:00
Olivier Cessenat
19c46de69f nlopt: new version 2.7.1 (#42477) 2024-02-05 10:27:01 +01:00
Michael Kuhn
e35fbfab77 gtkplus: add v3.24.41 and fix CUPS problems (#42480)
Fixes #42297
2024-02-05 10:23:57 +01:00
Massimiliano Culpo
478203dc68 asio: remove self-referential dependencies (#42469)
These shouldn't be an issue, but they can be expressed
in terms of variants on the package.
2024-02-05 10:10:58 +01:00
Massimiliano Culpo
5713ffd143 converge: fix self-referential dependencies (#42471) 2024-02-05 10:03:20 +01:00
Elsa Gonsiorowski, PhD
1711b6dee1 scr: fix @develop dependency versions (#42379) 2024-02-05 09:15:19 +01:00
Massimiliano Culpo
7fbd4afaaa cp2k: fix self-referential dependencies (#42472) 2024-02-05 09:12:47 +01:00
Massimiliano Culpo
f396dbcb4f berkeleygw: fix self-referential dependencies (#42470)
Also, remove a couple of duplicate directives
2024-02-05 09:04:32 +01:00
David Guibert
57fe3430fd py-pymummer: init (#42412)
* py-pymummer: init

* Update py-pymummer copyright date

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-04 06:27:46 -07:00
snehring
8428bef040 py-ete3: adding version 3.1.3 (#42462) 2024-02-04 06:12:19 -06:00
George Young
47c91c9163 ldsc: new package @2.0.1 (#42430)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-04 06:06:45 -06:00
David Guibert
6c57360eac py-pyfastaq: init (#42413) 2024-02-04 05:50:44 -06:00
George Young
624292d685 sicer2: new Python package @1.0.3 (#42383)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-04 05:17:27 -06:00
Mike Renfro
55ecc47dce fix ipyrad numpy dependencies (#42098)
* fix ipyrad numpy depedencies

ipyrad versions through 0.9.90 use np.int, which is deprecated in numpy 1.20.

* fix whitespace

* Correct numpy dependency restrictions
2024-02-04 05:16:22 -06:00
Massimiliano Culpo
f2125882c5 flecsi: fix constraints on mpi providers (#42447) 2024-02-03 12:11:03 +01:00
Massimiliano Culpo
d23cb39a3f quantum-espresso: fix self-referential dependencies (#42458) 2024-02-03 12:09:42 +01:00
Mosè Giordano
7f7d5b899a py-alphafold: use permanent link for openmm patch (#42461) 2024-02-03 11:07:16 +01:00
Harmen Stoppels
c44e854d05 Environment views: dependents before dependencies, resolve identical file conflicts (#42350)
Fix two separate problems:

1. We want to always visit parents before children while creating views
   (when it comes to ignoring conflicts, the first instance generated in
   the view is chosen, and we want the parent instance to have precedence).
   Our preorder traversal does not guarantee that, but our topological-
   order traversal does.
2. For copy style views with packages x depending on y, where
   <x-prefix>/foo is a symlink to <y-prefix>/foo, we want to guarantee
   that:
   * A conflict is not registered
   * <y-prefix>/foo is chosen (otherwise, the "foo" symlink would become
     self-referential if relocated relative to the view root)

   Note that
   * This is an exception to [1] (in this case the dependency instance
     overrides the dependent)
   * Prior to this change, if "foo" was ignored as a conflict, it was
     possible to create this self-referential symlink

Add tests for each of these cases
2024-02-03 11:05:45 +01:00
John W. Parent
8fa8dbc269 Sqlite package: export api symbols on Windows (#42299)
* Sqlite requires the user to provide a command line arg (DYNAMIC_SHELL)
  to export shared symbols to import lib from .def
* Add other options recommended by Sqlite docs: 
  https://github.com/sqlite/sqlite/blob/master/doc/compile-for-windows.md
  * Some of these options mean we can restore variants that were
    disabled for Windows (fts, functions, rtree).
2024-02-02 13:27:53 -08:00
jalcaraz
e59303d4ff tau: update test functions (#42432)
* Updated with generic test functions

* Added level_zero test

As the compiler is not installed by TAU, it should be loaded first and run with --dirty. Will check if there is a way to make it work without --dirty.

* Update var/spack/repos/builtin/packages/tau/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-02 11:12:24 -08:00
Massimiliano Culpo
494d943a24 ascent: fix self-referential dependencies (#42457) 2024-02-02 19:25:56 +01:00
kwryankrattiger
d227da5554 CI: Call timing script in after_script (#42166)
The main script body is over-written for power. Putting thet timing
aggregation in the after script allows it to be called on all of the
current pipelines.
2024-02-02 12:02:46 -06:00
Olivier Cessenat
714590426f astyle: new version 4.1.11 and build using cmake (#42390)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-02 18:58:27 +01:00
Massimiliano Culpo
9ffe179934 acts: fix self-referential dependencies (#42455) 2024-02-02 18:34:34 +01:00
dependabot[bot]
9c4e44a0ad build(deps): bump codecov/codecov-action from 4.0.0 to 4.0.1 (#42439)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.0 to 4.0.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](f30e4959ba...e0b68c6749)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-02 12:20:29 +01:00
dependabot[bot]
1ef69a8bfb build(deps): bump python-levenshtein in /lib/spack/docs (#42440)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.23.0 to 0.24.0.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.23.0...v0.24.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-02 12:20:02 +01:00
Massimiliano Culpo
55db090206 Extract low-level clingo wrappers from spack.solver.asp (#42429) 2024-02-02 12:19:38 +01:00
Matthew Whitlock
f8ce84860c Update packages_yaml.rst (#42438)
Fix an incorrect example.
2024-02-02 11:05:26 +01:00
Mosè Giordano
795360fe48 openmm: Apply patch use FindCUDAToolkit (#42437) 2024-02-02 11:04:03 +01:00
Axel Huebl
3d3d075496 WarpX: Disable CCache (#42434)
https://github.com/ECP-WarpX/WarpX/pull/4637
2024-02-02 11:02:00 +01:00
Frédéric Simonis
35630c219d preCICE: add v3.0.0 (#42426)
* preCICE: add version 3.0.0
* preCICE: add version conflict for libxml2 2.12.x
2024-02-01 22:56:18 -07:00
stefanfechter
5ef9bb7752 Bugfix: fix build of xforms (#35391)
This additional patch fixes the build of the now unmaintained library
xforms.
2024-02-01 13:07:57 -08:00
kjrstory
5bd5a219a6 Add algorithmic differentiation packages for SU2 (#39975)
* Add algorithmic differentiation packages for SU2
* Simplify checking boolean variants
* spack prefix and spec satisfies methos fix
* spelling fix
* style fix
2024-02-01 13:01:35 -08:00
fpruvost
c1450d26ff Add new package Qrmumps (#42393) 2024-02-01 13:00:14 -08:00
Weiqun Zhang
10d826597a amrex: add v24.02 (#42431) 2024-02-01 12:23:51 -07:00
Adam J. Stewart
00cbcd5dbc py-lightly: add v1.4.25 (#42421) 2024-02-01 11:54:01 -07:00
George Young
6bd8fda597 foldseek: new package @8 (#42422)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-01 11:33:28 -07:00
Rémi Lacroix
8b88255b43 pv: Add version 1.8.5. (#42419)
Switch to "tar.gz" source packages as the "tar.bz2" archives are not available for newer versions.
2024-02-01 09:54:53 -08:00
Rémi Lacroix
bcbd78cea9 ncdu: Add version 1.19 (#42420) 2024-02-01 09:49:57 -08:00
Adam J. Stewart
22ad28bb17 Python: add v3.12.1 (#42397) 2024-02-01 07:43:14 -07:00
Alberto Invernizzi
74bd8a9cf7 neovim: be more specific with lua dependencies (#42401) 2024-02-01 14:11:49 +01:00
Adam J. Stewart
023a6be67d Update PyTorch ecosystem (#42394) 2024-02-01 06:42:13 -06:00
Harmen Stoppels
922a1983f3 docs: git version section title + highlight issues (#42398)
* basic_usage: section title for git versions

* improve highlighting by using a comment instead of invalid syntax
2024-02-01 09:46:17 +01:00
Massimiliano Culpo
2758fc7e14 Remove caching in generated Dockerfiles (#42405) 2024-02-01 09:22:52 +01:00
dependabot[bot]
3541fcae01 build(deps): bump docker/metadata-action from 5.5.0 to 5.5.1 (#42416)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.5.0 to 5.5.1.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](dbef88086f...8e5442c4ef)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-01 09:14:31 +01:00
dependabot[bot]
eea06de6df build(deps): bump codecov/codecov-action from 3.1.6 to 4.0.0 (#42415)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.6 to 4.0.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](ab904c41d6...f30e4959ba)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-01 09:13:52 +01:00
Alberto Invernizzi
29658eddcc Lua: better specify providers in LuaPackage base class (#42392) 2024-02-01 08:57:12 +01:00
Tamara Dahlgren
2fc0d05a55 Environments: Add support for including views (#42250)
* Environments: Add support for including views (take 2)

* schema type hint fixes
2024-02-01 10:07:16 +09:00
Edward Hartnett
faf64f1a26 g2c: add v1.8.0 (#40761)
includes variant to build with future API for v2
2024-02-01 09:28:55 +09:00
Harmen Stoppels
d340523d68 jq: 1.7.1 (#42409) 2024-01-31 15:49:44 -08:00
Dave Sweeris
089fa12ff8 Update OpenSubdiv spec (#42381)
* Update package.py
   Add support for OpenSubdiv 3.5.x
* Fixed Dependencies
   I was getting errors about cmake not being able to find `xf86vm`, and adding a dependency on `libxxf86vm` fixes it.
2024-01-31 15:34:47 -08:00
Julius Plehn
d9528819a3 Adds nvhpc 24.1 (#42388) 2024-01-31 13:40:33 -08:00
Ben Wibking
e3d5ca2997 visit: set minimum silo version to 4.11 (#42072) 2024-01-31 22:05:17 +01:00
Sebastian Grimberg
8fc76ab325 Update recipe for Palace v0.12.0 (#42400) 2024-01-31 12:57:03 -08:00
Hector Martinez-Seara
e77678bd82 kim-api: added paths for bash/zsh completion (#31691)
Co-authored-by: Hector Mtz-Seara <hector@gmail.com>
Co-authored-by: Ryan S. Elliott <relliott@umn.edu>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: hseara <hseara@users.noreply.github.com>
2024-01-31 12:42:59 -07:00
Richard Berger
dafd8dbe87 Add sol2 package (#42402) 2024-01-31 12:09:07 -07:00
Chris White
5034919d23 add new release versions (#42362) 2024-01-31 10:58:07 -08:00
Harmen Stoppels
2bcfb72d38 environment/view: small cleanup (#42395) 2024-01-31 17:22:20 +01:00
Harmen Stoppels
f27bff81ba spack reproduce-build: accept URLs from web interface (#42261)
Sometimes the logs are too long and the copy & paste command is not
shown. In that case I'd like to just copy the failing GitLab job URL in
my browser to `spack reproduce-build <url>`.
2024-01-31 15:58:51 +01:00
Massimiliano Culpo
5c49bb45c7 ASP-based solver: decouple setup phase from clingo.backend (#41952)
Currently, the `SpackSolverSetup` and the `PyclingoDriver` are more coupled than necessary:
1. The driver object needs a setup object to be injected during a solve, 
2. And the setup object will get a reference back to the driver

This design is necessary because we use the low-level `clingo.backend` interface to setup our problem. This interface though is meant to bypass the grounder and add symbols directly in the grounded table, which is a feature we don't currently use.

The PR simplifies the encoding by having the setup object returning the problem-specific facts / rules as a list of strings, and the driver ingesting them using the [clingo.Control.add](https://potassco.org/clingo/python-api/5.6/clingo/control.html#clingo.control.Control.add) method. This removes any use of the low level interface.

Using this encoding makes it easy to hash the output of the setup phase, since it is returned as a string.
2024-01-31 15:37:59 +01:00
Alex Richert
97fb9565ee py-nbconvert: avoid install-time downloads (#42024)
* py-nbconvert: avoid install-time downloads

* install css files as resources for py-nbconvert

* Update var/spack/repos/builtin/packages/py-nbconvert/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* py-nbconvert: update dependencies for 7.14.1

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update var/spack/repos/builtin/packages/py-nbconvert/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update resources & remove style.min.css file

* Update package.py

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-31 08:28:41 -06:00
Alberto Invernizzi
72eaca23fe environments: develop paths were not getting expanded (#34986) 2024-01-31 15:18:25 +01:00
kwryankrattiger
1f11b3844a CI: Add OIDC capability for deprecated CI (#42371)
This "breaks" the deprecated schema by allowing unknown attributes
to the attributes section of the job types. The breaking change here is
that deprecated stacks will no longer ignore attributes that are unknown
but rather assume the new CI schema behavior of injecting them into the
generated CI configuration. This change is required to secure
authentication in Spack CI.
2024-01-31 15:05:57 +01:00
Rocco Meli
e129a6f47a Add +dlaf variant to cp2k in CI (#42346) 2024-01-31 11:54:45 +01:00
Harmen Stoppels
d983ac35fe ci: bump ghcr.io/spack/linux-ubuntu22.04-x86_64_v2 tag (#42357) 2024-01-31 09:59:03 +01:00
dependabot[bot]
2dcf4f709b build(deps): bump urllib3 from 2.1.0 to 2.2.0 in /lib/spack/docs (#42384)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.1.0 to 2.2.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.1.0...2.2.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-31 09:55:26 +01:00
dependabot[bot]
a171fe3565 build(deps): bump codecov/codecov-action from 3.1.5 to 3.1.6 (#42385)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.5 to 3.1.6.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](4fe8c5f003...ab904c41d6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-31 09:54:11 +01:00
Jonathon Anderson
c9aeab58e6 hpctoolkit: refine dependencies (#42354)
* Force Dyninst <=12 before @2024.01

* Remove some +pic requirements

* Use virtual tbb dep
2024-01-31 08:27:05 +01:00
Harmen Stoppels
517dac6ff8 compression.py: refactor + bug fix (#42367)
Improve naming, so it's clear file "extensions" are not taken in the
`PurePath(path).suffix` sense as the original function name suggests,
but rather that the files are opened and their magic bytes are
classified.

Add type hints.

Fix a bug where `stream.read(num_bytes)` was run on the compressed
stream instead of the uncompressed stream, which can potentially break
detection of tar.bz2 files.

Ensure that when peeking into streams for magic bytes, they are reset to
their original position upon return.

Use new API in `spack logs`.
2024-01-31 07:59:07 +01:00
jalcaraz
376653ec3d Updated last commit of TAU package with Dyninst test (#42387)
* Updated last commit of TAU package with Dyninst test

* The path to dyninst was missing when loading TAU
2024-01-30 19:07:33 -08:00
Harmen Stoppels
28eea2994f elf: relocate PT_INTERP (#42318)
Relocation of `PT_INTERP` in ELF files already happens to work from long to short path, thanks to generic binary relocation (i.e. find and replace). This PR improves it:

1. Adds logic to grow `PT_INTERP` strings through patchelf (which is only useful if the interpreter and rpath paths are the _only_ paths in the binary that need to be relocated)
2. Makes shrinking `PT_INTERP` cleaner. Before this PR when you would use Spack-built glibc as link dep, and relocate
executables using its dynamic linker, you'd end up with

   ```
   $ file exe
   exe: ELF 64-bit LSD pie executable, ..., interpreter /////////////////////////////////////////////////path/to/glibc/lib/ld-linux.so
   ```

   With this PR you get something sensible:

   ```
   $ file exe
   exe: ELF 64-bit LSD pie executable, ..., interpreter /path/to/glibc/lib/ld-linux.so
   ```

When Spack cannot modify the interpreter or rpath strings in-place, it errors out without modifying the file, and leaves both tasks to patchelf instead.

Also add type hints to `elf.py`.
2024-01-30 22:36:49 +01:00
Frédéric Simonis
6d55caabe8 precice: add release v2.5.1 (#42376) 2024-01-30 12:47:59 -08:00
Benjamin Fovet
d8260907df add gmsh v4.12.2 (#42375) 2024-01-30 12:13:07 -08:00
Wileam Y. Phan
9474f4bb33 gtpin: add versions 3.4 and 3.7 (#42373) 2024-01-30 12:05:43 -08:00
Rémi Lacroix
698b71e2fd autoconf: Fix patches' URLs (#42372) 2024-01-30 11:33:48 -08:00
John W. Parent
749301d133 MSVC: Broken ifx needs new $TMP (#42155)
Certain versions of ifx (the majority of those available) have an issue
where they are not compatible with TMP directories with dot chars
This precludes their use with CMake.
Remap TMP to point to the stage directory rather than whatever the TMP
default is
2024-01-30 10:18:54 -08:00
Massimiliano Culpo
a20c0de6d8 ns-3-dev: rewrite the package to use CMake (#34207) 2024-01-30 17:57:27 +01:00
dependabot[bot]
ad2ae63745 build(deps): bump pytest from 7.4.4 to 8.0.0 in /lib/spack/docs (#42361)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.4.4 to 8.0.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.4.4...8.0.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-30 08:23:50 -07:00
dependabot[bot]
d56ca4f107 build(deps): bump black from 24.1.0 to 24.1.1 in /lib/spack/docs (#42360)
Bumps [black](https://github.com/psf/black) from 24.1.0 to 24.1.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.0...24.1.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-30 15:23:42 +00:00
dependabot[bot]
e09c3ddec2 build(deps): bump black in /.github/workflows/style (#42359)
Bumps [black](https://github.com/psf/black) from 24.1.0 to 24.1.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.0...24.1.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-30 15:23:30 +00:00
Massimiliano Culpo
1882920c97 elsi: cleanup recipe (#42355) 2024-01-30 14:08:10 +01:00
Matthias Wolf
b70cb60b65 py-bluepyefe, py-igor2: new version, new package igor2 (#42191) 2024-01-30 03:29:07 -07:00
Adam J. Stewart
6a1e64f531 py-black: add v24.1.1 (#42343) 2024-01-30 03:14:02 -07:00
Nathalie Furmento
5217b20901 starpu: add release 1.4.3 (#42339) 2024-01-30 03:09:21 -07:00
James Beal
ed37969925 singularityce: add v4.x (#42347)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-01-30 03:08:57 -07:00
Henrik Finsberg
32230e6520 Add fenics development version and ufl-legacy (#42182)
* Add fenics development version and ufl-legacy

* Make sure python and setuptools are added to ufl-legacy and add version 2022.3.0 as well

* Run black and add maintainer to fenics package

* Fix typo

* Use Fiat version 2019.1.0

* Run black

* Add back master branch of fiat

* Remove master from the list of dolfin versions and add one extra line of each deps instead

* Run black

* Do not specify python version in ufl-legacy

* Remove python dependency from ufl-legacy

* Remove python dependency from ffc

* Add special case for master in ffc

* Run black

* Remove master from loop in ffc

* Run black again
2024-01-30 02:49:51 -07:00
Brad Geltz
7a712a11b9 geopm-service: New package and deprecate geopm (#41788)
* geopm: Mark all as deprecated

- This recipe will be removed in a future release.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add py-sphinx-emoji

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add py-dasbus

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* py-pygobject: Add v3.46.0

- Previous versions error during build phase.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* py-sphinx-tabs: Add new versions

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

* Add geopm-service

- Previous geopm package is now 2 packages:
  geopm-service and geopm-runtime.
- The GEOPM service is designed as a systemd/dbus
  service providing a userspace interface to
  privileged hardware telemetry and configurations.
- Installing via spack will enable some userspace
  testing, but generally most users will want to
  install the GEOPM service via the system package
  manager as root to get full functionality.
- This recipe will enable the creation of the fully
  userspace geopm-runtime recipe which will replace
  the old geopm recipe.

Signed-off-by: Brad Geltz <brad.geltz@intel.com>

---------

Signed-off-by: Brad Geltz <brad.geltz@intel.com>
2024-01-30 02:49:33 -07:00
Peter Scheibel
e63d8e6163 "spack logs": print log files for packages (either partially built or installed) (#42202) 2024-01-30 10:42:00 +01:00
Alex Leute
461a9093cd py-textual: Added package py-textual (#42291)
* py-textual: New package py-textual

* py-textual: Depend on py-mdit-py-plugins

* py-textual: Added dependency on python@3.8:3

* py-textual: Added a comment about why there is a dependency on
py-mdit-py-plugins

* py-textual: Ran black

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-01-30 03:19:39 -06:00
Caetano Melone
8edb6ff22a py-pytest-aiohttp: add package (#42313)
* add pytest-aiohttp

* black

* py-setuptools -> py-setuptools-scm

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-30 03:18:32 -06:00
Massimiliano Culpo
1f44be83af Add a PR template with a reference to spackbot commands (#42349) 2024-01-30 09:43:04 +01:00
Benjamin Fovet
1be078d01d gmsh: add v4.12.0 (#41854)
* disable building gmsh with oce for recent versions

Co-authored-by: Benjamin Fovet <benjamin.fovet@cea.fr>
2024-01-30 08:37:37 +01:00
Cristian Le
c07cde3308 Bump spglib version (#42340) 2024-01-29 16:06:29 -07:00
John W. Parent
dda8b1a5b8 VTK package: improve dependency-detection on Windows (#42300)
VTK struggles to consume some Spack-derived packages on Windows:
Patch VTK to allow a smoother integration

Also add install for examples as they are not part of the install
interface.
2024-01-29 14:42:36 -08:00
John W. Parent
ba45277640 gl2ps package: build only one of shared/static on Windows (#36576)
gl2ps tries to build static and shared libs simultaneously with
the same target name on the generator side. This causes a name
clash issue for Ninja on Windows (where the extension is .lib
in both cases).

Add a variant on Windows to force building only one of shared
or static, and patch the CMake build to enable use of this
variant.
2024-01-29 11:49:55 -08:00
Zack Galbreath
f03ae39fd1 Update GitLab memory requests (#42351)
* gitlab: remove commented-out duplicate entries

* gitlab: reclassify some packages from "huge" to "large"

Our observed max memory usage for these packages is as follows:

hipblas: 7.7G
qt: 6.6G
visit: 9.7G

All of these should fit within a "large" request (currently 12G).

* gitlab: remove pango from list of huge packages

This package is not currently built by any of our CI stacks.

* gitlab: update requests for high memory packages

Refine resource requests for memory-intensive packages based on
max memory usage data.
2024-01-29 19:28:58 +00:00
Arne Becker
62145122be perl-sql-translator: New package (#42319)
- Adds perl-sql-translator and its missing deps:
- Adds perl-import-into
- Adds perl-package-variant
- Adds perl-strictures

Built with build-time tests and comes with a simple run-time test.
2024-01-29 10:36:25 -08:00
Gilles Grospellier
44e33c3eb9 dotnet-core-sdk: Update to version 6.0.25 and add binaries for 'aarch64'. (#41739) 2024-01-29 10:10:58 -08:00
fpruvost
ed5ed3e31e Add fabulous, maphyspp, paddle packages. (#42287) 2024-01-29 10:06:35 -08:00
dependabot[bot]
cc866efba1 build(deps): bump dorny/paths-filter from 2.11.1 to 3.0.0 (#42294)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 2.11.1 to 3.0.0.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](4512585405...0bc4621a31)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 16:38:23 +01:00
Martin Diehl
f550262efb update damask to 3.0.0-beta (#42259) 2024-01-29 09:33:22 -06:00
Harmen Stoppels
890ec8d71c traverse: w/o deptype (#42345)
Add the empty deptype `spack.deptypes.NONE`.

Test the case `traverse_nodes(deptype=spack.deptypes.NONE)` to not
traverse dependencies, only de-duplicate.

Use the construct in environment views that otherwise would branch on
whether deps are enabled or not.
2024-01-29 16:31:50 +01:00
dependabot[bot]
62ed5ee318 build(deps): bump codecov/codecov-action from 3.1.4 to 3.1.5 (#42295)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.4 to 3.1.5.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](eaaf4bedf3...4fe8c5f003)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 16:27:44 +01:00
Mosè Giordano
0a10ff70bc openblas: use ARMV8SVE when target supports SVE feature (#42107) 2024-01-29 08:20:32 -07:00
Harmen Stoppels
0718e3459a filesystem: cleanup (#42342)
Type hints and removal of unused code
2024-01-29 14:43:17 +00:00
Rocco Meli
7ec93a496d MDAnalysis: add v2.7.0 (#41907)
* ensure umpire~cuda~rocm when ~cuda~rocm

* MDAnalysis: add v2.7.0

* Apply suggestions from code review

* license

* review changes

* add gsd and py-cmake versions

* Update package.py

* py-cmake

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix and reorder

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 08:31:34 -06:00
Brian Vanderwende
2170386ad9 Improve NCL variant support and fix hdf-eos5 issues (#41259) 2024-01-29 15:28:32 +01:00
WuK
6c48effbf5 add py-vl-convert-python (#42073)
* add py-vl-convert-python

* code format

* Update package.py

add homepage

* Update var/spack/repos/builtin/packages/py-vl-convert-python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove old versions, add todo

* Update var/spack/repos/builtin/packages/py-vl-convert-python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove blank line

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 08:18:42 -06:00
Wouter Deconinck
97e3da0e3e ncurses: fix 5.9 and 6.0 on modern compilers (#41982) 2024-01-29 15:18:25 +01:00
Alex Leute
d6ff81ab4d py-metrics: Adding package py-metrics (#42220)
* New package: py-metrics

* [py-metrics] cleaned up extra comment

* Updated copyright and ran black

* py-pathspec: Added version 0.5.5

---------

Co-authored-by: vehrc <vehrc@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-01-29 08:14:03 -06:00
Sam Grayson
6dfc2e7075 openldap: add findutils build dep (#42159) 2024-01-29 15:06:57 +01:00
Matthias Wolf
80e36d47c2 py-frozendict: patch up for Python 3.11 (#42192)
* py-frozendict: patch up for Python 3.11

See also Marco-Sulla/python-frozendict#68, rely on a pure Python
implementation when 3.11+ is used.

* mention related Github issue
2024-01-29 08:03:46 -06:00
snehring
e1826f89d4 tbl2asn: adding runtime dep libidn (#42171) 2024-01-29 14:59:10 +01:00
kjrstory
23df20fe6d of-precice: add new versions, update run environment (#40479) 2024-01-29 14:57:16 +01:00
Wouter Deconinck
10ef7a0466 containers: switch to quay.io/almalinuxorg images (#42193) 2024-01-29 14:55:56 +01:00
Alec Scott
7eba7d4028 go: simplify dependency on git to only run (#42208) 2024-01-29 14:49:45 +01:00
Dani
b09d741aed new builtin package: py-biobb-common (#41993)
* new builtin package: biobb-common

* removed whitespace for the style test

* replaced ' by " for the style test

* changed import line for the style test

* Update var/spack/repos/builtin/packages/biobb-common/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update var/spack/repos/builtin/packages/biobb-common/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update var/spack/repos/builtin/packages/biobb-common/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed test

* using pypi instead of git

I hope this is declared the right way.
I tried to test it locally but I am having problems with a dependency (openblas) thus I can not even try if this works.

* upgraded version

* Added py prefix

* removed biopython version restriction

* directory with new py-prefixed name

* Delete old non-prefixed package

* Update var/spack/repos/builtin/packages/py-biobb-common/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 13:52:57 +01:00
Sergey Kosukhin
594069961a netcdf-fortran: fix static linking in some cases (#35466)
* netcdf-fortran: enable building against static netcdf-c

* netcdf-fortran: strip the output of nc-config
2024-01-29 12:53:13 +01:00
Sean Koyama
9eb445f0a2 singularityce: add v3.11.4, v3.11.5 (#42251)
Co-authored-by: Servesh Muralidharan <smuralidharan@anl.gov>
2024-01-29 12:31:39 +01:00
Sean Koyama
e791f61c52 gnuplot: add v5.4.10, v6.0.0 (#42252)
Co-authored-by: Servesh Muralidharan <smuralidharan@anl.gov>
2024-01-29 12:30:47 +01:00
Thomas Madlener
b074e18402 py-jupytext: Add version 1.16 and update dependencies (#41552)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-29 12:26:58 +01:00
Alex Seaton
5b6a289d30 Added heyoka and mppp packages (#41839) 2024-01-29 11:03:48 +01:00
Jonathon Anderson
8e9bce44cc opencl-c-headers: install with CMake (#42173) 2024-01-29 10:47:52 +01:00
Ye Luo
7242238a25 quantum_espresso: relax constrait to allow +openmp ^elpa~openmp (#42322) 2024-01-29 10:41:52 +01:00
Mikael Simberg
ef26ee3f1f pika: Add 0.22.1 (#42338) 2024-01-29 02:37:56 -07:00
Adam J. Stewart
584ff9933a py-pandas: add v2.2.0 (#42205) 2024-01-29 03:34:07 -06:00
dependabot[bot]
bb07776606 build(deps): bump black from 23.12.1 to 24.1.0 in /lib/spack/docs (#42328)
Bumps [black](https://github.com/psf/black) from 23.12.1 to 24.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.12.1...24.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 10:28:28 +01:00
Pramod Kumbhar
07df50fbdc neuron: add new versions and clean-up the recipe (#41931)
* update tarball urls, add new versions and update maintainers
* remove unnecessary variant like cross-compile
* use self.define and self.define_from_variant
* improve regex / bug that matches with -DMPICH_SKIP_MPICXX=1

Co-authored-by: Matthias Wolf <matthias.wolf@epfl.ch>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-01-29 09:46:48 +01:00
Luc Grosheintz
fc731f28cb highfive: add v2.9.0 (#42337) 2024-01-29 01:43:04 -07:00
Tom Payerle
6474d7ef00 hdf5: Make +subfiling variant depend on +mpi (#42324)
Based on CMakeLists.txt, the subfiling VFD requires a parallel HDF5 build.
So make +subfiling variant depend on +mpi

Should resolve #42323
2024-01-29 09:41:51 +01:00
Jonathon Anderson
0b23bbbbb0 hpctoolkit: update develop and add 2024.01.stable (#42309) 2024-01-29 09:05:02 +01:00
wspear
21406b119c Update tau 2.33.1 hash (#42336) 2024-01-28 09:44:55 -08:00
Adam J. Stewart
2b51980904 Apply black 2024 style to Spack (#42317) 2024-01-27 16:15:35 +01:00
Robert Cohn
1865e228c4 [intel-oneapi-mkl] workaround linking issue for threads=openmp (#42327) 2024-01-26 22:22:51 -07:00
wspear
179a1e423e pdt 3.25.2 (#42330)
Add support for -icpx for oneapi
2024-01-26 21:08:19 -07:00
wspear
bd8de5bf2d Add tau 2.33.1 (#42331) 2024-01-26 20:52:44 -07:00
eugeneswalker
7c8c7eedca xyce: break blis circularity in depends_on (#42321) 2024-01-26 20:22:55 -07:00
Massimiliano Culpo
8c1957c03e gnupg: add v2.4.4 (#42320) 2024-01-26 14:33:48 -08:00
eugeneswalker
803ad69eb1 hydrogen@1.5.3: cmake patch with ESCAPE_QUOTES (#42325) 2024-01-26 13:56:16 -08:00
Chris White
29d784e5fa Axom: Update/merge changes from Axom's repo (#42269)
* update axom package from axom's repo

Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-01-26 11:08:21 -08:00
Dan LaManna
58b2201710 Stop passing manual AWS credentials to jobs (#42096) 2024-01-26 10:25:37 -07:00
Matthew Thompson
02605d577b hdf5: fix build error on Apple Clang 15 (#42264) 2024-01-26 18:18:57 +01:00
Adam J. Stewart
42de252c12 py-black: add v24.1.0 (#42316) 2024-01-26 07:48:21 -07:00
m-shunji
6c3c06a571 pexsi: fix to build with fujitsu-ssl2 (#42234)
Co-authored-by: inada-yoshie <inada.yoshie@fujitsu.com>
2024-01-26 11:00:22 +01:00
Harmen Stoppels
6a4573ce5a julia: patch for system lld and dsymutil (#42282) 2024-01-26 10:30:27 +01:00
Massimiliano Culpo
e77128dfa2 Run config audits in CI, add a new audit to detect wrongly named external specs (#42289) 2024-01-26 10:21:43 +01:00
Massimiliano Culpo
19df8e45ec Merge virtuals= from duplicate dependencies (#42281)
Previously, for abstract specs like:
```
foo ^[virtuals=a] bar ^[virtuals=b] bar
```
the second requirement was silently discarded on concretization. Now they're merged, and the abstract spec is equivalent to:
```
foo ^[virtuals=a,b] bar
```
2024-01-26 09:48:53 +01:00
eugeneswalker
4c7a1f541c e4s oneapi: use ghcr spack registry for runner image (#42267) 2024-01-26 02:02:58 +00:00
Daniele Cesarini
295e36efa3 COUNTDOWN package (#42123)
* Added countdown repo
* Added fixed version of COUNTDOWN
* Style fixes
* Changed mantainer syntax
* Variants listed in alphabetical order by name
* Added conflicts and some reordering
* Fixed conflicts syntax
* Style fixes
* [@spackbot] updating style on behalf of danielecesarini

---------

Co-authored-by: danielecesarini <danielecesarini@users.noreply.github.com>
2024-01-25 13:56:29 -08:00
eugeneswalker
3f47cc8d00 e4s neoverse-v2: use ghcr.io/spack image registry (#42268) 2024-01-25 13:45:29 -08:00
Matthew Thompson
4006020d78 pflogger: Add v1.12 (#42288)
* pflogger: add version 1.12
* Add version
* return MPI variant to false default
* [@spackbot] updating style on behalf of mathomp4

---------

Co-authored-by: mathomp4 <mathomp4@users.noreply.github.com>
2024-01-25 14:08:48 -07:00
kenche-linaro
6d4fa96aad linaro-forge: added 23.1.1 version (#42283) 2024-01-25 13:45:01 -07:00
Dom Heinzeller
85def2bfc7 Add eccodes@2.32.1 and eccodes@2.33.0 (#42257) 2024-01-25 13:34:10 -07:00
eugeneswalker
266bbad8cd e4s: add gromacs (#42266) 2024-01-25 11:48:25 -08:00
Derek Ryan Strong
1e3b7a6df1 Add fpart 1.6.0 (#42258) 2024-01-25 11:39:14 -08:00
Tom Payerle
00fe864321 cgal: Add version 5.6 (#42277)
Needed for latest version of sfcgal
2024-01-25 11:36:34 -08:00
Adam J. Stewart
3df720e909 py-lightly: add v1.4.26 (#42279) 2024-01-25 11:35:15 -08:00
Alberto Invernizzi
02a6ec7b3c CMake: disable Package Registry (#42149)
CMake may write and read from `~/.cmake` through `export(...)` and read `find_package(...)` respectively. We don't want this as it may influence the build in a non-deterministic way, so disable it for all versions of `cmake`.
2024-01-25 19:04:03 +01:00
Greg Becker
d3c1f7a872 Fix using sticky variants in externals (#42253) 2024-01-25 17:22:22 +01:00
Robert Cohn
84568b3454 spack find mpiexec for impi (#42284) 2024-01-25 08:21:50 -08:00
Harmen Stoppels
2721b4c10d llvm: disable libomptarget AMDGPU plugin (#42265)
Fixes CI on develop
2024-01-25 08:43:29 +01:00
Jason Verley
31ed39303c Update Xyce and Trilinos recipes (#42194)
* Trilinos: Add AMD to SuiteSparse TPL list

When Trilinos is built with SuiteSparse support, it should enable AMD as
one of the TPLs. It was previously enabling only UMFPACK. The Xyce
package uses AMD (but not UMFPACK).

* Xyce: Add 7.8 release and various improvements

In addition to adding the latest Xyce release (7.8), all the earlier
releases were deprecated, with the exception of 7.7.

The Trilinos specification was updated to remove unneeded packages,
explicitly enable all needed packages, and specify additional
do-not-build packages.

The serial build is now the default, with MPI still being an option.

I also epanded the explanation for one of the patches; and, finally, I
took the opportunity to update the Xyce description to better match the
current Xyce README description.
2024-01-24 10:46:14 -08:00
Richard Berger
6bddecbf28 SLURM/MPICH fixes (#42225)
* slurm: add new versions

* mpich: apply hostlist_t patch for newer Slurm versions

* [@spackbot] updating style on behalf of rbberger

---------

Co-authored-by: rbberger <rbberger@users.noreply.github.com>
2024-01-24 10:45:23 -08:00
Harmen Stoppels
54aebbb587 generate modules of non-roots during spack install of env (#42147)
Fixes a bug where Spack did not generate module files of non-roots during
spack install with an active environment.

The reason being that Environment.new_installs only contained roots.

This PR:

Drops special casing of automatic module generation in post-install hooks
When `use_view`, compute environment variable modifications like always, and
applies a view projection to them afterwards, like we do for spack env activate.
This ensures we don't have to delay module generation until after the view is
created.

Fixes another bug in use_view where prefixes of dependencies would not be
projected -- previously Spack would only temporarily set the current spec's prefix.
Removes the one and only use of the post_env_write hook (but doesn't drop it to
make this backportable w/o changes)
2024-01-24 09:45:58 -08:00
Mikael Simberg
e46f3803ad pika: Add 0.22.0 (#42263) 2024-01-24 10:32:58 -07:00
Harmen Stoppels
28b7f72b96 set_packge_py_globals: only set pure build related globals on the root in build context (#42215)
Previously `std_args` was called on non-roots in a build context, which is redundant, and also leads to issues when `std_args` expects build deps of the `pkg` to be installed.
2024-01-24 10:14:13 +01:00
Jonathon Anderson
61421b3a67 hpctoolkit: Add build_system=meson (#42059)
HPCToolkit `develop` now can optionally be built via Meson. This PR adds the `build_system=(autotools|meson)` variant and splits the build-system-dependent pieces into `AutotoolsBuilder` and `MesonBuilder`. The default is to build with `autotools`.

As of writing, the Meson is simply a wrapper around the original Autotools build, hence the build requires a native file listing install prefixes of all dependencies (which are internally mapped to `--with-{pkg}={prefix}` arguments for `./configure`). This is an unconventional but temporary state of affairs until the build system is fully ported to Meson and conventional dependency acquisition techniques like `pkg-config` and `cmake` are practically available.
2024-01-23 13:20:10 -08:00
Sean Koyama
acbf0d99c4 gobject-introspection: fine-grained glib dependencies, add 1.78.1 (#42222) 2024-01-23 22:05:11 +01:00
Tamara Dahlgren
e7be8160dd Environments: Fix environment configuration (#42058)
* Environments: fix environment config
* Don't change the lockfile manifest path
* Update activate's comments to tie the manifest to the configuration
* Add spec_list override method
* Remove type checking from 'activate' since already have built-in check
* Refactor global methods tied to the manifest to be in its class
* Restore Environment's 'env_subdir_path' and pass its config_stage_dir to EnvironmentManifestFile
* Restore global env_subdir_path for use by Environment and EnvironmentManifestFile
2024-01-23 13:01:40 -08:00
jmlapre
2af6597248 pigz: add v2.8 (#42227) 2024-01-23 12:55:32 -08:00
eugeneswalker
a4444e4107 tau ^intel-oneapi-mpi: fix prefix specification (#42248) 2024-01-23 12:54:08 -08:00
Loris Ercole
4c86ecc531 autodock-gpu: build with the specified cuda_arch (#42244)
The CUDA target should be specified at build time, otherwise
by default `autodock-gpu` will be built for compute capabilities
52, 60, 61, 70, which may cause errors on unsopported cards.
2024-01-23 12:52:38 -08:00
m-shunji
890a46c071 cosma: fix to build with fujitsu-ssl2 (#42230) 2024-01-23 12:51:29 -08:00
m-shunji
8999b0c178 elpa: fix to build with fujitsu compiler (#42231) 2024-01-23 12:41:59 -08:00
Massimiliano Culpo
a68fcb2fb8 cp2k: add a separate MakefileBuilder (#42130) 2024-01-23 21:39:40 +01:00
Alex Richert
84868b57c7 Add v5.0.0 to ip (#42228)
* Add v5.0.0 to ip
* don't require test target for ip@3
2024-01-23 12:09:32 -08:00
Juan Miguel Carceller
087bf70979 edm4hep: change master to main (#42246)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-01-23 12:04:48 -08:00
James Beal
039c343d0a Add missing dependency for vcftools (#42047)
* Add missing dependency
* Change dependency to virtual package
   As suggested :)

---------

Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-01-23 11:01:24 -08:00
eugeneswalker
11abd94c04 turbine ^intel-oneapi-mpi: fix prefix specification (#42247) 2024-01-23 11:39:05 -07:00
Loris Ercole
b3d0f19fe7 spglib: add tests variant and cmake requirement (#42242)
Fixes #42241
2024-01-23 11:38:42 -07:00
Rémi Lacroix
9be7f2328f Spglib: update the homepage (#42243)
The previous link gives a 404 error.
2024-01-23 07:54:01 -07:00
Harmen Stoppels
40b8dfceed lapackpp: new version (#42245) 2024-01-23 14:36:37 +01:00
Jack Morrison
41b20aec2b libfabric: Add version 1.20.1 (#42219) 2024-01-23 05:19:00 -07:00
Simon Pintarelli
98109ce3ea Change cosma and costa default to +shared (#42238) 2024-01-23 04:33:32 -07:00
Christoph Junghans
61cd877145 votca: add v2024 (#42224) 2024-01-23 04:33:15 -07:00
Massimiliano Culpo
66813460c0 Add syntactic sugar for "strong preferences" and "conflicts" (#41832)
Currently requirements allow to express "strong preferences" and "conflicts" from
configuration using a convoluted syntax:
```yaml
packages:
  zlib-ng:
    require:
    # conflict on %clang
    - one_of: ["%clang", "@:"]
    # Strong preference for +shared
    - any_of: ["+shared", "@:"]
```
This PR adds syntactic sugar so that the same can be written as:
```yaml
packages:
  zlib-ng:
    conflict:
    - "%clang"
    prefer:
    - "+shared"
```
Preferences written in this way are "stronger" that the ones documented at:
- https://spack.readthedocs.io/en/latest/packages_yaml.html#package-preferences
2024-01-22 13:18:00 -08:00
Harmen Stoppels
ed9d495915 environment.py: drop early exit in install (#42145)
`spack install` early exit behavior was sometimes convenient, except
that it had and has bugs:

1. prior bug: we didn't mark env roots of already installed specs as
   explicit in the db
2. current bug: `spack install --overwrite` is ignored

So this PR simplifies by letting the installer do its thing even if
everything is supposedly installed.
2024-01-22 20:39:12 +01:00
Annop Wongwathanarat
7580ba4861 Revert "acfl: truncate version version number" (#42214) 2024-01-22 10:28:22 -08:00
renjithravindrankannath
c673979fee Bump up the version for ROCm-6.0.0 (#42026)
* Bump up the version for ROCm-6.0.0
* Adding patch files
* Style check failure fix
* Style check fixes
* Style check error fixes
* Patch to remove hipblas client file installation in 6.0
* Patch need to be applied on all 5.7 relases
* 6.0 update for math libs and other packages, new github url etc
* Correct package-audit failures
* Correcting shasum for rocfft patch and limiting patch in rocblas
* Reverting updates in rocprofiler-dev due to ci-gitlab failure
* Fixes for ci-gitlab failure due to disabling hip backward compatibilit
* Adding patch file to Change HIP_PLATFORM from HCC to AMD and NVCC to NVIDIA
* Use the gcnArchName inplace of gcnArch as gcnArch is deprecated from rocm-6.0.0
* Patches to fix magma and blaspp build error with rocm 6.0.0
* Patch for mfem and arborx for rocm 6.0
* Style check error fix
* Correcting style check errors
* Uodating dependent version
* Update for petsc to build with rocm 6.0
  Need reverting-operator-mixup-fix-for-slate.patch for rocm 6.0
* Reverting the change in url for 2.7.4-rocm-enhanced
* hip-tensor 6.0.0 update
2024-01-22 10:19:28 -08:00
Wouter Deconinck
7acd5bdc7f gaudi: new version 37.2 (#42210)
No major changes, https://gitlab.cern.ch/gaudi/Gaudi/-/compare/v37r1...v37r2?from_project_id=38&straight=false
2024-01-22 09:45:43 -08:00
Adam J. Stewart
3a4800754e py-keras: add v3.0.4, v3.0.3 (#42206)
* py-keras: add v3.0.3
* py-keras: add v3.0.4
2024-01-22 09:31:45 -08:00
Wouter Deconinck
72f8611d3e acts: new version 32.0.0 (#42207)
* acts: new version 32.0.0
  No major build system changes, see [diff](https://github.com/acts-project/acts/compare/v31.2.0...v32.0.0). Summary of changes:
  - updated actsvg version requirement
  - MLpack dependency removed as of 32.0.0 (https://github.com/acts-project/acts/pull/2863)
* actsvg: new version 0.4.39 (new variant web)
* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-01-22 09:29:19 -08:00
Wouter Deconinck
a0cd63c210 dd4hep: new version 1.27.2 (#42211)
New bugfix version, without any major changes, https://github.com/AIDASoft/DD4hep/compare/v01-27-01...v01-27-02.
2024-01-22 09:10:02 -08:00
kwryankrattiger
2d9c6c3222 CMakePackage pass python hints automatically (#42201)
This commit ensures that CMake packages that also have Python as a build/link dep get a couple defines for the Python path so that CMake's builtin `FindPython3`, `FindPython`, `FindPythonInterp` modules can locate Python correctly.

The main problem with those CMake modules is that they first search for Python versions known at the time of release, meaning that old CMake maybe find older system Python 3.8 even though Python 3.11 comes first in `CMAKE_PREFIX_PATH` and `PATH`.

Package maintainers can opt out of this by overriding the `find_python_hints = False` attribute in the package class.
2024-01-22 16:31:16 +01:00
Harmen Stoppels
b28692dc72 repo.py: pass package name not fully qualified package name (#42217) 2024-01-22 14:44:13 +00:00
Maciej Wójcik
dee0f138b8 py-nglview: add new package and dependency (py-versioneer-518) (#42079)
* Add nglview package

* Use slightly older version

* py-nglview: Correct py-versioneer version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-nglview: Correct version of py-jupyter-packaging dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add py-versioneer-518 package

* py-versioneer-518: Correct version

* py-nglview: Numpy is needed during build for the tests

* py-nglview: dependency needed for tests

* py-nglview: Correct dependency types

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-22 05:39:36 -06:00
Matthias Wolf
91ff20cf7a py-bluepyemodel: add newer versions (#42188)
* py-bluepyemodel: add newer versions

* re-add license marker
2024-01-21 11:54:35 -06:00
Matthias Wolf
1084d3c261 py-nexusforge: add new version, fix dependencies (#42189) 2024-01-21 11:52:43 -06:00
Maciej Wójcik
124b41c1a0 snakemake: add new versions (#42074)
* py-toposort: Add newer versions

* snakemake: Add more recent versions with dependencies

* Typo in copyright

* Reorder dependencies, remove comments

* Whitespace
2024-01-21 11:18:19 -06:00
snehring
d282ec8179 py-tesorter: requires setuptools (#42124) 2024-01-21 11:17:52 -06:00
eugeneswalker
58997f7f9a e4s ci: use latest intel/hpckit 2024 based image (#41437)
* e4s ci: use latest intel/hpckit 2024 based image

* use latest container image: ecpe4s/ubuntu22.04-runner-amd64-oneapi-2024.0.0:2023.12.01

* comment out failing specs

* update to use patched container

* remove generalized package preference for intel-oneapi-mkl@2023

* change packages commented out
2024-01-20 20:17:44 -08:00
eugeneswalker
104a2b5e11 e4s ci: switch to neoverse_v2 target (#42115) 2024-01-20 16:28:29 -08:00
Robert Cohn
e198c13161 [kokkos] make dpl dependence explicit (#42128) 2024-01-20 16:26:21 -08:00
Wouter Deconinck
10b4481ba5 motif: patch to ensure main function (fixes #29594) (#42174) 2024-01-21 00:36:20 +01:00
Sam Grayson
4c1fbc9fdb libtool: add build dep on findutils (#42199) 2024-01-20 14:58:06 -07:00
Adam J. Stewart
02dc10c053 py-scikit-learn: add v1.4.0 (#42186) 2024-01-20 06:47:45 -06:00
Pranav Sivaraman
0c880369d8 libvterm: add v0.3.3 (#42203) 2024-01-20 11:07:10 +01:00
Adam J. Stewart
5b101ef105 py-pyamg: fix deptypes (#42185) 2024-01-20 09:28:25 +01:00
Alec Scott
c6acaaf145 fzf: add v0.45.0 (#42175) 2024-01-19 12:34:18 -08:00
Alec Scott
6883d6bf86 go: add v1.21.6 and v1.21.5 (#42177) 2024-01-19 12:33:19 -08:00
Alec Scott
a77bde66de emacs: add v29.2 (#42178) 2024-01-19 12:31:56 -08:00
Alec Scott
a5e40ae36d restic: add v0.16.3 (#42180) 2024-01-19 12:31:03 -08:00
Alec Scott
e556e92ec9 ripgrep: add v14.1.0 (#42181) 2024-01-19 12:30:08 -08:00
Adam J. Stewart
b233c255bd py-rtree: add v1.2.0 (#42200) 2024-01-19 12:17:38 -08:00
Seth R. Johnson
ee16d59221 ROOT: fix macos build for 6.30 (#42198) 2024-01-19 15:09:46 -05:00
Harmen Stoppels
39a7780754 test/cmd/checksum.py: avoid networking (#42190) 2024-01-19 20:00:38 +01:00
Harmen Stoppels
ce81175cf3 Revert "cc: work around -v split between ld and ccld" (#42196) 2024-01-19 17:59:41 +01:00
Seth R. Johnson
c31a998abb VecGeom: new version 1.2.7 and fix URLs (#42144)
* VecGeom: new version 1.2.7 and fix URLs

* vecgeom: remove deprecated ancient RC version with incorrect hash

* geant4: remove support for @10.3+vecgeom
2024-01-19 11:30:31 -05:00
Harmen Stoppels
00c4efb96e environment.py: remove symlinking of logs (#42148)
The piece of code that is removed in this PR predates environment views.

Spack would symlink build logs in `<env>/.spack-env/logs/*`, but this is
redundant because:

1. Views already add `<prefix>/.spack` (and there's logic there to avoid
   clashes)
2. The code was broken anyways: it would only symlink the logs of
   environment roots, not their deps, even if they were just built.

If users disable views, I'm pretty sure they're not waiting for
`.spack-env/logs` either. So, imo we can delete this code, and it was
probably overlooked in the past.
2024-01-19 17:10:03 +01:00
Harmen Stoppels
edc8a5f249 Typing for spack checksum code paths (#42183) 2024-01-19 13:56:20 +00:00
Massimiliano Culpo
75e96b856e btop: add v1.3.0, added GPU variant (#42139) 2024-01-19 10:11:56 +01:00
Harmen Stoppels
549ab690a8 oci: use pickleable errors (#42160) 2024-01-19 09:37:33 +01:00
Peter Scheibel
621e203a8e Bugfix: spack config change handle string requirements (#42176)
For a requirement like

```
packages:
  foo:
    require:
    - "+debug"
```

(not `one_of:`, `any_of:`, or `spec:`)

`spack config change` would ignore the string. This was particularly evident if toggling a variant for a previously unmentioned package:

```
$ spack config change packages:foo:require:+debug
$ spack config change packages:foo:require:~debug
```

This fixes that and adds a test for it.
2024-01-19 08:10:39 +00:00
John W. Parent
d7bcaa29c0 sqlite package: support Windows build (#41924)
Resubmission of #41761 with proper relocation of get_arch
(taken from #41824).

Co-authored-by: vsoch <vsochat@stanford.edu>
2024-01-18 23:02:06 -08:00
Brian Han
e9a0273538 umpire - Use ENABLE_OPENMP to build with OpenMP support (#42164) 2024-01-18 17:38:39 -07:00
Wouter Deconinck
0d7aa6d811 qt: new version 5.15.12 (#42135)
Qt5.15.12 (open source) was released on December 27, [release notes](https://code.qt.io/cgit/qt/qtreleasenotes.git/about/qt/5.15.12/release-note.md). No major build system changes expected.
2024-01-18 17:32:55 -07:00
Wouter Deconinck
40c61b0600 thepeg: v2.3.0 hash change (#42167)
```
$ curl 'https://thepeg.hepforge.org/downloads/?f=ThePEG-2.3.0.tar.bz2' | sha256sum 
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 1904k    0 1904k    0     0  1075k      0 --:--:--  0:00:01 --:--:-- 1074k
ac35979ae89c29608ca92c156a49ff68aace7a5a12a0c92f0a01a833d2d34572  -
```
2024-01-18 17:08:13 -07:00
Christopher Christofi
b54a69e8d1 centrifuge: add new package (#42168)
* centrifuge: add new package
* fix styling
2024-01-18 17:03:52 -07:00
Harmen Stoppels
4d54688782 Reduce the size on disk for logs (#42122)
* Reduce the size on disk for logs

This PR does two things:

1. Store a compressed `spack-build-out.txt.gz`
2. Get rid of phase logs, as they are duplicates of
   `spack-build-out.txt`

The logs are not compressed in the stage dir, so on build failure the
workflow for users is no different.

It's just that on install the logs are rarely used, and if needed, users
can easily `gzip -d` or `zgrep` them.

In the case of GCC installs, the compressed logs are <5% of the original
size, which is typically dozens of MBs.

* get rid of "backwards compat" of file names in stage dirs
2024-01-18 16:02:11 -08:00
Brian Han
2d37d5474d chai - make CUDA_SEPARABLE_COMPILATION flag optional (#42165) 2024-01-18 16:53:28 -07:00
Robert Cohn
8ac27241b6 [embree] rely on spack to provide location of tbb (#42163) 2024-01-18 16:48:27 -07:00
Christopher Christofi
fd70ac2d99 perl-graph: fix default version ordering to latest releases (#42161)
* perl-graph: fix default version ordering to latest releases
* fix styling
2024-01-18 16:44:09 -07:00
snehring
a38acfb195 libidn: adding new package libidn (#42170) 2024-01-18 16:34:17 -07:00
John W. Parent
6fa7d8b6a6 Skip sbang hook on Windows (#42156)
Sbangs don't exist on Native Windows, and the hook is causing errors
due to the file comparison + behavior of os.rename on Windows. Skip
the hook on Windows.
2024-01-18 15:33:14 -08:00
eugeneswalker
eb5494e9cc ginkgo@1.7.0 %oneapi: patch sycl w changes from ginkgo pr #1524 (#42151)
* ginkgo@1.7.0 %oneapi: patch sycl w changes from ginkgo pr #1524

* constrain patch to %oneapi@2024:
2024-01-18 13:57:24 -08:00
Richard Berger
671dab97d5 ucx: add explicit dependency to hsa-rocr-dev (#42152)
Fixes an issue that occurs when hip is provided as an external.
hsa-rocr-dev would not be part of the dependency tree in that case.
2024-01-18 13:30:48 -08:00
Satish Balay
27a3ba1a59 kokkos-kernels: add v4.1.00, v4.2.00 (#40565)
* kokkos-kernels: add version 4.1.00

* add kokkos-kernels@4.2.00

* [kokkos] make dpl dependence explicit

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Cohn, Robert S <robert.s.cohn@intel.com>
2024-01-18 14:25:09 -07:00
Andrey Perestoronin
87e836f353 add new advisor and vtune packages (#42150) 2024-01-18 15:00:37 -05:00
John W. Parent
883014e56a Tcl package: support build on Windows (#41939) 2024-01-18 12:47:52 -07:00
Robert Cohn
eb625321ae [intel-oneapi-mkl] patch mkl install to workaround cmake issue (#42146) 2024-01-18 12:33:00 -07:00
Dave Keeshan
ddfec30941 Add verible v0.0-3483-ga4d61b11 (#42142) 2024-01-18 10:51:56 -08:00
Wouter Deconinck
922ad2fbc6 opencascade: new versions 7.5.3p5, 7.7.2, 7.8.0 (#42136)
New patch 7.5.3p5, new bugfix 7.7.2, new minor 7.8.0.

Only possible impact on spack is the potential addition of a variant to select the memory manager in 7.8.0, see [diff](https://github.com/Open-Cascade-SAS/OCCT/compare/V7_7_2...V7_8_0). Not adding a variant at this time.
2024-01-18 10:50:15 -08:00
Dave Keeshan
e8e6b90044 Add yosys 0.37 (#42141) 2024-01-18 10:48:46 -08:00
Dom Heinzeller
560bb9f507 Bug fix for building ESMF shared on macOS: set ESMF_TRACE_LIB_BUILD=OFF (#42134) 2024-01-18 10:36:12 -08:00
Juan Miguel Carceller
78e5e31558 evtgen: add version 02.02.01 (#42055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-01-18 08:42:03 -08:00
Harmen Stoppels
cb685b049d oci: only push in parallel when forking (#42143) 2024-01-18 17:28:50 +01:00
Alberto Invernizzi
51c02be909 intel-oneapi-mkl: add missing compiler libraries for thread=openmp (#42087) 2024-01-18 08:14:16 -05:00
Massimiliano Culpo
f7db6bf3f9 spack: add v0.21.0 and v0.21.1 (#42140) 2024-01-18 05:28:02 -07:00
Massimiliano Culpo
203d682d87 spack graph: env aware (#42093) 2024-01-18 10:11:41 +01:00
Peter Scheibel
7b27591321 New command: spack config change (#41147)
Like `spack change` for specs in environments, this can e.g. replace `examplespec+debug` with `examplespec~debug` in a `require:` section.

Example behavior for a config like:

```
packages:
  foo:
    require:
    - spec: +debug
```

* `spack config change packages:foo:require:~debug` replaces `+debug` with `~debug`
* `spack config change packages:foo:require:@1.1` adds a requirement to the list
* `spack config change packages:bar:require:~debug` adds a requirement
2024-01-18 00:21:17 -08:00
Harmen Stoppels
9539037096 papi: Fix Gitlab CI by conflict with 7.1:%cce until -ffree-form is resolved (#41847)
Co-authored-by: Alec Scott <alec@bcs.sh>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-01-17 18:13:12 -07:00
Peter Scheibel
453ecdb77e Config path quote handling: keys with quotes (#40976)
As observed in #40944, when using `spack config add <path>`, the `path` might
contain keys that are enclosed in quotes.

This was broken in https://github.com/spack/spack/pull/39831, which assumed that
only the value (if present, the final element of the path) would use quotes.

This preserves the primary intended behavior of #39931 (allowing ":" in values when
using `spack config add`) while also allowing quotes on keys.

This has complicated the function `process_config_path`, but:
* It is not used outside of `config.py`
* The docstring has been updated to account for this
* Created an object to formalize the DSL, added a test for that, and
  refactored parsing to make use of regular expressions as well.
* Updated the parsing and also updated the `config_path_dsl` test with an explicit check.
  At a higher level, split the parsing to check if something is either a key or not:
  * in the first case, it is covered by a regex
  * in the second, it may be a YAML value, but in that case it would have to be the last
    entry of x:y:z, so in that case I attempt to use the YAML handling logic to parse it as such
2024-01-17 17:11:27 -08:00
Arne Becker
796d251061 perl-kyotocabinet: new package and lzo and lzma compression to kyotocabinet (#41772)
- Add Perl module KyotoCabinet
- Add lzo and lzma compression to kyotocabinet
2024-01-18 02:03:47 +01:00
Arne Becker
308761d5f9 unison: update to 2.53.3, deprecate old versions we can't build in spack (#41777)
- Use MakefilePackage and simplified package.py

- Deprecate old versions - they did not build for me with OCaml 4.13.1
  that is currently in Spack. Also, the changes from the previous
  versions seem to be quite significant.
2024-01-18 01:58:31 +01:00
snehring
ded778004e lammps: add latest stable and recommeded version 20230802.2 (#42126) 2024-01-18 01:02:50 +01:00
Sam Grayson
ad5d4ed235 krb5: add perl as a build dependency (#42114) 2024-01-17 15:32:25 -08:00
Tal Ben-Nun
349867c879 Limit patching Catch2 to the newer @3: version range (#42019) 2024-01-18 00:12:19 +01:00
afzpatel
277e1ff396 rpp: add a variant to install tests, update mivisionx dependency (#41774) 2024-01-17 15:42:55 -07:00
Richard Berger
eda4d3fa06 FleCSI updates (#42127)
* flecsi: simplify hdf5 variant logic
* flecsi: deprecate 1.4 version
2024-01-17 13:49:33 -08:00
Tamara Dahlgren
68e00e7073 Packages requiring manual downloads: improve error message (#42017)
Spack packages may not have a public download option, and can implement
`download_instr` to inform users how to obtain the artifacts needed to
build. `spack checksum` however did not account for this and would print
out a confusing error message when invoked on such packages ("Could not
find any remote versions").

This PR updates the error message to output the manual download instructions
if `spack checksum` is invoked on a package with `manual_download = True`.
2024-01-17 21:45:57 +00:00
Auriane R
69d762ce6a Broaden conflict between rocblas 5.2 and gcc 12 (#42064) 2024-01-17 11:49:26 -08:00
Matthieu Dorier
e92716ff2d build_environment.py: clean LUA_PATH and LUA_CPATH (#42101)
For better build isolation
2024-01-17 20:00:07 +01:00
Tom Scogland
0eaab09e88 fix pyright for package files (#42112) 2024-01-17 09:23:25 -08:00
Tom Scogland
c508ff1e5f cc: work around -v split between ld and ccld (#42111) 2024-01-17 09:04:04 -08:00
Arne Becker
0f920a85e4 perl-search-elasticsearch: New package (#42028)
Adds Search::Elasticsearch
2024-01-17 17:39:49 +01:00
Arne Becker
a12ecb112a perl-email-stuffer: New package (#42119)
Adds Email::Stuffer
2024-01-17 17:37:35 +01:00
Arne Becker
9640d30ea9 perl-rose-db-object and deps: New packages (#42029)
- Deprecates 1.63 in DateTime
- Adds Rose::DateTime
- Adds Rose::DB
- Adds Rose::DB::Object
2024-01-17 17:35:46 +01:00
David Guibert
85b2becd06 gnuplot: fix undefined ref to symbol libiconv_open (#42116)
This fixes #39720.
2024-01-17 17:22:06 +01:00
Auriane R
0331b0d044 Relax conflict in pika with cxxstd >= 20 and cuda <= 11 (#42118)
* Relax conflict with cxxstd >= 20 and cuda <= 11

* Update comment to be more specific to nvcc
2024-01-17 15:31:37 +01:00
WuK
1ce81fc299 add py-cairosvg py-cssselect2 (#42067)
* add py-cairosvg py-cssselect2

* Update package.py

add homepage

* Update package.py

add homepage
2024-01-17 07:32:31 -06:00
Maciej Wójcik
5b1b97aa49 py-reretry: add new versions (#42108)
* Update py-reretry package

* py-reretry: Remove yanked version
2024-01-17 07:30:42 -06:00
Thomas Bouvier
799ab6974c pyarrow: add versions up to v14.0.2 (#42109)
* pyarrow: add versions up to v14.0.2

* arrow: add v14.0.2
2024-01-17 07:26:49 -06:00
Taillefumier Mathieu
d94b7b9033 Version updates of SIRIUS (#42121)
Signed-off-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
2024-01-17 14:13:55 +01:00
Alex Leute
a01adb7bdc py-multi-imbalance: Added package py-multi-imbalance (#42094)
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-01-17 07:12:29 -06:00
Loris Ercole
4ad62d8b09 cp2k: fix 'gpu_map' bug (#42009) 2024-01-17 13:55:01 +01:00
Harmen Stoppels
28675478ce Create reproducible tarballs in VCSFetchStrategy.archive (#42042)
Currently when you repeatedly create a bootstrap mirror that includes
`clingo-bootstrap@spack` you get different tarballs every time.

This is a general problem with mirroring checkouts from version control
as tarballs. I think it's best to create tarballs ourselves, since that way we
have more control over its contents.

This PR ensures normalized tarballs like we do for build caches:

- normalize file permissions (in fact that was already inspired by git, so
  should be good)
- normalized file creation/modification time (timestamp 0)
- uid / guid = 0, no usernames
- normalized gzip header
- dir entries are ordered by `(is_dir, name)` where strings are not locale aware ;)

- POSIX says st_mode of symlinks is unspecified, so work around it and
  force mode to `0o755`
2024-01-16 21:11:43 -08:00
Tom Scogland
c05ed2c31a mark more things as build-tools (#42110) 2024-01-16 18:33:23 -07:00
Christopher Christofi
c4d2f11368 Revert "perl-constant: add new package" (#42099) 2024-01-16 17:19:12 -08:00
James Beal
c1ba631943 Add versions (#42105)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-01-16 17:18:14 -08:00
Adam J. Stewart
0b3bd21bd5 py-torchdata: update checksum (#42113) 2024-01-16 14:24:10 -08:00
Lydéric Debusschère
8317477daf py-sphinx-toolbox: new package (#41313)
* py-sphinx-toolbox: new package
* py-sphinx-toolbox: fix dependence py-typing-inspect

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2024-01-16 11:53:22 -08:00
Raffaele Solcà
07c1f7ced8 Add dla-future 0.4.0 (#42106) 2024-01-16 12:48:17 -07:00
Arne Becker
264c0d6428 perl-email-mime: New package (#42089)
Adds Email::MIME
2024-01-16 11:38:07 -08:00
Arne Becker
2836dcaf4e perl-email-sender: New package (#42090)
Adds Email::Sender
2024-01-16 11:35:23 -08:00
Christopher Christofi
dba556c1f8 Revert "perl-memoize: add new package with version 1.16" (#42097) 2024-01-16 11:11:34 -08:00
WuK
6792e2c3a7 add cutlass@3.3.0 (#42071) 2024-01-16 10:54:09 -08:00
Harmen Stoppels
bc9b39cb73 r: improve relocatability (#42030)
R embeds an absolute path to the `which` executable in the sources for
`Sys.which`. This gets ultimately stored as serialized byte code in some
custom database format, which uses compression for entries.

As a result, Spack cannot relocate `<prefix which>/bin/which` when
installing from a build cache.

The patch works around this by making R create a symlink to `which` in
its own prefix, have the R sources call that, so that relocation works
again.

See https://github.com/r-devel/r-svn/pull/151
2024-01-16 13:37:25 +01:00
Massimiliano Culpo
f0c69ff3bf Fix a bug when a required provider is requested for multiple virtuals (#42088) 2024-01-16 11:50:33 +01:00
Massimiliano Culpo
ddae696cf8 Fix using fully-qualified namespaces from root specs (#41957)
Explicitly requested namespaces are annotated during
the setup phase, and used to retrieve the correct package
class.

An attribute for the namespace has been added for each node.

Currently, a single namespace per package is allowed
during concretization.
2024-01-16 11:47:32 +01:00
Mosè Giordano
f449832d6f openblas: add v0.3.26 (#42086) 2024-01-16 11:09:59 +01:00
Ronald Rahaman
a2189cb9b4 mvapich2: add pmi_version variant for pmix, pmi1 support (#40665) 2024-01-16 10:27:13 +01:00
Michael Kuhn
70ec14d930 qt-base: fix xcb plugin not being built (#42070)
Qt requires quite a few X11/xcb dependencies to be able to compile the
xcb platform plugin. See https://doc.qt.io/qt-6/linux-requirements.html
2024-01-16 09:48:01 +01:00
Dave Keeshan
1225bd7a44 verilator: add v5.020 (#42077) 2024-01-16 09:45:05 +01:00
James Beal
7a8a168b81 The signature of filter_file is (#42092) 2024-01-15 12:08:04 -07:00
Arne Becker
cf3f275716 perl-test-yaml: New package (#42044)
Adds Test::YAML
2024-01-15 09:38:03 -07:00
WuK
d95bcd8d85 add new versions of py-altair (#42068)
* add new versions of py-altair

* fix year

* Update var/spack/repos/builtin/packages/py-altair/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-altair/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* reorder dependencies

* remove rc

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-01-15 05:17:56 -07:00
eugeneswalker
5f58a4c079 add e4s aarch64 (#42066) 2024-01-14 14:26:51 -08:00
Maciej Wójcik
adc56ac792 Fix packages inheriting GROMACS, add new versions (#42076)
* Fix inheritance of GROMACS derived packages, add new versions

* Reformatting
2024-01-14 07:15:05 -07:00
1192 changed files with 40157 additions and 15144 deletions

6
.github/pull_request_template.md vendored Normal file
View File

@@ -0,0 +1,6 @@
<!--
Remember that `spackbot` can help with your PR in multiple ways:
- `@spackbot help` shows all the commands that are currently available
- `@spackbot fix style` tries to push a commit to fix style issues in this PR
- `@spackbot re-run pipeline` runs the pipelines again, if you have write access to the repository
-->

View File

@@ -43,7 +43,7 @@ jobs:
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d # @v2.1.0
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -57,7 +57,7 @@ jobs:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: docker/metadata-action@dbef88086f6cef02e264edb7dbf63250c17cef6c
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
with:
images: |
@@ -96,7 +96,7 @@ jobs:
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226
uses: docker/setup-buildx-action@0d103c3126aa41d772a8362f6aa67afac040f80c
- name: Log in to GitHub Container Registry
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
@@ -118,7 +118,5 @@ jobs:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
push: ${{ github.event_name != 'pull_request' }}
cache-from: type=gha
cache-to: type=gha,mode=max
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}

View File

@@ -40,7 +40,7 @@ jobs:
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@4512585405083f25c027a35db413c2b3b9006d50
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below

View File

@@ -1,5 +1,5 @@
black==23.12.1
clingo==5.6.2
black==24.2.0
clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0

View File

@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
@@ -122,7 +122,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: shelltests,linux
@@ -181,7 +181,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d # @v2.1.0
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
@@ -216,6 +216,6 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,macos

View File

@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,windows
unit-tests-cmd:
@@ -57,7 +57,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,windows
build-abseil:

View File

@@ -1130,6 +1130,10 @@ A version specifier can also be a list of ranges and specific versions,
separated by commas. For example, ``@1.0:1.5,=1.7.1`` matches any version
in the range ``1.0:1.5`` and the specific version ``1.7.1``.
^^^^^^^^^^^^
Git versions
^^^^^^^^^^^^
For packages with a ``git`` attribute, ``git`` references
may be specified instead of a numerical version i.e. branches, tags
and commits. Spack will stage and build based off the ``git``

View File

@@ -87,7 +87,7 @@ You can check what is installed in the bootstrapping store at any time using:
.. code-block:: console
% spack find -b
% spack -b find
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 11 installed packages
-- darwin-catalina-x86_64 / apple-clang@12.0.0 ------------------
@@ -101,7 +101,7 @@ In case it is needed you can remove all the software in the current bootstrappin
% spack clean -b
==> Removing bootstrapped software and configuration in "/Users/spack/.spack/bootstrap"
% spack find -b
% spack -b find
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 0 installed packages
@@ -175,4 +175,4 @@ bootstrapping.
This command needs to be run on a machine with internet access and the resulting folder
has to be moved over to the air-gapped system. Once the local sources are added using the
commands suggested at the prompt, they can be used to bootstrap Spack.
commands suggested at the prompt, they can be used to bootstrap Spack.

View File

@@ -199,6 +199,7 @@ def setup(sphinx):
("py:class", "contextlib.contextmanager"),
("py:class", "module"),
("py:class", "_io.BufferedReader"),
("py:class", "_io.BytesIO"),
("py:class", "unittest.case.TestCase"),
("py:class", "_frozen_importlib_external.SourceFileLoader"),
("py:class", "clingo.Control"),
@@ -215,6 +216,7 @@ def setup(sphinx):
("py:class", "spack.spec.InstallStatus"),
("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -73,9 +73,12 @@ are six configuration scopes. From lowest to highest:
Spack instance per project) or for site-wide settings on a multi-user
machine (e.g., for a common Spack instance).
#. **plugin**: Read from a Python project's entry points. Settings here affect
all instances of Spack running with the same Python installation. This scope takes higher precedence than site, system, and default scopes.
#. **user**: Stored in the home directory: ``~/.spack/``. These settings
affect all instances of Spack and take higher precedence than site,
system, or defaults scopes.
system, plugin, or defaults scopes.
#. **custom**: Stored in a custom directory specified by ``--config-scope``.
If multiple scopes are listed on the command line, they are ordered
@@ -196,6 +199,45 @@ with MPICH. You can create different configuration scopes for use with
mpi: [mpich]
.. _plugin-scopes:
^^^^^^^^^^^^^
Plugin scopes
^^^^^^^^^^^^^
.. note::
Python version >= 3.8 is required to enable plugin configuration.
Spack can be made aware of configuration scopes that are installed as part of a python package. To do so, register a function that returns the scope's path to the ``"spack.config"`` entry point. Consider the Python package ``my_package`` that includes Spack configurations:
.. code-block:: console
my-package/
├── src
│   ├── my_package
│   │   ├── __init__.py
│   │   └── spack/
│   │   │   └── config.yaml
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make ``my_package``'s ``spack/`` configurations visible to Spack when ``my_package`` is installed:
.. code-block:: toml
[project.entry_points."spack.config"]
my_package = "my_package:get_config_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
.. code-block:: python
import importlib.resources
def get_config_path():
dirname = importlib.resources.files("my_package").joinpath("spack")
if dirname.exists():
return str(dirname)
.. _platform-scopes:
------------------------

View File

@@ -357,91 +357,23 @@ If there is a hook that you would like and is missing, you can propose to add a
``pre_install(spec)``
"""""""""""""""""""""
A ``pre_install`` hook is run within an install subprocess, directly before
the install starts. It expects a single argument of a spec, and is run in
a multiprocessing subprocess. Note that if you see ``pre_install`` functions associated with packages these are not hooks
as we have defined them here, but rather callback functions associated with
a package install.
A ``pre_install`` hook is run within the install subprocess, directly before the install starts.
It expects a single argument of a spec.
""""""""""""""""""""""
``post_install(spec)``
""""""""""""""""""""""
"""""""""""""""""""""""""""""""""""""
``post_install(spec, explicit=None)``
"""""""""""""""""""""""""""""""""""""
A ``post_install`` hook is run within an install subprocess, directly after
the install finishes, but before the build stage is removed. If you
write one of these hooks, you should expect it to accept a spec as the only
argument. This is run in a multiprocessing subprocess. This ``post_install`` is
also seen in packages, but in this context not related to the hooks described
here.
A ``post_install`` hook is run within the install subprocess, directly after the install finishes,
but before the build stage is removed and the spec is registered in the database. It expects two
arguments: spec and an optional boolean indicating whether this spec is being installed explicitly.
""""""""""""""""""""""""""""""""""""""""""""""""""""
``pre_uninstall(spec)`` and ``post_uninstall(spec)``
""""""""""""""""""""""""""""""""""""""""""""""""""""
""""""""""""""""""""""""""
``on_install_start(spec)``
""""""""""""""""""""""""""
This hook is run at the beginning of ``lib/spack/spack/installer.py``,
in the install function of a ``PackageInstaller``,
and importantly is not part of a build process, but before it. This is when
we have just newly grabbed the task, and are preparing to install. If you
write a hook of this type, you should provide the spec to it.
.. code-block:: python
def on_install_start(spec):
"""On start of an install, we want to...
"""
print('on_install_start')
""""""""""""""""""""""""""""
``on_install_success(spec)``
""""""""""""""""""""""""""""
This hook is run on a successful install, and is also run inside the build
process, akin to ``post_install``. The main difference is that this hook
is run outside of the context of the stage directory, meaning after the
build stage has been removed and the user is alerted that the install was
successful. If you need to write a hook that is run on success of a particular
phase, you should use ``on_phase_success``.
""""""""""""""""""""""""""""
``on_install_failure(spec)``
""""""""""""""""""""""""""""
This hook is run given an install failure that happens outside of the build
subprocess, but somewhere in ``installer.py`` when something else goes wrong.
If you need to write a hook that is relevant to a failure within a build
process, you would want to instead use ``on_phase_failure``.
"""""""""""""""""""""""""""
``on_install_cancel(spec)``
"""""""""""""""""""""""""""
The same, but triggered if a spec install is cancelled for any reason.
"""""""""""""""""""""""""""""""""""""""""""""""
``on_phase_success(pkg, phase_name, log_file)``
"""""""""""""""""""""""""""""""""""""""""""""""
This hook is run within the install subprocess, and specifically when a phase
successfully finishes. Since we are interested in the package, the name of
the phase, and any output from it, we require:
- **pkg**: the package variable, which also has the attached spec at ``pkg.spec``
- **phase_name**: the name of the phase that was successful (e.g., configure)
- **log_file**: the path to the file with output, in case you need to inspect or otherwise interact with it.
"""""""""""""""""""""""""""""""""""""""""""""
``on_phase_error(pkg, phase_name, log_file)``
"""""""""""""""""""""""""""""""""""""""""""""
In the case of an error during a phase, we might want to trigger some event
with a hook, and this is the purpose of this particular hook. Akin to
``on_phase_success`` we require the same variables - the package that failed,
the name of the phase, and the log file where we might find errors.
These hooks are currently used for cleaning up module files after uninstall.
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -416,6 +416,23 @@ that git clone if ``foo`` is in the environment.
Further development on ``foo`` can be tested by reinstalling the environment,
and eventually committed and pushed to the upstream git repo.
If the package being developed supports out-of-source builds then users can use the
``--build_directory`` flag to control the location and name of the build directory.
This is a shortcut to set the ``package_attributes:build_directory`` in the
``packages`` configuration (see :ref:`assigning-package-attributes`).
The supplied location will become the build-directory for that package in all future builds.
.. warning::
Potential pitfalls of setting the build directory
Spack does not check for out-of-source build compatibility with the packages and
so the onerous of making sure the package supports out-of-source builds is on
the user.
For example, most ``autotool`` and ``makefile`` packages do not support out-of-source builds
while all ``CMake`` packages do.
Understanding these nuances are on the software developers and we strongly encourage
developers to only redirect the build directory if they understand their package's
build-system.
^^^^^^^
Loading
^^^^^^^
@@ -472,11 +489,11 @@ a ``packages.yaml`` file) could contain:
.. code-block:: yaml
spack:
...
# ...
packages:
all:
compiler: [intel]
...
# ...
This configuration sets the default compiler for all packages to
``intel``.
@@ -822,7 +839,7 @@ directories.
.. code-block:: yaml
spack:
...
# ...
view:
mpis:
root: /path/to/view
@@ -866,7 +883,7 @@ automatically named ``default``, so that
.. code-block:: yaml
spack:
...
# ...
view: True
is equivalent to
@@ -874,7 +891,7 @@ is equivalent to
.. code-block:: yaml
spack:
...
# ...
view:
default:
root: .spack-env/view
@@ -884,7 +901,7 @@ and
.. code-block:: yaml
spack:
...
# ...
view: /path/to/view
is equivalent to
@@ -892,7 +909,7 @@ is equivalent to
.. code-block:: yaml
spack:
...
# ...
view:
default:
root: /path/to/view
@@ -935,6 +952,17 @@ function, as shown in the example below:
^mpi: "{name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}"
all: "{name}-{version}/{compiler.name}-{compiler.version}"
Projections also permit environment and spack configuration variable
expansions as shown below:
.. code-block:: yaml
projections:
all: "{name}-{version}/{compiler.name}-{compiler.version}/$date/$SYSTEM_ENV_VARIBLE"
where ``$date`` is the spack configuration variable that will expand with the ``YYYY-MM-DD``
format and ``$SYSTEM_ENV_VARIABLE`` is an environment variable defined in the shell.
The entries in the projections configuration file must all be either
specs or the keyword ``all``. For each spec, the projection used will
be the first non-``all`` entry that the spec satisfies, or ``all`` if

View File

@@ -111,3 +111,39 @@ The corresponding unit tests can be run giving the appropriate options to ``spac
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================
---------------------------------------
Registering Extensions via Entry Points
---------------------------------------
.. note::
Python version >= 3.8 is required to register extensions via entry points.
Spack can be made aware of extensions that are installed as part of a python package. To do so, register a function that returns the extension path, or paths, to the ``"spack.extensions"`` entry point. Consider the Python package ``my_package`` that includes a Spack extension:
.. code-block:: console
my-package/
├── src
│   ├── my_package
│   │   └── __init__.py
│   └── spack-scripting/ # the spack extensions
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make the ``spack-scripting`` extension visible to Spack when ``my_package`` is installed:
.. code-block:: toml
[project.entry_points."spack.extenions"]
my_package = "my_package:get_extension_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
.. code-block:: python
import importlib.resources
def get_extension_path():
dirname = importlib.resources.files("my_package").joinpath("spack-scripting")
if dirname.exists():
return str(dirname)

View File

@@ -623,7 +623,7 @@ Fortran.
compilers:
- compiler:
...
# ...
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++

View File

@@ -10,7 +10,7 @@ Modules (modules.yaml)
======================
The use of module systems to manage user environment in a controlled way
is a common practice at HPC centers that is often embraced also by
is a common practice at HPC centers that is sometimes embraced also by
individual programmers on their development machines. To support this
common practice Spack integrates with `Environment Modules
<http://modules.sourceforge.net/>`_ and `Lmod
@@ -21,14 +21,38 @@ Modules are one of several ways you can use Spack packages. For other
options that may fit your use case better, you should also look at
:ref:`spack load <spack-load>` and :ref:`environments <environments>`.
----------------------------
Using module files via Spack
----------------------------
-----------
Quick start
-----------
If you have installed a supported module system you should be able to
run ``module avail`` to see what module
files have been installed. Here is sample output of those programs,
showing lots of installed packages:
In the current version of Spack, module files are not generated by default. To get started, you
can generate module files for all currently installed packages by running either
.. code-block:: console
$ spack module tcl refresh
or
.. code-block:: console
$ spack module lmod refresh
Spack can also generate module files for all future installations automatically through the
following configuration:
.. code-block:: console
$ spack config add modules:default:enable:[tcl]
or
.. code-block:: console
$ spack config add modules:default:enable:[lmod]
Assuming you have a module system installed, you should now be able to use the ``module`` command
to interact with them:
.. code-block:: console
@@ -65,33 +89,17 @@ scheme used at your site.
Module file customization
-------------------------
Module files are generated by post-install hooks after the successful
installation of a package.
.. note::
Spack only generates modulefiles when a package is installed. If
you attempt to install a package and it is already installed, Spack
will not regenerate modulefiles for the package. This may lead to
inconsistent modulefiles if the Spack module configuration has
changed since the package was installed, either by editing a file
or changing scopes or environments.
Later in this section there is a subsection on :ref:`regenerating
modules <cmd-spack-module-refresh>` that will allow you to bring
your modules to a consistent state.
The table below summarizes the essential information associated with
the different file formats that can be generated by Spack:
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
| | **Hook name** | **Default root directory** | **Default template file** | **Compatible tools** |
+=============================+====================+===============================+==============================================+======================+
| **Tcl - Non-Hierarchical** | ``tcl`` | share/spack/modules | share/spack/templates/modules/modulefile.tcl | Env. Modules/Lmod |
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
| **Lua - Hierarchical** | ``lmod`` | share/spack/lmod | share/spack/templates/modules/modulefile.lua | Lmod |
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
+-----------+--------------+------------------------------+----------------------------------------------+----------------------+
| | Hierarchical | **Default root directory** | **Default template file** | **Compatible tools** |
+===========+==============+==============================+==============================================+======================+
| ``tcl`` | No | share/spack/modules | share/spack/templates/modules/modulefile.tcl | Env. Modules/Lmod |
+-----------+--------------+------------------------------+----------------------------------------------+----------------------+
| ``lmod`` | Yes | share/spack/lmod | share/spack/templates/modules/modulefile.lua | Lmod |
+-----------+--------------+------------------------------+----------------------------------------------+----------------------+
Spack ships with sensible defaults for the generation of module files, but
@@ -102,7 +110,7 @@ In general you can override or extend the default behavior by:
2. writing specific rules in the ``modules.yaml`` configuration file
3. writing your own templates to override or extend the defaults
The former method let you express changes in the run-time environment
The former method lets you express changes in the run-time environment
that are needed to use the installed software properly, e.g. injecting variables
from language interpreters into their extensions. The latter two instead permit to
fine tune the filesystem layout, content and creation of module files to meet
@@ -110,79 +118,62 @@ site specific conventions.
.. _overide-api-calls-in-package-py:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Override API calls in ``package.py``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting environment variables dynamically in ``package.py``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
There are two methods that you can override in any ``package.py`` to affect the
content of the module files generated by Spack. The first one:
There are two methods that you can implement in any ``package.py`` to dynamically affect the
content of the module files generated by Spack. The most important one is
``setup_run_environment``, which can be used to set environment variables in the module file that
depend on the spec:
.. code-block:: python
def setup_run_environment(self, env):
pass
if self.spec.satisfies("+foo"):
env.set("FOO", "bar")
can alter the content of the module file associated with the same package where it is overridden.
The second method:
The second, less commonly used, is ``setup_dependent_run_environment(self, env, dependent_spec)``,
which allows a dependency to set variables in the module file of its dependents. This is typically
used in packages like ``python``, ``r``, or ``perl`` to prepend the dependent's prefix to the
search path of the interpreter (``PYTHONPATH``, ``R_LIBS``, ``PERL5LIB`` resp.), so it can locate
the packages at runtime.
For example, a simplified version of the ``python`` package could look like this:
.. code-block:: python
def setup_dependent_run_environment(self, env, dependent_spec):
pass
if dependent_spec.package.extends(self.spec):
env.prepend_path("PYTHONPATH", dependent_spec.prefix.lib.python)
can instead inject run-time environment modifications in the module files of packages
that depend on it. In both cases you need to fill ``env`` with the desired
list of environment modifications.
.. admonition:: The ``r`` package and callback APIs
An example in which it is crucial to override both methods
is given by the ``r`` package. This package installs libraries and headers
in non-standard locations and it is possible to prepend the appropriate directory
to the corresponding environment variables:
================== =================================
LD_LIBRARY_PATH ``self.prefix/rlib/R/lib``
PKG_CONFIG_PATH ``self.prefix/rlib/pkgconfig``
================== =================================
with the following snippet:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/r/package.py
:pyobject: R.setup_run_environment
The ``r`` package also knows which environment variable should be modified
to make language extensions provided by other packages available, and modifies
it appropriately in the override of the second method:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/r/package.py
:pyobject: R.setup_dependent_run_environment
and would make any package that ``extends("python")`` have its library directory added to the
``PYTHONPATH`` environment variable in the module file. It's much more convenient to set this
variable here, than to repeat it in every Python extension's ``setup_run_environment`` method.
.. _modules-yaml:
^^^^^^^^^^^^^^^^^^^^^^^^^^
Write a configuration file
^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``modules.yaml`` config file and module sets
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The configuration files that control module generation behavior
are named ``modules.yaml``. The default configuration:
The configuration files that control module generation behavior are named ``modules.yaml``. The
default configuration looks like this:
.. literalinclude:: _spack_root/etc/spack/defaults/modules.yaml
:language: yaml
activates the hooks to generate ``tcl`` module files and inspects
the installation folder of each package for the presence of a set of subdirectories
(``bin``, ``man``, ``share/man``, etc.). If any is found its full path is prepended
to the environment variables listed below the folder name.
You can define one or more **module sets**, each of which can be configured separately with regard
to install location, naming scheme, inclusion and exclusion, autoloading, et cetera.
Spack modules can be configured for multiple module sets. The default
module set is named ``default``. All Spack commands which operate on
modules default to apply the ``default`` module set, but can be
applied to any module set in the configuration.
The default module set is aptly named ``default``. All
:ref:`Spack commands that operate on modules <maintaining-module-files>` apply to the ``default``
module set, unless another module set is specified explicitly (with the ``--name`` flag).
"""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^
Changing the modules root
"""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^
As shown in the table above, the default module root for ``lmod`` is
``$spack/share/spack/lmod`` and the default root for ``tcl`` is
@@ -198,7 +189,7 @@ set by changing the ``roots`` key of the configuration.
my_custom_lmod_modules:
roots:
lmod: /path/to/install/custom/lmod/modules
...
# ...
This configuration will create two module sets. The default module set
will install its ``tcl`` modules to ``/path/to/install/tcl/modules``
@@ -224,25 +215,32 @@ location could be confusing to users of your modules. In the next
section, we will discuss enabling and disabling module types (module
file generators) for each module set.
""""""""""""""""""""
Activate other hooks
""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Automatically generating module files
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Any other module file generator shipped with Spack can be activated adding it to the
list under the ``enable`` key in the module file. Currently the only generator that
is not active by default is ``lmod``, which produces hierarchical lua module files.
Each module system can then be configured separately. In fact, you should list configuration
options that affect a particular type of module files under a top level key corresponding
to the generator being customized:
Spack can be configured to automatically generate module files as part of package installation.
This is done by adding the desired module systems to the ``enable`` list.
.. code-block:: yaml
modules:
default:
enable:
- tcl
- lmod
- tcl
- lmod
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configuring ``tcl`` and ``lmod`` modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can configure the behavior of either module system separately, under a key corresponding to
the generator being customized:
.. code-block:: yaml
modules:
default:
tcl:
# contains environment modules specific customizations
lmod:
@@ -253,16 +251,82 @@ either change the layout of the module files on the filesystem, or they will aff
their content. For the latter point it is possible to use anonymous specs
to fine tune the set of packages on which the modifications should be applied.
.. _autoloading-dependencies:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Autoloading and hiding dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A module file should set the variables that are needed for an application to work. But since an
application often has many dependencies, where should all the environment variables for those be
set? In Spack the rule is that each package sets the runtime variables that are needed by the
package itself, and no more. This way, dependencies can be loaded standalone too, and duplication
of environment variables is avoided.
That means however that if you want to use an application, you need to load the modules for all its
dependencies. Of course this is not something you would want users to do manually.
Since Spack knows the dependency graph of every package, it can easily generate module files that
automatically load the modules for its dependencies recursively. It is enabled by default for both
Lmod and Environment Modules under the ``autoload: direct`` config option. The former system has
builtin support through the ``depends_on`` function, the latter simply uses a ``module load``
statement. Both module systems (at least in newer versions) do reference counting, so that if a
module is loaded by two different modules, it will only be unloaded after the others are.
The ``autoload`` key accepts the values:
* ``none``: no autoloading
* ``run``: autoload direct *run* type dependencies
* ``direct``: autoload direct *link and run* type dependencies
* ``all``: autoload all dependencies
In case of ``run`` and ``direct``, a ``module load`` triggers a recursive load.
The ``direct`` option is most correct: there are cases where pure link dependencies need to set
variables for themselves, or need to have variables of their own dependencies set.
In practice however, ``run`` is often sufficient, and may make ``module load`` snappier.
The ``all`` option is discouraged and seldomly used.
A common complaint about autoloading is the large number of modules that are visible to the user.
Spack has a solution for this as well: ``hide_implicits: true``. This ensures that only those
packages you've explicitly installed are exposed by ``module avail``, but still allows for
autoloading of hidden dependencies. Lmod should support hiding implicits in general, while
Environment Modules requires version 4.7 or higher.
.. note::
If supported by your module system, we highly encourage the following configuration that enables
autoloading and hiding of implicits. It ensures all runtime variables are set correctly,
including those for dependencies, without overwhelming the user with a large number of available
modules. Further, it makes it easier to get readable module names without collisions, see the
section below on :ref:`modules-projections`.
.. code-block:: yaml
modules:
default:
tcl:
hide_implicits: true
all:
autoload: direct # or `run`
lmod:
hide_implicits: true
all:
autoload: direct # or `run`
.. _anonymous_specs:
""""""""""""""""""""""""""""
Selection by anonymous specs
""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting environment variables for selected packages in config
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In the configuration file you can use *anonymous specs* (i.e. specs
that **are not required to have a root package** and are thus used just
to express constraints) to apply certain modifications on a selected set
of the installed software. For instance, in the snippet below:
In the configuration file you can filter particular specs, and make further changes to the
environment variables that go into their module files. This is very powerful when you want to avoid
:ref:`modifying the package itself <overide-api-calls-in-package-py>`, or when you want to set
certain variables on multiple selected packages at once.
For instance, in the snippet below:
.. code-block:: yaml
@@ -305,12 +369,28 @@ the variable ``FOOBAR`` will be unset.
.. note::
Order does matter
The modifications associated with the ``all`` keyword are always evaluated
first, no matter where they appear in the configuration file. All the other
spec constraints are instead evaluated top to bottom.
first, no matter where they appear in the configuration file. All the other changes to
environment variables for matching specs are evaluated from top to bottom.
""""""""""""""""""""""""""""""""""""""""""""
.. warning::
As general advice, it's often better to set as few unnecessary variables as possible. For
example, the following seemingly innocent and potentially useful configuration
.. code-block:: yaml
all:
environment:
set:
"{name}_ROOT": "{prefix}"
sets ``BINUTILS_ROOT`` to its prefix in modules for ``binutils``, which happens to break
the ``gcc`` compiler: it uses this variable as its default search path for certain object
files and libraries, and by merely setting it, everything fails to link.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Exclude or include specific module files
""""""""""""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can use anonymous specs also to prevent module files from being written or
to force them to be written. Consider the case where you want to hide from users
@@ -330,14 +410,19 @@ you will prevent the generation of module files for any package that
is compiled with ``gcc@4.4.7``, with the only exception of any ``gcc``
or any ``llvm`` installation.
It is safe to combine ``exclude`` and ``autoload``
:ref:`mentioned above <autoloading-dependencies>`. When ``exclude`` prevents a module file to be
generated for a dependency, the ``autoload`` feature will simply not generate a statement to load
it.
.. _modules-projections:
"""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Customize the naming of modules
"""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The names of environment modules generated by spack are not always easy to
The names of environment modules generated by Spack are not always easy to
fully comprehend due to the long hash in the name. There are three module
configuration options to help with that. The first is a global setting to
adjust the hash length. It can be set anywhere from 0 to 32 and has a default
@@ -353,6 +438,13 @@ shows how to set hash length in the module file names:
tcl:
hash_length: 7
.. tip::
Using ``hide_implicits: true`` (see :ref:`autoloading-dependencies`) vastly reduces the number
modules exposed to the user. The hidden modules always contain the hash in their name, and are
not influenced by the ``hash_length`` setting. Hidden implicits thus make it easier to use a
short hash length or no hash at all, without risking name conflicts.
To help make module names more readable, and to help alleviate name conflicts
with a short hash, one can use the ``suffixes`` option in the modules
configuration file. This option will add strings to modules that match a spec.
@@ -365,12 +457,12 @@ For instance, the following config options,
tcl:
all:
suffixes:
^python@2.7.12: 'python-2.7.12'
^python@3.12: 'python-3.12'
^openblas: 'openblas'
will add a ``python-2.7.12`` version string to any packages compiled with
python matching the spec, ``python@2.7.12``. This is useful to know which
version of python a set of python extensions is associated with. Likewise, the
will add a ``python-3.12`` version string to any packages compiled with
Python matching the spec, ``python@3.12``. This is useful to know which
version of Python a set of Python extensions is associated with. Likewise, the
``openblas`` string is attached to any program that has openblas in the spec,
most likely via the ``+blas`` variant specification.
@@ -468,41 +560,11 @@ that are already in the Lmod hierarchy.
For hierarchies that are deeper than three layers ``lmod spider`` may have some issues.
See `this discussion on the Lmod project <https://github.com/TACC/Lmod/issues/114>`_.
""""""""""""""""""""""
Select default modules
""""""""""""""""""""""
By default, when multiple modules of the same name share a directory,
the highest version number will be the default module. This behavior
of the ``module`` command can be overridden with a symlink named
``default`` to the desired default module. If you wish to configure
default modules with Spack, add a ``defaults`` key to your modules
configuration:
.. code-block:: yaml
modules:
my-module-set:
tcl:
defaults:
- gcc@10.2.1
- hdf5@1.2.10+mpi+hl%gcc
These defaults may be arbitrarily specific. For any package that
satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
.. _customize-env-modifications:
"""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Customize environment modifications
"""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can control which prefixes in a Spack package are added to
environment variables with the ``prefix_inspections`` section; this
@@ -600,9 +662,9 @@ stack to users who are likely to inspect the modules to find full
paths to software, when it is desirable to present the users with a
simpler set of paths than those generated by the Spack install tree.
""""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Filter out environment modifications
""""""""""""""""""""""""""""""""""""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Modifications to certain environment variables in module files are there by
default, for instance because they are generated by prefix inspections.
@@ -622,49 +684,37 @@ do so by using the ``exclude_env_vars``:
The configuration above will generate module files that will not contain
modifications to either ``CPATH`` or ``LIBRARY_PATH``.
^^^^^^^^^^^^^^^^^^^^^^
Select default modules
^^^^^^^^^^^^^^^^^^^^^^
.. _autoloading-dependencies:
"""""""""""""""""""""
Autoload dependencies
"""""""""""""""""""""
Often it is required for a module to have its (transient) dependencies loaded as well.
One example where this is useful is when one package needs to use executables provided
by its dependency; when the dependency is autoloaded, the executable will be in the
PATH. Similarly for scripting languages such as Python, packages and their dependencies
have to be loaded together.
Autoloading is enabled by default for Lmod and Environment Modules. The former
has builtin support for through the ``depends_on`` function. The latter uses
``module load`` statement to load and track dependencies.
Autoloading can also be enabled conditionally:
By default, when multiple modules of the same name share a directory,
the highest version number will be the default module. This behavior
of the ``module`` command can be overridden with a symlink named
``default`` to the desired default module. If you wish to configure
default modules with Spack, add a ``defaults`` key to your modules
configuration:
.. code-block:: yaml
modules:
default:
tcl:
all:
autoload: none
^python:
autoload: direct
modules:
my-module-set:
tcl:
defaults:
- gcc@10.2.1
- hdf5@1.2.10+mpi+hl%gcc
The configuration file above will produce module files that will
load their direct dependencies if the package installed depends on ``python``.
The allowed values for the ``autoload`` statement are either ``none``,
``direct`` or ``all``.
These defaults may be arbitrarily specific. For any package that
satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. note::
Tcl prerequisites
In the ``tcl`` section of the configuration file it is possible to use
the ``prerequisites`` directive that accepts the same values as
``autoload``. It will produce module files that have a ``prereq``
statement, which autoloads dependencies on Environment Modules when its
``auto_handling`` configuration option is enabled. If Environment Modules
is installed with Spack, ``auto_handling`` is enabled by default starting
version 4.2. Otherwise it is enabled by default since version 5.0.
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
.. _maintaining-module-files:
------------------------
Maintaining Module Files

View File

@@ -487,6 +487,56 @@ present. For instance with a configuration like:
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Conflicts and strong preferences
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If the semantic of requirements is too strong, you can also express "strong preferences" and "conflicts"
from configuration files:
.. code-block:: yaml
packages:
all:
prefer:
- '%clang'
conflict:
- '+shared'
The ``prefer`` and ``conflict`` sections can be used whenever a ``require`` section is allowed.
The argument is always a list of constraints, and each constraint can be either a simple string,
or a more complex object:
.. code-block:: yaml
packages:
all:
conflict:
- spec: '%clang'
when: 'target=x86_64_v3'
message: 'reason why clang cannot be used'
The ``spec`` attribute is mandatory, while both ``when`` and ``message`` are optional.
.. note::
Requirements allow for expressing both "strong preferences" and "conflicts".
The syntax for doing so, though, may not be immediately clear. For
instance, if we want to prevent any package from using ``%clang``, we can set:
.. code-block:: yaml
packages:
all:
require:
- one_of: ['%clang', '@:']
Since only one of the requirements must hold, and ``@:`` is always true, the rule above is
equivalent to a conflict. For "strong preferences" we need to substitute the ``one_of`` policy
with ``any_of``.
.. _package-preferences:
-------------------
@@ -597,6 +647,8 @@ manually placed files within the install prefix are owned by the
assigned group. If no group is assigned, Spack will allow the OS
default behavior to go as expected.
.. _assigning-package-attributes:
----------------------------
Assigning Package Attributes
----------------------------
@@ -607,10 +659,11 @@ You can assign class-level attributes in the configuration:
packages:
mpileaks:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
package_attributes:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
Attributes set this way will be accessible to any method executed
in the package.py file (e.g. the ``install()`` method). Values for these

View File

@@ -810,7 +810,7 @@ generated by ``spack ci generate``. You also want your generated rebuild jobs
.. code-block:: yaml
spack:
...
# ...
ci:
pipeline-gen:
- build-job:

View File

@@ -17,7 +17,7 @@ experimental software separately from the built-in repository. Spack
allows you to configure local repositories using either the
``repos.yaml`` or the ``spack repo`` command.
A package repository a directory structured like this::
A package repository is a directory structured like this::
repo/
repo.yaml

View File

@@ -2,12 +2,12 @@ sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.23.0
python-levenshtein==0.25.0
docutils==0.20.1
pygments==2.17.2
urllib3==2.1.0
pytest==7.4.4
urllib3==2.2.1
pytest==8.0.2
isort==5.13.2
black==23.12.1
black==24.2.0
flake8==7.0.0
mypy==1.8.0

View File

@@ -171,7 +171,7 @@ def polite_path(components: Iterable[str]):
@memoized
def _polite_antipattern():
# A regex of all the characters we don't want in a filename
return re.compile(r"[^A-Za-z0-9_.-]")
return re.compile(r"[^A-Za-z0-9_+.-]")
def polite_filename(filename: str) -> str:
@@ -920,29 +920,35 @@ def get_filetype(path_name):
return output.strip()
@system_path_filter
def is_nonsymlink_exe_with_shebang(path):
"""
Returns whether the path is an executable script with a shebang.
Return False when the path is a *symlink* to an executable script.
"""
def has_shebang(path):
"""Returns whether a path has a shebang line. Returns False if the file cannot be opened."""
try:
st = os.lstat(path)
# Should not be a symlink
if stat.S_ISLNK(st.st_mode):
return False
# Should be executable
if not st.st_mode & (stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH):
return False
# Should start with a shebang
with open(path, "rb") as f:
return f.read(2) == b"#!"
except (IOError, OSError):
except OSError:
return False
@system_path_filter
def is_nonsymlink_exe_with_shebang(path):
"""Returns whether the path is an executable regular file with a shebang. Returns False too
when the path is a symlink to a script, and also when the file cannot be opened."""
try:
st = os.lstat(path)
except OSError:
return False
# Should not be a symlink
if stat.S_ISLNK(st.st_mode):
return False
# Should be executable
if not st.st_mode & (stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH):
return False
return has_shebang(path)
@system_path_filter(arg_slice=slice(1))
def chgrp_if_not_world_writable(path, group):
"""chgrp path to group if path is not world writable"""
@@ -1234,6 +1240,47 @@ def get_single_file(directory):
return fnames[0]
@system_path_filter
def windows_sfn(path: os.PathLike):
"""Returns 8.3 Filename (SFN) representation of
path
8.3 Filenames (SFN or short filename) is a file
naming convention used prior to Win95 that Windows
still (and will continue to) support. This convention
caps filenames at 8 characters, and most importantly
does not allow for spaces in addition to other specifications.
The scheme is generally the same as a normal Windows
file scheme, but all spaces are removed and the filename
is capped at 6 characters. The remaining characters are
replaced with ~N where N is the number file in a directory
that a given file represents i.e. Program Files and Program Files (x86)
would be PROGRA~1 and PROGRA~2 respectively.
Further, all file/directory names are all caps (although modern Windows
is case insensitive in practice).
Conversion is accomplished by fileapi.h GetShortPathNameW
Returns paths in 8.3 Filename form
Note: this method is a no-op on Linux
Args:
path: Path to be transformed into SFN (8.3 filename) format
"""
# This should not be run-able on linux/macos
if sys.platform != "win32":
return path
path = str(path)
import ctypes
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
# stub Windows types TCHAR[LENGTH]
TCHAR_arr = ctypes.c_wchar * len(path)
ret_str = TCHAR_arr()
k32.GetShortPathNameW(path, ret_str, len(path))
return ret_str.value
@contextmanager
def temp_cwd():
tmp_dir = tempfile.mkdtemp()
@@ -1377,120 +1424,89 @@ def traverse_tree(
yield (source_path, dest_path)
def lexists_islink_isdir(path):
"""Computes the tuple (lexists(path), islink(path), isdir(path)) in a minimal
number of stat calls on unix. Use os.path and symlink.islink methods for windows."""
if sys.platform == "win32":
if not os.path.lexists(path):
return False, False, False
return os.path.lexists(path), islink(path), os.path.isdir(path)
# First try to lstat, so we know if it's a link or not.
try:
lst = os.lstat(path)
except (IOError, OSError):
return False, False, False
is_link = stat.S_ISLNK(lst.st_mode)
# Check whether file is a dir.
if not is_link:
is_dir = stat.S_ISDIR(lst.st_mode)
return True, is_link, is_dir
# Check whether symlink points to a dir.
try:
st = os.stat(path)
is_dir = stat.S_ISDIR(st.st_mode)
except (IOError, OSError):
# Dangling symlink (i.e. it lexists but not exists)
is_dir = False
return True, is_link, is_dir
class BaseDirectoryVisitor:
"""Base class and interface for :py:func:`visit_directory_tree`."""
def visit_file(self, root, rel_path, depth):
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
"""Handle the non-symlink file at ``os.path.join(root, rel_path)``
Parameters:
root (str): root directory
rel_path (str): relative path to current file from ``root``
root: root directory
rel_path: relative path to current file from ``root``
depth (int): depth of current file from the ``root`` directory"""
pass
def visit_symlinked_file(self, root, rel_path, depth):
"""Handle the symlink to a file at ``os.path.join(root, rel_path)``.
Note: ``rel_path`` is the location of the symlink, not to what it is
pointing to. The symlink may be dangling.
def visit_symlinked_file(self, root: str, rel_path: str, depth) -> None:
"""Handle the symlink to a file at ``os.path.join(root, rel_path)``. Note: ``rel_path`` is
the location of the symlink, not to what it is pointing to. The symlink may be dangling.
Parameters:
root (str): root directory
rel_path (str): relative path to current symlink from ``root``
depth (int): depth of current symlink from the ``root`` directory"""
root: root directory
rel_path: relative path to current symlink from ``root``
depth: depth of current symlink from the ``root`` directory"""
pass
def before_visit_dir(self, root, rel_path, depth):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""Return True from this function to recurse into the directory at
os.path.join(root, rel_path). Return False in order not to recurse further.
Parameters:
root (str): root directory
rel_path (str): relative path to current directory from ``root``
depth (int): depth of current directory from the ``root`` directory
root: root directory
rel_path: relative path to current directory from ``root``
depth: depth of current directory from the ``root`` directory
Returns:
bool: ``True`` when the directory should be recursed into. ``False`` when
not"""
return False
def before_visit_symlinked_dir(self, root, rel_path, depth):
"""Return ``True`` to recurse into the symlinked directory and ``False`` in
order not to. Note: ``rel_path`` is the path to the symlink itself.
Following symlinked directories blindly can cause infinite recursion due to
cycles.
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""Return ``True`` to recurse into the symlinked directory and ``False`` in order not to.
Note: ``rel_path`` is the path to the symlink itself. Following symlinked directories
blindly can cause infinite recursion due to cycles.
Parameters:
root (str): root directory
rel_path (str): relative path to current symlink from ``root``
depth (int): depth of current symlink from the ``root`` directory
root: root directory
rel_path: relative path to current symlink from ``root``
depth: depth of current symlink from the ``root`` directory
Returns:
bool: ``True`` when the directory should be recursed into. ``False`` when
not"""
return False
def after_visit_dir(self, root, rel_path, depth):
"""Called after recursion into ``rel_path`` finished. This function is not
called when ``rel_path`` was not recursed into.
def after_visit_dir(self, root: str, rel_path: str, depth: int) -> None:
"""Called after recursion into ``rel_path`` finished. This function is not called when
``rel_path`` was not recursed into.
Parameters:
root (str): root directory
rel_path (str): relative path to current directory from ``root``
depth (int): depth of current directory from the ``root`` directory"""
root: root directory
rel_path: relative path to current directory from ``root``
depth: depth of current directory from the ``root`` directory"""
pass
def after_visit_symlinked_dir(self, root, rel_path, depth):
"""Called after recursion into ``rel_path`` finished. This function is not
called when ``rel_path`` was not recursed into.
def after_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> None:
"""Called after recursion into ``rel_path`` finished. This function is not called when
``rel_path`` was not recursed into.
Parameters:
root (str): root directory
rel_path (str): relative path to current symlink from ``root``
depth (int): depth of current symlink from the ``root`` directory"""
root: root directory
rel_path: relative path to current symlink from ``root``
depth: depth of current symlink from the ``root`` directory"""
pass
def visit_directory_tree(root, visitor, rel_path="", depth=0):
"""Recurses the directory root depth-first through a visitor pattern using the
interface from :py:class:`BaseDirectoryVisitor`
def visit_directory_tree(
root: str, visitor: BaseDirectoryVisitor, rel_path: str = "", depth: int = 0
):
"""Recurses the directory root depth-first through a visitor pattern using the interface from
:py:class:`BaseDirectoryVisitor`
Parameters:
root (str): path of directory to recurse into
visitor (BaseDirectoryVisitor): what visitor to use
rel_path (str): current relative path from the root
depth (str): current depth from the root
root: path of directory to recurse into
visitor: what visitor to use
rel_path: current relative path from the root
depth: current depth from the root
"""
dir = os.path.join(root, rel_path)
dir_entries = sorted(os.scandir(dir), key=lambda d: d.name)
@@ -1498,26 +1514,19 @@ def visit_directory_tree(root, visitor, rel_path="", depth=0):
for f in dir_entries:
rel_child = os.path.join(rel_path, f.name)
islink = f.is_symlink()
# On Windows, symlinks to directories are distinct from
# symlinks to files, and it is possible to create a
# broken symlink to a directory (e.g. using os.symlink
# without `target_is_directory=True`), invoking `isdir`
# on a symlink on Windows that is broken in this manner
# will result in an error. In this case we can work around
# the issue by reading the target and resolving the
# directory ourselves
# On Windows, symlinks to directories are distinct from symlinks to files, and it is
# possible to create a broken symlink to a directory (e.g. using os.symlink without
# `target_is_directory=True`), invoking `isdir` on a symlink on Windows that is broken in
# this manner will result in an error. In this case we can work around the issue by reading
# the target and resolving the directory ourselves
try:
isdir = f.is_dir()
except OSError as e:
if sys.platform == "win32" and hasattr(e, "winerror") and e.winerror == 5 and islink:
# if path is a symlink, determine destination and
# evaluate file vs directory
# if path is a symlink, determine destination and evaluate file vs directory
link_target = resolve_link_target_relative_to_the_link(f)
# link_target might be relative but
# resolve_link_target_relative_to_the_link
# will ensure that if so, that it is relative
# to the CWD and therefore
# makes sense
# link_target might be relative but resolve_link_target_relative_to_the_link
# will ensure that if so, that it is relative to the CWD and therefore makes sense
isdir = os.path.isdir(link_target)
else:
raise e

View File

@@ -843,6 +843,30 @@ def __repr__(self):
return repr(self.instance)
def get_entry_points(*, group: str):
"""Wrapper for ``importlib.metadata.entry_points``
Args:
group: entry points to select
Returns:
EntryPoints for ``group`` or empty list if unsupported
"""
try:
import importlib.metadata # type: ignore # novermin
except ImportError:
return []
try:
return importlib.metadata.entry_points(group=group)
except TypeError:
# Prior to Python 3.10, entry_points accepted no parameters and always
# returned a dictionary of entry points, keyed by group. See
# https://docs.python.org/3/library/importlib.metadata.html#entry-points
return importlib.metadata.entry_points().get(group, [])
def load_module_from_file(module_name, module_path):
"""Loads a python module from the path of the corresponding file.

View File

@@ -8,7 +8,7 @@
import filecmp
import os
import shutil
from collections import OrderedDict
from typing import Callable, Dict, List, Optional, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, touch, traverse_tree
@@ -51,32 +51,32 @@ class SourceMergeVisitor(BaseDirectoryVisitor):
- A list of merge conflicts in dst/
"""
def __init__(self, ignore=None):
def __init__(self, ignore: Optional[Callable[[str], bool]] = None):
self.ignore = ignore if ignore is not None else lambda f: False
# When mapping <src root> to <dst root>/<projection>, we need
# to prepend the <projection> bit to the relative path in the
# destination dir.
self.projection = ""
# When mapping <src root> to <dst root>/<projection>, we need to prepend the <projection>
# bit to the relative path in the destination dir.
self.projection: str = ""
# When a file blocks another file, the conflict can sometimes
# be resolved / ignored (e.g. <prefix>/LICENSE or
# or <site-packages>/<namespace>/__init__.py conflicts can be
# ignored).
self.file_conflicts = []
# Two files f and g conflict if they are not os.path.samefile(f, g) and they are both
# projected to the same destination file. These conflicts are not necessarily fatal, and
# can be resolved or ignored. For example <prefix>/LICENSE or
# <site-packages>/<namespace>/__init__.py conflicts can be ignored).
self.file_conflicts: List[MergeConflict] = []
# When we have to create a dir where a file is, or a file
# where a dir is, we have fatal errors, listed here.
self.fatal_conflicts = []
# When we have to create a dir where a file is, or a file where a dir is, we have fatal
# errors, listed here.
self.fatal_conflicts: List[MergeConflict] = []
# What directories we have to make; this is an ordered set,
# so that we have a fast lookup and can run mkdir in order.
self.directories = OrderedDict()
# What directories we have to make; this is an ordered dict, so that we have a fast lookup
# and can run mkdir in order.
self.directories: Dict[str, Tuple[str, str]] = {}
# Files to link. Maps dst_rel to (src_root, src_rel)
self.files = OrderedDict()
# Files to link. Maps dst_rel to (src_root, src_rel). This is an ordered dict, where files
# are guaranteed to be grouped by src_root in the order they were visited.
self.files: Dict[str, Tuple[str, str]] = {}
def before_visit_dir(self, root, rel_path, depth):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Register a directory if dst / rel_path is not blocked by a file or ignored.
"""
@@ -104,7 +104,7 @@ def before_visit_dir(self, root, rel_path, depth):
self.directories[proj_rel_path] = (root, rel_path)
return True
def before_visit_symlinked_dir(self, root, rel_path, depth):
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Replace symlinked dirs with actual directories when possible in low depths,
otherwise handle it as a file (i.e. we link to the symlink).
@@ -136,40 +136,56 @@ def before_visit_symlinked_dir(self, root, rel_path, depth):
self.visit_file(root, rel_path, depth)
return False
def visit_file(self, root, rel_path, depth):
def visit_file(self, root: str, rel_path: str, depth: int, *, symlink: bool = False) -> None:
proj_rel_path = os.path.join(self.projection, rel_path)
if self.ignore(rel_path):
pass
elif proj_rel_path in self.directories:
# Can't create a file where a dir is; fatal error
src_a_root, src_a_relpath = self.directories[proj_rel_path]
self.fatal_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(src_a_root, src_a_relpath),
src_a=os.path.join(*self.directories[proj_rel_path]),
src_b=os.path.join(root, rel_path),
)
)
elif proj_rel_path in self.files:
# In some cases we can resolve file-file conflicts
src_a_root, src_a_relpath = self.files[proj_rel_path]
self.file_conflicts.append(
MergeConflict(
dst=proj_rel_path,
src_a=os.path.join(src_a_root, src_a_relpath),
src_b=os.path.join(root, rel_path),
# When two files project to the same path, they conflict iff they are distinct.
# If they are the same (i.e. one links to the other), register regular files rather
# than symlinks. The reason is that in copy-type views, we need a copy of the actual
# file, not the symlink.
src_a = os.path.join(*self.files[proj_rel_path])
src_b = os.path.join(root, rel_path)
try:
samefile = os.path.samefile(src_a, src_b)
except OSError:
samefile = False
if not samefile:
# Distinct files produce a conflict.
self.file_conflicts.append(
MergeConflict(dst=proj_rel_path, src_a=src_a, src_b=src_b)
)
)
return
if not symlink:
# Remove the link in favor of the actual file. The del is necessary to maintain the
# order of the files dict, which is grouped by root.
del self.files[proj_rel_path]
self.files[proj_rel_path] = (root, rel_path)
else:
# Otherwise register this file to be linked.
self.files[proj_rel_path] = (root, rel_path)
def visit_symlinked_file(self, root, rel_path, depth):
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
# Treat symlinked files as ordinary files (without "dereferencing")
self.visit_file(root, rel_path, depth)
self.visit_file(root, rel_path, depth, symlink=True)
def set_projection(self, projection):
def set_projection(self, projection: str) -> None:
self.projection = os.path.normpath(projection)
# Todo, is this how to check in general for empty projection?
@@ -197,24 +213,19 @@ def set_projection(self, projection):
class DestinationMergeVisitor(BaseDirectoryVisitor):
"""DestinatinoMergeVisitor takes a SourceMergeVisitor
and:
"""DestinatinoMergeVisitor takes a SourceMergeVisitor and:
a. registers additional conflicts when merging
to the destination prefix
b. removes redundant mkdir operations when
directories already exist in the destination
prefix.
a. registers additional conflicts when merging to the destination prefix
b. removes redundant mkdir operations when directories already exist in the destination prefix.
This also makes sure that symlinked directories
in the target prefix will never be merged with
This also makes sure that symlinked directories in the target prefix will never be merged with
directories in the sources directories.
"""
def __init__(self, source_merge_visitor):
def __init__(self, source_merge_visitor: SourceMergeVisitor):
self.src = source_merge_visitor
def before_visit_dir(self, root, rel_path, depth):
def before_visit_dir(self, root: str, rel_path: str, depth: int) -> bool:
# If destination dir is a file in a src dir, add a conflict,
# and don't traverse deeper
if rel_path in self.src.files:
@@ -236,7 +247,7 @@ def before_visit_dir(self, root, rel_path, depth):
# don't descend into it.
return False
def before_visit_symlinked_dir(self, root, rel_path, depth):
def before_visit_symlinked_dir(self, root: str, rel_path: str, depth: int) -> bool:
"""
Symlinked directories in the destination prefix should
be seen as files; we should not accidentally merge
@@ -262,7 +273,7 @@ def before_visit_symlinked_dir(self, root, rel_path, depth):
# Never descend into symlinked target dirs.
return False
def visit_file(self, root, rel_path, depth):
def visit_file(self, root: str, rel_path: str, depth: int) -> None:
# Can't merge a file if target already exists
if rel_path in self.src.directories:
src_a_root, src_a_relpath = self.src.directories[rel_path]
@@ -280,7 +291,7 @@ def visit_file(self, root, rel_path, depth):
)
)
def visit_symlinked_file(self, root, rel_path, depth):
def visit_symlinked_file(self, root: str, rel_path: str, depth: int) -> None:
# Treat symlinked files as ordinary files (without "dereferencing")
self.visit_file(root, rel_path, depth)

View File

@@ -189,6 +189,7 @@ def _windows_can_symlink() -> bool:
import llnl.util.filesystem as fs
fs.touchp(fpath)
fs.mkdirp(dpath)
try:
os.symlink(dpath, dlink)

View File

@@ -244,7 +244,7 @@ def _search_duplicate_specs_in_externals(error_cls):
+ lines
+ ["as they might result in non-deterministic hashes"]
)
except TypeError:
except (TypeError, AttributeError):
details = []
errors.append(error_cls(summary=error_msg, details=details))
@@ -292,12 +292,6 @@ def _avoid_mismatched_variants(error_cls):
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
syaml.dump_config(config_data, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
for pkg_name in packages_yaml:
# 'all:' must be more forgiving, since it is setting defaults for everything
if pkg_name == "all" or "variants" not in packages_yaml[pkg_name]:
@@ -317,7 +311,7 @@ def make_error(config_data, summary):
f"Setting a preference for the '{pkg_name}' package to the "
f"non-existing variant '{variant.name}'"
)
errors.append(make_error(preferences, summary))
errors.append(_make_config_error(preferences, summary, error_cls=error_cls))
continue
# Variant cannot accept this value
@@ -329,11 +323,41 @@ def make_error(config_data, summary):
f"Setting the variant '{variant.name}' of the '{pkg_name}' package "
f"to the invalid value '{str(variant)}'"
)
errors.append(make_error(preferences, summary))
errors.append(_make_config_error(preferences, summary, error_cls=error_cls))
return errors
@config_packages
def _wrongly_named_spec(error_cls):
"""Warns if the wrong name is used for an external spec"""
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
for pkg_name in packages_yaml:
if pkg_name == "all":
continue
externals = packages_yaml[pkg_name].get("externals", [])
is_virtual = spack.repo.PATH.is_virtual(pkg_name)
for entry in externals:
spec = spack.spec.Spec(entry["spec"])
regular_pkg_is_wrong = not is_virtual and pkg_name != spec.name
virtual_pkg_is_wrong = is_virtual and not any(
p.name == spec.name for p in spack.repo.PATH.providers_for(pkg_name)
)
if regular_pkg_is_wrong or virtual_pkg_is_wrong:
summary = f"Wrong external spec detected for '{pkg_name}': {spec}"
errors.append(_make_config_error(entry, summary, error_cls=error_cls))
return errors
def _make_config_error(config_data, summary, error_cls):
s = io.StringIO()
s.write("Occurring in the following file:\n")
syaml.dump_config(config_data, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
#: Sanity checks on package directives
package_directives = AuditClass(
group="packages",
@@ -772,13 +796,33 @@ def check_virtual_with_variants(spec, msg):
except spack.repo.UnknownPackageError:
# This dependency is completely missing, so report
# and continue the analysis
summary = (
f"{pkg_name}: unknown package '{dep_name}' in " "'depends_on' directive"
)
summary = f"{pkg_name}: unknown package '{dep_name}' in 'depends_on' directive"
details = [f" in {filename}"]
errors.append(error_cls(summary=summary, details=details))
continue
# Check for self-referential specs similar to:
#
# depends_on("foo@X.Y", when="^foo+bar")
#
# That would allow clingo to choose whether to have foo@X.Y+bar in the graph.
problematic_edges = [
x for x in when.edges_to_dependencies(dep_name) if not x.virtuals
]
if problematic_edges and not dep.patches:
summary = (
f"{pkg_name}: dependency on '{dep.spec}' when '{when}' is self-referential"
)
details = [
(
f" please specify better using '^[virtuals=...] {dep_name}', or "
f"substitute with an equivalent condition on '{pkg_name}'"
),
f" in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
continue
# check variants
dependency_variants = dep.spec.variants
for name, value in dependency_variants.items():

View File

@@ -5,7 +5,6 @@
import codecs
import collections
import errno
import hashlib
import io
import itertools
@@ -23,8 +22,7 @@
import urllib.parse
import urllib.request
import warnings
from contextlib import closing, contextmanager
from gzip import GzipFile
from contextlib import closing
from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
@@ -50,6 +48,7 @@
import spack.stage
import spack.store
import spack.traverse as traverse
import spack.util.archive
import spack.util.crypto
import spack.util.file_cache as file_cache
import spack.util.gpg
@@ -1133,205 +1132,46 @@ def generate_key_index(key_prefix, tmpdir=None):
shutil.rmtree(tmpdir)
@contextmanager
def gzip_compressed_tarfile(path):
"""Create a reproducible, compressed tarfile"""
# Create gzip compressed tarball of the install prefix
# 1) Use explicit empty filename and mtime 0 for gzip header reproducibility.
# If the filename="" is dropped, Python will use fileobj.name instead.
# This should effectively mimick `gzip --no-name`.
# 2) On AMD Ryzen 3700X and an SSD disk, we have the following on compression speed:
# compresslevel=6 gzip default: llvm takes 4mins, roughly 2.1GB
# compresslevel=9 python default: llvm takes 12mins, roughly 2.1GB
# So we follow gzip.
with open(path, "wb") as f, ChecksumWriter(f) as inner_checksum, closing(
GzipFile(filename="", mode="wb", compresslevel=6, mtime=0, fileobj=inner_checksum)
) as gzip_file, ChecksumWriter(gzip_file) as outer_checksum, tarfile.TarFile(
name="", mode="w", fileobj=outer_checksum
) as tar:
yield tar, inner_checksum, outer_checksum
def _tarinfo_name(absolute_path: str, *, _path=pathlib.PurePath) -> str:
"""Compute tarfile entry name as the relative path from the (system) root."""
return _path(*_path(absolute_path).parts[1:]).as_posix()
def tarfile_of_spec_prefix(tar: tarfile.TarFile, prefix: str) -> None:
"""Create a tarfile of an install prefix of a spec. Skips existing buildinfo file.
Only adds regular files, symlinks and dirs. Skips devices, fifos. Preserves hardlinks.
Normalizes permissions like git. Tar entries are added in depth-first pre-order, with
dir entries partitioned by file | dir, and sorted alphabetically, for reproducibility.
Partitioning ensures only one dir is in memory at a time, and sorting improves compression.
Args:
tar: tarfile object to add files to
prefix: absolute install prefix of spec"""
if not os.path.isabs(prefix) or not os.path.isdir(prefix):
raise ValueError(f"prefix '{prefix}' must be an absolute path to a directory")
hardlink_to_tarinfo_name: Dict[Tuple[int, int], str] = dict()
stat_key = lambda stat: (stat.st_dev, stat.st_ino)
try: # skip buildinfo file if it exists
files_to_skip = [stat_key(os.lstat(buildinfo_file_name(prefix)))]
skip = lambda entry: stat_key(entry.stat(follow_symlinks=False)) in files_to_skip
except OSError:
files_to_skip = []
skip = lambda entry: False
# First add all directories leading up to `prefix` (Spack <= 0.21 did not do this, leading to
# issues when tarballs are used in runtimes like AWS lambda). Skip the file system root.
parent_dirs = reversed(pathlib.Path(prefix).parents)
next(parent_dirs) # skip the root: slices are supported from python 3.10
for parent_dir in parent_dirs:
dir_info = tarfile.TarInfo(_tarinfo_name(str(parent_dir)))
dir_info.type = tarfile.DIRTYPE
dir_info.mode = 0o755
tar.addfile(dir_info)
dir_stack = [prefix]
while dir_stack:
dir = dir_stack.pop()
# Add the dir before its contents
dir_info = tarfile.TarInfo(_tarinfo_name(dir))
dir_info.type = tarfile.DIRTYPE
dir_info.mode = 0o755
tar.addfile(dir_info)
# Sort by name: reproducible & improves compression
with os.scandir(dir) as it:
entries = sorted(it, key=lambda entry: entry.name)
new_dirs = []
for entry in entries:
if entry.is_dir(follow_symlinks=False):
new_dirs.append(entry.path)
continue
file_info = tarfile.TarInfo(_tarinfo_name(entry.path))
s = entry.stat(follow_symlinks=False)
# Skip existing binary distribution files.
id = stat_key(s)
if id in files_to_skip:
continue
# Normalize the mode
file_info.mode = 0o644 if s.st_mode & 0o100 == 0 else 0o755
if entry.is_symlink():
file_info.type = tarfile.SYMTYPE
file_info.linkname = os.readlink(entry.path)
tar.addfile(file_info)
elif entry.is_file(follow_symlinks=False):
# Deduplicate hardlinks
if s.st_nlink > 1:
if id in hardlink_to_tarinfo_name:
file_info.type = tarfile.LNKTYPE
file_info.linkname = hardlink_to_tarinfo_name[id]
tar.addfile(file_info)
continue
hardlink_to_tarinfo_name[id] = file_info.name
# If file not yet seen, copy it.
file_info.type = tarfile.REGTYPE
file_info.size = s.st_size
with open(entry.path, "rb") as f:
tar.addfile(file_info, f)
dir_stack.extend(reversed(new_dirs)) # we pop, so reverse to stay alphabetical
class ChecksumWriter(io.BufferedIOBase):
"""Checksum writer computes a checksum while writing to a file."""
myfileobj = None
def __init__(self, fileobj, algorithm=hashlib.sha256):
self.fileobj = fileobj
self.hasher = algorithm()
self.length = 0
def hexdigest(self):
return self.hasher.hexdigest()
def write(self, data):
if isinstance(data, (bytes, bytearray)):
length = len(data)
else:
data = memoryview(data)
length = data.nbytes
if length > 0:
self.fileobj.write(data)
self.hasher.update(data)
self.length += length
return length
def read(self, size=-1):
raise OSError(errno.EBADF, "read() on write-only object")
def read1(self, size=-1):
raise OSError(errno.EBADF, "read1() on write-only object")
def peek(self, n):
raise OSError(errno.EBADF, "peek() on write-only object")
@property
def closed(self):
return self.fileobj is None
def close(self):
fileobj = self.fileobj
if fileobj is None:
return
self.fileobj.close()
self.fileobj = None
def flush(self):
self.fileobj.flush()
def fileno(self):
return self.fileobj.fileno()
def rewind(self):
raise OSError("Can't rewind while computing checksum")
def readable(self):
return False
def writable(self):
return True
def seekable(self):
return True
def tell(self):
return self.fileobj.tell()
def seek(self, offset, whence=io.SEEK_SET):
# In principle forward seek is possible with b"0" padding,
# but this is not implemented.
if offset == 0 and whence == io.SEEK_CUR:
return
raise OSError("Can't seek while computing checksum")
def readline(self, size=-1):
raise OSError(errno.EBADF, "readline() on write-only object")
spack.util.archive.reproducible_tarfile_from_prefix(
tar,
prefix,
# Spack <= 0.21 did not include parent directories, leading to issues when tarballs are
# used in runtimes like AWS lambda.
include_parent_directories=True,
skip=skip,
)
def _do_create_tarball(tarfile_path: str, binaries_dir: str, buildinfo: dict):
with gzip_compressed_tarfile(tarfile_path) as (tar, inner_checksum, outer_checksum):
with spack.util.archive.gzip_compressed_tarfile(tarfile_path) as (
tar,
inner_checksum,
outer_checksum,
):
# Tarball the install prefix
tarfile_of_spec_prefix(tar, binaries_dir)
# Serialize buildinfo for the tarball
bstring = syaml.dump(buildinfo, default_flow_style=True).encode("utf-8")
tarinfo = tarfile.TarInfo(name=_tarinfo_name(buildinfo_file_name(binaries_dir)))
tarinfo = tarfile.TarInfo(
name=spack.util.archive.default_path_to_name(buildinfo_file_name(binaries_dir))
)
tarinfo.type = tarfile.REGTYPE
tarinfo.size = len(bstring)
tarinfo.mode = 0o644
@@ -1701,7 +1541,7 @@ def fetch_url_to_mirror(url):
response = spack.oci.opener.urlopen(
urllib.request.Request(
url=ref.manifest_url(),
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
headers={"Accept": ", ".join(spack.oci.oci.manifest_content_type)},
)
)
except Exception:

View File

@@ -542,7 +542,7 @@ def verify_patchelf(patchelf: "spack.util.executable.Executable") -> bool:
return version >= spack.version.Version("0.13.1")
def ensure_patchelf_in_path_or_raise() -> None:
def ensure_patchelf_in_path_or_raise() -> spack.util.executable.Executable:
"""Ensure patchelf is in the PATH or raise."""
# The old concretizer is not smart and we're doing its job: if the latest patchelf
# does not concretize because the compiler doesn't support C++17, we try to

View File

@@ -146,7 +146,7 @@ def mypy_root_spec() -> str:
def black_root_spec() -> str:
"""Return the root spec used to bootstrap black"""
return _root_spec("py-black@:23.1.0")
return _root_spec("py-black@:24.1.0")
def flake8_root_spec() -> str:

View File

@@ -217,6 +217,9 @@ def clean_environment():
env.unset("R_HOME")
env.unset("R_ENVIRON")
env.unset("LUA_PATH")
env.unset("LUA_CPATH")
# Affects GNU make, can e.g. indirectly inhibit enabling parallel build
# env.unset('MAKEFLAGS')
@@ -552,58 +555,55 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
"""
module = ModuleChangePropagator(pkg)
m = module
if context == Context.BUILD:
jobs = determine_number_of_jobs(parallel=pkg.parallel)
m.make_jobs = jobs
module.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
m.nmake = Executable("nmake")
m.msbuild = Executable("msbuild")
# analog to configure for win32
m.cscript = Executable("cscript")
jobs = determine_number_of_jobs(parallel=pkg.parallel)
module.make_jobs = jobs
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
# TODO: make these build deps that can be installed if not found.
module.make = MakeExecutable("make", jobs)
module.gmake = MakeExecutable("gmake", jobs)
module.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
module.nmake = Executable("nmake")
module.msbuild = Executable("msbuild")
# analog to configure for win32
module.cscript = Executable("cscript")
# Standard CMake arguments
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
m.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
m.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
module.configure = Executable("./configure")
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
m.spack_cc = os.path.join(link_dir, pkg.compiler.link_paths["cc"])
m.spack_cxx = os.path.join(link_dir, pkg.compiler.link_paths["cxx"])
m.spack_f77 = os.path.join(link_dir, pkg.compiler.link_paths["f77"])
m.spack_fc = os.path.join(link_dir, pkg.compiler.link_paths["fc"])
module.spack_cc = os.path.join(link_dir, pkg.compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg.compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg.compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg.compiler.link_paths["fc"])
# Useful directories within the prefix are encapsulated in
# a Prefix object.
m.prefix = pkg.prefix
module.prefix = pkg.prefix
# Platform-specific library suffix.
m.dso_suffix = dso_suffix
module.dso_suffix = dso_suffix
def static_to_shared_library(static_lib, shared_lib=None, **kwargs):
compiler_path = kwargs.get("compiler", m.spack_cc)
compiler_path = kwargs.get("compiler", module.spack_cc)
compiler = Executable(compiler_path)
return _static_to_shared_library(
pkg.spec.architecture, compiler, static_lib, shared_lib, **kwargs
)
m.static_to_shared_library = static_to_shared_library
module.static_to_shared_library = static_to_shared_library
module.propagate_changes_to_mro()
@@ -972,8 +972,8 @@ def __init__(self, *specs: spack.spec.Spec, context: Context) -> None:
self.should_set_package_py_globals = (
self.should_setup_dependent_build_env | self.should_setup_run_env | UseMode.ROOT
)
# In a build context, the root and direct build deps need build-specific globals set.
self.needs_build_context = UseMode.ROOT | UseMode.BUILDTIME_DIRECT
# In a build context, the root needs build-specific globals set.
self.needs_build_context = UseMode.ROOT
def set_all_package_py_globals(self):
"""Set the globals in modules of package.py files."""

View File

@@ -541,7 +541,7 @@ def autoreconf(self, pkg, spec, prefix):
if os.path.exists(self.configure_abs_path):
return
# Else try to regenerate it, which reuquires a few build dependencies
# Else try to regenerate it, which requires a few build dependencies
ensure_build_dependencies_or_raise(
spec=spec,
dependencies=["autoconf", "automake", "libtool"],

View File

@@ -199,6 +199,8 @@ def initconfig_mpi_entries(self):
mpiexec = "/usr/bin/srun"
else:
mpiexec = os.path.join(spec["slurm"].prefix.bin, "srun")
elif hasattr(spec["mpi"].package, "mpiexec"):
mpiexec = spec["mpi"].package.mpiexec
else:
mpiexec = os.path.join(spec["mpi"].prefix.bin, "mpirun")
if not os.path.exists(mpiexec):

View File

@@ -15,6 +15,7 @@
import spack.build_environment
import spack.builder
import spack.deptypes as dt
import spack.package_base
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
@@ -31,8 +32,86 @@ def _extract_primary_generator(generator):
primary generator from the generator string which may contain an
optional secondary generator.
"""
primary_generator = _primary_generator_extractor.match(generator).group(1)
return primary_generator
return _primary_generator_extractor.match(generator).group(1)
def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
``find_python_hints`` for context."""
if not getattr(pkg, "find_python_hints", False):
return
pythons = pkg.spec.dependencies("python", dt.BUILD | dt.LINK)
if len(pythons) != 1:
return
try:
python_executable = pythons[0].package.command.path
except RuntimeError:
return
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),
CMakeBuilder.define("Python_EXECUTABLE", python_executable),
CMakeBuilder.define("Python3_EXECUTABLE", python_executable),
]
)
def _supports_compilation_databases(pkg: spack.package_base.PackageBase) -> bool:
"""Check if this package (and CMake) can support compilation databases."""
# CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5
if not pkg.spec.satisfies("^cmake@3.5:"):
return False
# CMAKE_EXPORT_COMPILE_COMMANDS is only implemented for Makefile and Ninja generators
if not (pkg.spec.satisfies("generator=make") or pkg.spec.satisfies("generator=ninja")):
return False
return True
def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
"""Set a few default defines for CMake, depending on its version."""
cmakes = pkg.spec.dependencies("cmake", dt.BUILD)
if len(cmakes) != 1:
return
cmake = cmakes[0]
# CMAKE_INTERPROCEDURAL_OPTIMIZATION only exists for CMake >= 3.9
try:
ipo = pkg.spec.variants["ipo"].value
except KeyError:
ipo = False
if cmake.satisfies("@3.9:"):
args.append(CMakeBuilder.define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
# Disable Package Registry: export(PACKAGE) may put files in the user's home directory, and
# find_package may search there. This is not what we want.
# Do not populate CMake User Package Registry
if cmake.satisfies("@3.15:"):
# see https://cmake.org/cmake/help/latest/policy/CMP0090.html
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
elif cmake.satisfies("@3.1:"):
# see https://cmake.org/cmake/help/latest/variable/CMAKE_EXPORT_NO_PACKAGE_REGISTRY.html
args.append(CMakeBuilder.define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
# Do not use CMake User/System Package Registry
# https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#disabling-the-package-registry
if cmake.satisfies("@3.16:"):
args.append(CMakeBuilder.define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
elif cmake.satisfies("@3.1:3.15"):
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
# Export a compilation database if supported.
if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
def generator(*names: str, default: Optional[str] = None):
@@ -86,6 +165,13 @@ class CMakePackage(spack.package_base.PackageBase):
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "cmake"
#: When this package depends on Python and ``find_python_hints`` is set to True, pass the
#: defines {Python3,Python,PYTHON}_EXECUTABLE explicitly, so that CMake locates the right
#: Python in its builtin FindPython3, FindPython, and FindPythonInterp modules. Spack does
#: CMake's job because CMake's modules by default only search for Python versions known at the
#: time of release.
find_python_hints = True
build_system("cmake")
with when("build_system=cmake"):
@@ -216,7 +302,10 @@ class CMakeBuilder(BaseBuilder):
@property
def archive_files(self):
"""Files to archive for packages based on CMake"""
return [os.path.join(self.build_directory, "CMakeCache.txt")]
files = [os.path.join(self.build_directory, "CMakeCache.txt")]
if _supports_compilation_databases(self):
files.append(os.path.join(self.build_directory, "compile_commands.json"))
return files
@property
def root_cmakelists_dir(self):
@@ -241,9 +330,9 @@ def std_cmake_args(self):
"""Standard cmake arguments provided as a property for
convenience of package writers
"""
std_cmake_args = CMakeBuilder.std_args(self.pkg, generator=self.generator)
std_cmake_args += getattr(self.pkg, "cmake_flag_args", [])
return std_cmake_args
args = CMakeBuilder.std_args(self.pkg, generator=self.generator)
args += getattr(self.pkg, "cmake_flag_args", [])
return args
@staticmethod
def std_args(pkg, generator=None):
@@ -263,11 +352,6 @@ def std_args(pkg, generator=None):
except KeyError:
build_type = "RelWithDebInfo"
try:
ipo = pkg.spec.variants["ipo"].value
except KeyError:
ipo = False
define = CMakeBuilder.define
args = [
"-G",
@@ -276,10 +360,6 @@ def std_args(pkg, generator=None):
define("CMAKE_BUILD_TYPE", build_type),
]
# CMAKE_INTERPROCEDURAL_OPTIMIZATION only exists for CMake >= 3.9
if pkg.spec.satisfies("^cmake@3.9:"):
args.append(define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
if primary_generator == "Unix Makefiles":
args.append(define("CMAKE_VERBOSE_MAKEFILE", True))
@@ -288,6 +368,9 @@ def std_args(pkg, generator=None):
[define("CMAKE_FIND_FRAMEWORK", "LAST"), define("CMAKE_FIND_APPBUNDLE", "LAST")]
)
_conditional_cmake_defaults(pkg, args)
_maybe_set_python_hints(pkg, args)
# Set up CMake rpath
args.extend(
[

View File

@@ -218,7 +218,7 @@ def pset_components(self):
"+inspector": " intel-inspector",
"+itac": " intel-itac intel-ta intel-tc" " intel-trace-analyzer intel-trace-collector",
# Trace Analyzer and Collector
"+vtune": " intel-vtune"
"+vtune": " intel-vtune",
# VTune, ..-profiler since 2020, ..-amplifier before
}.items():
if variant in self.spec:

View File

@@ -29,15 +29,12 @@ class LuaPackage(spack.package_base.PackageBase):
with when("build_system=lua"):
depends_on("lua-lang")
extends("lua", when="^lua")
with when("^lua-luajit"):
extends("lua-luajit")
depends_on("luajit")
depends_on("lua-luajit+lualinks")
with when("^lua-luajit-openresty"):
extends("lua-luajit-openresty")
depends_on("luajit")
depends_on("lua-luajit-openresty+lualinks")
with when("^[virtuals=lua-lang] lua"):
extends("lua")
with when("^[virtuals=lua-lang] lua-luajit"):
extends("lua-luajit+lualinks")
with when("^[virtuals=lua-lang] lua-luajit-openresty"):
extends("lua-luajit-openresty+lualinks")
@property
def lua(self):

View File

@@ -149,7 +149,7 @@ def std_args(pkg):
else:
default_library = "shared"
args = [
return [
"-Dprefix={0}".format(pkg.prefix),
# If we do not specify libdir explicitly, Meson chooses something
# like lib/x86_64-linux-gnu, which causes problems when trying to
@@ -163,8 +163,6 @@ def std_args(pkg):
"-Dwrap_mode=nodownload",
]
return args
@property
def build_dirname(self):
"""Returns the directory name to use when building the package."""

View File

@@ -69,7 +69,7 @@ class MSBuildBuilder(BaseBuilder):
@property
def build_directory(self):
"""Return the directory containing the MSBuild solution or vcxproj."""
return self.pkg.stage.source_path
return fs.windows_sfn(self.pkg.stage.source_path)
@property
def toolchain_version(self):

View File

@@ -77,7 +77,11 @@ def ignore_quotes(self):
@property
def build_directory(self):
"""Return the directory containing the makefile."""
return self.pkg.stage.source_path if not self.makefile_root else self.makefile_root
return (
fs.windows_sfn(self.pkg.stage.source_path)
if not self.makefile_root
else fs.windows_sfn(self.makefile_root)
)
@property
def std_nmake_args(self):

View File

@@ -9,10 +9,13 @@
import shutil
from os.path import basename, isdir
from llnl.util.filesystem import HeaderList, find_libraries, join_path, mkdirp
from llnl.util import tty
from llnl.util.filesystem import HeaderList, LibraryList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
from spack.build_environment import dso_suffix
from spack.directives import conflicts, variant
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -179,16 +182,72 @@ class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""
def openmp_libs(self):
"""Supply LibraryList for linking OpenMP"""
# NB: Hunting down explicit library files may be the Spack way of
# doing things, but it is better to add the compiler defined option
# e.g. -fopenmp
# If other packages use openmp, then all the packages need to
# support the same ABI. Spack usually uses the same compiler
# for all the packages, but you can force it if necessary:
#
# e.g. spack install blaspp%oneapi@2024 ^intel-oneapi-mkl%oneapi@2024
#
if self.spec.satisfies("%intel") or self.spec.satisfies("%oneapi"):
libname = "libiomp5"
elif self.spec.satisfies("%gcc"):
libname = "libgomp"
elif self.spec.satisfies("%clang"):
libname = "libomp"
else:
raise InstallError(
"OneAPI package with OpenMP threading requires one of %clang, %gcc, %oneapi, "
"or %intel"
)
# query the compiler for the library path
with self.compiler.compiler_environment():
omp_lib_path = Executable(self.compiler.cc)(
"--print-file-name", f"{libname}.{dso_suffix}", output=str
).strip()
# Newer versions of clang do not give the full path to libomp. If that's
# the case, look in a path relative to the compiler where libomp is
# typically found. If it's not found there, error out.
if not os.path.exists(omp_lib_path) and self.spec.satisfies("%clang"):
compiler_root = os.path.dirname(os.path.dirname(os.path.realpath(self.compiler.cc)))
omp_lib_path_compiler = os.path.join(compiler_root, "lib", f"{libname}.{dso_suffix}")
if os.path.exists(omp_lib_path_compiler):
omp_lib_path = omp_lib_path_compiler
# if the compiler cannot find the file, it returns the input path
if not os.path.exists(omp_lib_path):
raise InstallError(f"OneAPI package cannot locate OpenMP library: {omp_lib_path}")
omp_libs = LibraryList(omp_lib_path)
tty.info(f"OneAPI package requires OpenMP library: {omp_libs}")
return omp_libs
# find_headers uses heuristics to determine the include directory
# that does not work for oneapi packages. Use explicit directories
# instead.
def header_directories(self, dirs):
h = HeaderList([])
h.directories = dirs
# trilinos passes the directories to cmake, and cmake requires
# that the directory exists
for dir in dirs:
if not isdir(dir):
raise RuntimeError(f"{dir} does not exist")
return h
@property
def headers(self):
return self.header_directories(
[self.component_prefix.include, self.component_prefix.include.join(self.component_dir)]
)
# This should match the directories added to CPATH by
# env/vars.sh for the component
return self.header_directories([self.component_prefix.include])
@property
def libs(self):

View File

@@ -2,11 +2,15 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools
import inspect
import operator
import os
import re
import shutil
from typing import Iterable, List, Mapping, Optional
import stat
from typing import Dict, Iterable, List, Mapping, Optional, Tuple
import archspec
@@ -23,7 +27,7 @@
import spack.package_base
import spack.spec
import spack.store
from spack.directives import build_system, depends_on, extends, maintainers
from spack.directives import build_system, depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
@@ -52,8 +56,6 @@ def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
@@ -136,31 +138,52 @@ def view_file_conflicts(self, view, merge_map):
return conflicts
def add_files_to_view(self, view, merge_map, skip_if_exists=True):
if not self.extendee_spec:
# Patch up shebangs to the python linked in the view only if python is built by Spack.
if not self.extendee_spec or self.extendee_spec.external:
return super().add_files_to_view(view, merge_map, skip_if_exists)
# We only patch shebangs in the bin directory.
copied_files: Dict[Tuple[int, int], str] = {} # File identifier -> source
delayed_links: List[Tuple[str, str]] = [] # List of symlinks from merge map
bin_dir = self.spec.prefix.bin
python_prefix = self.extendee_spec.prefix
python_is_external = self.extendee_spec.external
global_view = fs.same_path(python_prefix, view.get_projection_for_spec(self.spec))
for src, dst in merge_map.items():
if os.path.exists(dst):
if skip_if_exists and os.path.lexists(dst):
continue
elif global_view or not fs.path_contains_subdirectory(src, bin_dir):
if not fs.path_contains_subdirectory(src, bin_dir):
view.link(src, dst)
elif not os.path.islink(src):
continue
s = os.lstat(src)
# Symlink is delayed because we may need to re-target if its target is copied in view
if stat.S_ISLNK(s.st_mode):
delayed_links.append((src, dst))
continue
# If it's executable and has a shebang, copy and patch it.
if (s.st_mode & 0b111) and fs.has_shebang(src):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
is_script = fs.is_nonsymlink_exe_with_shebang(src)
if is_script and not python_is_external:
fs.filter_file(
python_prefix,
os.path.abspath(view.get_projection_for_spec(self.spec)),
dst,
)
fs.filter_file(
python_prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
orig_link_target = os.path.realpath(src)
new_link_target = os.path.abspath(merge_map[orig_link_target])
view.link(new_link_target, dst)
view.link(src, dst)
# Finally re-target the symlinks that point to copied files.
for src, dst in delayed_links:
try:
s = os.stat(src)
target = copied_files[(s.st_dev, s.st_ino)]
except (OSError, KeyError):
target = None
if target:
os.symlink(os.path.relpath(target, os.path.dirname(dst)), dst)
else:
view.link(src, dst, spec=self.spec)
def remove_files_from_view(self, view, merge_map):
ignore_namespace = False
@@ -346,16 +369,19 @@ def headers(self) -> HeaderList:
# Remove py- prefix in package name
name = self.spec.name[3:]
# Headers may be in either location
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
headers = fs.find_all_headers(include) + fs.find_all_headers(platlib)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
if headers:
return headers
msg = "Unable to locate {} headers in {} or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
msg = "Unable to locate {} headers in {}, {}, or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib, purelib))
@property
def libs(self) -> LibraryList:
@@ -364,15 +390,19 @@ def libs(self) -> LibraryList:
# Remove py- prefix in package name
name = self.spec.name[3:]
root = self.prefix.join(self.spec["python"].package.platlib).join(name)
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
libs = fs.find_all_libraries(root, recursive=True)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
libs = functools.reduce(operator.add, libs_list)
if libs:
return libs
msg = "Unable to recursively locate {} libraries in {}"
raise NoLibrariesError(msg.format(self.spec.name, root))
msg = "Unable to recursively locate {} libraries in {} or {}"
raise NoLibrariesError(msg.format(self.spec.name, platlib, purelib))
@spack.builder.builder("python_pip")

View File

@@ -162,23 +162,9 @@ def hip_flags(amdgpu_target):
# Add compiler minimum versions based on the first release where the
# processor is included in llvm/lib/Support/TargetParser.cpp
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx900:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx906:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx908:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx90c")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx941")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx942")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1032")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1033")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx1034")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1035")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx1036")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1100")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1101")

View File

@@ -35,9 +35,9 @@ def _misc_cache():
#: Spack's cache for small data
MISC_CACHE: Union[
spack.util.file_cache.FileCache, llnl.util.lang.Singleton
] = llnl.util.lang.Singleton(_misc_cache)
MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_misc_cache)
)
def fetch_cache_location():
@@ -91,6 +91,6 @@ def symlink(self, mirror_ref):
#: Spack's local cache for downloaded source archives
FETCH_CACHE: Union[
spack.fetch_strategy.FsCache, llnl.util.lang.Singleton
] = llnl.util.lang.Singleton(_fetch_cache)
FETCH_CACHE: Union[spack.fetch_strategy.FsCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_fetch_cache)
)

View File

@@ -7,9 +7,7 @@
get_job_name = lambda needs_entry: (
needs_entry.get("job")
if (isinstance(needs_entry, collections.abc.Mapping) and needs_entry.get("artifacts", True))
else needs_entry
if isinstance(needs_entry, str)
else None
else needs_entry if isinstance(needs_entry, str) else None
)

View File

@@ -7,13 +7,14 @@
import glob
import hashlib
import json
import multiprocessing
import multiprocessing.pool
import os
import shutil
import sys
import tempfile
import urllib.request
from typing import Dict, List, Optional, Tuple
from typing import Dict, List, Optional, Tuple, Union
import llnl.util.tty as tty
from llnl.string import plural
@@ -326,8 +327,30 @@ def _progress(i: int, total: int):
return ""
def _make_pool():
return multiprocessing.pool.Pool(determine_number_of_jobs(parallel=True))
class NoPool:
def map(self, func, args):
return [func(a) for a in args]
def starmap(self, func, args):
return [func(*a) for a in args]
def __enter__(self):
return self
def __exit__(self, *args):
pass
MaybePool = Union[multiprocessing.pool.Pool, NoPool]
def _make_pool() -> MaybePool:
"""Can't use threading because it's unsafe, and can't use spawned processes because of globals.
That leaves only forking"""
if multiprocessing.get_start_method() == "fork":
return multiprocessing.pool.Pool(determine_number_of_jobs(parallel=True))
else:
return NoPool()
def push_fn(args):
@@ -571,6 +594,15 @@ def _put_manifest(
base_manifest, base_config = base_images[architecture]
env = _retrieve_env_dict_from_config(base_config)
# If the base image uses `vnd.docker.distribution.manifest.v2+json`, then we use that too.
# This is because Singularity / Apptainer is very strict about not mixing them.
base_manifest_mediaType = base_manifest.get(
"mediaType", "application/vnd.oci.image.manifest.v1+json"
)
use_docker_format = (
base_manifest_mediaType == "application/vnd.docker.distribution.manifest.v2+json"
)
spack.user_environment.environment_modifications_for_specs(*specs).apply_modifications(env)
# Create an oci.image.config file
@@ -602,8 +634,8 @@ def _put_manifest(
# Upload the config file
upload_blob_with_retry(image_ref, file=config_file, digest=config_file_checksum)
oci_manifest = {
"mediaType": "application/vnd.oci.image.manifest.v1+json",
manifest = {
"mediaType": base_manifest_mediaType,
"schemaVersion": 2,
"config": {
"mediaType": base_manifest["config"]["mediaType"],
@@ -614,7 +646,11 @@ def _put_manifest(
*(layer for layer in base_manifest["layers"]),
*(
{
"mediaType": "application/vnd.oci.image.layer.v1.tar+gzip",
"mediaType": (
"application/vnd.docker.image.rootfs.diff.tar.gzip"
if use_docker_format
else "application/vnd.oci.image.layer.v1.tar+gzip"
),
"digest": str(checksums[s.dag_hash()].compressed_digest),
"size": checksums[s.dag_hash()].size,
}
@@ -623,11 +659,11 @@ def _put_manifest(
],
}
if annotations:
oci_manifest["annotations"] = annotations
if not use_docker_format and annotations:
manifest["annotations"] = annotations
# Finally upload the manifest
upload_manifest_with_retry(image_ref, oci_manifest=oci_manifest)
upload_manifest_with_retry(image_ref, manifest=manifest)
# delete the config file
os.unlink(config_file)
@@ -663,7 +699,7 @@ def _push_oci(
base_image: Optional[ImageReference],
installed_specs_with_deps: List[Spec],
tmpdir: str,
pool: multiprocessing.pool.Pool,
pool: MaybePool,
force: bool = False,
) -> Tuple[List[str], Dict[str, Tuple[dict, dict]], Dict[str, spack.oci.oci.Blob]]:
"""Push specs to an OCI registry
@@ -779,11 +815,10 @@ def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
return config if "spec" in config else None
def _update_index_oci(
image_ref: ImageReference, tmpdir: str, pool: multiprocessing.pool.Pool
) -> None:
response = spack.oci.opener.urlopen(urllib.request.Request(url=image_ref.tags_url()))
spack.oci.opener.ensure_status(response, 200)
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
request = urllib.request.Request(url=image_ref.tags_url())
response = spack.oci.opener.urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags = json.load(response)["tags"]
# Fetch all image config files in parallel

View File

@@ -5,6 +5,7 @@
import re
import sys
from typing import Dict, Optional
import llnl.string
import llnl.util.lang
@@ -17,10 +18,15 @@
import spack.util.crypto
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.package_base import PackageBase, deprecated_version, preferred_version
from spack.package_base import (
ManualDownloadRequiredError,
PackageBase,
deprecated_version,
preferred_version,
)
from spack.util.editor import editor
from spack.util.format import get_version_lines
from spack.version import Version
from spack.version import StandardVersion, Version
description = "checksum available versions of a package"
section = "packaging"
@@ -84,28 +90,30 @@ def checksum(parser, args):
spec = spack.spec.Spec(args.package)
# Get the package we're going to generate checksums for
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
pkg: PackageBase = spack.repo.PATH.get_pkg_class(spec.name)(spec)
versions = [Version(v) for v in args.versions]
# Skip manually downloaded packages
if pkg.manual_download:
raise ManualDownloadRequiredError(pkg.download_instr)
# Define placeholder for remote versions.
# This'll help reduce redundant work if we need to check for the existance
# of remote versions more than once.
remote_versions = None
versions = [StandardVersion.from_string(v) for v in args.versions]
# Define placeholder for remote versions. This'll help reduce redundant work if we need to
# check for the existence of remote versions more than once.
remote_versions: Optional[Dict[StandardVersion, str]] = None
# Add latest version if requested
if args.latest:
remote_versions = pkg.fetch_remote_versions(args.jobs)
remote_versions = pkg.fetch_remote_versions(concurrency=args.jobs)
if len(remote_versions) > 0:
latest_version = sorted(remote_versions.keys(), reverse=True)[0]
versions.append(latest_version)
versions.append(max(remote_versions.keys()))
# Add preferred version if requested
# Add preferred version if requested (todo: exclude git versions)
if args.preferred:
versions.append(preferred_version(pkg))
# Store a dict of the form version -> URL
url_dict = {}
url_dict: Dict[StandardVersion, str] = {}
for version in versions:
if deprecated_version(pkg, version):
@@ -115,16 +123,16 @@ def checksum(parser, args):
if url is not None:
url_dict[version] = url
continue
# if we get here, it's because no valid url was provided by the package
# do expensive fallback to try to recover
# If we get here, it's because no valid url was provided by the package. Do expensive
# fallback to try to recover
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions(args.jobs)
remote_versions = pkg.fetch_remote_versions(concurrency=args.jobs)
if version in remote_versions:
url_dict[version] = remote_versions[version]
if len(versions) <= 0:
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions(args.jobs)
remote_versions = pkg.fetch_remote_versions(concurrency=args.jobs)
url_dict = remote_versions
# A spidered URL can differ from the package.py *computed* URL, pointing to different tarballs.

View File

@@ -6,6 +6,7 @@
import json
import os
import shutil
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -157,7 +158,9 @@ def setup_parser(subparser):
description=deindent(ci_reproduce.__doc__),
help=spack.cmd.first_line(ci_reproduce.__doc__),
)
reproduce.add_argument("job_url", help="URL of job artifacts bundle")
reproduce.add_argument(
"job_url", help="URL of GitLab job web page or artifact", type=_gitlab_artifacts_url
)
reproduce.add_argument(
"--runtime",
help="Container runtime to use.",
@@ -792,11 +795,6 @@ def ci_reproduce(args):
artifacts of the provided gitlab pipeline rebuild job's URL will be used to derive
instructions for reproducing the build locally
"""
job_url = args.job_url
work_dir = args.working_dir
autostart = args.autostart
runtime = args.runtime
# Allow passing GPG key for reprocuding protected CI jobs
if args.gpg_file:
gpg_key_url = url_util.path_to_file_url(args.gpg_file)
@@ -805,7 +803,47 @@ def ci_reproduce(args):
else:
gpg_key_url = None
return spack_ci.reproduce_ci_job(job_url, work_dir, autostart, gpg_key_url, runtime)
return spack_ci.reproduce_ci_job(
args.job_url, args.working_dir, args.autostart, gpg_key_url, args.runtime
)
def _gitlab_artifacts_url(url: str) -> str:
"""Take a URL either to the URL of the job in the GitLab UI, or to the artifacts zip file,
and output the URL to the artifacts zip file."""
parsed = urlparse(url)
if not parsed.scheme or not parsed.netloc:
raise ValueError(url)
parts = parsed.path.split("/")
if len(parts) < 2:
raise ValueError(url)
# Just use API endpoints verbatim, they're probably generated by Spack.
if parts[1] == "api":
return url
# If it's a URL to the job in the Gitlab UI, we may need to append the artifacts path.
minus_idx = parts.index("-")
# Remove repeated slashes in the remainder
rest = [p for p in parts[minus_idx + 1 :] if p]
# Now the format is jobs/X or jobs/X/artifacts/download
if len(rest) < 2 or rest[0] != "jobs":
raise ValueError(url)
if len(rest) == 2:
# replace jobs/X with jobs/X/artifacts/download
rest.extend(("artifacts", "download"))
# Replace the parts and unparse.
parts[minus_idx + 1 :] = rest
# Don't allow fragments / queries
return urlunparse(parsed._replace(path="/".join(parts), fragment="", query=""))
def ci(parser, args):

View File

@@ -76,6 +76,10 @@ def setup_parser(subparser):
)
add_parser.add_argument("-f", "--file", help="file from which to set all config values")
change_parser = sp.add_parser("change", help="swap variants etc. on specs in config")
change_parser.add_argument("path", help="colon-separated path to config section with specs")
change_parser.add_argument("--match-spec", help="only change constraints that match this")
prefer_upstream_parser = sp.add_parser(
"prefer-upstream", help="set package preferences from upstream"
)
@@ -118,7 +122,7 @@ def _get_scope_and_section(args):
if not section and not scope:
env = ev.active_environment()
if env:
scope = env.env_file_config_scope_name()
scope = env.scope_name
# set scope defaults
elif not scope:
@@ -263,6 +267,98 @@ def _can_update_config_file(scope: spack.config.ConfigScope, cfg_file):
return fs.can_write_to_dir(scope.path) and fs.can_access(cfg_file)
def _config_change_requires_scope(path, spec, scope, match_spec=None):
"""Return whether or not anything changed."""
require = spack.config.get(path, scope=scope)
if not require:
return False
changed = False
def override_cfg_spec(spec_str):
nonlocal changed
init_spec = spack.spec.Spec(spec_str)
# Overridden spec cannot be anonymous
init_spec.name = spec.name
if match_spec and not init_spec.satisfies(match_spec):
# If there is a match_spec, don't change constraints that
# don't match it
return spec_str
elif not init_spec.intersects(spec):
changed = True
return str(spack.spec.Spec.override(init_spec, spec))
else:
# Don't override things if they intersect, otherwise we'd
# be e.g. attaching +debug to every single version spec
return spec_str
if isinstance(require, str):
new_require = override_cfg_spec(require)
else:
new_require = []
for item in require:
if "one_of" in item:
item["one_of"] = [override_cfg_spec(x) for x in item["one_of"]]
elif "any_of" in item:
item["any_of"] = [override_cfg_spec(x) for x in item["any_of"]]
elif "spec" in item:
item["spec"] = override_cfg_spec(item["spec"])
elif isinstance(item, str):
item = override_cfg_spec(item)
else:
raise ValueError(f"Unexpected requirement: ({type(item)}) {str(item)}")
new_require.append(item)
spack.config.set(path, new_require, scope=scope)
return changed
def _config_change(config_path, match_spec_str=None):
all_components = spack.config.process_config_path(config_path)
key_components = all_components[:-1]
key_path = ":".join(key_components)
spec = spack.spec.Spec(syaml.syaml_str(all_components[-1]))
match_spec = None
if match_spec_str:
match_spec = spack.spec.Spec(match_spec_str)
if key_components[-1] == "require":
# Extract the package name from the config path, which allows
# args.spec to be anonymous if desired
pkg_name = key_components[1]
spec.name = pkg_name
changed = False
for scope in spack.config.writable_scope_names():
changed |= _config_change_requires_scope(key_path, spec, scope, match_spec=match_spec)
if not changed:
existing_requirements = spack.config.get(key_path)
if isinstance(existing_requirements, str):
raise spack.config.ConfigError(
"'config change' needs to append a requirement,"
" but existing require: config is not a list"
)
ideal_scope_to_modify = None
for scope in spack.config.writable_scope_names():
if spack.config.get(key_path, scope=scope):
ideal_scope_to_modify = scope
break
update_path = f"{key_path}:[{str(spec)}]"
spack.config.add(update_path, scope=ideal_scope_to_modify)
else:
raise ValueError("'config change' can currently only change 'require' sections")
def config_change(args):
_config_change(args.path, args.match_spec)
def config_update(args):
# Read the configuration files
spack.config.CONFIG.get_config(args.section, scope=args.scope)
@@ -490,5 +586,6 @@ def config(parser, args):
"update": config_update,
"revert": config_revert,
"prefer-upstream": config_prefer_upstream,
"change": config_change,
}
action[args.config_command](args)

View File

@@ -8,6 +8,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.config
import spack.spec
import spack.util.path
import spack.version
@@ -21,6 +22,7 @@
def setup_parser(subparser):
subparser.add_argument("-p", "--path", help="source location of package")
subparser.add_argument("-b", "--build-directory", help="build directory for the package")
clone_group = subparser.add_mutually_exclusive_group()
clone_group.add_argument(
@@ -151,4 +153,11 @@ def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
if args.build_directory is not None:
spack.config.add(
"packages:{}:package_attributes:build_directory:{}".format(
spec.name, args.build_directory
),
env.scope_name,
)
_update_config(spec, path)

View File

@@ -270,7 +270,8 @@ def create_temp_env_directory():
def _tty_info(msg):
"""tty.info like function that prints the equivalent printf statement for eval."""
decorated = f'{colorize("@*b{==>}")} {msg}\n'
print(f"printf {shlex.quote(decorated)};")
executor = "echo" if sys.platform == "win32" else "printf"
print(f"{executor} {shlex.quote(decorated)};")
def env_activate(args):

View File

@@ -18,6 +18,7 @@
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.repo
import spack.util.environment
from spack.cmd.common import arguments
@@ -152,9 +153,9 @@ def external_find(args):
def packages_to_search_for(
*, names: Optional[List[str]], tags: List[str], exclude: Optional[List[str]]
):
result = []
for current_tag in tags:
result.extend(spack.repo.PATH.packages_with_tags(current_tag, full=True))
result = list(
{pkg for tag in tags for pkg in spack.repo.PATH.packages_with_tags(tag, full=True)}
)
if names:
# Match both fully qualified and unqualified

View File

@@ -18,7 +18,14 @@
def setup_parser(subparser):
setup_parser.parser = subparser
subparser.epilog = """
Outside of an environment, the command concretizes specs and graphs them, unless the
--installed option is given. In that case specs are matched from the current DB.
If an environment is active, specs are matched from the currently available concrete specs
in the lockfile.
"""
method = subparser.add_mutually_exclusive_group()
method.add_argument(
"-a", "--ascii", action="store_true", help="draw graph as ascii to stdout (default)"
@@ -41,39 +48,40 @@ def setup_parser(subparser):
)
subparser.add_argument(
"-i",
"--installed",
action="store_true",
help="graph installed specs, or specs in the active env (implies --dot)",
"-i", "--installed", action="store_true", help="graph specs from the DB"
)
arguments.add_common_arguments(subparser, ["deptype", "specs"])
def graph(parser, args):
if args.installed and args.specs:
tty.die("cannot specify specs with --installed")
env = ev.active_environment()
if args.installed and env:
tty.die("cannot use --installed with an active environment")
if args.color and not args.dot:
tty.die("the --color option can be used only with --dot")
if args.installed:
args.dot = True
env = ev.active_environment()
if env:
specs = env.concrete_roots()
else:
if not args.specs:
specs = spack.store.STORE.db.query()
else:
result = []
for item in args.specs:
result.extend(spack.store.STORE.db.query(item))
specs = list(set(result))
elif env:
specs = env.concrete_roots()
if args.specs:
specs = env.all_matching_specs(*args.specs)
else:
specs = spack.cmd.parse_specs(args.specs, concretize=not args.static)
if not specs:
setup_parser.parser.print_help()
return 1
tty.die("no spec matching the query")
if args.static:
args.dot = True
static_graph_dot(specs, depflag=args.deptype)
return

View File

@@ -30,6 +30,7 @@
@c{@min:max} version range (inclusive)
@c{@min:} version <min> or higher
@c{@:max} up to version <max> (inclusive)
@c{@=version} exact version
compilers:
@g{%compiler} build with <compiler>

View File

@@ -290,11 +290,11 @@ def require_user_confirmation_for_overwrite(concrete_specs, args):
def _dump_log_on_error(e: spack.build_environment.InstallError):
e.print_context()
assert e.pkg, "Expected InstallError to include the associated package"
if not os.path.exists(e.pkg.build_log_path):
if not os.path.exists(e.pkg.log_path):
tty.error("'spack install' created no log.")
else:
sys.stderr.write("Full build log:\n")
with open(e.pkg.build_log_path, errors="replace") as log:
with open(e.pkg.log_path, errors="replace") as log:
shutil.copyfileobj(log, sys.stderr)

View File

@@ -292,9 +292,11 @@ def head(n, span_id, title, anchor=None):
out.write("<dd>\n")
out.write(
", ".join(
d
if d not in pkg_names
else '<a class="reference internal" href="#%s">%s</a>' % (d, d)
(
d
if d not in pkg_names
else '<a class="reference internal" href="#%s">%s</a>' % (d, d)
)
for d in deps
)
)

View File

@@ -0,0 +1,71 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import gzip
import io
import os
import shutil
import sys
import spack.cmd
import spack.spec
import spack.util.compression as compression
from spack.cmd.common import arguments
from spack.main import SpackCommandError
description = "print out logs for packages"
section = "basic"
level = "long"
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["spec"])
def _dump_byte_stream_to_stdout(instream: io.BufferedIOBase) -> None:
# Reopen stdout in binary mode so we don't have to worry about encoding
outstream = os.fdopen(sys.stdout.fileno(), "wb", closefd=False)
shutil.copyfileobj(instream, outstream)
def _logs(cmdline_spec: spack.spec.Spec, concrete_spec: spack.spec.Spec):
if concrete_spec.installed:
log_path = concrete_spec.package.install_log_path
elif os.path.exists(concrete_spec.package.stage.path):
# TODO: `spack logs` can currently not show the logs while a package is being built, as the
# combined log file is only written after the build is finished.
log_path = concrete_spec.package.log_path
else:
raise SpackCommandError(f"{cmdline_spec} is not installed or staged")
try:
stream = open(log_path, "rb")
except OSError as e:
if e.errno == errno.ENOENT:
raise SpackCommandError(f"No logs are available for {cmdline_spec}") from e
raise SpackCommandError(f"Error reading logs for {cmdline_spec}: {e}") from e
with stream as f:
ext = compression.extension_from_magic_numbers_by_stream(f, decompress=False)
if ext and ext != "gz":
raise SpackCommandError(f"Unsupported storage format for {log_path}: {ext}")
# If the log file is gzip compressed, wrap it with a decompressor
_dump_byte_stream_to_stdout(gzip.GzipFile(fileobj=f) if ext == "gz" else f)
def logs(parser, args):
specs = spack.cmd.parse_specs(args.spec)
if not specs:
raise SpackCommandError("You must supply a spec.")
if len(specs) != 1:
raise SpackCommandError("Too many specs. Supply only one.")
concrete_spec = spack.cmd.matching_spec_from_env(specs[0])
_logs(specs[0], concrete_spec)

View File

@@ -127,10 +127,7 @@ def _process_result(result, show, required_format, kwargs):
print()
if result.unsolved_specs and "solutions" in show:
tty.msg("Unsolved specs")
for spec in result.unsolved_specs:
print(spec)
print()
tty.msg(asp.Result.format_unsolved(result.unsolved_specs))
def solve(parser, args):

View File

@@ -228,7 +228,7 @@ def create_reporter(args, specs_to_test, test_suite):
def test_list(args):
"""list installed packages with available tests"""
tagged = set(spack.repo.PATH.packages_with_tags(*args.tag)) if args.tag else set()
tagged = spack.repo.PATH.packages_with_tags(*args.tag) if args.tag else set()
def has_test_and_tags(pkg_class):
tests = spack.install_test.test_functions(pkg_class)

View File

@@ -514,9 +514,10 @@ def get_compilers(config, cspec=None, arch_spec=None):
for items in config:
items = items["compiler"]
# NOTE: in principle this should be equality not satisfies, but config can still
# be written in old format gcc@10.1.0 instead of gcc@=10.1.0.
if cspec and not cspec.satisfies(items["spec"]):
# We might use equality here.
if cspec and not spack.spec.parse_with_version_concrete(
items["spec"], compiler=True
).satisfies(cspec):
continue
# If an arch spec is given, confirm that this compiler

View File

@@ -7,6 +7,7 @@
import re
import subprocess
import sys
import tempfile
from typing import Dict, List, Set
import spack.compiler
@@ -15,7 +16,7 @@
import spack.util.executable
from spack.compiler import Compiler
from spack.error import SpackError
from spack.version import Version
from spack.version import Version, VersionRange
avail_fc_version: Set[str] = set()
fc_path: Dict[str, str] = dict()
@@ -292,6 +293,15 @@ def setup_custom_environment(self, pkg, env):
else:
env.set_path(env_var, int_env[env_var].split(os.pathsep))
# certain versions of ifx (2021.3.0:2023.1.0) do not play well with env:TMP
# that has a "." character in the path
# Work around by pointing tmp to the stage for the duration of the build
if self.fc and Version(self.fc_version(self.fc)).satisfies(
VersionRange("2021.3.0", "2023.1.0")
):
new_tmp = tempfile.mkdtemp(dir=pkg.stage.path)
env.set("TMP", new_tmp)
env.set("CC", self.cc)
env.set("CXX", self.cxx)
env.set("FC", self.fc)

View File

@@ -826,7 +826,6 @@ def __init__(self, spec):
class InsufficientArchitectureInfoError(spack.error.SpackError):
"""Raised when details on architecture cannot be collected from the
system"""

View File

@@ -63,10 +63,11 @@
from spack.util.cpus import cpus_available
#: Dict from section names -> schema for that section
SECTION_SCHEMAS = {
SECTION_SCHEMAS: Dict[str, Any] = {
"compilers": spack.schema.compilers.schema,
"concretizer": spack.schema.concretizer.schema,
"definitions": spack.schema.definitions.schema,
"view": spack.schema.view.schema,
"develop": spack.schema.develop.schema,
"mirrors": spack.schema.mirrors.schema,
"repos": spack.schema.repos.schema,
@@ -81,7 +82,7 @@
# Same as above, but including keys for environments
# this allows us to unify config reading between configs and environments
_ALL_SCHEMAS = copy.deepcopy(SECTION_SCHEMAS)
_ALL_SCHEMAS: Dict[str, Any] = copy.deepcopy(SECTION_SCHEMAS)
_ALL_SCHEMAS.update({spack.schema.env.TOP_LEVEL_KEY: spack.schema.env.schema})
#: Path to the default configuration
@@ -638,7 +639,6 @@ def get(self, path: str, default: Optional[Any] = None, scope: Optional[str] = N
We use ``:`` as the separator, like YAML objects.
"""
# TODO: Currently only handles maps. Think about lists if needed.
parts = process_config_path(path)
section = parts.pop(0)
@@ -764,6 +764,31 @@ def _add_platform_scope(
cfg.push_scope(scope_type(plat_name, plat_path))
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
"""Load configuration paths from entry points
A python package can register entry point metadata so that Spack can find
its configuration by adding the following to the project's pyproject.toml:
.. code-block:: toml
[project.entry-points."spack.config"]
baz = "baz:get_spack_config_path"
The function ``get_spack_config_path`` returns the path to the package's
spack configuration scope
"""
config_paths: List[Tuple[str, str]] = []
for entry_point in lang.get_entry_points(group="spack.config"):
hook = entry_point.load()
if callable(hook):
config_path = hook()
if config_path and os.path.exists(config_path):
config_paths.append(("plugin-%s" % entry_point.name, str(config_path)))
return config_paths
def _add_command_line_scopes(
cfg: Union[Configuration, lang.Singleton], command_line_scopes: List[str]
) -> None:
@@ -816,6 +841,9 @@ def create() -> Configuration:
# No site-level configs should be checked into spack by default.
configuration_paths.append(("site", os.path.join(spack.paths.etc_path)))
# Python package's can register configuration scopes via entry_points
configuration_paths.extend(config_paths_from_entry_points())
# User configuration can override both spack defaults and site config
# This is disabled if user asks for no local configuration.
if not disable_local_config:
@@ -883,7 +911,9 @@ def add(fullpath: str, scope: Optional[str] = None) -> None:
has_existing_value = True
path = ""
override = False
value = syaml.load_config(components[-1])
value = components[-1]
if not isinstance(value, syaml.syaml_str):
value = syaml.load_config(value)
for idx, name in enumerate(components[:-1]):
# First handle double colons in constructing path
colon = "::" if override else ":" if path else ""
@@ -905,7 +935,7 @@ def add(fullpath: str, scope: Optional[str] = None) -> None:
# construct value from this point down
for component in reversed(components[idx + 1 : -1]):
value = {component: value}
value: Dict[str, str] = {component: value} # type: ignore[no-redef]
break
if override:
@@ -916,7 +946,7 @@ def add(fullpath: str, scope: Optional[str] = None) -> None:
# append values to lists
if isinstance(existing, list) and not isinstance(value, list):
value = [value]
value: List[str] = [value] # type: ignore[no-redef]
# merge value into existing
new = merge_yaml(existing, value)
@@ -949,7 +979,8 @@ def scopes() -> Dict[str, ConfigScope]:
def writable_scopes() -> List[ConfigScope]:
"""
Return list of writable scopes
Return list of writable scopes. Higher-priority scopes come first in the
list.
"""
return list(
reversed(
@@ -1094,7 +1125,7 @@ def read_config_file(
data = syaml.load_config(f)
if data:
if not schema:
if schema is None:
key = next(iter(data))
schema = _ALL_SCHEMAS[key]
validate(data, schema)
@@ -1336,56 +1367,141 @@ def they_are(t):
return copy.copy(source)
class ConfigPath:
quoted_string = "(?:\"[^\"]+\")|(?:'[^']+')"
unquoted_string = "[^:'\"]+"
element = rf"(?:(?:{quoted_string})|(?:{unquoted_string}))"
next_key_pattern = rf"({element}[+-]?)(?:\:|$)"
@staticmethod
def _split_front(string, extract):
m = re.match(extract, string)
if not m:
return None, None
token = m.group(1)
return token, string[len(token) :]
@staticmethod
def _validate(path):
"""Example valid config paths:
x:y:z
x:"y":z
x:y+:z
x:y::z
x:y+::z
x:y:
x:y::
"""
first_key, path = ConfigPath._split_front(path, ConfigPath.next_key_pattern)
if not first_key:
raise ValueError(f"Config path does not start with a parse-able key: {path}")
path_elements = [first_key]
path_index = 1
while path:
separator, path = ConfigPath._split_front(path, r"(\:+)")
if not separator:
raise ValueError(f"Expected separator for {path}")
path_elements[path_index - 1] += separator
if not path:
break
element, remainder = ConfigPath._split_front(path, ConfigPath.next_key_pattern)
if not element:
# If we can't parse something as a key, then it must be a
# value (if it's valid).
try:
syaml.load_config(path)
except spack.util.spack_yaml.SpackYAMLError as e:
raise ValueError(
"Remainder of path is not a valid key"
f" and does not parse as a value {path}"
) from e
element = path
path = None # The rest of the path was consumed into the value
else:
path = remainder
path_elements.append(element)
path_index += 1
return path_elements
@staticmethod
def process(path):
result = []
quote = "['\"]"
seen_override_in_path = False
path_elements = ConfigPath._validate(path)
last_element_idx = len(path_elements) - 1
for i, element in enumerate(path_elements):
override = False
append = False
prepend = False
quoted = False
if element.endswith("::") or (element.endswith(":") and i == last_element_idx):
if seen_override_in_path:
raise syaml.SpackYAMLError(
"Meaningless second override indicator `::' in path `{0}'".format(path), ""
)
override = True
seen_override_in_path = True
element = element.rstrip(":")
if element.endswith("+"):
prepend = True
elif element.endswith("-"):
append = True
element = element.rstrip("+-")
if re.match(f"^{quote}", element):
quoted = True
element = element.strip("'\"")
if any([append, prepend, override, quoted]):
element = syaml.syaml_str(element)
if append:
element.append = True
if prepend:
element.prepend = True
if override:
element.override = True
result.append(element)
return result
def process_config_path(path: str) -> List[str]:
"""Process a path argument to config.set() that may contain overrides ('::' or
trailing ':')
Note: quoted value path components will be processed as a single value (escaping colons)
quoted path components outside of the value will be considered ill formed and will
raise.
e.g. `this:is:a:path:'value:with:colon'` will yield:
Colons will be treated as static strings if inside of quotes,
e.g. `this:is:a:path:'value:with:colon'` will yield:
[this, is, a, path, value:with:colon]
[this, is, a, path, value:with:colon]
The path may consist only of keys (e.g. for a `get`) or may end in a value.
Keys are always strings: if a user encloses a key in quotes, the quotes
should be removed. Values with quotes should be treated as strings,
but without quotes, may be parsed as a different yaml object (e.g.
'{}' is a dict, but '"{}"' is a string).
This function does not know whether the final element of the path is a
key or value, so:
* It must strip the quotes, in case it is a key (so we look for "key" and
not '"key"'))
* It must indicate somehow that the quotes were stripped, in case it is a
value (so that we don't process '"{}"' as a YAML dict)
Therefore, all elements with quotes are stripped, and then also converted
to ``syaml_str`` (if treating the final element as a value, the caller
should not parse it in this case).
"""
result = []
if path.startswith(":"):
raise syaml.SpackYAMLError(f"Illegal leading `:' in path `{path}'", "")
seen_override_in_path = False
while path:
front, sep, path = path.partition(":")
if (sep and not path) or path.startswith(":"):
if seen_override_in_path:
raise syaml.SpackYAMLError(
f"Meaningless second override indicator `::' in path `{path}'", ""
)
path = path.lstrip(":")
front = syaml.syaml_str(front)
front.override = True # type: ignore[attr-defined]
seen_override_in_path = True
elif front.endswith("+"):
front = front.rstrip("+")
front = syaml.syaml_str(front)
front.prepend = True # type: ignore[attr-defined]
elif front.endswith("-"):
front = front.rstrip("-")
front = syaml.syaml_str(front)
front.append = True # type: ignore[attr-defined]
result.append(front)
quote = "['\"]"
not_quote = "[^'\"]"
if re.match(f"^{quote}", path):
m = re.match(rf"^({quote}{not_quote}+{quote})$", path)
if not m:
raise ValueError("Quotes indicate value, but there are additional path entries")
result.append(m.group(1))
break
return result
return ConfigPath.process(path)
#

View File

@@ -19,9 +19,6 @@
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/fedora:38"
}
@@ -33,9 +30,6 @@
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/fedora:37"
}
@@ -47,9 +41,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux9",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/rockylinux:9"
}
@@ -61,9 +52,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux8",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/rockylinux:8"
}
@@ -71,29 +59,23 @@
"almalinux:9": {
"bootstrap": {
"template": "container/almalinux_9.dockerfile",
"image": "quay.io/almalinux/almalinux:9"
"image": "quay.io/almalinuxorg/almalinux:9"
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux9",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "quay.io/almalinux/almalinux:9"
"image": "quay.io/almalinuxorg/almalinux:9"
}
},
"almalinux:8": {
"bootstrap": {
"template": "container/almalinux_8.dockerfile",
"image": "quay.io/almalinux/almalinux:8"
"image": "quay.io/almalinuxorg/almalinux:8"
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux8",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "quay.io/almalinux/almalinux:8"
"image": "quay.io/almalinuxorg/almalinux:8"
}
},
"centos:stream": {
@@ -105,9 +87,6 @@
"build": "spack/centos-stream",
"final": {
"image": "quay.io/centos/centos:stream"
},
"build_tags": {
"develop": "latest"
}
},
"centos:7": {
@@ -115,10 +94,7 @@
"template": "container/centos_7.dockerfile"
},
"os_package_manager": "yum",
"build": "spack/centos7",
"build_tags": {
"develop": "latest"
}
"build": "spack/centos7"
},
"opensuse/leap:15": {
"bootstrap": {
@@ -126,9 +102,6 @@
},
"os_package_manager": "zypper",
"build": "spack/leap15",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "opensuse/leap:latest"
}
@@ -148,19 +121,13 @@
"template": "container/ubuntu_2204.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-jammy",
"build_tags": {
"develop": "latest"
}
"build": "spack/ubuntu-jammy"
},
"ubuntu:20.04": {
"bootstrap": {
"template": "container/ubuntu_2004.dockerfile"
},
"build": "spack/ubuntu-focal",
"build_tags": {
"develop": "latest"
},
"os_package_manager": "apt"
},
"ubuntu:18.04": {
@@ -168,10 +135,7 @@
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic",
"build_tags": {
"develop": "latest"
}
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -50,10 +50,7 @@ def build_info(image, spack_version):
if not build_image:
return None, None
# Translate version from git to docker if necessary
build_tag = image_data["build_tags"].get(spack_version, spack_version)
return build_image, build_tag
return build_image, spack_version
def os_package_manager_for(image):

View File

@@ -1687,7 +1687,11 @@ def root(key, record):
with self.read_transaction():
roots = [rec.spec for key, rec in self._data.items() if root(key, rec)]
needed = set(id(spec) for spec in tr.traverse_nodes(roots, deptype=deptype))
return [rec.spec for rec in self._data.values() if id(rec.spec) not in needed]
return [
rec.spec
for rec in self._data.values()
if id(rec.spec) not in needed and rec.installed
]
def update_explicit(self, spec, explicit):
"""

View File

@@ -36,6 +36,9 @@
#: Default dependency type if none is specified
DEFAULT: DepFlag = BUILD | LINK
#: A flag with no dependency types set
NONE: DepFlag = 0
#: An iterator of all flag components
ALL_FLAGS: Tuple[DepFlag, DepFlag, DepFlag, DepFlag] = (BUILD, LINK, RUN, TEST)

View File

@@ -34,7 +34,7 @@ class OpenMpi(Package):
import functools
import os.path
import re
from typing import Any, Callable, List, Optional, Set, Tuple, Union
from typing import TYPE_CHECKING, Any, Callable, List, Optional, Set, Tuple, Union
import llnl.util.lang
import llnl.util.tty.color
@@ -57,6 +57,9 @@ class OpenMpi(Package):
VersionLookupError,
)
if TYPE_CHECKING:
import spack.package_base
__all__ = [
"DirectiveError",
"DirectiveMeta",
@@ -349,6 +352,7 @@ def remove_directives(arg):
return _decorator
SubmoduleCallback = Callable[["spack.package_base.PackageBase"], Union[str, List[str], bool]]
directive = DirectiveMeta.directive
@@ -380,7 +384,7 @@ def version(
tag: Optional[str] = None,
branch: Optional[str] = None,
get_full_repo: Optional[bool] = None,
submodules: Optional[bool] = None,
submodules: Union[SubmoduleCallback, Optional[bool]] = None,
submodules_delete: Optional[bool] = None,
# other version control
svn: Optional[str] = None,
@@ -699,11 +703,14 @@ def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependenc
patch: spack.patch.Patch
if "://" in url_or_filename:
if sha256 is None:
raise ValueError("patch() with a url requires a sha256")
patch = spack.patch.UrlPatch(
pkg,
url_or_filename,
level,
working_dir,
working_dir=working_dir,
ordering_key=ordering_key,
sha256=sha256,
archive_sha256=archive_sha256,

View File

@@ -21,7 +21,6 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.lang import dedupe
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import symlink
@@ -83,17 +82,30 @@
lockfile_name = "spack.lock"
#: Name of the directory where environments store repos, logs, views
#: Name of the directory where environments store repos, logs, views, configs
env_subdir_name = ".spack-env"
def env_root_path():
def env_root_path() -> str:
"""Override default root path if the user specified it"""
return spack.util.path.canonicalize_path(
spack.config.get("config:environments_root", default=default_env_path)
)
def environment_name(path: Union[str, pathlib.Path]) -> str:
"""Human-readable representation of the environment.
This is the path for directory environments, and just the name
for managed environments.
"""
path_str = str(path)
if path_str.startswith(env_root_path()):
return os.path.basename(path_str)
else:
return path_str
def check_disallowed_env_config_mods(scopes):
for scope in scopes:
with spack.config.use_configuration(scope):
@@ -179,9 +191,8 @@ def validate_env_name(name):
def activate(env, use_env_repo=False):
"""Activate an environment.
To activate an environment, we add its configuration scope to the
existing Spack configuration, and we set active to the current
environment.
To activate an environment, we add its manifest's configuration scope to the
existing Spack configuration, and we set active to the current environment.
Arguments:
env (Environment): the environment to activate
@@ -198,7 +209,7 @@ def activate(env, use_env_repo=False):
# below.
install_tree_before = spack.config.get("config:install_tree")
upstreams_before = spack.config.get("upstreams")
prepare_config_scope(env)
env.manifest.prepare_config_scope()
install_tree_after = spack.config.get("config:install_tree")
upstreams_after = spack.config.get("upstreams")
if install_tree_before != install_tree_after or upstreams_before != upstreams_after:
@@ -226,7 +237,7 @@ def deactivate():
if hasattr(_active_environment, "store_token"):
spack.store.restore(_active_environment.store_token)
delattr(_active_environment, "store_token")
deactivate_config_scope(_active_environment)
_active_environment.manifest.deactivate_config_scope()
# use _repo so we only remove if a repo was actually constructed
if _active_environment._repo:
@@ -363,12 +374,12 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
to store the environment in a different directory, we have to rewrite
relative paths to absolute ones."""
with env:
dev_specs = spack.config.get("develop", default={}, scope=env.env_file_config_scope_name())
dev_specs = spack.config.get("develop", default={}, scope=env.scope_name)
if not dev_specs:
return
for name, entry in dev_specs.items():
dev_path = entry["path"]
expanded_path = os.path.normpath(os.path.join(init_file_dir, entry["path"]))
dev_path = substitute_path_variables(entry["path"])
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
# Skip if the expanded path is the same (e.g. when absolute)
if dev_path == expanded_path:
@@ -378,7 +389,7 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
dev_specs[name]["path"] = expanded_path
spack.config.set("develop", dev_specs, scope=env.env_file_config_scope_name())
spack.config.set("develop", dev_specs, scope=env.scope_name)
env._dev_specs = None
# If we changed the environment's spack.yaml scope, that will not be reflected
@@ -599,39 +610,32 @@ def content_hash(self, specs):
return spack.util.hash.b32_hash(contents)
def get_projection_for_spec(self, spec):
"""Get projection for spec relative to view root
"""Get projection for spec. This function does not require the view
to exist on the filesystem."""
return self._view(self.root).get_projection_for_spec(spec)
Getting the projection from the underlying root will get the temporary
projection. This gives the permanent projection relative to the root
symlink.
def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
"""
view = self.view()
view_path = view.get_projection_for_spec(spec)
rel_path = os.path.relpath(view_path, self._current_root)
return os.path.join(self.root, rel_path)
def view(self, new=None):
"""
Generate the FilesystemView object for this ViewDescriptor
By default, this method returns a FilesystemView object rooted at the
current underlying root of this ViewDescriptor (self._current_root)
Returns a view object for the *underlying* view directory. This means that the
self.root symlink is followed, and that the view has to exist on the filesystem
(unless ``new``). This function is useful when writing to the view.
Raise if new is None and there is no current view
Arguments:
new (str or None): If a string, create a FilesystemView
rooted at that path. Default None. This should only be used to
regenerate the view, and cannot be used to access specs.
new: If a string, create a FilesystemView rooted at that path. Default None. This
should only be used to regenerate the view, and cannot be used to access specs.
"""
root = new if new else self._current_root
if not root:
path = new if new else self._current_root
if not path:
# This can only be hit if we write a future bug
msg = (
"Attempting to get nonexistent view from environment. "
"View root is at %s" % self.root
raise SpackEnvironmentViewError(
f"Attempting to get nonexistent view from environment. View root is at {self.root}"
)
raise SpackEnvironmentViewError(msg)
return self._view(path)
def _view(self, root: str) -> SimpleFilesystemView:
"""Returns a view object for a given root dir."""
return SimpleFilesystemView(
root,
spack.store.STORE.layout,
@@ -657,30 +661,28 @@ def __contains__(self, spec):
return True
def specs_for_view(self, concretized_root_specs):
"""
From the list of concretized user specs in the environment, flatten
the dags, and filter selected, installed specs, remove duplicates on dag hash.
"""
# With deps, requires traversal
if self.link == "all" or self.link == "run":
deptype = ("run") if self.link == "run" else ("link", "run")
specs = list(
traverse.traverse_nodes(
concretized_root_specs, deptype=deptype, key=traverse.by_dag_hash
)
)
def specs_for_view(self, concrete_roots: List[Spec]) -> List[Spec]:
"""Flatten the DAGs of the concrete roots, keep only unique, selected, and installed specs
in topological order from root to leaf."""
if self.link == "all":
deptype = dt.LINK | dt.RUN
elif self.link == "run":
deptype = dt.RUN
else:
specs = list(dedupe(concretized_root_specs, key=traverse.by_dag_hash))
deptype = dt.NONE
specs = traverse.traverse_nodes(
concrete_roots, order="topo", deptype=deptype, key=traverse.by_dag_hash
)
# Filter selected, installed specs
with spack.store.STORE.db.read_transaction():
specs = [s for s in specs if s in self and s.installed]
result = [s for s in specs if s in self and s.installed]
return specs
return self._exclude_duplicate_runtimes(result)
def regenerate(self, concretized_root_specs):
specs = self.specs_for_view(concretized_root_specs)
def regenerate(self, concrete_roots: List[Spec]) -> None:
specs = self.specs_for_view(concrete_roots)
# To ensure there are no conflicts with packages being installed
# that cannot be resolved or have repos that have been removed
@@ -697,14 +699,14 @@ def regenerate(self, concretized_root_specs):
old_root = self._current_root
if new_root == old_root:
tty.debug("View at %s does not need regeneration." % self.root)
tty.debug(f"View at {self.root} does not need regeneration.")
return
_error_on_nonempty_view_dir(new_root)
# construct view at new_root
if specs:
tty.msg("Updating view at {0}".format(self.root))
tty.msg(f"Updating view at {self.root}")
view = self.view(new=new_root)
@@ -714,7 +716,7 @@ def regenerate(self, concretized_root_specs):
# Create a new view
try:
fs.mkdirp(new_root)
view.add_specs(*specs, with_dependencies=False)
view.add_specs(*specs)
# create symlink from tmp_symlink_name to new_root
if os.path.exists(tmp_symlink_name):
@@ -728,7 +730,7 @@ def regenerate(self, concretized_root_specs):
try:
shutil.rmtree(new_root, ignore_errors=True)
os.unlink(tmp_symlink_name)
except (IOError, OSError):
except OSError:
pass
# Give an informative error message for the typical error case: two specs, same package
@@ -764,11 +766,32 @@ def regenerate(self, concretized_root_specs):
msg += str(e)
tty.warn(msg)
def _exclude_duplicate_runtimes(self, nodes):
all_runtimes = spack.repo.PATH.packages_with_tags("runtime")
runtimes_by_name = {}
for s in nodes:
if s.name not in all_runtimes:
continue
current_runtime = runtimes_by_name.get(s.name, s)
runtimes_by_name[s.name] = max(current_runtime, s, key=lambda x: x.version)
return [x for x in nodes if x.name not in all_runtimes or runtimes_by_name[x.name] == x]
def _create_environment(path):
return Environment(path)
def env_subdir_path(manifest_dir: Union[str, pathlib.Path]) -> str:
"""Path to where the environment stores repos, logs, views, configs.
Args:
manifest_dir: directory containing the environment manifest file
Returns: directory the environment uses to manage its files
"""
return os.path.join(str(manifest_dir), env_subdir_name)
class Environment:
"""A Spack environment, which bundles together configuration and a list of specs."""
@@ -780,12 +803,13 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
manifest_dir: directory with the "spack.yaml" associated with the environment
"""
self.path = os.path.abspath(str(manifest_dir))
self.name = environment_name(self.path)
self.env_subdir_path = env_subdir_path(self.path)
self.txlock = lk.Lock(self._transaction_lock_path)
self._unify = None
self.new_specs: List[Spec] = []
self.new_installs: List[Spec] = []
self.views: Dict[str, ViewDescriptor] = {}
#: Specs from "spack.yaml"
@@ -802,9 +826,15 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
self._previous_active = None
self._dev_specs = None
# Load the manifest file contents into memory
self._load_manifest_file()
def _load_manifest_file(self):
"""Instantiate and load the manifest file contents into memory."""
with lk.ReadTransaction(self.txlock):
self.manifest = EnvironmentManifestFile(manifest_dir)
self._read()
self.manifest = EnvironmentManifestFile(self.path)
with self.manifest.use_config():
self._read()
@property
def unify(self):
@@ -822,19 +852,10 @@ def __reduce__(self):
def _re_read(self):
"""Reinitialize the environment object."""
self.clear(re_read=True)
self.manifest = EnvironmentManifestFile(self.path)
self._read(re_read=True)
self._load_manifest_file()
def _read(self, re_read=False):
# If the manifest has included files, then some of the information
# (e.g., definitions) MAY be in those files. So we need to ensure
# the config is populated with any associated spec lists in order
# to fully construct the manifest state.
includes = self.manifest[TOP_LEVEL_KEY].get("include", [])
if includes and not re_read:
prepare_config_scope(self)
self._construct_state_from_manifest(re_read)
def _read(self):
self._construct_state_from_manifest()
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
@@ -861,38 +882,67 @@ def _process_definition(self, item):
else:
self.spec_lists[name] = user_specs
def _construct_state_from_manifest(self, re_read=False):
"""Read manifest file and set up user specs."""
def _process_view(self, env_view: Optional[Union[bool, str, Dict]]):
"""Process view option(s), which can be boolean, string, or None.
A boolean environment view option takes precedence over any that may
be included. So ``view: True`` results in the default view only. And
``view: False`` means the environment will have no view.
Args:
env_view: view option provided in the manifest or configuration
"""
def add_view(name, values):
"""Add the view with the name and the string or dict values."""
if isinstance(values, str):
self.views[name] = ViewDescriptor(self.path, values)
elif isinstance(values, dict):
self.views[name] = ViewDescriptor.from_dict(self.path, values)
else:
tty.error(f"Cannot add view named {name} for {type(values)} values {values}")
# If the configuration specifies 'view: False' then we are done
# processing views. If this is called with the environment's view
# view (versus an included view), then there are to be NO views.
if env_view is False:
return
# If the configuration specifies 'view: True' then only the default
# view will be created for the environment and we are done processing
# views.
if env_view is True:
add_view(default_view_name, self.view_path_default)
return
# Otherwise, the configuration has a subdirectory or dictionary.
if isinstance(env_view, str):
add_view(default_view_name, env_view)
elif env_view:
for name, values in env_view.items():
add_view(name, values)
# If we reach this point without an explicit view option then we
# provide the default view.
if self.views == dict():
self.views[default_view_name] = ViewDescriptor(self.path, self.view_path_default)
def _construct_state_from_manifest(self):
"""Set up user specs and views from the manifest file."""
self.spec_lists = collections.OrderedDict()
self.views = {}
if not re_read:
for item in spack.config.get("definitions", []):
self._process_definition(item)
env_configuration = self.manifest[TOP_LEVEL_KEY]
for item in env_configuration.get("definitions", []):
for item in spack.config.get("definitions", []):
self._process_definition(item)
env_configuration = self.manifest[TOP_LEVEL_KEY]
spec_list = env_configuration.get(user_speclist_name, [])
user_specs = SpecList(
user_speclist_name, [s for s in spec_list if s], self.spec_lists.copy()
)
self.spec_lists[user_speclist_name] = user_specs
enable_view = env_configuration.get("view")
# enable_view can be boolean, string, or None
if enable_view is True or enable_view is None:
self.views = {default_view_name: ViewDescriptor(self.path, self.view_path_default)}
elif isinstance(enable_view, str):
self.views = {default_view_name: ViewDescriptor(self.path, enable_view)}
elif enable_view:
path = self.path
self.views = dict(
(name, ViewDescriptor.from_dict(path, values))
for name, values in enable_view.items()
)
else:
self.views = {}
self._process_view(spack.config.get("view", True))
@property
def user_specs(self):
@@ -921,10 +971,8 @@ def clear(self, re_read=False):
"""Clear the contents of the environment
Arguments:
re_read (bool): If True, do not clear ``new_specs`` nor
``new_installs`` values. These values cannot be read from
yaml, and need to be maintained when re-reading an existing
environment.
re_read: If ``True``, do not clear ``new_specs``. This value cannot be read from yaml,
and needs to be maintained when re-reading an existing environment.
"""
self.spec_lists = collections.OrderedDict()
self.spec_lists[user_speclist_name] = SpecList()
@@ -938,24 +986,6 @@ def clear(self, re_read=False):
if not re_read:
# things that cannot be recreated from file
self.new_specs = [] # write packages for these on write()
self.new_installs = [] # write modules for these on write()
@property
def internal(self):
"""Whether this environment is managed by Spack."""
return self.path.startswith(env_root_path())
@property
def name(self):
"""Human-readable representation of the environment.
This is the path for directory environments, and just the name
for managed environments.
"""
if self.internal:
return os.path.basename(self.path)
else:
return self.path
@property
def active(self):
@@ -984,23 +1014,9 @@ def _lock_backup_v1_path(self):
"""Path to backup of v1 lockfile before conversion to v2"""
return self.lock_path + ".backup.v1"
@property
def env_subdir_path(self):
"""Path to directory where the env stores repos, logs, views."""
return os.path.join(self.path, env_subdir_name)
@property
def repos_path(self):
return os.path.join(self.path, env_subdir_name, "repos")
@property
def log_path(self):
return os.path.join(self.path, env_subdir_name, "logs")
@property
def config_stage_dir(self):
"""Directory for any staged configuration file(s)."""
return os.path.join(self.env_subdir_path, "config")
return os.path.join(self.env_subdir_path, "repos")
@property
def view_path_default(self):
@@ -1013,122 +1029,10 @@ def repo(self):
self._repo = make_repo_path(self.repos_path)
return self._repo
def included_config_scopes(self):
"""List of included configuration scopes from the environment.
Scopes are listed in the YAML file in order from highest to
lowest precedence, so configuration from earlier scope will take
precedence over later ones.
This routine returns them in the order they should be pushed onto
the internal scope stack (so, in reverse, from lowest to highest).
"""
scopes = []
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = self.manifest[TOP_LEVEL_KEY].get("include", [])
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, self.config_stage_dir, skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
config_path = os.path.join(self.path, config_path)
config_path = os.path.normpath(os.path.realpath(config_path))
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = "env:%s:%s" % (self.name, os.path.basename(config_path))
tty.debug("Creating ConfigScope {0} for '{1}'".format(config_name, config_path))
scope = spack.config.ConfigScope(config_name, config_path)
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = "env:%s:%s" % (self.name, config_path)
tty.debug(
"Creating SingleFileScope {0} for '{1}'".format(config_name, config_path)
)
scope = spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
else:
missing.append(config_path)
continue
scopes.append(scope)
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
raise spack.config.ConfigFileError(msg)
return scopes
def env_file_config_scope_name(self):
@property
def scope_name(self):
"""Name of the config scope of this environment's manifest file."""
return "env:%s" % self.name
def env_file_config_scope(self):
"""Get the configuration scope for the environment's manifest file."""
config_name = self.env_file_config_scope_name()
return spack.config.SingleFileScope(
config_name, self.manifest_path, spack.schema.env.schema, [TOP_LEVEL_KEY]
)
def config_scopes(self):
"""A list of all configuration scopes for this environment."""
return check_disallowed_env_config_mods(
self.included_config_scopes() + [self.env_file_config_scope()]
)
return self.manifest.scope_name
def destroy(self):
"""Remove this environment from Spack entirely."""
@@ -1228,7 +1132,7 @@ def change_existing_spec(
for idx, spec in matches:
override_spec = Spec.override(spec, change_spec)
self.spec_lists[list_name].specs[idx] = override_spec
self.spec_lists[list_name].replace(idx, str(override_spec))
if list_name == user_speclist_name:
self.manifest.override_user_spec(str(override_spec), idx=idx)
else:
@@ -1236,7 +1140,6 @@ def change_existing_spec(
str(spec), override=str(override_spec), list_name=list_name
)
self.update_stale_references(from_list=list_name)
self._construct_state_from_manifest()
def remove(self, query_spec, list_name=user_speclist_name, force=False):
"""Remove specs from an environment that match a query_spec"""
@@ -1593,44 +1496,6 @@ def _concretize_separately(self, tests=False):
]
return results
def concretize_and_add(self, user_spec, concrete_spec=None, tests=False):
"""Concretize and add a single spec to the environment.
Concretize the provided ``user_spec`` and add it along with the
concretized result to the environment. If the given ``user_spec`` was
already present in the environment, this does not add a duplicate.
The concretized spec will be added unless the ``user_spec`` was
already present and an associated concrete spec was already present.
Args:
concrete_spec: if provided, then it is assumed that it is the
result of concretizing the provided ``user_spec``
"""
if self.unify is True:
msg = (
"cannot install a single spec in an environment that is "
"configured to be concretized together. Run instead:\n\n"
" $ spack add <spec>\n"
" $ spack install\n"
)
raise SpackEnvironmentError(msg)
spec = Spec(user_spec)
if self.add(spec):
concrete = concrete_spec or spec.concretized(tests=tests)
self._add_concrete_spec(spec, concrete)
else:
# spec might be in the user_specs, but not installed.
# TODO: Redo name-based comparison for old style envs
spec = next(s for s in self.user_specs if s.satisfies(user_spec))
concrete = self.specs_by_hash.get(spec.dag_hash())
if not concrete:
concrete = spec.concretized(tests=tests)
self._add_concrete_spec(spec, concrete)
return concrete
@property
def default_view(self):
if not self.has_view(default_view_name):
@@ -1807,8 +1672,8 @@ def _add_concrete_spec(self, spec, concrete, new=True):
self.concretized_order.append(h)
self.specs_by_hash[h] = concrete
def _get_overwrite_specs(self):
# Find all dev specs that were modified.
def _dev_specs_that_need_overwrite(self):
"""Return the hashes of all specs that need to be reinstalled due to source code change."""
changed_dev_specs = [
s
for s in traverse.traverse_nodes(
@@ -1833,21 +1698,6 @@ def _get_overwrite_specs(self):
if depth == 0 or spec.installed
]
def _install_log_links(self, spec):
if not spec.external:
# Make sure log directory exists
log_path = self.log_path
fs.mkdirp(log_path)
with fs.working_dir(self.path):
# Link the resulting log file into logs dir
build_log_link = os.path.join(
log_path, "%s-%s.log" % (spec.name, spec.dag_hash(7))
)
if os.path.lexists(build_log_link):
os.remove(build_log_link)
symlink(spec.package.build_log_path, build_log_link)
def _partition_roots_by_install_status(self):
"""Partition root specs into those that do not have to be passed to the
installer, and those that should be, taking into account development
@@ -1881,58 +1731,18 @@ def install_all(self, **install_args):
"""
self.install_specs(None, **install_args)
def install_specs(self, specs=None, **install_args):
tty.debug("Assessing installation status of environment packages")
# If "spack install" is invoked repeatedly for a large environment
# where all specs are already installed, the operation can take
# a large amount of time due to repeatedly acquiring and releasing
# locks. As a small optimization, drop already installed root specs.
installed_roots, uninstalled_roots = self._partition_roots_by_install_status()
if specs:
specs_to_install = [s for s in specs if s not in installed_roots]
specs_dropped = [s for s in specs if s in installed_roots]
else:
specs_to_install = uninstalled_roots
specs_dropped = installed_roots
def install_specs(self, specs: Optional[List[Spec]] = None, **install_args):
roots = self.concrete_roots()
specs = specs if specs is not None else roots
# We need to repeat the work of the installer thanks to the above optimization:
# Already installed root specs should be marked explicitly installed in the
# database.
if specs_dropped:
with spack.store.STORE.db.write_transaction(): # do all in one transaction
for spec in specs_dropped:
spack.store.STORE.db.update_explicit(spec, True)
# Extend the set of specs to overwrite with modified dev specs and their parents
install_args["overwrite"] = (
install_args.get("overwrite", []) + self._dev_specs_that_need_overwrite()
)
if not specs_to_install:
tty.msg("All of the packages are already installed")
else:
tty.debug("Processing {0} uninstalled specs".format(len(specs_to_install)))
installs = [(spec.package, {**install_args, "explicit": spec in roots}) for spec in specs]
specs_to_overwrite = self._get_overwrite_specs()
tty.debug("{0} specs need to be overwritten".format(len(specs_to_overwrite)))
install_args["overwrite"] = install_args.get("overwrite", []) + specs_to_overwrite
installs = []
for spec in specs_to_install:
pkg_install_args = install_args.copy()
pkg_install_args["explicit"] = spec in self.roots()
installs.append((spec.package, pkg_install_args))
try:
builder = PackageInstaller(installs)
builder.install()
finally:
# Ensure links are set appropriately
for spec in specs_to_install:
if spec.installed:
self.new_installs.append(spec)
try:
self._install_log_links(spec)
except OSError as e:
tty.warn(
"Could not install log links for {0}: {1}".format(spec.name, str(e))
)
PackageInstaller(installs).install()
def all_specs_generator(self) -> Iterable[Spec]:
"""Returns a generator for all concrete specs"""
@@ -2253,13 +2063,8 @@ def write(self, regenerate: bool = True) -> None:
if regenerate:
self.regenerate_views()
spack.hooks.post_env_write(self)
self._reset_new_specs_and_installs()
def _reset_new_specs_and_installs(self) -> None:
self.new_specs = []
self.new_installs = []
self.new_specs.clear()
def update_lockfile(self) -> None:
with fs.write_tmp_and_move(self.lock_path) as f:
@@ -2457,18 +2262,6 @@ def make_repo_path(root):
return path
def prepare_config_scope(env):
"""Add env's scope to the global configuration search path."""
for scope in env.config_scopes():
spack.config.CONFIG.push_scope(scope)
def deactivate_config_scope(env):
"""Remove any scopes from env from the global config path."""
for scope in env.config_scopes():
spack.config.CONFIG.remove_scope(scope.name)
def manifest_file(env_name_or_dir):
"""Return the absolute path to a manifest file given the environment
name or directory.
@@ -2647,8 +2440,9 @@ def from_lockfile(manifest_dir: Union[pathlib.Path, str]) -> "EnvironmentManifes
already existing in the directory.
Args:
manifest_dir: directory where the lockfile is
manifest_dir: directory containing the manifest and lockfile
"""
# TBD: Should this be the abspath?
manifest_dir = pathlib.Path(manifest_dir)
lockfile = manifest_dir / lockfile_name
with lockfile.open("r") as f:
@@ -2666,6 +2460,8 @@ def from_lockfile(manifest_dir: Union[pathlib.Path, str]) -> "EnvironmentManifes
def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
self.manifest_dir = pathlib.Path(manifest_dir)
self.manifest_file = self.manifest_dir / manifest_name
self.scope_name = f"env:{environment_name(self.manifest_dir)}"
self.config_stage_dir = os.path.join(env_subdir_path(manifest_dir), "config")
if not self.manifest_file.exists():
msg = f"cannot find '{manifest_name}' in {self.manifest_dir}"
@@ -2902,6 +2698,145 @@ def __iter__(self):
def __str__(self):
return str(self.manifest_file)
@property
def included_config_scopes(self) -> List[spack.config.ConfigScope]:
"""List of included configuration scopes from the manifest.
Scopes are listed in the YAML file in order from highest to
lowest precedence, so configuration from earlier scope will take
precedence over later ones.
This routine returns them in the order they should be pushed onto
the internal scope stack (so, in reverse, from lowest to highest).
Returns: Configuration scopes associated with the environment manifest
Raises:
SpackEnvironmentError: if the manifest includes a remote file but
no configuration stage directory has been identified
"""
scopes = []
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = self[TOP_LEVEL_KEY].get("include", [])
env_name = environment_name(self.manifest_dir)
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
config_path = os.path.join(self.manifest_dir, config_path)
config_path = os.path.normpath(os.path.realpath(config_path))
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = "env:%s:%s" % (env_name, os.path.basename(config_path))
tty.debug("Creating ConfigScope {0} for '{1}'".format(config_name, config_path))
scope = spack.config.ConfigScope(config_name, config_path)
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = "env:%s:%s" % (env_name, config_path)
tty.debug(
"Creating SingleFileScope {0} for '{1}'".format(config_name, config_path)
)
scope = spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
else:
missing.append(config_path)
continue
scopes.append(scope)
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
raise spack.config.ConfigFileError(msg)
return scopes
@property
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
"""A list of all configuration scopes for the environment manifest.
Returns: All configuration scopes associated with the environment
"""
config_name = self.scope_name
env_scope = spack.config.SingleFileScope(
config_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
)
return check_disallowed_env_config_mods(self.included_config_scopes + [env_scope])
def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope)
def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name)
@contextlib.contextmanager
def use_config(self):
"""Ensure only the manifest's configuration scopes are global."""
with no_active_environment():
self.prepare_config_scope()
yield
self.deactivate_config_scope()
class SpackEnvironmentError(spack.error.SpackError):
"""Superclass for all errors to do with Spack environments."""

View File

@@ -12,6 +12,7 @@
import re
import sys
import types
from pathlib import Path
from typing import List
import llnl.util.lang
@@ -132,10 +133,38 @@ def load_extension(name: str) -> str:
def get_extension_paths():
"""Return the list of canonicalized extension paths from config:extensions."""
extension_paths = spack.config.get("config:extensions") or []
extension_paths.extend(extension_paths_from_entry_points())
paths = [spack.util.path.canonicalize_path(p) for p in extension_paths]
return paths
def extension_paths_from_entry_points() -> List[str]:
"""Load extensions from a Python package's entry points.
A python package can register entry point metadata so that Spack can find
its extensions by adding the following to the project's pyproject.toml:
.. code-block:: toml
[project.entry-points."spack.extensions"]
baz = "baz:get_spack_extensions"
The function ``get_spack_extensions`` returns paths to the package's
spack extensions
"""
extension_paths: List[str] = []
for entry_point in llnl.util.lang.get_entry_points(group="spack.extensions"):
hook = entry_point.load()
if callable(hook):
paths = hook() or []
if isinstance(paths, (Path, str)):
extension_paths.append(str(paths))
else:
extension_paths.extend(paths)
return extension_paths
def get_command_paths():
"""Return the list of paths where to search for command files."""
command_paths = []

View File

@@ -30,6 +30,7 @@
import shutil
import urllib.error
import urllib.parse
from pathlib import PurePath
from typing import List, Optional
import llnl.url
@@ -37,13 +38,14 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.string import comma_and, quote
from llnl.util.filesystem import get_single_file, mkdirp, temp_cwd, temp_rename, working_dir
from llnl.util.filesystem import get_single_file, mkdirp, temp_cwd, working_dir
from llnl.util.symlink import symlink
import spack.config
import spack.error
import spack.oci.opener
import spack.url
import spack.util.archive
import spack.util.crypto as crypto
import spack.util.git
import spack.util.url as url_util
@@ -600,29 +602,21 @@ def expand(self):
tty.debug("Source fetched with %s is already expanded." % self.url_attr)
@_needs_stage
def archive(self, destination, **kwargs):
def archive(self, destination, *, exclude: Optional[str] = None):
assert llnl.url.extension_from_path(destination) == "tar.gz"
assert self.stage.source_path.startswith(self.stage.path)
# We need to prepend this dir name to every entry of the tarfile
top_level_dir = PurePath(self.stage.srcdir or os.path.basename(self.stage.source_path))
tar = which("tar", required=True)
patterns = kwargs.get("exclude", None)
if patterns is not None:
if isinstance(patterns, str):
patterns = [patterns]
for p in patterns:
tar.add_default_arg("--exclude=%s" % p)
with working_dir(self.stage.path):
if self.stage.srcdir:
# Here we create an archive with the default repository name.
# The 'tar' command has options for changing the name of a
# directory that is included in the archive, but they differ
# based on OS, so we temporarily rename the repo
with temp_rename(self.stage.source_path, self.stage.srcdir):
tar("-czf", destination, self.stage.srcdir)
else:
tar("-czf", destination, os.path.basename(self.stage.source_path))
with working_dir(self.stage.source_path), spack.util.archive.gzip_compressed_tarfile(
destination
) as (tar, _, _):
spack.util.archive.reproducible_tarfile_from_prefix(
tar=tar,
prefix=".",
skip=lambda entry: entry.name == exclude,
path_to_name=lambda path: (top_level_dir / PurePath(path)).as_posix(),
)
def __str__(self):
return "VCS: %s" % self.url
@@ -703,7 +697,6 @@ def __str__(self):
@fetcher
class GitFetchStrategy(VCSFetchStrategy):
"""
Fetch strategy that gets source code from a git repository.
Use like this in a package:
@@ -936,9 +929,12 @@ def clone(self, dest=None, commit=None, branch=None, tag=None, bare=False):
git_commands = []
submodules = self.submodules
if callable(submodules):
submodules = list(submodules(self.package))
git_commands.append(["submodule", "init", "--"] + submodules)
git_commands.append(["submodule", "update", "--recursive"])
submodules = submodules(self.package)
if submodules:
if isinstance(submodules, str):
submodules = [submodules]
git_commands.append(["submodule", "init", "--"] + submodules)
git_commands.append(["submodule", "update", "--recursive"])
elif submodules:
git_commands.append(["submodule", "update", "--init", "--recursive"])
@@ -1095,7 +1091,6 @@ def __str__(self):
@fetcher
class SvnFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a subversion repository.
Use like this in a package:
@@ -1190,7 +1185,6 @@ def __str__(self):
@fetcher
class HgFetchStrategy(VCSFetchStrategy):
"""
Fetch strategy that gets source code from a Mercurial repository.
Use like this in a package:

View File

@@ -32,6 +32,7 @@
from llnl.util.tty.color import colorize
import spack.config
import spack.paths
import spack.projections
import spack.relocate
import spack.schema.projections
@@ -91,16 +92,16 @@ def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
prefix_to_projection[spack.store.STORE.layout.root] = view._root
# This is vestigial code for the *old* location of sbang.
prefix_to_projection[
"#!/bin/bash {0}/bin/sbang".format(spack.paths.spack_root)
] = sbang.sbang_shebang_line()
prefix_to_projection[f"#!/bin/bash {spack.paths.spack_root}/bin/sbang"] = (
sbang.sbang_shebang_line()
)
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
try:
os.chown(dst, src_stat.st_uid, src_stat.st_gid)
except OSError:
tty.debug("Can't change the permissions for %s" % dst)
tty.debug(f"Can't change the permissions for {dst}")
def view_func_parser(parsed_name):
@@ -112,7 +113,7 @@ def view_func_parser(parsed_name):
elif parsed_name in ("add", "symlink", "soft"):
return view_symlink
else:
raise ValueError("invalid link type for view: '%s'" % parsed_name)
raise ValueError(f"invalid link type for view: '{parsed_name}'")
def inverse_view_func_parser(view_type):
@@ -270,9 +271,10 @@ def __init__(self, root, layout, **kwargs):
# Ensure projections are the same from each source
# Read projections file from view
if self.projections != self.read_projections():
msg = "View at %s has projections file" % self._root
msg += " which does not match projections passed manually."
raise ConflictingProjectionsError(msg)
raise ConflictingProjectionsError(
f"View at {self._root} has projections file"
" which does not match projections passed manually."
)
self._croot = colorize_root(self._root) + " "
@@ -313,11 +315,11 @@ def add_specs(self, *specs, **kwargs):
def add_standalone(self, spec):
if spec.external:
tty.warn(self._croot + "Skipping external package: %s" % colorize_spec(spec))
tty.warn(f"{self._croot}Skipping external package: {colorize_spec(spec)}")
return True
if self.check_added(spec):
tty.warn(self._croot + "Skipping already linked package: %s" % colorize_spec(spec))
tty.warn(f"{self._croot}Skipping already linked package: {colorize_spec(spec)}")
return True
self.merge(spec)
@@ -325,7 +327,7 @@ def add_standalone(self, spec):
self.link_meta_folder(spec)
if self.verbose:
tty.info(self._croot + "Linked package: %s" % colorize_spec(spec))
tty.info(f"{self._croot}Linked package: {colorize_spec(spec)}")
return True
def merge(self, spec, ignore=None):
@@ -393,7 +395,7 @@ def needs_file(spec, file):
for file in files:
if not os.path.lexists(file):
tty.warn("Tried to remove %s which does not exist" % file)
tty.warn(f"Tried to remove {file} which does not exist")
continue
# remove if file is not owned by any other package in the view
@@ -404,7 +406,7 @@ def needs_file(spec, file):
# we are currently removing, as we remove files before unlinking the
# metadata directory.
if len([s for s in specs if needs_file(s, file)]) <= 1:
tty.debug("Removing file " + file)
tty.debug(f"Removing file {file}")
os.remove(file)
def check_added(self, spec):
@@ -477,14 +479,14 @@ def remove_standalone(self, spec):
Remove (unlink) a standalone package from this view.
"""
if not self.check_added(spec):
tty.warn(self._croot + "Skipping package not linked in view: %s" % spec.name)
tty.warn(f"{self._croot}Skipping package not linked in view: {spec.name}")
return
self.unmerge(spec)
self.unlink_meta_folder(spec)
if self.verbose:
tty.info(self._croot + "Removed package: %s" % colorize_spec(spec))
tty.info(f"{self._croot}Removed package: {colorize_spec(spec)}")
def get_projection_for_spec(self, spec):
"""
@@ -558,9 +560,9 @@ def print_conflict(self, spec_active, spec_specified, level="error"):
linked = tty.color.colorize(" (@gLinked@.)", color=color)
specified = tty.color.colorize("(@rSpecified@.)", color=color)
cprint(
self._croot + "Package conflict detected:\n"
"%s %s\n" % (linked, colorize_spec(spec_active))
+ "%s %s" % (specified, colorize_spec(spec_specified))
f"{self._croot}Package conflict detected:\n"
f"{linked} {colorize_spec(spec_active)}\n"
f"{specified} {colorize_spec(spec_specified)}"
)
def print_status(self, *specs, **kwargs):
@@ -572,14 +574,14 @@ def print_status(self, *specs, **kwargs):
for s, v in zip(specs, in_view):
if not v:
tty.error(self._croot + "Package not linked: %s" % s.name)
tty.error(f"{self._croot}Package not linked: {s.name}")
elif s != v:
self.print_conflict(v, s, level="warn")
in_view = list(filter(None, in_view))
if len(specs) > 0:
tty.msg("Packages linked in %s:" % self._croot[:-1])
tty.msg(f"Packages linked in {self._croot[:-1]}:")
# Make a dict with specs keyed by architecture and compiler.
index = index_by(specs, ("architecture", "compiler"))
@@ -589,20 +591,19 @@ def print_status(self, *specs, **kwargs):
if i > 0:
print()
header = "%s{%s} / %s{%s}" % (
spack.spec.ARCHITECTURE_COLOR,
architecture,
spack.spec.COMPILER_COLOR,
compiler,
header = (
f"{spack.spec.ARCHITECTURE_COLOR}{{{architecture}}} "
f"/ {spack.spec.COMPILER_COLOR}{{{compiler}}}"
)
tty.hline(colorize(header), char="-")
specs = index[(architecture, compiler)]
specs.sort()
format_string = "{name}{@version}"
format_string += "{%compiler}{compiler_flags}{variants}"
abbreviated = [s.cformat(format_string) for s in specs]
abbreviated = [
s.cformat("{name}{@version}{%compiler}{compiler_flags}{variants}")
for s in specs
]
# Print one spec per line along with prefix path
width = max(len(s) for s in abbreviated)
@@ -634,22 +635,19 @@ def unlink_meta_folder(self, spec):
class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on
performance and immutable views, where specs cannot be removed after they
were added."""
"""A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same
package, that project to the same prefix. We want to catch that as
early as possible and give a sensible error to the user. Here we use
the metadata dir (.spack) projection as a quick test to see whether
two specs in the view are going to clash. The metadata dir is used
because it's always added by Spack with identical files, so a
guaranteed clash that's easily verified."""
seen = dict()
"""A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to
the user. Here we use the metadata dir (.spack) projection as a quick test to see whether
two specs in the view are going to clash. The metadata dir is used because it's always
added by Spack with identical files, so a guaranteed clash that's easily verified."""
seen = {}
for current_spec in specs:
metadata_dir = self.relative_metadata_dir_for_spec(current_spec)
conflicting_spec = seen.get(metadata_dir)
@@ -657,7 +655,8 @@ def _sanity_check_view_projection(self, specs):
raise ConflictingSpecsError(current_spec, conflicting_spec)
seen[metadata_dir] = current_spec
def add_specs(self, *specs, **kwargs):
def add_specs(self, *specs: spack.spec.Spec) -> None:
"""Link a root-to-leaf topologically ordered list of specs into the view."""
assert all((s.concrete for s in specs))
if len(specs) == 0:
return
@@ -668,9 +667,6 @@ def add_specs(self, *specs, **kwargs):
tty.warn("Skipping external package: " + s.short_spec)
specs = [s for s in specs if not s.external]
if kwargs.get("exclude", None):
specs = set(filter_exclude(specs, kwargs["exclude"]))
self._sanity_check_view_projection(specs)
# Ignore spack meta data folder.
@@ -695,13 +691,11 @@ def skip_list(file):
# Inform about file-file conflicts.
if visitor.file_conflicts:
if self.ignore_conflicts:
tty.debug("{0} file conflicts".format(len(visitor.file_conflicts)))
tty.debug(f"{len(visitor.file_conflicts)} file conflicts")
else:
raise MergeConflictSummary(visitor.file_conflicts)
tty.debug(
"Creating {0} dirs and {1} links".format(len(visitor.directories), len(visitor.files))
)
tty.debug(f"Creating {len(visitor.directories)} dirs and {len(visitor.files)} links")
# Make the directory structure
for dst in visitor.directories:

View File

@@ -15,13 +15,6 @@
* post_install(spec, explicit)
* pre_uninstall(spec)
* post_uninstall(spec)
* on_install_start(spec)
* on_install_success(spec)
* on_install_failure(spec)
* on_phase_success(pkg, phase_name, log_file)
* on_phase_error(pkg, phase_name, log_file)
* on_phase_error(pkg, phase_name, log_file)
* post_env_write(env)
This can be used to implement support for things like module
systems (e.g. modules, lmod, etc.) or to add other custom
@@ -78,17 +71,5 @@ def __call__(self, *args, **kwargs):
pre_install = _HookRunner("pre_install")
post_install = _HookRunner("post_install")
# These hooks are run within an install subprocess
pre_uninstall = _HookRunner("pre_uninstall")
post_uninstall = _HookRunner("post_uninstall")
on_phase_success = _HookRunner("on_phase_success")
on_phase_error = _HookRunner("on_phase_error")
# These are hooks in installer.py, before starting install subprocess
on_install_start = _HookRunner("on_install_start")
on_install_success = _HookRunner("on_install_success")
on_install_failure = _HookRunner("on_install_failure")
on_install_cancel = _HookRunner("on_install_cancel")
# Environment hooks
post_env_write = _HookRunner("post_env_write")

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import IO, Optional, Tuple
from typing import BinaryIO, Optional, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import BaseDirectoryVisitor, visit_directory_tree
@@ -18,7 +18,7 @@ def should_keep(path: bytes) -> bool:
return path.startswith(b"$") or (os.path.isabs(path) and os.path.lexists(path))
def _drop_redundant_rpaths(f: IO) -> Optional[Tuple[bytes, bytes]]:
def _drop_redundant_rpaths(f: BinaryIO) -> Optional[Tuple[bytes, bytes]]:
"""Drop redundant entries from rpath.
Args:

View File

@@ -34,21 +34,8 @@ def _for_each_enabled(
def post_install(spec, explicit: bool):
import spack.environment as ev # break import cycle
if ev.active_environment():
# If the installed through an environment, we skip post_install
# module generation and generate the modules on env_write so Spack
# can manage interactions between env views and modules
return
_for_each_enabled(spec, "write", explicit)
def post_uninstall(spec):
_for_each_enabled(spec, "remove")
def post_env_write(env):
for spec in env.new_installs:
_for_each_enabled(spec, "write")

View File

@@ -229,6 +229,8 @@ def post_install(spec, explicit=None):
$spack_prefix/bin/sbang instead of something longer than the
shebang limit.
"""
if sys.platform == "win32":
return
if spec.external:
tty.debug("SKIP: shebang filtering [external package]")
return

View File

@@ -36,6 +36,7 @@
import sys
import time
from collections import defaultdict
from gzip import GzipFile
from typing import Dict, Iterator, List, Optional, Set, Tuple
import llnl.util.filesystem as fs
@@ -638,13 +639,12 @@ def archive_install_logs(pkg: "spack.package_base.PackageBase", phase_log_dir: s
pkg: the package that was built and installed
phase_log_dir: path to the archive directory
"""
# Archive the whole stdout + stderr for the package
fs.install(pkg.log_path, pkg.install_log_path)
# Archive all phase log paths
for phase_log in pkg.phase_log_files:
log_file = os.path.basename(phase_log)
fs.install(phase_log, os.path.join(phase_log_dir, log_file))
# Copy a compressed version of the install log
with open(pkg.log_path, "rb") as f, open(pkg.install_log_path, "wb") as g:
# Use GzipFile directly so we can omit filename / mtime in header
gzip_file = GzipFile(filename="", mode="wb", compresslevel=6, mtime=0, fileobj=g)
shutil.copyfileobj(f, gzip_file)
gzip_file.close()
# Archive the install-phase test log, if present
pkg.archive_install_test_log()
@@ -1705,7 +1705,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
except spack.build_environment.StopPhase as e:
# A StopPhase exception means that do_install was asked to
# stop early from clients, and is not an error at this point
spack.hooks.on_install_failure(task.request.pkg.spec)
pid = f"{self.pid}: " if tty.show_pid() else ""
tty.debug(f"{pid}{str(e)}")
tty.debug(f"Package stage directory: {pkg.stage.source_path}")
@@ -2011,7 +2010,6 @@ def install(self) -> None:
if task is None:
continue
spack.hooks.on_install_start(task.request.pkg.spec)
install_args = task.request.install_args
keep_prefix = install_args.get("keep_prefix")
@@ -2037,9 +2035,6 @@ def install(self) -> None:
tty.warn(f"{pkg_id} does NOT actually have any uninstalled deps left")
dep_str = "dependencies" if task.priority > 1 else "dependency"
# Hook to indicate task failure, but without an exception
spack.hooks.on_install_failure(task.request.pkg.spec)
raise InstallError(
f"Cannot proceed with {pkg_id}: {task.priority} uninstalled "
f"{dep_str}: {','.join(task.uninstalled_deps)}",
@@ -2062,11 +2057,6 @@ def install(self) -> None:
tty.warn(f"{pkg_id} failed to install")
self._update_failed(task)
# Mark that the package failed
# TODO: this should also be for the task.pkg, but we don't
# model transitive yet.
spack.hooks.on_install_failure(task.request.pkg.spec)
if self.fail_fast:
raise InstallError(fail_fast_err, pkg=pkg)
@@ -2169,7 +2159,6 @@ def install(self) -> None:
tty.error(
f"Failed to install {pkg.name} due to " f"{exc.__class__.__name__}: {str(exc)}"
)
spack.hooks.on_install_cancel(task.request.pkg.spec)
raise
except binary_distribution.NoChecksumException as exc:
@@ -2188,7 +2177,6 @@ def install(self) -> None:
except (Exception, SystemExit) as exc:
self._update_failed(task, True, exc)
spack.hooks.on_install_failure(task.request.pkg.spec)
# Best effort installs suppress the exception and mark the
# package as a failure.
@@ -2372,9 +2360,6 @@ def run(self) -> bool:
_print_timer(pre=self.pre, pkg_id=self.pkg_id, timer=self.timer)
_print_installed_pkg(self.pkg.prefix)
# Send final status that install is successful
spack.hooks.on_install_success(self.pkg.spec)
# preserve verbosity across runs
return self.echo
@@ -2453,15 +2438,10 @@ def _real_install(self) -> None:
# Catch any errors to report to logging
self.timer.start(phase_fn.name)
phase_fn.execute()
spack.hooks.on_phase_success(pkg, phase_fn.name, log_file)
self.timer.stop(phase_fn.name)
except BaseException:
combine_phase_logs(pkg.phase_log_files, pkg.log_path)
spack.hooks.on_phase_error(pkg, phase_fn.name, log_file)
# phase error indicates install error
spack.hooks.on_install_failure(pkg.spec)
raise
# We assume loggers share echo True/False

View File

@@ -950,14 +950,10 @@ def _main(argv=None):
parser.print_help()
return 1
# -h, -H, and -V are special as they do not require a command, but
# all the other options do nothing without a command.
# version is special as it does not require a command or loading and additional infrastructure
if args.version:
print(get_version())
return 0
elif args.help:
sys.stdout.write(parser.format_help(level=args.help))
return 0
# ------------------------------------------------------------------------
# This part of the `main()` sets up Spack's configuration.
@@ -996,6 +992,12 @@ def _main(argv=None):
print_setup_info(*args.print_shell_vars.split(","))
return 0
# -h and -H are special as they do not require a command, but
# all the other options do nothing without a command.
if args.help:
sys.stdout.write(parser.format_help(level=args.help))
return 0
# At this point we've considered all the options to spack itself, so we
# need a command or we're done.
if not args.command:
@@ -1038,9 +1040,9 @@ def finish_parse_and_run(parser, cmd_name, main_args, env_format_error):
set_working_dir()
# now we can actually execute the command.
if args.spack_profile or args.sorted_profile:
if main_args.spack_profile or main_args.sorted_profile:
_profile_wrapper(command, parser, args, unknown)
elif args.pdb:
elif main_args.pdb:
import pdb
pdb.runctx("_invoke_command(command, parser, args, unknown)", globals(), locals())

View File

@@ -43,6 +43,7 @@
import spack.build_environment
import spack.config
import spack.deptypes as dt
import spack.environment
import spack.error
import spack.modules.common
@@ -53,6 +54,7 @@
import spack.spec
import spack.store
import spack.tengine as tengine
import spack.user_environment
import spack.util.environment
import spack.util.file_permissions as fp
import spack.util.path
@@ -119,43 +121,26 @@ def update_dictionary_extending_lists(target, update):
target[key] = update[key]
def dependencies(spec, request="all"):
"""Returns the list of dependent specs for a given spec, according to the
request passed as parameter.
def dependencies(spec: spack.spec.Spec, request: str = "all") -> List[spack.spec.Spec]:
"""Returns the list of dependent specs for a given spec.
Args:
spec: spec to be analyzed
request: either 'none', 'direct' or 'all'
request: one of "none", "run", "direct", "all"
Returns:
list of dependencies
The return list will be empty if request is 'none', will contain
the direct dependencies if request is 'direct', or the entire DAG
if request is 'all'.
list of requested dependencies
"""
if request not in ("none", "direct", "all"):
message = "Wrong value for argument 'request' : "
message += "should be one of ('none', 'direct', 'all')"
raise tty.error(message + " [current value is '%s']" % request)
if request == "none":
return []
elif request == "run":
return spec.dependencies(deptype=dt.RUN)
elif request == "direct":
return spec.dependencies(deptype=dt.RUN | dt.LINK)
elif request == "all":
return list(spec.traverse(order="topo", deptype=dt.LINK | dt.RUN, root=False))
if request == "direct":
return spec.dependencies(deptype=("link", "run"))
# FIXME : during module file creation nodes seem to be visited multiple
# FIXME : times even if cover='nodes' is given. This work around permits
# FIXME : to get a unique list of spec anyhow. Do we miss a merge
# FIXME : step among nodes that refer to the same package?
seen = set()
seen_add = seen.add
deps = sorted(
spec.traverse(order="post", cover="nodes", deptype=("link", "run"), root=False),
reverse=True,
)
return [d for d in deps if not (d in seen or seen_add(d))]
raise ValueError(f'request "{request}" is not one of "none", "direct", "run", "all"')
def merge_config_rules(configuration, spec):
@@ -695,28 +680,33 @@ def environment_modifications(self):
)
spack.config.merge_yaml(
prefix_inspections,
spack.config.get("modules:%s:prefix_inspections" % self.conf.name, {}),
spack.config.get(f"modules:{self.conf.name}:prefix_inspections", {}),
)
use_view = spack.config.get("modules:%s:use_view" % self.conf.name, False)
use_view = spack.config.get(f"modules:{self.conf.name}:use_view", False)
assert isinstance(use_view, (bool, str))
spec = self.spec.copy() # defensive copy before setting prefix
if use_view:
if use_view is True:
use_view = spack.environment.default_view_name
env = spack.environment.active_environment()
if not env:
raise spack.environment.SpackEnvironmentViewError(
"Module generation with views requires active environment"
)
view = env.views[use_view]
view_name = spack.environment.default_view_name if use_view is True else use_view
spec.prefix = view.get_projection_for_spec(spec)
if not env.has_view(view_name):
raise spack.environment.SpackEnvironmentViewError(
f"View {view_name} not found in environment {env.name} when generating modules"
)
view = env.views[view_name]
else:
view = None
env = spack.util.environment.inspect_path(
spec.prefix, prefix_inspections, exclude=spack.util.environment.is_system_path
self.spec.prefix, prefix_inspections, exclude=spack.util.environment.is_system_path
)
# Let the extendee/dependency modify their extensions/dependencies
@@ -726,13 +716,19 @@ def environment_modifications(self):
# whole chain of setup_dependent_package has to be followed from leaf to spec.
# So: just run it here, but don't collect env mods.
spack.build_environment.SetupContext(
spec, context=Context.RUN
self.spec, context=Context.RUN
).set_all_package_py_globals()
# Then run setup_dependent_run_environment before setup_run_environment.
for dep in spec.dependencies(deptype=("link", "run")):
dep.package.setup_dependent_run_environment(env, spec)
spec.package.setup_run_environment(env)
for dep in self.spec.dependencies(deptype=("link", "run")):
dep.package.setup_dependent_run_environment(env, self.spec)
self.spec.package.setup_run_environment(env)
# Project the environment variables from prefix to view if needed
if view and self.spec in view:
spack.user_environment.project_env_mods(
*self.spec.traverse(deptype=dt.LINK | dt.RUN), view=view, env=env
)
# Modifications required from modules.yaml
env.extend(self.conf.env)
@@ -754,11 +750,11 @@ def environment_modifications(self):
msg = "some tokens cannot be expanded in an environment variable name"
_check_tokens_are_valid(x.name, message=msg)
# Transform them
x.name = spec.format(x.name, transform=transform)
x.name = self.spec.format(x.name, transform=transform)
if self.modification_needs_formatting(x):
try:
# Not every command has a value
x.value = spec.format(x.value)
x.value = self.spec.format(x.value)
except AttributeError:
pass
x.name = str(x.name).replace("-", "_")

View File

@@ -134,7 +134,7 @@ def upload_blob(
return True
# Otherwise, do another PUT request.
spack.oci.opener.ensure_status(response, 202)
spack.oci.opener.ensure_status(request, response, 202)
assert "Location" in response.headers
# Can be absolute or relative, joining handles both
@@ -143,19 +143,16 @@ def upload_blob(
)
f.seek(0)
response = _urlopen(
Request(
url=upload_url,
method="PUT",
data=f,
headers={
"Content-Type": "application/octet-stream",
"Content-Length": str(file_size),
},
)
request = Request(
url=upload_url,
method="PUT",
data=f,
headers={"Content-Type": "application/octet-stream", "Content-Length": str(file_size)},
)
spack.oci.opener.ensure_status(response, 201)
response = _urlopen(request)
spack.oci.opener.ensure_status(request, response, 201)
# print elapsed time and # MB/s
_log_upload_progress(digest, file_size, time.time() - start)
@@ -164,7 +161,7 @@ def upload_blob(
def upload_manifest(
ref: ImageReference,
oci_manifest: dict,
manifest: dict,
tag: bool = True,
_urlopen: spack.oci.opener.MaybeOpen = None,
):
@@ -172,7 +169,7 @@ def upload_manifest(
Args:
ref: The image reference.
oci_manifest: The OCI manifest or index.
manifest: The manifest or index.
tag: When true, use the tag, otherwise use the digest,
this is relevant for multi-arch images, where the
tag is an index, referencing the manifests by digest.
@@ -182,23 +179,23 @@ def upload_manifest(
"""
_urlopen = _urlopen or spack.oci.opener.urlopen
data = json.dumps(oci_manifest, separators=(",", ":")).encode()
data = json.dumps(manifest, separators=(",", ":")).encode()
digest = Digest.from_sha256(hashlib.sha256(data).hexdigest())
size = len(data)
if not tag:
ref = ref.with_digest(digest)
response = _urlopen(
Request(
url=ref.manifest_url(),
method="PUT",
data=data,
headers={"Content-Type": oci_manifest["mediaType"]},
)
request = Request(
url=ref.manifest_url(),
method="PUT",
data=data,
headers={"Content-Type": manifest["mediaType"]},
)
spack.oci.opener.ensure_status(response, 201)
response = _urlopen(request)
spack.oci.opener.ensure_status(request, response, 201)
return digest, size

View File

@@ -310,19 +310,15 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
# Login failed, avoid infinite recursion where we go back and
# forth between auth server and registry
if hasattr(req, "login_attempted"):
raise urllib.error.HTTPError(
req.full_url, code, f"Failed to login to {req.full_url}: {msg}", headers, fp
raise spack.util.web.DetailedHTTPError(
req, code, f"Failed to login: {msg}", headers, fp
)
# On 401 Unauthorized, parse the WWW-Authenticate header
# to determine what authentication is required
if "WWW-Authenticate" not in headers:
raise urllib.error.HTTPError(
req.full_url,
code,
"Cannot login to registry, missing WWW-Authenticate header",
headers,
fp,
raise spack.util.web.DetailedHTTPError(
req, code, "Cannot login to registry, missing WWW-Authenticate header", headers, fp
)
header_value = headers["WWW-Authenticate"]
@@ -330,8 +326,8 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
try:
challenge = get_bearer_challenge(parse_www_authenticate(header_value))
except ValueError as e:
raise urllib.error.HTTPError(
req.full_url,
raise spack.util.web.DetailedHTTPError(
req,
code,
f"Cannot login to registry, malformed WWW-Authenticate header: {header_value}",
headers,
@@ -340,8 +336,8 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
# If there is no bearer challenge, we can't handle it
if not challenge:
raise urllib.error.HTTPError(
req.full_url,
raise spack.util.web.DetailedHTTPError(
req,
code,
f"Cannot login to registry, unsupported authentication scheme: {header_value}",
headers,
@@ -356,8 +352,8 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
timeout=req.timeout,
)
except ValueError as e:
raise urllib.error.HTTPError(
req.full_url,
raise spack.util.web.DetailedHTTPError(
req,
code,
f"Cannot login to registry, failed to obtain bearer token: {e}",
headers,
@@ -412,13 +408,13 @@ def create_opener():
return opener
def ensure_status(response: HTTPResponse, status: int):
def ensure_status(request: urllib.request.Request, response: HTTPResponse, status: int):
"""Raise an error if the response status is not the expected one."""
if response.status == status:
return
raise urllib.error.HTTPError(
response.geturl(), response.status, response.reason, response.info(), None
raise spack.util.web.DetailedHTTPError(
request, response.status, response.reason, response.info(), None
)

View File

@@ -9,6 +9,8 @@
import platform
import subprocess
from llnl.util import tty
from spack.error import SpackError
from spack.util import windows_registry as winreg
from spack.version import Version
@@ -83,11 +85,50 @@ def compiler_search_paths(self):
os.path.join(str(os.getenv("ONEAPI_ROOT")), "compiler", "*", "windows", "bin")
)
)
# Second strategy: Find MSVC via the registry
msft = winreg.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft", winreg.HKEY.HKEY_LOCAL_MACHINE
)
vs_entries = msft.find_subkeys(r"VisualStudio_.*")
def try_query_registry(retry=False):
winreg_report_error = lambda e: tty.debug(
'Windows registry query on "SOFTWARE\\WOW6432Node\\Microsoft"'
f"under HKEY_LOCAL_MACHINE: {str(e)}"
)
try:
# Registry interactions are subject to race conditions, etc and can generally
# be flakey, do this in a catch block to prevent reg issues from interfering
# with compiler detection
msft = winreg.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft", winreg.HKEY.HKEY_LOCAL_MACHINE
)
return msft.find_subkeys(r"VisualStudio_.*", recursive=False)
except OSError as e:
# OSErrors propagated into caller by Spack's registry module are expected
# and indicate a known issue with the registry query
# i.e. user does not have permissions or the key/value
# doesn't exist
winreg_report_error(e)
return []
except winreg.InvalidRegistryOperation as e:
# Other errors raised by the Spack's reg module indicate
# an unexpected error type, and are handled specifically
# as the underlying cause is difficult/impossible to determine
# without manually exploring the registry
# These errors can also be spurious (race conditions)
# and may resolve on re-execution of the query
# or are permanent (specific types of permission issues)
# but the registry raises the same exception for all types of
# atypical errors
if retry:
winreg_report_error(e)
return []
vs_entries = try_query_registry()
if not vs_entries:
# Occasional spurious race conditions can arise when reading the MS reg
# typically these race conditions resolve immediately and we can safely
# retry the reg query without waiting
# Note: Winreg does not support locking
vs_entries = try_query_registry(retry=True)
vs_paths = []
def clean_vs_path(path):
@@ -99,11 +140,8 @@ def clean_vs_path(path):
val = entry.get_subkey("Capabilities").get_value("ApplicationDescription").value
vs_paths.append(clean_vs_path(val))
except FileNotFoundError as e:
if hasattr(e, "winerror"):
if e.winerror == 2:
pass
else:
raise
if hasattr(e, "winerror") and e.winerror == 2:
pass
else:
raise

View File

@@ -24,6 +24,7 @@
import textwrap
import time
import traceback
import typing
import warnings
from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Tuple, Type, TypeVar, Union
@@ -66,7 +67,7 @@
from spack.stage import DIYStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.version import GitVersion, StandardVersion, Version
from spack.version import GitVersion, StandardVersion
FLAG_HANDLER_RETURN_TYPE = Tuple[
Optional[Iterable[str]], Optional[Iterable[str]], Optional[Iterable[str]]
@@ -93,29 +94,26 @@
spack_times_log = "install_times.json"
def deprecated_version(pkg, version):
"""Return True if the version is deprecated, False otherwise.
def deprecated_version(pkg: "PackageBase", version: Union[str, StandardVersion]) -> bool:
"""Return True iff the version is deprecated.
Arguments:
pkg (PackageBase): The package whose version is to be checked.
version (str or spack.version.StandardVersion): The version being checked
pkg: The package whose version is to be checked.
version: The version being checked
"""
if not isinstance(version, StandardVersion):
version = Version(version)
version = StandardVersion.from_string(version)
for k, v in pkg.versions.items():
if version == k and v.get("deprecated", False):
return True
return False
details = pkg.versions.get(version)
return details is not None and details.get("deprecated", False)
def preferred_version(pkg):
def preferred_version(pkg: "PackageBase"):
"""
Returns a sorted list of the preferred versions of the package.
Arguments:
pkg (PackageBase): The package whose versions are to be assessed.
pkg: The package whose versions are to be assessed.
"""
# Here we sort first on the fact that a version is marked
# as preferred in the package, then on the fact that the
@@ -568,6 +566,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict[str, Tuple["spack.variant.Variant", "spack.spec.Spec"]]
#: By default, packages are not virtual
#: Virtual packages override this attribute
@@ -732,13 +731,13 @@ def dependencies_by_name(cls, when: bool = False):
@classmethod
def possible_dependencies(
cls,
transitive=True,
expand_virtuals=True,
transitive: bool = True,
expand_virtuals: bool = True,
depflag: dt.DepFlag = dt.ALL,
visited=None,
missing=None,
virtuals=None,
):
visited: Optional[dict] = None,
missing: Optional[dict] = None,
virtuals: Optional[set] = None,
) -> Dict[str, Set[str]]:
"""Return dict of possible dependencies of this package.
Args:
@@ -902,22 +901,16 @@ def version(self):
@classmethod
@memoized
def version_urls(cls):
"""OrderedDict of explicitly defined URLs for versions of this package.
def version_urls(cls) -> Dict[StandardVersion, str]:
"""Dict of explicitly defined URLs for versions of this package.
Return:
An OrderedDict (version -> URL) different versions of this
package, sorted by version.
An dict mapping version to url, ordered by version.
A version's URL only appears in the result if it has an an
explicitly defined ``url`` argument. So, this list may be empty
if a package only defines ``url`` at the top level.
A version's URL only appears in the result if it has an an explicitly defined ``url``
argument. So, this list may be empty if a package only defines ``url`` at the top level.
"""
version_urls = collections.OrderedDict()
for v, args in sorted(cls.versions.items()):
if "url" in args:
version_urls[v] = args["url"]
return version_urls
return {v: args["url"] for v, args in sorted(cls.versions.items()) if "url" in args}
def nearest_url(self, version):
"""Finds the URL with the "closest" version to ``version``.
@@ -960,36 +953,39 @@ def update_external_dependencies(self, extendee_spec=None):
"""
pass
def all_urls_for_version(self, version):
def all_urls_for_version(self, version: StandardVersion) -> List[str]:
"""Return all URLs derived from version_urls(), url, urls, and
list_url (if it contains a version) in a package in that order.
Args:
version (spack.version.Version): the version for which a URL is sought
version: the version for which a URL is sought
"""
uf = None
if type(self).url_for_version != PackageBase.url_for_version:
uf = self.url_for_version
return self._implement_all_urls_for_version(version, uf)
def _implement_all_urls_for_version(self, version, custom_url_for_version=None):
if not isinstance(version, StandardVersion):
version = Version(version)
def _implement_all_urls_for_version(
self,
version: Union[str, StandardVersion],
custom_url_for_version: Optional[Callable[[StandardVersion], Optional[str]]] = None,
) -> List[str]:
version = StandardVersion.from_string(version) if isinstance(version, str) else version
urls = []
urls: List[str] = []
# If we have a specific URL for this version, don't extrapolate.
version_urls = self.version_urls()
if version in version_urls:
urls.append(version_urls[version])
url = self.version_urls().get(version)
if url:
urls.append(url)
# if there is a custom url_for_version, use it
if custom_url_for_version is not None:
u = custom_url_for_version(version)
if u not in urls and u is not None:
if u is not None and u not in urls:
urls.append(u)
def sub_and_add(u):
def sub_and_add(u: Optional[str]) -> None:
if u is None:
return
# skip the url if there is no version to replace
@@ -997,9 +993,7 @@ def sub_and_add(u):
spack.url.parse_version(u)
except spack.url.UndetectableVersionError:
return
nu = spack.url.substitute_version(u, self.url_version(version))
urls.append(nu)
urls.append(spack.url.substitute_version(u, self.url_version(version)))
# If no specific URL, use the default, class-level URL
sub_and_add(getattr(self, "url", None))
@@ -1130,13 +1124,7 @@ def stage(self, stage):
@property
def env_path(self):
"""Return the build environment file path associated with staging."""
# Backward compatibility: Return the name of an existing log path;
# otherwise, return the current install env path name.
old_filename = os.path.join(self.stage.path, "spack-build.env")
if os.path.exists(old_filename):
return old_filename
else:
return os.path.join(self.stage.path, _spack_build_envfile)
return os.path.join(self.stage.path, _spack_build_envfile)
@property
def env_mods_path(self):
@@ -1167,13 +1155,6 @@ def install_env_path(self):
@property
def log_path(self):
"""Return the build log file path associated with staging."""
# Backward compatibility: Return the name of an existing log path.
for filename in ["spack-build.out", "spack-build.txt"]:
old_log = os.path.join(self.stage.path, filename)
if os.path.exists(old_log):
return old_log
# Otherwise, return the current log path name.
return os.path.join(self.stage.path, _spack_build_logfile)
@property
@@ -1186,15 +1167,15 @@ def phase_log_files(self):
@property
def install_log_path(self):
"""Return the build log file path on successful installation."""
"""Return the (compressed) build log file path on successful installation"""
# Backward compatibility: Return the name of an existing install log.
for filename in ["build.out", "build.txt"]:
for filename in [_spack_build_logfile, "build.out", "build.txt"]:
old_log = os.path.join(self.metadata_dir, filename)
if os.path.exists(old_log):
return old_log
# Otherwise, return the current install log path name.
return os.path.join(self.metadata_dir, _spack_build_logfile)
return os.path.join(self.metadata_dir, _spack_build_logfile + ".gz")
@property
def configure_args_path(self):
@@ -1412,13 +1393,9 @@ def download_instr(self):
(str): default manual download instructions
"""
required = (
"Manual download is required for {0}. ".format(self.spec.name)
if self.manual_download
else ""
)
return "{0}Refer to {1} for download instructions.".format(
required, self.spec.package.homepage
f"Manual download is required for {self.spec.name}. " if self.manual_download else ""
)
return f"{required}Refer to {self.homepage} for download instructions."
def do_fetch(self, mirror_only=False):
"""
@@ -2091,15 +2068,6 @@ def unit_test_check(self):
"""
return True
@property
def build_log_path(self):
"""
Return the expected (or current) build log file path. The path points
to the staging build file until the software is successfully installed,
when it points to the file in the installation directory.
"""
return self.install_log_path if self.spec.installed else self.log_path
@classmethod
def inject_flags(cls: Type[Pb], name: str, flags: Iterable[str]) -> FLAG_HANDLER_RETURN_TYPE:
"""
@@ -2383,15 +2351,14 @@ def format_doc(cls, **kwargs):
return results.getvalue()
@property
def all_urls(self):
def all_urls(self) -> List[str]:
"""A list of all URLs in a package.
Check both class-level and version-specific URLs.
Returns:
list: a list of URLs
Returns a list of URLs
"""
urls = []
urls: List[str] = []
if hasattr(self, "url") and self.url:
urls.append(self.url)
@@ -2404,7 +2371,9 @@ def all_urls(self):
urls.append(args["url"])
return urls
def fetch_remote_versions(self, concurrency=None):
def fetch_remote_versions(
self, concurrency: Optional[int] = None
) -> Dict[StandardVersion, str]:
"""Find remote versions of this package.
Uses ``list_url`` and any other URLs listed in the package file.
@@ -2493,14 +2462,21 @@ def flatten_dependencies(spec, flat_dir):
dep_files.merge(flat_dir + "/" + name)
def possible_dependencies(*pkg_or_spec, **kwargs):
def possible_dependencies(
*pkg_or_spec: Union[str, spack.spec.Spec, typing.Type[PackageBase]],
transitive: bool = True,
expand_virtuals: bool = True,
depflag: dt.DepFlag = dt.ALL,
missing: Optional[dict] = None,
virtuals: Optional[set] = None,
) -> Dict[str, Set[str]]:
"""Get the possible dependencies of a number of packages.
See ``PackageBase.possible_dependencies`` for details.
"""
packages = []
for pos in pkg_or_spec:
if isinstance(pos, PackageMeta):
if isinstance(pos, PackageMeta) and issubclass(pos, PackageBase):
packages.append(pos)
continue
@@ -2513,9 +2489,16 @@ def possible_dependencies(*pkg_or_spec, **kwargs):
else:
packages.append(pos.package_class)
visited = {}
visited: Dict[str, Set[str]] = {}
for pkg in packages:
pkg.possible_dependencies(visited=visited, **kwargs)
pkg.possible_dependencies(
visited=visited,
transitive=transitive,
expand_virtuals=expand_virtuals,
depflag=depflag,
missing=missing,
virtuals=virtuals,
)
return visited
@@ -2563,3 +2546,7 @@ class DependencyConflictError(spack.error.SpackError):
def __init__(self, conflict):
super().__init__("%s conflicts with another file in the flattened directory." % (conflict))
class ManualDownloadRequiredError(InvalidPackageOpError):
"""Raised when attempting an invalid operation on a package that requires a manual download."""

View File

@@ -9,9 +9,9 @@
import os.path
import pathlib
import sys
from typing import Any, Dict, Optional, Tuple, Type
import llnl.util.filesystem
import llnl.util.lang
from llnl.url import allowed_archive
import spack
@@ -25,15 +25,16 @@
from spack.util.executable import which, which_string
def apply_patch(stage, patch_path, level=1, working_dir="."):
def apply_patch(
stage: "spack.stage.Stage", patch_path: str, level: int = 1, working_dir: str = "."
) -> None:
"""Apply the patch at patch_path to code in the stage.
Args:
stage (spack.stage.Stage): stage with code that will be patched
patch_path (str): filesystem location for the patch to apply
level (int or None): patch level (default 1)
working_dir (str): relative path *within* the stage to change to
(default '.')
stage: stage with code that will be patched
patch_path: filesystem location for the patch to apply
level: patch level
working_dir: relative path *within* the stage to change to
"""
git_utils_path = os.environ.get("PATH", "")
if sys.platform == "win32":
@@ -58,16 +59,24 @@ def apply_patch(stage, patch_path, level=1, working_dir="."):
class Patch:
"""Base class for patches.
Arguments:
pkg (str): the package that owns the patch
The owning package is not necessarily the package to apply the patch
to -- in the case where a dependent package patches its dependency,
it is the dependent's fullname.
"""
def __init__(self, pkg, path_or_url, level, working_dir):
sha256: str
def __init__(
self, pkg: "spack.package_base.PackageBase", path_or_url: str, level: int, working_dir: str
) -> None:
"""Initialize a new Patch instance.
Args:
pkg: the package that owns the patch
path_or_url: the relative path or URL to a patch file
level: patch level
working_dir: relative path *within* the stage to change to
"""
# validate level (must be an integer >= 0)
if not isinstance(level, int) or not level >= 0:
raise ValueError("Patch level needs to be a non-negative integer.")
@@ -75,27 +84,28 @@ def __init__(self, pkg, path_or_url, level, working_dir):
# Attributes shared by all patch subclasses
self.owner = pkg.fullname
self.path_or_url = path_or_url # needed for debug output
self.path = None # must be set before apply()
self.path: Optional[str] = None # must be set before apply()
self.level = level
self.working_dir = working_dir
def apply(self, stage: "spack.stage.Stage"):
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
Arguments:
stage (spack.stage.Stage): stage where source code lives
Args:
stage: stage where source code lives
"""
if not self.path or not os.path.isfile(self.path):
raise NoSuchPatchError(f"No such patch: {self.path}")
apply_patch(stage, self.path, self.level, self.working_dir)
@property
def stage(self):
return None
# TODO: Use TypedDict once Spack supports Python 3.8+ only
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
def to_dict(self):
"""Partial dictionary -- subclases should add to this."""
Returns:
A dictionary representation.
"""
return {
"owner": self.owner,
"sha256": self.sha256,
@@ -103,31 +113,55 @@ def to_dict(self):
"working_dir": self.working_dir,
}
def __eq__(self, other):
def __eq__(self, other: object) -> bool:
"""Equality check.
Args:
other: another patch
Returns:
True if both patches have the same checksum, else False
"""
if not isinstance(other, Patch):
return NotImplemented
return self.sha256 == other.sha256
def __hash__(self):
def __hash__(self) -> int:
"""Unique hash.
Returns:
A unique hash based on the sha256.
"""
return hash(self.sha256)
class FilePatch(Patch):
"""Describes a patch that is retrieved from a file in the repository.
"""Describes a patch that is retrieved from a file in the repository."""
Arguments:
pkg (str): the class object for the package that owns the patch
relative_path (str): path to patch, relative to the repository
directory for a package.
level (int): level to pass to patch command
working_dir (str): path within the source directory where patch
should be applied
"""
_sha256: Optional[str] = None
def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
relative_path: str,
level: int,
working_dir: str,
ordering_key: Optional[Tuple[str, int]] = None,
) -> None:
"""Initialize a new FilePatch instance.
Args:
pkg: the class object for the package that owns the patch
relative_path: path to patch, relative to the repository directory for a package.
level: level to pass to patch command
working_dir: path within the source directory where patch should be applied
ordering_key: key used to ensure patches are applied in a consistent order
"""
self.relative_path = relative_path
# patches may be defined by relative paths to parent classes
# search mro to look for the file
abs_path = None
abs_path: Optional[str] = None
# At different times we call FilePatch on instances and classes
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
for cls in inspect.getmro(pkg_cls):
@@ -150,50 +184,90 @@ def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
super().__init__(pkg, abs_path, level, working_dir)
self.path = abs_path
self._sha256 = None
self.ordering_key = ordering_key
@property
def sha256(self):
if self._sha256 is None:
def sha256(self) -> str:
"""Get the patch checksum.
Returns:
The sha256 of the patch file.
"""
if self._sha256 is None and self.path is not None:
self._sha256 = checksum(hashlib.sha256, self.path)
assert isinstance(self._sha256, str)
return self._sha256
def to_dict(self):
return llnl.util.lang.union_dicts(super().to_dict(), {"relative_path": self.relative_path})
@sha256.setter
def sha256(self, value: str) -> None:
"""Set the patch checksum.
Args:
value: the sha256
"""
self._sha256 = value
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
Returns:
A dictionary representation.
"""
data = super().to_dict()
data["relative_path"] = self.relative_path
return data
class UrlPatch(Patch):
"""Describes a patch that is retrieved from a URL.
"""Describes a patch that is retrieved from a URL."""
Arguments:
pkg (str): the package that owns the patch
url (str): URL where the patch can be fetched
level (int): level to pass to patch command
working_dir (str): path within the source directory where patch
should be applied
"""
def __init__(
self,
pkg: "spack.package_base.PackageBase",
url: str,
level: int = 1,
*,
working_dir: str = ".",
sha256: str, # This is required for UrlPatch
ordering_key: Optional[Tuple[str, int]] = None,
archive_sha256: Optional[str] = None,
) -> None:
"""Initialize a new UrlPatch instance.
def __init__(self, pkg, url, level=1, working_dir=".", ordering_key=None, **kwargs):
Arguments:
pkg: the package that owns the patch
url: URL where the patch can be fetched
level: level to pass to patch command
working_dir: path within the source directory where patch should be applied
ordering_key: key used to ensure patches are applied in a consistent order
sha256: sha256 sum of the patch, used to verify the patch
archive_sha256: sha256 sum of the *archive*, if the patch is compressed
(only required for compressed URL patches)
"""
super().__init__(pkg, url, level, working_dir)
self.url = url
self._stage = None
self._stage: Optional["spack.stage.Stage"] = None
self.ordering_key = ordering_key
self.archive_sha256 = kwargs.get("archive_sha256")
if allowed_archive(self.url) and not self.archive_sha256:
if allowed_archive(self.url) and not archive_sha256:
raise PatchDirectiveError(
"Compressed patches require 'archive_sha256' "
"and patch 'sha256' attributes: %s" % self.url
)
self.archive_sha256 = archive_sha256
self.sha256 = kwargs.get("sha256")
if not self.sha256:
if not sha256:
raise PatchDirectiveError("URL patches require a sha256 checksum")
self.sha256 = sha256
def apply(self, stage: "spack.stage.Stage"):
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
Args:
stage: stage where source code lives
"""
assert self.stage.expanded, "Stage must be expanded before applying patches"
# Get the patch file.
@@ -204,15 +278,20 @@ def apply(self, stage: "spack.stage.Stage"):
return super().apply(stage)
@property
def stage(self):
def stage(self) -> "spack.stage.Stage":
"""The stage in which to download (and unpack) the URL patch.
Returns:
The stage object.
"""
if self._stage:
return self._stage
fetch_digest = self.archive_sha256 or self.sha256
# Two checksums, one for compressed file, one for its contents
if self.archive_sha256:
fetcher = fs.FetchAndVerifyExpandedFile(
if self.archive_sha256 and self.sha256:
fetcher: fs.FetchStrategy = fs.FetchAndVerifyExpandedFile(
self.url, archive_sha256=self.archive_sha256, expanded_sha256=self.sha256
)
else:
@@ -231,7 +310,12 @@ def stage(self):
)
return self._stage
def to_dict(self):
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
Returns:
A dictionary representation.
"""
data = super().to_dict()
data["url"] = self.url
if self.archive_sha256:
@@ -239,8 +323,21 @@ def to_dict(self):
return data
def from_dict(dictionary, repository=None):
"""Create a patch from json dictionary."""
def from_dict(
dictionary: Dict[str, Any], repository: Optional["spack.repo.RepoPath"] = None
) -> Patch:
"""Create a patch from json dictionary.
Args:
dictionary: dictionary representation of a patch
repository: repository containing package
Returns:
A patch object.
Raises:
ValueError: If *owner* or *url*/*relative_path* are missing in the dictionary.
"""
repository = repository or spack.repo.PATH
owner = dictionary.get("owner")
if "owner" not in dictionary:
@@ -252,7 +349,7 @@ def from_dict(dictionary, repository=None):
pkg_cls,
dictionary["url"],
dictionary["level"],
dictionary["working_dir"],
working_dir=dictionary["working_dir"],
sha256=dictionary["sha256"],
archive_sha256=dictionary.get("archive_sha256"),
)
@@ -267,7 +364,7 @@ def from_dict(dictionary, repository=None):
# TODO: handle this more gracefully.
sha256 = dictionary["sha256"]
checker = Checker(sha256)
if not checker.check(patch.path):
if patch.path and not checker.check(patch.path):
raise fs.ChecksumError(
"sha256 checksum failed for %s" % patch.path,
"Expected %s but got %s " % (sha256, checker.sum)
@@ -295,10 +392,17 @@ class PatchCache:
namespace2.package2:
<patch json>
... etc. ...
"""
def __init__(self, repository, data=None):
def __init__(
self, repository: "spack.repo.RepoPath", data: Optional[Dict[str, Any]] = None
) -> None:
"""Initialize a new PatchCache instance.
Args:
repository: repository containing package
data: nested dictionary of patches
"""
if data is None:
self.index = {}
else:
@@ -309,21 +413,39 @@ def __init__(self, repository, data=None):
self.repository = repository
@classmethod
def from_json(cls, stream, repository):
def from_json(cls, stream: Any, repository: "spack.repo.RepoPath") -> "PatchCache":
"""Initialize a new PatchCache instance from JSON.
Args:
stream: stream of data
repository: repository containing package
Returns:
A new PatchCache instance.
"""
return PatchCache(repository=repository, data=sjson.load(stream))
def to_json(self, stream):
def to_json(self, stream: Any) -> None:
"""Dump a JSON representation to a stream.
Args:
stream: stream of data
"""
sjson.dump({"patches": self.index}, stream)
def patch_for_package(self, sha256: str, pkg):
def patch_for_package(self, sha256: str, pkg: "spack.package_base.PackageBase") -> Patch:
"""Look up a patch in the index and build a patch object for it.
Arguments:
sha256: sha256 hash to look up
pkg (spack.package_base.PackageBase): Package object to get patch for.
We build patch objects lazily because building them requires that
we have information about the package's location in its repo."""
we have information about the package's location in its repo.
Args:
sha256: sha256 hash to look up
pkg: Package object to get patch for.
Returns:
The patch object.
"""
sha_index = self.index.get(sha256)
if not sha_index:
raise PatchLookupError(
@@ -346,7 +468,12 @@ def patch_for_package(self, sha256: str, pkg):
patch_dict["sha256"] = sha256
return from_dict(patch_dict, repository=self.repository)
def update_package(self, pkg_fullname):
def update_package(self, pkg_fullname: str) -> None:
"""Update the patch cache.
Args:
pkg_fullname: package to update.
"""
# remove this package from any patch entries that reference it.
empty = []
for sha256, package_to_patch in self.index.items():
@@ -372,14 +499,29 @@ def update_package(self, pkg_fullname):
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
def update(self, other):
"""Update this cache with the contents of another."""
def update(self, other: "PatchCache") -> None:
"""Update this cache with the contents of another.
Args:
other: another patch cache to merge
"""
for sha256, package_to_patch in other.index.items():
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
@staticmethod
def _index_patches(pkg_class, repository):
def _index_patches(
pkg_class: Type["spack.package_base.PackageBase"], repository: "spack.repo.RepoPath"
) -> Dict[Any, Any]:
"""Patch index for a specific patch.
Args:
pkg_class: package object to get patches for
repository: repository containing the package
Returns:
The patch index for that package.
"""
index = {}
# Add patches from the class

View File

@@ -3,6 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.path
def get_projection(projections, spec):
"""
@@ -11,7 +13,7 @@ def get_projection(projections, spec):
all_projection = None
for spec_like, projection in projections.items():
if spec.satisfies(spec_like):
return projection
return spack.util.path.substitute_path_variables(projection)
elif spec_like == "all":
all_projection = projection
all_projection = spack.util.path.substitute_path_variables(projection)
return all_projection

View File

@@ -7,6 +7,7 @@
import os
import re
from collections import OrderedDict
from typing import List, Optional
import macholib.mach_o
import macholib.MachO
@@ -47,7 +48,7 @@ def __init__(self, file_path, root_path):
@memoized
def _patchelf():
def _patchelf() -> Optional[executable.Executable]:
"""Return the full path to the patchelf binary, if available, else None."""
import spack.bootstrap
@@ -55,9 +56,7 @@ def _patchelf():
return None
with spack.bootstrap.ensure_bootstrap_configuration():
patchelf = spack.bootstrap.ensure_patchelf_in_path_or_raise()
return patchelf.path
return spack.bootstrap.ensure_patchelf_in_path_or_raise()
def _elf_rpaths_for(path):
@@ -340,31 +339,34 @@ def macholib_get_paths(cur_path):
return (rpaths, deps, ident)
def _set_elf_rpaths(target, rpaths):
"""Replace the original RPATH of the target with the paths passed
as arguments.
def _set_elf_rpaths_and_interpreter(
target: str, rpaths: List[str], interpreter: Optional[str] = None
) -> Optional[str]:
"""Replace the original RPATH of the target with the paths passed as arguments.
Args:
target: target executable. Must be an ELF object.
rpaths: paths to be set in the RPATH
interpreter: optionally set the interpreter
Returns:
A string concatenating the stdout and stderr of the call
to ``patchelf`` if it was invoked
A string concatenating the stdout and stderr of the call to ``patchelf`` if it was invoked
"""
# Join the paths using ':' as a separator
rpaths_str = ":".join(rpaths)
patchelf, output = executable.Executable(_patchelf()), None
try:
# TODO: error handling is not great here?
# TODO: revisit the use of --force-rpath as it might be conditional
# TODO: if we want to support setting RUNPATH from binary packages
patchelf_args = ["--force-rpath", "--set-rpath", rpaths_str, target]
output = patchelf(*patchelf_args, output=str, error=str)
args = ["--force-rpath", "--set-rpath", rpaths_str]
if interpreter:
args.extend(["--set-interpreter", interpreter])
args.append(target)
return _patchelf()(*args, output=str, error=str)
except executable.ProcessError as e:
msg = "patchelf --force-rpath --set-rpath {0} failed with error {1}"
tty.warn(msg.format(target, e))
return output
tty.warn(str(e))
return None
def needs_binary_relocation(m_type, m_subtype):
@@ -501,10 +503,12 @@ def new_relocate_elf_binaries(binaries, prefix_to_prefix):
for path in binaries:
try:
elf.replace_rpath_in_place_or_raise(path, prefix_to_prefix)
except elf.ElfDynamicSectionUpdateFailed as e:
# Fall back to the old `patchelf --set-rpath` method.
_set_elf_rpaths(path, e.new.decode("utf-8").split(":"))
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix)
except elf.ElfCStringUpdatesFailed as e:
# Fall back to `patchelf --set-rpath ... --set-interpreter ...`
rpaths = e.rpath.new_value.decode("utf-8").split(":") if e.rpath else []
interpreter = e.pt_interp.new_value.decode("utf-8") if e.pt_interp else None
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def relocate_elf_binaries(
@@ -546,10 +550,10 @@ def relocate_elf_binaries(
new_rpaths = _make_relative(new_binary, new_root, new_norm_rpaths)
# check to see if relative rpaths are changed before rewriting
if sorted(new_rpaths) != sorted(orig_rpaths):
_set_elf_rpaths(new_binary, new_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
else:
new_rpaths = _transform_rpaths(orig_rpaths, orig_root, new_prefixes)
_set_elf_rpaths(new_binary, new_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def make_link_relative(new_links, orig_links):
@@ -596,7 +600,7 @@ def make_elf_binaries_relative(new_binaries, orig_binaries, orig_layout_root):
orig_rpaths = _elf_rpaths_for(new_binary)
if orig_rpaths:
new_rpaths = _make_relative(orig_binary, orig_layout_root, orig_rpaths)
_set_elf_rpaths(new_binary, new_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def warn_if_link_cant_be_relocated(link, target):

View File

@@ -25,7 +25,7 @@
import traceback
import types
import uuid
from typing import Any, Dict, List, Tuple, Union
from typing import Any, Dict, List, Set, Tuple, Union
import llnl.path
import llnl.util.filesystem as fs
@@ -490,7 +490,7 @@ def read(self, stream):
self.index = spack.tag.TagIndex.from_json(stream, self.repository)
def update(self, pkg_fullname):
self.index.update_package(pkg_fullname)
self.index.update_package(pkg_fullname.split(".")[-1])
def write(self, stream):
self.index.to_json(stream)
@@ -746,19 +746,17 @@ def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags, full=False):
"""Returns a list of packages matching any of the tags in input.
def packages_with_tags(self, *tags: str, full: bool = False) -> Set[str]:
"""Returns a set of packages matching any of the tags in input.
Args:
full: if True the package names in the output are fully-qualified
"""
r = set()
for repo in self.repos:
current = repo.packages_with_tags(*tags)
if full:
current = [f"{repo.namespace}.{x}" for x in current]
r |= set(current)
return sorted(r)
return {
f"{repo.namespace}.{pkg}" if full else pkg
for repo in self.repos
for pkg in repo.packages_with_tags(*tags)
}
def all_package_classes(self):
for name in self.all_package_names():
@@ -1169,15 +1167,10 @@ def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags):
def packages_with_tags(self, *tags: str) -> Set[str]:
v = set(self.all_package_names())
index = self.tag_index
for t in tags:
t = t.lower()
v &= set(index[t])
return sorted(v)
v.intersection_update(*(self.tag_index[tag.lower()] for tag in tags))
return v
def all_package_classes(self):
"""Iterator over all package *classes* in the repository.

View File

@@ -6,6 +6,7 @@
import collections
import contextlib
import functools
import gzip
import os
import time
import traceback
@@ -190,9 +191,13 @@ def on_success(self, pkg, kwargs, package_record):
def fetch_log(self, pkg):
try:
with open(pkg.build_log_path, "r", encoding="utf-8") as stream:
return "".join(stream.readlines())
except Exception:
if os.path.exists(pkg.install_log_path):
stream = gzip.open(pkg.install_log_path, "rt")
else:
stream = open(pkg.log_path)
with stream as f:
return f.read()
except OSError:
return f"Cannot open log for {pkg.spec.cshort_spec}"
def extract_package_from_signature(self, instance, *args, **kwargs):

View File

@@ -6,7 +6,6 @@
import warnings
import llnl.util.lang
import llnl.util.tty
# jsonschema is imported lazily as it is heavy to import
@@ -62,25 +61,3 @@ def _deprecated_properties(validator, deprecated, instance, schema):
Validator = llnl.util.lang.Singleton(_make_validator)
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": {
"type": "array",
"items": {"type": "array", "items": {"type": "string"}},
},
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}

View File

@@ -3,16 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for bootstrap.yaml configuration file."""
from typing import Any, Dict
#: Schema of a single source
_source_schema = {
_source_schema: Dict[str, Any] = {
"type": "object",
"properties": {"name": {"type": "string"}, "metadata": {"type": "string"}},
"additionalProperties": False,
"required": ["name", "metadata"],
}
properties = {
properties: Dict[str, Any] = {
"bootstrap": {
"type": "object",
"properties": {

View File

@@ -6,27 +6,31 @@
"""Schema for a buildcache spec.yaml file
.. literalinclude:: _spack_root/lib/spack/spack/schema/buildcache_spec.py
:lines: 13-
:lines: 15-
"""
from typing import Any, Dict
import spack.schema.spec
properties: Dict[str, Any] = {
# `buildinfo` is no longer needed as of Spack 0.21
"buildinfo": {"type": "object"},
"spec": {
"type": "object",
"additionalProperties": True,
"items": spack.schema.spec.properties,
},
"binary_cache_checksum": {
"type": "object",
"properties": {"hash_algorithm": {"type": "string"}, "hash": {"type": "string"}},
},
"buildcache_layout_version": {"type": "number"},
}
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack buildcache specfile schema",
"type": "object",
"additionalProperties": False,
"properties": {
# `buildinfo` is no longer needed as of Spack 0.21
"buildinfo": {"type": "object"},
"spec": {
"type": "object",
"additionalProperties": True,
"items": spack.schema.spec.properties,
},
"binary_cache_checksum": {
"type": "object",
"properties": {"hash_algorithm": {"type": "string"}, "hash": {"type": "string"}},
},
"buildcache_layout_version": {"type": "number"},
},
"properties": properties,
}

View File

@@ -2,16 +2,15 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for cdash.yaml configuration file.
.. literalinclude:: ../spack/schema/cdash.py
:lines: 13-
"""
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"cdash": {
"type": "object",
"additionalProperties": False,

View File

@@ -2,12 +2,12 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for gitlab-ci.yaml configuration file.
.. literalinclude:: ../spack/schema/ci.py
:lines: 13-
:lines: 16-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
@@ -164,7 +164,7 @@
}
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"ci": {
"oneOf": [
# TODO: Replace with core-shared-properties in Spack 0.23

View File

@@ -2,16 +2,17 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for compilers.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/compilers.py
:lines: 13-
:lines: 15-
"""
from typing import Any, Dict
import spack.schema.environment
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"compilers": {
"type": "array",
"items": {

View File

@@ -2,14 +2,14 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for concretizer.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/concretizer.py
:lines: 13-
:lines: 12-
"""
from typing import Any, Dict
properties = {
properties: Dict[str, Any] = {
"concretizer": {
"type": "object",
"additionalProperties": False,

View File

@@ -5,15 +5,16 @@
"""Schema for config.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/config.py
:lines: 13-
:lines: 17-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema.projections
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"config": {
"type": "object",
"default": {},

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for the 'container' subsection of Spack environments."""
from typing import Any, Dict
_stages_from_dockerhub = {
"type": "object",
@@ -85,4 +86,4 @@
},
}
properties = {"container": container_schema}
properties: Dict[str, Any] = {"container": container_schema}

View File

@@ -11,112 +11,115 @@
This does not specify a configuration - it is an input format
that is consumed and transformed into Spack DB records.
"""
from typing import Any, Dict
properties: Dict[str, Any] = {
"_meta": {
"type": "object",
"additionalProperties": False,
"properties": {
"file-type": {"type": "string", "minLength": 1},
"cpe-version": {"type": "string", "minLength": 1},
"system-type": {"type": "string", "minLength": 1},
"schema-version": {"type": "string", "minLength": 1},
# Older schemas use did not have "cpe-version", just the
# schema version; in that case it was just called "version"
"version": {"type": "string", "minLength": 1},
},
},
"compilers": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"prefix": {"type": "string", "minLength": 1},
"executables": {
"type": "object",
"additionalProperties": False,
"properties": {
"cc": {"type": "string", "minLength": 1},
"cxx": {"type": "string", "minLength": 1},
"fc": {"type": "string", "minLength": 1},
},
},
"arch": {
"type": "object",
"required": ["os", "target"],
"additionalProperties": False,
"properties": {
"os": {"type": "string", "minLength": 1},
"target": {"type": "string", "minLength": 1},
},
},
},
},
},
"specs": {
"type": "array",
"items": {
"type": "object",
"required": ["name", "version", "arch", "compiler", "prefix", "hash"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"arch": {
"type": "object",
"required": ["platform", "platform_os", "target"],
"additionalProperties": False,
"properties": {
"platform": {"type": "string", "minLength": 1},
"platform_os": {"type": "string", "minLength": 1},
"target": {
"type": "object",
"additionalProperties": False,
"required": ["name"],
"properties": {"name": {"type": "string", "minLength": 1}},
},
},
},
"compiler": {
"type": "object",
"required": ["name", "version"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
},
},
"dependencies": {
"type": "object",
"patternProperties": {
"\\w[\\w-]*": {
"type": "object",
"required": ["hash"],
"additionalProperties": False,
"properties": {
"hash": {"type": "string", "minLength": 1},
"type": {
"type": "array",
"items": {"type": "string", "minLength": 1},
},
},
}
},
},
"prefix": {"type": "string", "minLength": 1},
"rpm": {"type": "string", "minLength": 1},
"hash": {"type": "string", "minLength": 1},
"parameters": {"type": "object"},
},
},
},
}
schema = {
"$schema": "http://json-schema.org/schema#",
"title": "CPE manifest schema",
"type": "object",
"additionalProperties": False,
"properties": {
"_meta": {
"type": "object",
"additionalProperties": False,
"properties": {
"file-type": {"type": "string", "minLength": 1},
"cpe-version": {"type": "string", "minLength": 1},
"system-type": {"type": "string", "minLength": 1},
"schema-version": {"type": "string", "minLength": 1},
# Older schemas use did not have "cpe-version", just the
# schema version; in that case it was just called "version"
"version": {"type": "string", "minLength": 1},
},
},
"compilers": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"prefix": {"type": "string", "minLength": 1},
"executables": {
"type": "object",
"additionalProperties": False,
"properties": {
"cc": {"type": "string", "minLength": 1},
"cxx": {"type": "string", "minLength": 1},
"fc": {"type": "string", "minLength": 1},
},
},
"arch": {
"type": "object",
"required": ["os", "target"],
"additionalProperties": False,
"properties": {
"os": {"type": "string", "minLength": 1},
"target": {"type": "string", "minLength": 1},
},
},
},
},
},
"specs": {
"type": "array",
"items": {
"type": "object",
"required": ["name", "version", "arch", "compiler", "prefix", "hash"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
"arch": {
"type": "object",
"required": ["platform", "platform_os", "target"],
"additioanlProperties": False,
"properties": {
"platform": {"type": "string", "minLength": 1},
"platform_os": {"type": "string", "minLength": 1},
"target": {
"type": "object",
"additionalProperties": False,
"required": ["name"],
"properties": {"name": {"type": "string", "minLength": 1}},
},
},
},
"compiler": {
"type": "object",
"required": ["name", "version"],
"additionalProperties": False,
"properties": {
"name": {"type": "string", "minLength": 1},
"version": {"type": "string", "minLength": 1},
},
},
"dependencies": {
"type": "object",
"patternProperties": {
"\\w[\\w-]*": {
"type": "object",
"required": ["hash"],
"additionalProperties": False,
"properties": {
"hash": {"type": "string", "minLength": 1},
"type": {
"type": "array",
"items": {"type": "string", "minLength": 1},
},
},
}
},
},
"prefix": {"type": "string", "minLength": 1},
"rpm": {"type": "string", "minLength": 1},
"hash": {"type": "string", "minLength": 1},
"parameters": {"type": "object"},
},
},
},
},
"properties": properties,
}

View File

@@ -6,12 +6,41 @@
"""Schema for database index.json file
.. literalinclude:: _spack_root/lib/spack/spack/schema/database_index.py
:lines: 36-
:lines: 17-
"""
from typing import Any, Dict
import spack.schema.spec
# spack.schema.spec.properties
properties: Dict[str, Any] = {
"database": {
"type": "object",
"required": ["installs", "version"],
"additionalProperties": False,
"properties": {
"installs": {
"type": "object",
"patternProperties": {
r"^[\w\d]{32}$": {
"type": "object",
"properties": {
"spec": spack.schema.spec.properties,
"path": {"oneOf": [{"type": "string"}, {"type": "null"}]},
"installed": {"type": "boolean"},
"ref_count": {"type": "integer", "minimum": 0},
"explicit": {"type": "boolean"},
"installation_time": {"type": "number"},
},
}
},
},
"version": {"type": "string"},
},
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
@@ -19,30 +48,5 @@
"type": "object",
"required": ["database"],
"additionalProperties": False,
"properties": {
"database": {
"type": "object",
"required": ["installs", "version"],
"additionalProperties": False,
"properties": {
"installs": {
"type": "object",
"patternProperties": {
r"^[\w\d]{32}$": {
"type": "object",
"properties": {
"spec": spack.schema.spec.properties,
"path": {"oneOf": [{"type": "string"}, {"type": "null"}]},
"installed": {"type": "boolean"},
"ref_count": {"type": "integer", "minimum": 0},
"explicit": {"type": "boolean"},
"installation_time": {"type": "number"},
},
}
},
},
"version": {"type": "string"},
},
}
},
"properties": properties,
}

View File

@@ -6,20 +6,21 @@
"""Schema for definitions
.. literalinclude:: _spack_root/lib/spack/spack/schema/definitions.py
:lines: 13-
:lines: 16-
"""
from typing import Any, Dict
import spack.schema
from .spec_list import spec_list_schema
#: Properties for inclusion in other schemas
properties = {
properties: Dict[str, Any] = {
"definitions": {
"type": "array",
"default": [],
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spack.schema.spec_list_schema},
"patternProperties": {r"^(?!when$)\w*": spec_list_schema},
},
}
}

View File

@@ -2,9 +2,9 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Any, Dict
properties = {
properties: Dict[str, Any] = {
"develop": {
"type": "object",
"default": {},

View File

@@ -6,74 +6,46 @@
"""Schema for env.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/env.py
:lines: 36-
:lines: 19-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema.gitlab_ci # DEPRECATED
import spack.schema.merged
import spack.schema.projections
from .spec_list import spec_list_schema
#: Top level key in a manifest file
TOP_LEVEL_KEY = "spack"
projections_scheme = spack.schema.projections.properties["projections"]
properties: Dict[str, Any] = {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
spack.schema.gitlab_ci.properties,
# merged configuration scope schemas
spack.schema.merged.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spec_list_schema,
},
),
}
}
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack environment file schema",
"type": "object",
"additionalProperties": False,
"properties": {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
spack.schema.gitlab_ci.properties,
# merged configuration scope schemas
spack.schema.merged.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
"view": {
"anyOf": [
{"type": "boolean"},
{"type": "string"},
{
"type": "object",
"patternProperties": {
r"\w+": {
"required": ["root"],
"additionalProperties": False,
"properties": {
"root": {"type": "string"},
"link": {
"type": "string",
"pattern": "(roots|all|run)",
},
"link_type": {"type": "string"},
"select": {
"type": "array",
"items": {"type": "string"},
},
"exclude": {
"type": "array",
"items": {"type": "string"},
},
"projections": projections_scheme,
},
}
},
},
]
},
},
),
}
},
"properties": properties,
}

View File

@@ -6,6 +6,7 @@
schemas.
"""
import collections.abc
from typing import Any, Dict
array_of_strings_or_num = {
"type": "array",
@@ -18,7 +19,7 @@
"patternProperties": {r"\w[\w-]*": {"anyOf": [{"type": "string"}, {"type": "number"}]}},
}
definition = {
definition: Dict[str, Any] = {
"type": "object",
"default": {},
"additionalProperties": False,

Some files were not shown because too many files have changed in this diff Show More