Compare commits

...

301 Commits

Author SHA1 Message Date
John W. Parent
bf14b424bb proj: correct CMake arg for shared build with proj older than 7.0.0 (#43089)
* proj: correct CMake arg for shared build with proj older than 7.0.0

* Actually use new CMake arg

* Update var/spack/repos/builtin/packages/proj/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-09 00:40:27 -07:00
Arne Becker
14209a86a6 perl-bio-eutilities and deps: new packages (#42869)
This adds Spack packages for these Perl distributons:
- Bio::DB::EUtilities and its dependencies:
- Bio::ASN1::EntrezGene
- Bio::Cluster
- Bio::Variation
2024-03-08 10:42:26 -08:00
Arne Becker
b7d9900764 perl-test-warn: update version, dependencies (#42901)
* perl-test-warn: new package
  Adds Test::Warn
* Revive older version
* perl-test-warn: fix URL and deprecate old version
2024-03-08 10:40:20 -08:00
Arne Becker
bc155e7b90 perl-starman and deps: new packages (#42958)
Adds Starman and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Starman
- Starman
2024-03-08 10:36:20 -08:00
Massimiliano Culpo
65f9ba345f gnupg: add v2.4.5, remove deprecated versions (#43090)
Also updates dependencies:
  libassuan: add v2.5.7
  libgcrypt: deprecate end of life versions
2024-03-08 10:20:55 -08:00
Dave Keeshan
ca49bc5652 verible: Add revision v0.0-3607-g46de0f64 (#43086) 2024-03-08 10:18:39 -08:00
snehring
b84b85a7e0 visit: link against hip when built with vtk-m+rocm (#42452) 2024-03-08 19:09:35 +01:00
John W. Parent
016cdba16f Curl package: update Windows support (#42752)
* Properly specify Curl builder interface for static vs shared curl
  with NMake system to ensure all built curls export expected
  symbols.
* Symlinks curl library build artifact to more idiomatic name for
  FindCurl.cmake implementations and other NMake consumers.
2024-03-08 10:08:12 -08:00
Jonas Eschle
4806e6549f Add package zfit (#42667)
* Add package zfit (WIP)

* Add package zfit

* Add package zfit

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* enh: add extras, 0.18.1

* fix: add default

* fix:  typo

* fix:  typo

* chore: cleanup arguments

* chore: cleanup arguments

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* chore: cleanup arguments

* fix:  typo

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-08 11:04:35 -07:00
Adam J. Stewart
c14b277150 py-fiona: add v1.9.6 (#43099) 2024-03-08 10:32:15 -07:00
John W. Parent
919025d9f3 VTK package: use CMake config provide by PROJ (#43088)
PROJ can install its own config file: VTK should prefer that over manual detection.
2024-03-08 10:32:00 -07:00
Adam J. Stewart
52f57c90eb Retiring as PythonPackage maintainer (#43091) 2024-03-08 18:29:24 +01:00
fgava90
ee1fa3e50c Deprecating py-pylint@2.3 as it cannot build with python@3.8: (#43096)
* Deprecating py-pylint@2.3 as it cannot build with python@3.8:

* Style fix

* Removed versions because can't build with python@3.7

---------

Co-authored-by: Gava, Francesco <francesco.gava@mclaren.com>
2024-03-08 07:43:46 -07:00
jmlapre
772928241b py-cheap-repr: new package (#42944)
* py_cheap_repr: add initial package.py

* Update var/spack/repos/builtin/packages/py-cheap-repr/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cheap-repr: use pypi link instead

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-08 06:45:25 -06:00
Martin Lang
7440bb4c36 bigdft-psolver: new versions 1.9.3, 1.9.4 (#42644) 2024-03-08 07:22:15 +01:00
Tom Payerle
c464866deb sfcgal: add v1.5.1, and restrict versions of cgal dependency (#42278)
Added new versions @1.5.1 and @1.4.1 (sfcgal moved from github to gitlab)

Placed restrictions on what versions of cgal are supported for different
cfcgal versions.  These restrictions are based on what was found in the
version history at https://gitlab.com/sfcgal/SFCGAL/-/blob/v1.5.1/NEWS?ref_type=tags
as well as the CMakeLists.txt for different versions.

@1.4 and @1.5 seem to require a specific version of cgal based on CMakeLists.txt

Earlier versions (@1.3.8:1.3.10) claim to support cgal@4.3: (but Spack recipe
claims did not work until @4.7, so sticking with that as minimum).  CMakeList.txt
suggests they support cgal@5 as well, but version history suggests otherwise.
2024-03-08 07:08:34 +01:00
Tom Payerle
799a8a5090 sfcgal: add custom libs property (#42276) 2024-03-08 07:06:38 +01:00
m-shunji
c218ee50e9 libint: add link option for fujitsu compiler (#42233)
Co-authored-by: inada-yoshie <inada.yoshie@fujitsu.com>
2024-03-08 07:03:18 +01:00
Teague Sterling
8ff7a20320 duckdb: add v0.10.0, overhaul of package recipe (#38922)
* Expanding the duckdb package to fix the version number (required for external extensions to work) being pulled from git and have variants for the built-in extensions at build-time. This also changes the build system from CMakePackage to Makefile package (as advised from upstream).

* - Reorganized and cleaned up variants
 - Updated the patch to work for 0.10.0+
 - Removed (or made non-default) some unecessary variants
 - Added ninja as the default generator
 - Set up some shared library dependencies;
2024-03-08 06:32:29 +01:00
Dave Keeshan
e3fe6bc0f7 Add 5.022 and cleanup how autoconf is done (#43085) 2024-03-07 18:29:27 -07:00
Sreenivasa Murthy Kolam
c6fcb1068f Enable tensorflow-2.11 support for ROCm (#38520)
* enable tensorflow-2.11 support for ROCm

* add latest sha for mesa and limit the patches to older version.similar
changes in #37910 to enable gitlab-ci pass

* address review commemts
2024-03-07 15:23:40 -07:00
Weiqun Zhang
54ac3e72ed amrex: add v24.03 (#43083) 2024-03-07 13:59:29 -08:00
Matthew Thompson
274fbebc4c Update GFE packages (#43082)
* Update GFE packages
* Add pflogger 1.13.1
2024-03-07 13:47:33 -08:00
Tim Haines
d40eb19918 Dyninst: add v13.0.0 (#43068) 2024-03-07 13:43:18 -08:00
dependabot[bot]
31de670bd2 build(deps): bump dorny/paths-filter from 3.0.1 to 3.0.2 (#43000)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 3.0.1 to 3.0.2.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](ebc4d7e9eb...de90cc6fb3)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-07 11:54:24 -08:00
Harmen Stoppels
6c0961549b zlib-ng: 2.1.6 (#43060) 2024-03-07 12:35:36 -07:00
Tim Fuller
c090bc5ebe Drop optional dependencies of Spack (#43081)
Remove dependency on `importlib_metadata` and `pkg_resources`, which can be problematic if the version in PYTHONPATH is incompatible with the interpreter Spack is running under.
2024-03-07 17:52:49 +00:00
renjithravindrankannath
bca4d37d76 hip: update patch PARAMETERS_MIN_ALIGNMENT for 6.0 (#42834) 2024-03-07 18:31:23 +01:00
Sreenivasa Murthy Kolam
9b484d2eea fix build issue with rocm-openmp-extras for 5.4.3 release (#43046)
* fix build issue with rocm-openmp-extras for 5.4.3 release
https://github.com/spack/spack/issues/36930
* fix style errors
2024-03-07 10:23:07 -07:00
Adam J. Stewart
a57b0e1e2d py-lightning: add v2.2.1 (#43042) 2024-03-07 09:22:52 -08:00
Luc Berger
e3cb3b29d9 Kokkos Kernels: adding missing TPLs and pre-conditions (#43043)
* Kokkos Kernels: adding missing TPLs and pre-conditions
  Adding variants and dependencies for rocBLAS and rocSPARSE.
  Also adding a "when=" close to the TPL variants that prevents
  enabling the TPLs in versions of the library when it was not
  yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: fix issue with unpacking wrong number of args
  After changing the tpls dictionary we need to update the unpacking
  logic to catch the right number of outputs out of it!
* Kokkos Kernels: updating doc string for tpls var and using f-string
  Improving comment a bit and switching to f-string for more readability.
  When setup to do more testing will try to use f-string in the CMake
  options generation part of the package.
* Style change
2024-03-07 09:00:10 -08:00
Mark W. Krentel
ac48ecd375 meson: add version 1.3.2 (#43036) 2024-03-07 09:56:20 -07:00
Juan Miguel Carceller
0bb20d34db autotools: fix a typo in comment (#43080)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-07 16:53:14 +00:00
Eric Berquist
971fda5c33 sst-core: use ncurses for interactive sst-info (#43063)
* sst-core now effectively depends on ncurses

* use --with-curses

* sst-core: update comment about ncurses

* should have curses for build, link, and run
2024-03-07 10:45:47 -06:00
Mikael Simberg
dcc4423a9d pika: Add 0.23.0 (#43077) 2024-03-07 16:06:54 +01:00
runiq
82c380b563 Fix spack find bootstrapping docs (#43074)
Closes #43052.

Maybe moving the argument to the `find` subcommand is a good idea, but I
just wanted to get the docs fix out.

Co-authored-by: Patrice Peterson <patrice.peterson@itz.uni-halle.de>
2024-03-07 14:13:32 +01:00
Adam J. Stewart
8bcf6a31ae ML CI: variants are now required (#42851) 2024-03-07 11:42:13 +01:00
Vanessasaurus
ddd88e266a Automated deployment to update package flux-core 2024-03-07 (#43067) 2024-03-07 11:13:31 +01:00
Martin Aumüller
08c597d83e botan: add v3.3.0, v2.9.14 (#43047) 2024-03-06 09:29:04 -08:00
Martin Aumüller
bf5340755d embree: add v4.3.1 (#43048) 2024-03-06 09:25:22 -08:00
Satish Balay
f8e70a0c96 llvm: add 18.1.0 (#43049) 2024-03-06 17:36:03 +01:00
Tim Fuller
7e468aefd5 Allow loading extensions through python entry-points (#42370)
This PR adds the ability to load spack extensions through `importlib.metadata` entry 
points, in addition to the regular configuration variable.

It requires Python 3.8 or greater to be properly supported.
2024-03-06 11:18:49 +01:00
John W. Parent
e685d04f84 netcdf-c package: Skip problematic post install on Windows (#43039)
#42878 adds a post install filter of the netCDFConfig.cmake file
that replaces a valid CMake target on Windows with an invalid one.
Don't do this replacement on Windows.
2024-03-05 21:44:36 -07:00
Adam J. Stewart
9d962f55b0 spack.patch: fix type hint circular import (#43041) 2024-03-05 17:26:44 -08:00
renjithravindrankannath
00d3066b97 migraphx: add v6.0.0 & v6.0.2 (#42918)
* Updates for migraphx 6.0.0 & 6.0.2
* Style check error and audit check error fix
* Adding patch for half-include-directory
* The parameter GPU_TARGETS is used from 5.7 in migraphx
* Adding rocmlir dependency in migraphx and 6.0 updates in rocmlir
* Applying upcoming changes to make CK JIT optional and enable
  compilation on Windows in order to build without ck dependency
2024-03-05 15:16:31 -08:00
Martin Lang
5ca0dcecb2 mpfr: missing dependency for version 4.0.1 (#43012)
* mpfr: missing dependency for version 4.0.1
  mpfr 4.0.1 (like 4.0.2) needs autoconf-archive where it takes the
  AX_PHREAD macro from
* autoconf-archive is also required for mpfr@4.0.0
2024-03-05 15:11:31 -08:00
Pranav Sivaraman
fa8fb7903b nvptx-tools: add v2023-09-13 (#40897)
New changes have been made to nvptx-tools that address dropping support
for sm_30 in later CUDA versions (12.0+).

Also refactor gcc to make nvptx-tools a dependency instead of a resource.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-03-05 22:51:40 +01:00
Adam J. Stewart
f35ff441f2 spack.patch: add type hints (#42811)
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-03-05 13:19:43 -08:00
Arne Becker
9ea9ee05c8 perl-catalyst-view-json: new package (#42981)
Adds Catalyst::View::JSON
2024-03-05 12:04:38 -08:00
Arne Becker
24ddc49c1b perl-bsd-resource: new package (#42982)
Adds BSD::Resource
2024-03-05 12:03:52 -08:00
Arne Becker
2f4266161c perl-chart-gnuplot: new package (#42983)
Adds Chart::Gnuplot
2024-03-05 12:03:00 -08:00
Arne Becker
7bcb0fff7d perl-config-inifiles: new package (#42984)
Adds Config::IniFiles
2024-03-05 12:02:06 -08:00
Arne Becker
cd332c6370 perl-plack-middleware-assets and deps: new packages (#42990)
This adds Plack::Middleware::Assets and its missing deps:
- CSS::Minifier::XS
- JavaScript::Minifier::XS
- Test::DiagINC
2024-03-05 11:59:52 -08:00
Arne Becker
6daf9677f3 perl-plack-middleware-crossorigin: new package (#42991)
Adds Plack::Middleware::CrossOrigin
2024-03-05 11:59:25 -08:00
G-Ragghianti
cb6450977d Removing application of the ldconfig patch because it conflicts with (#43001)
gdrcopy version 2.4.1
2024-03-05 11:57:24 -08:00
Vanessasaurus
bf62ac0769 Automated deployment to update package flux-sched 2024-03-05 (#43005)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-03-05 11:53:29 -08:00
Sergey Kosukhin
0223fe746b serialbox: fix PYTHONPATH (#43034) 2024-03-05 11:48:18 -08:00
Arne Becker
12fba13441 perl-chi-driver-memcached: new package (#43011)
Adds CHI::Driver::Memcached.

Modified perl-chi to make the tests work. Testing modules in perl-chi
were not loaded when testing CHI::Driver::Memcached, so added the "run"
type to these.
2024-03-05 11:46:57 -08:00
Arne Becker
0c44f5a140 perl-html-template: new package (#43017)
Adds HTML::Template
2024-03-05 11:45:14 -08:00
Arne Becker
f4853790c5 perl-catalyst-devel and deps: new packages (#43013)
This add Catalyst::Devel and its missing dependencies:
- Catalyst::Plugin::ConfigLoader
- Catalyst::Plugin::Static::Simple
- File::ChangeNotify
2024-03-05 11:44:53 -08:00
Arne Becker
9ed2e396f4 perl-plack-middleware-deflater: new package (#43014)
Adds Plack::Middleware::Deflater
2024-03-05 11:43:39 -08:00
Arne Becker
3ee6fc937e perl-data-predicate: new package (#43015)
* perl-data-predicate: new package
Adds Data::Predicate
* Added description
2024-03-05 11:42:48 -08:00
Arne Becker
c9b6cc9a58 perl-exporter-auto: new package (#43016)
Adds Exporter::Auto
2024-03-05 11:41:46 -08:00
Arne Becker
58b394bcec perl-list-compare: new package (#43018)
Adds List::Compare
2024-03-05 11:39:05 -08:00
Arne Becker
4d89eeca9b perl-sereal and deps: new packages (#43022)
* perl-sereal and deps: new packages
This adds Sereal and its missing dependencies:
- Sereal::Encoder
- Sereal::Decoder
* Add missing files.
2024-03-05 11:38:24 -08:00
Arne Becker
bfc71e9dae perl-string-approx: new package (#43023)
Adds String::Approx
2024-03-05 11:35:04 -08:00
Arne Becker
f061dcda74 perl-string-numeric: new package (#43024)
Adds String::Numeric
2024-03-05 11:34:03 -08:00
Arne Becker
cc460894fd perl-test-file-contents: new package (#43025)
Adds Test::File::Contents
2024-03-05 11:32:50 -08:00
Arne Becker
5e09660e87 perl-test-mockobject and deps: new packages (#43026)
Adds Test::MockObject and its dependencies.
Installed OK with build-time tests. Added dependencies:
- UNIVERSAL::can
- UNIVERSAL::isa
2024-03-05 11:31:59 -08:00
Arne Becker
5a8efb3b14 perl-test-perl-critic: new package (#43027)
Adds Test::Perl::Critic
2024-03-05 11:30:12 -08:00
Arne Becker
99002027c4 perl-test-pod-coverage and deps: new packages (#43028)
Adds Test::Pod::Coverage and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Pod::Coverage
2024-03-05 11:28:28 -08:00
Arne Becker
a247879be3 perl-tie-ixhash: new package (#43029)
Adds Tie::IxHash
2024-03-05 11:25:59 -08:00
Sergey Kosukhin
7b46993fed eccodes: drop redundant build dependency (#43035) 2024-03-05 11:25:00 -08:00
Arne Becker
dd59f4ba34 perl-text-csv-xs: new package (#43030)
Adds Text::CSV_XS
2024-03-05 11:22:38 -08:00
Arne Becker
18ab14e659 perl-xml-hash-xs: new package (#43031)
Adds XML::Hash::XS
2024-03-05 11:20:00 -08:00
Massimiliano Culpo
28eb5e1bf6 archspec: add v0.2.3 (#43009) 2024-03-05 20:09:46 +01:00
Sergey Kosukhin
c658ddbfa3 icon: add new package (#43037) 2024-03-05 12:03:38 -07:00
Massimiliano Culpo
12963c894f libksba: add v1.6.6, remove deprecated versions (#43006) 2024-03-05 19:30:39 +01:00
Mark W. Krentel
61fa12508f hpcviewer: add version 2024.02 (#42997) 2024-03-05 10:15:32 -08:00
Martin Lang
daf6acef6e py-numpy: patch for AVX512 build flags on Intel Classic Compiler (#43020) 2024-03-05 17:27:05 +01:00
Alex Richert
d30621e787 dakota: make python dependency optional, add v6.19 (#42914)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-05 12:57:13 +01:00
Wouter Deconinck
dd4b365608 container: don't map develop to latest (#42952)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-05 10:47:04 +01:00
Massimiliano Culpo
157d47fc5a ASP-based solver: improve reusing nodes with gcc-runtime (#42408)
* ASP-based solver: improve reusing nodes with gcc-runtime

This PR skips emitting dependency constraints on "gcc-runtime",
for concrete specs that are considered for reuse.

Instead, an appropriate version of gcc-runtime is recomputed
considering also the concrete nodes from reused specs.

This ensures that root nodes in a DAG have always a runtime
that is at a version greater or equal than their dependencies.

* Add unit-test for view with multiple runtimes
* Select latest version of runtimes in views
* Construct result keeping track of latest
* Keep ordering stable, just in case
2024-03-04 22:46:28 -08:00
Massimiliano Culpo
13daa1b692 libgpg-error: add v1.48 (#42872) 2024-03-05 05:56:50 +01:00
Mosè Giordano
f923e650f9 py-pythran: @:0.12.1 is incompatible with python@3.11: (#42994)
Ref: https://github.com/serge-sans-paille/pythran/issues/2101 and https://github.com/scipy/scipy/issues/18390.
2024-03-04 18:28:17 -07:00
Victoria Cherkas
1a1bbb8af2 metkit: add git url (#42867) 2024-03-04 21:15:03 +01:00
Victoria Cherkas
594fcc3c80 metkit: add additional eckit dependency constraint (#42871) 2024-03-04 21:14:09 +01:00
Tom Payerle
76ec19b26e hdf-eos2: support version @3.0, plus assorted fixes (#41782)
1) support for version @3.0
Unfortunately, download seems to require registration now
so using manual_download mechanism for @3:

2) copying from hdf-eos5 patch from @vanderwb to enable
use of Spack compiler wrappers instead of h4cc

3) Patching an issue in hdf-eos2 configure script.  The
script will test for jpeg, libz libraries, succeed and
append HAVE_LIBJPEG=1, etc to confdefs.h, and then abort
because HAVE_LIBJPEG not set in running environment.

4) Add some LDFLAGS to build environment.  Otherwise
seems to fail to build test script due to rpc dependence
in HDF4.
2024-03-04 21:08:19 +01:00
Arne Becker
00baaf868e perl-perl-critic-moose and deps: new packages (#42992)
Adds Perl::Critic::Moose and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Perl::Critic
- Pod::Parser
- Perl::Tidy
- PPI
- PPIx::QuoteLike
- List::SomeUtils
- PPIx::Regexp
- B::Keywords
- PPIx::Utils
- String::Format
- Pod::Spell
- Test::SubCalls
- Test::Object
- Lingua::EN::Inflect
- Hook::LexWrap
2024-03-04 12:54:14 -07:00
Harmen Stoppels
3b06347f65 repo.py: cleanup packages_with_tags (#42980) 2024-03-04 20:50:04 +01:00
Eric Berquist
5b9e207db2 bear: add up to version 3.1.3 (#42993) 2024-03-04 12:17:24 -07:00
psakievich
d6fd9017c4 Document new environment variable expansion in projections (#42963)
Adding docs and test for #42917

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-03-04 12:17:08 -07:00
Sreenivasa Murthy Kolam
913d79238e llvm-amdgpu: remove the openmp variant. (#42807)
Add rocm-openmp-extras package as a dependency for +openmp for rocm
2024-03-04 12:16:53 -07:00
Arne Becker
250038fa9b perl-dbix-class and deps: new packages (#42986)
Adds DBIx::Class and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Class::C3::Componentised
- Data::Dumper::Concise
- Config::Any
- Context::Preserve
- Class::Accessor::Grouped
- Module::Find
- SQL::Abstract::Classic
- Class::C3
- SQL::Abstract
- Algorithm::C3
2024-03-04 12:11:38 -07:00
Arne Becker
26c553fce7 perl-net-server-ss-prefork and deps: new packages (#42989)
Adds Net::Server::SS::PreFork and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Server::Starter
- Net::Server
- HTTP::Server::Simple
2024-03-04 12:11:14 -07:00
Arne Becker
e24c242fb7 perl-gzip-faster: new package (#42988)
Adds Gzip::Faster
2024-03-04 12:10:57 -07:00
Arne Becker
ca14ce2629 perl-catalyst-action-rest: new package (#42960)
Adds Catalyst::Action::REST
2024-03-04 10:59:57 -08:00
Arne Becker
44f443946c perl-graphviz and deps: new packages (#42987)
Adds GraphViz and its dependencies.
Installed OK with build-time tests. Added dependencies:
- XML::XPath
2024-03-04 11:58:12 -07:00
Arne Becker
6e6bc89bda perl-catalyst-action-renderview and deps: new packages (#42961)
Adds Catalyst::Action::RenderView and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
2024-03-04 10:57:50 -08:00
Arne Becker
8714ea6652 perl-catalyst-component-instancepercontext: new package (#42962)
Adds Catalyst::Component::InstancePerContext
2024-03-04 10:56:08 -08:00
Sinan
df92f0a7d4 imagemagick: add v7.0.9:7.1.1-29 (#42968)
* package/imagemagick add new version, improve
* confimed that build fails when libsm is missing on linux

---------

Co-authored-by: Sinan81 <Sinan@world>
2024-03-04 10:52:56 -08:00
Alec Scott
d24b91157c restic: add v0.16.4 (#42956) 2024-03-04 19:52:20 +01:00
Alec Scott
1a0f77388c direnv: add v2.34.0 (#42955) 2024-03-04 19:51:47 +01:00
Harmen Stoppels
34571d4ad6 linux-pam: add missing libtirpc dep (#42976) 2024-03-04 19:29:25 +01:00
Rocco Meli
a574f40732 add namd 3.0b6 (#42979) 2024-03-04 19:06:03 +01:00
Arne Becker
d4ffe244af perl-chi and deps: new packages (#42959)
* perl-chi and deps: new packages

Adds CHI and its dependencies.
Installed OK with build-time tests. Added dependencies:
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI

* Add license
2024-03-04 08:59:33 -08:00
AMD Toolchain Support
e08e66ad89 AOCL: add v4.2.0 (#42920)
* AOCL: add v4.2.0
   Co-authored-by: Phil Tooley <phil.tooley@amd.com> and
                vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
* Review comments for spack community PR #42920

---------

Co-authored-by: Phil Tooley <phil.tooley@amd.com> and vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-03-04 08:43:27 -08:00
Vanessasaurus
0543710258 flux-core: flux-security dependency with build/link (#42985) 2024-03-04 16:48:12 +01:00
Harmen Stoppels
5d994e48d5 modules: allow autoload: run, like in environment views (#42743) 2024-03-04 08:49:45 +01:00
Harmen Stoppels
d1fa23e9c6 versions: fix typing problems (#42903)
Fix the type declaration of VersionList.versions.

Fix further problems exposed by that fix.
2024-03-04 08:38:54 +01:00
Isaac Corley
f1db8b7871 torchgeo v0.5.2 (#42967) 2024-03-03 13:57:39 -07:00
Axel Huebl
c5cca54c27 Fix mgard: OpenMP on AppleClang (#42933)
macOS AppleClang does not provide OpenMP by default with XCode.
Use LLVM's OpenMP to fix compile errors of mgard with OpenMP (default).
2024-03-03 08:57:24 +01:00
Arne Becker
a9c1648db8 perl-log-dispatch-filerotate and dep: new package (#42965)
Adds Log::Dispatch::FileRotate and its dependency:
- Log::Dispatch

Installed OK, build-time tests ran successfully.
2024-03-03 00:40:27 -07:00
Arne Becker
3bd911377e perl-catalyst-plugin-cache: new package (#42964)
Adds Catalyst::Plugin::Cache
2024-03-02 23:34:26 -07:00
otsukay
fcb2f7d3aa Remove unnecessary if statements, which are harmful since +blas+lapack variants have been removed. (#42936)
Co-authored-by: Yuichi Otsuka <otsukay@riken.jp>
2024-03-02 22:13:47 -07:00
Arne Becker
a8a9e0160a perl-test-time-hires: new package (#42957)
Adds Test::Time::HiRes
2024-03-02 18:10:08 -08:00
jfavre
9ca6aaeafd libcatalyst: add fortran & python variants (#42941)
* Update package.py

Adding two variants 'fortran' and 'python' to enable language wrappings

* Update package.py

remove extra space
2024-03-02 17:55:46 -08:00
Adam J. Stewart
aed5c39312 py-numpy: add v1.26.4 (#42515) 2024-03-02 21:56:43 +01:00
Chris Marsh
eb36bb2a8f netcdf-c package: correct library names (#42878)
An incorrect hdf5 library name is added to pkconfig and CMake config
files when netcdf-c is built with CMake.
2024-03-02 11:05:59 -08:00
dependabot[bot]
8dcf860888 build(deps): bump codecov/codecov-action from 4.0.2 to 4.1.0 (#42860)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.2 to 4.1.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](0cfda1dd0a...54bcd8715e)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-02 10:39:09 -08:00
Arne Becker
a5b7cb6e6f perl-test-xml-simple and deps: new packages (#42873)
* perl-test-longstring: New package

Adds Test::LongString

* perl-test-xml-simple: new package

- Adds perl-test-xml-simple
2024-03-02 10:37:35 -08:00
Arne Becker
34c0bfefa6 perl-test-xml and deps: new packages (#42870)
- Adds perl-test-xml and dependency:
- Adds perl-xml-semanticdiff
2024-03-02 10:35:56 -08:00
Elliott Slaughter
ea5db048f3 legion: Add 23.09.0 and 23.12.0, remove control_replication. (#42915)
* legion: Add 23.09.0 and 23.12.0, remove control_replication.

The branch control_replication has been merged to master and should no
longer be used.

* flecsi: Switch to Legion master branch.

Legion control_replication has been merged to master.

* Fix Legion 23.09.0 and 23.12.0 build for ROCm 6.
2024-03-02 10:32:44 -08:00
Adam J. Stewart
e68a17f2c6 Bazel: fix patching of 4.2.4 (#42938) 2024-03-02 10:29:18 -08:00
Brian Vanderwende
4af9ec3d8a Add ncvis package and add option to wxwidgets (#38204)
* Add ncvis and opengl option for wxwidgets

* Style fixes for ncvis

* Replace in with satisfies for opengl constraint

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2024-03-02 10:26:36 -08:00
Zachary Newell
eb90e2c894 Added NCCL version 2.20.3-1 (#42951)
Added NCCL version 2.20.3-1 to the package.py. I tested compiling it and running nccl-tests on Ubuntu 22.04.
2024-03-02 06:59:17 -06:00
Mosè Giordano
763f444d63 py-numba: add tbb variant (#42930)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-03-01 02:03:39 -07:00
Chris Marsh
6614c4322d Seacas: fix patch hash (#42934) 2024-03-01 01:48:36 -07:00
Chris Marsh
983422facf libogg does not build a shared libary with cmake (#42877)
* when built with cmake, libogg does not build with a shared libary by default. This resolves that

* spack style fixes

* Clean up imports

* enforce +pic when +shared
2024-02-29 21:12:27 -06:00
Ye Luo
d0bdd66238 Add Quantum ESPRESSO 7.3.1 (#42927) 2024-02-29 20:08:25 -07:00
Tim Fuller
3a50d32299 Show extension commands with spack -h (#41726)
* Execute `args.help` after setting main options so that extension commands will show with `spack -h`

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-02-29 16:51:42 -08:00
psakievich
50ee3624c0 Support environment variable expansion inside module projections (#42917) 2024-02-29 16:49:37 -08:00
afzpatel
2840cb54b5 initial commit to fix ck gpu targets cmake arg (#42924) 2024-02-29 15:48:07 -06:00
eugeneswalker
5c482d0d7e reduce size of e4s to deal with large rebuild artifact (#42884) 2024-02-29 13:44:01 -07:00
joscot-linaro
f3ad990b55 linaro-forge: added 23.1.2 version (#42922) 2024-02-29 12:28:03 -08:00
Victoria Cherkas
977603cd96 Update fdb package.py with libs (#42874)
* Update fdb package.py with libs
* Formatting
2024-02-29 12:23:37 -08:00
Wanlin Wang
1f919255fd Update riscv-gnu-toolchain package.py (#42893)
* Update package.py
  1. add one compiler type named 'musl'
  2. add a variant name 'multilib'
  3. add a variant name 'cmodel'
* Added one compiler type named 'musl'.
  Added a variant named 'multilib'.
  Added a variant named 'cmodel'.
  Added several versions.
* aarch64 is not supported.
2024-02-29 12:11:40 -08:00
Adam J. Stewart
5140a9b6a3 py-keras: add v3.0.5 (#42697) 2024-02-29 17:56:03 +01:00
Chris Marsh
1732ad0af4 vtk: Update proj dependency (#42797)
* Update proj dependency to enable newer proj usage

* Allow for any proj version
2024-02-29 06:07:21 -08:00
dependabot[bot]
e7510ae292 build(deps): bump docker/setup-buildx-action from 3.0.0 to 3.1.0 (#42883)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.0.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](f95db51fdd...0d103c3126)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-29 14:48:24 +01:00
Harmen Stoppels
0c224ba4a7 libevent: remove autotools build deps again (#42908)
The deps were added in #40945 to make it work on macOS 11, because the
old configure scripts only detect macOS 10. Apparently people reported the
autoreconf script caused issues, later fixed in #41057. However, also
with that fix, things are incorrect, cause people now report:

```
libtool: You should recreate aclocal.m4 with macros from libtool 2.4.7
libtool: and run autoconf again.
```

HOWEVER, all this is unnecessary, because the underlying issue was
already fixed long ago, it's just that it regressed at some point, but
it's back in place since #41205.
2024-02-29 09:57:21 +01:00
Terry Cojean
86b4a867ef ginkgo: add PAPI SDE support (#39425)
Signed-off-by: Terry Cojean <terry.cojean@kit.edu>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-29 06:04:34 +01:00
Arne Becker
6049e5f6eb perl-readonly-xs: new package (#42897)
This adds Readonly::XS. Since this module can not be used by itself, the
Spack package comes with a test override. This anticipates that the perl
builder will one day have a generic standalone module usage test.
2024-02-28 14:27:22 -08:00
Arne Becker
0339690a59 perl-test-json, perl-json-any: New packages (#42896)
* perl-test-json: New package
  Adds Test::JSON
* Adds perl-json-any
2024-02-28 14:25:34 -08:00
Arne Becker
2bae1b7e00 perl-test-xpath: New package (#42895)
Adds Test::XPath
2024-02-28 14:23:41 -08:00
Arne Becker
ae5b605f99 perl-uri-find: New package (#42894)
Adds URI::Find
2024-02-28 14:22:19 -08:00
Arne Becker
35898d94d2 perl-net-cidr-lite: new package (#42898)
* perl-net-cidr-lite: new package
   Adds Net::CIDR::Lite
* Add license
2024-02-28 14:20:52 -08:00
Arne Becker
7e00bd5014 perl-mojolicious: new package (#42899)
Adds Mojolicious
2024-02-28 14:16:36 -08:00
Arne Becker
1f3aefb0a3 perl-cache-memcached and deps: new packages (#42911)
Adds Cache::Memcached and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Cache::Memcached
2024-02-28 14:02:38 -08:00
Harmen Stoppels
d4601d0e53 Unit tests: skip tests that intermittently fail on Windows (#42909) 2024-02-28 14:00:09 -08:00
Tom Payerle
935660e3d5 mysql: explicity cast python command to str in _fix_dtrace_shebang() (#40781)
This should fix issue #40780

We explicitly cast self.spec["python"].command to str in the filter_file
call in _fix_dtrace_shebang to avoid the error
==> Error: TypeError: expected str, bytes or os.PathLike object, not Executable

Not sure why the error is appearing (is it only for specific python versions, etc?),
but the fix should be quite safe.
2024-02-28 13:00:51 -08:00
Alec Scott
17bfc41841 bison: remove unnessisary deps, add variant for colored output (#42209) 2024-02-28 12:32:40 -08:00
Arne Becker
49b38e3a78 perl-yaml-syck: New package (#42892)
Adds YAML::Syck
2024-02-28 12:26:09 -08:00
Arne Becker
66f078ff84 perl-catalyst-runtime and deps: new packages (#42886)
* perl-catalyst-runtime and deps: new packages
  This add Perl Catalyst::Runtime and its missing dependencies.
  Adds:
  - perl-catalyst-runtime
  - perl-apache-logformat-compiler
  - perl-cgi-simple
  - perl-cgi-struct
  - perl-class-c3-adopt-next
  - perl-cookie-baker
  - perl-data-dump
  - perl-devel-stacktrace-ashtml
  - perl-filesys-notify-simple
  - perl-getopt-long-descriptive
  - perl-hash-multivalue
  - perl-http-body
  - perl-http-entity-parser
  - perl-http-headers-fast
  - perl-http-multipartparser
  - perl-moosex-emulate-class-accessor-fast
  - perl-moosex-getopt
  - perl-moosex-methodattributes
  - perl-moosex-role-parameterized
  - perl-path-class
  - perl-plack
  - perl-plack-middleware-fixmissingbodyinredirect
  - perl-plack-middleware-methodoverride
  - perl-plack-middleware-removeredundantbody
  - perl-plack-middleware-reverseproxy
  - perl-plack-test-externalserver
  - perl-posix-strftime-compiler
  - perl-stream-buffered
  - perl-string-rewriteprefix
  - perl-test-mocktime
  - perl-test-tcp
  - perl-test-time
  - perl-test-trap
  - perl-tree-simple
  - perl-tree-simple-visitorfactory
  - perl-uri-ws
  - perl-www-form-urlencoded
2024-02-28 12:17:45 -08:00
AMD Toolchain Support
304a63507a AOCC: add v4.2.0 (#42891)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-02-28 20:24:57 +01:00
Jonas Eschle
c7afc0eb5f Upgrade TensorFlow Probability with newer versions (#42673)
* enh: add newer versions

* enh: add newer versions

* format

* fix typo

* Update package.py

* make jax and TF optional dependencies

* style fix

* remove dependency

* remove old TFP version

* fix:  style
2024-02-28 12:29:23 -06:00
kwryankrattiger
57cde78c56 ParaView Release Candidate 5.12.0-RC3 (#42654) 2024-02-28 09:41:10 -08:00
Arne Becker
b01f6308d5 perl-json-xsand deps: new packages (#42904)
Adds JSON::XS and its deps:
- Canary::Stability
- Types::Serialiser
2024-02-28 09:30:46 -08:00
eugeneswalker
13709bb7b7 e4s: new packages: glvis, laghos (#42847)
* e4s: new packages: glvis, laghos

* gl: require: osmesa

* be explicit: glvis ^llvm so that llvm-amdgpu not chosen

* glvis fails on oneapi stack due to issue 42839
2024-02-28 09:26:53 -08:00
Harmen Stoppels
661ae1f230 versions: simplify list if union not disjoint (#42902)
Spack merges ranges and concrete versions if they have non-empty
intersection. That is not enough for adjacent version ranges.

This commit ensures that disjoint ranges in version lists are simplified
if their union is not disjoint:

```python
"@1.0:2.0,2.1,2.2:3,4:6" # simplifies to "@1.0:6"
```
2024-02-28 16:33:25 +01:00
Sinan
287e1039f5 package_py_systemd_python_improve (#42865)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-02-28 07:56:17 -06:00
Jonas Eschle
015dc4ee2e Add package zfit interface (#42666)
* Add package zfit interface

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of jonas-eschle

* Update var/spack/repos/builtin/packages/py-zfit-interface/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-28 07:52:40 -06:00
Axel Huebl
46165982b1 C-Blosc2: Fuzzer Tests (#42881)
The fuzzer tests are a bit flaky and have linker issues on
clang. We generally only should build them in testing.
2024-02-27 21:08:33 -07:00
eugeneswalker
c9a111946e e4s oneapi: remove outdated package preferences (#42875) 2024-02-27 14:35:06 -08:00
renjithravindrankannath
62160021c1 Adding dependency of roctracer-dev and patch in miopen-hip (#42637) 2024-02-27 14:52:47 -07:00
Tom Payerle
3290e2c189 openexr: Add custom libs property (#42274)
Libraries for openexr are named libOpenEXR*.so, etc., so the default libs
handler in spec does not find them.

Add a custom libs property to address this.

Partial fix for #42273

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-02-27 10:45:29 +01:00
George Young
2a9fc3452a regtools: add new package (#42852)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-27 10:44:34 +01:00
YI Zeping
2ea8a2e6ae btop: add cmake version restriction (#42835)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-27 10:08:22 +01:00
Lydéric Debusschère
fe4a4ddcf9 bazel: allow offline build of major versions 5 and 6 (#41575)
* bazel: allow offline build of major versions 5 and 6; add variant download_data

* bazel: add maintainer LydDeb

* bazel: install offline only; remove variant download_data

* bazel: fix variable name: resource_dico --> resource_dictionary

* bazel: fix style

* bazel: fix the build of version 4

* bazel: add comment about resources

* bazel: access to resource stages with self.stage

* bazel: add except to solve AttributeError: 'Stage' object has no attribute 'resource'

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2024-02-27 03:03:25 -06:00
Juan Miguel Carceller
c45714fd3c delphes: use the same C++ standard as in ROOT (#42816)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 09:54:26 +01:00
Juan Miguel Carceller
523d12d9a8 garfieldpp: Add version 5.0 (#42817)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 09:54:03 +01:00
Howard Pritchard
5340e0184d Open MPI: adjust pmix dependency for 5.0.x (#42827)
for various reasons had to advance dependency of 5.0.2 to at least
pmix 4.2.4.  5.0.1 and 5.0.0 can also build with 4.2.4 pmix or newer.

related to #42651

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-02-27 09:49:29 +01:00
AMD Toolchain Support
bca7698138 openfoam: add mpfr search paths (#42779)
Co-authored-by: Branden Moore <branden.moore@amd.com>
2024-02-27 09:37:48 +01:00
Peter Scheibel
5c26ce5385 skip test which is causing spurious failures on Windows (#42832) 2024-02-27 09:36:10 +01:00
Eisuke Kawashima
02137dda17 eigenexa: add 2.7–2.12 (#38170) 2024-02-27 09:23:08 +01:00
eugeneswalker
4abac88895 e4s ci: use ubuntu 22.04 images (#42843) 2024-02-27 01:12:53 -07:00
stepanvanecek
79c2a55e00 gpuscout: new package (#42761)
Co-authored-by: Stepan Vanecek <stepan@Stepans-MBP.fritz.box>
2024-02-27 09:05:09 +01:00
Carsten Uphoff
71c169293c double-batched: add v0.5.0 (#42850)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-02-27 08:59:42 +01:00
Wouter Deconinck
bcc5ded205 dd4hep: new version 1.28 (#42846) 2024-02-27 08:50:01 +01:00
Kensuke WATANABE
379a5d8fa0 root: add dependent package required for build time tests (#42849) 2024-02-27 08:43:08 +01:00
Juan Miguel Carceller
d8c2782949 bdsim: use the same C++ standard as in ROOT, add a patch (#42031)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 08:34:42 +01:00
dependabot[bot]
6dde6ca887 build(deps): bump pytest from 8.0.1 to 8.0.2 in /lib/spack/docs (#42861)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.0.1 to 8.0.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.0.1...8.0.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-27 08:30:03 +01:00
downloadico
8f8c262fb3 picard: add version 3.1.1 (#42862) 2024-02-27 08:29:06 +01:00
afzpatel
93b8e771b6 rocm-gdb: add v6.0.2 (#42855) 2024-02-27 08:22:56 +01:00
Todd Gamblin
48088ee24a refactor: add type annotations and refactor solver conditions (#42081)
Refactoring `SpackSolverSetup` is a bit easier with type annotations, so I started
adding some. This adds annotations for the (many) instance variables on
`SpackSolverSetup` as well as a few other places.

This also refactors `condition()` to reduce redundancy and to allow
`_get_condition_id()` to be called independently of the larger condition
function.


Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-26 22:26:01 +00:00
Mikael Simberg
c7df258ca6 Update camp missing headers patch to be applied with all compilers (#42857) 2024-02-26 12:30:01 -05:00
Erik Heeren
b8e8fa2dcd py-find-libpython: 0.3.1 (#42853)
* py-find-libpython: 0.3.1

* py-find-libpython: sort versions
2024-02-26 10:07:51 -07:00
Adam J. Stewart
8e885f4eb2 ImageMagick: fewer dependencies on macOS (#42739) 2024-02-26 17:45:29 +01:00
Miranda Mundt
116308fa17 py-pyomo: add v6.7.1 (#42795)
* Update Pyomo spack package for 6.7.1 release

* Apply changes from @adamjstewart

* Update sphinx+Pyomo versions

* Whoops - typo
2024-02-26 10:14:36 -06:00
Adam J. Stewart
5eb4d858cb Update PyTorch ecosystem (#42819) 2024-02-25 19:20:25 -08:00
eugeneswalker
8dd5f36b68 e4s external rocm ci: use ubuntu 22 image with rocm 5.7.1 (#42842)
* e4s external rocm ci: use ubuntu 22 image with rocm 5.7.1

* comment out slate+rocm due to build error
2024-02-25 17:50:56 -08:00
eugeneswalker
e3ce3ab266 e4s ci: add py-mpi4py, py-numba (#42845) 2024-02-25 17:23:39 -08:00
Jeremy Fix
0618cb98d1 py-ipyvuetify: new package (#42836)
* py-ipyvuetify: new package

* Limit py-jupyter-packing version to 0.7.x

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-jupyterlab version and type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-ipyvue version range to exclude 2

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* rm py-wheel, already considered for PythonPackage

* fix: pynpm only required for build, reorder dependencies as in the pyproject.toml

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-25 07:34:23 -06:00
Maciej Wójcik
3b4a27ce7b snakemake: new version with plugins (#42713)
* snakemake: add Snakemake 8 with dependencies

* snakemake: add missing description

* Whitespace

* Whitespace

* Whitespace

* Whitespace

* py-conda-inject: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-azure-batch: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-cluster-generic: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake: add upper bound for Python

* py-snakemake-executor-plugin-drmaa: specify dependency type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-googlebatch: correct dependency version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-tes: correct dependency version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-storage-plugin-s3: reorder

* snakemake: remove newly added variants

* snakemake: remove newly added variants

* snakemake: remove newly added variant

* snakemake: update version

* snakemake: update version

* snakemake: whitespace

* py-snakemake-storage-plugin-s3: update version

* snakemake: use newer version

* snakemake: whitespace

* snakemake: update interfaces

* py-snakemake-storage-plugin-gcs: link issue

* snakemake: update versions

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-25 03:47:38 -07:00
Veselin Dobrev
07afbd5619 [laghos] Add a patch for MPI_Session (#42841) 2024-02-24 16:07:59 -07:00
Pranav Sivaraman
5d8cd207ec zoxide: new package (#42840)
* feat: zoxide package

* Apply suggestions from code review

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2024-02-24 14:53:08 -07:00
Adam J. Stewart
3990589b08 py-lightly: add v1.5.0 (#42820) 2024-02-24 12:27:53 -08:00
dependabot[bot]
38d821461c build(deps): bump codecov/codecov-action from 4.0.1 to 4.0.2 (#42831)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.1 to 4.0.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](e0b68c6749...0cfda1dd0a)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-24 12:26:30 -08:00
Adam J. Stewart
80b13a0059 py-pandas: add v2.2.1 (#42838) 2024-02-24 12:20:55 -08:00
Maciej Wójcik
ab101d33be py-azure-...: add new versions (#42742)
* py-azure-core: add new versions

* py-azure-identity: add new versions, flatten dependencies

* py-azure-storage-blob: add new versions

* py-msal: add new versions

* py-azure-...: black is terrible

* py-azure-storage-blob: correct dependency

* Reorder

* Reorder
2024-02-24 12:29:05 -06:00
Sinan
cc742126ef package/libspatialite: add conflict, new version (#42573)
* package/libspatialite: add conflict, new version

* depends on new version of freexl

* fix bug

* remove manual download stuff

* improve style

* first depracate

* [@spackbot] updating style on behalf of Sinan81

* get rid of conflict, reorder deps

* remove manual download

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2024-02-24 09:29:46 -06:00
Stephen Hudson
a1f21436a4 libEnsemble: add v1.2.1 (#42828) 2024-02-24 09:27:44 -06:00
Alex Richert
95fdc92c1f Allow awscli-v2 to be installed without examples/ dir (#42773)
* Allow awscli-v2 to be installed without examples/ dir

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-24 09:25:59 -06:00
Alex Richert
6680c6b72e libjpeg-turbo: add v2.1.5, update recipe (#37963)
Co-authored-by: Alec Scott <hi@alecbcs.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-23 19:23:52 -07:00
Chris Marsh
74b6bf14b8 netcdf-cxx4: convert to CMake-based build (#42766)
The CMake-based build is anticipated to work in all cases where the
Autotools-based build did, and to address all prior issues with less
maintenance of the package. In detail:

* Fixes #42735 (CMake's find_package helps with linking to proper
  netcdf-c)
* Replaces older Autotools-based build
* All preexisting variants are handled
* Record hdf5 as an explicit dependency (was missing before)
* Add +tests option

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-02-23 15:37:33 -08:00
Vicente Bolea
7c315fc14b proj: apply stdint.h patch in version 8 (#42791)
* proj: apply stdint.h patch in version 8

* Update var/spack/repos/builtin/packages/proj/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 14:48:20 -07:00
John W. Parent
f51c9fc6c3 Windows path handling: change representation for paths with spaces (#42754)
Some builds on Windows break when encountering paths with spaces. This
reencodes some paths in Windows 8.3 filename format (when on Windows):
this serves as an equivalent identifier for the file, but in a form that
does not have spaces.

8.3 filenames are also truncated in length, which could be helpful, but
that is not the primary intended purpose of using this format.

Overall

* nmake/msbuild packages do this generally for the install prefix
* curl/perl require additional modifications (as written now, each package
  may require calls to `windows_sfn` to work when the Spack
  root/install/staging prefixes contain spaces)

Some items for follow-up:

* Spack itself does not create paths with spaces "on top" of whatever
  the user configures or where it is placed (e.g. the Spack root, the
  staging directory, etc.), so it might be possible to edit some of these
  paths once and avoid a proliferation of individual `windows_sfn`
  calls in individual packages.
* This approach may result in the insertion of 8.3-style paths into
  build artifacts (on Windows), handling this may require additional
  bookkeeping (e.g. when relocating).
2024-02-23 13:30:11 -08:00
John W. Parent
3e713bb0fa vtk package: support vtk@9 on Windows (#42751) 2024-02-23 11:58:58 -08:00
Peter Scheibel
55bbb10984 Alert user to failed concretizations (#42655)
With this change an error message is emitted when the result of concretization 
is in an inconsistent state.
2024-02-23 20:15:25 +01:00
Simon Pintarelli
6e37f873f5 tiled-mm: add v2.3 (#42829) 2024-02-23 19:53:51 +01:00
Massimiliano Culpo
d49cbecf8c Cleanup spack.schema (#42815)
* Move spec_list into its own file, instead of __init__.py

* Remove spack.schema.spack

This module was introduced in #33960 It's almost an exact duplicate of
spack.schema.env, and is not used anywhere.

* Fix typo
2024-02-23 10:23:54 -08:00
Dr Marco Claudio De La Pierre
fe07645e3b Update/add packages in the Nextflow ecosystem (#42776)
Signed-off-by: Dr Marco Claudio De La Pierre <marco.delapierre@gmail.com>
2024-02-23 17:53:46 +01:00
Ben Morgan
c5b8d5c92a geant4: new version v11.2.1 (#42822) 2024-02-23 08:07:35 -05:00
Jeremy Fix
2278816cb3 py-jwcrypto: new package (#42783)
* adds the spack recipe for py-jwcrypto

* split long line to fix E501

* Specify versions for py-cryptography and py-typing-extensions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:58:11 -07:00
Jeremy Fix
4bd305b6d3 py-reacton: new package (#42794)
* adds the spack recipe for reacton python package

* Fix versions for ipywidgets and typing-extensions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:48:06 -07:00
Erik Heeren
a26e0ff999 py-find-libpython: new package (#42804)
* py-find-libpython: new package

* Update var/spack/repos/builtin/packages/py-find-libpython/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:42:58 -07:00
Jeremy Fix
4550fb83c0 py-ipyvue: new package (#42789)
* add spack recipe for ipyvue

* Specify version for ipywidgets

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:37:48 -07:00
Todd Gamblin
5b96c2e89f py-sympy: add version 1.12 (#42770) 2024-02-23 05:42:49 -06:00
Sinan
18c8406091 pdal: fix version range for patch (#42769)
* Update package.py

fix bug.

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:28:41 -06:00
Caetano Melone
bccefa14cb py-codespell: add package (#42694)
* py-codespell: add package

* setuptools-scm conflict

confirmed via https://github.com/codespell-project/codespell/issues/3365

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* reorder dependencies and versions

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:07:24 -06:00
Maciej Wójcik
20fc5a174a py-s3transfer, py-boto3, py-botocore: add new versions (#42741)
* py-s3transfer: add new versions

* py-boto3: add new versions

* py-botocore: add new versions

* py-boto3: correct version ranges
2024-02-23 04:18:45 -06:00
Alec Scott
3abcb408d1 py-ansible: add v2.16.3 (#42734)
* py-ansible: add v2.16.3

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add specific python version requirements from setup.cfg

* Add additional ranges for py-setuptools

* Update var/spack/repos/builtin/packages/py-ansible/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 04:13:33 -06:00
Maciej Wójcik
a0d97d9294 py-argparse-dataclass: add new package (#42494)
* py-argparse-dataclass: add new package

* Remove obvious dependency
2024-02-23 04:11:21 -06:00
Massimiliano Culpo
0979a6a875 Remove dead code from Environment (#42818)
Environment.concretize_and_add is not used anywhere.
2024-02-23 10:48:07 +01:00
Massimiliano Culpo
98de8e3257 Fix wrong call to a function (#42814) 2024-02-23 06:37:22 +01:00
akimler
23b299c086 matio: add v1.5.26 (#42808) 2024-02-23 06:03:06 +01:00
Massimiliano Culpo
adcd3364c9 elpa: remove deprecated versions (#42802) 2024-02-23 06:02:06 +01:00
Tom Payerle
a2908fb9f7 qb3: add custom libs property (#42275) 2024-02-22 15:13:05 -07:00
John W. Parent
f514e72011 netcdf-c package: fix hdf5 linking on Windows (#42749) 2024-02-22 14:18:34 -07:00
Adam J. Stewart
b61d964eb8 PythonPackage: check purelib for libs/headers (#42602)
* PythonPackage: check purelib for libs/headers

* Update error messages too

* Fix functools.reduce argument order
2024-02-22 13:17:21 -08:00
John W. Parent
2066eda3cd Seacas package: add Windows support (#42692) 2024-02-22 13:28:33 -07:00
Alex Richert
d8b186a381 dakota: add boost components (#42659)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-22 13:03:13 -07:00
Adam J. Stewart
d258aec099 GDAL: add v3.8.4 (#42805) 2024-02-22 11:42:14 -08:00
Harmen Stoppels
3d1d5f755f oci: when base image uses Image Manifest Version 2, follow suit (#42777) 2024-02-22 16:33:56 +01:00
Thomas-Ulrich
90778873d1 tandem: update package (#42785) 2024-02-22 15:45:20 +01:00
AMD Toolchain Support
1af57e13fb elpa: fix support for patched version (#42803)
Co-authored-by: Ning Li <ning.li@amd.com>
2024-02-22 15:43:12 +01:00
AMD Toolchain Support
28d25affcc ELPA: Linking fixes for BLAS and OpenMP (#42747)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2024-02-22 15:21:00 +01:00
Martin Lang
3ebaf33915 libgd: fix INT_MAX not defined (#42104)
Compiling version 2.2.4 fails (on a Debian system with only a minimum set of packages installed) with an error because `INT_MAX` is undeclared:
```
   263    gd_gd2.c: In function '_gd2GetHeader':
>> 264    gd_gd2.c:212:54: error: 'INT_MAX' undeclared (first use in this function)
   265      212 |                 if (*ncx <= 0 || *ncy <= 0 || *ncx > INT_MAX / *ncy) {
   266          |                                                      ^~~~~~~
   267    gd_gd2.c:87:1: note: 'INT_MAX' is defined in header '<limits.h>'; did you forget to '#include <limits.h>'?
```
2024-02-22 03:23:59 -07:00
Alec Scott
e8d981b405 rust: add v1.76.0 (#42798) 2024-02-22 03:09:31 -07:00
Steven Hahn
02f222f6a3 google benchmark: Add variant with libpfm4 (#42620)
Signed-off-by: Steven Hahn <hahnse@ornl.gov>
2024-02-22 07:29:49 +01:00
James Beal
8345c6fb85 delly2: add v1.2.6 (#42745)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-22 07:28:40 +01:00
Xavier Delaruelle
3f23634c46 environment-modules: add version 5.4.0 (#42763) 2024-02-22 07:10:55 +01:00
MatthewLieber
d5766431a0 mvapich: add v3.0 (#42756)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-02-22 07:09:14 +01:00
Martin Lang
1388bfe47d bigdft-atlab: add v1.9.3, v1.9.4 (#42643) 2024-02-22 07:05:00 +01:00
dependabot[bot]
579dec3b35 build(deps): bump urllib3 from 2.2.0 to 2.2.1 in /lib/spack/docs (#42757)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.2.0 to 2.2.1.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.2.0...2.2.1)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-22 07:01:45 +01:00
Martin Lang
9608ed9dbd cgal: add v5.5.3 (#42650) 2024-02-22 06:58:03 +01:00
Wouter Deconinck
42b739d6d5 podio: depends_on py-graphviz type run (for podio-vis) (#42787)
The podio-vis tool depends at run-time on py-graphviz, https://github.com/AIDASoft/podio/blob/master/tools/podio-vis#L10.
2024-02-22 06:14:38 +01:00
Matthieu Dorier
91a0c71ed1 nlohmann-json-schema-validator: added version 2.2.0 and 2.3.0 (#42792) 2024-02-22 06:11:29 +01:00
Dom Heinzeller
6ee6fbe56b ecflow: apply ctsapi_cassert.patch for all compilers (#42793) 2024-02-22 06:09:54 +01:00
Mikael Simberg
be4eae3fa8 pika: add sanitizers variant (#42778) 2024-02-22 05:54:33 +01:00
Harmen Stoppels
ad70b88d5f spack gc: do not show uninstalled but needed specs (#42696) 2024-02-22 05:21:39 +01:00
eugeneswalker
c1d230f25f e4s ci stacks: add python packages (#42774)
* e4s ci stacks: add python packages

* comment out failing specs
2024-02-21 20:59:05 -07:00
John W. Parent
4bc52fc1a3 env activate: use Win-compatible print on Windows (#42755)
Use "echo" instead of "printf" on Windows.
2024-02-21 11:02:04 -08:00
John W. Parent
7d728822f0 Windows: fix error with can_symlink check (#42753) 2024-02-21 10:18:25 -08:00
John W. Parent
e96640d2b9 cgns package: don't use MPI wrappers on Windows (#42750) 2024-02-21 10:10:48 -08:00
Alex Richert
e196978c7c Add 'docs' variant to rust-bootstrap (#42768)
* Add 'docs' variant to rust-bootstrap

* remove docs for rust-bootstrap
2024-02-21 11:04:13 -07:00
Harmen Stoppels
de3d1e6c66 rocm: removal of deprecated <5.1 versions (#42676)
The package `aomp` is removed entirely, as it was too outdated to have non-deprecated dependencies.
2024-02-21 14:07:40 +01:00
Massimiliano Culpo
2d8e0825fe binutils: add v2.42 (#42760) 2024-02-20 23:03:51 -08:00
pauleonix
d5c06c4e2c asio: add patches 1.28.2 and 1.28.1 (#42762) 2024-02-20 23:02:43 -08:00
Wouter Deconinck
0d92b07dbd pythia6: deal with dead pythiasix.hepforge.org (#42162)
* pythia6: deal with dead pythiasix.hepforge.org

* pythia6: rm main81.f from CMakeLists.txt

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-02-20 15:30:54 -06:00
Auriane R
6cfcbc0167 Update conflict between stdexec and clang (#42765) 2024-02-20 13:32:54 -07:00
Massimiliano Culpo
f9e9fab2da clingo: add v5.7.1 (#42758) 2024-02-20 07:49:49 -08:00
Massimiliano Culpo
b3f790c259 btop: add v1.3.2 (#42759) 2024-02-20 07:46:35 -08:00
George Young
b5b5130bed pblat: add new package (#42517)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-19 22:28:27 +01:00
Vicente Bolea
d670a2a5ce adios2: update kokkos dependency (#42621)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-19 11:54:08 -07:00
Harmen Stoppels
0ee3a3c401 Use relative target in symlinks to modified files in view (#42699) 2024-02-19 16:33:38 +01:00
Dave Keeshan
ad6dc0d103 verible: add v0.0-3539-g9442853c (#42628) 2024-02-19 14:40:23 +01:00
Alex Richert
6e373b46f8 scorep: specify binutils headers and libs (#42656) 2024-02-19 14:35:22 +01:00
Thomas-Ulrich
abe617c4ea hipsycl: update package (#42518) 2024-02-19 14:34:05 +01:00
Satish Balay
a0e80b23b9 DTK: specify MPI compilers (#42592)
Co-authored-by: balay <balay@users.noreply.github.com>
2024-02-19 14:24:28 +01:00
Dom Heinzeller
4d051eb6ff ecflow: fix compilation with Intel classic compilers (#42622) 2024-02-19 14:23:09 +01:00
kinagaki-fj
0d89083cb5 omm-bundle: add new package (#42304) 2024-02-19 14:16:39 +01:00
Alex Richert
23e586fd85 ferret: add support for gcc@10: (#42660) 2024-02-19 14:14:11 +01:00
Richard Berger
c2b116175b kokkos: disable CUDA_MALLOC_ASYNC on cray-mpich (#42661)
Co-authored-by: Daniel Arndt <arndtd@ornl.gov>
2024-02-19 13:52:48 +01:00
Mikael Simberg
a1f90620c3 umpire: depend on camp~rocm when umpire itself has ~rocm (#42701) 2024-02-19 11:58:31 +01:00
Harmen Stoppels
668879141f remove a few redundant calls to setup_run_environment (#42718)
Any package `X` used as `depends_on("x", type="build")` will have
`X.setup_run_environment(env)` called, because it has to be able to
"run" in the build environment.

So there is no point in calling `setup_run_environment` from
`setup_dependent_build_environment`.

Also it's redundant to call `setup_run_environment` in
`setup_dependent_run_environment`, cause (a) the latter is called _for
every parent edge_ instead of once per node, and (b) it's only called
after `setup_run_environment` is called anyways. Better to call
`setup_run_environment` once and only once.
2024-02-19 11:43:45 +01:00
Adam J. Stewart
267defd9d3 py-matplotlib: add v3.8.3 (#42698) 2024-02-19 11:40:18 +01:00
Ken Raffenetti
603d3f2283 mpich: Remove invalid pmi option (#42686)
pmi=off is not a valid configuration option. MPICH cannot function
without a PMI library. Fixes #42685.
2024-02-19 11:38:49 +01:00
James Beal
02bfbbe269 Bump for new version of bedtools2 (#42034)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-19 11:35:26 +01:00
Alec Scott
44d08d2c24 gnupg: make discoverable as external (#42736) 2024-02-19 11:27:39 +01:00
AMD Toolchain Support
c6faab10aa CP2K: fix multiple use of spec["fftw"] (#42724)
fftw object was originally created with spec["fftw:openmp"], but
referencing spec["fftw"] overwrites the 'last_query' in the spec object,
so later use of fftw.libs was not returing FFTW OpenMP libs.

Also allow the post-install fixup to support amdfftw as well as fftw.

Co-authored-by: Branden Moore <branden.moore@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-02-19 11:21:11 +01:00
Geoffrey Lentner
baa203f115 duckdb: add v0.9.2 (#42374) 2024-02-19 03:02:43 -07:00
Alec Scott
575a33846f bfs: add v3.1.1 (#42740) 2024-02-19 10:54:00 +01:00
Adam J. Stewart
79df065730 py-shapely: add v2.0.3 (#42738) 2024-02-18 21:29:18 +01:00
William Moses
953ee2d0ca Bump enzyme to 0.0.100 (#42626) 2024-02-18 08:48:21 -08:00
Nai-Yuan Chiang
6796b581ed hiop new release, v1.0.3 (#42730) 2024-02-18 08:45:44 -08:00
Christoph Junghans
f49a5519b7 byfl: initial commit (#42731) 2024-02-18 08:43:56 -08:00
Tom Drever
e07a6f51dc Add py-click-option-group (#42678)
* Add py-click-option-group

* Specify dependency versions
2024-02-18 08:22:25 -06:00
Dani
cb73d71cf1 new builtin package: py-biobb-model (#42681)
* new builtin package: py-biobb-model

* Update var/spack/repos/builtin/packages/py-biobb-model/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:18:03 -06:00
Jonas Eschle
5949fc2c88 add package py-jacobi (#42672)
* add package py-jacobi

* fix:  add description

* fix:  add description

* fix:  add description

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/py-jacobi/package.py

I don't think that numpy is used in "build"? But not important

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:17:35 -06:00
Jonas Eschle
fd10cfdebf add package py dotmap (#42665)
* add package py dotmap

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* fix:  add description

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
2024-02-18 03:16:32 -06:00
Maciej Wójcik
32506e222d py-sysrsync: add new package (#42492)
* py-sysrsync: add new package

* py-sysrsync: specify dependency type

* py-sysrsync: add constraint for Python
2024-02-18 03:15:42 -06:00
Maciej Wójcik
a7d5cc6d68 py-google-...: add new versions and few new packages (#42671)
* py-google-cloud-storage: add new versions

* py-google-api-core: add new versions

* py-proto-plus: add new package

* py-google-api-core: add grpc variant

* py-google-api-core: add grpc variant

* py-google-api-core: add missing prefix

* py-google-cloud-batch: add new package

* py-google-cloud-logging: add new package

* py-google-cloud-appengine-logging: add new package

* py-google-cloud-audit-log: add new package

* py-grpc-google-iam-v1: add new package

* py-proto-plus: remove obvious dependency

* Whitespace

* Whitespace

* py-google-cloud-audit-log: correct conflict

* py-proto-plus: correct dependency type

* Whitespace

* py-google-auth: add new version

* py-google-resumable-media: add new version

* py-google-cloud-storage: constrain version of dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-grpcio-status: use newer version

* py-google-resumable-media: add upper bound of dependency

* Add types of dependencies.

* py-grpcio: add new version

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:15:02 -06:00
Vanessasaurus
222241f232 flux-core: add uuid (#42635)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-18 08:54:11 +01:00
Sinan
aa1820eb5c pdal: new package (#42714)
* new package pdal

* [@spackbot] updating style on behalf of Sinan81

* fix style

* add license

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* [@spackbot] updating style on behalf of Sinan81

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* improve dependency spec

* add maintainer

* add conflict

* fix bug

* improve

* improve

* [@spackbot] updating style on behalf of Sinan81

* fix style

* specify cmake dependency version

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Alec Scott <alec@bcs.sh>
2024-02-17 13:40:40 -08:00
George Young
16ea5f68ba cryodrgn: new package @2.3.0 (#42443)
* cryodrgn: new package @2.3.0

* correcting dependency ranges

* correcting dependency ranges

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-17 15:07:56 -06:00
Dani
aeec515b4f new builtin package: py-biobb-structure-utils (#42683) 2024-02-17 14:42:10 -06:00
Maciej Wójcik
6e2ec2950b py-kubernetes: add new versions (#42670)
* py-kubernetes: add new versions

* py-oauthlib: add new version
2024-02-17 14:32:20 -06:00
Maciej Wójcik
fe5772898d py-azure-... and py-msrest: add new versions (#42624)
* py-azure-batch: add new versions

* py-azure-core: add new versions

* py-azure-identity: add new versions

* py-azure-mgmt-batch: add new versions

* py-azure-mgmt-core: add new versions

* py-azure-storage-blob: add new versions

* py-msrest: add new versions

* Whitespace

* Whitespace

* py-msrest: add a note

* py-msrest: version-dependent URL

* py-azure-mgmt-batch: correct version of dependency
2024-02-17 14:24:11 -06:00
Alex Leute
384ddf8e93 py-smote-variants: Added package py-smote-variants (#42502)
* py-smote-variants: Added package py-smote-variants

Also added py-minisom and py-metric-learn as dependencies

* py-metric-learn: Added build dependency on setuptools

* py-smote-variants: Added a dependency on py-pytest-runner

As well as a comment about why statistics isn't included

* [@spackbot] updating style on behalf of alex391

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-02-17 14:20:03 -06:00
Tom Vander Aa
32c2e240f8 py-charm4py: needs Cython<3.0 (#42491)
* py-charm4py: needs Cython<3.0

* Update var/spack/repos/builtin/packages/py-charm4py/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-17 14:16:31 -06:00
611 changed files with 13231 additions and 9441 deletions

View File

@@ -43,7 +43,7 @@ jobs:
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044 # @v2.1.0
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -96,7 +96,7 @@ jobs:
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226
uses: docker/setup-buildx-action@0d103c3126aa41d772a8362f6aa67afac040f80c
- name: Log in to GitHub Container Registry
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d

View File

@@ -40,7 +40,7 @@ jobs:
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@ebc4d7e9ebcb0b1eb21480bb8f43113e996ac77a
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below

View File

@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
@@ -122,7 +122,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: shelltests,linux
@@ -181,7 +181,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044 # @v2.1.0
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
@@ -216,6 +216,6 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,macos

View File

@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,windows
unit-tests-cmd:
@@ -57,7 +57,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,windows
build-abseil:

View File

@@ -87,7 +87,7 @@ You can check what is installed in the bootstrapping store at any time using:
.. code-block:: console
% spack find -b
% spack -b find
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 11 installed packages
-- darwin-catalina-x86_64 / apple-clang@12.0.0 ------------------
@@ -101,7 +101,7 @@ In case it is needed you can remove all the software in the current bootstrappin
% spack clean -b
==> Removing bootstrapped software and configuration in "/Users/spack/.spack/bootstrap"
% spack find -b
% spack -b find
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 0 installed packages
@@ -175,4 +175,4 @@ bootstrapping.
This command needs to be run on a machine with internet access and the resulting folder
has to be moved over to the air-gapped system. Once the local sources are added using the
commands suggested at the prompt, they can be used to bootstrap Spack.
commands suggested at the prompt, they can be used to bootstrap Spack.

View File

@@ -73,9 +73,12 @@ are six configuration scopes. From lowest to highest:
Spack instance per project) or for site-wide settings on a multi-user
machine (e.g., for a common Spack instance).
#. **plugin**: Read from a Python project's entry points. Settings here affect
all instances of Spack running with the same Python installation. This scope takes higher precedence than site, system, and default scopes.
#. **user**: Stored in the home directory: ``~/.spack/``. These settings
affect all instances of Spack and take higher precedence than site,
system, or defaults scopes.
system, plugin, or defaults scopes.
#. **custom**: Stored in a custom directory specified by ``--config-scope``.
If multiple scopes are listed on the command line, they are ordered
@@ -196,6 +199,45 @@ with MPICH. You can create different configuration scopes for use with
mpi: [mpich]
.. _plugin-scopes:
^^^^^^^^^^^^^
Plugin scopes
^^^^^^^^^^^^^
.. note::
Python version >= 3.8 is required to enable plugin configuration.
Spack can be made aware of configuration scopes that are installed as part of a python package. To do so, register a function that returns the scope's path to the ``"spack.config"`` entry point. Consider the Python package ``my_package`` that includes Spack configurations:
.. code-block:: console
my-package/
├── src
│   ├── my_package
│   │   ├── __init__.py
│   │   └── spack/
│   │   │   └── config.yaml
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make ``my_package``'s ``spack/`` configurations visible to Spack when ``my_package`` is installed:
.. code-block:: toml
[project.entry_points."spack.config"]
my_package = "my_package:get_config_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
.. code-block:: python
import importlib.resources
def get_config_path():
dirname = importlib.resources.files("my_package").joinpath("spack")
if dirname.exists():
return str(dirname)
.. _platform-scopes:
------------------------

View File

@@ -952,6 +952,17 @@ function, as shown in the example below:
^mpi: "{name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}"
all: "{name}-{version}/{compiler.name}-{compiler.version}"
Projections also permit environment and spack configuration variable
expansions as shown below:
.. code-block:: yaml
projections:
all: "{name}-{version}/{compiler.name}-{compiler.version}/$date/$SYSTEM_ENV_VARIBLE"
where ``$date`` is the spack configuration variable that will expand with the ``YYYY-MM-DD``
format and ``$SYSTEM_ENV_VARIABLE`` is an environment variable defined in the shell.
The entries in the projections configuration file must all be either
specs or the keyword ``all``. For each spec, the projection used will
be the first non-``all`` entry that the spec satisfies, or ``all`` if

View File

@@ -111,3 +111,39 @@ The corresponding unit tests can be run giving the appropriate options to ``spac
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================
---------------------------------------
Registering Extensions via Entry Points
---------------------------------------
.. note::
Python version >= 3.8 is required to register extensions via entry points.
Spack can be made aware of extensions that are installed as part of a python package. To do so, register a function that returns the extension path, or paths, to the ``"spack.extensions"`` entry point. Consider the Python package ``my_package`` that includes a Spack extension:
.. code-block:: console
my-package/
├── src
│   ├── my_package
│   │   └── __init__.py
│   └── spack-scripting/ # the spack extensions
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make the ``spack-scripting`` extension visible to Spack when ``my_package`` is installed:
.. code-block:: toml
[project.entry_points."spack.extenions"]
my_package = "my_package:get_extension_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
.. code-block:: python
import importlib.resources
def get_extension_path():
dirname = importlib.resources.files("my_package").joinpath("spack-scripting")
if dirname.exists():
return str(dirname)

View File

@@ -273,9 +273,21 @@ builtin support through the ``depends_on`` function, the latter simply uses a ``
statement. Both module systems (at least in newer versions) do reference counting, so that if a
module is loaded by two different modules, it will only be unloaded after the others are.
The ``autoload`` key accepts the values ``none``, ``direct``, and ``all``. To disable it, use
``none``, and to enable, it's best to stick to ``direct``, which only autoloads the direct link and
run type dependencies, relying on recursive autoloading to load the rest.
The ``autoload`` key accepts the values:
* ``none``: no autoloading
* ``run``: autoload direct *run* type dependencies
* ``direct``: autoload direct *link and run* type dependencies
* ``all``: autoload all dependencies
In case of ``run`` and ``direct``, a ``module load`` triggers a recursive load.
The ``direct`` option is most correct: there are cases where pure link dependencies need to set
variables for themselves, or need to have variables of their own dependencies set.
In practice however, ``run`` is often sufficient, and may make ``module load`` snappier.
The ``all`` option is discouraged and seldomly used.
A common complaint about autoloading is the large number of modules that are visible to the user.
Spack has a solution for this as well: ``hide_implicits: true``. This ensures that only those
@@ -297,11 +309,11 @@ Environment Modules requires version 4.7 or higher.
tcl:
hide_implicits: true
all:
autoload: direct
autoload: direct # or `run`
lmod:
hide_implicits: true
all:
autoload: direct
autoload: direct # or `run`
.. _anonymous_specs:

View File

@@ -5,8 +5,8 @@ sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.0
docutils==0.20.1
pygments==2.17.2
urllib3==2.2.0
pytest==8.0.1
urllib3==2.2.1
pytest==8.0.2
isort==5.13.2
black==24.2.0
flake8==7.0.0

View File

@@ -1240,6 +1240,47 @@ def get_single_file(directory):
return fnames[0]
@system_path_filter
def windows_sfn(path: os.PathLike):
"""Returns 8.3 Filename (SFN) representation of
path
8.3 Filenames (SFN or short filename) is a file
naming convention used prior to Win95 that Windows
still (and will continue to) support. This convention
caps filenames at 8 characters, and most importantly
does not allow for spaces in addition to other specifications.
The scheme is generally the same as a normal Windows
file scheme, but all spaces are removed and the filename
is capped at 6 characters. The remaining characters are
replaced with ~N where N is the number file in a directory
that a given file represents i.e. Program Files and Program Files (x86)
would be PROGRA~1 and PROGRA~2 respectively.
Further, all file/directory names are all caps (although modern Windows
is case insensitive in practice).
Conversion is accomplished by fileapi.h GetShortPathNameW
Returns paths in 8.3 Filename form
Note: this method is a no-op on Linux
Args:
path: Path to be transformed into SFN (8.3 filename) format
"""
# This should not be run-able on linux/macos
if sys.platform != "win32":
return path
path = str(path)
import ctypes
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
# stub Windows types TCHAR[LENGTH]
TCHAR_arr = ctypes.c_wchar * len(path)
ret_str = TCHAR_arr()
k32.GetShortPathNameW(path, ret_str, len(path))
return ret_str.value
@contextmanager
def temp_cwd():
tmp_dir = tempfile.mkdtemp()

View File

@@ -843,6 +843,30 @@ def __repr__(self):
return repr(self.instance)
def get_entry_points(*, group: str):
"""Wrapper for ``importlib.metadata.entry_points``
Args:
group: entry points to select
Returns:
EntryPoints for ``group`` or empty list if unsupported
"""
try:
import importlib.metadata # type: ignore # novermin
except ImportError:
return []
try:
return importlib.metadata.entry_points(group=group)
except TypeError:
# Prior to Python 3.10, entry_points accepted no parameters and always
# returned a dictionary of entry points, keyed by group. See
# https://docs.python.org/3/library/importlib.metadata.html#entry-points
return importlib.metadata.entry_points().get(group, [])
def load_module_from_file(module_name, module_path):
"""Loads a python module from the path of the corresponding file.

View File

@@ -189,6 +189,7 @@ def _windows_can_symlink() -> bool:
import llnl.util.filesystem as fs
fs.touchp(fpath)
fs.mkdirp(dpath)
try:
os.symlink(dpath, dlink)

View File

@@ -1541,7 +1541,7 @@ def fetch_url_to_mirror(url):
response = spack.oci.opener.urlopen(
urllib.request.Request(
url=ref.manifest_url(),
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
headers={"Accept": ", ".join(spack.oci.oci.manifest_content_type)},
)
)
except Exception:

View File

@@ -541,7 +541,7 @@ def autoreconf(self, pkg, spec, prefix):
if os.path.exists(self.configure_abs_path):
return
# Else try to regenerate it, which reuquires a few build dependencies
# Else try to regenerate it, which requires a few build dependencies
ensure_build_dependencies_or_raise(
spec=spec,
dependencies=["autoconf", "automake", "libtool"],

View File

@@ -69,7 +69,7 @@ class MSBuildBuilder(BaseBuilder):
@property
def build_directory(self):
"""Return the directory containing the MSBuild solution or vcxproj."""
return self.pkg.stage.source_path
return fs.windows_sfn(self.pkg.stage.source_path)
@property
def toolchain_version(self):

View File

@@ -77,7 +77,11 @@ def ignore_quotes(self):
@property
def build_directory(self):
"""Return the directory containing the makefile."""
return self.pkg.stage.source_path if not self.makefile_root else self.makefile_root
return (
fs.windows_sfn(self.pkg.stage.source_path)
if not self.makefile_root
else fs.windows_sfn(self.makefile_root)
)
@property
def std_nmake_args(self):

View File

@@ -2,7 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools
import inspect
import operator
import os
import re
import shutil
@@ -24,7 +27,7 @@
import spack.package_base
import spack.spec
import spack.store
from spack.directives import build_system, depends_on, extends, maintainers
from spack.directives import build_system, depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
@@ -53,8 +56,6 @@ def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
@@ -180,7 +181,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
except (OSError, KeyError):
target = None
if target:
os.symlink(target, dst)
os.symlink(os.path.relpath(target, os.path.dirname(dst)), dst)
else:
view.link(src, dst, spec=self.spec)
@@ -368,16 +369,19 @@ def headers(self) -> HeaderList:
# Remove py- prefix in package name
name = self.spec.name[3:]
# Headers may be in either location
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
headers = fs.find_all_headers(include) + fs.find_all_headers(platlib)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
if headers:
return headers
msg = "Unable to locate {} headers in {} or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
msg = "Unable to locate {} headers in {}, {}, or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib, purelib))
@property
def libs(self) -> LibraryList:
@@ -386,15 +390,19 @@ def libs(self) -> LibraryList:
# Remove py- prefix in package name
name = self.spec.name[3:]
root = self.prefix.join(self.spec["python"].package.platlib).join(name)
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
libs = fs.find_all_libraries(root, recursive=True)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
libs = functools.reduce(operator.add, libs_list)
if libs:
return libs
msg = "Unable to recursively locate {} libraries in {}"
raise NoLibrariesError(msg.format(self.spec.name, root))
msg = "Unable to recursively locate {} libraries in {} or {}"
raise NoLibrariesError(msg.format(self.spec.name, platlib, purelib))
@spack.builder.builder("python_pip")

View File

@@ -162,23 +162,9 @@ def hip_flags(amdgpu_target):
# Add compiler minimum versions based on the first release where the
# processor is included in llvm/lib/Support/TargetParser.cpp
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx900:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx906:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx908:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx90c")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx941")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx942")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1032")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1033")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx1034")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1035")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx1036")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1100")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1101")

View File

@@ -594,6 +594,15 @@ def _put_manifest(
base_manifest, base_config = base_images[architecture]
env = _retrieve_env_dict_from_config(base_config)
# If the base image uses `vnd.docker.distribution.manifest.v2+json`, then we use that too.
# This is because Singularity / Apptainer is very strict about not mixing them.
base_manifest_mediaType = base_manifest.get(
"mediaType", "application/vnd.oci.image.manifest.v1+json"
)
use_docker_format = (
base_manifest_mediaType == "application/vnd.docker.distribution.manifest.v2+json"
)
spack.user_environment.environment_modifications_for_specs(*specs).apply_modifications(env)
# Create an oci.image.config file
@@ -625,8 +634,8 @@ def _put_manifest(
# Upload the config file
upload_blob_with_retry(image_ref, file=config_file, digest=config_file_checksum)
oci_manifest = {
"mediaType": "application/vnd.oci.image.manifest.v1+json",
manifest = {
"mediaType": base_manifest_mediaType,
"schemaVersion": 2,
"config": {
"mediaType": base_manifest["config"]["mediaType"],
@@ -637,7 +646,11 @@ def _put_manifest(
*(layer for layer in base_manifest["layers"]),
*(
{
"mediaType": "application/vnd.oci.image.layer.v1.tar+gzip",
"mediaType": (
"application/vnd.docker.image.rootfs.diff.tar.gzip"
if use_docker_format
else "application/vnd.oci.image.layer.v1.tar+gzip"
),
"digest": str(checksums[s.dag_hash()].compressed_digest),
"size": checksums[s.dag_hash()].size,
}
@@ -646,11 +659,11 @@ def _put_manifest(
],
}
if annotations:
oci_manifest["annotations"] = annotations
if not use_docker_format and annotations:
manifest["annotations"] = annotations
# Finally upload the manifest
upload_manifest_with_retry(image_ref, oci_manifest=oci_manifest)
upload_manifest_with_retry(image_ref, manifest=manifest)
# delete the config file
os.unlink(config_file)

View File

@@ -270,7 +270,8 @@ def create_temp_env_directory():
def _tty_info(msg):
"""tty.info like function that prints the equivalent printf statement for eval."""
decorated = f'{colorize("@*b{==>}")} {msg}\n'
print(f"printf {shlex.quote(decorated)};")
executor = "echo" if sys.platform == "win32" else "printf"
print(f"{executor} {shlex.quote(decorated)};")
def env_activate(args):

View File

@@ -18,6 +18,7 @@
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.repo
import spack.util.environment
from spack.cmd.common import arguments
@@ -152,9 +153,9 @@ def external_find(args):
def packages_to_search_for(
*, names: Optional[List[str]], tags: List[str], exclude: Optional[List[str]]
):
result = []
for current_tag in tags:
result.extend(spack.repo.PATH.packages_with_tags(current_tag, full=True))
result = list(
{pkg for tag in tags for pkg in spack.repo.PATH.packages_with_tags(tag, full=True)}
)
if names:
# Match both fully qualified and unqualified

View File

@@ -127,10 +127,7 @@ def _process_result(result, show, required_format, kwargs):
print()
if result.unsolved_specs and "solutions" in show:
tty.msg("Unsolved specs")
for spec in result.unsolved_specs:
print(spec)
print()
tty.msg(asp.Result.format_unsolved(result.unsolved_specs))
def solve(parser, args):

View File

@@ -228,7 +228,7 @@ def create_reporter(args, specs_to_test, test_suite):
def test_list(args):
"""list installed packages with available tests"""
tagged = set(spack.repo.PATH.packages_with_tags(*args.tag)) if args.tag else set()
tagged = spack.repo.PATH.packages_with_tags(*args.tag) if args.tag else set()
def has_test_and_tags(pkg_class):
tests = spack.install_test.test_functions(pkg_class)

View File

@@ -764,6 +764,31 @@ def _add_platform_scope(
cfg.push_scope(scope_type(plat_name, plat_path))
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
"""Load configuration paths from entry points
A python package can register entry point metadata so that Spack can find
its configuration by adding the following to the project's pyproject.toml:
.. code-block:: toml
[project.entry-points."spack.config"]
baz = "baz:get_spack_config_path"
The function ``get_spack_config_path`` returns the path to the package's
spack configuration scope
"""
config_paths: List[Tuple[str, str]] = []
for entry_point in lang.get_entry_points(group="spack.config"):
hook = entry_point.load()
if callable(hook):
config_path = hook()
if config_path and os.path.exists(config_path):
config_paths.append(("plugin-%s" % entry_point.name, str(config_path)))
return config_paths
def _add_command_line_scopes(
cfg: Union[Configuration, lang.Singleton], command_line_scopes: List[str]
) -> None:
@@ -816,6 +841,9 @@ def create() -> Configuration:
# No site-level configs should be checked into spack by default.
configuration_paths.append(("site", os.path.join(spack.paths.etc_path)))
# Python package's can register configuration scopes via entry_points
configuration_paths.extend(config_paths_from_entry_points())
# User configuration can override both spack defaults and site config
# This is disabled if user asks for no local configuration.
if not disable_local_config:

View File

@@ -19,9 +19,6 @@
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/fedora:38"
}
@@ -33,9 +30,6 @@
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/fedora:37"
}
@@ -47,9 +41,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux9",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/rockylinux:9"
}
@@ -61,9 +52,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux8",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/rockylinux:8"
}
@@ -75,9 +63,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux9",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "quay.io/almalinuxorg/almalinux:9"
}
@@ -89,9 +74,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux8",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "quay.io/almalinuxorg/almalinux:8"
}
@@ -105,9 +87,6 @@
"build": "spack/centos-stream",
"final": {
"image": "quay.io/centos/centos:stream"
},
"build_tags": {
"develop": "latest"
}
},
"centos:7": {
@@ -115,10 +94,7 @@
"template": "container/centos_7.dockerfile"
},
"os_package_manager": "yum",
"build": "spack/centos7",
"build_tags": {
"develop": "latest"
}
"build": "spack/centos7"
},
"opensuse/leap:15": {
"bootstrap": {
@@ -126,9 +102,6 @@
},
"os_package_manager": "zypper",
"build": "spack/leap15",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "opensuse/leap:latest"
}
@@ -148,19 +121,13 @@
"template": "container/ubuntu_2204.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-jammy",
"build_tags": {
"develop": "latest"
}
"build": "spack/ubuntu-jammy"
},
"ubuntu:20.04": {
"bootstrap": {
"template": "container/ubuntu_2004.dockerfile"
},
"build": "spack/ubuntu-focal",
"build_tags": {
"develop": "latest"
},
"os_package_manager": "apt"
},
"ubuntu:18.04": {
@@ -168,10 +135,7 @@
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic",
"build_tags": {
"develop": "latest"
}
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -50,10 +50,7 @@ def build_info(image, spack_version):
if not build_image:
return None, None
# Translate version from git to docker if necessary
build_tag = image_data["build_tags"].get(spack_version, spack_version)
return build_image, build_tag
return build_image, spack_version
def os_package_manager_for(image):

View File

@@ -1687,7 +1687,11 @@ def root(key, record):
with self.read_transaction():
roots = [rec.spec for key, rec in self._data.items() if root(key, rec)]
needed = set(id(spec) for spec in tr.traverse_nodes(roots, deptype=deptype))
return [rec.spec for rec in self._data.values() if id(rec.spec) not in needed]
return [
rec.spec
for rec in self._data.values()
if id(rec.spec) not in needed and rec.installed
]
def update_explicit(self, spec, explicit):
"""

View File

@@ -703,11 +703,14 @@ def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependenc
patch: spack.patch.Patch
if "://" in url_or_filename:
if sha256 is None:
raise ValueError("patch() with a url requires a sha256")
patch = spack.patch.UrlPatch(
pkg,
url_or_filename,
level,
working_dir,
working_dir=working_dir,
ordering_key=ordering_key,
sha256=sha256,
archive_sha256=archive_sha256,

View File

@@ -626,14 +626,13 @@ def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
new: If a string, create a FilesystemView rooted at that path. Default None. This
should only be used to regenerate the view, and cannot be used to access specs.
"""
root = new if new else self._current_root
if not root:
path = new if new else self._current_root
if not path:
# This can only be hit if we write a future bug
raise SpackEnvironmentViewError(
"Attempting to get nonexistent view from environment. "
f"View root is at {self.root}"
f"Attempting to get nonexistent view from environment. View root is at {self.root}"
)
return self._view(root)
return self._view(path)
def _view(self, root: str) -> SimpleFilesystemView:
"""Returns a view object for a given root dir."""
@@ -678,7 +677,9 @@ def specs_for_view(self, concrete_roots: List[Spec]) -> List[Spec]:
# Filter selected, installed specs
with spack.store.STORE.db.read_transaction():
return [s for s in specs if s in self and s.installed]
result = [s for s in specs if s in self and s.installed]
return self._exclude_duplicate_runtimes(result)
def regenerate(self, concrete_roots: List[Spec]) -> None:
specs = self.specs_for_view(concrete_roots)
@@ -765,6 +766,16 @@ def regenerate(self, concrete_roots: List[Spec]) -> None:
msg += str(e)
tty.warn(msg)
def _exclude_duplicate_runtimes(self, nodes):
all_runtimes = spack.repo.PATH.packages_with_tags("runtime")
runtimes_by_name = {}
for s in nodes:
if s.name not in all_runtimes:
continue
current_runtime = runtimes_by_name.get(s.name, s)
runtimes_by_name[s.name] = max(current_runtime, s, key=lambda x: x.version)
return [x for x in nodes if x.name not in all_runtimes or runtimes_by_name[x.name] == x]
def _create_environment(path):
return Environment(path)
@@ -1485,44 +1496,6 @@ def _concretize_separately(self, tests=False):
]
return results
def concretize_and_add(self, user_spec, concrete_spec=None, tests=False):
"""Concretize and add a single spec to the environment.
Concretize the provided ``user_spec`` and add it along with the
concretized result to the environment. If the given ``user_spec`` was
already present in the environment, this does not add a duplicate.
The concretized spec will be added unless the ``user_spec`` was
already present and an associated concrete spec was already present.
Args:
concrete_spec: if provided, then it is assumed that it is the
result of concretizing the provided ``user_spec``
"""
if self.unify is True:
msg = (
"cannot install a single spec in an environment that is "
"configured to be concretized together. Run instead:\n\n"
" $ spack add <spec>\n"
" $ spack install\n"
)
raise SpackEnvironmentError(msg)
spec = Spec(user_spec)
if self.add(spec):
concrete = concrete_spec or spec.concretized(tests=tests)
self._add_concrete_spec(spec, concrete)
else:
# spec might be in the user_specs, but not installed.
# TODO: Redo name-based comparison for old style envs
spec = next(s for s in self.user_specs if s.satisfies(user_spec))
concrete = self.specs_by_hash.get(spec.dag_hash())
if not concrete:
concrete = spec.concretized(tests=tests)
self._add_concrete_spec(spec, concrete)
return concrete
@property
def default_view(self):
if not self.has_view(default_view_name):

View File

@@ -12,6 +12,7 @@
import re
import sys
import types
from pathlib import Path
from typing import List
import llnl.util.lang
@@ -132,10 +133,38 @@ def load_extension(name: str) -> str:
def get_extension_paths():
"""Return the list of canonicalized extension paths from config:extensions."""
extension_paths = spack.config.get("config:extensions") or []
extension_paths.extend(extension_paths_from_entry_points())
paths = [spack.util.path.canonicalize_path(p) for p in extension_paths]
return paths
def extension_paths_from_entry_points() -> List[str]:
"""Load extensions from a Python package's entry points.
A python package can register entry point metadata so that Spack can find
its extensions by adding the following to the project's pyproject.toml:
.. code-block:: toml
[project.entry-points."spack.extensions"]
baz = "baz:get_spack_extensions"
The function ``get_spack_extensions`` returns paths to the package's
spack extensions
"""
extension_paths: List[str] = []
for entry_point in llnl.util.lang.get_entry_points(group="spack.extensions"):
hook = entry_point.load()
if callable(hook):
paths = hook() or []
if isinstance(paths, (Path, str)):
extension_paths.append(str(paths))
else:
extension_paths.extend(paths)
return extension_paths
def get_command_paths():
"""Return the list of paths where to search for command files."""
command_paths = []

View File

@@ -950,14 +950,10 @@ def _main(argv=None):
parser.print_help()
return 1
# -h, -H, and -V are special as they do not require a command, but
# all the other options do nothing without a command.
# version is special as it does not require a command or loading and additional infrastructure
if args.version:
print(get_version())
return 0
elif args.help:
sys.stdout.write(parser.format_help(level=args.help))
return 0
# ------------------------------------------------------------------------
# This part of the `main()` sets up Spack's configuration.
@@ -996,6 +992,12 @@ def _main(argv=None):
print_setup_info(*args.print_shell_vars.split(","))
return 0
# -h and -H are special as they do not require a command, but
# all the other options do nothing without a command.
if args.help:
sys.stdout.write(parser.format_help(level=args.help))
return 0
# At this point we've considered all the options to spack itself, so we
# need a command or we're done.
if not args.command:

View File

@@ -121,43 +121,26 @@ def update_dictionary_extending_lists(target, update):
target[key] = update[key]
def dependencies(spec, request="all"):
"""Returns the list of dependent specs for a given spec, according to the
request passed as parameter.
def dependencies(spec: spack.spec.Spec, request: str = "all") -> List[spack.spec.Spec]:
"""Returns the list of dependent specs for a given spec.
Args:
spec: spec to be analyzed
request: either 'none', 'direct' or 'all'
request: one of "none", "run", "direct", "all"
Returns:
list of dependencies
The return list will be empty if request is 'none', will contain
the direct dependencies if request is 'direct', or the entire DAG
if request is 'all'.
list of requested dependencies
"""
if request not in ("none", "direct", "all"):
message = "Wrong value for argument 'request' : "
message += "should be one of ('none', 'direct', 'all')"
raise tty.error(message + " [current value is '%s']" % request)
if request == "none":
return []
elif request == "run":
return spec.dependencies(deptype=dt.RUN)
elif request == "direct":
return spec.dependencies(deptype=dt.RUN | dt.LINK)
elif request == "all":
return list(spec.traverse(order="topo", deptype=dt.LINK | dt.RUN, root=False))
if request == "direct":
return spec.dependencies(deptype=("link", "run"))
# FIXME : during module file creation nodes seem to be visited multiple
# FIXME : times even if cover='nodes' is given. This work around permits
# FIXME : to get a unique list of spec anyhow. Do we miss a merge
# FIXME : step among nodes that refer to the same package?
seen = set()
seen_add = seen.add
deps = sorted(
spec.traverse(order="post", cover="nodes", deptype=("link", "run"), root=False),
reverse=True,
)
return [d for d in deps if not (d in seen or seen_add(d))]
raise ValueError(f'request "{request}" is not one of "none", "direct", "run", "all"')
def merge_config_rules(configuration, spec):

View File

@@ -161,7 +161,7 @@ def upload_blob(
def upload_manifest(
ref: ImageReference,
oci_manifest: dict,
manifest: dict,
tag: bool = True,
_urlopen: spack.oci.opener.MaybeOpen = None,
):
@@ -169,7 +169,7 @@ def upload_manifest(
Args:
ref: The image reference.
oci_manifest: The OCI manifest or index.
manifest: The manifest or index.
tag: When true, use the tag, otherwise use the digest,
this is relevant for multi-arch images, where the
tag is an index, referencing the manifests by digest.
@@ -179,7 +179,7 @@ def upload_manifest(
"""
_urlopen = _urlopen or spack.oci.opener.urlopen
data = json.dumps(oci_manifest, separators=(",", ":")).encode()
data = json.dumps(manifest, separators=(",", ":")).encode()
digest = Digest.from_sha256(hashlib.sha256(data).hexdigest())
size = len(data)
@@ -190,7 +190,7 @@ def upload_manifest(
url=ref.manifest_url(),
method="PUT",
data=data,
headers={"Content-Type": oci_manifest["mediaType"]},
headers={"Content-Type": manifest["mediaType"]},
)
response = _urlopen(request)

View File

@@ -566,6 +566,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict[str, Tuple["spack.variant.Variant", "spack.spec.Spec"]]
#: By default, packages are not virtual
#: Virtual packages override this attribute

View File

@@ -9,9 +9,9 @@
import os.path
import pathlib
import sys
from typing import Any, Dict, Optional, Tuple, Type
import llnl.util.filesystem
import llnl.util.lang
from llnl.url import allowed_archive
import spack
@@ -25,15 +25,16 @@
from spack.util.executable import which, which_string
def apply_patch(stage, patch_path, level=1, working_dir="."):
def apply_patch(
stage: "spack.stage.Stage", patch_path: str, level: int = 1, working_dir: str = "."
) -> None:
"""Apply the patch at patch_path to code in the stage.
Args:
stage (spack.stage.Stage): stage with code that will be patched
patch_path (str): filesystem location for the patch to apply
level (int or None): patch level (default 1)
working_dir (str): relative path *within* the stage to change to
(default '.')
stage: stage with code that will be patched
patch_path: filesystem location for the patch to apply
level: patch level
working_dir: relative path *within* the stage to change to
"""
git_utils_path = os.environ.get("PATH", "")
if sys.platform == "win32":
@@ -58,16 +59,24 @@ def apply_patch(stage, patch_path, level=1, working_dir="."):
class Patch:
"""Base class for patches.
Arguments:
pkg (str): the package that owns the patch
The owning package is not necessarily the package to apply the patch
to -- in the case where a dependent package patches its dependency,
it is the dependent's fullname.
"""
def __init__(self, pkg, path_or_url, level, working_dir):
sha256: str
def __init__(
self, pkg: "spack.package_base.PackageBase", path_or_url: str, level: int, working_dir: str
) -> None:
"""Initialize a new Patch instance.
Args:
pkg: the package that owns the patch
path_or_url: the relative path or URL to a patch file
level: patch level
working_dir: relative path *within* the stage to change to
"""
# validate level (must be an integer >= 0)
if not isinstance(level, int) or not level >= 0:
raise ValueError("Patch level needs to be a non-negative integer.")
@@ -75,27 +84,28 @@ def __init__(self, pkg, path_or_url, level, working_dir):
# Attributes shared by all patch subclasses
self.owner = pkg.fullname
self.path_or_url = path_or_url # needed for debug output
self.path = None # must be set before apply()
self.path: Optional[str] = None # must be set before apply()
self.level = level
self.working_dir = working_dir
def apply(self, stage: "spack.stage.Stage"):
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
Arguments:
stage (spack.stage.Stage): stage where source code lives
Args:
stage: stage where source code lives
"""
if not self.path or not os.path.isfile(self.path):
raise NoSuchPatchError(f"No such patch: {self.path}")
apply_patch(stage, self.path, self.level, self.working_dir)
@property
def stage(self):
return None
# TODO: Use TypedDict once Spack supports Python 3.8+ only
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
def to_dict(self):
"""Partial dictionary -- subclases should add to this."""
Returns:
A dictionary representation.
"""
return {
"owner": self.owner,
"sha256": self.sha256,
@@ -103,31 +113,55 @@ def to_dict(self):
"working_dir": self.working_dir,
}
def __eq__(self, other):
def __eq__(self, other: object) -> bool:
"""Equality check.
Args:
other: another patch
Returns:
True if both patches have the same checksum, else False
"""
if not isinstance(other, Patch):
return NotImplemented
return self.sha256 == other.sha256
def __hash__(self):
def __hash__(self) -> int:
"""Unique hash.
Returns:
A unique hash based on the sha256.
"""
return hash(self.sha256)
class FilePatch(Patch):
"""Describes a patch that is retrieved from a file in the repository.
"""Describes a patch that is retrieved from a file in the repository."""
Arguments:
pkg (str): the class object for the package that owns the patch
relative_path (str): path to patch, relative to the repository
directory for a package.
level (int): level to pass to patch command
working_dir (str): path within the source directory where patch
should be applied
"""
_sha256: Optional[str] = None
def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
relative_path: str,
level: int,
working_dir: str,
ordering_key: Optional[Tuple[str, int]] = None,
) -> None:
"""Initialize a new FilePatch instance.
Args:
pkg: the class object for the package that owns the patch
relative_path: path to patch, relative to the repository directory for a package.
level: level to pass to patch command
working_dir: path within the source directory where patch should be applied
ordering_key: key used to ensure patches are applied in a consistent order
"""
self.relative_path = relative_path
# patches may be defined by relative paths to parent classes
# search mro to look for the file
abs_path = None
abs_path: Optional[str] = None
# At different times we call FilePatch on instances and classes
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
for cls in inspect.getmro(pkg_cls):
@@ -150,50 +184,90 @@ def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
super().__init__(pkg, abs_path, level, working_dir)
self.path = abs_path
self._sha256 = None
self.ordering_key = ordering_key
@property
def sha256(self):
if self._sha256 is None:
def sha256(self) -> str:
"""Get the patch checksum.
Returns:
The sha256 of the patch file.
"""
if self._sha256 is None and self.path is not None:
self._sha256 = checksum(hashlib.sha256, self.path)
assert isinstance(self._sha256, str)
return self._sha256
def to_dict(self):
return llnl.util.lang.union_dicts(super().to_dict(), {"relative_path": self.relative_path})
@sha256.setter
def sha256(self, value: str) -> None:
"""Set the patch checksum.
Args:
value: the sha256
"""
self._sha256 = value
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
Returns:
A dictionary representation.
"""
data = super().to_dict()
data["relative_path"] = self.relative_path
return data
class UrlPatch(Patch):
"""Describes a patch that is retrieved from a URL.
"""Describes a patch that is retrieved from a URL."""
Arguments:
pkg (str): the package that owns the patch
url (str): URL where the patch can be fetched
level (int): level to pass to patch command
working_dir (str): path within the source directory where patch
should be applied
"""
def __init__(
self,
pkg: "spack.package_base.PackageBase",
url: str,
level: int = 1,
*,
working_dir: str = ".",
sha256: str, # This is required for UrlPatch
ordering_key: Optional[Tuple[str, int]] = None,
archive_sha256: Optional[str] = None,
) -> None:
"""Initialize a new UrlPatch instance.
def __init__(self, pkg, url, level=1, working_dir=".", ordering_key=None, **kwargs):
Arguments:
pkg: the package that owns the patch
url: URL where the patch can be fetched
level: level to pass to patch command
working_dir: path within the source directory where patch should be applied
ordering_key: key used to ensure patches are applied in a consistent order
sha256: sha256 sum of the patch, used to verify the patch
archive_sha256: sha256 sum of the *archive*, if the patch is compressed
(only required for compressed URL patches)
"""
super().__init__(pkg, url, level, working_dir)
self.url = url
self._stage = None
self._stage: Optional["spack.stage.Stage"] = None
self.ordering_key = ordering_key
self.archive_sha256 = kwargs.get("archive_sha256")
if allowed_archive(self.url) and not self.archive_sha256:
if allowed_archive(self.url) and not archive_sha256:
raise PatchDirectiveError(
"Compressed patches require 'archive_sha256' "
"and patch 'sha256' attributes: %s" % self.url
)
self.archive_sha256 = archive_sha256
self.sha256 = kwargs.get("sha256")
if not self.sha256:
if not sha256:
raise PatchDirectiveError("URL patches require a sha256 checksum")
self.sha256 = sha256
def apply(self, stage: "spack.stage.Stage"):
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
Args:
stage: stage where source code lives
"""
assert self.stage.expanded, "Stage must be expanded before applying patches"
# Get the patch file.
@@ -204,15 +278,20 @@ def apply(self, stage: "spack.stage.Stage"):
return super().apply(stage)
@property
def stage(self):
def stage(self) -> "spack.stage.Stage":
"""The stage in which to download (and unpack) the URL patch.
Returns:
The stage object.
"""
if self._stage:
return self._stage
fetch_digest = self.archive_sha256 or self.sha256
# Two checksums, one for compressed file, one for its contents
if self.archive_sha256:
fetcher = fs.FetchAndVerifyExpandedFile(
if self.archive_sha256 and self.sha256:
fetcher: fs.FetchStrategy = fs.FetchAndVerifyExpandedFile(
self.url, archive_sha256=self.archive_sha256, expanded_sha256=self.sha256
)
else:
@@ -231,7 +310,12 @@ def stage(self):
)
return self._stage
def to_dict(self):
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
Returns:
A dictionary representation.
"""
data = super().to_dict()
data["url"] = self.url
if self.archive_sha256:
@@ -239,8 +323,21 @@ def to_dict(self):
return data
def from_dict(dictionary, repository=None):
"""Create a patch from json dictionary."""
def from_dict(
dictionary: Dict[str, Any], repository: Optional["spack.repo.RepoPath"] = None
) -> Patch:
"""Create a patch from json dictionary.
Args:
dictionary: dictionary representation of a patch
repository: repository containing package
Returns:
A patch object.
Raises:
ValueError: If *owner* or *url*/*relative_path* are missing in the dictionary.
"""
repository = repository or spack.repo.PATH
owner = dictionary.get("owner")
if "owner" not in dictionary:
@@ -252,7 +349,7 @@ def from_dict(dictionary, repository=None):
pkg_cls,
dictionary["url"],
dictionary["level"],
dictionary["working_dir"],
working_dir=dictionary["working_dir"],
sha256=dictionary["sha256"],
archive_sha256=dictionary.get("archive_sha256"),
)
@@ -267,7 +364,7 @@ def from_dict(dictionary, repository=None):
# TODO: handle this more gracefully.
sha256 = dictionary["sha256"]
checker = Checker(sha256)
if not checker.check(patch.path):
if patch.path and not checker.check(patch.path):
raise fs.ChecksumError(
"sha256 checksum failed for %s" % patch.path,
"Expected %s but got %s " % (sha256, checker.sum)
@@ -295,10 +392,17 @@ class PatchCache:
namespace2.package2:
<patch json>
... etc. ...
"""
def __init__(self, repository, data=None):
def __init__(
self, repository: "spack.repo.RepoPath", data: Optional[Dict[str, Any]] = None
) -> None:
"""Initialize a new PatchCache instance.
Args:
repository: repository containing package
data: nested dictionary of patches
"""
if data is None:
self.index = {}
else:
@@ -309,21 +413,39 @@ def __init__(self, repository, data=None):
self.repository = repository
@classmethod
def from_json(cls, stream, repository):
def from_json(cls, stream: Any, repository: "spack.repo.RepoPath") -> "PatchCache":
"""Initialize a new PatchCache instance from JSON.
Args:
stream: stream of data
repository: repository containing package
Returns:
A new PatchCache instance.
"""
return PatchCache(repository=repository, data=sjson.load(stream))
def to_json(self, stream):
def to_json(self, stream: Any) -> None:
"""Dump a JSON representation to a stream.
Args:
stream: stream of data
"""
sjson.dump({"patches": self.index}, stream)
def patch_for_package(self, sha256: str, pkg):
def patch_for_package(self, sha256: str, pkg: "spack.package_base.PackageBase") -> Patch:
"""Look up a patch in the index and build a patch object for it.
Arguments:
sha256: sha256 hash to look up
pkg (spack.package_base.PackageBase): Package object to get patch for.
We build patch objects lazily because building them requires that
we have information about the package's location in its repo."""
we have information about the package's location in its repo.
Args:
sha256: sha256 hash to look up
pkg: Package object to get patch for.
Returns:
The patch object.
"""
sha_index = self.index.get(sha256)
if not sha_index:
raise PatchLookupError(
@@ -346,7 +468,12 @@ def patch_for_package(self, sha256: str, pkg):
patch_dict["sha256"] = sha256
return from_dict(patch_dict, repository=self.repository)
def update_package(self, pkg_fullname):
def update_package(self, pkg_fullname: str) -> None:
"""Update the patch cache.
Args:
pkg_fullname: package to update.
"""
# remove this package from any patch entries that reference it.
empty = []
for sha256, package_to_patch in self.index.items():
@@ -372,14 +499,29 @@ def update_package(self, pkg_fullname):
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
def update(self, other):
"""Update this cache with the contents of another."""
def update(self, other: "PatchCache") -> None:
"""Update this cache with the contents of another.
Args:
other: another patch cache to merge
"""
for sha256, package_to_patch in other.index.items():
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
@staticmethod
def _index_patches(pkg_class, repository):
def _index_patches(
pkg_class: Type["spack.package_base.PackageBase"], repository: "spack.repo.RepoPath"
) -> Dict[Any, Any]:
"""Patch index for a specific patch.
Args:
pkg_class: package object to get patches for
repository: repository containing the package
Returns:
The patch index for that package.
"""
index = {}
# Add patches from the class

View File

@@ -3,6 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.path
def get_projection(projections, spec):
"""
@@ -11,7 +13,7 @@ def get_projection(projections, spec):
all_projection = None
for spec_like, projection in projections.items():
if spec.satisfies(spec_like):
return projection
return spack.util.path.substitute_path_variables(projection)
elif spec_like == "all":
all_projection = projection
all_projection = spack.util.path.substitute_path_variables(projection)
return all_projection

View File

@@ -25,7 +25,7 @@
import traceback
import types
import uuid
from typing import Any, Dict, List, Tuple, Union
from typing import Any, Dict, List, Set, Tuple, Union
import llnl.path
import llnl.util.filesystem as fs
@@ -746,19 +746,17 @@ def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags, full=False):
"""Returns a list of packages matching any of the tags in input.
def packages_with_tags(self, *tags: str, full: bool = False) -> Set[str]:
"""Returns a set of packages matching any of the tags in input.
Args:
full: if True the package names in the output are fully-qualified
"""
r = set()
for repo in self.repos:
current = repo.packages_with_tags(*tags)
if full:
current = [f"{repo.namespace}.{x}" for x in current]
r |= set(current)
return sorted(r)
return {
f"{repo.namespace}.{pkg}" if full else pkg
for repo in self.repos
for pkg in repo.packages_with_tags(*tags)
}
def all_package_classes(self):
for name in self.all_package_names():
@@ -1169,15 +1167,10 @@ def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags):
def packages_with_tags(self, *tags: str) -> Set[str]:
v = set(self.all_package_names())
index = self.tag_index
for t in tags:
t = t.lower()
v &= set(index[t])
return sorted(v)
v.intersection_update(*(self.tag_index[tag.lower()] for tag in tags))
return v
def all_package_classes(self):
"""Iterator over all package *classes* in the repository.

View File

@@ -6,7 +6,6 @@
import warnings
import llnl.util.lang
import llnl.util.tty
# jsonschema is imported lazily as it is heavy to import
@@ -62,25 +61,3 @@ def _deprecated_properties(validator, deprecated, instance, schema):
Validator = llnl.util.lang.Singleton(_make_validator)
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": {
"type": "array",
"items": {"type": "array", "items": {"type": "string"}},
},
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}

View File

@@ -10,7 +10,7 @@
"""
from typing import Any, Dict
import spack.schema
from .spec_list import spec_list_schema
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
@@ -20,7 +20,7 @@
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spack.schema.spec_list_schema},
"patternProperties": {r"^(?!when$)\w*": spec_list_schema},
},
}
}

View File

@@ -16,11 +16,11 @@
import spack.schema.merged
import spack.schema.projections
from .spec_list import spec_list_schema
#: Top level key in a manifest file
TOP_LEVEL_KEY = "spack"
projections_scheme = spack.schema.projections.properties["projections"]
properties: Dict[str, Any] = {
"spack": {
"type": "object",
@@ -34,7 +34,7 @@
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
"specs": spec_list_schema,
},
),
}

View File

@@ -34,7 +34,7 @@
dictionary_of_strings = {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}}
dependency_selection = {"type": "string", "enum": ["none", "direct", "all"]}
dependency_selection = {"type": "string", "enum": ["none", "run", "direct", "all"]}
module_file_configuration = {
"type": "object",

View File

@@ -1,46 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for spack environment
.. literalinclude:: _spack_root/lib/spack/spack/schema/spack.py
:lines: 20-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema
import spack.schema.gitlab_ci as ci_schema # DEPRECATED
import spack.schema.merged as merged_schema
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
ci_schema.properties,
# merged configuration scope schemas
merged_schema.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
},
),
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack environment file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@@ -0,0 +1,24 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
matrix_schema = {"type": "array", "items": {"type": "array", "items": {"type": "string"}}}
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": matrix_schema,
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}

View File

@@ -15,7 +15,7 @@
import types
import typing
import warnings
from typing import Callable, Dict, List, NamedTuple, Optional, Sequence, Set, Tuple, Union
from typing import Callable, Dict, Iterator, List, NamedTuple, Optional, Set, Tuple, Type, Union
import archspec.cpu
@@ -258,7 +258,7 @@ def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunc
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts))
def _create_counter(specs, tests):
def _create_counter(specs: List[spack.spec.Spec], tests: bool):
strategy = spack.config.CONFIG.get("concretizer:duplicates:strategy", "none")
if strategy == "full":
return FullDuplicatesCounter(specs, tests=tests)
@@ -411,7 +411,7 @@ def raise_if_unsat(self):
"""
Raise an appropriate error if the result is unsatisfiable.
The error is an InternalConcretizerError, and includes the minimized cores
The error is an SolverError, and includes the minimized cores
resulting from the solve, formatted to be human readable.
"""
if self.satisfiable:
@@ -422,7 +422,7 @@ def raise_if_unsat(self):
constraints = constraints[0]
conflicts = self.format_minimal_cores()
raise InternalConcretizerError(constraints, conflicts=conflicts)
raise SolverError(constraints, conflicts=conflicts)
@property
def specs(self):
@@ -435,7 +435,10 @@ def specs(self):
@property
def unsolved_specs(self):
"""List of abstract input specs that were not solved."""
"""List of tuples pairing abstract input specs that were not
solved with their associated candidate spec from the solver
(if the solve completed).
"""
if self._unsolved_specs is None:
self._compute_specs_from_answer_set()
return self._unsolved_specs
@@ -449,7 +452,7 @@ def specs_by_input(self):
def _compute_specs_from_answer_set(self):
if not self.satisfiable:
self._concrete_specs = []
self._unsolved_specs = self.abstract_specs
self._unsolved_specs = list((x, None) for x in self.abstract_specs)
self._concrete_specs_by_input = {}
return
@@ -470,7 +473,22 @@ def _compute_specs_from_answer_set(self):
self._concrete_specs.append(answer[node])
self._concrete_specs_by_input[input_spec] = answer[node]
else:
self._unsolved_specs.append(input_spec)
self._unsolved_specs.append((input_spec, candidate))
@staticmethod
def format_unsolved(unsolved_specs):
"""Create a message providing info on unsolved user specs and for
each one show the associated candidate spec from the solver (if
there is one).
"""
msg = "Unsatisfied input specs:"
for input_spec, candidate in unsolved_specs:
msg += f"\n\tInput spec: {str(input_spec)}"
if candidate:
msg += f"\n\tCandidate spec: {str(candidate)}"
else:
msg += "\n\t(No candidate specs from solver)"
return msg
def _normalize_packages_yaml(packages_yaml):
@@ -805,6 +823,13 @@ def on_model(model):
print("Statistics:")
pprint.pprint(self.control.statistics)
if result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
f" that do not satisfy the request.\n\t{unsolved_str}"
)
return result, timer, self.control.statistics
@@ -872,35 +897,41 @@ def __iter__(self):
return iter(self.data)
# types for condition caching in solver setup
ConditionSpecKey = Tuple[str, Optional[TransformFunction]]
ConditionIdFunctionPair = Tuple[int, List[AspFunction]]
ConditionSpecCache = Dict[str, Dict[ConditionSpecKey, ConditionIdFunctionPair]]
class SpackSolverSetup:
"""Class to set up and run a Spack concretization solve."""
def __init__(self, tests=False):
self.gen = None # set by setup()
def __init__(self, tests: bool = False):
# these are all initialized in setup()
self.gen: "ProblemInstanceBuilder" = ProblemInstanceBuilder()
self.possible_virtuals: Set[str] = set()
self.assumptions = []
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
self.assumptions: List[Tuple["clingo.Symbol", bool]] = [] # type: ignore[name-defined]
self.declared_versions: Dict[str, List[DeclaredVersion]] = collections.defaultdict(list)
self.possible_versions: Dict[str, Set[GitOrStandardVersion]] = collections.defaultdict(set)
self.deprecated_versions: Dict[str, Set[GitOrStandardVersion]] = collections.defaultdict(
set
)
self.possible_virtuals = None
self.possible_compilers = []
self.possible_oses = set()
self.variant_values_from_specs = set()
self.version_constraints = set()
self.target_constraints = set()
self.default_targets = []
self.compiler_version_constraints = set()
self.post_facts = []
self.possible_compilers: List = []
self.possible_oses: Set = set()
self.variant_values_from_specs: Set = set()
self.version_constraints: Set = set()
self.target_constraints: Set = set()
self.default_targets: List = []
self.compiler_version_constraints: Set = set()
self.post_facts: List = []
# (ID, CompilerSpec) -> dictionary of attributes
self.compiler_info = collections.defaultdict(dict)
self.reusable_and_possible: ConcreteSpecsByHash = ConcreteSpecsByHash()
self.reusable_and_possible = ConcreteSpecsByHash()
self._id_counter = itertools.count()
self._trigger_cache = collections.defaultdict(dict)
self._effect_cache = collections.defaultdict(dict)
self._id_counter: Iterator[int] = itertools.count()
self._trigger_cache: ConditionSpecCache = collections.defaultdict(dict)
self._effect_cache: ConditionSpecCache = collections.defaultdict(dict)
# Caches to optimize the setup phase of the solver
self.target_specs_cache = None
@@ -912,8 +943,8 @@ def __init__(self, tests=False):
self.concretize_everything = True
# Set during the call to setup
self.pkgs = None
self.explicitly_required_namespaces = {}
self.pkgs: Set[str] = set()
self.explicitly_required_namespaces: Dict[str, str] = {}
def pkg_version_rules(self, pkg):
"""Output declared versions of a package.
@@ -1197,6 +1228,38 @@ def variant_rules(self, pkg):
self.gen.newline()
def _get_condition_id(
self,
named_cond: spack.spec.Spec,
cache: ConditionSpecCache,
body: bool,
transform: Optional[TransformFunction] = None,
) -> int:
"""Get the id for one half of a condition (either a trigger or an imposed constraint).
Construct a key from the condition spec and any associated transformation, and
cache the ASP functions that they imply. The saved functions will be output
later in ``trigger_rules()`` and ``effect_rules()``.
Returns:
The id of the cached trigger or effect.
"""
pkg_cache = cache[named_cond.name]
named_cond_key = (str(named_cond), transform)
result = pkg_cache.get(named_cond_key)
if result:
return result[0]
cond_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=body)
if transform:
requirements = transform(named_cond, requirements)
pkg_cache[named_cond_key] = (cond_id, requirements)
return cond_id
def condition(
self,
required_spec: spack.spec.Spec,
@@ -1222,7 +1285,8 @@ def condition(
"""
named_cond = required_spec.copy()
named_cond.name = named_cond.name or name
assert named_cond.name, "must provide name for anonymous conditions!"
if not named_cond.name:
raise ValueError(f"Must provide a name for anonymous condition: '{named_cond}'")
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the caller,
@@ -1232,35 +1296,19 @@ def condition(
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
cache = self._trigger_cache[named_cond.name]
named_cond_key = (str(named_cond), transform_required)
if named_cond_key not in cache:
trigger_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=True, required_from=name)
if transform_required:
requirements = transform_required(named_cond, requirements)
cache[named_cond_key] = (trigger_id, requirements)
trigger_id, requirements = cache[named_cond_key]
trigger_id = self._get_condition_id(
named_cond, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id)))
if not imposed_spec:
return condition_id
cache = self._effect_cache[named_cond.name]
imposed_spec_key = (str(imposed_spec), transform_imposed)
if imposed_spec_key not in cache:
effect_id = next(self._id_counter)
requirements = self.spec_clauses(imposed_spec, body=False, required_from=name)
if transform_imposed:
requirements = transform_imposed(imposed_spec, requirements)
cache[imposed_spec_key] = (effect_id, requirements)
effect_id, requirements = cache[imposed_spec_key]
effect_id = self._get_condition_id(
imposed_spec, cache=self._effect_cache, body=False, transform=transform_imposed
)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id)))
return condition_id
def impose(self, condition_id, imposed_spec, node=True, name=None, body=False):
@@ -1362,23 +1410,13 @@ def virtual_preferences(self, pkg_name, func):
def provider_defaults(self):
self.gen.h2("Default virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
assert self.possible_virtuals is not None, msg
self.virtual_preferences(
"all", lambda v, p, i: self.gen.fact(fn.default_provider_preference(v, p, i))
)
def provider_requirements(self):
self.gen.h2("Requirements on virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
parser = RequirementParser(spack.config.CONFIG)
assert self.possible_virtuals is not None, msg
for virtual_str in sorted(self.possible_virtuals):
rules = parser.rules_from_virtual(virtual_str)
if rules:
@@ -1577,35 +1615,57 @@ def flag_defaults(self):
fn.compiler_version_flag(compiler.name, compiler.version, name, flag)
)
def spec_clauses(self, *args, **kwargs):
"""Wrap a call to `_spec_clauses()` into a try/except block that
raises a comprehensible error message in case of failure.
def spec_clauses(
self,
spec: spack.spec.Spec,
*,
body: bool = False,
transitive: bool = True,
expand_hashes: bool = False,
concrete_build_deps=False,
required_from: Optional[str] = None,
) -> List[AspFunction]:
"""Wrap a call to `_spec_clauses()` into a try/except block with better error handling.
Arguments are as for ``_spec_clauses()`` except ``required_from``.
Arguments:
required_from: name of package that caused this call.
"""
requestor = kwargs.pop("required_from", None)
try:
clauses = self._spec_clauses(*args, **kwargs)
clauses = self._spec_clauses(
spec,
body=body,
transitive=transitive,
expand_hashes=expand_hashes,
concrete_build_deps=concrete_build_deps,
)
except RuntimeError as exc:
msg = str(exc)
if requestor:
msg += ' [required from package "{0}"]'.format(requestor)
if required_from:
msg += f" [required from package '{required_from}']"
raise RuntimeError(msg)
return clauses
def _spec_clauses(
self, spec, body=False, transitive=True, expand_hashes=False, concrete_build_deps=False
):
self,
spec: spack.spec.Spec,
*,
body: bool = False,
transitive: bool = True,
expand_hashes: bool = False,
concrete_build_deps: bool = False,
) -> List[AspFunction]:
"""Return a list of clauses for a spec mandates are true.
Arguments:
spec (spack.spec.Spec): the spec to analyze
body (bool): if True, generate clauses to be used in rule bodies
(final values) instead of rule heads (setters).
transitive (bool): if False, don't generate clauses from
dependencies (default True)
expand_hashes (bool): if True, descend into hashes of concrete specs
(default False)
concrete_build_deps (bool): if False, do not include pure build deps
of concrete specs (as they have no effect on runtime constraints)
spec: the spec to analyze
body: if True, generate clauses to be used in rule bodies (final values) instead
of rule heads (setters).
transitive: if False, don't generate clauses from dependencies (default True)
expand_hashes: if True, descend into hashes of concrete specs (default False)
concrete_build_deps: if False, do not include pure build deps of concrete specs
(as they have no effect on runtime constraints)
Normally, if called with ``transitive=True``, ``spec_clauses()`` just generates
hashes for the dependency requirements of concrete specs. If ``expand_hashes``
@@ -1615,7 +1675,7 @@ def _spec_clauses(
"""
clauses = []
f = _Body if body else _Head
f: Union[Type[_Head], Type[_Body]] = _Body if body else _Head
if spec.name:
clauses.append(f.node(spec.name) if not spec.virtual else f.virtual_node(spec.name))
@@ -1704,8 +1764,9 @@ def _spec_clauses(
# dependencies
if spec.concrete:
# older specs do not have package hashes, so we have to do this carefully
if getattr(spec, "_package_hash", None):
clauses.append(fn.attr("package_hash", spec.name, spec._package_hash))
package_hash = getattr(spec, "_package_hash", None)
if package_hash:
clauses.append(fn.attr("package_hash", spec.name, package_hash))
clauses.append(fn.attr("hash", spec.name, spec.dag_hash()))
edges = spec.edges_from_dependents()
@@ -1725,6 +1786,11 @@ def _spec_clauses(
dep = dspec.spec
if spec.concrete:
# GCC runtime is solved again by clingo, even on concrete specs, to give
# the possibility to reuse specs built against a different runtime.
if dep.name == "gcc-runtime":
continue
# We know dependencies are real for concrete specs. For abstract
# specs they just mean the dep is somehow in the DAG.
for dtype in dt.ALL_FLAGS:
@@ -1764,7 +1830,7 @@ def _spec_clauses(
return clauses
def define_package_versions_and_validate_preferences(
self, possible_pkgs, *, require_checksum: bool, allow_deprecated: bool
self, possible_pkgs: Set[str], *, require_checksum: bool, allow_deprecated: bool
):
"""Declare any versions in specs not declared in packages."""
packages_yaml = spack.config.get("packages")
@@ -1797,7 +1863,7 @@ def define_package_versions_and_validate_preferences(
if pkg_name not in packages_yaml or "version" not in packages_yaml[pkg_name]:
continue
version_defs = []
version_defs: List[GitOrStandardVersion] = []
for vstr in packages_yaml[pkg_name]["version"]:
v = vn.ver(vstr)
@@ -2008,13 +2074,6 @@ def target_defaults(self, specs):
def virtual_providers(self):
self.gen.h2("Virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
assert self.possible_virtuals is not None, msg
# what provides what
for vspec in sorted(self.possible_virtuals):
self.gen.fact(fn.virtual(vspec))
self.gen.newline()
@@ -2211,7 +2270,7 @@ def define_concrete_input_specs(self, specs, possible):
def setup(
self,
specs: Sequence[spack.spec.Spec],
specs: List[spack.spec.Spec],
*,
reuse: Optional[List[spack.spec.Spec]] = None,
allow_deprecated: bool = False,
@@ -2233,8 +2292,7 @@ def setup(
self.possible_virtuals = node_counter.possible_virtuals()
self.pkgs = node_counter.possible_dependencies()
runtimes = spack.repo.PATH.packages_with_tags("runtime")
self.pkgs.update(set(runtimes))
self.pkgs.update(spack.repo.PATH.packages_with_tags("runtime"))
# Fail if we already know an unreachable node is requested
for spec in specs:
@@ -3429,15 +3487,13 @@ def solve_in_rounds(
if not result.satisfiable or not result.specs:
break
input_specs = result.unsolved_specs
input_specs = list(x for (x, y) in result.unsolved_specs)
for spec in result.specs:
reusable_specs.extend(spec.traverse())
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
"""
Subclass for new constructor signature for new concretizer
"""
"""There was an issue with the spec that was requested (i.e. a user error)."""
def __init__(self, msg):
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
@@ -3447,8 +3503,21 @@ def __init__(self, msg):
class InternalConcretizerError(spack.error.UnsatisfiableSpecError):
"""
Subclass for new constructor signature for new concretizer
"""Errors that indicate a bug in Spack."""
def __init__(self, msg):
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
self.provided = None
self.required = None
self.constraint_type = None
class SolverError(InternalConcretizerError):
"""For cases where the solver is unable to produce a solution.
Such cases are unexpected because we allow for solutions with errors,
so for example user specs that are over-constrained should still
get a solution.
"""
def __init__(self, provided, conflicts):
@@ -3461,7 +3530,7 @@ def __init__(self, provided, conflicts):
if conflicts:
msg += ", errors are:" + "".join([f"\n {conflict}" for conflict in conflicts])
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
super().__init__(msg)
self.provided = provided

View File

@@ -117,7 +117,7 @@ def _compute_cache_values(self):
self._possible_dependencies = set(self._link_run) | set(self._total_build)
def possible_packages_facts(self, gen, fn):
build_tools = set(spack.repo.PATH.packages_with_tags("build-tools"))
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
gen.h2("Packages with at most a single node")
for package_name in sorted(self.possible_dependencies() - build_tools):
gen.fact(fn.max_dupes(package_name, 1))
@@ -142,7 +142,7 @@ def possible_packages_facts(self, gen, fn):
class FullDuplicatesCounter(MinimalDuplicatesCounter):
def possible_packages_facts(self, gen, fn):
build_tools = set(spack.repo.PATH.packages_with_tags("build-tools"))
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
counter = collections.Counter(
list(self._link_run) + list(self._total_build) + list(self._direct_build)
)

View File

@@ -2091,7 +2091,12 @@ def to_node_dict(self, hash=ht.dag_hash):
if hasattr(variant, "_patches_in_order_of_appearance"):
d["patches"] = variant._patches_in_order_of_appearance
if self._concrete and hash.package_hash and self._package_hash:
if (
self._concrete
and hash.package_hash
and hasattr(self, "_package_hash")
and self._package_hash
):
# We use the attribute here instead of `self.package_hash()` because this
# should *always* be assignhed at concretization time. We don't want to try
# to compute a package hash for concrete spec where a) the package might not

View File

@@ -2731,15 +2731,6 @@ def test_concretize_user_specs_together():
assert all("mpich" not in spec for _, spec in e.concretized_specs())
def test_cant_install_single_spec_when_concretizing_together():
e = ev.create("coconcretization")
e.unify = True
with pytest.raises(ev.SpackEnvironmentError, match=r"cannot install"):
e.concretize_and_add("zlib")
e.install_all()
def test_duplicate_packages_raise_when_concretizing_together():
e = ev.create("coconcretization")
e.unify = True

View File

@@ -94,6 +94,9 @@ def test_get_executables(working_env, mock_executable):
external = SpackCommand("external")
# TODO: this test should be made to work, but in the meantime it is
# causing intermittent (spurious) CI failures on all PRs
@pytest.mark.skipif(sys.platform == "win32", reason="Test fails intermittently on Windows")
def test_find_external_cmd_not_buildable(mutable_config, working_env, mock_executable):
"""When the user invokes 'spack external find --not-buildable', the config
for any package where Spack finds an external version should be marked as
@@ -248,6 +251,7 @@ def _determine_variants(cls, exes, version_str):
assert gcc.external_path == os.path.sep + os.path.join("opt", "gcc", "bin")
@pytest.mark.not_on_windows("Fails spuriously on Windows")
def test_new_entries_are_reported_correctly(mock_executable, mutable_config, monkeypatch):
# Prepare an environment to detect a fake gcc
gcc_exe = mock_executable("gcc", output="echo 4.2.1")

View File

@@ -6,9 +6,11 @@
import pytest
import spack.deptypes as dt
import spack.environment as ev
import spack.main
import spack.spec
import spack.traverse
gc = spack.main.SpackCommand("gc")
add = spack.main.SpackCommand("add")
@@ -19,11 +21,8 @@
@pytest.mark.db
def test_gc_without_build_dependency(config, mutable_database):
output = gc("-yb")
assert "There are no unused specs." in output
output = gc("-y")
assert "There are no unused specs." in output
assert "There are no unused specs." in gc("-yb")
assert "There are no unused specs." in gc("-y")
@pytest.mark.db
@@ -32,11 +31,9 @@ def test_gc_with_build_dependency(config, mutable_database):
s.concretize()
s.package.do_install(fake=True, explicit=True)
output = gc("-yb")
assert "There are no unused specs." in output
output = gc("-y")
assert "Successfully uninstalled cmake" in output
assert "There are no unused specs." in gc("-yb")
assert "Successfully uninstalled cmake" in gc("-y")
assert "There are no unused specs." in gc("-y")
@pytest.mark.db
@@ -72,34 +69,39 @@ def test_gc_with_build_dependency_in_environment(config, mutable_database, mutab
with e:
assert mutable_database.query_local("simple-inheritance")
output = gc("-y")
assert "Restricting garbage collection" in output
assert "Successfully uninstalled cmake" in output
fst = gc("-y")
assert "Restricting garbage collection" in fst
assert "Successfully uninstalled cmake" in fst
snd = gc("-y")
assert "Restricting garbage collection" in snd
assert "There are no unused specs" in snd
@pytest.mark.db
def test_gc_except_any_environments(config, mutable_database, mutable_mock_env_path):
s = spack.spec.Spec("simple-inheritance")
s.concretize()
s.package.do_install(fake=True, explicit=True)
"""Tests whether the garbage collector can remove all specs except those still needed in some
environment (needed in the sense of roots + link/run deps)."""
assert mutable_database.query_local("zmpi")
e = ev.create("test_gc")
with e:
add("simple-inheritance")
install()
assert mutable_database.query_local("simple-inheritance")
e.add("simple-inheritance")
e.concretize()
e.install_all(fake=True)
e.write()
assert mutable_database.query_local("simple-inheritance")
assert not e.all_matching_specs(spack.spec.Spec("zmpi"))
output = gc("-yE")
assert "Restricting garbage collection" not in output
assert "Successfully uninstalled zmpi" in output
assert not mutable_database.query_local("zmpi")
with e:
output = gc("-yE")
assert "Restricting garbage collection" not in output
assert "There are no unused specs" not in output
# All runtime specs in this env should still be installed.
assert all(
s.installed
for s in spack.traverse.traverse_nodes(e.concrete_roots(), deptype=dt.LINK | dt.RUN)
)
@pytest.mark.db

View File

@@ -12,13 +12,7 @@
maintainers = spack.main.SpackCommand("maintainers")
MAINTAINED_PACKAGES = [
"maintainers-1",
"maintainers-2",
"maintainers-3",
"py-extension1",
"py-extension2",
]
MAINTAINED_PACKAGES = ["maintainers-1", "maintainers-2", "maintainers-3", "py-extension1"]
def split(output):
@@ -53,11 +47,8 @@ def test_all(mock_packages, capfd):
"user2,",
"user3",
"py-extension1:",
"adamjstewart,",
"user1,",
"user2",
"py-extension2:",
"adamjstewart",
]
with capfd.disabled():
@@ -69,9 +60,6 @@ def test_all_by_user(mock_packages, capfd):
with capfd.disabled():
out = split(maintainers("--all", "--by-user"))
assert out == [
"adamjstewart:",
"py-extension1,",
"py-extension2",
"user0:",
"maintainers-3",
"user1:",

View File

@@ -341,6 +341,7 @@ def test_different_compilers_get_different_flags(self):
assert set(client.compiler_flags["fflags"]) == set(["-O0", "-g"])
assert not set(cmake.compiler_flags["fflags"])
@pytest.mark.xfail(reason="Broken, needs to be fixed")
def test_compiler_flags_from_compiler_and_dependent(self):
client = Spec("cmake-client %clang@12.2.0 platform=test os=fe target=fe cflags==-g")
client.concretize()
@@ -2093,7 +2094,25 @@ def test_result_specs_is_not_empty(self, specs):
result, _, _ = solver.driver.solve(setup, specs, reuse=[])
assert result.specs
assert not result.unsolved_specs
@pytest.mark.regression("38664")
def test_unsolved_specs_raises_error(self, monkeypatch, mock_packages, config):
"""Check that the solver raises an exception when input specs are not
satisfied.
"""
specs = [Spec("zlib")]
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
simulate_unsolved_property = list((x, None) for x in specs)
monkeypatch.setattr(spack.solver.asp.Result, "unsolved_specs", simulate_unsolved_property)
with pytest.raises(
spack.solver.asp.InternalConcretizerError,
match="the solver completed but produced specs",
):
solver.driver.solve(setup, specs, reuse=[])
@pytest.mark.regression("36339")
def test_compiler_match_constraints_when_selected(self):

View File

@@ -11,6 +11,7 @@
import spack.repo
import spack.solver.asp
import spack.spec
from spack.environment.environment import ViewDescriptor
from spack.version import Version
pytestmark = [
@@ -19,6 +20,17 @@
]
def _concretize_with_reuse(*, root_str, reused_str):
reused_spec = spack.spec.Spec(reused_str).concretized()
setup = spack.solver.asp.SpackSolverSetup(tests=False)
driver = spack.solver.asp.PyclingoDriver()
result, _, _ = driver.solve(
setup, [spack.spec.Spec(f"{root_str} ^{reused_str}")], reuse=[reused_spec]
)
root = result.specs[0]
return root, reused_spec
@pytest.fixture
def runtime_repo(config):
repo = os.path.join(spack.paths.repos_path, "compiler_runtime.test")
@@ -60,3 +72,59 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
assert a.dependencies("gcc-runtime")
assert a.dependencies("b")
assert not b.dependencies("gcc-runtime")
@pytest.mark.parametrize(
"root_str,reused_str,expected,nruntime",
[
# The reused runtime is older than we need, thus we'll add a more recent one for a
("a%gcc@10.2.1", "b%gcc@4.5.0", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@4.5.0"}, 2),
# The root is compiled with an older compiler, thus we'll reuse the runtime from b
("a%gcc@4.5.0", "b%gcc@10.2.1", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@10.2.1"}, 1),
],
)
def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime, runtime_repo):
"""Tests that we can reuse specs with a "gcc-runtime" leaf node. In particular, checks
that the semantic for gcc-runtimes versions accounts for reused packages too.
"""
root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str)
assert f"{expected['b']}" in reused_spec
runtime_a = root.dependencies("gcc-runtime")[0]
assert runtime_a.satisfies(expected["a"])
runtime_b = root["b"].dependencies("gcc-runtime")[0]
assert runtime_b.satisfies(expected["b"])
runtimes = [x for x in root.traverse() if x.name == "gcc-runtime"]
assert len(runtimes) == nruntime
@pytest.mark.parametrize(
"root_str,reused_str,expected,not_expected",
[
# Ensure that, whether we have multiple runtimes in the DAG or not,
# we always link only the latest version
("a%gcc@10.2.1", "b%gcc@4.5.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@4.5.0"]),
("a%gcc@4.5.0", "b%gcc@10.2.1", ["gcc-runtime@10.2.1"], ["gcc-runtime@4.5.0"]),
],
)
def test_views_can_handle_duplicate_runtime_nodes(
root_str, reused_str, expected, not_expected, runtime_repo, tmp_path, monkeypatch
):
"""Tests that an environment is able to select the latest version of a runtime node to be
linked in a view, in case more than one compatible version is in the DAG.
"""
root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str)
# Mock the installation status to allow selecting nodes for the view
monkeypatch.setattr(spack.spec.Spec, "installed", True)
nodes = list(root.traverse())
view = ViewDescriptor(str(tmp_path), str(tmp_path))
candidate_specs = view.specs_for_view(nodes)
for x in expected:
assert any(node.satisfies(x) for node in candidate_specs)
for x in not_expected:
assert all(not node.satisfies(x) for node in candidate_specs)

View File

@@ -25,7 +25,7 @@ def test_build_and_run_images(minimal_configuration):
# Test the output of the build property
build = writer.build
assert build.image == "spack/ubuntu-bionic:latest"
assert build.image == "spack/ubuntu-bionic:develop"
def test_packages(minimal_configuration):

View File

@@ -12,7 +12,7 @@
@pytest.mark.parametrize(
"image,spack_version,expected",
[
("ubuntu:18.04", "develop", ("spack/ubuntu-bionic", "latest")),
("ubuntu:18.04", "develop", ("spack/ubuntu-bionic", "develop")),
("ubuntu:18.04", "0.14.0", ("spack/ubuntu-bionic", "0.14.0")),
],
)

View File

@@ -24,8 +24,6 @@ class PyTorch(PythonPackage, CudaPackage):
homepage = "https://pytorch.org/"
git = "https://github.com/pytorch/pytorch.git"
maintainers("adamjstewart")
# Exact set of modules is version- and variant-specific, just attempt to import the
# core libraries to ensure that the package was successfully installed.
import_modules = ["torch", "torch.autograd", "torch.nn", "torch.utils"]

View File

@@ -79,7 +79,7 @@ def test_error_on_anonymous_dependency(config, mock_packages):
[
("maintainers-1", ["user1", "user2"]),
# Extends PythonPackage
("py-extension1", ["adamjstewart", "user1", "user2"]),
("py-extension1", ["user1", "user2"]),
# Extends maintainers-1
("maintainers-3", ["user0", "user1", "user2", "user3"]),
],

View File

@@ -0,0 +1,114 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import sys
import pytest
import llnl.util.lang
import spack.config
import spack.extensions
class MockConfigEntryPoint:
def __init__(self, tmp_path):
self.dir = tmp_path
self.name = "mypackage_config"
def load(self):
etc_path = self.dir.joinpath("spack/etc")
etc_path.mkdir(exist_ok=True, parents=True)
f = self.dir / "spack/etc/config.yaml"
with open(f, "w") as fh:
fh.write("config:\n install_tree:\n root: /spam/opt\n")
def ep():
return self.dir / "spack/etc"
return ep
class MockExtensionsEntryPoint:
def __init__(self, tmp_path):
self.dir = tmp_path
self.name = "mypackage_extensions"
def load(self):
cmd_path = self.dir.joinpath("spack/spack-myext/myext/cmd")
cmd_path.mkdir(exist_ok=True, parents=True)
f = self.dir / "spack/spack-myext/myext/cmd/spam.py"
with open(f, "w") as fh:
fh.write("description = 'hello world extension command'\n")
fh.write("section = 'test command'\n")
fh.write("level = 'long'\n")
fh.write("def setup_parser(subparser):\n pass\n")
fh.write("def spam(parser, args):\n print('spam for all!')\n")
def ep():
return self.dir / "spack/spack-myext"
return ep
def entry_points_factory(tmp_path):
def entry_points(group=None):
if group == "spack.config":
return (MockConfigEntryPoint(tmp_path),)
elif group == "spack.extensions":
return (MockExtensionsEntryPoint(tmp_path),)
return ()
return entry_points
@pytest.fixture()
def mock_get_entry_points(tmp_path, monkeypatch):
entry_points = entry_points_factory(tmp_path)
monkeypatch.setattr(llnl.util.lang, "get_entry_points", entry_points)
def test_spack_entry_point_config(tmp_path, mock_get_entry_points):
"""Test config scope entry point"""
config_paths = dict(spack.config.config_paths_from_entry_points())
config_path = config_paths.get("plugin-mypackage_config")
my_config_path = tmp_path / "spack/etc"
if config_path is None:
raise ValueError("Did not find entry point config in %s" % str(config_paths))
else:
assert os.path.samefile(config_path, my_config_path)
config = spack.config.create()
assert config.get("config:install_tree:root", scope="plugin-mypackage_config") == "/spam/opt"
def test_spack_entry_point_extension(tmp_path, mock_get_entry_points):
"""Test config scope entry point"""
my_ext = tmp_path / "spack/spack-myext"
extensions = spack.extensions.get_extension_paths()
found = bool([ext for ext in extensions if os.path.samefile(ext, my_ext)])
if not found:
raise ValueError("Did not find extension in %s" % ", ".join(extensions))
extensions = spack.extensions.extension_paths_from_entry_points()
found = bool([ext for ext in extensions if os.path.samefile(ext, my_ext)])
if not found:
raise ValueError("Did not find extension in %s" % ", ".join(extensions))
root = spack.extensions.load_extension("myext")
assert os.path.samefile(root, my_ext)
module = spack.extensions.get_module("spam")
assert module is not None
@pytest.mark.skipif(sys.version_info[:2] < (3, 8), reason="Python>=3.8 required")
def test_llnl_util_lang_get_entry_points(tmp_path, monkeypatch):
import importlib.metadata # type: ignore # novermin
monkeypatch.setattr(importlib.metadata, "entry_points", entry_points_factory(tmp_path))
entry_points = list(llnl.util.lang.get_entry_points(group="spack.config"))
assert isinstance(entry_points[0], MockConfigEntryPoint)
entry_points = list(llnl.util.lang.get_entry_points(group="spack.extensions"))
assert isinstance(entry_points[0], MockExtensionsEntryPoint)

View File

@@ -141,6 +141,7 @@ def test_partial_install_delete_prefix_and_stage(install_mockery, mock_fetch, wo
assert s.package.spec.installed
@pytest.mark.not_on_windows("Fails spuriously on Windows")
@pytest.mark.disable_clean_stage_check
def test_failing_overwrite_install_should_keep_previous_installation(
mock_fetch, install_mockery, working_env

View File

@@ -9,6 +9,7 @@
import hashlib
import json
import os
import pathlib
from contextlib import contextmanager
import spack.environment as ev
@@ -172,6 +173,12 @@ def test_buildcache_push_with_base_image_command(
dst_image = ImageReference.from_string(f"dst.example.com/image:{tag}")
retrieved_manifest, retrieved_config = get_manifest_and_config(dst_image)
# Check that the media type is OCI
assert retrieved_manifest["mediaType"] == "application/vnd.oci.image.manifest.v1+json"
assert (
retrieved_manifest["config"]["mediaType"] == "application/vnd.oci.image.config.v1+json"
)
# Check that the base image layer is first.
assert retrieved_manifest["layers"][0]["digest"] == str(tar_gz_digest)
assert retrieved_config["rootfs"]["diff_ids"][0] == str(tar_digest)
@@ -189,3 +196,93 @@ def test_buildcache_push_with_base_image_command(
# And verify that all layers including the base layer are present
for layer in retrieved_manifest["layers"]:
assert blob_exists(dst_image, digest=Digest.from_string(layer["digest"]))
assert layer["mediaType"] == "application/vnd.oci.image.layer.v1.tar+gzip"
def test_uploading_with_base_image_in_docker_image_manifest_v2_format(
tmp_path: pathlib.Path, mutable_database, disable_parallel_buildcache_push
):
"""If the base image uses an old manifest schema, Spack should also use that.
That is necessary for container images to work with Apptainer, which is rather strict about
mismatching manifest/layer types."""
registry_src = InMemoryOCIRegistry("src.example.com")
registry_dst = InMemoryOCIRegistry("dst.example.com")
base_image = ImageReference.from_string("src.example.com/my-base-image:latest")
with oci_servers(registry_src, registry_dst):
mirror("add", "oci-test", "oci://dst.example.com/image")
# Create a dummy base image (blob, config, manifest) in registry A in the Docker Image
# Manifest V2 format.
rootfs = tmp_path / "rootfs"
(rootfs / "bin").mkdir(parents=True)
(rootfs / "bin" / "sh").write_text("hello world")
tarball = tmp_path / "base.tar.gz"
with gzip_compressed_tarfile(tarball) as (tar, tar_gz_checksum, tar_checksum):
tar.add(rootfs, arcname=".")
tar_gz_digest = Digest.from_sha256(tar_gz_checksum.hexdigest())
tar_digest = Digest.from_sha256(tar_checksum.hexdigest())
upload_blob(base_image, str(tarball), tar_gz_digest)
config = {
"created": "2015-10-31T22:22:56.015925234Z",
"author": "Foo <example@example.com>",
"architecture": "amd64",
"os": "linux",
"config": {
"User": "foo",
"Memory": 2048,
"MemorySwap": 4096,
"CpuShares": 8,
"ExposedPorts": {"8080/tcp": {}},
"Env": ["PATH=/usr/bin:/bin"],
"Entrypoint": ["/bin/sh"],
"Cmd": ["-c", "'echo hello world'"],
"Volumes": {"/x": {}},
"WorkingDir": "/",
},
"rootfs": {"diff_ids": [str(tar_digest)], "type": "layers"},
"history": [
{
"created": "2015-10-31T22:22:54.690851953Z",
"created_by": "/bin/sh -c #(nop) ADD file:a3bc1e842b69636f9df5256c49c5374fb4eef1e281fe3f282c65fb853ee171c5 in /",
}
],
}
config_file = tmp_path / "config.json"
config_file.write_text(json.dumps(config))
config_digest = Digest.from_sha256(hashlib.sha256(config_file.read_bytes()).hexdigest())
upload_blob(base_image, str(config_file), config_digest)
manifest = {
"schemaVersion": 2,
"mediaType": "application/vnd.docker.distribution.manifest.v2+json",
"config": {
"mediaType": "application/vnd.docker.container.image.v1+json",
"size": config_file.stat().st_size,
"digest": str(config_digest),
},
"layers": [
{
"mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip",
"size": tarball.stat().st_size,
"digest": str(tar_gz_digest),
}
],
}
upload_manifest(base_image, manifest)
# Finally upload some package to registry B with registry A's image as base
buildcache("push", "--base-image", str(base_image), "oci-test", "mpileaks^mpich")
# Should have some manifests uploaded to registry B now.
assert registry_dst.manifests
# Verify that all manifest are in the Docker Image Manifest V2 format, not OCI.
# And also check that we're not using annotations, which is an OCI-only "feature".
for m in registry_dst.manifests.values():
assert m["mediaType"] == "application/vnd.docker.distribution.manifest.v2+json"
assert m["config"]["mediaType"] == "application/vnd.docker.container.image.v1+json"
for layer in m["layers"]:
assert layer["mediaType"] == "application/vnd.docker.image.rootfs.diff.tar.gzip"
assert "annotations" not in m

View File

@@ -17,6 +17,7 @@
from typing import Callable, Dict, List, Optional, Pattern, Tuple
from urllib.request import Request
import spack.oci.oci
from spack.oci.image import Digest
from spack.oci.opener import OCIAuthHandler
@@ -171,7 +172,7 @@ def __init__(self, domain: str, allow_single_post: bool = True) -> None:
self.blobs: Dict[str, bytes] = {}
# Map from (name, tag) to manifest
self.manifests: Dict[Tuple[str, str], Dict] = {}
self.manifests: Dict[Tuple[str, str], dict] = {}
def index(self, req: Request):
return MockHTTPResponse.with_json(200, "OK", body={})
@@ -225,15 +226,12 @@ def put_session(self, req: Request):
def put_manifest(self, req: Request, name: str, ref: str):
# In requests, Python runs header.capitalize().
content_type = req.get_header("Content-type")
assert content_type in (
"application/vnd.oci.image.manifest.v1+json",
"application/vnd.oci.image.index.v1+json",
)
assert content_type in spack.oci.oci.all_content_type
index_or_manifest = json.loads(self._require_data(req))
# Verify that we have all blobs (layers for manifest, manifests for index)
if content_type == "application/vnd.oci.image.manifest.v1+json":
if content_type in spack.oci.oci.manifest_content_type:
for layer in index_or_manifest["layers"]:
assert layer["digest"] in self.blobs, "Missing blob while uploading manifest"

View File

@@ -0,0 +1,19 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from datetime import date
import spack.projections
import spack.spec
def test_projection_expansion(mock_packages, monkeypatch):
"""Test that env variables and spack config variables are expanded in projections"""
monkeypatch.setenv("FOO_ENV_VAR", "test-string")
projections = {"all": "{name}-{version}/$FOO_ENV_VAR/$date"}
spec = spack.spec.Spec("fake@1.0")
projection = spack.projections.get_projection(projections, spec)
assert "{name}-{version}/test-string/%s" % date.today().strftime("%Y-%m-%d") == projection

View File

@@ -906,6 +906,13 @@ def test_version_list_normalization():
assert ver("1.0:2.0,=1.0,ref=1.0") == ver(["1.0:2.0"])
def test_version_list_connected_union_of_disjoint_ranges():
# Make sure that we also simplify lists of ranges if their intersection is empty, but their
# union is connected.
assert ver("1.0:2.0,2.1,2.2:3,4:6") == ver(["1.0:6"])
assert ver("1.0:1.2,1.3:2") == ver("1.0:1.5,1.6:2")
@pytest.mark.parametrize("version", ["=1.2", "git.ref=1.2", "1.2"])
def test_version_comparison_with_list_fails(version):
vlist = VersionList(["=1.3"])

View File

@@ -695,26 +695,35 @@ def satisfies(self, other: Union["ClosedOpenRange", ConcreteVersion, "VersionLis
def overlaps(self, other: Union["ClosedOpenRange", ConcreteVersion, "VersionList"]) -> bool:
return self.intersects(other)
def union(self, other: Union["ClosedOpenRange", ConcreteVersion, "VersionList"]):
def _union_if_not_disjoint(
self, other: Union["ClosedOpenRange", ConcreteVersion]
) -> Optional["ClosedOpenRange"]:
"""Same as union, but returns None when the union is not connected. This function is not
implemented for version lists as right-hand side, as that makes little sense."""
if isinstance(other, StandardVersion):
return self if self.lo <= other < self.hi else VersionList([self, other])
return self if self.lo <= other < self.hi else None
if isinstance(other, GitVersion):
return self if self.lo <= other.ref_version < self.hi else VersionList([self, other])
return self if self.lo <= other.ref_version < self.hi else None
if isinstance(other, ClosedOpenRange):
# Notice <= cause we want union(1:2, 3:4) = 1:4.
if self.lo <= other.hi and other.lo <= self.hi:
return ClosedOpenRange(min(self.lo, other.lo), max(self.hi, other.hi))
return (
ClosedOpenRange(min(self.lo, other.lo), max(self.hi, other.hi))
if self.lo <= other.hi and other.lo <= self.hi
else None
)
return VersionList([self, other])
raise TypeError(f"Unexpected type {type(other)}")
def union(self, other: Union["ClosedOpenRange", ConcreteVersion, "VersionList"]):
if isinstance(other, VersionList):
v = other.copy()
v.add(self)
return v
raise ValueError(f"Unexpected type {type(other)}")
result = self._union_if_not_disjoint(other)
return result if result is not None else VersionList([self, other])
def intersection(self, other: Union["ClosedOpenRange", ConcreteVersion]):
# range - version -> singleton or nothing.
@@ -731,20 +740,21 @@ class VersionList:
"""Sorted, non-redundant list of Version and ClosedOpenRange elements."""
def __init__(self, vlist=None):
self.versions: List[StandardVersion, GitVersion, ClosedOpenRange] = []
if vlist is not None:
if isinstance(vlist, str):
vlist = from_string(vlist)
if isinstance(vlist, VersionList):
self.versions = vlist.versions
else:
self.versions = [vlist]
self.versions: List[Union[StandardVersion, GitVersion, ClosedOpenRange]] = []
if vlist is None:
pass
elif isinstance(vlist, str):
vlist = from_string(vlist)
if isinstance(vlist, VersionList):
self.versions = vlist.versions
else:
for v in vlist:
self.add(ver(v))
self.versions = [vlist]
else:
for v in vlist:
self.add(ver(v))
def add(self, item):
if isinstance(item, ConcreteVersion):
def add(self, item: Union[StandardVersion, GitVersion, ClosedOpenRange, "VersionList"]):
if isinstance(item, (StandardVersion, GitVersion)):
i = bisect_left(self, item)
# Only insert when prev and next are not intersected.
if (i == 0 or not item.intersects(self[i - 1])) and (
@@ -755,16 +765,22 @@ def add(self, item):
elif isinstance(item, ClosedOpenRange):
i = bisect_left(self, item)
# Note: can span multiple concrete versions to the left,
# For instance insert 1.2: into [1.2, hash=1.2, 1.3]
# would bisect to i = 1.
while i > 0 and item.intersects(self[i - 1]):
item = item.union(self[i - 1])
# Note: can span multiple concrete versions to the left (as well as to the right).
# For instance insert 1.2: into [1.2, hash=1.2, 1.3, 1.4:1.5]
# would bisect at i = 1 and merge i = 0 too.
while i > 0:
union = item._union_if_not_disjoint(self[i - 1])
if union is None: # disjoint
break
item = union
del self.versions[i - 1]
i -= 1
while i < len(self) and item.intersects(self[i]):
item = item.union(self[i])
while i < len(self):
union = item._union_if_not_disjoint(self[i])
if union is None:
break
item = union
del self.versions[i]
self.versions.insert(i, item)
@@ -798,16 +814,20 @@ def copy(self):
def lowest(self) -> Optional[StandardVersion]:
"""Get the lowest version in the list."""
return None if not self else self[0]
return next((v for v in self.versions if isinstance(v, StandardVersion)), None)
def highest(self) -> Optional[StandardVersion]:
"""Get the highest version in the list."""
return None if not self else self[-1]
return next((v for v in reversed(self.versions) if isinstance(v, StandardVersion)), None)
def highest_numeric(self) -> Optional[StandardVersion]:
"""Get the highest numeric version in the list."""
numeric_versions = list(filter(lambda v: str(v) not in infinity_versions, self.versions))
return None if not any(numeric_versions) else numeric_versions[-1]
numeric = (
v
for v in reversed(self.versions)
if isinstance(v, StandardVersion) and not v.isdevelop()
)
return next(numeric, None)
def preferred(self) -> Optional[StandardVersion]:
"""Get the preferred (latest) version in the list."""

View File

@@ -154,11 +154,13 @@ ignore_missing_imports = true
'boto3',
'botocore',
'distro',
'importlib.metadata',
'jinja2',
'jsonschema',
'macholib',
'markupsafe',
'numpy',
'pkg_resources',
'pyristent',
'pytest',
'ruamel.yaml',

View File

@@ -330,7 +330,7 @@ protected-publish:
e4s-generate:
extends: [ ".e4s", ".generate-x86_64"]
image: ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01
image: ecpe4s/ubuntu22.04-runner-amd64-gcc-11.4:2024.03.01
e4s-build:
extends: [ ".e4s", ".build" ]
@@ -353,7 +353,7 @@ e4s-build:
e4s-neoverse-v2-generate:
extends: [ ".e4s-neoverse-v2", ".generate-neoverse-v2" ]
image: ghcr.io/spack/ubuntu22.04-runner-arm64-gcc-11.4:2024.01.01
image: ecpe4s/ubuntu22.04-runner-arm64-gcc-11.4:2024.03.01
e4s-neoverse-v2-build:
extends: [ ".e4s-neoverse-v2", ".build" ]
@@ -376,7 +376,7 @@ e4s-neoverse-v2-build:
e4s-neoverse_v1-generate:
extends: [ ".e4s-neoverse_v1", ".generate-neoverse_v1" ]
image: ghcr.io/spack/ubuntu20.04-runner-arm64-gcc-11.4:2023.08.01
image: ecpe4s/ubuntu22.04-runner-arm64-gcc-11.4:2024.03.01
e4s-neoverse_v1-build:
extends: [ ".e4s-neoverse_v1", ".build" ]
@@ -399,7 +399,7 @@ e4s-neoverse_v1-build:
e4s-rocm-external-generate:
extends: [ ".e4s-rocm-external", ".generate-x86_64"]
image: ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4-rocm5.4.3:2023.08.01
image: ecpe4s/ubuntu22.04-runner-amd64-gcc-11.4-rocm5.7.1:2024.03.01
e4s-rocm-external-build:
extends: [ ".e4s-rocm-external", ".build" ]

View File

@@ -151,7 +151,6 @@ spack:
# - alquimia # pflotran: petsc-3.19.4-c6pmpdtpzarytxo434zf76jqdkhdyn37/lib/petsc/conf/rules:169: material_aux.o] Error 1: fortran errors
# - amrex # disabled temporarily pending resolution of unreproducible CI failure
# - archer # subsumed by llvm +omp_tsan
# - axom # axom: CMake Error at axom/sidre/cmake_install.cmake:154 (file): file INSTALL cannot find "/tmp/gitlab-runner-2/spack-stage/spack-stage-axom-0.8.1-jvol6riu34vuyqvrd5ft2gyhrxdqvf63/spack-build-jvol6ri/lib/fortran/axom_spio.mod": No such file or directory.
# - bricks # bricks: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - dealii # llvm@14.0.6: ?; intel-tbb@2020.3: clang-15: error: unknown argument: '-flifetime-dse=1'; assimp@5.2.5: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)

View File

@@ -147,7 +147,6 @@ spack:
# HOLDING THESE BACK UNTIL CRAY SLES CAPACITY IS EXPANDED AT UO
# - alquimia
# - amrex
# - archer
# - axom
# - bricks
# - dealii

View File

@@ -7,7 +7,7 @@ spack:
packages:
all:
require: '%gcc@11.4.0 target=neoverse_v2'
require: '%gcc target=neoverse_v2'
providers:
blas: [openblas]
mpi: [mpich]
@@ -31,18 +31,13 @@ spack:
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
@@ -156,6 +151,7 @@ spack:
- umap
- umpire
- upcxx
- veloc
- wannier90
- xyce +mpi +shared +pymi +pymi_static_tpls
# INCLUDED IN ECP DAV CPU
@@ -170,18 +166,39 @@ spack:
- py-cinemasci
- sz
- unifyfs
- veloc
- laghos
# - visit # silo: https://github.com/spack/spack/issues/39538
- vtk-m
- zfp
# --
# - archer # part of llvm +omp_tsan
# - bricks ~cuda # not respecting target=aarch64?
# - dealii # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - geopm # geopm: https://github.com/spack/spack/issues/38795
# - glvis # glvis: https://github.com/spack/spack/issues/42839
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp # py-numcodecs@0.7.3: gcc: error: unrecognized command-line option '-mno-sse2'
# - variorum # variorum: https://github.com/spack/spack/issues/38786
# PYTHON PACKAGES
- opencv +python3
- py-horovod
- py-jax
- py-jupyterlab
- py-matplotlib
- py-mpi4py
- py-notebook
- py-numba
- py-numpy
- py-openai
- py-pandas
- py-plotly
- py-pooch
- py-pytest
- py-scikit-learn
- py-scipy
- py-seaborn
- py-tensorflow
- py-torch
# CUDA NOARCH
- flux-core +cuda
- hpctoolkit +cuda
@@ -332,7 +349,7 @@ spack:
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu22.04-runner-arm64-gcc-11.4:2024.01.01"
image: ecpe4s/ubuntu22.04-runner-arm64-gcc-11.4:2024.03.01
cdash:
build-group: E4S ARM Neoverse V2

View File

@@ -7,7 +7,7 @@ spack:
packages:
all:
require: '%gcc@11.4.0 target=neoverse_v1'
require: '%gcc target=neoverse_v1'
providers:
blas: [openblas]
mpi: [mpich]
@@ -35,12 +35,9 @@ spack:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
@@ -95,6 +92,7 @@ spack:
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- laghos
- lammps
- lbann
- legion
@@ -173,13 +171,34 @@ spack:
- vtk-m
- zfp
# --
# - archer # part of llvm +omp_tsan
# - bricks ~cuda # not respecting target=aarch64?
# - dealii # slepc: make[1]: *** internal error: invalid --jobserver-auth string 'fifo:/tmp/GMfifo1313'.
# - geopm # geopm: https://github.com/spack/spack/issues/38795
# - glvis # glvis: https://github.com/spack/spack/issues/42839
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp # py-numcodecs@0.7.3: gcc: error: unrecognized command-line option '-mno-sse2'
# - variorum # variorum: https://github.com/spack/spack/issues/38786
# PYTHON PACKAGES
- opencv +python3
- py-horovod
- py-jax
- py-jupyterlab
- py-matplotlib
- py-mpi4py
- py-notebook
- py-numba
- py-numpy
- py-openai
- py-pandas
- py-plotly
- py-pooch
- py-pytest
- py-scikit-learn
- py-scipy
- py-seaborn
- py-tensorflow
- py-torch
# CUDA NOARCH
- flux-core +cuda
- hpctoolkit +cuda
@@ -330,7 +349,7 @@ spack:
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu20.04-runner-arm64-gcc-11.4:2023.08.01"
image: ecpe4s/ubuntu22.04-runner-arm64-gcc-11.4:2024.03.01
cdash:
build-group: E4S ARM Neoverse V1

View File

@@ -13,9 +13,10 @@ spack:
- "%oneapi"
providers:
blas: [openblas]
mpi: [mpich]
tbb: [intel-tbb]
variants: +mpi
gl:
require: osmesa
elfutils:
variants: ~nls
gcc-runtime:
@@ -38,14 +39,10 @@ spack:
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: 'mpich@4:'
mpich:
require: '~wrapperrpath ~hwloc'
py-cryptography:
require: '@38.0.1'
unzip:
require: '%gcc'
binutils:
@@ -63,8 +60,6 @@ spack:
require: '%gcc'
openssh:
require: '%gcc'
libffi:
require: "@3.4.4"
dyninst:
require: "%gcc"
bison:
@@ -108,6 +103,7 @@ spack:
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- laghos
- lammps
- lbann
- legion
@@ -193,10 +189,32 @@ spack:
# - upcxx # upcxx: /opt/intel/oneapi/mpi/2021.10.0//libfabric/bin/fi_info: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory
# --
# - bricks ~cuda # bricks: /opt/intel/oneapi/compiler/2024.0/bin/sycl-post-link: error while loading shared libraries: libonnxruntime.1.12.22.721.so: cannot open shared object file: No such file or directory
# - glvis ^llvm # glvis: https://github.com/spack/spack/issues/42839
# - pdt # pdt: pdbType.cc:193:21: warning: ISO C++11 does not allow conversion from string literal to 'char *' [-Wwritable-strings]
# - quantum-espresso # quantum-espresso@7.2 /i3fqdx5: warning: <unknown>:0:0: loop not unroll-and-jammed: the optimizer was unable to perform the requested transformation; the transformation might be disabled or specified as part of an unsupported transformation ordering
# - tau +mpi +python +syscall # pdt: pdbType.cc:193:21: warning: ISO C++11 does not allow conversion from string literal to 'char *' [-Wwritable-strings]
# PYTHON PACKAGES
- opencv +python3
- py-jupyterlab
- py-notebook
- py-numpy
- py-openai
- py-pandas
- py-plotly
- py-pooch
- py-pytest
- py-scikit-learn
- py-scipy
- py-seaborn
- py-mpi4py
- py-numba
# - py-horovod # error
# - py-jax # error
# - py-matplotlib # error
# - py-tensorflow # error
# - py-torch # error
# GPU
- amrex +sycl
- tau +mpi +opencl +level_zero ~pdt +syscall # requires libdrm.so to be installed

View File

@@ -32,8 +32,6 @@ spack:
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
@@ -74,7 +72,6 @@ spack:
- drishti
- dxt-explorer
- dyninst
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 ~paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp # +visit: libext, libxkbfile, libxrender, libxt, silo (https://github.com/spack/spack/issues/39538), cairo
- exaworks
- flecsi
- flit
@@ -97,6 +94,7 @@ spack:
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- laghos
- lammps
- lbann
- legion
@@ -158,6 +156,7 @@ spack:
- upcxx
- wannier90
- xyce +mpi +shared +pymi +pymi_static_tpls
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 ~paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp # +visit: libext, libxkbfile, libxrender, libxt, silo (https://github.com/spack/spack/issues/39538), cairo
# INCLUDED IN ECP DAV CPU
- adios2
- ascent
@@ -176,13 +175,34 @@ spack:
- vtk-m
- zfp
# --
# - archer # part of llvm +omp_tsan
# - dealii # fltk: https://github.com/spack/spack/issues/38791
# - geopm # geopm: https://github.com/spack/spack/issues/38798
# - glvis # glvis: https://github.com/spack/spack/issues/42839
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp # py-numcodecs: gcc: error: unrecognized command line option '-mno-sse2'; did you mean '-mno-isel'? gcc: error: unrecognized command line option '-mno-avx2'
# - phist +mpi # ghost@develop: gcc-9: error: unrecognized command line option '-march=native'; did you mean '-mcpu=native'?
# - variorum # variorum: https://github.com/spack/spack/issues/38786
# PYTHON PACKAGES
- opencv +python3
- py-jax
- py-jupyterlab
- py-matplotlib
- py-mpi4py
- py-notebook
- py-numba
- py-numpy
- py-openai
- py-pandas
- py-plotly
- py-pooch
- py-pytest
- py-scikit-learn
- py-scipy
- py-seaborn
# - py-horovod # py-torch, py-tensorflow
# - py-tensorflow # error
# - py-torch # error
# CUDA NOARCH
- bricks +cuda
- cabana +cuda ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=70

View File

@@ -10,49 +10,15 @@ spack:
require: '%gcc target=x86_64_v3'
providers:
blas: [openblas]
mpi: [mpich]
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: ~nls
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
openblas:
variants: threads=openmp
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
tbb:
require: intel-tbb
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4"
vtk-m:
require: "+examples"
cuda:
version: [11.8.0]
openblas:
variants: threads=openmp
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt+osmesa"
@@ -61,181 +27,181 @@ spack:
comgr:
buildable: false
externals:
- spec: comgr@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: comgr@5.7.1
prefix: /opt/rocm-5.7.1/
hip-rocclr:
buildable: false
externals:
- spec: hip-rocclr@5.4.3
prefix: /opt/rocm-5.4.3/hip
- spec: hip-rocclr@5.7.1
prefix: /opt/rocm-5.7.1/hip
hipblas:
buildable: false
externals:
- spec: hipblas@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: hipblas@5.7.1
prefix: /opt/rocm-5.7.1/
hipcub:
buildable: false
externals:
- spec: hipcub@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: hipcub@5.7.1
prefix: /opt/rocm-5.7.1/
hipfft:
buildable: false
externals:
- spec: hipfft@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: hipfft@5.7.1
prefix: /opt/rocm-5.7.1/
hipsparse:
buildable: false
externals:
- spec: hipsparse@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: hipsparse@5.7.1
prefix: /opt/rocm-5.7.1/
miopen-hip:
buildable: false
externals:
- spec: miopen-hip@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: miopen-hip@5.7.1
prefix: /opt/rocm-5.7.1/
miopengemm:
buildable: false
externals:
- spec: miopengemm@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: miopengemm@5.7.1
prefix: /opt/rocm-5.7.1/
rccl:
buildable: false
externals:
- spec: rccl@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rccl@5.7.1
prefix: /opt/rocm-5.7.1/
rocblas:
buildable: false
externals:
- spec: rocblas@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocblas@5.7.1
prefix: /opt/rocm-5.7.1/
rocfft:
buildable: false
externals:
- spec: rocfft@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocfft@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-clang-ocl:
buildable: false
externals:
- spec: rocm-clang-ocl@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-clang-ocl@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-cmake:
buildable: false
externals:
- spec: rocm-cmake@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-cmake@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-dbgapi:
buildable: false
externals:
- spec: rocm-dbgapi@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-dbgapi@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-debug-agent:
buildable: false
externals:
- spec: rocm-debug-agent@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-debug-agent@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-device-libs:
buildable: false
externals:
- spec: rocm-device-libs@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-device-libs@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-gdb:
buildable: false
externals:
- spec: rocm-gdb@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-gdb@5.7.1
prefix: /opt/rocm-5.7.1/
rocm-opencl:
buildable: false
externals:
- spec: rocm-opencl@5.4.3
prefix: /opt/rocm-5.4.3/opencl
- spec: rocm-opencl@5.7.1
prefix: /opt/rocm-5.7.1/opencl
rocm-smi-lib:
buildable: false
externals:
- spec: rocm-smi-lib@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: rocm-smi-lib@5.7.1
prefix: /opt/rocm-5.7.1/
hip:
buildable: false
externals:
- spec: hip@5.4.3
prefix: /opt/rocm-5.4.3
- spec: hip@5.7.1
prefix: /opt/rocm-5.7.1
extra_attributes:
compilers:
c: /opt/rocm-5.4.3/llvm/bin/clang++
c++: /opt/rocm-5.4.3/llvm/bin/clang++
hip: /opt/rocm-5.4.3/hip/bin/hipcc
c: /opt/rocm-5.7.1/llvm/bin/clang++
c++: /opt/rocm-5.7.1/llvm/bin/clang++
hip: /opt/rocm-5.7.1/hip/bin/hipcc
hipify-clang:
buildable: false
externals:
- spec: hipify-clang@5.4.3
prefix: /opt/rocm-5.4.3
- spec: hipify-clang@5.7.1
prefix: /opt/rocm-5.7.1
llvm-amdgpu:
buildable: false
externals:
- spec: llvm-amdgpu@5.4.3
prefix: /opt/rocm-5.4.3/llvm
- spec: llvm-amdgpu@5.7.1
prefix: /opt/rocm-5.7.1/llvm
extra_attributes:
compilers:
c: /opt/rocm-5.4.3/llvm/bin/clang++
cxx: /opt/rocm-5.4.3/llvm/bin/clang++
c: /opt/rocm-5.7.1/llvm/bin/clang++
cxx: /opt/rocm-5.7.1/llvm/bin/clang++
hsakmt-roct:
buildable: false
externals:
- spec: hsakmt-roct@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: hsakmt-roct@5.7.1
prefix: /opt/rocm-5.7.1/
hsa-rocr-dev:
buildable: false
externals:
- spec: hsa-rocr-dev@5.4.3
prefix: /opt/rocm-5.4.3/
- spec: hsa-rocr-dev@5.7.1
prefix: /opt/rocm-5.7.1/
extra_atributes:
compilers:
c: /opt/rocm-5.4.3/llvm/bin/clang++
cxx: /opt/rocm-5.4.3/llvm/bin/clang++
c: /opt/rocm-5.7.1/llvm/bin/clang++
cxx: /opt/rocm-5.7.1/llvm/bin/clang++
roctracer-dev-api:
buildable: false
externals:
- spec: roctracer-dev-api@5.4.3
prefix: /opt/rocm-5.4.3
- spec: roctracer-dev-api@5.7.1
prefix: /opt/rocm-5.7.1
roctracer-dev:
buildable: false
externals:
- spec: roctracer-dev@4.5.3
prefix: /opt/rocm-5.4.3
prefix: /opt/rocm-5.7.1
rocprim:
buildable: false
externals:
- spec: rocprim@5.4.3
prefix: /opt/rocm-5.4.3
- spec: rocprim@5.7.1
prefix: /opt/rocm-5.7.1
rocrand:
buildable: false
externals:
- spec: rocrand@5.4.3
prefix: /opt/rocm-5.4.3
- spec: rocrand@5.7.1
prefix: /opt/rocm-5.7.1
hipsolver:
buildable: false
externals:
- spec: hipsolver@5.4.3
prefix: /opt/rocm-5.4.3
- spec: hipsolver@5.7.1
prefix: /opt/rocm-5.7.1
rocsolver:
buildable: false
externals:
- spec: rocsolver@5.4.3
prefix: /opt/rocm-5.4.3
- spec: rocsolver@5.7.1
prefix: /opt/rocm-5.7.1
rocsparse:
buildable: false
externals:
- spec: rocsparse@5.4.3
prefix: /opt/rocm-5.4.3
- spec: rocsparse@5.7.1
prefix: /opt/rocm-5.7.1
rocthrust:
buildable: false
externals:
- spec: rocthrust@5.4.3
prefix: /opt/rocm-5.4.3
- spec: rocthrust@5.7.1
prefix: /opt/rocm-5.7.1
rocprofiler-dev:
buildable: false
externals:
- spec: rocprofiler-dev@5.4.3
prefix: /opt/rocm-5.4.3
- spec: rocprofiler-dev@5.7.1
prefix: /opt/rocm-5.7.1
specs:
# ROCM NOARCH
@@ -262,7 +228,7 @@ spack:
- mfem +rocm amdgpu_target=gfx908
- petsc +rocm amdgpu_target=gfx908
- raja ~openmp +rocm amdgpu_target=gfx908
- slate +rocm amdgpu_target=gfx908
# - slate +rocm amdgpu_target=gfx908 # slate: hip/device_gescale_row_col.hip.cc:58:49: error: use of overloaded operator '*' is ambiguous (with operand types 'HIP_vector_type<double, 2>' and 'const HIP_vector_type<double, 2>')
- slepc +rocm amdgpu_target=gfx908 ^petsc +rocm amdgpu_target=gfx908
- strumpack ~slate +rocm amdgpu_target=gfx908
- sundials +rocm amdgpu_target=gfx908
@@ -303,7 +269,7 @@ spack:
- mfem +rocm amdgpu_target=gfx90a
- petsc +rocm amdgpu_target=gfx90a
- raja ~openmp +rocm amdgpu_target=gfx90a
- slate +rocm amdgpu_target=gfx90a
# - slate +rocm amdgpu_target=gfx90a # slate: hip/device_gescale_row_col.hip.cc:58:49: error: use of overloaded operator '*' is ambiguous (with operand types 'HIP_vector_type<double, 2>' and 'const HIP_vector_type<double, 2>')
- slepc +rocm amdgpu_target=gfx90a ^petsc +rocm amdgpu_target=gfx90a
- strumpack ~slate +rocm amdgpu_target=gfx90a
- sundials +rocm amdgpu_target=gfx90a
@@ -327,7 +293,7 @@ spack:
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4-rocm5.4.3:2023.08.01"
image: ecpe4s/ubuntu22.04-runner-amd64-gcc-11.4-rocm5.7.1:2024.03.01
cdash:
build-group: E4S ROCm External

View File

@@ -31,18 +31,13 @@ spack:
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mesa:
version: [21.3.8]
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
ncurses:
require: '@6.3 +termlib'
tbb:
require: intel-tbb
boost:
version: [1.79.0]
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
@@ -106,6 +101,7 @@ spack:
- julia ^llvm ~clang ~gold ~polly targets=amdgpu,bpf,nvptx,webassembly
- kokkos +openmp
- kokkos-kernels +openmp
- laghos
- lammps
- lbann
- legion
@@ -187,201 +183,222 @@ spack:
- vtk-m
- zfp
# --
# - archer # submerged into llvm +libomp_tsan
# - geopm # geopm: https://github.com/spack/spack/issues/38795
# - glvis # glvis: https://github.com/spack/spack/issues/42839
# CUDA NOARCH
- bricks +cuda
- flux-core +cuda
- hpctoolkit +cuda
- papi +cuda
- tau +mpi +cuda +syscall
# --
# - legion +cuda # legion: needs NVIDIA driver
# PYTHON PACKAGES
- opencv +python3
- py-horovod
- py-jax
- py-jupyterlab
- py-matplotlib
- py-mpi4py
- py-notebook
- py-numba
- py-numpy
- py-openai
- py-pandas
- py-plotly
- py-pooch
- py-pytest
- py-scikit-learn
- py-scipy
- py-seaborn
- py-tensorflow
- py-torch
# CUDA 80
- amrex +cuda cuda_arch=80
- arborx +cuda cuda_arch=80 ^kokkos +wrapper
- cabana +cuda cuda_arch=80 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=80
- caliper +cuda cuda_arch=80
- chai ~benchmarks ~tests +cuda cuda_arch=80 ^umpire ~shared
- cusz +cuda cuda_arch=80
- dealii +cuda cuda_arch=80
- ecp-data-vis-sdk ~rocm +adios2 ~ascent +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=80 # +ascent fails because fides fetch error
- exago +mpi +python +raja +hiop ~rocm +cuda cuda_arch=80 ~ipopt ^hiop@1.0.0 ~sparse +mpi +raja ~rocm +cuda cuda_arch=80 #^raja@0.14.0
- flecsi +cuda cuda_arch=80
- ginkgo +cuda cuda_arch=80
- gromacs +cuda cuda_arch=80
- heffte +cuda cuda_arch=80
- hpx +cuda cuda_arch=80
- hypre +cuda cuda_arch=80
- kokkos +wrapper +cuda cuda_arch=80
- kokkos-kernels +cuda cuda_arch=80 ^kokkos +wrapper +cuda cuda_arch=80
- libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=80 ^cusz +cuda cuda_arch=80
- magma +cuda cuda_arch=80
- mfem +cuda cuda_arch=80
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=80
- omega-h +cuda cuda_arch=80
- parsec +cuda cuda_arch=80
- petsc +cuda cuda_arch=80
- py-torch +cuda cuda_arch=80
- raja +cuda cuda_arch=80
- slate +cuda cuda_arch=80
- slepc +cuda cuda_arch=80
- strumpack ~slate +cuda cuda_arch=80
- sundials +cuda cuda_arch=80
- superlu-dist +cuda cuda_arch=80
- tasmanian +cuda cuda_arch=80
- trilinos +cuda cuda_arch=80
- umpire ~shared +cuda cuda_arch=80
# INCLUDED IN ECP DAV CUDA
# - adios2 +cuda cuda_arch=80
# - ascent +cuda cuda_arch=80 # ascent: https://github.com/spack/spack/issues/38045
# - paraview +cuda cuda_arch=80
# - vtk-m +cuda cuda_arch=80
# - zfp +cuda cuda_arch=80
# --
# - lammps +cuda cuda_arch=80 # lammps: needs NVIDIA driver
# - upcxx +cuda cuda_arch=80 # upcxx: needs NVIDIA driver
# - axom +cuda cuda_arch=80 # axom: https://github.com/spack/spack/issues/29520
# - lbann +cuda cuda_arch=80 # lbann: https://github.com/spack/spack/issues/38788
# # CUDA NOARCH
# - bricks +cuda
# - flux-core +cuda
# - hpctoolkit +cuda
# - papi +cuda
# - tau +mpi +cuda +syscall
# # --
# # - legion +cuda # legion: needs NVIDIA driver
# CUDA 90
- amrex +cuda cuda_arch=90
- arborx +cuda cuda_arch=90 ^kokkos +wrapper
- cabana +cuda cuda_arch=90 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=90
- caliper +cuda cuda_arch=90
- chai ~benchmarks ~tests +cuda cuda_arch=90 ^umpire ~shared
- cusz +cuda cuda_arch=90
- flecsi +cuda cuda_arch=90
- ginkgo +cuda cuda_arch=90
- gromacs +cuda cuda_arch=90
- heffte +cuda cuda_arch=90
- hpx +cuda cuda_arch=90
- kokkos +wrapper +cuda cuda_arch=90
- kokkos-kernels +cuda cuda_arch=90 ^kokkos +wrapper +cuda cuda_arch=90
- libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=90 ^cusz +cuda cuda_arch=90
- magma +cuda cuda_arch=90
- mfem +cuda cuda_arch=90
- mgard +serial +openmp +timing +unstructured +cuda cuda_arch=90
- parsec +cuda cuda_arch=90
- petsc +cuda cuda_arch=90
- py-torch +cuda cuda_arch=90
- raja +cuda cuda_arch=90
- slate +cuda cuda_arch=90
- slepc +cuda cuda_arch=90
- strumpack ~slate +cuda cuda_arch=90
- sundials +cuda cuda_arch=90
- superlu-dist +cuda cuda_arch=90
- trilinos +cuda cuda_arch=90
- umpire ~shared +cuda cuda_arch=90
# INCLUDED IN ECP DAV CUDA
- adios2 +cuda cuda_arch=90
# - ascent +cuda cuda_arch=90 # ascent: https://github.com/spack/spack/issues/38045
# - paraview +cuda cuda_arch=90 # paraview: InstallError: Incompatible cuda_arch=90
- vtk-m +cuda cuda_arch=90
- zfp +cuda cuda_arch=90
# --
# - axom +cuda cuda_arch=90 # axom: https://github.com/spack/spack/issues/29520
# - dealii +cuda cuda_arch=90 # dealii: https://github.com/spack/spack/issues/39532
# - ecp-data-vis-sdk ~rocm +adios2 +ascent +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=90 # paraview: incompatible cuda_arch; vtk-m: CMake Error at CMake/VTKmWrappers.cmake:413 (message): vtkm_cont needs to be built STATIC as CUDA doesn't support virtual methods across dynamic library boundaries. You need to set the CMake opt ion BUILD_SHARED_LIBS to `OFF` or (better) turn VTKm_NO_DEPRECATED_VIRTUAL to `ON`.
# - hypre +cuda cuda_arch=90 # concretizer: hypre +cuda requires cuda@:11, but cuda_arch=90 requires cuda@12:
# - lammps +cuda cuda_arch=90 # lammps: needs NVIDIA driver
# - lbann +cuda cuda_arch=90 # concretizer: Cannot select a single "version" for package "lbann"
# - omega-h +cuda cuda_arch=90 # omega-h: https://github.com/spack/spack/issues/39535
# - tasmanian +cuda cuda_arch=90 # tasmanian: conflicts with cuda@12
# - upcxx +cuda cuda_arch=90 # upcxx: needs NVIDIA driver
# # CUDA 80
# - amrex +cuda cuda_arch=80
# - arborx +cuda cuda_arch=80 ^kokkos +wrapper
# - cabana +cuda cuda_arch=80 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=80
# - caliper +cuda cuda_arch=80
# - chai ~benchmarks ~tests +cuda cuda_arch=80 ^umpire ~shared
# - cusz +cuda cuda_arch=80
# - dealii +cuda cuda_arch=80
# - ecp-data-vis-sdk ~rocm +adios2 ~ascent +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=80 # +ascent fails because fides fetch error
# - exago +mpi +python +raja +hiop ~rocm +cuda cuda_arch=80 ~ipopt ^hiop@1.0.0 ~sparse +mpi +raja ~rocm +cuda cuda_arch=80 #^raja@0.14.0
# - flecsi +cuda cuda_arch=80
# - ginkgo +cuda cuda_arch=80
# - gromacs +cuda cuda_arch=80
# - heffte +cuda cuda_arch=80
# - hpx +cuda cuda_arch=80
# - hypre +cuda cuda_arch=80
# - kokkos +wrapper +cuda cuda_arch=80
# - kokkos-kernels +cuda cuda_arch=80 ^kokkos +wrapper +cuda cuda_arch=80
# - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=80 ^cusz +cuda cuda_arch=80
# - magma +cuda cuda_arch=80
# - mfem +cuda cuda_arch=80
# - mgard +serial +openmp +timing +unstructured +cuda cuda_arch=80
# - omega-h +cuda cuda_arch=80
# - parsec +cuda cuda_arch=80
# - petsc +cuda cuda_arch=80
# - py-torch +cuda cuda_arch=80
# - raja +cuda cuda_arch=80
# - slate +cuda cuda_arch=80
# - slepc +cuda cuda_arch=80
# - strumpack ~slate +cuda cuda_arch=80
# - sundials +cuda cuda_arch=80
# - superlu-dist +cuda cuda_arch=80
# - tasmanian +cuda cuda_arch=80
# - trilinos +cuda cuda_arch=80
# - umpire ~shared +cuda cuda_arch=80
# # INCLUDED IN ECP DAV CUDA
# # - adios2 +cuda cuda_arch=80
# # - ascent +cuda cuda_arch=80 # ascent: https://github.com/spack/spack/issues/38045
# # - paraview +cuda cuda_arch=80
# # - vtk-m +cuda cuda_arch=80
# # - zfp +cuda cuda_arch=80
# # --
# # - lammps +cuda cuda_arch=80 # lammps: needs NVIDIA driver
# # - upcxx +cuda cuda_arch=80 # upcxx: needs NVIDIA driver
# # - axom +cuda cuda_arch=80 # axom: https://github.com/spack/spack/issues/29520
# # - lbann +cuda cuda_arch=80 # lbann: https://github.com/spack/spack/issues/38788
# ROCM NOARCH
- hpctoolkit +rocm
- tau +mpi +rocm +syscall # tau: has issue with `spack env depfile` build
# # CUDA 90
# - amrex +cuda cuda_arch=90
# - arborx +cuda cuda_arch=90 ^kokkos +wrapper
# - cabana +cuda cuda_arch=90 ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=90
# - caliper +cuda cuda_arch=90
# - chai ~benchmarks ~tests +cuda cuda_arch=90 ^umpire ~shared
# - cusz +cuda cuda_arch=90
# - flecsi +cuda cuda_arch=90
# - ginkgo +cuda cuda_arch=90
# - gromacs +cuda cuda_arch=90
# - heffte +cuda cuda_arch=90
# - hpx +cuda cuda_arch=90
# - kokkos +wrapper +cuda cuda_arch=90
# - kokkos-kernels +cuda cuda_arch=90 ^kokkos +wrapper +cuda cuda_arch=90
# - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=90 ^cusz +cuda cuda_arch=90
# - magma +cuda cuda_arch=90
# - mfem +cuda cuda_arch=90
# - mgard +serial +openmp +timing +unstructured +cuda cuda_arch=90
# - parsec +cuda cuda_arch=90
# - petsc +cuda cuda_arch=90
# - py-torch +cuda cuda_arch=90
# - raja +cuda cuda_arch=90
# - slate +cuda cuda_arch=90
# - slepc +cuda cuda_arch=90
# - strumpack ~slate +cuda cuda_arch=90
# - sundials +cuda cuda_arch=90
# - superlu-dist +cuda cuda_arch=90
# - trilinos +cuda cuda_arch=90
# - umpire ~shared +cuda cuda_arch=90
# # INCLUDED IN ECP DAV CUDA
# - adios2 +cuda cuda_arch=90
# # - ascent +cuda cuda_arch=90 # ascent: https://github.com/spack/spack/issues/38045
# # - paraview +cuda cuda_arch=90 # paraview: InstallError: Incompatible cuda_arch=90
# - vtk-m +cuda cuda_arch=90
# - zfp +cuda cuda_arch=90
# # --
# # - axom +cuda cuda_arch=90 # axom: https://github.com/spack/spack/issues/29520
# # - dealii +cuda cuda_arch=90 # dealii: https://github.com/spack/spack/issues/39532
# # - ecp-data-vis-sdk ~rocm +adios2 +ascent +hdf5 +vtkm +zfp +paraview +cuda cuda_arch=90 # paraview: incompatible cuda_arch; vtk-m: CMake Error at CMake/VTKmWrappers.cmake:413 (message): vtkm_cont needs to be built STATIC as CUDA doesn't support virtual methods across dynamic library boundaries. You need to set the CMake opt ion BUILD_SHARED_LIBS to `OFF` or (better) turn VTKm_NO_DEPRECATED_VIRTUAL to `ON`.
# # - hypre +cuda cuda_arch=90 # concretizer: hypre +cuda requires cuda@:11, but cuda_arch=90 requires cuda@12:
# # - lammps +cuda cuda_arch=90 # lammps: needs NVIDIA driver
# # - lbann +cuda cuda_arch=90 # concretizer: Cannot select a single "version" for package "lbann"
# # - omega-h +cuda cuda_arch=90 # omega-h: https://github.com/spack/spack/issues/39535
# # - tasmanian +cuda cuda_arch=90 # tasmanian: conflicts with cuda@12
# # - upcxx +cuda cuda_arch=90 # upcxx: needs NVIDIA driver
# ROCM 908
- adios2 +kokkos +rocm amdgpu_target=gfx908
- amrex +rocm amdgpu_target=gfx908
- arborx +rocm amdgpu_target=gfx908
- cabana +rocm amdgpu_target=gfx908
- caliper +rocm amdgpu_target=gfx908
- chai ~benchmarks +rocm amdgpu_target=gfx908
- ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx908
- gasnet +rocm amdgpu_target=gfx908
- ginkgo +rocm amdgpu_target=gfx908
- heffte +rocm amdgpu_target=gfx908
- hpx +rocm amdgpu_target=gfx908
- hypre +rocm amdgpu_target=gfx908
- kokkos +rocm amdgpu_target=gfx908
- legion +rocm amdgpu_target=gfx908
- magma ~cuda +rocm amdgpu_target=gfx908
- mfem +rocm amdgpu_target=gfx908
- petsc +rocm amdgpu_target=gfx908
- raja ~openmp +rocm amdgpu_target=gfx908
- slate +rocm amdgpu_target=gfx908
- slepc +rocm amdgpu_target=gfx908 ^petsc +rocm amdgpu_target=gfx908
- strumpack ~slate +rocm amdgpu_target=gfx908
- sundials +rocm amdgpu_target=gfx908
- superlu-dist +rocm amdgpu_target=gfx908
- tasmanian ~openmp +rocm amdgpu_target=gfx908
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx908
- umpire +rocm amdgpu_target=gfx908
- upcxx +rocm amdgpu_target=gfx908
# INCLUDED IN ECP DAV ROCM
# - hdf5
# - hdf5-vol-async
# - hdf5-vol-cache
# - hdf5-vol-log
# - libcatalyst
- paraview +rocm amdgpu_target=gfx908
# - vtk-m ~openmp +rocm amdgpu_target=gfx908 # vtk-m: https://github.com/spack/spack/issues/40268
# --
# - exago +mpi +python +raja +hiop +rocm amdgpu_target=gfx908 ~ipopt cxxflags="-Wno-error=non-pod-varargs" ^hiop@1.0.0 ~sparse +mpi +raja +rocm amdgpu_target=gfx908 # hiop: CMake Error at cmake/FindHiopHipLibraries.cmake:23 (find_package)
# - lbann ~cuda +rocm amdgpu_target=gfx908 # aluminum: https://github.com/spack/spack/issues/38807
# - papi +rocm amdgpu_target=gfx908 # papi: https://github.com/spack/spack/issues/27898
# # ROCM NOARCH
# - hpctoolkit +rocm
# - tau +mpi +rocm +syscall # tau: has issue with `spack env depfile` build
# ROCM 90a
- adios2 +kokkos +rocm amdgpu_target=gfx90a
- amrex +rocm amdgpu_target=gfx90a
- arborx +rocm amdgpu_target=gfx90a
- cabana +rocm amdgpu_target=gfx90a
- caliper +rocm amdgpu_target=gfx90a
- chai ~benchmarks +rocm amdgpu_target=gfx90a
- ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx90a
- gasnet +rocm amdgpu_target=gfx90a
- ginkgo +rocm amdgpu_target=gfx90a
- heffte +rocm amdgpu_target=gfx90a
- hpx +rocm amdgpu_target=gfx90a
- hypre +rocm amdgpu_target=gfx90a
- kokkos +rocm amdgpu_target=gfx90a
- legion +rocm amdgpu_target=gfx90a
- magma ~cuda +rocm amdgpu_target=gfx90a
- mfem +rocm amdgpu_target=gfx90a
- petsc +rocm amdgpu_target=gfx90a
- raja ~openmp +rocm amdgpu_target=gfx90a
- slate +rocm amdgpu_target=gfx90a
- slepc +rocm amdgpu_target=gfx90a ^petsc +rocm amdgpu_target=gfx90a
- strumpack ~slate +rocm amdgpu_target=gfx90a
- sundials +rocm amdgpu_target=gfx90a
- superlu-dist +rocm amdgpu_target=gfx90a
- tasmanian ~openmp +rocm amdgpu_target=gfx90a
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx90a
- umpire +rocm amdgpu_target=gfx90a
- upcxx +rocm amdgpu_target=gfx90a
# INCLUDED IN ECP DAV ROCM
# - hdf5
# - hdf5-vol-async
# - hdf5-vol-cache
# - hdf5-vol-log
# - libcatalyst
- paraview +rocm amdgpu_target=gfx90a
# - vtk-m ~openmp +rocm amdgpu_target=gfx90a # vtk-m: https://github.com/spack/spack/issues/40268
# --
# - exago +mpi +python +raja +hiop +rocm amdgpu_target=gfx90a ~ipopt cxxflags="-Wno-error=non-pod-varargs" ^hiop@1.0.0 ~sparse +mpi +raja +rocm amdgpu_target=gfx90a # hiop: CMake Error at cmake/FindHiopHipLibraries.cmake:23 (find_package)
# - lbann ~cuda +rocm amdgpu_target=gfx90a # aluminum: https://github.com/spack/spack/issues/38807
# - papi +rocm amdgpu_target=gfx90a # papi: https://github.com/spack/spack/issues/27898
# # ROCM 908
# - adios2 +kokkos +rocm amdgpu_target=gfx908
# - amrex +rocm amdgpu_target=gfx908
# - arborx +rocm amdgpu_target=gfx908
# - cabana +rocm amdgpu_target=gfx908
# - caliper +rocm amdgpu_target=gfx908
# - chai ~benchmarks +rocm amdgpu_target=gfx908
# - ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx908
# - gasnet +rocm amdgpu_target=gfx908
# - ginkgo +rocm amdgpu_target=gfx908
# - heffte +rocm amdgpu_target=gfx908
# - hpx +rocm amdgpu_target=gfx908
# - hypre +rocm amdgpu_target=gfx908
# - kokkos +rocm amdgpu_target=gfx908
# - legion +rocm amdgpu_target=gfx908
# - magma ~cuda +rocm amdgpu_target=gfx908
# - mfem +rocm amdgpu_target=gfx908
# - petsc +rocm amdgpu_target=gfx908
# - raja ~openmp +rocm amdgpu_target=gfx908
# - slate +rocm amdgpu_target=gfx908
# - slepc +rocm amdgpu_target=gfx908 ^petsc +rocm amdgpu_target=gfx908
# - strumpack ~slate +rocm amdgpu_target=gfx908
# - sundials +rocm amdgpu_target=gfx908
# - superlu-dist +rocm amdgpu_target=gfx908
# - tasmanian ~openmp +rocm amdgpu_target=gfx908
# - trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx908
# - umpire +rocm amdgpu_target=gfx908
# - upcxx +rocm amdgpu_target=gfx908
# # INCLUDED IN ECP DAV ROCM
# # - hdf5
# # - hdf5-vol-async
# # - hdf5-vol-cache
# # - hdf5-vol-log
# # - libcatalyst
# - paraview +rocm amdgpu_target=gfx908
# # - vtk-m ~openmp +rocm amdgpu_target=gfx908 # vtk-m: https://github.com/spack/spack/issues/40268
# # --
# # - exago +mpi +python +raja +hiop +rocm amdgpu_target=gfx908 ~ipopt cxxflags="-Wno-error=non-pod-varargs" ^hiop@1.0.0 ~sparse +mpi +raja +rocm amdgpu_target=gfx908 # hiop: CMake Error at cmake/FindHiopHipLibraries.cmake:23 (find_package)
# # - lbann ~cuda +rocm amdgpu_target=gfx908 # aluminum: https://github.com/spack/spack/issues/38807
# # - papi +rocm amdgpu_target=gfx908 # papi: https://github.com/spack/spack/issues/27898
# # ROCM 90a
# - adios2 +kokkos +rocm amdgpu_target=gfx90a
# - amrex +rocm amdgpu_target=gfx90a
# - arborx +rocm amdgpu_target=gfx90a
# - cabana +rocm amdgpu_target=gfx90a
# - caliper +rocm amdgpu_target=gfx90a
# - chai ~benchmarks +rocm amdgpu_target=gfx90a
# - ecp-data-vis-sdk +paraview +vtkm +rocm amdgpu_target=gfx90a
# - gasnet +rocm amdgpu_target=gfx90a
# - ginkgo +rocm amdgpu_target=gfx90a
# - heffte +rocm amdgpu_target=gfx90a
# - hpx +rocm amdgpu_target=gfx90a
# - hypre +rocm amdgpu_target=gfx90a
# - kokkos +rocm amdgpu_target=gfx90a
# - legion +rocm amdgpu_target=gfx90a
# - magma ~cuda +rocm amdgpu_target=gfx90a
# - mfem +rocm amdgpu_target=gfx90a
# - petsc +rocm amdgpu_target=gfx90a
# - raja ~openmp +rocm amdgpu_target=gfx90a
# - slate +rocm amdgpu_target=gfx90a
# - slepc +rocm amdgpu_target=gfx90a ^petsc +rocm amdgpu_target=gfx90a
# - strumpack ~slate +rocm amdgpu_target=gfx90a
# - sundials +rocm amdgpu_target=gfx90a
# - superlu-dist +rocm amdgpu_target=gfx90a
# - tasmanian ~openmp +rocm amdgpu_target=gfx90a
# - trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack ~ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu ~stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long +rocm amdgpu_target=gfx90a
# - umpire +rocm amdgpu_target=gfx90a
# - upcxx +rocm amdgpu_target=gfx90a
# # INCLUDED IN ECP DAV ROCM
# # - hdf5
# # - hdf5-vol-async
# # - hdf5-vol-cache
# # - hdf5-vol-log
# # - libcatalyst
# - paraview +rocm amdgpu_target=gfx90a
# # - vtk-m ~openmp +rocm amdgpu_target=gfx90a # vtk-m: https://github.com/spack/spack/issues/40268
# # --
# # - exago +mpi +python +raja +hiop +rocm amdgpu_target=gfx90a ~ipopt cxxflags="-Wno-error=non-pod-varargs" ^hiop@1.0.0 ~sparse +mpi +raja +rocm amdgpu_target=gfx90a # hiop: CMake Error at cmake/FindHiopHipLibraries.cmake:23 (find_package)
# # - lbann ~cuda +rocm amdgpu_target=gfx90a # aluminum: https://github.com/spack/spack/issues/38807
# # - papi +rocm amdgpu_target=gfx90a # papi: https://github.com/spack/spack/issues/27898
ci:
pipeline-gen:
- build-job:
image: "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01"
image: ecpe4s/ubuntu22.04-runner-amd64-gcc-11.4:2024.03.01
cdash:
build-group: E4S

View File

@@ -3,8 +3,11 @@ spack:
packages:
all:
require: target=aarch64
variants: +mps~cuda~rocm
require:
- target=aarch64
- +mps
- ~cuda
- ~rocm
mpi:
require: openmpi
openblas:

View File

@@ -2,8 +2,10 @@ spack:
view: false
packages:
all:
require: target=x86_64_v3
variants: ~cuda~rocm
require:
- target=x86_64_v3
- ~cuda
- ~rocm
mpi:
require: openmpi

View File

@@ -2,8 +2,11 @@ spack:
view: false
packages:
all:
require: target=x86_64_v3
variants: ~rocm+cuda cuda_arch=80
require:
- target=x86_64_v3
- ~rocm
- +cuda
- cuda_arch=80
llvm:
# https://github.com/spack/spack/issues/27999
require: ~cuda

View File

@@ -2,8 +2,11 @@ spack:
view: false
packages:
all:
require: target=x86_64_v3
variants: ~cuda+rocm amdgpu_target=gfx90a
require:
- target=x86_64_v3
- ~cuda
- +rocm
- amdgpu_target=gfx90a
gl:
require: "osmesa"
py-torch:
@@ -35,31 +38,30 @@ spack:
- mxnet
# PyTorch
# Does not yet support Spack-install ROCm
# - py-botorch
# - py-efficientnet-pytorch
# - py-gpytorch
# - py-kornia
# - py-lightning
# - py-pytorch-gradual-warmup-lr
# - py-pytorch-lightning
# - py-segmentation-models-pytorch
# - py-timm
# - py-torch
# - py-torch-cluster
# - py-torch-geometric
# - py-torch-nvidia-apex
# - py-torch-scatter
# - py-torch-sparse
# - py-torch-spline-conv
# - py-torchaudio
# - py-torchdata
# - py-torchfile
# - py-torchgeo
# - py-torchmetrics
# - py-torchtext
# - py-torchvision
# - py-vector-quantize-pytorch
- py-botorch
- py-efficientnet-pytorch
- py-gpytorch
- py-kornia
- py-lightning
- py-pytorch-gradual-warmup-lr
- py-pytorch-lightning
- py-segmentation-models-pytorch
- py-timm
- py-torch
- py-torch-cluster
- py-torch-geometric
- py-torch-nvidia-apex
- py-torch-scatter
- py-torch-sparse
- py-torch-spline-conv
- py-torchaudio
- py-torchdata
- py-torchfile
- py-torchgeo
- py-torchmetrics
- py-torchtext
- py-torchvision
- py-vector-quantize-pytorch
# scikit-learn
- py-scikit-learn
@@ -72,12 +74,13 @@ spack:
- py-tensorboardx
# TensorFlow
- py-tensorflow
- py-tensorflow-datasets
- py-tensorflow-estimator
- py-tensorflow-hub
- py-tensorflow-metadata
- py-tensorflow-probability
# Does not yet support Spack-installed ROCm
# - py-tensorflow
# - py-tensorflow-datasets
# - py-tensorflow-estimator
# - py-tensorflow-hub
# - py-tensorflow-metadata
# - py-tensorflow-probability
# XGBoost
- py-xgboost

View File

@@ -120,9 +120,11 @@ class Adios2(CMakePackage, CudaPackage, ROCmPackage):
depends_on("cuda", when="+cuda ~kokkos")
# Kokkos support
depends_on("kokkos@3.7: +cuda +wrapper", when="+kokkos +cuda")
depends_on("kokkos@3.7: +rocm", when="+kokkos +rocm")
depends_on("kokkos@3.7: +sycl", when="+kokkos +sycl")
with when("+kokkos"):
depends_on("kokkos@3.7:")
depends_on("kokkos +cuda +wrapper", when="+cuda")
depends_on("kokkos +rocm", when="+rocm")
depends_on("kokkos +sycl", when="+sycl")
# Propagate CUDA target to kokkos for +cuda
for cuda_arch in CudaPackage.cuda_arch_values:

View File

@@ -24,6 +24,7 @@ class AmdAocl(BundlePackage):
maintainers("amd-toolchain-support")
version("4.2", preferred=True)
version("4.1")
version("4.0")
version("3.2")
@@ -43,11 +44,15 @@ class AmdAocl(BundlePackage):
depends_on("amdfftw ~openmp")
depends_on("amdlibflame threads=none")
for vers in ("2.2", "3.0", "3.1", "3.2", "4.0", "4.1"):
with when(f"@{vers}"):
depends_on(f"amdblis@{vers}")
depends_on(f"amdfftw@{vers}")
depends_on(f"amdlibflame@{vers}")
depends_on(f"amdlibm@{vers}")
depends_on(f"amdscalapack@{vers}")
depends_on(f"aocl-sparse@{vers}")
for vers in ["2.2", "3.0", "3.1", "3.2", "4.0", "4.1", "4.2"]:
with when(f"@={vers}"):
depends_on(f"amdblis@={vers}")
depends_on(f"amdfftw@={vers}")
depends_on(f"amdlibflame@={vers}")
depends_on(f"amdlibm@={vers}")
depends_on(f"amdscalapack@={vers}")
depends_on(f"aocl-sparse@={vers}")
if Version(vers) >= Version("4.2"):
depends_on(f"aocl-compression@={vers}")
depends_on(f"aocl-crypto@={vers}")
depends_on(f"aocl-libmem@={vers}")

View File

@@ -0,0 +1,55 @@
diff -Naur a/config/zen4/make_defs.mk b/config/zen4/make_defs.mk
--- a/config/zen4/make_defs.mk 2022-11-12 13:05:45.000000000 +0000
+++ b/config/zen4/make_defs.mk 2023-05-12 14:40:10.848359434 +0000
@@ -73,6 +73,15 @@
# gcc 11.0 or later:
+ifeq ($(shell test $(GCC_VERSION) -ge 13; echo $$?),0)
+ifneq ($(DEBUG_TYPE),noopt)
+CKOPTFLAGS := -O2 -fgcse-after-reload -fipa-cp-clone -floop-interchange -floop-unroll-and-jam -fpeel-loops -fpredictive-commoning -fsplit-loops -fsplit-paths -ftree-loop-distribution -funswitch-loops -fvect-cost-model=dynamic -fversion-loops-for-strides -fomit-frame-pointer
+endif
+
+CKVECFLAGS += -march=znver4 -mfpmath=sse
+CRVECFLAGS += -march=znver4
+
+else
ifeq ($(shell test $(GCC_VERSION) -ge 11; echo $$?),0)
# Update CKOPTFLAGS for gcc 11+ to use O3 optimization without
# -ftree-partial-pre flag. This flag results in suboptimal code
@@ -100,6 +109,7 @@
endif # GCC 8
endif # GCC 9
endif # GCC 11
+endif # GCC 13
else
ifeq ($(CC_VENDOR),clang)
@@ -132,6 +142,16 @@
#if compiling with clang
VENDOR_STRING := $(strip $(shell ${CC_VENDOR} --version | egrep -o '[0-9]+\.[0-9]+\.?[0-9]*'))
CC_MAJOR := $(shell (echo ${VENDOR_STRING} | cut -d. -f1))
+#clang 16 or later:
+ifeq ($(shell test $(CC_MAJOR) -ge 16; echo $$?),0)
+CKVECFLAGS += -march=znver4
+CRVECFLAGS += -march=znver4
+else
+#clang 12 or later:
+ifeq ($(shell test $(CC_MAJOR) -ge 12; echo $$?),0)
+CKVECFLAGS += -march=znver3 -mavx512f -mavx512dq -mavx512bw -mavx512vl -mavx512vnni -mavx512bf16 -mfpmath=sse -falign-loops=64
+CRVECFLAGS += -march=znver3
+else
#clang 9.0 or later:
ifeq ($(shell test $(CC_MAJOR) -ge 9; echo $$?),0)
CKVECFLAGS += -march=znver2
@@ -139,7 +159,9 @@
else
CKVECFLAGS += -march=znver1
CRVECFLAGS += -march=znver1
-endif # ge 9
+endif # clang 9
+endif # clang 12
+endif # clang 16
endif # aocc 2
endif # aocc 3
endif # aocc 4

View File

@@ -23,7 +23,7 @@ class Amdblis(BlisBase):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-BLIS license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/dense/eula/blas-4-1-eula.html
https://www.amd.com/en/developer/aocl/dense/eula/blas-4-2-eula.html
https://www.amd.com/en/developer/aocl/dense/eula/blas-eula.html
"""
@@ -38,6 +38,11 @@ class Amdblis(BlisBase):
license("BSD-3-Clause")
version(
"4.2",
sha256="0e1baf850ba0e6f99e79f64bbb0a59fcb838ddb5028e24527f52b407c3c62963",
preferred=True,
)
version("4.1", sha256="a05c6c7d359232580d1d599696053ad0beeedf50f3b88d5d22ee7d34375ab577")
version("4.0", sha256="cddd31176834a932753ac0fc4c76332868feab3e9ac607fa197d8b44c1e74a41")
version("3.2", sha256="5a400ee4fc324e224e12f73cc37b915a00f92b400443b15ce3350278ad46fff6")
@@ -55,15 +60,15 @@ def configure_args(self):
args = super().configure_args()
if not (
spec.satisfies(r"%aocc@3.2:4.1")
spec.satisfies(r"%aocc@3.2:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@15:16")
or spec.satisfies(r"%clang@15:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
if spec.satisfies("+ilp64"):

View File

@@ -0,0 +1,11 @@
diff -Nur amd-fftw-4.0/kernel/cpy2d-pair.c amd-fftw-4.0-new/kernel/cpy2d-pair.c
--- amd-fftw-4.0/kernel/cpy2d-pair.c 2022-11-11 16:52:26.000000000 +0530
+++ amd-fftw-4.0-new/kernel/cpy2d-pair.c 2023-05-12 00:09:10.408511128 +0530
@@ -21,6 +21,7 @@
/* out of place copy routines for pairs of isomorphic 2D arrays */
#include "kernel/ifftw.h"
+#include <string.h>
#ifdef AMD_OPT_ALL
#include "immintrin.h"

View File

@@ -28,7 +28,7 @@ class Amdfftw(FftwBase):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-FFTW license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/fftw/eula/fftw-libraries-4-1-eula.html
https://www.amd.com/en/developer/aocl/fftw/eula/fftw-libraries-4-2-eula.html
https://www.amd.com/en/developer/aocl/fftw/eula/fftw-libraries-eula.html
"""
@@ -41,6 +41,11 @@ class Amdfftw(FftwBase):
license("GPL-2.0-only")
version(
"4.2",
sha256="391ef7d933e696762e3547a35b58ab18d22a6cf3e199c74889bcf25a1d1fc89b",
preferred=True,
)
version("4.1", sha256="f1cfecfcc0729f96a5bd61c6b26f3fa43bb0662d3fff370d4f73490c60cf4e59")
version("4.0", sha256="5f02cb05f224bd86bd88ec6272b294c26dba3b1d22c7fb298745fd7b9d2271c0")
version("3.2", sha256="31cab17a93e03b5b606e88dd6116a1055b8f49542d7d0890dbfcca057087b8d0")
@@ -103,7 +108,7 @@ class Amdfftw(FftwBase):
depends_on("texinfo")
provides("fftw-api@3", when="@2:")
provides("fftw-api@3")
conflicts(
"precision=quad",
@@ -167,15 +172,15 @@ def configure(self, spec, prefix):
options.append("F77={0}".format(os.path.basename(spack_fc)))
if not (
spec.satisfies(r"%aocc@3.2:4.1")
spec.satisfies(r"%aocc@3.2:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@15:16")
or spec.satisfies(r"%clang@15:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
if "+debug" in spec:

View File

@@ -3,15 +3,16 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# ----------------------------------------------------------------------------\
import os
from llnl.util import tty
import spack.build_systems.autotools
import spack.build_systems.cmake
from spack.package import *
from spack.pkg.builtin.libflame import LibflameBase
class Amdlibflame(LibflameBase):
class Amdlibflame(CMakePackage, LibflameBase):
"""libFLAME (AMD Optimized version) is a portable library for
dense matrix computations, providing much of the functionality
present in Linear Algebra Package (LAPACK). It includes a
@@ -36,7 +37,7 @@ class Amdlibflame(LibflameBase):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-libFLAME license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/dense/eula-libflame/libflame-4-1-eula.html
https://www.amd.com/en/developer/aocl/dense/eula-libflame/libflame-4-2-eula.html
https://www.amd.com/en/developer/aocl/dense/eula-libflame/libflame-eula.html
"""
@@ -48,7 +49,11 @@ class Amdlibflame(LibflameBase):
maintainers("amd-toolchain-support")
license("BSD-3-Clause")
version(
"4.2",
sha256="93a433c169528ffba74a99df0ba3ce3d5b1fab9bf06ce8d2fd72ee84768ed84c",
preferred=True,
)
version("4.1", sha256="8aed69c60d11cc17e058cabcb8a931cee4f343064ade3e73d3392b7214624b61")
version("4.0", sha256="bcb05763aa1df1e88f0da5e43ff86d956826cbea1d9c5ff591d78a3e091c66a4")
version("3.2", sha256="6b5337fb668b82d0ed0a4ab4b5af4e2f72e4cedbeeb4a8b6eb9a3ef057fb749a")
@@ -57,9 +62,33 @@ class Amdlibflame(LibflameBase):
version("3.0", sha256="d94e08b688539748571e6d4c1ec1ce42732eac18bd75de989234983c33f01ced")
version("2.2", sha256="12b9c1f92d2c2fa637305aaa15cf706652406f210eaa5cbc17aaea9fcfa576dc")
variant("ilp64", default=False, description="Build with ILP64 support")
variant("ilp64", default=False, when="@3.0.1: ", description="Build with ILP64 support")
variant(
"enable-aocl-blas",
default=False,
when="@4.1.0:",
description="Enables tight coupling with AOCL-BLAS library in order to use AOCL-BLAS\
internal routines",
)
variant(
"vectorization",
default="auto",
when="@4.2:",
values=("auto", "avx2", "avx512", "none"),
multi=False,
description="Use hardware vectorization support",
)
# Build system
build_system(
conditional("cmake", when="@4.2:"), conditional("autotools", when="@:4.1"), default="cmake"
)
# Required dependencies
with when("build_system=cmake"):
generator("make")
depends_on("cmake@3.15.0:", type="build")
conflicts("+ilp64", when="@:3.0.0", msg="ILP64 is supported from 3.0.1 onwards")
conflicts("threads=pthreads", msg="pthread is not supported")
conflicts("threads=openmp", when="@:3", msg="openmp is not supported by amdlibflame < 4.0")
requires("target=x86_64:", msg="AMD libflame available only on x86_64")
@@ -72,7 +101,9 @@ class Amdlibflame(LibflameBase):
depends_on("python+pythoncmd", type="build")
depends_on("gmake@4:", when="@3.0.1,3.1:", type="build")
depends_on("aocl-utils", type=("build"), when="@4.1: ")
for vers in ["4.1", "4.2"]:
with when(f"@{vers}"):
depends_on(f"aocl-utils@{vers}")
@property
def lapack_libs(self):
@@ -94,62 +125,101 @@ def flag_handler(self, name, flags):
flags.append("-Wno-error=incompatible-function-pointer-types")
flags.append("-Wno-implicit-function-declaration")
flags.append("-Wno-sometimes-uninitialized")
if name == "ldflags":
if self.spec.satisfies("^aocl-utils~shared"):
flags.append("-lstdc++")
return (flags, None, None)
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
spec = self.spec
args = [self.define("LIBAOCLUTILS_INCLUDE_PATH", spec["aocl-utils"].prefix.include)]
aocl_utils_lib_path = spec["aocl-utils"].libs
args.append("-DLIBAOCLUTILS_LIBRARY_PATH={0}".format(aocl_utils_lib_path))
# From 3.2 version, amd optimized flags are encapsulated under:
# ENABLE_AMD_AOCC_FLAGS for AOCC compiler
# ENABLE_AMD_FLAGS for all other compilers
if spec.satisfies("@3.2:"):
if spec.satisfies("%aocc"):
args.append(self.define("ENABLE_AMD_AOCC_FLAGS", True))
else:
args.append(self.define("ENABLE_AMD_FLAGS", True))
if spec.satisfies("@3.0.1: +ilp64"):
args.append(self.define("ENABLE_ILP64", True))
if spec.satisfies("@4.1.0: +enable-aocl-blas"):
args.append(self.define("ENABLE_AOCL_BLAS", True))
args.append("-DAOCL_ROOT:PATH={0}".format(spec["blas"].prefix))
if spec.variants["vectorization"].value == "auto":
if spec.satisfies("target=avx512"):
args.append("-DLF_ISA_CONFIG=avx512")
elif spec.satisfies("target=avx2"):
args.append("-DLF_ISA_CONFIG=avx2")
else:
args.append("-DLF_ISA_CONFIG=none")
else:
args.append(self.define("LF_ISA_CONFIG", spec.variants["vectorization"].value))
return args
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
"""configure_args function"""
args = super().configure_args()
args = self.pkg.configure_args()
spec = self.spec
if not (
self.spec.satisfies(r"%aocc@3.2:4.1")
or self.spec.satisfies(r"%gcc@12.2:13.1")
or self.spec.satisfies(r"%clang@15:16")
spec.satisfies(r"%aocc@3.2:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@15:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
# From 3.2 version, amd optimized flags are encapsulated under:
# enable-amd-aocc-flags for AOCC compiler
# enable-amd-flags for all other compilers
if "@3.2:" in self.spec:
if "%aocc" in self.spec:
if spec.satisfies("@3.2: "):
if spec.satisfies("%aocc"):
args.append("--enable-amd-aocc-flags")
else:
args.append("--enable-amd-flags")
if "@:3.1" in self.spec:
if spec.satisfies("@:3.1"):
args.append("--enable-external-lapack-interfaces")
if "@3.1" in self.spec:
if spec.satisfies("@3.1"):
args.append("--enable-blas-ext-gemmt")
if "@3.1 %aocc" in self.spec:
if spec.satisfies("@3.1 %aocc"):
args.append("--enable-void-return-complex")
if "@3.0:3.1 %aocc" in self.spec:
if spec.satisfies("@3.0:3.1 %aocc"):
"""To enabled Fortran to C calling convention for
complex types when compiling with aocc flang"""
args.append("--enable-f2c-dotc")
if "@3.0.1: +ilp64" in self.spec:
if spec.satisfies("@3.0.1: +ilp64"):
args.append("--enable-ilp64")
if "@4.1:" in self.spec:
args.append("CFLAGS=-I{0}".format(self.spec["aocl-utils"].prefix.include))
aocl_utils_lib_path = os.path.join(
self.spec["aocl-utils"].prefix.lib, "libaoclutils.a"
)
if spec.satisfies("@4.1:"):
args.append("CFLAGS=-I{0}".format(spec["aocl-utils"].prefix.include))
aocl_utils_lib_path = spec["aocl-utils"].libs
args.append("LIBAOCLUTILS_LIBRARY_PATH={0}".format(aocl_utils_lib_path))
return args
@when("@4.1:")
def build(self, spec, prefix):
aocl_utils_lib_path = os.path.join(self.spec["aocl-utils"].prefix.lib, "libaoclutils.a")
def build(self, pkg, spec, prefix):
aocl_utils_lib_path = spec["aocl-utils"].libs
make("all", "LIBAOCLUTILS_LIBRARY_PATH={0}".format(aocl_utils_lib_path))
@run_after("build")
@@ -162,7 +232,11 @@ def check(self):
else:
make("check", "LIBBLAS = {0}".format(blas_flags), parallel=False)
def install(self, spec, prefix):
def install(self, pkg, spec, prefix):
"""make install function"""
# make install in parallel fails with message 'File already exists'
make("install", parallel=False)
def setup_dependent_run_environment(self, env, dependent_spec):
if self.spec.external:
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib)

View File

@@ -21,7 +21,7 @@ class Amdlibm(SConsPackage):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-FFTW license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/libm/eula/libm-4-1-eula.html
https://www.amd.com/en/developer/aocl/libm/eula/libm-4-2-eula.html
https://www.amd.com/en/developer/aocl/libm/libm-eula.html
"""
@@ -33,6 +33,11 @@ class Amdlibm(SConsPackage):
license("BSD-3-Clause")
version(
"4.2",
sha256="58847b942e998b3f52eb41ae26403c7392d244fcafa707cbf23165aac24edd9e",
preferred=True,
)
version("4.1", sha256="5bbbbc6bc721d9a775822eab60fbc11eb245e77d9f105b4fcb26a54d01456122")
version("4.0", sha256="038c1eab544be77598eccda791b26553d3b9e2ee4ab3f5ad85fdd2a77d015a7d")
version("3.2", sha256="c75b287c38a3ce997066af1f5c8d2b19fc460d5e56678ea81f3ac33eb79ec890")
@@ -40,13 +45,15 @@ class Amdlibm(SConsPackage):
version("3.0", sha256="eb26b5e174f43ce083928d0d8748a6d6d74853333bba37d50057aac2bef7c7aa")
version("2.2", commit="4033e022da428125747e118ccd6fdd9cee21c470")
variant("verbose", default=False, description="Building with verbosity")
variant("verbose", default=False, description="Building with verbosity", when="@:4.1")
# Mandatory dependencies
depends_on("python@3.6.1:", type=("build", "run"))
depends_on("scons@3.1.2:", type=("build"))
depends_on("aocl-utils", type=("build"), when="@4.1: ")
depends_on("mpfr", type=("link"))
for vers in ["4.1", "4.2"]:
with when(f"@{vers}"):
depends_on(f"aocl-utils@{vers}")
patch("0001-libm-ose-Scripts-cleanup-pyc-files.patch", when="@2.2")
patch("0002-libm-ose-prevent-log-v3.c-from-building.patch", when="@2.2")
@@ -64,15 +71,15 @@ def build_args(self, spec, prefix):
args.append("--aocl_utils_install_path={0}".format(self.spec["aocl-utils"].prefix))
if not (
self.spec.satisfies(r"%aocc@3.2:4.1")
self.spec.satisfies(r"%aocc@3.2:4.2")
or self.spec.satisfies(r"%gcc@12.2:13.1")
or self.spec.satisfies(r"%clang@15:16")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
# we are circumventing the use of
@@ -85,10 +92,8 @@ def build_args(self, spec, prefix):
args.append("{0}CC={1}".format(var_prefix, self.compiler.cc))
args.append("{0}CXX={1}".format(var_prefix, self.compiler.cxx))
if "+verbose" in self.spec:
args.append("--verbose=1")
else:
args.append("--verbose=0")
# Always build verbose
args.append("--verbose=1")
return args

File diff suppressed because it is too large Load Diff

View File

@@ -22,7 +22,7 @@ class Amdscalapack(ScalapackBase):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-ScaLAPACK license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/scalapack/eula/scalapack-libraries-4-1-eula.html
https://www.amd.com/en/developer/aocl/scalapack/eula/scalapack-libraries-4-2-eula.html
https://www.amd.com/en/developer/aocl/scalapack/eula/scalapack-libraries-eula.html
"""
@@ -33,7 +33,11 @@ class Amdscalapack(ScalapackBase):
maintainers("amd-toolchain-support")
license("BSD-3-Clause-Open-MPI")
version(
"4.2",
sha256="c6e9a846c05cdc05252b0b5f264164329812800bf13f9d97c77114dc138e6ccb",
preferred=True,
)
version("4.1", sha256="b2e51c3604e5869d1faaef2e52c92071fcb3de1345aebb2ea172206622067ad9")
version("4.0", sha256="f02913b5984597b22cdb9a36198ed61039a1bf130308e778dc31b2a7eb88b33b")
version("3.2", sha256="9e00979bb1be39d627bdacb01774bc043029840d542fafc934d16fec3e3b0892")
@@ -46,6 +50,13 @@ class Amdscalapack(ScalapackBase):
conflicts("+ilp64", when="@:3.0", msg="ILP64 is supported from 3.1 onwards")
requires("target=x86_64:", msg="AMD scalapack available only on x86_64")
patch("clang-hollerith.patch", when="%clang@16:")
def patch(self):
# Flang-New gets confused and thinks it finds Hollerith constants
if self.spec.satisfies("%clang@16:"):
filter_file("-cpp", "", "CMakeLists.txt")
def url_for_version(self, version):
vers = "https://github.com/amd/{0}/archive/{1}.tar.gz"
if version >= Version("3.1"):
@@ -59,15 +70,15 @@ def cmake_args(self):
spec = self.spec
if not (
spec.satisfies(r"%aocc@3.2:4.1")
spec.satisfies(r"%aocc@3.2:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@15:16")
or spec.satisfies(r"%clang@15:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
if spec.satisfies("%gcc@10:"):
@@ -109,3 +120,7 @@ def cmake_args(self):
)
return args
def setup_dependent_run_environment(self, env, dependent_spec):
if self.spec.external:
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib)

View File

@@ -26,6 +26,7 @@ class Amrex(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
version("develop", branch="development")
version("24.03", sha256="024876fe65838d1021fcbf8530b992bff8d9be1d3f08a1723c4e2e5f7c28b427")
version("24.02", sha256="286cc3ca29daa69c8eafc1cd7a572662dec9eb78631ac3d33a1260868fdc6996")
version("24.01", sha256="83dbd4dad6dc51fa4a80aad0347b15ee5a6d816cf4abcd87f7b0e2987d8131b7")
version("23.12", sha256="90e00410833d7a82bf6d9e71a70ce85d2bfb89770da7e34d0dda940f2bf5384a")

View File

@@ -33,6 +33,11 @@ class Aocc(Package):
maintainers("amd-toolchain-support")
version(
ver="4.2.0",
sha256="ed5a560ec745b24dc0685ccdcbde914843fb2f2dfbfce1ba592de4ffbce1ccab",
url="https://download.amd.com/developer/eula/aocc/aocc-4-2/aocc-compiler-4.2.0.tar",
)
version(
ver="4.1.0",
sha256="5b04bfdb751c68dfb9470b34235d76efa80a6b662a123c3375b255982cb52acd",
@@ -56,7 +61,6 @@ class Aocc(Package):
depends_on("zlib-api")
depends_on("ncurses")
depends_on("libtool")
depends_on("texinfo")
variant(
"license-agreed",

View File

@@ -0,0 +1,109 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# ----------------------------------------------------------------------------
from llnl.util import tty
from spack.package import *
class AoclCompression(CMakePackage):
"""
AOCL-Compression is a software framework of various lossless compression
and decompression methods tuned and optimized for AMD Zen based CPUs.
This framework offers a single set of unified APIs for all the supported
compression and decompression methods which facilitate the applications
to easily integrate and use them.
AOCL-Compression supports lz4, zlib/deflate, lzma, zstd, bzip2, snappy,
and lz4hc based compression and decompression methods along with their
native APIs.
The library offers openMP based multi-threaded implementation of lz4, zlib,
zstd and snappy compression methods. It supports the dynamic dispatcher
feature that executes the most optimal function variant implemented using
Function Multi-versioning thereby offering a single optimized library
portable across different x86 CPU architectures.
AOCL-Compression framework is developed in C for UNIX® and Windows® based
systems. A test suite is provided for the validation and performance
benchmarking of the supported compression and decompression methods.
This suite also supports the benchmarking of IPP compression methods, such
as, lz4, lz4hc, zlib and bzip2. The library build framework offers CTest
based testing of the test cases implemented using GTest and the library
test suite.
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-Compression license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-eulas/compression-elua-4-2.pdf
"""
_name = "aocl-compression"
homepage = "https://www.amd.com/en/developer/aocl/compression.html"
git = "https://github.com/amd/aocl-compression.git"
url = "https://github.com/amd/aocl-compression/archive/refs/tags/4.2.tar.gz"
maintainers("amd-toolchain-support")
version(
"4.2",
sha256="a18b3e7f64a8105c1500dda7b4c343e974b5e26bfe3dd838a1c1acf82a969c6f",
preferred=True,
)
variant("shared", default=True, description="Build shared library")
variant("zlib", default=True, description="Build zlib library")
variant("bzip2", default=True, description="Build bzip2 library")
variant("snappy", default=True, description="Build snappy library")
variant("zstd", default=True, description="Build zstd library")
variant("lzma", default=True, description="Build lzma library")
variant("lz4", default=True, description="Build lz4 library")
variant("lz4hc", default=True, description="Build lz4hc library")
variant(
"openmp",
default=False,
description="openmp based multi-threaded compression and decompression",
)
depends_on("cmake@3.15:", type="build")
def cmake_args(self):
"""Runs ``cmake`` in the build directory"""
spec = self.spec
args = []
if not (
spec.satisfies(r"%aocc@4.1:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@16:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@4.1:4.2, and clang@16:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
args = [
self.define_from_variant("AOCL_ENABLE_THREADS", "openmp"),
"-DLZ4_FRAME_FORMAT_SUPPORT=ON",
"-DAOCL_LZ4HC_DISABLE_PATTERN_ANALYSIS=ON",
]
if "~shared" in spec:
args.append("-DBUILD_STATIC_LIBS=ON")
if "~zlib" in spec:
args.append("-DAOCL_EXCLUDE_ZLIB=ON")
if "~bzip2" in spec:
args.append("-DAOCL_EXCLUDE_BZIP2=ON")
if "~snappy" in spec:
args.append("-DAOCL_EXCLUDE_SNAPPY=ON")
if "~zstd" in spec:
args.append("-DAOCL_EXCLUDE_ZSTD=ON")
if "~lzma" in spec:
args.append("-DAOCL_EXCLUDE_LZMA=ON")
if "~lz4" in spec:
args.append("-DAOCL_EXCLUDE_LZ4=ON")
if "~lz4hc" in spec:
args.append("-DAOCL_EXCLUDE_LZ4HC=ON")
return args

View File

@@ -0,0 +1,93 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# ----------------------------------------------------------------------------
from llnl.util import tty
from spack.package import *
class AoclCrypto(CMakePackage):
"""
AOCL-Crypto is a library consisting of basic cryptographic functions
optimized and tuned for AMD Zen based microarchitecture.
This library provides a unified solution for Cryptographic routines such
as AES (Advanced Encryption Standard) encryption/decryption routines
(CFB, CTR, CBC, CCM, GCM, OFB, SIV, XTS), SHA (Secure Hash Algorithms)
routines (SHA2, SHA3, SHAKE), Message Authentication Code (CMAC, HMAC),
ECDH (Elliptic-curve DiffieHellman) and RSA (Rivest, Shamir, and Adleman)
key generation functions, etc. AOCL Crypto supports a dynamic dispatcher
feature that executes the most optimal function variant implemented using
Function Multi-versioning thereby offering a single optimized library
portable across different x86 CPU architectures.
AOCL Crypto framework is developed in C / C++ for Unix and Windows based
systems.
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-Cryptography license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/cryptography/eula/cryptography-4-2-eula.html
"""
_name = "aocl-crypto"
homepage = "https://www.amd.com/en/developer/aocl/cryptography.html"
git = "https://github.com/amd/aocl-crypto"
url = "https://github.com/amd/aocl-crypto/archive/refs/tags/4.2.tar.gz"
maintainers("amd-toolchain-support")
version(
"4.2",
sha256="2bdbedd8ab1b28632cadff237f4abd776e809940ad3633ad90fc52ce225911fe",
preferred=True,
)
variant("examples", default=False, description="Build examples")
depends_on("cmake@3.15:", type="build")
depends_on("openssl@3.0.0:")
depends_on("p7zip", type="build")
for vers in ["4.2"]:
with when(f"@={vers}"):
depends_on(f"aocl-utils@={vers}")
@property
def build_directory(self):
"""Returns the directory to use when building the package
:return: directory where to build the package
"""
build_directory = self.stage.source_path
if self.spec.variants["build_type"].value == "Debug":
build_directory = join_path(build_directory, "build", "debug")
else:
build_directory = join_path(build_directory, "build", "release")
return build_directory
def cmake_args(self):
"""Runs ``cmake`` in the build directory"""
spec = self.spec
if not (
spec.satisfies(r"%aocc@4.1:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@16:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@4.1:4.2, and clang@16:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
args = ["-DCMAKE_C_COMPILER=%s" % spack_cc, "-DCMAKE_CXX_COMPILER=%s" % spack_cxx]
args.append(self.define_from_variant("ALCP_ENABLE_EXAMPLES", "examples"))
args.append("-DOPENSSL_INSTALL_DIR=" + spec["openssl"].prefix)
args.append("-DENABLE_AOCL_UTILS=ON")
args.append("-DAOCL_UTILS_INSTALL_DIR=" + spec["aocl-utils"].prefix)
return args

View File

@@ -0,0 +1,93 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# ----------------------------------------------------------------------------
from llnl.util import tty
from spack.package import *
class AoclLibmem(CMakePackage):
"""
AOCL-LibMem is a Linux library of data movement and manipulation
functions (such as memcpy and strcpy) highly optimized for AMD Zen
micro-architecture.
This library has multiple implementations of each function that can be
chosen based on the application requirements as per alignments, instruction
choice, threshold values, and tunable parameters.
By default, this library will choose the best fit implementation based on
the underlying micro-architectural support for CPU features and instructions.
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-LibMem license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/libmem/eula/libmem-4-2-eula.html
"""
_name = "aocl-libmem"
homepage = "https://www.amd.com/en/developer/aocl/libmem.html"
git = "https://github.com/amd/aocl-libmem"
url = "https://github.com/amd/aocl-libmem/archive/refs/tags/4.2.tar.gz"
maintainers("amd-toolchain-support")
version(
"4.2",
sha256="4ff5bd8002e94cc2029ef1aeda72e7cf944b797c7f07383656caa93bcb447569",
preferred=True,
)
variant("logging", default=False, description="Enable/Disable logger")
variant("tunables", default=False, description="Enable/Disable user input")
variant("shared", default=True, description="build shared library")
variant(
"vectorization",
default="auto",
description="Use hardware vectorization support",
values=("avx2", "avx512", "auto"),
multi=False,
)
depends_on("cmake@3.15:", type="build")
@property
def libs(self):
"""find libmem libs function"""
shared = "+shared" in self.spec
return find_libraries("libaocl-libmem", root=self.prefix, recursive=True, shared=shared)
def cmake_args(self):
"""Runs ``cmake`` in the build directory"""
spec = self.spec
if not (
spec.satisfies(r"%aocc@4.1:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@16:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@4.1:4.2, and clang@16:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
args = []
args.append(self.define_from_variant("ENABLE_LOGGING", "logging"))
args.append(self.define_from_variant("ENABLE_TUNABLES", "tunables"))
args.append(self.define_from_variant("BUILD_SHARED_LIBS", "shared"))
if spec.satisfies("vectorisation=auto"):
if "avx512" in self.spec.target:
args.append("-ALMEM_ARCH=avx512")
elif "avx2" in self.spec.target:
args.append("-ALMEM_ARCH=avx2")
else:
args.append("-ALMEM_ARCH=none")
else:
args.append(self.define("ALMEM_ARCH", spec.variants["vectorization"].value))
return args

View File

@@ -19,19 +19,24 @@ class AoclSparse(CMakePackage):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-Sparse license agreement.
You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/sparse/eula/sparse-libraries-4-1-eula.html
https://www.amd.com/en/developer/aocl/sparse/eula/sparse-libraries-4-2-eula.html
https://www.amd.com/en/developer/aocl/sparse/eula/sparse-libraries-eula.html
"""
_name = "aocl-sparse"
homepage = "https://www.amd.com/en/developer/aocl/sparse.html"
url = "https://github.com/amd/aocl-sparse/archive/3.0.tar.gz"
git = "https://github.com/amd/aocl-sparse"
url = "https://github.com/amd/aocl-sparse/archive/3.0.tar.gz"
maintainers("amd-toolchain-support")
license("MIT")
version(
"4.2",
sha256="03cd67adcfea4a574fece98b60b4aba0a6e5a9c8f608ff1ccc1fb324a7185538",
preferred=True,
)
version("4.1", sha256="35ef437210bc25fdd802b462eaca830bfd928f962569b91b592f2866033ef2bb")
version("4.0", sha256="68524e441fdc7bb923333b98151005bed39154d9f4b5e8310b5c37de1d69c2c3")
version("3.2", sha256="db7d681a8697d6ef49acf3e97e8bec35b048ce0ad74549c3b738bbdff496618f")
@@ -51,11 +56,15 @@ class AoclSparse(CMakePackage):
description="Enable experimental AVX512 support",
)
depends_on("amdblis", when="@4.1:")
depends_on("amdlibflame", when="@4.1:")
for vers in ["4.1", "4.2"]:
with when(f"@={vers}"):
depends_on(f"amdblis@={vers}")
depends_on(f"amdlibflame@={vers}")
if Version(vers) >= Version("4.2"):
depends_on(f"aocl-utils@={vers}")
depends_on("boost", when="+benchmarks")
depends_on("boost", when="@2.2")
depends_on("cmake@3.11:", type="build")
depends_on("cmake@3.15:", type="build")
@property
def build_directory(self):
@@ -78,15 +87,15 @@ def cmake_args(self):
spec = self.spec
if not (
spec.satisfies(r"%aocc@3.2:4.1")
spec.satisfies(r"%aocc@3.2:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@15:16")
or spec.satisfies(r"%clang@15:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
args = []
@@ -100,15 +109,21 @@ def cmake_args(self):
args.append(self.define_from_variant("BUILD_ILP64", "ilp64"))
if self.spec.satisfies("@4.0:"):
args.append("-DAOCL_BLIS_LIB=" + str(self.spec["amdblis"].libs))
args.append(f"-DAOCL_BLIS_LIB={self.spec['amdblis'].libs}")
args.append(
"-DAOCL_BLIS_INCLUDE_DIR={0}/blis".format(self.spec["amdblis"].prefix.include)
)
args.append("-DAOCL_LIBFLAME=" + str(self.spec["amdlibflame"].libs))
args.append(f"-DAOCL_LIBFLAME={self.spec['amdlibflame'].libs}")
args.append(
"-DAOCL_LIBFLAME_INCLUDE_DIR={0}".format(self.spec["amdlibflame"].prefix.include)
)
if "@4.2:" in self.spec:
args.append(f"-DAOCL_UTILS_LIB={self.spec['aocl-utils'].libs}")
args.append(
"-DAOCL_UTILS_INCLUDE_DIR={0}".format(self.spec["aocl-utils"].prefix.include)
)
return args
@run_after("build")

View File

@@ -25,7 +25,7 @@ class AoclUtils(CMakePackage):
LICENSING INFORMATION: By downloading, installing and using this software,
you agree to the terms and conditions of the AMD AOCL-Utils license
agreement. You may obtain a copy of this license agreement from
https://www.amd.com/en/developer/aocl/utils/utils-eula/utils-eula-4-1.html
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-eulas/utils-elua-4-2.pdf
"""
_name = "aocl-utils"
@@ -37,30 +37,44 @@ class AoclUtils(CMakePackage):
license("BSD-3-Clause")
version(
"4.2",
sha256="48ce7fae592f5c73a1c3d2c18752f43c939451ed5d3f7a154551f738af440d77",
preferred=True,
)
version("4.1", sha256="a2f271f5eef07da366dae421af3c89286ebb6239047a31a46451758d4a06bc85")
variant("doc", default=False, description="enable documentation")
variant("tests", default=False, description="enable testing")
variant("shared", default=True, when="@4.2:", description="build shared library")
variant("examples", default=False, description="enable examples")
depends_on("doxygen", when="+doc")
@property
def libs(self):
"""find aocl-utils libs function"""
shared = "+shared" in self.spec
return find_libraries("libaoclutils", root=self.prefix, recursive=True, shared=shared)
def cmake_args(self):
spec = self.spec
if not (
self.spec.satisfies(r"%aocc@3.2:4.1")
or self.spec.satisfies(r"%gcc@12.2:13.1")
or self.spec.satisfies(r"%clang@15:16")
spec.satisfies(r"%aocc@3.2:4.2")
or spec.satisfies(r"%gcc@12.2:13.1")
or spec.satisfies(r"%clang@15:17")
):
tty.warn(
"AOCL has been tested to work with the following compilers\
versions - gcc@12.2:13.1, aocc@3.2:4.1, and clang@15:16\
see the following aocl userguide for details: \
https://www.amd.com/content/dam/amd/en/documents/developer/version-4-1-documents/aocl/aocl-4-1-user-guide.pdf"
"AOCL has been tested to work with the following compilers "
"versions - gcc@12.2:13.1, aocc@3.2:4.2, and clang@15:17 "
"see the following aocl userguide for details: "
"https://www.amd.com/content/dam/amd/en/documents/developer/version-4-2-documents/aocl/aocl-4-2-user-guide.pdf"
)
args = []
args.append(self.define_from_variant("ALCI_DOCS", "doc"))
args.append(self.define_from_variant("ALCI_TESTS", "tests"))
args.append(self.define_from_variant("BUILD_SHARED_LIBS", "shared"))
args.append(self.define_from_variant("ALCI_EXAMPLES", "examples"))
return args

View File

@@ -1,41 +0,0 @@
From eb1e1351da41a0da25aa056636932acd8a4f955f Mon Sep 17 00:00:00 2001
From: Ethan Stewart <ethan.stewart@amd.com>
Date: Fri, 25 Sep 2020 09:53:42 -0500
Subject: [PATCH] Add amdgcn to devicelibs bitcode names 3.8
---
clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp b/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp
index 25d3db59d44..1bb9d993bf7 100644
--- a/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp
+++ b/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp
@@ -148,21 +148,21 @@ const char *AMDGCN::OpenMPLinker::constructOmpExtraCmds(
llvm::StringRef WaveFrontSizeBC;
std::string GFXVersion = SubArchName.drop_front(3).str();
if (stoi(GFXVersion) < 1000)
- WaveFrontSizeBC = "oclc_wavefrontsize64_on.bc";
+ WaveFrontSizeBC = "oclc_wavefrontsize64_on.amdgcn.bc";
else
- WaveFrontSizeBC = "oclc_wavefrontsize64_off.bc";
+ WaveFrontSizeBC = "oclc_wavefrontsize64_off.amdgcn.bc";
// FIXME: remove double link of hip aompextras, ockl, and WaveFrontSizeBC
if (Args.hasArg(options::OPT_cuda_device_only))
BCLibs.append(
{Args.MakeArgString("libomptarget-amdgcn-" + SubArchName + ".bc"),
- "hip.bc", "ockl.bc",
+ "hip.amdgcn.bc", "ockl.amdgcn.bc",
std::string(WaveFrontSizeBC)});
else {
BCLibs.append(
{Args.MakeArgString("libomptarget-amdgcn-" + SubArchName + ".bc"),
Args.MakeArgString("libaompextras-amdgcn-" + SubArchName + ".bc"),
- "hip.bc", "ockl.bc",
+ "hip.amdgcn.bc", "ockl.amdgcn.bc",
Args.MakeArgString("libbc-hostrpc-amdgcn.a"),
std::string(WaveFrontSizeBC)});
--
2.17.1

View File

@@ -1,41 +0,0 @@
From 2414b9faee9c264ce4b92b4d709375313df03344 Mon Sep 17 00:00:00 2001
From: Ethan Stewart <ethan.stewart@amd.com>
Date: Tue, 22 Sep 2020 13:39:22 -0500
Subject: [PATCH] Add amdgcn to devicelibs bitcode names
---
clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp b/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp
index cc9b4f1caba..d22609fbe62 100644
--- a/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp
+++ b/clang/lib/Driver/ToolChains/AMDGPUOpenMP.cpp
@@ -148,21 +148,21 @@ const char *AMDGCN::OpenMPLinker::constructOmpExtraCmds(
llvm::StringRef WaveFrontSizeBC;
std::string GFXVersion = SubArchName.drop_front(3).str();
if (stoi(GFXVersion) < 1000)
- WaveFrontSizeBC = "oclc_wavefrontsize64_on.bc";
+ WaveFrontSizeBC = "oclc_wavefrontsize64_on.amdgcn.bc";
else
- WaveFrontSizeBC = "oclc_wavefrontsize64_off.bc";
+ WaveFrontSizeBC = "oclc_wavefrontsize64_off.amdgcn.bc";
// FIXME: remove double link of hip aompextras, ockl, and WaveFrontSizeBC
if (Args.hasArg(options::OPT_cuda_device_only))
BCLibs.append(
{Args.MakeArgString("libomptarget-amdgcn-" + SubArchName + ".bc"),
- "hip.bc", "ockl.bc",
+ "hip.amdgcn.bc", "ockl.amdgcn.bc",
std::string(WaveFrontSizeBC)});
else {
BCLibs.append(
{Args.MakeArgString("libomptarget-amdgcn-" + SubArchName + ".bc"),
Args.MakeArgString("libaompextras-amdgcn-" + SubArchName + ".bc"),
- "hip.bc", "ockl.bc",
+ "hip.amdgcn.bc", "ockl.amdgcn.bc",
Args.MakeArgString("libbc-hostrpc-amdgcn.a"),
std::string(WaveFrontSizeBC)});
--
2.17.1

View File

@@ -1,28 +0,0 @@
From 526efe86427a4d49da38773534d84025dd4246c3 Mon Sep 17 00:00:00 2001
From: Ethan Stewart <ethan.stewart@amd.com>
Date: Tue, 10 Nov 2020 15:32:59 -0600
Subject: [PATCH] Add cmake option for copying source for debugging.
---
openmp/CMakeLists.txt | 8 ++++++++
1 file changed, 8 insertions(+)
diff --git a/openmp/CMakeLists.txt b/openmp/CMakeLists.txt
index a86e83c50212..51962b561a3b 100644
--- a/openmp/CMakeLists.txt
+++ b/openmp/CMakeLists.txt
@@ -103,3 +103,11 @@ endif()
# Now that we have seen all testsuites, create the check-openmp target.
construct_check_openmp_target()
+
+option(DEBUG_COPY_SOURCE "Enable source code copy for openmp debug build."
+ ${ENABLE_SOURCE_COPY})
+if (${ENABLE_SOURCE_COPY})
+ install(DIRECTORY runtime/src DESTINATION ${OPENMP_INSTALL_LIBDIR}/src/openmp/runtime)
+ install(DIRECTORY libomptarget/src libomptarget/plugins DESTINATION ${OPENMP_INSTALL_LIBDIR}/src/openmp/libomptarget)
+ install(DIRECTORY libompd/src DESTINATION ${OPENMP_INSTALL_LIBDIR}/src/openmp/libompd)
+endif()
--
2.17.1

View File

@@ -1,528 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
from spack.package import *
tools_url = "https://github.com/ROCm"
compute_url = "https://github.com/ROCm"
aomp = [
"e4526489833896bbc47ba865e0d115fab278ce269789a8c99a97f444595f5f6a",
"970374c3acb9dda8b9a17d7a579dbaab48fac731db8fdce566a65abee37e5ed3",
"86f90d6505eccdb2840069cadf57f7111d4685653c4974cf65fb22b172e55478",
"14fc6867af0b17e3bff8cb42cb36f509c95a29b7a933a106bf6778de21f6c123",
"ce29cead5391a4a13f2c567e2e059de9291888d24985460725e43a91b740be7a",
]
devlib = [
"dce3a4ba672c4a2da4c2260ee4dc96ff6dd51877f5e7e1993cb107372a35a378",
"b3a114180bf184b3b829c356067bc6a98021d52c1c6f9db6bc57272ebafc5f1d",
"e82cc9a8eb7d92de02cabb856583e28f17a05c8cf9c97aec5275608ef1a38574",
"c99f45dacf5967aef9a31e3731011b9c142446d4a12bac69774998976f2576d7",
"bca9291385d6bdc91a8b39a46f0fd816157d38abb1725ff5222e6a0daa0834cc",
]
llvm = [
"b4fd7305dc57887eec17cce77bbf42215db46a4a3d14d8e517ab92f4e200b29d",
"89b967de5e79f6df7c62fdc12529671fa30989ae7b634d5a7c7996629ec1140e",
"98deabedb6cb3067ee960a643099631902507f236e4d9dc65b3e0f8d659eb55c",
"f0a0b9fec0626878340a15742e73a56f155090011716461edcb069dcf05e6b30",
"3ff18a8bd31d5b55232327e574dfa3556cf26787e105d0ba99411c5687325a8d",
]
flang = [
"cc27f8bfb49257b7a4f0b03f4ba5e06a28dcb6c337065c4201b6075dd2d5bc48",
"1fe07a0da20eb66a2a2aa8d354bf95c6f216ec38cc4a051e98041e0d13c34b36",
"54cc6a9706dba6d7808258632ed40fa6493838edb309709d3b25e0f9b02507f8",
"43d57bcc87fab092ac242e36da62588a87b6fa91f9e81fdb330159497afdecb3",
"81674bf3c9d8fd9b16fb3e5c66a870537c25ff8302fc1b162ab9e95944167163",
]
extras = [
"5dbf27f58b8114318208b97ba99a90483b78eebbcad4117cac6881441977e855",
"adaf7670b2497ff3ac09636e0dd30f666a5a5b742ecdcb8551d722102dcfbd85",
"4460a4f4b03022947f536221483e85dcd9b07064a54516ec103a1939c3f587b5",
"014fca1fba54997c6db0e84822df274fb6807698b6856da4f737f38f10ab0e5d",
"ee146cff4b9ee7aae90d7bb1d6b4957839232be0e7dab1865e0ae39832f8f795",
]
# Used only for 3.5.0
hip = ["86eb7749ff6f6c5f6851cd6c528504d42f9286967324a50dd0dd54a6a74cacc7"]
vdi = ["b21866c7c23dc536356db139b88b6beb3c97f58658836974a7fc167feb31ad7f"]
opencl = ["8963fcd5a167583b3db8b94363778d4df4593bfce8141e1d3c32a59fb64a0cf6"]
versions = ["3.5.0", "3.7.0", "3.8.0", "3.9.0", "3.10.0"]
versions_dict = dict() # type: Dict[str, Dict[str, str]]
hashes = [aomp, devlib, llvm, flang, extras]
hashes_35 = [aomp, devlib, llvm, flang, extras, hip, vdi, opencl]
components = ["aomp", "devlib", "llvm", "flang", "extras"]
components_35 = ["aomp", "devlib", "llvm", "flang", "extras", "hip", "vdi", "opencl"]
for outer_index, item in enumerate(versions):
if item == "3.5.0":
use_components = components_35
use_hashes = hashes_35
else:
use_components = components
use_hashes = hashes
for inner_index, component in enumerate(use_hashes):
versions_dict.setdefault(item, {})[use_components[inner_index]] = use_hashes[inner_index][
outer_index
]
class Aomp(Package):
"""llvm openmp compiler from AMD."""
homepage = tools_url + "/aomp"
url = tools_url + "/aomp/archive/rocm-3.10.0.tar.gz"
maintainers("srekolam", "arjun-raj-kuppala", "estewart08")
tags = ["e4s"]
version("3.10.0", sha256=versions_dict["3.10.0"]["aomp"])
version("3.9.0", sha256=versions_dict["3.9.0"]["aomp"])
# Cmake above 3.18 would fail the build on 3.5.0
depends_on("cmake@3:", type="build")
depends_on("cmake@3:3.17", when="@3.5.0", type="build")
depends_on("python@3:", type="build", when="@3.9.0:")
depends_on("py-setuptools", when="@3.9.0:", type="build")
depends_on("gl@4.5:", type=("build", "link"))
depends_on("py-pip", when="@3.8.0:", type="build")
depends_on("py-wheel", when="@3.8.0:", type=("build", "run"))
depends_on("perl-data-dumper", type="build")
depends_on("awk", type="build")
depends_on("elfutils", type=("build", "link"))
depends_on("libffi", type=("build", "link"))
for ver in ["3.5.0", "3.7.0", "3.8.0", "3.9.0", "3.10.0"]:
depends_on("hsakmt-roct@" + ver, when="@" + ver)
depends_on("comgr@" + ver, type="build", when="@" + ver)
depends_on("hsa-rocr-dev@" + ver, when="@" + ver)
depends_on("rocm-device-libs@" + ver, when="@" + ver)
if ver != "3.5.0":
depends_on("hip@" + ver, when="@" + ver)
depends_on("hip-rocclr@" + ver, when="@" + ver)
if ver == "3.9.0" or ver == "3.10.0":
depends_on("rocm-gdb@" + ver, when="@" + ver)
resource(
name="rocm-device-libs",
url=compute_url + "/ROCm-Device-Libs/archive/rocm-" + ver + ".tar.gz",
sha256=versions_dict[ver]["devlib"],
expand=True,
destination="aomp-dir",
placement="rocm-device-libs",
when="@" + ver,
)
resource(
name="amd-llvm-project",
url=tools_url + "/amd-llvm-project/archive/rocm-" + ver + ".tar.gz",
sha256=versions_dict[ver]["llvm"],
expand=True,
destination="aomp-dir",
placement="amd-llvm-project",
when="@" + ver,
)
resource(
name="flang",
url=tools_url + "/flang/archive/rocm-" + ver + ".tar.gz",
sha256=versions_dict[ver]["flang"],
expand=True,
destination="aomp-dir",
placement="flang",
when="@" + ver,
)
resource(
name="aomp-extras",
url=tools_url + "/aomp-extras/archive/rocm-" + ver + ".tar.gz",
sha256=versions_dict[ver]["extras"],
expand=True,
destination="aomp-dir",
placement="aomp-extras",
when="@" + ver,
)
if ver == "3.5.0":
resource(
name="hip-on-vdi",
url=tools_url + "/hip/archive/aomp-3.5.0.tar.gz",
sha256=versions_dict["3.5.0"]["hip"],
expand=True,
destination="aomp-dir",
placement="hip-on-vdi",
when="@3.5.0",
)
resource(
name="vdi",
url=tools_url + "/rocclr/archive/aomp-3.5.0.tar.gz",
sha256=versions_dict["3.5.0"]["vdi"],
expand=True,
destination="aomp-dir",
placement="vdi",
when="@3.5.0",
)
resource(
name="opencl-on-vdi",
sha256=versions_dict["3.5.0"]["opencl"],
url=compute_url + "/ROCm-OpenCL-Runtime/archive/aomp-3.5.0.tar.gz",
expand=True,
destination="aomp-dir",
placement="opencl-on-vdi",
when="@3.5.0",
)
# Copy source files over for debug build in 3.9.0
patch(
"0001-Add-cmake-option-for-copying-source-for-debugging.patch",
working_dir="aomp-dir/amd-llvm-project",
when="@3.9.0:",
)
# Revert back to .amdgcn.bc naming scheme for 3.8.0
patch(
"0001-Add-amdgcn-to-devicelibs-bitcode-names-3.8.patch",
working_dir="aomp-dir/amd-llvm-project",
when="@3.8.0",
)
# Revert back to .amdgcn.bc naming scheme for 3.7.0
patch(
"0001-Add-amdgcn-to-devicelibs-bitcode-names.patch",
working_dir="aomp-dir/amd-llvm-project",
when="@3.7.0",
)
def patch(self):
# Make sure python2.7 is used for the generation of hip header
if self.spec.version == Version("3.5.0"):
kwargs = {"ignore_absent": False, "backup": False, "string": False}
with working_dir("aomp-dir/hip-on-vdi"):
match = "^#!/usr/bin/python"
python = self.spec["python"].command.path
substitute = "#!{python}".format(python=python)
files = ["hip_prof_gen.py", "vdi/hip_prof_gen.py"]
filter_file(match, substitute, *files, **kwargs)
src = self.stage.source_path
libomptarget = "{0}/aomp-dir/amd-llvm-project/openmp/libomptarget"
aomp_extras = "{0}/aomp-dir/aomp-extras/aomp-device-libs"
flang = "{0}/aomp-dir/flang/"
if self.spec.version >= Version("3.9.0"):
filter_file(
"ADDITIONAL_VERSIONS 2.7",
"ADDITIONAL_VERSIONS 3",
flang.format(src) + "CMakeLists.txt",
)
if self.spec.version >= Version("3.8.0"):
filter_file(
"{CMAKE_INSTALL_PREFIX}",
"{HSA_INCLUDE}",
libomptarget.format(src) + "/hostrpc/services/CMakeLists.txt",
)
filter_file(
"CONFIG",
"CONFIG PATHS ${CMAKE_INSTALL_PREFIX} NO_DEFAULT_PATH",
libomptarget.format(src) + "/../libompd/test/CMakeLists.txt",
)
if self.spec.version != Version("3.5.0"):
filter_file(
"{ROCM_DIR}/aomp/amdgcn/bitcode",
"{DEVICE_LIBS_DIR}",
libomptarget.format(src) + "/hostrpc/CMakeLists.txt",
libomptarget.format(src) + "/deviceRTLs/amdgcn/CMakeLists.txt",
)
if self.spec.version == Version("3.5.0"):
filter_file(
"{ROCM_DIR}/lib/bitcode",
"{DEVICE_LIBS_DIR}",
libomptarget.format(src) + "/deviceRTLs/hostcall/CMakeLists.txt",
)
filter_file(
"{ROCM_DIR}/lib/bitcode",
"{DEVICE_LIBS_DIR}",
aomp_extras.format(src) + "/aompextras/CMakeLists.txt",
aomp_extras.format(src) + "/libm/CMakeLists.txt",
libomptarget.format(src) + "/deviceRTLs/amdgcn/CMakeLists.txt",
string=True,
)
filter_file(
r"${ROCM_DIR}/hsa/include ${ROCM_DIR}/hsa/include/hsa",
"${HSA_INCLUDE}/hsa/include ${HSA_INCLUDE}/hsa/include/hsa",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
string=True,
)
filter_file(
"{ROCM_DIR}/hsa/lib",
"{HSA_LIB}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
)
filter_file(
r"{ROCM_DIR}/lib\)",
"{HSAKMT_LIB})\nset(HSAKMT_LIB64 ${HSAKMT_LIB64})",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
)
filter_file(
r"-L${LIBOMPTARGET_DEP_LIBHSAKMT_LIBRARIES_DIRS}",
"-L${LIBOMPTARGET_DEP_LIBHSAKMT_LIBRARIES_DIRS} -L${HSAKMT_LIB64}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
string=True,
)
filter_file(
r"-rpath,${LIBOMPTARGET_DEP_LIBHSAKMT_LIBRARIES_DIRS}",
"-rpath,${LIBOMPTARGET_DEP_LIBHSAKMT_LIBRARIES_DIRS}" + ",-rpath,${HSAKMT_LIB64}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
string=True,
)
filter_file(
"{ROCM_DIR}/include",
"{COMGR_INCLUDE}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
)
filter_file(
"{ROCM_DIR}/include",
"{COMGR_INCLUDE}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
)
filter_file(
r"-L${LLVM_LIBDIR}${OPENMP_LIBDIR_SUFFIX}",
"-L${LLVM_LIBDIR}${OPENMP_LIBDIR_SUFFIX} -L${COMGR_LIB}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
string=True,
)
filter_file(
r"rpath,${LLVM_LIBDIR}${OPENMP_LIBDIR_SUFFIX}",
"rpath,${LLVM_LIBDIR}${OPENMP_LIBDIR_SUFFIX}" + "-Wl,-rpath,${COMGR_LIB}",
libomptarget.format(src) + "/plugins/hsa/CMakeLists.txt",
string=True,
)
def setup_run_environment(self, env):
devlibs_prefix = self.spec["rocm-device-libs"].prefix
aomp_prefix = self.spec["aomp"].prefix
env.set("HIP_DEVICE_LIB_PATH", "{0}/amdgcn/bitcode".format(format(devlibs_prefix)))
env.set("AOMP", "{0}".format(format(aomp_prefix)))
def setup_build_environment(self, env):
aomp_prefix = self.spec["aomp"].prefix
env.set("AOMP", "{0}".format(format(aomp_prefix)))
env.set("FC", "{0}/bin/flang".format(format(aomp_prefix)))
env.set("GFXLIST", "gfx700 gfx701 gfx801 gfx803 gfx900 gfx902 gfx906 gfx908")
def install(self, spec, prefix):
src = self.stage.source_path
gfx_list = "gfx700;gfx701;gfx801;gfx803;gfx900;gfx902;gfx906;gfx908"
aomp_prefix = self.spec["aomp"].prefix
devlibs_prefix = self.spec["rocm-device-libs"].prefix
hsa_prefix = self.spec["hsa-rocr-dev"].prefix
hsakmt_prefix = self.spec["hsakmt-roct"].prefix
comgr_prefix = self.spec["comgr"].prefix
opencl_src = "/aomp-dir/opencl-on-vdi/api/opencl"
omp_src = "/aomp-dir/amd-llvm-project/openmp"
debug_map_format = "-fdebug-prefix-map={0}{1}={2}".format(src, omp_src, aomp_prefix)
if self.spec.version >= Version("3.9.0"):
bitcode_dir = "/amdgcn/bitcode"
else:
bitcode_dir = "/lib"
components = dict()
components["amd-llvm-project"] = [
"../aomp-dir/amd-llvm-project/llvm",
"-DLLVM_ENABLE_PROJECTS=clang;lld;compiler-rt",
"-DCMAKE_BUILD_TYPE=release",
"-DLLVM_ENABLE_ASSERTIONS=ON",
"-DLLVM_TARGETS_TO_BUILD=AMDGPU;X86",
"-DCMAKE_C_COMPILER={0}".format(self.compiler.cc),
"-DCMAKE_CXX_COMPILER={0}".format(self.compiler.cxx),
"-DCMAKE_ASM_COMPILER={0}".format(self.compiler.cc),
"-DBUG_REPORT_URL=https://github.com/ROCm/aomp",
"-DLLVM_ENABLE_BINDINGS=OFF",
"-DLLVM_INCLUDE_BENCHMARKS=OFF",
"-DLLVM_BUILD_TESTS=OFF",
"-DLLVM_INCLUDE_TESTS=OFF",
"-DCLANG_INCLUDE_TESTS=OFF",
"-DCMAKE_VERBOSE_MAKEFILE=1",
"-DCMAKE_INSTALL_RPATH_USE_LINK_PATH=FALSE",
]
if self.spec.version == Version("3.5.0"):
components["vdi"] = [
"../aomp-dir/vdi",
"-DUSE_COMGR_LIBRARY=yes",
"-DOPENCL_DIR={0}{1}".format(src, opencl_src),
]
components["hip-on-vdi"] = [
"../aomp-dir/hip-on-vdi",
"-DVDI_ROOT={0}/aomp-dir/vdi".format(src),
"-DHIP_COMPILER=clang",
"-DHIP_PLATFORM=vdi",
"-DVDI_DIR={0}/aomp-dir/vdi".format(src),
"-DHSA_PATH={0}".format(hsa_prefix),
"-DLIBVDI_STATIC_DIR={0}/spack-build-vdi".format(src),
"-DCMAKE_CXX_FLAGS=-Wno-ignored-attributes",
]
components["aomp-extras"] = [
"../aomp-dir/aomp-extras",
"-DROCM_PATH=$ROCM_DIR ",
"-DDEVICE_LIBS_DIR={0}{1}".format(devlibs_prefix, bitcode_dir),
"-DAOMP_STANDALONE_BUILD=0",
"-DDEVICELIBS_ROOT={0}/aomp-dir/rocm-device-libs".format(src),
"-DCMAKE_VERBOSE_MAKEFILE=1",
]
openmp_common_args = [
"-DROCM_DIR={0}".format(hsa_prefix),
"-DDEVICE_LIBS_DIR={0}{1}".format(devlibs_prefix, bitcode_dir),
"-DAOMP_STANDALONE_BUILD=0",
"-DDEVICELIBS_ROOT={0}/aomp-dir/rocm-device-libs".format(src),
"-DOPENMP_TEST_C_COMPILER={0}/bin/clang".format(aomp_prefix),
"-DOPENMP_TEST_CXX_COMPILER={0}/bin/clang++".format(aomp_prefix),
"-DLIBOMPTARGET_AMDGCN_GFXLIST={0}".format(gfx_list),
"-DLIBOMP_COPY_EXPORTS=OFF",
"-DHSA_INCLUDE={0}".format(hsa_prefix),
"-DHSA_LIB={0}/lib".format(hsa_prefix),
"-DHSAKMT_LIB={0}/lib".format(hsakmt_prefix),
"-DHSAKMT_LIB64={0}/lib64".format(hsakmt_prefix),
"-DCOMGR_INCLUDE={0}/include".format(comgr_prefix),
"-DCOMGR_LIB={0}/lib".format(comgr_prefix),
"-DOPENMP_ENABLE_LIBOMPTARGET=1",
"-DOPENMP_ENABLE_LIBOMPTARGET_HSA=1",
]
components["openmp"] = ["../aomp-dir/amd-llvm-project/openmp"]
components["openmp"] += openmp_common_args
components["openmp-debug"] = [
"../aomp-dir/amd-llvm-project/openmp",
"-DLIBOMPTARGET_NVPTX_DEBUG=ON",
"-DOPENMP_ENABLE_LIBOMPTARGET=1",
"-DOPENMP_ENABLE_LIBOMPTARGET_HSA=1" "-DCMAKE_CXX_FLAGS=-g",
"-DCMAKE_C_FLAGS=-g",
]
if self.spec.version >= Version("3.9.0"):
components["openmp-debug"] += [
"-DENABLE_SOURCE_COPY=ON",
"-DOPENMP_SOURCE_DEBUG_MAP={0}".format(debug_map_format),
]
if self.spec.version >= Version("3.8.0"):
components["openmp-debug"] += [
"-DLIBOMP_ARCH=x86_64",
"-DLIBOMP_OMP_VERSION=50",
"-DLIBOMP_OMPT_SUPPORT=ON",
"-DLIBOMP_USE_DEBUGGER=ON",
"-DLIBOMP_CFLAGS=-O0",
"-DLIBOMP_CPPFLAGS=-O0",
"-DLIBOMP_OMPD_ENABLED=ON",
"-DLIBOMP_OMPD_SUPPORT=ON",
"-DLIBOMP_OMPT_DEBUG=ON",
]
components["openmp-debug"] += openmp_common_args
flang_common_args = [
"-DLLVM_ENABLE_ASSERTIONS=ON",
"-DLLVM_CONFIG={0}/bin/llvm-config".format(aomp_prefix),
"-DCMAKE_CXX_COMPILER={0}/bin/clang++".format(aomp_prefix),
"-DCMAKE_C_COMPILER={0}/bin/clang".format(aomp_prefix),
"-DCMAKE_Fortran_COMPILER={0}/bin/flang".format(aomp_prefix),
"-DLLVM_TARGETS_TO_BUILD=AMDGPU;x86",
]
components["pgmath"] = ["../aomp-dir/flang/runtime/libpgmath"]
components["pgmath"] += flang_common_args
components["flang"] = [
"../aomp-dir/flang",
"-DFLANG_OPENMP_GPU_AMD=ON",
"-DFLANG_OPENMP_GPU_NVIDIA=ON",
]
components["flang"] += flang_common_args
components["flang-runtime"] = [
"../aomp-dir/flang",
"-DLLVM_INSTALL_RUNTIME=ON",
"-DFLANG_BUILD_RUNTIME=ON",
"-DOPENMP_BUILD_DIR={0}/spack-build-openmp/runtime/src".format(src),
]
components["flang-runtime"] += flang_common_args
if self.spec.version != Version("3.5.0"):
build_order = [
"amd-llvm-project",
"aomp-extras",
"openmp",
"openmp-debug",
"pgmath",
"flang",
"flang-runtime",
]
elif self.spec.version == Version("3.5.0"):
build_order = [
"amd-llvm-project",
"vdi",
"hip-on-vdi",
"aomp-extras",
"openmp",
"openmp-debug",
"pgmath",
"flang",
"flang-runtime",
]
# Override standard CMAKE_BUILD_TYPE
for arg in std_cmake_args:
found = re.search("CMAKE_BUILD_TYPE", arg)
if found:
std_cmake_args.remove(arg)
for component in build_order:
with working_dir("spack-build-{0}".format(component), create=True):
cmake_args = components[component]
cmake_args.extend(std_cmake_args)
# OpenMP build needs to be run twice(Release, Debug)
if component == "openmp-debug":
cmake_args.append("-DCMAKE_BUILD_TYPE=Debug")
else:
cmake_args.append("-DCMAKE_BUILD_TYPE=Release")
cmake(*cmake_args)
make()
make("install")

Some files were not shown because too many files have changed in this diff Show More