spack/share
Adam J. Stewart 46f5b192ef
PyTorch: build flash attention by default, except in CI (#48521)
* PyTorch: build flash attention by default, except in CI

* Variant is boolean, only available when +cuda/+rocm

* desc -> _desc
2025-02-11 13:20:10 -08:00
..
spack PyTorch: build flash attention by default, except in CI (#48521) 2025-02-11 13:20:10 -08:00