* fix remaining flake8 errors
* imports: sort imports everywhere in Spack
We enabled import order checking in #23947, but fixing things manually drives
people crazy. This used `spack style --fix --all` from #24071 to automatically
sort everything in Spack so PR submitters won't have to deal with it.
This should go in after #24071, as it assumes we're using `isort`, not
`flake8-import-order` to order things. `isort` seems to be more flexible and
allows `llnl` mports to be in their own group before `spack` ones, so this
seems like a good switch.
* Fix compiler test
Use `self.spec.satisfies` on compiler to determine if a flag should be
applied or not. This approach avoids issues with the strings `gcc`
or `clang` appearing in the full path to the compiler executables, as
happens with spack-installed compilers (e.g. `nvhpc%gcc`).
* Limit compiler name search to last path component
@skosukhin pointed out that the cflag modification should happen for any
clang or gcc compiler, regardless of what compiler spec provides them.
This commit reverts to searching for a compiler name containing "gcc"
or "clang", but limits the search to the last path component, which
avoids matching spack-installed compilers built with gcc (e.g.
`nvhpc%gcc`), which will have "gcc" in the compiler path.
* Use `os.path` rather than `pathlib`
Co-authored-by: Paul Henning <phenning@lanl.gov>
`dateutil.parser` was an optional dependency for CVS tests. It was failing on macOS
beacuse the dateutil types were not being installed, and mypy was failing *even when the
CVS tests were skipped*. This seems like it was an oversight on macOS --
`types-dateutil-parser` was not installed there, though it was on Linux unit tests.
It takes 6 lines of YAML and some weird test-skipping logic to get `python-dateutil` and
`types-python-dateutil` installed in all the tests where we need them, but it only takes
4 lines of code to write the date parser we need for CVS, so I just did that instead.
Note that CVS date format can vary from system to system, but it seems like it's always
pretty similar for the parts we care about.
- [x] Replace dateutil.parser with a simpler date regex
- [x] Lose the dependency on `dateutil.parser`
Previous tests of `spack style` didn't really run the tools --
they just ensure that the commands worked enough to get coverage.
This adds several real tests and ensures that we hit the corner
cases in `spack style`. This also tests sucess as well as failure
cases.
This consolidates code across tools in `spack style` so that each
`run_<tool>` function can be called indirecty through a dictionary
of handlers, and os that checks like finding the executable for the
tool can be shared across commands.
- [x] rework `spack style` to use decorators to register tools
- [x] define tool order in one place in `spack style`
- [x] fix python 2/3 issues to Get `isort` checks working
- [x] make isort error regex more robust across versions
- [x] remove unused output option
- [x] change vestigial `TRAVIS_BRANCH` to `GITHUB_BASE_REF`
- [x] update completion
This PR configures the spack docbook packages
- docbook-xsl
- docbook-xml
The public entities are now mapped to the locally installed files of the
respective packages. The example catalogs are left in place and
XML_CATALOG_FILES points to the newly created catalogs.
Perl keeps copies of the bzip2 and zlib source code in its own source
tree and by default uses them in favor of outside libraries. Instead,
put these dependencies under control of spack and tell perl to use the
spack-built versions.
* py-keyring: fix installation on linux
* Update var/spack/repos/builtin/packages/py-keyring/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-keyring/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
We should not fail the generate stage simply due to the presence of
a broken-spec somewhere in the DAG. Only fail if the known broken
spec needs to be rebuilt.
This PR adds a context manager that permit to group the common part of a `when=` argument and add that to the context:
```python
class Gcc(AutotoolsPackage):
with when('+nvptx'):
depends_on('cuda')
conflicts('@:6', msg='NVPTX only supported in gcc 7 and above')
conflicts('languages=ada')
conflicts('languages=brig')
conflicts('languages=go')
```
The above snippet is equivalent to:
```python
class Gcc(AutotoolsPackage):
depends_on('cuda', when='+nvptx')
conflicts('@:6', when='+nvptx', msg='NVPTX only supported in gcc 7 and above')
conflicts('languages=ada', when='+nvptx')
conflicts('languages=brig', when='+nvptx')
conflicts('languages=go', when='+nvptx')
```
which needs a repetition of the `when='+nvptx'` argument. The context manager might help improving readability and permits to group together directives related to the same semantic aspect (e.g. all the directives needed to model the behavior of `gcc` when `+nvptx` is active).
Modifications:
- [x] Added a `when` context manager to be used with package directives
- [x] Add unit tests and documentation for the new feature
- [x] Modified `cp2k` and `gcc` to show the use of the context manager
I installed curl on my mac and it picked up a homebrew (I think?)
installation of gsasl. A later system update broke git because of the
implicitly added dependency. Explicitly disabling libraries that *might*
exist on the system is the safe approach here.
```
dyld: Library not loaded: /usr/local/opt/gsasl/lib/libgsasl.7.dylib
Referenced from: /rnsdhpc/code/spack/opt/spack/apple-clang/curl/gag5v3c/lib/libcurl.4.dylib
Reason: image not found
error: git-remote-https died of signal 6
```
* Added Perl workaround for CUDA <= 8
* Re-wrapped comment
* Proofreading corrections
* Added a reference
* Do not override Perl include path
* Retrieve shell once
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
* trilinos: add teko conflict
* trilinos: improve gotype variant
Instead of 'none' and 'long' typically being the same (but not for older
trilinos versions), add an explicit 'all' variant that only works for
older trilinos which supports multiple simultaneous tpetra
instantiations.
* trilinos: add self as maintainer
* trilinos: disable vendored gtest by default
This changes several conflicting variants to a single
multi-value variant, and uses conflicts instead of raising InstallError.
(With clingo, requesting +gui automatically selects features=huge!)
I have also rearranged the dependencies for clarity and simplified the
conifgure args.
ci: only write to broken-specs list on SpackError
Only write to the broken-specs list when `spack install` raises a SpackError,
instead of writing to this list unnecessarily when infrastructure-related problems
prevent a develop job from completing successfully.
If two Specs have the same hash (and prefix) but are not equal, Spack
originally had logic to detect this and raise an error (since both
cannot be installed in the same place). Recently this has eroded and
the check no-longer works; moreover, when defining projections (which
may truncate the hash or other distinguishing properties from the
prefix) Spack was also failing to detect collisions (in both of these
cases, Spack would overwrite the old prefix with the new Spec).
This PR maintains a list of all "taken" prefixes: if a hash is not
registered (i.e. recorded as installed in the database) but the prefix
is occupied, that is a collision. This can detect collisions created
by defining projections (specifically when they omit the hash).
The PR does not detect collisions where specs have the same hash
(and prefix) but are not equal.
Fix syntax of conflict between numpy 1.21.0 and gcc11 to that the clingo
concretizer recognizes it.
In addition the upstream master branch was renamed to main.
* Switch hdf5 package from autotools to cmake.
* Add variant for building with zlib, default to ON.
* Update for format requirements.
* Format change.
* Fix breakage from last merge from develop.
Switch szip to use libaec (unrestricted encryption).
Remove 'static' variant: static libs will only be installed when
~shared.
* Improve args based on suggestions from pull request.
* Update code URL to github.com
Add/modify 4 depends_on lines to fix running "spack graph --deptype=link hdf5".
* Remove trailing whitespace.
* Remove dependencies added solely to make "spack greph --type=link" work.
* Add new version HDF5 1.8.22.
* Remove unnecessary java_check.
* Fix whitespace for style checks.
* Reverted zlib version dependency to 1.1.2:.
zlib variant removed.
api version default renamed "default".
* Remove blank line.
* Whitespace corrections.
* iRemoved unnecessary 'debug' variant.
* Fix typo in version number in conflict for '+szip'.
* Set default for tools variant to True.
Remove patch functions dependent on 'libtool' file that cmake doesn't
produce.
* Remove line to set ONLY_SHARED_LIBS to true.
Add post_install code to install only one version of tools with shared
linkage and original tool names.
* Remove trailing white space and import of glob package not used.
* Leave BUILD_TESTING set to default which is ON.
* Remove post_install code to install only one version of tools because
some dependent packages running tests in e4s testing are using
h5diff-shared. Keep both tools versions for now.
* No longer need to import os.
Instead of refusing to build +mpi with gcc10, add what I guess is now
the standard workaround, ie., `-fallow-argument-mismatch`.
Getting this into pfunit's cmake-based but kinda non-standard build isi
a bit ugly, but you gotta do what you gotta do...
Version 1.17 of DD4hep was renamed from "01-17-00" to "01-17", in line
with the naming conventions of previous releases. Since release archives
contain a subdirectory with the version string in it, this changes the contents
of the tarball ever so slightly, so the SHA-256 checksum must change as well.
Fix url to find newer versions, add newest version 4.0.2 and add
variants for
- cxxstd: To use a specific c++ standard
- static: Enable or disable build of static libraries
- boost: Boost support
- sqlite: SQLite support
- postgresql: PostgreSQL support
When having a few packages loaded, installing go-bootstrap will fail
because the `PATH` variable is truncated at 4096 bytes. Increase the
limit to 128 KiB to make longer paths fit.
1. "+simplex" conflicts with "dealii@:9.2" [The interface to simplex is supported from version 9.3.0 onwards. Please explicitly disable this variant via ~simplex]
2. "+arborx" conflicts with "dealii@:9.2" [The interface to arborx is supported from version 9.3.0 onwards. Please explicitly disable this variant via ~arborx]
Prior to any Spack build, Spack modifies PATH etc. to help the build
find the dependencies it needs. It also allows any package to define
custom environment modifications (and furthermore a package can
specify environment modifications to apply when it is used as a
dependency). If an external package defines custom environment
modifications that alter PATH, and the external package is in a merged
or system prefix, then that prefix could "override" the Spack-built
packages.
This commit reorders environment modifications so that PrependPath
actions which expose Spack-built packages override PrependPath actions
for custom environment modifications of external packages.
In more detail, the original order of environment modifications is:
* Modules
* Compiler flag variables
* PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH for dependencies
* Custom package.py modifications in the following order:
* dependencies
* root
This commit changes the order:
* Modules
* Compiler flag variables
* For each external dependency
* PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH modifications
* Custom modifications
* For each Spack-built dependency
* PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH modifications
* Custom modifications
Spack pipelines need to take specific actions internally that depend
on whether the pipeline is being run on a PR to spack or a merge to
the develop branch. Pipelines can also run in other repositories,
which represents other possible use cases than just the two mentioned
above. This PR creates a "SPACK_PIPELINE_TYPE" gitlab variable which
is propagated to rebuild jobs, and is also used internally to determine
which pipeline-specific tasks to run.
One goal of the PR is fix an issue where rebuild jobs which failed on
develop pipelines did not properly report the broken full hash to the
"broken-specs-url".
* Add Externally Findable section to info command
* Use comma delimited detection attributes in addition to boolean value
* Unit test externally detectable part of spack info
yes I know this name isn't popular but that's the way it is right now.
master and the upcoming v5.0.x release branch use git submodules.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
* [py-xxhash] created template
* [py-xxhash] working on dependencies
* [py-xxhash] set version for xxhash
* [py-xxhash] Final cleanup
- added homepage
- added description
- removed fixmes
* Force the Python interpreter with an env variable
This commit forces the Python interpreter with an
environment variable, to ensure that the Python set
by the "setup-python" action is the one being used.
Due to the policy adopted by Spack to prefer python3
over python we may end up picking a Python 3.X
interpreter where Python 2.7 was meant to be used.
* Revert "Update conftest.py (#24473)"
This reverts commit 477c8ce820.
* Make python-dateutil a soft dependency for unit tests
Before #23212 people could clone spack and run
```
spack unit-tests
```
while now this is not possible, since python-dateutil is
a required but not vendored dependency. This change makes
it not a hard requirement, i.e. it will be used if found
in the current interpreter.
* Workaround mypy complaint
This commit fixes a subtle bug that may occur when
a package is a "possible_provider" of a virtual but
no "provides_virtual" can be deduced. In that case
the cardinality constraint on "provides_virtual"
may arbitrarily assign a package the role of provider
even if the constraints for it to be one are not fulfilled.
The fix reworks the logic around three concepts:
- "possible_provider": a package may provide a virtual if some constraints are met
- "provides_virtual": a package meet the constraints to provide a virtual
- "provider": a package selected to provide a virtual
Spack packages can now fetch versions from CVS repositories. Note
this fetch mechanism is unsafe unless using :extssh:. Most public
CVS repositories use an insecure protocol implemented as part of CVS.
Here we are adding an install_times.json into the spack install metadata folder.
We record a total, global time, along with the times for each phase. The type
of phase or install start / end is included (e.g., build or fail)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
The original implementation of `flag_handler` searched the
`self.compiler.cc` string for `clang` or `gcc` in order to add a flag
for those compilers. This approach fails when using a spack-installed
compiler that was itself built with gcc or clang, as those strings will
appear in the fully-qualified compiler executable paths. This commit
switches to searching for `%gcc` or `%clang` in `self.spec`.
Co-authored-by: Paul Henning <phenning@lanl.gov>
the 4.1.1 release has fixes for problems that kept 4.1.0 from
being the default open mpi version to build using spack.
related to #24396
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
* remove blueos check on cuda variant, fix typo
* restore necessary compiler guard
* remove axom+cuda from testing because it only partially works outside ppc systems
This PR does the following:
- adds version corresponding to commit at 08/03/2020
- adds missing get_DE_events.py script
- adds dependencies needed by get_DE_events.py
- removes REDItoolDenovo.py.patch and python2to3.patch in favor of
running 2to3 and reindent pre-build
- add batch_sort.patch to handle differences in string/char handling
betweeen python2 and python3
- adds a variant for the Nature Protocol
- adds dependencies for the nature_protocol variant
- added myself as maintainer
This PR adds a new version of reditools from git.
This PR fixes a couple of issues with the opencv package, mostly in
relation to cuda. This is only focused on cuda, not any of the other
variants.
- Added versions to the contrib_vers list. Added for all that can be
retrieved from github. The one for the latest version was missing.
- Added a cmake patch for v3.2.0.
- Deprecated versions 3.1.0 and 3.2.0 as neither of those could be
built, with or without cuda.
- Adjusted constraints on applying initial cmake patch.
- Added cudnn dependency when +cuda.
- Set constraints for cudnn and cuda for older versions of opencv.
Add a new "spack audit" command. This command can check for issues
with configuration or with packages and is intended to help a
user debug a failed Spack build.
In some cases the reported issues are always errors but are too
costly to check for (e.g. packages that specify missing variants on
dependencies). In other cases the issues may be legitimate but
uncommon usage of Spack and we want to be sure the user intended the
behavior (e.g. duplicate compiler definitions).
Audits are grouped by theme, and for now the two themes are packages
and configuration. For example you can run all available audits
on packages with "spack audit packages". It is intended that in
the future users will be able to define their own audits.
The package audits are good candidates for running in package_sanity
(i.e. they could catch bugs in user-submitted packages before they
are merged) but that is left for a later PR.
Building magma has been failing consistently and is currently
blocking PRs from being merged. Disable that spec while we
investigate the failure and work on a fix.
This should get us most of the way there to support using monitor during a spack container build, for both Singularity and Docker. Some quick notes:
### Docker
Docker works by way of BUILDKIT and being able to specify --secret. What this means is that you can prefix a line with a mount of type secret as follows:
```bash
# Install the software, remove unnecessary deps
RUN --mount=type=secret,id=su --mount=type=secret,id=st cd /opt/spack-environment && spack env activate . && export SPACKMON_USER=$(cat /run/secrets/su) && export SPACKMON_TOKEN=$(cat /run/secrets/st) && spack install --monitor --fail-fast && spack gc -y
```
Where the id for one or more secrets corresponds to the file mounted at `/run/secrets/<name>`. So, for example, to build this container with su (spackmon user) and sv (spackmon token) defined I would export them on my host and do:
```bash
$ DOCKER_BUILDKIT=1 docker build --network="host" --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
```
And when we add `env` to the secret definition that tells the build to look for the secret with id "st" in the environment variable `SPACKMON_TOKEN` for example.
If the user is building locally with a local spack monitor, we also need to set the `--network` to be the host, otherwise you can't connect to it (a la isolation of course.)
## Singularity
Singularity doesn't have as nice an ability to clearly specify secrets, so (hoping this eventually gets implemented) what I'm doing now is providing the user instructions to write the credentials to a file, add it to the container to source, and remove when done.
## Tags
Note that the tags PR https://github.com/spack/spack/pull/23712 will need to be merged before `--monitor-tags` will actually work because I'm checking for the attribute (that doesn't exist yet):
```bash
"tags": getattr(args, "monitor_tags", None)
```
So when that PR is merged to update the argument group, it will work here, and I can either update the PR here to not check if the attribute is there (it will be) or open another one in the case this PR is already merged.
Finally, I added a bunch of documetation for how to use monitor with containerize. I say "mostly working" because I can't do a full test run with this new version until the container base is built with the updated spack (the request to the monitor server for an env install was missing so I had to add it here).
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Inline codecov annotations make the code hard to read, and they add annotations
in files that seemingly have nothing to do with the PR. Sadly, they add a whole
lot of noise and not a lot of benefit over looking at the PR on codecov. We
should just have people look at the coverage on codecov itself.
* New package: py-pyusb
Change-Id: I606127858b961b5841c60befc5a8353df0f9f38c
* fixup dependencies
Change-Id: I0c9b0ccee693d2c4e847717950d4ce64cb319794
* fixup 2
Change-Id: Ibaccbdafd865e363564f491054e4e4ceb778727b
* Update var/spack/repos/builtin/packages/py-pyusb/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
A patch no longer applies cleanly as its fixed in v4.0.6 - fix it here
==> Installing openmpi-4.0.6-in47f6rxspbnyibkdx6x4ekg6piujobd
==> No binary for openmpi-4.0.6-in47f6rxspbnyibkdx6x4ekg6piujobd found: installing from source
==> Fetching https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.6.tar.bz2
Reversed (or previously applied) patch detected! Assume -R? [n]
Apply anyway? [n]
2 out of 2 hunks ignored -- saving rejects to file opal/include/opal/sys/gcc_builtin/atomic.h.rej
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
When running executables from build dependencies, we want to avoid that
`LD_PRELOAD` and `DYLD_INSERT_LIBRARIES` any of their shared libs build
by spack with system libraries.
The Z3 solver provides a Z3Config.cmake file when built using the CMake build
system. This submission changes the package build system to inherit the
CMakePackage type. In addition to changing the build system, this submission:
- Adds the GMP variant
- Removes v4.4.0 and v4.4.1 as CMake was implemented starting with v4.5.0
This adds a package for `irep`, a tool for reading `lua` input decks from
Fortran, C, and C++.
`irep` can be built with either `lua` or `luajit`. To address this, we also add
a virtual package for lua called `lua-lang`. `luajit` isn't, by default, a drop-in
replacement for `lua`, but we add a `+lualinks` variant to it that adds symlinks
that make it behave like `lua@5.1`. With this variant enabled, it provides the
`lua-lang` virtual. `lua` always provides `lua-lang`.
- [x] add `irep` package
- [x] add `+lualinks` variant to `lua-luajit`
- [x] create `lua-lang` virtual, provided by `lua` and `luajit+lualinks`
Co-authored-by: Kayla Richarda Butler <butler59@quartz1148.llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* libdrm: fix one configure error and require libpciaccess
Failure with `LIBS`: the linker can't find `-lrt` so configure fails on
darwin-bigsur %apple-clang@12.0.5
```
>> 22 configure: error: in `/private/var/folders/gy/mrg1ffts2h945qj9k29s1l1dvvmbqb/T/s3j/spack-s
tage/spack-stage-libdrm-2.4.100-ofhk6m25n2pi427ihnxmvjkfmgyzlrqc/spack-src':
>> 23 configure: error: C compiler cannot create executables
24 See `config.log' for more details
See build log for details:
/var/folders/gy/mrg1ffts2h945qj9k29s1l1dvvmbqb/T/s3j/spack-stage/spack-stage-libdrm-2.4.100-ofhk6m25n2pi427ihnxmvjkfmgyzlrqc/spack-build-out.txt
```
* libpciaccess: Mark conflict with darwin
```
make[2]: *** [common_init.lo] Error 1
make[2]: *** Waiting for unfinished jobs....
common_interface.c:75:10: fatal error: 'sys/endian.h' file not found
^~~~~~~~~~~~~~
```
and
```
common_init.c:73:3: error: "Unsupported OS"
```
and others
* extending example for buildcaches
I was attempting to create a local build cache from a directory, and I found the
docs for both buildcaches and mirrors, but did not connect the docs that the
url variable could be the local filesystem variable. I am extending the docs for
buildcaches with an example of creating and interacting with one on the filesystem
because I suspect other users will run into this need and possibly not find what
they are looking for.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
* adding as follows to spack mirror list
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* update url, add all new versions and fix installation
* add wxparaver package and set the old paraver package as deprecated
* remove update of deprecated package
* remove old version from new wxparaver
* Update url
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
It is currently kind of confusing to the reader to distinguish spack buildcache install
and spack install, and it is not clear how to use a build cache once a mirror is added.
Hopefully this little big of description can help (and I hope I got it right!)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Use the 'version_yearlike' attribute instead of 'version' to
check if the SPACK_COMPILER_EXTRA_RPATHS should be set to include
the built-in 'libfabrics'.
When using the bare 'version', the comparison is wrong when
building with 'intel-parallel-studio', which has the version
format '<edition>.YYYY.Nupdate', due to the leading '<edition>'.
xfsprogs currently does not install with error message:
FATAL ERROR: could not find a valid ini.h header.
Adding this package libinih, and including it as
a dependency for xfsprogs seems to fix the issue. It could be
that we only need to add it for newer versions (if it worked before)
and maybe a maintainer can comment on that.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Pagination on Github prevent spack from easily parse all available
versions. Also, due to recent migration to GitHub, tarballs for
versions up to 3.12.13 have be regenerated, changing the hash.
The current URL will apparently be supported, so we keep it, and give
the alternative one as a comment.
This should fix#24278
$INSTALLDIR/lib/python3.7/site-packages/IPython/core/events.py contains an
import from backcall even in @7.3.0, so dependency on py-backcall needs
to start earlier.
Restrict poppler version for texlive to poppler@:0.84
Should fix#19946
See also https://github.com/NixOS/nixpkgs/issues/79170
Looks like poppler@0.84 upgraded their header files to use the C++ cstdio
instead of the C stdio.h. Since TeX is using C, not C++. this causes problems.
* zfp: several package improvements
- add variants for build targets, language bindings, backends
- ensure selected variants are compatible with zfp version
- point to GitHub (not LLNL) tar balls
- add dependencies
- update link to homepage
- add maintainers
* zfp: address suggestions by Spack team
- use conflicts() instead of raising exceptions
- use define() and define_from_variant() where applicable
* Apply suggestions from code review
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Fix ZFP OpenMP build.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* [py-keyboard] created template
* [py-keyboard]
- updated homepage
- added dependency for OSX
- added description
- removed fixmes
* [py-keyboard] Until py-pyobjc can be created, specifying conflict with platform=darwin
* [py-keyboard] is verb
* Update of Flecsi Spackage
Update of flecsi spackage to reconcile differences between flecsi@1:1.9
and flecsi@2: for future support purposes
* Removing Unnecessary Conditional
Removing unused conditional. Initially the plan was to switch based on
version in `cmake_args` but this was not necessary as build system
variable names remained mostly the same and conflicts prevent the rest.
For the most part, if a variant is there it does not need to check
against what version of the code is being built.
* Updated CI To Reconcile Flecsi Changes
Updated CI to target flecsi@1.4.2 which best matches the previous
release version and reconciled change in variant name
The common.inc script in TBB uses the environ var 'OS' to determine
the platform it's on. On Linux, this is normally empty and TBB falls
back to uname. But some systems set this to 'CentOS Linux 8' which is
descriptive, but not exactly what common.inc is looking for.
Instead, take the value from python and explicitly set OS to what TBB
expects to avoid this problem.
Since the two packages share a common history, the installation
procedure has been factored into a common base class.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
* Tcl: fix TCLLIBPATH
* Fix TCL|TK|TIX_LIBRARY paths
* Fix TCL_LIBRARY, no tcl8.6 subdir
* Don't rely on os.listdirs sorting
For tcl and tk, we also install the source directory, so there are
two init.tcl and tk.tcl locations. We want the one in lib/lib64,
which should come before the one in share.
* Add more patches
* Fix dylib on macOS
* Tk: add smoke tests
* Tix: add smoke test
Extracting specs for the result of a solve has been factored
as a method into the asp.Result class. The method account for
virtual specs being passed as initial requests.
Minimizing compiler mismatches in the DAG and preferring newer
versions of packages are now higher priority than trying to use as
many default values as possible in multi-valued variants.
According to the docs, r is needed for plotting, but plotting is
untested. In addition, the specific version requirement of java for gatk
could lead to multiple installations of r being triggered in an
environment. That might cause people to have to be deliberate about
java in a deployment. All in all, it seems that r is better as a
variant for gatk.
* Set job_id for SGE in darshan-runtime package
* Use a multi value variant for scheduler
Only one scheduler can be selected so make the variant multi valued and
set multi=False.
* hdf-eos5: Fix issue when linking against hdf5+szip (#23411)
Should fix issue #23411 when linking against hdf5+szip
Also fix bug if hdf5 does not depend on zlib
Reluctantly added payerle as a maintainer
Added version 1.1.13
Fixed versions for dependencies based on README.md for package
In particular:
* versions 1.1.x require python@3, at least 3.4 and for 1.1.13 at least 3.6
* py-osqp had been pinned to version 0.4.1, but README.md either shows
no version restriction, of 0.4.1 and higher
* @1.1.13 requires at least 1.1.6 of py-scs
* I am assuming since 1.1.x is python@3 only, py-six no longer required
(it was not explicitly showing up in README.md for these versions)
Since the module roots were removed from the config file,
`--print-shell-vars` cannot find the module roots anymore. Fix it by
using the new `root_path` function. Moreover, the roots for lmod and
modules seems to have been flipped by accident.
* add versions 2.2.0.2 and 2.2.1.1
* Add maintainer
Added Ishaan as additional maintainer as he is also maintainer of the Python bindings
* add new major precice version as dependency
The VALID_VERSION regex didn't check that the version string was
completely valid, only that a prefix of it was. This version ensures
the entire string represents a valid version.
This makes a few related changes.
1. Make the SEGMENT_REGEX identify *which* arm it matches by what groups
are populated, including whether it's a string or int component or a
separator all at once.
2. Use the updated regex to parse the input once with a findall rather
than twice, once with findall and once with split, since the version
components and separators can be distinguished by their group status.
3. Rather than "convert to int, on exception stay string," if the int
group is set then convert to int, if not then construct an instance
of the VersionStrComponent class
4. VersionStrComponent now implements all of the special string
comparison logic as part of its __lt__ and __eq__ methods to deal
with infinity versions and also overloads comparison with integers.
5. Version now uses direct tuple comparison since it has no per-element
special logic outside the VersionStrComponent class.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* New Package:py-haphpipe@1.0.3
* removed llvm restrict. & changed freebayes
* Style fix
* Removed pip, wheel, added url for deps list
* used proper gsutil naming
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* url src for deps, samtools fix
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* petsc: add hip variant
* libceed: add 0.8, disable occa by default, and let autodetect AVX
Disabling OCCA because backend updates did not make this release and
there are some known bugs so most users won't have reason to use OCCA.
https://github.com/CEED/libCEED/pull/688
* WIP: ceed: 4.0 release
* MFEM package updates (#19748)
* MFEM package updates
* mfem: flake8
* [mfem] Various fixes and tweaks.
[arpack-ng] Add a patch to fix building with IBM XL Fortran.
[libceed] Fix building with IBM XL C/C++.
[pumi] Add C++11 flag for version 2.2.3.
* [mfem] Fix the shared CUDA build.
Reported by: @MPhysXDev
* [mfem] Fix a TODO item
* [mfem] Tweak the AmgX dependencies
* [suite-sparse] Fix the version of the mpfr dependency
* MFEM: add initial HIP support using the ROCmPackage.
* MFEM: add 'slepc' variant.
* MFEM: update the patch for v4.2 for SLEPc.
* mfem: apply 'mfem-4.2-slepc.patch' just to v4.2.
* ceed: apply 'spack style'
* [mfem] Add a patch for mfem v4.2 to work with petsc v3.15.0.
[laghos] Add laghos version 3.1 based on the latest commit in
the repository; this version works with mfem v4.2.
[ceed] For ceed v4.0 use laghos v3.1.
* [libceed] Explicitly set 'CC_VENDOR=icc' when using 'intel'
compiler.
* [mfem] Allow pumi >= 2.2.3 with mfem >= 4.2.0.
[ceed] Use pumi v2.2.5 with ceed v4.0.0.
* [ceed] Explicitly use occa v1.1.0 with ceed v4.0.0.
Use mfem@4.2.0+rocm with ceed@4.0.0+mfem+hip.
* [ceed] Add NekRS v21 as a dependency for ceed v4.0.0.
* [ceed] Fix NekRS version: 21 --> 21.0
* [ceed] Propagate +cuda variant to petsc for ceed v4.0.
* [mfem] Propagate '+rocm' variant to some other packages.
* [ceed] Use +rocm variant of nekrs instead of +hip.
* [ceed] Do not enable magma with ceed@4.0.0+hip.
* [libceed] Fix hip build with libceed@0.8.
* [laghos] For v3.1, use the release .tar.gz file instead of commit.
* Remove cuda & hip variants as they are inherited
* [ceed] Remove comments and FIXMEs about 'magma+hip'.
* [ceed] [libceed] Remove TODOs about occa + hip.
* libceed: use ROCmPackage and +rocm
* petsc: use ROCmPackage for HIP
* libceed, petsc: use CudaPackage
* ceed: forward cuda_arch and amdgpu_target
* [mfem] Use Spack's CudaPackage as a base class; as a result,
'cuda_arch' values should not include the 'sm_' prefix.
Also, propagate 'cuda_arch' and 'amdgpu_target' variants
to enabled dependencies.
* petsc: variant is +rocm, package name is hip
Co-authored-by: Jed Brown <jed@jedbrown.org>
Co-authored-by: Thilina Rathnayake <thilinarmtb@gmail.com>
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.