CI Timing Statistics (#38598)

* Write timing information for installs from cache

* CI: aggregate and upload install_times.json to artifacts

* CI: Don't change root directory for artifact generation

* Flat event based timer variation

Event based timer allows for easily starting and stopping timers without
wiping sub-timer data. It also requires less branching logic when
tracking time.

The json output is non-hierarchical in this version and hierarchy is
less rigidly enforced between starting and stopping.

* Add and write timers for top level install

* Update completion

* remove unused subtimer api

* Fix unit tests

* Suppress timing summary option

* Save timers summaries to user_data artifacts

* Remove completion from fish

* Move spack python to script section

* Write timer correctly for non-cache installs

* Re-add hash to timer file

* Fish completion updates

* Fix null timer yield value

* fix type hints

* Remove timer-summary-file option

* Add "." in front of non-package timer name

---------

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
This commit is contained in:
kwryankrattiger
2023-09-07 15:41:31 -05:00
committed by GitHub
parent c5fc794d77
commit 8ec1657136
7 changed files with 155 additions and 38 deletions

View File

@@ -0,0 +1,38 @@
#!/usr/bin/env spack-python
"""
This script is meant to be run using:
`spack python aggregate_logs.spack.py`
"""
import os
def find_logs(prefix, filename):
for root, _, files in os.walk(prefix):
if filename in files:
yield os.path.join(root, filename)
if __name__ == "__main__":
import json
from argparse import ArgumentParser
parser = ArgumentParser("aggregate_logs")
parser.add_argument("output_file")
parser.add_argument("--log", default="install_times.json")
parser.add_argument("--prefix", required=True)
args = parser.parse_args()
prefixes = [p for p in args.prefix.split(":") if os.path.exists(p)]
# Aggregate the install timers into a single json
data = []
for prefix in prefixes:
time_logs = find_logs(prefix, args.log)
for log in time_logs:
with open(log) as fd:
data.append(json.load(fd))
with open(args.output_file, "w") as fd:
json.dump(data, fd)