πŸ› File an issue
Pipeline Status
πŸ”΄ Failing
Success Rate (7d)
71.7%
Builds (7d)
60
Median Duration
13m
P95: 39m
DORA Level
High
Deployment Frequency
15.5 / week
elite
How often builds are deployed. Elite: >7Γ—/week, High: 1–7Γ—/week, Medium: monthly, Low: less than monthly.
Lead Time
16m
elite
Time from commit to production. Elite: <1h, High: <1d, Medium: <1wk, Low: >1wk.
Change Failure Rate
10.7%
medium
Percentage of deployments causing a failure. Elite: <5%, High: <10%, Medium: <15%, Low: β‰₯15%.
MTTR
4.3h
high
Mean time to restore service after a failure. Elite: <1h, High: <1d, Medium: <1wk, Low: >1wk.
Build Success Rate
Daily Build Outcomes
Build Duration Trends
Avg Duration by Repo
Failure Category Breakdown
MTTR
4.3h
Mean Time to Recovery
MTBF
24.0h
Mean Time Between Failures
Flakiness Leaderboard β€” Top 10 Flaky Jobs
# Job Name Repo Runs Failures Failure Rate Flakiness Index Last Failure Top Failing Step
1 build-testing / Check all successful ucore 47 20 4255.3%
0.338
today
2 build-lts / Check all successful ucore 47 21 4468.1%
0.325
yesterday
3 build-testing / Build: ucore-hci-nvidia… ucore 47 8 1702.1%
0.318
5d ago Package Provenance
4 build-testing / Build: ucore-hci: x86_64 ucore 47 8 1702.1%
0.318
5d ago Verify GitHub package downloads
5 build-testing / Build: ucore-hci-nvidia… ucore 47 8 1702.1%
0.318
5d ago Verify GitHub package downloads
6 build-stable / Build: ucore-hci-nvidia-… ucore 47 8 1702.1%
0.318
5d ago Verify GitHub package downloads
7 build-stable / Build: ucore-hci-nvidia-… ucore 47 8 1702.1%
0.318
5d ago Verify GitHub package downloads
8 build-stable / Check all successful ucore 47 21 4468.1%
0.311
2d ago
9 build-lts / Build: ucore-nvidia-open: x… ucore 47 9 1914.9%
0.311
5d ago Build Image
10 build-testing / Build: ucore-hci-nvidia… ucore 47 9 1914.9%
0.311
5d ago Build Image
Builds Per Day
Avg Queue Wait Time
Build Trigger Breakdown
Builder Comparison β€” All Repos
Repo Success Rate 7d Success Rate 30d Avg Duration Total Runs (7d) Last Stream Status
ucore 71.7% 82.6% 16m 60 πŸ”΄ failure

Stream Health

Repo Stream 7d Rate 30d Rate Runs (7d) Avg Duration Last Run Status
ucore build-lts 70.0% 81.8% 20 19m 3h ago 🟒
build-stable 75.0% 81.8% 20 14m 2h ago 🟒
build-testing 70.0% 84.1% 20 15m 2h ago πŸ”΄
Publish Reliability
Repos at Full Coverage
0/1
β‰₯95% success rate (7d)
Overall 7d Success Rate
71.7%
Avg across all repos
Repos with Recent Activity
1/1
Total runs > 0 (7d)

Publish step tracking requires live build data β€” shown after first successful CI run.

Supply Chain Security

ucore
Cosign Signing
100.0%
30d success rate
SBOM Coverage
0.0%
30d success rate

Rates computed from workflow step names over last 30 days. Steps not detected in pipeline are shown as β€”.

OpenSSF Scorecard

Scores from OpenSSF Scorecard. Click a card to view the full report.

Recent Builds
Repo Workflow Branch Trigger Duration Started Jobs
πŸ”΄ ucore testing main sched 31m 2026-04-08T03:52:26Z 23/24 βœ—
🟒 ucore stable main sched 12m 2026-04-08T03:45:25Z 32/32 βœ“
🟒 ucore lts main sched 12m 2026-04-08T03:44:15Z 32/32 βœ“
🟒 ucore testing main sched 13m 2026-04-07T03:52:08Z 32/32 βœ“
🟒 ucore stable main sched 49m 2026-04-07T03:44:41Z 32/32 βœ“
πŸ”΄ ucore lts main sched 39m 2026-04-07T03:42:11Z 23/24 βœ—
🟒 ucore testing main sched 12m 2026-04-06T03:54:28Z 32/32 βœ“
🟒 ucore stable main sched 13m 2026-04-06T03:49:58Z 32/32 βœ“
🟒 ucore lts main sched 11m 2026-04-06T03:48:57Z 32/32 βœ“
🟒 ucore testing renovate/extractions-setup-just-4.x PR 9m 2026-04-05T11:44:19Z 24/24 βœ“
🟒 ucore stable renovate/extractions-setup-just-4.x PR 12m 2026-04-05T11:44:19Z 24/24 βœ“
🟒 ucore lts renovate/extractions-setup-just-4.x PR 8m 2026-04-05T11:44:19Z 24/24 βœ“
🟒 ucore testing main sched 13m 2026-04-05T03:53:06Z 32/32 βœ“
🟒 ucore stable main sched 13m 2026-04-05T03:47:24Z 32/32 βœ“
🟒 ucore lts main sched 13m 2026-04-05T03:46:26Z 32/32 βœ“
🟒 ucore testing tailscale-repo PR 9m 2026-04-05T01:14:07Z 24/24 βœ“
🟒 ucore stable tailscale-repo PR 9m 2026-04-05T01:14:07Z 24/24 βœ“
🟒 ucore lts tailscale-repo PR 12m 2026-04-05T01:14:07Z 24/24 βœ“
🟒 ucore testing main sched 14m 2026-04-04T03:42:39Z 32/32 βœ“
🟒 ucore stable main sched 13m 2026-04-04T03:40:16Z 32/32 βœ“

Further Reading

The metrics on this page are grounded in peer-reviewed research and open standards. These resources explain what each metric means, why it predicts software delivery performance, and how to improve it.

Software Delivery Performance
DORA Metrics β€” Four Keys

The canonical framework for measuring software delivery: Deployment Frequency, Lead Time for Changes, Change Failure Rate, and Time to Restore Service. Published by Google Cloud's DevOps Research and Assessment team and validated across thousands of organizations since 2014.

Supply Chain Security
OpenSSF Scorecard

Automated security health checks for open source projects, scoring 0–10 across checks including signed releases, SBOM presence, branch protection, pinned dependencies, and CI test coverage. Produced by the Open Source Security Foundation (OpenSSF), a Linux Foundation project.

Supply Chain Security
SLSA β€” Supply-chain Levels for Software Artifacts

A graduated framework (L0–L3) for verifiable software build integrity. Each level adds stronger guarantees: L1 means provenance exists, L2 means it is signed by a hosted build platform, L3 means the build environment itself is hardened and isolated. Developed by Google and adopted as an OpenSSF standard.

Supply Chain Security
Sigstore / Cosign

Keyless, identity-based artifact signing backed by a public transparency log (Rekor). Cosign signs and verifies container images and release artifacts using short-lived OIDC certificates β€” no long-lived private keys to manage or rotate. A CNCF project used by Kubernetes, Tekton, and the Bluefin image pipeline.

Supply Chain Security
SBOM β€” Software Bill of Materials

A machine-readable inventory of every component and dependency in a software artifact. SBOMs make vulnerability response faster β€” when a new CVE is published, you can immediately know which of your images are affected. The OpenSSF SBOM Everywhere SIG maintains tooling guidance and naming conventions.

Supply Chain Security
CNCF TAG Security β€” Supply Chain Best Practices

The CNCF Technical Advisory Group for Security publishes authoritative whitepapers on cloud-native supply chain security. The Software Supply Chain Best Practices paper (v2, 2025) and the Secure Software Factory reference architecture define the practices that the Scorecard and SLSA checks encode.

Platform Engineering
CNCF Platforms White Paper

The authoritative CNCF definition of what internal developer platforms are, what they should measure (user satisfaction, self-service rate, onboarding time), and how platform teams should operate. Published by the TAG App Delivery Platforms Working Group. Recommends DORA metrics as the delivery measurement standard for platform teams.

Platform Engineering
Platform Engineering Maturity Model

A 4-level model (Provisional β†’ Operational β†’ Scalable β†’ Optimizing) across five aspects: Investment, Adoption, Interfaces, Operations, and Measurement. Helps platform teams understand where they are and what practices characterize the next level. Published by the CNCF TAG App Delivery Platforms Working Group.

Platform Engineering
Cloud Native Maturity Model

A 5-level model (Build β†’ Operate β†’ Scale β†’ Improve β†’ Adapt) across Business Outcomes, People, Process, Policy, and Technology. Maintained by the CNCF Cartografos Working Group. Version 4 (2025) adds AI and FinOps dimensions. Useful for understanding where cloud-native adoption fits in the broader organizational journey.

Research
Accelerate β€” The Science of Lean Software and DevOps

The peer-reviewed research behind DORA metrics. Nicole Forsgren, Jez Humble, and Gene Kim identified 24 technical, process, and cultural capabilities that predict software delivery performance and organizational outcomes. Required reading for understanding why deployment frequency and lead time matter.

Research
SRE Golden Signals

Google's Site Reliability Engineering book defines four signals sufficient to monitor any user-facing service: Latency, Traffic, Errors, and Saturation. These are the production observability complement to DORA β€” they define what a "failure" actually is (without them, Change Failure Rate cannot be accurately measured) and predict when MTTR will spike before incidents occur.

Research
SPACE Framework β€” Developer Productivity

DORA measures what the pipeline does; SPACE measures how developers experience it. Developed by Nicole Forsgren (GitHub), Margaret-Anne Storey, and colleagues at Microsoft Research. Five dimensions: Satisfaction, Performance, Activity, Communication/Collaboration, and Efficiency. Never measure activity in isolation.