Integrating RocqStat WCET Analysis Into CI/CD for Safety-Critical Embedded Software
embeddedci-cdsafety

Integrating RocqStat WCET Analysis Into CI/CD for Safety-Critical Embedded Software

UUnknown
2026-02-24
10 min read
Advertisement

Practical 2026 guide: integrate RocqStat WCET checks into VectorCAST-driven CI/CD for safety-critical embedded systems with gating and automation.

Stop Finding Timing Bugs in Production: Add RocqStat WCET to Your CI/CD Today

If your team still discovers missed deadlines during late-stage integration or on-field runs, you're paying for avoidable recalls, emergency engineering and long MTTR. In 2026, with Vector's acquisition of RocqStat and the tight coupling of timing analysis into verification toolchains, there's no excuse: you can (and should) automate worst-case execution time (WCET) checks inside your embedded CI/CD pipeline and gate merges before timing regressions ever reach hardware.

Why this matters now (2026 context)

The industry trend driving this is clear: software-defined vehicles and safety-critical systems now demand timing evidence for compliance (ISO 26262, DO-178C) and fleet safety. Vector’s 2026 acquisition of RocqStat accelerates native timing analysis in VectorCAST — creating an opportunity to integrate rigorous WCET workflows directly into CI/CD. Teams that add automated timing gates reduce late-stage rework and lower certification risk.

What you’ll get from this guide

  • Step-by-step integration pattern: VectorCAST & RocqStat in CI (local, GitHub Actions, GitLab CI examples)
  • Practical gating rules and threshold examples
  • How to collect deterministic timing traces and run RocqStat automatically
  • Audit, traceability and compliance best practices
  • Remediation options and automation patterns

High-level flow: CI WCET pipeline

At a glance, the automated WCET pipeline follows this flow — integrate this as stages in your CI/CD pipeline:

  1. Build the firmware and VectorCAST test artefacts
  2. Execute deterministic test vectors on target or cycle-accurate simulator
  3. Collect execution timing traces (per-run timestamps, hardware cycle counts, or VectorCAST trace files)
  4. Run RocqStat (CLI) to compute the WCET estimate and confidence interval
  5. Compare the WCET to the assigned timing budget and baseline; apply gating rules
  6. Archive reports, sign artifacts, and optionally trigger remediation (automated issue, revert, or rollback)

Prerequisites and setup

  • VectorCAST set up with your unit and integration tests (VectorCAST 2026+ supports RocqStat integration roadmap)
  • RocqStat CLI or Docker image available in your CI runners (StatInf technology included within Vector toolchain as of 2026)
  • Deterministic test harness: hardware target, HIL or cycle-accurate simulator
  • Artifact storage (S3, GitLab LFS, or VectorCAST artifact server)
  • CI runner with privileged access or isolated timing environment

Step 1 — Prepare deterministic timing runs

The single biggest source of noise in automated timing analysis is non-deterministic test execution. For reliable WCET estimates:

  • Use a dedicated CI runner for timing tasks (disable other processes, power management and network interference)
  • Prefer hardware counters (cycle counters) or fine-grained timestamping from VectorCAST test harness
  • Maintain a fixed board setup and ensure consistent toolchain versions — embed compiler flags and link map reproducibility in CI
  • Run multiple input vectors covering worst-case paths; seed fuzz or functional coverage to exercise boundary conditions

Step 2 — Collect traces via VectorCAST

VectorCAST already controls unit and integration tests; augment this with trace collection. Typical options:

  • Enable VectorCAST's tracing to export per-test timing logs or use hardware cycle counters exposed through the test stencil.
  • Store logs as structured JSON or CSV to feed RocqStat, or use VectorCAST’s API to emit compatible measurement bundles.

Example: export timing from VectorCAST CLI

vcastcli --workspace project.vcw --run-tests --export-timings timings.json

The exact flags depend on your VectorCAST version. The goal: produce a reproducible timing archive per pipeline run.

Step 3 — Run RocqStat (automated)

RocqStat consumes the timing measurements and produces a statistically-sound WCET estimate with confidence bounds. Run it as part of your CI job.

Bundle RocqStat in a container to keep CI reproducible. Example Docker run snippet:

docker run --rm -v $(pwd)/timings:/data rocqstat/rocqstat:2026 \
  --input /data/timings.json --output /data/wcet_report.json --confidence 0.999

Output is a machine-readable JSON containing: estimated WCET, upper bound at requested confidence, percentile metrics, and diagnostics.

CLI example (pseudo)

rocqstat --measurements timings.json --confidence 0.9999 --save wcet_output.json

Capture the output and return code. RocqStat can also produce human-readable reports (HTML) for reviewers and auditors.

Step 4 — Gating rules: how to decide pass/fail

Define firm gating rules. Examples below reflect best practices in safety-critical projects:

  • Hard budget: fail if WCET_upper_bound > timing_budget
  • Regression detection: fail if WCET_upper_bound > baseline_upper_bound * 1.10 (greater than 10% regression)
  • Margin enforcement: fail if (timing_budget - WCET_upper_bound) < minimum_margin_ms (e.g., 10% of budget)
  • Stability requirement: require at least N valid runs and statistical convergence flags from RocqStat

Sample gating script (bash + jq)

# wcet_gate.sh
REPORT=wcet_output.json
BUDGET_MS=2.0
BASELINE_FILE=baseline_wcet.json

wcet_upper=$(jq .wcet_upper_ms $REPORT)
baseline_upper=$(jq .wcet_upper_ms $BASELINE_FILE)

# Fail on hard budget
if (( $(echo "$wcet_upper > $BUDGET_MS" | bc -l) )); then
  echo "FAIL: WCET ${wcet_upper}ms exceeds budget ${BUDGET_MS}ms"
  exit 1
fi

# Fail on regression >10%
if (( $(echo "$wcet_upper > $baseline_upper * 1.10" | bc -l) )); then
  echo "FAIL: WCET regression detected: ${wcet_upper}ms vs baseline ${baseline_upper}ms"
  exit 1
fi

echo "PASS: WCET ${wcet_upper}ms within budget and baseline"
exit 0

Integrate this script as a CI job step. Adjust numbers and confidence per your safety plan.

Step 5 — CI examples: GitHub Actions & GitLab CI

GitHub Actions (workflow snippet)

name: wcet-check
on: [push, pull_request]

jobs:
  wcet:
    runs-on: self-hosted-timing-runner
    steps:
      - uses: actions/checkout@v4
      - name: Build firmware & VectorCAST
        run: ./ci/build_and_vcast.sh
      - name: Run VectorCAST tests & export timings
        run: vcastcli --workspace project.vcw --run-tests --export-timings timings.json
      - name: Run RocqStat
        run: docker run --rm -v ${{ github.workspace }}/timings:/data rocqstat/rocqstat:2026 \
               --input /data/timings.json --output /data/wcet_output.json --confidence 0.9999
      - name: Gate WCET
        run: ./ci/wcet_gate.sh

GitLab CI (gitlab-ci.yml snippet)

stages:
  - build
  - test
  - timing

build_firmware:
  stage: build
  tags: [self-hosted-timing]
  script:
    - ./ci/build_and_vcast.sh

run_timing_tests:
  stage: test
  tags: [self-hosted-timing]
  script:
    - vcastcli --workspace project.vcw --run-tests --export-timings timings.json
    - docker run --rm -v $CI_PROJECT_DIR/timings:/data rocqstat/rocqstat:2026 \
        --input /data/timings.json --output /data/wcet_output.json --confidence 0.9999
    - ./ci/wcet_gate.sh

Handling non-determinism and flaky timing

Timing flakiness is common. Use these patterns to mitigate:

  • Warm-up runs: discard initial runs to eliminate cache/warm-up artifacts
  • Multiple seeds: run tests with varied inputs and aggregate via RocqStat
  • Statistical convergence: require RocqStat convergence flags; if not converged, mark the run as unstable and trigger re-run on isolated hardware
  • Auto-retry cap: allow a limited automated retry (e.g., 1 retry) to filter transient noise, but keep audit trail

Automated remediation patterns

When timing gates fail, a few automated responses reduce manual toil and accelerate recovery:

  • Auto-create an issue with attached WCET report and failing tests
  • Annotate the pull request with exact diffs and link to human-readable RocqStat HTML report
  • Optional auto-revert on protected branches if gate fails repeatedly (policy-controlled)
  • Trigger a targeted testcase to narrow down the function or change causing regression (binary bisection)

Example: GitHub PR annotation (pseudo)

if [ $? -ne 0 ]; then
  gh issue create --title "WCET Gate Failure: PR #$PR" --body "WCET upper bound exceeded. See wcet_output.json and timings.zip"
  gh pr comment $PR --body "WCET gate failed: . Failing test: test_x"
fi

Traceability, compliance and audit evidence

For safety certification, store and sign artifacts produced during the WCET stage. Maintain a chain-of-evidence:

  • Raw timing measurements
  • RocqStat input parameters and result JSON/HTML
  • VectorCAST test logs and coverage reports
  • Build reproducibility metadata (compiler version, linker map, toolchain hashes)
  • Signed artifacts: use a CI step to sign reports and upload to an immutable artifact store

Example: S3 archival and signature

aws s3 cp wcet_output.json s3://evidence-bucket/${CI_COMMIT_SHA}/wcet_output.json
gpg --detach-sign wcet_output.json
aws s3 cp wcet_output.json.sig s3://evidence-bucket/${CI_COMMIT_SHA}/wcet_output.json.sig

Keep retention policies aligned with certification needs (often multiple years) and provide easy retrieval for auditors.

Operational metrics to track in dashboards

Turn timing results into operational telemetry for engineering and safety teams:

  • WCET upper bound per function/module
  • WCET baseline delta (regression %)
  • Number of failed timing gates per sprint
  • Confidence level achieved (e.g., 99.99%)
  • Mean execution time and variance — useful for planning margin

A Tier-1 supplier added RocqStat into their VectorCAST-driven CI in Q4 2025 after pilot testing. They used a 3-stage timing gate: unit-level WCET, integration-level WCET, and system-level HIL WCET. Within the first quarter they reduced late-stage timing defects by 85% and shortened certification artifact preparation time by 30% because the WCET artifacts were produced and archived automatically with each build.

"Integrating timing analysis into CI turned WCET from a manual milestone to an automated quality gate — catching regressions at the PR level saved months of late debugging." — Lead Embedded SRE, Automotive Tier-1

Advanced strategies and 2026 innovations

Looking ahead, teams should plan for these trends that matured in late 2025 and 2026:

  • Tight VectorCAST–RocqStat integration: expect native workflows and UIs that remove custom adapters
  • AI-assisted timing anomaly detection: ML models that highlight suspicious inputs or code paths most likely to cause WCET growth
  • Distributed timing farms: cloud-native hardware emulation farms that provide scalable HIL on demand for WCET runs
  • Certifiable toolchains: more vendors offering tool qualification evidence for WCET tools to ease DO-178C/ISO 26262 usage

Common pitfalls and how to avoid them

  • Running timing tests on shared CI nodes — use isolated runners
  • Not versioning RocqStat config — store the exact parameters used to compute WCET in the repo
  • Over-reliance on single-run measurements — use statistical estimation and require convergence
  • Underestimating thermal effects — control and log ambient temp for hardware runs

Checklist — deploy WCET gating in 4 sprints

  1. Sprint 1: Containerize RocqStat, add a CI job to run a minimal timing test and produce JSON
  2. Sprint 2: Wire VectorCAST export into that job and validate WCET outputs against a local baseline
  3. Sprint 3: Implement gating script and PR annotations; enforce on feature branches
  4. Sprint 4: HIL/production-level runs, artifact signing, retention policy and auditor playbook

Actionable takeaways

  • Automate early: add WCET checks at PR level to catch regressions early.
  • Isolate timing runs: dedicated hardware/simulators deliver repeatable results.
  • Use statistical gating: rely on RocqStat's confidence intervals, not single-run numbers.
  • Make evidence immutable: sign and archive all WCET artifacts to satisfy auditors.
  • Triage quickly: auto-create issues and attach failing inputs and reports for fast remediation.

Final thoughts

As of 2026, integrating RocqStat WCET analysis into CI/CD is no longer an experimental luxury — it's a practical requirement for teams building safety-critical embedded systems. With Vector's acquisition of RocqStat technology, the path to unified timing verification in VectorCAST is shorter and more supported than ever. Implementing the steps above will convert timing verification from a late-stage bottleneck into an automated quality gate that protects your releases and reduces certification risk.

Get started checklist

  • Provision an isolated CI runner/hardware for timing tests
  • Containerize RocqStat and standardize CLI parameters
  • Automate VectorCAST export of timing traces
  • Implement gating script with clear thresholds and baseline comparison
  • Archive signed evidence for audits

Call to action

Ready to stop discovering timing defects late? Start by running a pilot: containerize RocqStat, add a timing job to an isolated runner and gate a single critical function. If you want a practical checklist and example repository tailored to VectorCAST and your target board, request our CI/CD timing integration template — it includes Docker images, pipeline examples, and a gating policy you can drop into your repo today.

Advertisement

Related Topics

#embedded#ci-cd#safety
U

Unknown

Contributor

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
2026-02-24T02:10:25.831Z