SuperMark Observability

4 modules active
SuperMark Observability

Audit how you use AI in Claude Code. Everything runs on your machine — only aggregated metrics are sent.

expert-usage v0.5.0 hook-activity v0.1.0 model-routing v0.1.0 token-efficiency v0.3.0
1

Download

One Python file. No dependencies beyond the standard library.

2

Run locally

Reads ~/.claude/projects/*.jsonl. Raw content never leaves your disk.

3

Upload

Drop or paste the JSON. We score server-side and render your dashboard.

Download audit script

Then run from where the browser saved it (usually ~/Downloads):

cd ~/Downloads && python3 supermark-observability-audit-0.8.0.py --json > supermark-obs-report.json

Step 3 — Upload your results

🛸 Drop supermark-obs-report.json here

or click to browse

or paste JSON directly
Loading your audit...
view: landing
token: none
localStorage:
api: idle
script: supermark-observability-audit-0.8.0.py