Zing Forum

Reading

BurnBar: A Local-First Usage Tracking Tool for AI Coding Assistants

BurnBar is a macOS menu bar application with a local-first architecture. It helps developers monitor the usage, cost, and workflow of AI coding assistants (such as Claude Code, Codex, Kimi, etc.) in real time, without requiring API keys, and all data is stored locally.

BurnBarOpenBurnBarAI编码助手Claude CodeCodex用量追踪本地优先macOS应用成本监控
Published 2026-05-10 09:44Recent activity 2026-05-10 10:34Estimated read 5 min
BurnBar: A Local-First Usage Tracking Tool for AI Coding Assistants
1

Section 01

BurnBar: Local-First AI Coding Assistant Usage Tracking Tool (Main Post)

BurnBar is an open-source macOS menu bar application developed by the Imagine That AI team. It addresses the 'black box' cost problem of AI coding assistants by using a local-first architecture to track usage, cost, and workflow of tools like Claude Code, Codex, and Kimi. Key features include real-time monitoring, privacy protection (data stored locally in SQLite, no API keys required), and optional cloud sync. Project repository: https://github.com/Imagine-That-Ai/BurnBar.

2

Section 02

Problem Background: The 'Black Box' Cost of AI Coding Assistants

AI coding tools like Claude Code, GitHub Copilot, and Kimi boost productivity but hide unclear costs. Developers often use multiple tools without real-time spending visibility, leading to unexpected monthly bills. Provider statistics are delayed or lack detail, making it hard to track usage patterns and cost distribution.

3

Section 03

Core Design: Local-First & Privacy Protection

BurnBar’s local-first architecture ensures: 1. All data stored in local SQLite databases (no account/API keys needed, no cloud dependency for core functions). 2. Reads local session logs of AI tools instead of API queries (API keys stay with providers, zero extra network requests). Supports Claude Code, Codex, Kimi, etc.

4

Section 04

Key Features of BurnBar

  1. Menu bar integration (non-intrusive, no Dock icon). 2. Real-time tracking: time dimensions (today/week/month), units (dollars/tokens), provider-wise distribution. 3. InsightEngine: detects usage patterns (e.g., 40% cost increase vs yesterday), cache hit analysis, new model reminders. 4. Daily summary notifications. 5. Built-in chat panel for querying usage data.
5

Section 05

Technical Architecture Deep Dive

  1. Daemon-first design: core functions via OpenBurnBarDaemon (runs in background even if UI closed). 2. Local retrieval system: GRDB/SQLite as primary storage, Firestore for sharing only. 3. CLI tools: support health check, project control (command example: swift run --package-path OpenBurnBarDaemon OpenBurnBarCLI -- help). 4. Editor extensions (Cursor/VS Code) for seamless integration.
6

Section 06

Optional Cloud Integration

Optional features: Google/Apple login (Firebase auth), selective sync (usage records, chat metadata), explicit control over chat content sync, and easy opt-out without affecting local functions. Also supports routing models like Z.ai via local OpenAI-compatible gateway for Cursor/Factory/OpenCode.

7

Section 07

Deployment & Usage Guide

BurnBar’s macOS Beta version (0.1.3-beta.1) can be installed via GitHub Releases (download DMG, drag to Applications) or source (make install). System requirements: macOS, already using supported AI tools (Claude Code, Codex, etc.).

8

Section 08

Application Scenarios, Limitations & Conclusion

Scenarios: Personal devs (cost awareness, budget planning), team management (usage pattern analysis, cost optimization), efficiency analysis (habit improvement). Limitations: macOS-only, Beta state (stability issues), VS Code extension not on market, only supports tools with local logs. Conclusion: BurnBar fills a gap in the AI tool ecosystem, offering privacy-focused usage tracking. It will become a standard tool as AI assistants grow in popularity.