Throughput — Data Volume
Understanding the Throughput metric in the systemprompt.io Control Center
On this page
Throughput measures the total volume of data exchanged between you and Claude Code. It captures how much information is flowing through your sessions — a proxy for the complexity and depth of work being performed.
Definition
Throughput is the total bytes of content exchanged across all actions in a session or day, broken down into input and output.
Formula:
Total throughput = content_input_bytes + content_output_bytes
Throughput rate = total_bytes / total_active_session_seconds
What's counted
Every piece of data that flows between you and Claude Code contributes to throughput:
Input bytes:
- Prompt text you submit
- File contents read by the Read tool
- Search results returned by Grep and Glob
- Command output from Bash executions
- Tool input parameters
Output bytes:
- AI-generated responses
- Code written by Edit and Write tools
- Tool execution outputs
- Agent delegation results
A session that reads 50 files, makes 20 edits, and runs 10 test commands will have substantially higher throughput than a session that asks a single question and gets a short answer.
Data source
Throughput is deterministic — byte-counted from every hook event payload:
- Each
PostToolUseevent records the size oftool_input(bytes sent to the tool) andtool_response(bytes returned) - Each
UserPromptSubmitevent records the prompt length in bytes - Values are aggregated into
content_input_bytesandcontent_output_bytesonplugin_session_summaries
There is no sampling or estimation. Every byte is counted.
Interpretation
Throughput is best understood in context:
| Throughput | Typical session type |
|---|---|
| < 10 KB | Quick question-and-answer, small lookup |
| 10–100 KB | Targeted edit, single-file change, short debugging |
| 100 KB–1 MB | Multi-file refactor, feature implementation, code review |
| 1–10 MB | Large-scale changes, codebase-wide search and replace, migration |
| 10+ MB | Massive operations — full codebase analysis, large file processing |
The throughput rate (bytes per second) adds temporal context. High total throughput with a low rate means a long session processing lots of data slowly. High throughput with a high rate means rapid, intensive data processing.
Why it matters
Higher throughput indicates more data processed, which generally correlates with more complex work being done. A session that processes 5 MB of code is reading more files, making more edits, and running more commands than one processing 50 KB.
Throughput also helps identify sessions where Claude is doing heavy lifting versus sessions where the interaction is lightweight. If you're trying to maximize the value you extract from AI assistance, throughput tells you how much raw work is being performed.
Comparing throughput across sessions of similar duration reveals efficiency differences. Two 30-minute sessions with wildly different throughput suggest different levels of AI engagement — one may have had long idle periods while the other kept Claude continuously busy.
The StarCraft analogy
Throughput maps to resource gathering rate — how much mineral and gas flows through your bases per minute. A player mining from 4 bases processes far more resources than one mining from 1. Similarly, a session with high throughput is processing far more information, enabling more complex strategies and faster execution.