Ship agents to production
in record time.
50MB of production ready Rust with auth, memory, MCP hosting and observability. Everything your agents need to serve real users.
Agents with memory. Infrastructure you own.
Build agents that remember users across sessions, learn from their own performance metrics, and communicate with each other via A2A protocol. systemprompt.io is a Rust library, not a platform. You write extensions, compile them in, and own the resulting binary. Ship to your cloud, your servers, your domain. No vendor lock-in. No runtime dependencies on us.
EMBEDDED RUST LIBRARY
50MB binary with everything. A complete AI infrastructure you compile into your project.
- Complete stack in one binary
- Production auth built in
- MCP hosting, secured
- Extensible at compile time
- Observability without instrumentation
UNIFIED CONTROL PLANE
The same commands work for you and your AI agents. Local or remote. Every action audited.
- Superagent-ready interface
- Transparent remote routing
- Eight unified domains
- Complete audit visibility
- Config-driven operations
THE CLOSED LOOP
Your agents can query their own performance and adapt in real-time. The feedback loop is built in.
- MCP tool calls -> PostgreSQL -> CLI queries
- A2A self-analysis in real-time
- Behavioral pattern recognition
- Dynamic context & skill modification
- Production audit trail with trace IDs
Everything your AI needs. Built in.
AI execution. Agent orchestration. MCP. Auth. Production-ready from day one.
Runs AI
Anthropic, OpenAI, Gemini—one API. Vision, reasoning, streaming, cost tracking.
→ ExecuteOrchestrates agents
A2A protocol. Multi-agent workflows. Task management. Shared state.
→ ExecuteSchedules jobs
Cron-based automation. Background tasks. Content publishing. Maintenance.
→ IntegrateHosts MCP
Production tool servers with real OAuth. Your capabilities, secured.
→ IntegrateRuns workflows
Skills and playbooks. Define once, execute anywhere. YAML automation.
→ IntegrateServes web
Templates, navigation, theming. Your brand, your domain, your interface.
→ SecureHandles auth
OAuth2/OIDC + WebAuthn included. Ship AI, not login screens.
→ SecureScoped permissions
Per-agent, per-tool authorization. OAuth2 scopes enforced on every request.
→ OperateStores files
Upload, serve, permission. No S3 config. No CDN. Works.
→ OperateTracks everything
Costs, usage, audit trails. Every AI request logged automatically.
→ OperateManages content
Blog, docs, legal pages. Markdown in, indexed database out. SEO included.
→Get started in minutes
Clone
Create your project from the GitHub template
gh repo create my-ai --template systempromptio/systemprompt-template --clone
Build
Build the CLI binary (offline mode for first build)
SQLX_OFFLINE=true cargo build --release
Login
Authenticate with systemprompt.io
systemprompt cloud auth login
Profile
Create a local or cloud deployment profile
systemprompt cloud profile create local
Run
Start all services locally
just start
Machine-Native Guides Written by agents. For agents.
Point your super agent to a playbook and watch the magic happen. Deterministic. Self-repairing. Executable.
START HERE
The required entry point for every agent. Read this playbook first before any task.
$ systemprompt core playbooks show guide_start
Built for production workloads
MCP Server Hosting
Host your MCP servers with real authentication. Connect Claude Desktop to production APIs securely.
# services/mcp/systemprompt.yaml
mcp_servers:
systemprompt:
binary: "systemprompt-mcp"
port: 5010
oauth:
required: true
scopes: ["admin"]
audience: "mcp"
Learn more →
Per-User AI Products
Ship AI products where every user gets isolated agent access with proper permissions.
# services/agents/welcome.yaml
card:
securitySchemes:
oauth2:
type: oauth2
flows:
authorizationCode:
scopes:
anonymous: "Public access"
user: "Authenticated user access"
Learn more →
Internal AI Tooling
Give your team shared agents with proper permissions. No more credential sharing.
# services/agents/welcome.yaml
agents:
welcome:
enabled: true
card:
skills:
- id: "general_assistance"
name: "General Assistance"
- id: "content_writing"
name: "Content Writing"
Learn more →
Agent-to-Agent Orchestration
Multiple agents coordinating via A2A protocol with shared state and permissions.
# services/agents/welcome.yaml (A2A card)
card:
protocolVersion: "0.3.0"
preferredTransport: "JSONRPC"
capabilities:
streaming: true
pushNotifications: false
Learn more →
Agentic Mesh
Automated, scalable workflows run by agents. Deterministic scheduling meets agentic intelligence.
# services/scheduler/config.yaml
scheduler:
enabled: true
jobs:
- name: publish_pipeline
extension: web
job: publish_pipeline
schedule: "0 */15 * * * *"
Learn more →
Framework vs. Library
What Frameworks Give You
- Abstractions you don't control
- Build auth yourself (weeks)
- Single-user by default
- Debug through framework internals
- Lock-in to their patterns
- Manage your own infrastructure
What a Library Should Provide
Built on open standards
Your binary. Your rules. We're just a library.
No vendor lock-in. Production-grade from day one.
MCP
Connect any AI client to your tools
The open protocol from Anthropic for connecting AI clients to external capabilities. HTTP-native, OAuth2-protected, production-ready.
- Works with Claude Code, Claude Desktop, ChatGPT
- HTTP transport with real authentication
- Per-tool OAuth2 scopes
A2A
Agents that discover and collaborate
Google's open protocol for agent interoperability. Discovery, capabilities negotiation, and secure multi-agent communication.
- Automatic agent discovery
- Capability negotiation
- Secure cross-agent messaging
OAuth2
Battle-tested authorization
Full OAuth2 authorization server with OpenID Connect. PKCE flows, token introspection, scoped permissions for every tool call.
- Authorization code with PKCE
- Client credentials flow
- Standard OIDC discovery
WebAuthn
Passwordless by default
W3C standard for phishing-resistant authentication. Face ID, Touch ID, YubiKey - no passwords to steal.
- Face ID / Touch ID / Windows Hello
- Hardware security keys
- Phishing-resistant by design
Your extensions. Your binary.
systemprompt.io is a library you control, not a platform that controls you.
impl Extension for MyExtension {
fn metadata(&self) -> ExtensionMetadata {
ExtensionMetadata {
id: "my-extension",
name: "My Extension",
version: env!("CARGO_PKG_VERSION"),
}
}
}
register_extension!(MyExtension);
fn router(&self, ctx: &dyn ExtensionContext) -> Option<ExtensionRouter> {
let router = Router::new()
.route("/items", get(list_items).post(create_item))
.with_state(ctx.database());
Some(ExtensionRouter::new(router, "/api/v1"))
}
fn schemas(&self) -> Vec<SchemaDefinition> {
vec![SchemaDefinition::inline(
"my_items",
include_str!("../schema/items.sql")
)]
}
fn jobs(&self) -> Vec<Arc<dyn Job>> {
vec![Arc::new(CleanupJob), Arc::new(SyncJob)]
}
fn llm_providers(&self) -> Vec<Arc<dyn LlmProvider>> {
vec![Arc::new(MyCustomProvider::new())]
}
Compile-time discovery
One trait, many capabilities. The Extension trait provides optional methods for HTTP routes, database schemas, background jobs, and LLM providers. Extensions register via the inventory crate for compile-time discovery.
Library, not platform
You own the binary
Compile once, deploy anywhere. No runtime dependencies on us.
Library dependency
Add systemprompt.io as a Cargo dependency. Pin your version. Update when ready.
If it compiles, it works
Rust's type system catches errors at compile time. No runtime surprises.
Extensions stay private
Build proprietary logic on open core. We never see your code.
Ship YOUR product
White-label ready. Your domain. Your brand.
Pricing & Licensing
Source-available. BYOK. Sandbox everything. Optional cloud hosting.
Free
BSL license, self-hosted
- Self-hosted (just needs PostgreSQL)
- BYOK (OpenAI, Gemini, Anthropic)
- All core features
- Unlimited agents & MCP servers
- Completely sandboxed
- Community support
Cloud
Managed hosting, variable resources
- One-click deploy
- Managed PostgreSQL
- Automatic backups
- Your tenant URL
- Scale resources as needed
- Email support
Licensed Enterprise
Private infrastructure & business licensing
- Install on your private infrastructure
- Volume licensing
- Dedicated support
- SLA guarantee
- Custom integrations
Frequently asked questions
What's the difference between systemprompt.io and Claude/ChatGPT?
Claude and ChatGPT are AI models. systemprompt.io is a Rust framework for deploying AI to production. It provides auth, MCP hosting, agent orchestration, and observability in a single binary. You call the AI; we handle the infrastructure.
Can I self-host everything?
Yes. systemprompt.io compiles to a single binary. Point it at any PostgreSQL database and run it on bare metal, in a VM, or containerized. The self-hosted version includes all core features. Cloud hosting is available for teams who want managed infrastructure.
What MCP clients are supported?
Any MCP client works with systemprompt.io: Claude Code, Claude Desktop, ChatGPT, and any tool that speaks the Model Context Protocol. Our servers use HTTP-native transports supported by modern clients.
How does authentication work?
systemprompt.io uses OAuth2/OIDC for API authentication and WebAuthn for passwordless user login. Every MCP tool call and agent interaction is authenticated and authorized against scoped permissions.
Is this open source?
systemprompt.io is source-available under BSL-1.1 (Business Source License). You can view, modify, and self-host the code. After 4 years, each version converts to Apache 2.0. The template you customize is MIT licensed and fully yours.
SHIP AI TO YOUR USERS
You bring the intelligence. We handle the multiplayer infrastructure.
Get Started Free