What is Tyr?
Tyr is an open-source control plane for governing AI agents — Claude Code, Cursor, GitHub Copilot, AutoGen, LangGraph, custom MCP agents, autonomous LLMs — across every environment they run in: developer laptops, bare-metal servers, VMs, and (soon) Kubernetes.
It consists of three binaries:
| Binary | Role |
|---|---|
tyr-server | Control plane. gRPC (:7700) + REST (:7701). PostgreSQL-backed. Stores policies, agents, events. |
tyrd | Agent. Runs on each host. Attaches eBPF programs, enforces policy, streams events to the server. |
tyr | CLI. Manage agents, apply policies, tail audit logs, create enrollment tokens. |
Mental model
flowchart LR AI["AI Process<br/>(Cursor, Claude Code, …)"] subgraph Kernel["Linux kernel + tyrd eBPF"] Hooks["• observe<br/>• enforce<br/>• capture TLS"] end Server["tyr-server<br/>+ UI / CLI"] AI -- "syscalls" --> Kernel Kernel -- "gRPC + mTLS" --> ServerWhen an AI process tries to open(), execve(), or connect(), the eBPF program in tyrd intercepts the call in-kernel. If policy denies it, the syscall returns EPERM before it completes. Every event — allowed, denied, or alerted — streams to the server over mTLS-authenticated gRPC.
What Tyr is not
- Not a sandbox — Tyr governs policy, it doesn’t provide a separate execution environment like Docker or Firecracker.
- Not an SDK — your AI agent doesn’t need to be modified or import a library. Tyr is kernel-level and transparent.
- Not a replacement for Falco/Tetragon — those are general-purpose runtime security. Tyr is AI-agent-aware and provides an enforcement plane on top.
- Not a managed service — it’s a self-hosted open-source project. You run the server.
When should I use Tyr?
- You have developers running AI coding agents with full user privileges and want visibility into what they actually touch.
- You run autonomous LLM workflows in production (scrapers, agents, workers) and want hard guardrails on what they can do.
- You need an audit log of every LLM API call for compliance.
- You want to enforce “agent X can only read
/workspaceand callapi.openai.com” across a fleet of machines.
→ Next: Why Tyr? · How it compares