Introduction
Welcome to Odock documentation.
Welcome to Odock
Odock is an AI governance layer that sits between client applications and model or MCP providers. It provides a centralized gateway for LLM and MCP access, enabling granular control over usage, budgets, and security.
Explore the Documentation
What is Odock?
Learn about the core concepts and architecture of the Odock platform.
Docker Quickstart
Get Odock up and running locally in minutes using Docker Compose.
Platform Map
Understand how the different components of Odock fit together.
Gateway API
Reference for the LLM Gateway API endpoints and request formats.
Key Features
- Unified Gateway: Access OpenAI, Anthropic, Gemini, and vLLM through a single endpoint.
- Governance: Manage API keys, model access, and MCP server permissions.
- Budgets & Quotas: Enforce spend and usage limits at various scopes.
- Security: Built-in SafetySec engine for prompt injection and data leakage protection.
- Observability: Comprehensive monitoring with Prometheus, Grafana, Loki, and Tempo.