AI Gateway Overview¶
The AI Gateway provides a unified interface for managing AI services, including Large Language Model (LLM) providers and proxy configurations. It enables organizations to seamlessly integrate, manage, and govern AI capabilities within their API ecosystem.
AI Workspace¶
The AI Workspace is a comprehensive platform for managing AI services, including AI gateways, LLM providers, proxy configurations, and Model Context Protocol (MCP) integrations.
AI Gateways¶
Create and manage AI gateway runtimes directly within the AI Workspace:
- Create gateways: Set up gateway runtimes to process and route AI requests
- Multiple deployment options: Deploy via Quick Start, Virtual Machine, Docker, or Kubernetes
- Monitor gateway status: Track the availability of your gateway deployments
Learn more in the AI Gateways section.
LLM Providers¶
Connect to various LLM service providers and manage their configurations:
- Connect OpenAI, Anthropic, and more: Integrate multiple AI service providers
- Manage provider credentials: Securely store and manage API keys and authentication
- Monitor provider status: Track the availability and health of connected providers
Learn more in the LLM Providers section.
LLM Proxies¶
Create and manage proxy endpoints for your LLM services:
- Create proxy endpoints: Set up endpoints for routing AI requests
- Attach policies and guardrails: Apply rate limiting, content filtering, and security policies
- Track proxy traffic: Monitor usage and performance metrics
Learn more in the LLM Proxies section.
MCP Features (Coming Soon)¶
Model Context Protocol (MCP) External Servers and Registries will be available soon.
To get started, refer to the Getting Started guide.