LLM Providers Overview¶
LLM Providers are integrations with AI service platforms that offer language models. By configuring providers in the AI Workspace, you can:
- Centralize credential management: Store API keys and authentication details securely
- Connect multiple providers: Integrate with leading LLM services
- Monitor provider status: Track availability and health of connected services
- Simplify configuration: Use providers across multiple proxies without duplicating credentials
Supported Providers¶
API Platform AI Workspace supports the following LLM providers:
| Provider | Description | Learn More |
|---|---|---|
| Access GPT-4, GPT-3.5, and other OpenAI models | Documentation | |
| Integrate Claude models for advanced AI capabilities | Documentation | |
| Use OpenAI models hosted on Microsoft Azure | Documentation | |
| Access models through Azure AI Foundry platform | Documentation | |
| Integrate Google's Gemini language models | Documentation | |
| Access Mistral's open and commercial models | Documentation | |
| AWS Bedrock (Coming Soon) | Connect to Amazon Bedrock's managed AI service |
Next: Configure LLM Provider - Step-by-step guide to set up your first provider