Skip to content

LLM Providers Overview

LLM Providers are integrations with AI service platforms that offer language models. By configuring providers in the AI Workspace, you can:

  • Centralize credential management: Store API keys and authentication details securely
  • Connect multiple providers: Integrate with leading LLM services
  • Monitor provider status: Track availability and health of connected services
  • Simplify configuration: Use providers across multiple proxies without duplicating credentials

Supported Providers

API Platform AI Workspace supports the following LLM providers:

Provider Description Learn More
OpenAI OpenAI Access GPT-4, GPT-3.5, and other OpenAI models Documentation
Anthropic Anthropic Integrate Claude models for advanced AI capabilities Documentation
Azure OpenAI Azure OpenAI Use OpenAI models hosted on Microsoft Azure Documentation
Azure AI Foundry Azure AI Foundry Access models through Azure AI Foundry platform Documentation
Gemini Gemini Integrate Google's Gemini language models Documentation
Mistral AI Mistral AI Access Mistral's open and commercial models Documentation
AWS Bedrock (Coming Soon) Connect to Amazon Bedrock's managed AI service

Next: Configure LLM Provider - Step-by-step guide to set up your first provider