extended-salmon
extended-salmon4h ago

Custom LLM Inquiry

Hi @Vapi @Vapi Support Bot , I wanted to clarify a few points regarding custom LLM integration with Vapi: 1. Can we use Azure Openai Realtime APIs with a custom LLM ? 2. Is it necessary to create a separate backend API layer to connect the custom LLM, or can we directly use the Azure OpenAI endpoint URL and credentials within the assistant configuration?
1 Reply
Vapi Support Bot
You do not need a separate backend API layer—Vapi allows you to connect directly to your Azure OpenAI endpoint using your credentials. Here’s a sample configuration for your assistant:
{
"provider": "azure-openai",
"models": ["gpt-4o-2024-11-20"],
"region": "eastus",
"openAIEndpoint": "https://your-resource.openai.azure.com",
"openAIKey": "your-azure-openai-api-key"
}
{
"provider": "azure-openai",
"models": ["gpt-4o-2024-11-20"],
"region": "eastus",
"openAIEndpoint": "https://your-resource.openai.azure.com",
"openAIKey": "your-azure-openai-api-key"
}
Just ensure your deployment name matches Vapi’s supported models exactly for a smooth setup.
For more details, see the Vapi Azure OpenAI integration guide. Source: - Azure OpenAI Integration Guide
⤵️ Mention @Vapi Support Bot for follow-ups 🐥

Did you find this page helpful?