extended-salmon•4h ago
Custom LLM Inquiry
Hi @Vapi @Vapi Support Bot ,
I wanted to clarify a few points regarding custom LLM integration with Vapi:
1. Can we use Azure Openai Realtime APIs with a custom LLM ?
2. Is it necessary to create a separate backend API layer to connect the custom LLM, or can we directly use the Azure OpenAI endpoint URL and credentials within the assistant configuration?
1 Reply
You do not need a separate backend API layer—Vapi allows you to connect directly to your Azure OpenAI endpoint using your credentials. Here’s a sample configuration for your assistant:
Just ensure your deployment name matches Vapi’s supported models exactly for a smooth setup.
For more details, see the Vapi Azure OpenAI integration guide. Source: - Azure OpenAI Integration Guide
For more details, see the Vapi Azure OpenAI integration guide. Source: - Azure OpenAI Integration Guide
⤵️ Mention @Vapi Support Bot for follow-ups 🐥