magic-amber•8mo ago
Connect VAPI to langgraph
I have a very complex workflow developed in langgraph and I want to attach VAPI voice capabilities. Every request the VAPI receives from the user has to be redirected to the langgraph agent. Could you pleae help me out how to configure this. Thanks
7 Replies
To connect VAPI to your LangGraph agent, configure VAPI to use your LangGraph endpoint as a Custom LLM:
If your LangGraph agent is running locally, use ngrok to create a public URL.
Sources:
- Custom LLM Tool Calling Integration
- Using Your Server
- Server SDKs
⤵️ Mention @Vapi Support Bot for follow-ups 🐥
magic-amberOP•8mo ago
how to setup in dashboard?
@Duckie how to setup in dashboard?
According to Using Your Server, you can set up LangGraph in the VAPI dashboard by:
1. Log in to your VAPI account at dashboard.vapi.ai
2. Navigate to the "Model" section
3. Choose the "Custom LLM" option
4. Enter your LangGraph endpoint URL in the endpoint field (use ngrok URL if your LangGraph agent runs locally)
5. Test the connection by sending a test message
Is your LangGraph endpoint already accessible via a public URL, or do you need guidance on setting up ngrok for a locally running LangGraph agent?
Source:
- Using Your Server
magic-amberOP•8mo ago
I've added my custom URL ( https://b456-2605-ad80-30-241e-f9a5-117c-9942-7f2c.ngrok-free.app), but still its adding /chat/completions in the last which makes the system not working
deep-jade•8mo ago
Create a custom Vapi tool that sends user voice input via POST to your LangGraph API endpoint, which processes it and returns a JSON response for Vapi to vocalize.
Create a new assistant in the Vapi dashboard, configure it with your LangGraph API endpoint, set up transcriber and voice settings, define a system prompt, and test the integration using the Test Call feature.
extended-salmon•8mo ago
Yes, you need to have OpenAI structured response for CustomLLM and it must be
chat/completions It also needs to be streaming response.
https://github.com/VapiAI/example-custom-llmHey Sai, checking if this was resolved for you.