magic-amber
magic-amber8mo ago

Connect VAPI to langgraph

I have a very complex workflow developed in langgraph and I want to attach VAPI voice capabilities. Every request the VAPI receives from the user has to be redirected to the langgraph agent. Could you pleae help me out how to configure this. Thanks
7 Replies
Vapi Support Bot
To connect VAPI to your LangGraph agent, configure VAPI to use your LangGraph endpoint as a Custom LLM:
curl -X PATCH https://api.vapi.ai/assistant/YOUR_ASSISTANT_ID \
-H "Authorization: Bearer YOUR_PRIVATE_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": {
"provider": "custom-llm",
"model": "gpt-4o",
"url": "https://YOUR_LANGGRAPH_ENDPOINT/chat/completions"
}
}'
curl -X PATCH https://api.vapi.ai/assistant/YOUR_ASSISTANT_ID \
-H "Authorization: Bearer YOUR_PRIVATE_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": {
"provider": "custom-llm",
"model": "gpt-4o",
"url": "https://YOUR_LANGGRAPH_ENDPOINT/chat/completions"
}
}'
If your LangGraph agent is running locally, use ngrok to create a public URL. Sources: - Custom LLM Tool Calling Integration - Using Your Server - Server SDKs
⤵️ Mention @Vapi Support Bot for follow-ups 🐥
magic-amber
magic-amberOP8mo ago
how to setup in dashboard? @Duckie how to setup in dashboard?
Vapi Support Bot
According to Using Your Server, you can set up LangGraph in the VAPI dashboard by: 1. Log in to your VAPI account at dashboard.vapi.ai 2. Navigate to the "Model" section 3. Choose the "Custom LLM" option 4. Enter your LangGraph endpoint URL in the endpoint field (use ngrok URL if your LangGraph agent runs locally) 5. Test the connection by sending a test message Is your LangGraph endpoint already accessible via a public URL, or do you need guidance on setting up ngrok for a locally running LangGraph agent? Source: - Using Your Server
magic-amber
magic-amberOP8mo ago
I've added my custom URL ( https://b456-2605-ad80-30-241e-f9a5-117c-9942-7f2c.ngrok-free.app), but still its adding /chat/completions in the last which makes the system not working
deep-jade
deep-jade8mo ago
Create a custom Vapi tool that sends user voice input via POST to your LangGraph API endpoint, which processes it and returns a JSON response for Vapi to vocalize. Create a new assistant in the Vapi dashboard, configure it with your LangGraph API endpoint, set up transcriber and voice settings, define a system prompt, and test the integration using the Test Call feature.
extended-salmon
extended-salmon8mo ago
Yes, you need to have OpenAI structured response for CustomLLM and it must be chat/completions It also needs to be streaming response. https://github.com/VapiAI/example-custom-llm
Vapi
Vapi7mo ago
Hey Sai, checking if this was resolved for you.

Did you find this page helpful?