custom llm integration issue

Hi i implemented a custom llm backend using langgraph

I'm supplying content back in the response for Vapi to consume

I can see in my longs the content is filled with some sort of response

But Vapi doesn't seem to pick up hence its just silence after the first intro message

Can someone help me? Litterally this is my last blocker before getting a end to end solution in place!

Thanks

Call ID 25de6020-2b3d-4278-96ca-6fa32606e434
Was this page helpful?