custom llm response

i am using mixtral model to generate desired outputs. i am integrating my custom llm using the vapi interface, when i am calling the agent, my llm recieve the requesst by vapi, but the response i am sending does not reach the vapi interfae i.e it doesnt say anything and the call ends. please tell me how to cater this problem
Was this page helpful?