VAPI - Custom LLM - Streaming
Hello, When using custom-llm-server, is it required to have streaming enabled ? as i see that VAPI is sending stream:true to my custom llm server, and for specific requirements, i m not able to support streaming as of now. my server is returning an OPENAI compatible response but VAPI is not able to Read it( Talk ), can you please advise on this?