wise-white
wise-white17mo ago

Streaming API responses in vapi

Flowise, assistants api, and more, all support streaming, apparently the only way to let the AI do anything besides use the inhouse vapi model, is to have it use tools which are basically just webhooks, nothing is custom code - is there a way to use streaming with flowise and other tools in vapi? or do we need to constantly use these types of tools??
6 Replies
Ask Inkeep
Ask Inkeep17mo ago
Hi @Abed Malak | HoopAI, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly. ----------------------------- While I generate an answer, here are some sources that might be helpful: * Custom LLM * Web * How Vapi Works * Introduction -----------------------------
Ask Inkeep
Ask Inkeep17mo ago
Answer Vapi is designed to be highly customizable and modular, enabling developers to integrate various external tools and services, including streaming capabilities. While the core documentation provided doesn't explicitly mention streaming API responses in combination with Flowise or specific tools, Vapi's architecture supports using any OpenAI-compatible endpoint as the LLM, which can include services like Flowise if they are compatible. This means you can set up your server to act as the LLM and potentially handle streaming responses through your custom server setup (1) . For integrating external tools or services, you can leverage Vapi's capability to use a custom-llm provider. This involves setting up your server to be compatible with the OpenAI Client, allowing your external services to interact with Vapi through your server. This setup is crucial for custom implementations, including streaming responses or integrating with tools like Flowise. Remember, Vapi is built with the idea of giving developers a lot of room to customize and integrate. So, while using tools and webhooks is a common approach for extending its capabilities, integrating streaming or other specific functionalities depends on how you set up and utilize your server in combination with Vapi's flexible APIs (2) .
manual-pink
manual-pink17mo ago
What is your usecase?
wise-white
wise-whiteOP17mo ago
to give the AI Assistant knowledge like if it doesn't know, it can query a vector db and stream the response back right now it's saying: "one sec" then it's taking a ton of time
manual-pink
manual-pink17mo ago
We are working on the Knowledge base It will be coming very soon like in a 1-2 weeks at max.
wise-white
wise-whiteOP17mo ago
woaaaaaaaaaaa

Did you find this page helpful?