Vapi helps developers build, test, and deploy voice agents at scale. We enable everything in between the raw models and production, including telephony, test suites, and real-time analytics.
Hi @BigRed, I'm an AI assistant for Vapi. While a team member reviews this question, I'll look through relevant sources and see if I can provide suggestions. Please tag @Ask Inkeep with any follow-up questions.
No, currently we do not permit increasing the token count beyond 1000. Moreover, it's preferable to use fewer than 1000 tokens, as our focus is on Voice AI rather than conversational AI.
As I understand it token limits of 1500-2500 also allow a bigger context window. This is what I’m more interested in. For example. If a call is initiated and the caller says hello my name is Jeff. Then later in the conversation the chat bot ask for the name it doesn’t “remember” it. But with higher tokens the context window is bigger for remembering. That’s why I was asking. Thanks for the reply.
I don’t. These calls are first time callers. Information is collected then passed through an api to a dispatching software. Just was looking ti make the conversation seem for fluid/human