
We’ve partnered with Vercel to launch our MCP server, now supporting streamable-HTTP for more reliable connections.
An MCP server is a lightweight program that exposes tools and real-world actions to AI models via the Model Context Protocol.
Think of it as a universal toolbox - LLMs and agent frameworks can call your server to:
With Vapi’s MCP server on Vercel, your AI agents can:
Server-Sent Events (SSE) require long-lived connections that can often break at scale, especially when handling concurrent clients.
Streamable HTTP is stateless and more reliable, with each request handling its own lifecycle.