like-gold
like-gold8mo ago

Sounds good, but I’m building a custom X for Y...

Not a problem, we can likely already support it. Vapi is designed to be modular at every level of the voice pipeline: Text-to-speech, LLM, Speech-to-text. You can bring your own custom models for any part of the pipeline. If they’re hosted with one of our providers: you just need to add your provider keys, then specify the custom model in your API requests. If they are hosted elsewhere: you can use the Custom LLM provider and specify the URL to your model in your API request.
Fine-tuned OpenAI models | Vapi
Use Another LLM or Your Own Server
0 Replies
No replies yetBe the first to reply to this messageJoin

Did you find this page helpful?