rival-black
rival-black16mo ago

Fireworks.ai not working as Custom LLM

Has anyone managed to use Fireworks.ai as the Custom LLM please? I entered, this but it doesnt work: Custom LLM URL: https://api.fireworks.ai/inference/v1 Model: accounts/fireworks/models/llama-v3-70b-instruct And I tried the Fireworks API key in the Custom LLM Provider key, both as: APIKEY and Bearer APIKEY
5 Replies
Ask Inkeep
Ask Inkeep16mo ago
Hi @Ethan Tan, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly. ----------------------------- While I generate an answer, here are some sources that might be helpful: * Introduction * Custom LLM * Frequently Asked Questions * Connecting Your Custom LLM to Vapi: A Comprehensive Guide ----------------------------- Answer Sorry, I wasn't able to find any information about your question.
like-gold
like-gold16mo ago
Currently, To use it Fireworks.ai you need to configure your server and get the token from Vapi and send it to your custom server and respond back the information in the format we support. For a detailed explanation please use this documentation https://docs.vapi.ai/custom-llm-guide
rival-black
rival-blackOP16mo ago
Ah I see ok thank you. Any plans to integrate Fireworks as an option soon?
like-gold
like-gold16mo ago
Ask Inkeep
Ask Inkeep16mo ago
Glad I could be helpful. Feel free to create a new thread with any new questions you may have.

Did you find this page helpful?