Using a finetuned Openai model via Web SDK
Hello, I'm calling the Web SDK to create an assistant. How do I use a finetuned Openai model?
I saw this in the docs but I'm not sure where/when to do the PATCH request, and how it fits in with the current call:
This is my current code:
I saw this in the docs but I'm not sure where/when to do the PATCH request, and how it fits in with the current call:
Using Fine-Tuned OpenAI Models
To set up your OpenAI Fine-Tuned model, you need to follow these steps:
Set the custom llm URL to https://api.openai.com/v1.
Assign the custom llm key to the OpenAI key.
Update the model to their model.
Execute a PATCH request to the /assistant endpoint and ensure that model.metadataSendMode is set to off.Using Fine-Tuned OpenAI Models
To set up your OpenAI Fine-Tuned model, you need to follow these steps:
Set the custom llm URL to https://api.openai.com/v1.
Assign the custom llm key to the OpenAI key.
Update the model to their model.
Execute a PATCH request to the /assistant endpoint and ensure that model.metadataSendMode is set to off.This is my current code:
const vapi = new Vapi("x");
const assistantOptions = {
name: "Vapi’s Pizza Front Desk",
firstMessage: "Vappy’s Pizzeria speaking, how can I help you?",
transcriber: {
provider: "deepgram",
model: "nova-2",
language: "en-US",
},
voice: {
provider: "playht",
voiceId: "jennifer",
},
model: {
provider: "openai",
model: "gpt-4o",
messages: [
{
role: "system",
content: `x`,
},
],
},
};
const startCallButton = document.getElementById('startCallButton');
const voiceAnimation = document.getElementById('voiceAnimation');
let callActive = false;
// Add this line to check if the button is found
console.log('Start/End Call button:', startCallButton);
// Just before attaching the event listener
console.log('About to attach event listener');
startCallButton.addEventListener('click', async () => {
console.log('Event listener triggered');
console.log('Button clicked. Call active:', callActive);
if (!callActive) {
try {
startCallButton.disabled = true;
startCallButton.innerText = 'Connecting...';
voiceAnimation.classList.add('voice-animation-active');
await vapi.start(assistantOptions); const vapi = new Vapi("x");
const assistantOptions = {
name: "Vapi’s Pizza Front Desk",
firstMessage: "Vappy’s Pizzeria speaking, how can I help you?",
transcriber: {
provider: "deepgram",
model: "nova-2",
language: "en-US",
},
voice: {
provider: "playht",
voiceId: "jennifer",
},
model: {
provider: "openai",
model: "gpt-4o",
messages: [
{
role: "system",
content: `x`,
},
],
},
};
const startCallButton = document.getElementById('startCallButton');
const voiceAnimation = document.getElementById('voiceAnimation');
let callActive = false;
// Add this line to check if the button is found
console.log('Start/End Call button:', startCallButton);
// Just before attaching the event listener
console.log('About to attach event listener');
startCallButton.addEventListener('click', async () => {
console.log('Event listener triggered');
console.log('Button clicked. Call active:', callActive);
if (!callActive) {
try {
startCallButton.disabled = true;
startCallButton.innerText = 'Connecting...';
voiceAnimation.classList.add('voice-animation-active');
await vapi.start(assistantOptions);