Baseten Not working with Vapi Custom LLM

Hi there I am trying baseten openai compatible api with Vapi and it not getting through.

An error occurred in this call: pipeline-error-custom-llm-400-bad-request-validation-failed


In logs:-
Model request failed (attempt #1, reason: 400 "function-after[check_json_schema(), function-after[check_n(), function-after[disallow_beam_search(), DynamoTRTLLMChatCompletionRequest]]].metadata.numAssistantTurns (0): Input should be a valid string, function-after[check_json_schema(), function-after[check_n(), function-af...) (provider: custom-llm, model: meta-llama/Llama-4-Maverick-17B-128E-Instruct, region: unknown, credential: true)
Was this page helpful?