Custom LLM server issue, failing after getting stream
I am having issues with a custom LLM server, I cam getting error that LLM has failed, even though I am seeing the full output of the model in the Call Log - something is happening after the stream from the Custom LLM completes and I do not know how to investigate.
Help!!!
Help!!!