streaming issue using custom llm

using the web interface, i've created a bot and connect to a custom llm endpoint. Streaming is recommended and sometimes works, but often the response gets trucated like this
image.png
Was this page helpful?