mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
Show Status Updating When Running Local Models over Ollama #3667
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @TheElo on GitHub (Feb 7, 2025).
The Issue:
When starting a conversation with a local model, it can take some time till the model is loaded. The user does not how long it will take till the conversation will start or if the system maybe halted.
The Solution:
Pull and show status updates from ollama, like "initializing model", "loading model into ram", "creating token cache" or if possible just measure the time from first user prompt to model responding referenced to the model and show a timer next time the model is loaded. The "thinking" display cold be the ideal position to display the state.
The Benefit:
I personally don't have to check if docker is running correctly. Or looking at my ram if things move. The UI should look more "alive" while starting or switch local models.
I don't know if the APIs actually support that, maybe this feature request should be linked to ollama.