Stopping generation doesn't actually stop interference #1084

Closed
opened 2025-11-11 14:36:54 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @Arche151 on GitHub (May 31, 2024).

I am running the docker image with bundled Ollama (CPU interference) and when I click the stop button during generation, in the UI the text generation stops, but my CPU is still under as much load as before.

And when I start a new chat, the text generations takes forever. Only after restarting the Docker container I get the same speeds as before.

So clicking stop doesn't actually stop the interference.

Originally created by @Arche151 on GitHub (May 31, 2024). I am running the docker image with bundled Ollama (CPU interference) and when I click the stop button during generation, in the UI the text generation stops, but my CPU is still under as much load as before. And when I start a new chat, the text generations takes forever. Only after restarting the Docker container I get the same speeds as before. So clicking stop doesn't actually stop the interference.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1084