mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 19:38:46 -05:00
Pulling model in Models section spawns a never ending stream of Pulling Manfiest notifications that animated down screen #50
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @kevinduffey on GitHub (Nov 22, 2023).
Originally assigned to: @tjbck on GitHub.
Basically as the title says.. I go in to Models, insert image (llama2:7b) and click the download button. It pops up a short notification saying Pulling Manifest. Then, that same notification keeps showing up and animating down the screen. Like I guess 1000s of them. They just keep on showing up and looking like a waterfall of them down the middle of the page. I expected to see an animated download indicator right below the input box.
When I reload the page.. no model is downloaded and it everything is back to normal. I verified server url is correct.
So tried again (in between writing this) and turns out a) my docker volume setup was not allowing ollama to write to it, so I got that working. but b) even with it working it STILL shows this falling down icon as per the image above. It DOES however.. once it stops, indicate the model is done and I was able to close the dialog and select the model and see it work.
@tjbck commented on GitHub (Nov 22, 2023):
Hi, This looks very strange, I'm not sure what would be the cause of this issue as I cannot reproduce your issue in any of my local machines (mac, linux, windows). Could you please try running Ollama and the Web UI in other machines if you have any and see if you can reproduce the issue yourself? Thanks.
@sanasol commented on GitHub (Nov 22, 2023):
Same here

Tested on macbook and linux server.
Both times opened using Brave on MacOs.
@kevinduffey commented on GitHub (Nov 22, 2023):
Not sure why it was closed so fast.. but I dont have another machine to run it on. Glad to see I am not alone with this. I suspect the team should look to do some more testing. At least the image actually pulls down once all that animated waterfall notifications stop. So the download does work. It's not a big big deal, but just an oddity I guess. At least it's somewhat cool looking.. what would be better is if they bounced when they got to the bottom of the screen before disappearing. Adding some physics to it would be slick!
If it helps I was using Chrome on Windows hitting my linux ip that was running ollama and ollama-webui. Both of those are running as docker containers in portainer.
@sanasol commented on GitHub (Nov 22, 2023):
https://github.com/ollama-webui/ollama-webui/blob/main/src/lib/components/chat/SettingsModal.svelte#L134
There is literally while(true) and !data.status.includes('downloading')
But message is always "pulling {hash}" probably
@tjbck commented on GitHub (Nov 22, 2023):
Seems like I've accidentally closed this issue, Just updated Ollama to the latest version and seeing this issue as well. It will be fixed in a bit. Thanks for bringing this up.
@kevinduffey commented on GitHub (Nov 22, 2023):
Right on. Glad to help.
@tjbck commented on GitHub (Nov 22, 2023):
Hi all, The issue should be fixed now with this merge, Please let me if the problem persists. Thanks!
@sanasol commented on GitHub (Nov 22, 2023):
perfect
@sanasol commented on GitHub (Nov 22, 2023):
But there is another problem for me, maybe some quickfix possible. Dont want to post issue for it.
Cannot download 70b llama(other models too sometimes), tried multiple times, got stuck at end.
Sometimes server process killed.


No CPU/RAM usage after that, like just stuck.
@sanasol commented on GitHub (Nov 22, 2023):
Nvm, I see there is a lot of issues https://github.com/jmorganca/ollama/issues?q=stuck
Fix is restart and retry until it work :)
Or better to add ollama noprune flag, to make resume possible after stuck/restart
compose.yaml
And then
docker compose restart && docker compose exec -it ollama ollama pull llama2:70b
until win