Pulling model in Models section spawns a never ending stream of Pulling Manfiest notifications that animated down screen #50

Closed
opened 2025-11-11 14:03:16 -06:00 by GiteaMirror · 10 comments
Owner

Originally created by @kevinduffey on GitHub (Nov 22, 2023).

Originally assigned to: @tjbck on GitHub.

Basically as the title says.. I go in to Models, insert image (llama2:7b) and click the download button. It pops up a short notification saying Pulling Manifest. Then, that same notification keeps showing up and animating down the screen. Like I guess 1000s of them. They just keep on showing up and looking like a waterfall of them down the middle of the page. I expected to see an animated download indicator right below the input box.

When I reload the page.. no model is downloaded and it everything is back to normal. I verified server url is correct.

Screenshot 2023-11-21 222709

So tried again (in between writing this) and turns out a) my docker volume setup was not allowing ollama to write to it, so I got that working. but b) even with it working it STILL shows this falling down icon as per the image above. It DOES however.. once it stops, indicate the model is done and I was able to close the dialog and select the model and see it work.

Originally created by @kevinduffey on GitHub (Nov 22, 2023). Originally assigned to: @tjbck on GitHub. Basically as the title says.. I go in to Models, insert image (llama2:7b) and click the download button. It pops up a short notification saying Pulling Manifest. Then, that same notification keeps showing up and animating down the screen. Like I guess 1000s of them. They just keep on showing up and looking like a waterfall of them down the middle of the page. I expected to see an animated download indicator right below the input box. When I reload the page.. no model is downloaded and it everything is back to normal. I verified server url is correct. ![Screenshot 2023-11-21 222709](https://github.com/ollama-webui/ollama-webui/assets/13786755/b1254a8d-c1d8-46be-a518-7b193eb25d7b) So tried again (in between writing this) and turns out a) my docker volume setup was not allowing ollama to write to it, so I got that working. but b) even with it working it STILL shows this falling down icon as per the image above. It DOES however.. once it stops, indicate the model is done and I was able to close the dialog and select the model and see it work.
GiteaMirror added the bug label 2025-11-11 14:03:16 -06:00
Author
Owner

@tjbck commented on GitHub (Nov 22, 2023):

Hi, This looks very strange, I'm not sure what would be the cause of this issue as I cannot reproduce your issue in any of my local machines (mac, linux, windows). Could you please try running Ollama and the Web UI in other machines if you have any and see if you can reproduce the issue yourself? Thanks.

@tjbck commented on GitHub (Nov 22, 2023): Hi, This looks very strange, I'm not sure what would be the cause of this issue as I cannot reproduce your issue in any of my local machines (mac, linux, windows). Could you please try running Ollama and the Web UI in other machines if you have any and see if you can reproduce the issue yourself? Thanks.
Author
Owner

@sanasol commented on GitHub (Nov 22, 2023):

Same here
Screenshot 2023-11-22 at 23 32 08

Tested on macbook and linux server.

Both times opened using Brave on MacOs.

@sanasol commented on GitHub (Nov 22, 2023): Same here <img width="1728" alt="Screenshot 2023-11-22 at 23 32 08" src="https://github.com/ollama-webui/ollama-webui/assets/1709666/0a6a7ceb-6739-4a74-a75d-80948faa6553"> Tested on macbook and linux server. Both times opened using Brave on MacOs.
Author
Owner

@kevinduffey commented on GitHub (Nov 22, 2023):

Not sure why it was closed so fast.. but I dont have another machine to run it on. Glad to see I am not alone with this. I suspect the team should look to do some more testing. At least the image actually pulls down once all that animated waterfall notifications stop. So the download does work. It's not a big big deal, but just an oddity I guess. At least it's somewhat cool looking.. what would be better is if they bounced when they got to the bottom of the screen before disappearing. Adding some physics to it would be slick!
If it helps I was using Chrome on Windows hitting my linux ip that was running ollama and ollama-webui. Both of those are running as docker containers in portainer.

@kevinduffey commented on GitHub (Nov 22, 2023): Not sure why it was closed so fast.. but I dont have another machine to run it on. Glad to see I am not alone with this. I suspect the team should look to do some more testing. At least the image actually pulls down once all that animated waterfall notifications stop. So the download does work. It's not a big big deal, but just an oddity I guess. At least it's somewhat cool looking.. what would be better is if they bounced when they got to the bottom of the screen before disappearing. Adding some physics to it would be slick! If it helps I was using Chrome on Windows hitting my linux ip that was running ollama and ollama-webui. Both of those are running as docker containers in portainer.
Author
Owner

@sanasol commented on GitHub (Nov 22, 2023):

https://github.com/ollama-webui/ollama-webui/blob/main/src/lib/components/chat/SettingsModal.svelte#L134

There is literally while(true) and !data.status.includes('downloading')

But message is always "pulling {hash}" probably

@sanasol commented on GitHub (Nov 22, 2023): https://github.com/ollama-webui/ollama-webui/blob/main/src/lib/components/chat/SettingsModal.svelte#L134 There is literally while(true) and !data.status.includes('downloading') But message is always "pulling {hash}" probably
Author
Owner

@tjbck commented on GitHub (Nov 22, 2023):

Seems like I've accidentally closed this issue, Just updated Ollama to the latest version and seeing this issue as well. It will be fixed in a bit. Thanks for bringing this up.

@tjbck commented on GitHub (Nov 22, 2023): Seems like I've accidentally closed this issue, Just updated Ollama to the latest version and seeing this issue as well. It will be fixed in a bit. Thanks for bringing this up.
Author
Owner

@kevinduffey commented on GitHub (Nov 22, 2023):

Right on. Glad to help.

@kevinduffey commented on GitHub (Nov 22, 2023): Right on. Glad to help.
Author
Owner

@tjbck commented on GitHub (Nov 22, 2023):

Hi all, The issue should be fixed now with this merge, Please let me if the problem persists. Thanks!

@tjbck commented on GitHub (Nov 22, 2023): Hi all, The issue should be fixed now with this merge, Please let me if the problem persists. Thanks!
Author
Owner

@sanasol commented on GitHub (Nov 22, 2023):

Screenshot 2023-11-23 at 01 05 47

perfect

@sanasol commented on GitHub (Nov 22, 2023): <img width="1006" alt="Screenshot 2023-11-23 at 01 05 47" src="https://github.com/ollama-webui/ollama-webui/assets/1709666/b8fe730f-b49f-4675-b390-6ef86d4a1436"> perfect
Author
Owner

@sanasol commented on GitHub (Nov 22, 2023):

But there is another problem for me, maybe some quickfix possible. Dont want to post issue for it.

Cannot download 70b llama(other models too sometimes), tried multiple times, got stuck at end.

Sometimes server process killed.
Screenshot 2023-11-23 at 01 13 08
Screenshot 2023-11-23 at 01 13 30

No CPU/RAM usage after that, like just stuck.

@sanasol commented on GitHub (Nov 22, 2023): But there is another problem for me, maybe some quickfix possible. Dont want to post issue for it. Cannot download 70b llama(other models too sometimes), tried multiple times, got stuck at end. Sometimes server process killed. <img width="1221" alt="Screenshot 2023-11-23 at 01 13 08" src="https://github.com/ollama-webui/ollama-webui/assets/1709666/34ca8e7a-b957-40fa-b58f-385253a3cc0c"> <img width="1368" alt="Screenshot 2023-11-23 at 01 13 30" src="https://github.com/ollama-webui/ollama-webui/assets/1709666/3e671736-450f-4527-b620-ba8c12372c56"> No CPU/RAM usage after that, like just stuck.
Author
Owner

@sanasol commented on GitHub (Nov 22, 2023):

Nvm, I see there is a lot of issues https://github.com/jmorganca/ollama/issues?q=stuck

Fix is restart and retry until it work :)

Or better to add ollama noprune flag, to make resume possible after stuck/restart

compose.yaml

version: '3.6'

services:
  ollama:
    # Uncomment below for GPU support
    # deploy:
    #   resources:
    #     reservations:
    #       devices:
    #         - driver: nvidia
    #           count: 1
    #           capabilities:
    #             - gpu
    volumes:
      - ollama:/root/.ollama
    # Uncomment below to expose Ollama API outside the container stack
    # ports:
    #   - 11434:11434
    container_name: ollama
    pull_policy: always
    tty: true
    restart: unless-stopped
    image: ollama/ollama:latest
+    environment:
+     OLLAMA_NOPRUNE: true

And then

docker compose restart && docker compose exec -it ollama ollama pull llama2:70b

until win

@sanasol commented on GitHub (Nov 22, 2023): Nvm, I see there is a lot of issues https://github.com/jmorganca/ollama/issues?q=stuck Fix is restart and retry until it work :) Or better to [add ollama noprune flag](https://github.com/jmorganca/ollama/issues/695#issuecomment-1749328329), to make resume possible after stuck/restart [compose.yaml](https://github.com/ollama-webui/ollama-webui/blob/main/compose.yaml) ```diff version: '3.6' services: ollama: # Uncomment below for GPU support # deploy: # resources: # reservations: # devices: # - driver: nvidia # count: 1 # capabilities: # - gpu volumes: - ollama:/root/.ollama # Uncomment below to expose Ollama API outside the container stack # ports: # - 11434:11434 container_name: ollama pull_policy: always tty: true restart: unless-stopped image: ollama/ollama:latest + environment: + OLLAMA_NOPRUNE: true ``` And then docker compose restart && docker compose exec -it ollama ollama pull llama2:70b until win
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#50