No connection to Windows Ollama #368

Closed
opened 2025-11-11 14:19:21 -06:00 by GiteaMirror · 8 comments
Owner

Originally created by @themw123 on GitHub (Feb 27, 2024).

Bug Report

Description

Installed ollama on native windows and tested it.
Installed open-webui in docker on server(same network as windows maschine) with the following command:
docker run -d -p 8080:8080 -e OLLAMA_API_BASE_URL=https://192.168.178.23/api -v /my_path_to_data:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

I am facing a white blank screen in browser after sign in.
in browser console i am getting the following two errors:
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
Manifest: Line: 1, column: 1, Syntax error.
After a while it loads the page but it is not connected to ollama, i cant see any model that i had installed before

all but one status code are 200:
INFO: my_ip:0 - "GET / HTTP/1.1" 304 Not Modified

Within the container of open-webui i am able to ping the windows maschine

Originally created by @themw123 on GitHub (Feb 27, 2024). # Bug Report ## Description Installed ollama on native windows and tested it. Installed open-webui in docker on server(same network as windows maschine) with the following command: `docker run -d -p 8080:8080 -e OLLAMA_API_BASE_URL=https://192.168.178.23/api -v /my_path_to_data:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` I am facing a white blank screen in browser after sign in. in browser console i am getting the following two errors: `Failed to load resource: net::ERR_BLOCKED_BY_CLIENT` `Manifest: Line: 1, column: 1, Syntax error.` After a while it loads the page but it is not connected to ollama, i cant see any model that i had installed before all but one status code are 200: `INFO: my_ip:0 - "GET / HTTP/1.1" 304 Not Modified` Within the container of open-webui i am able to ping the windows maschine
Author
Owner

@justinh-rahb commented on GitHub (Feb 27, 2024):

Set the OLLAMA_HOST=0.0.0.0 and OLLAMA_ORIGINS=* environment variables for Ollama, per their FAQ

@justinh-rahb commented on GitHub (Feb 27, 2024): Set the `OLLAMA_HOST=0.0.0.0` and `OLLAMA_ORIGINS=*` environment variables for Ollama, [per their FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows)
Author
Owner

@themw123 commented on GitHub (Feb 28, 2024):

My fault. Now in postman i can succesfully call the models by using the ip and port.
But still i am faceing the same problems with the webui.
Tried different browser and cleared cashe.

@themw123 commented on GitHub (Feb 28, 2024): My fault. Now in postman i can succesfully call the models by using the ip and port. But still i am faceing the same problems with the webui. Tried different browser and cleared cashe.
Author
Owner

@justinh-rahb commented on GitHub (Feb 28, 2024):

Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
Manifest: Line: 1, column: 1, Syntax error.

This smells like CORS getting in the way, which setting the OLLAMA_ORIGINS=* variable should fix, did you do that?

@justinh-rahb commented on GitHub (Feb 28, 2024): ``` Failed to load resource: net::ERR_BLOCKED_BY_CLIENT Manifest: Line: 1, column: 1, Syntax error. ``` This smells like CORS getting in the way, which setting the `OLLAMA_ORIGINS=*` variable should fix, did you do that?
Author
Owner

@tjbck commented on GitHub (Feb 28, 2024):

Try replacing https with http.

@tjbck commented on GitHub (Feb 28, 2024): Try replacing `https` with `http`.
Author
Owner

@justinh-rahb commented on GitHub (Feb 28, 2024):

Try replacing https with http.

🤦‍♂️ Missed that, ya.. that'll do it.

@justinh-rahb commented on GitHub (Feb 28, 2024): > Try replacing `https` with `http`. 🤦‍♂️ Missed that, ya.. that'll do it.
Author
Owner

@themw123 commented on GitHub (Feb 28, 2024):

yes i am using OLLAMA_ORIGINS=*
also tried with http
Browser console logs once in a while:
401 (Unauthorized)

@themw123 commented on GitHub (Feb 28, 2024): yes i am using `OLLAMA_ORIGINS=*` also tried with http Browser console logs once in a while: ` 401 (Unauthorized)`
Author
Owner

@themw123 commented on GitHub (Feb 28, 2024):

omg now its working. Changed http://192.168.178.23/api to http://192.168.178.23:11434/api
did it once before but there i hadnt set the env variables.
Thanks a lot. I should have gone to bed earlier

@themw123 commented on GitHub (Feb 28, 2024): omg now its working. Changed `http://192.168.178.23/api` to `http://192.168.178.23:11434/api` did it once before but there i hadnt set the env variables. Thanks a lot. I should have gone to bed earlier
Author
Owner

@themw123 commented on GitHub (Feb 28, 2024):

Now its working with docker run but with compose i am getting no answer when asking a question to the model(loading forever) and in browser console i am getting:
Uncaught (in promise) SyntaxError: Unexpected token '<', "<!DOCTYPE "... is not valid JSON

docker run:
docker run -d --network=host -e OLLAMA_API_BASE_URL=http://192.168.178.23:11434/api -v /my_path_to_data:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

docker compose:

open-webui:
    volumes:
        - './ollama/webui:/app/backend/data'
        - './litellm/config.yaml:/app/backend/data/litellm/config.yaml'
    #with or without network_mode it makes no difference
    network_mode: host
    #ports:
    #    - '8080:8080'
    container_name: open-webui
    image: ghcr.io/open-webui/open-webui:main
    restart: unless-stopped
    network_mode: host
    environment:
      - OLLAMA_API_BASE_URL=http://192.168.178.23:11434/api
@themw123 commented on GitHub (Feb 28, 2024): Now its working with docker run but with compose i am getting no answer when asking a question to the model(loading forever) and in browser console i am getting: `Uncaught (in promise) SyntaxError: Unexpected token '<', "<!DOCTYPE "... is not valid JSON` docker run: `docker run -d --network=host -e OLLAMA_API_BASE_URL=http://192.168.178.23:11434/api -v /my_path_to_data:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` docker compose: ``` open-webui: volumes: - './ollama/webui:/app/backend/data' - './litellm/config.yaml:/app/backend/data/litellm/config.yaml' #with or without network_mode it makes no difference network_mode: host #ports: # - '8080:8080' container_name: open-webui image: ghcr.io/open-webui/open-webui:main restart: unless-stopped network_mode: host environment: - OLLAMA_API_BASE_URL=http://192.168.178.23:11434/api ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#368