Uh-oh! There was an issue connecting to Ollama. #105

Closed
opened 2025-11-11 14:05:48 -06:00 by GiteaMirror · 18 comments
Owner

Originally created by @ralyodio on GitHub (Dec 18, 2023).

Uh-oh! There was an issue connecting to Ollama.

Originally created by @ralyodio on GitHub (Dec 18, 2023). Uh-oh! There was an issue connecting to Ollama.
Author
Owner

@ralyodio commented on GitHub (Dec 18, 2023):

is this on my side? or is ollama.ai servers down?

@ralyodio commented on GitHub (Dec 18, 2023): is this on my side? or is ollama.ai servers down?
Author
Owner

@tjbck commented on GitHub (Dec 18, 2023):

Hi, We cannot help you with the limited information you've included, please provide the commands you used to install, browser console logs and docker container logs. Also, please make sure that you have followed and checked the troubleshooting guide provided in the readme.md, Thanks.

@tjbck commented on GitHub (Dec 18, 2023): Hi, We cannot help you with the limited information you've included, please provide the commands you used to install, browser console logs and docker container logs. Also, please make sure that you have followed and checked the troubleshooting guide provided in the readme.md, Thanks.
Author
Owner

@justinh-rahb commented on GitHub (Dec 18, 2023):

is this on my side? or is ollama.ai servers down?

@ralyodio The Ollama WebUI does not interact with ollama.ai servers, but rather only communicates with a locally running instance of Ollama on the same machine or another machine via API. Therefore, if there is an issue, it would be specific to your local installation and not related to any third-party servers on the internet.

@justinh-rahb commented on GitHub (Dec 18, 2023): > is this on my side? or is ollama.ai servers down? @ralyodio The Ollama WebUI does not interact with ollama.ai servers, but rather only communicates with a locally running instance of Ollama on the same machine or another machine via API. Therefore, if there is an issue, it would be specific to your local installation and not related to any third-party servers on the internet.
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

i can't sseem to get it to work w/o using docker and even with docker it worked the first time I ran build, but after that I always get server connection fail

@ralyodio commented on GitHub (Dec 19, 2023): i can't sseem to get it to work w/o using docker and even with docker it worked the first time I ran build, but after that I always get server connection fail
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

[2023-12-19T00:46:09.652Z] "GET /ollama/api/tags" Error (404): "connect ECONNREFUSED ::1:5432"

this is what i get in the logs.

@ralyodio commented on GitHub (Dec 19, 2023): `[2023-12-19T00:46:09.652Z] "GET /ollama/api/tags" Error (404): "connect ECONNREFUSED ::1:5432"` this is what i get in the logs.
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

Here's what I'm doing:

OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve

in another shell I run:

OLLAMA_API_BASE_URL=http://localhost:11434 PORT=5432 npm start

and when I go to https://ai.profullstack.com the UI loads but can't connect to ollama server. In the ollama server log I see a 404 for /api/v1/ but that's it.

Here's my nginx config incase:

    location /api {
#        auth_basic "ai access required";
#        auth_basic_user_file /etc/nginx/.htpasswd;
        proxy_pass http://localhost:11434/api;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_cache_bypass $http_upgrade;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-Proto https;
        proxy_set_header X-Forwarded-For $remote_addr;
    }

    location / {
#        auth_basic "ai access required";
#        auth_basic_user_file /etc/nginx/.htpasswd;
        proxy_pass http://localhost:5432;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_cache_bypass $http_upgrade;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-Proto https;
        proxy_set_header X-Forwarded-For $remote_addr;
    }
@ralyodio commented on GitHub (Dec 19, 2023): Here's what I'm doing: `OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve` in another shell I run: `OLLAMA_API_BASE_URL=http://localhost:11434 PORT=5432 npm start` and when I go to https://ai.profullstack.com the UI loads but can't connect to ollama server. In the ollama server log I see a 404 for `/api/v1/` but that's it. Here's my nginx config incase: ``` location /api { # auth_basic "ai access required"; # auth_basic_user_file /etc/nginx/.htpasswd; proxy_pass http://localhost:11434/api; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_cache_bypass $http_upgrade; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-For $remote_addr; } location / { # auth_basic "ai access required"; # auth_basic_user_file /etc/nginx/.htpasswd; proxy_pass http://localhost:5432; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_cache_bypass $http_upgrade; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-For $remote_addr; } ```
Author
Owner

@tjbck commented on GitHub (Dec 19, 2023):

Please refer to https://github.com/ollama-webui/ollama-webui#how-to-install-without-docker, to install locally and fully understand all the ollama-webui components. Thanks.

@tjbck commented on GitHub (Dec 19, 2023): Please refer to https://github.com/ollama-webui/ollama-webui#how-to-install-without-docker, to install locally and fully understand all the ollama-webui components. Thanks.
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

Yeah I went through all that.

@ralyodio commented on GitHub (Dec 19, 2023): Yeah I went through all that.
Author
Owner

@tjbck commented on GitHub (Dec 19, 2023):

Hmm, I have no idea how exactly you have Ollama/Ollama WebUI set up, and If you follow all the steps outlined in the readme.md, you should have the webui up and running.

Also, I cannot reproduce at all with the given information, Could you tell us more about your setup, as detailed as possible so that we can better help you with your issue? Thanks.

@tjbck commented on GitHub (Dec 19, 2023): Hmm, I have no idea how exactly you have Ollama/Ollama WebUI set up, and If you follow all the steps outlined in the readme.md, you should have the webui up and running. Also, I cannot reproduce at all with the given information, Could you tell us more about your setup, as detailed as possible so that we can better help you with your issue? Thanks.
Author
Owner

@oliverbob commented on GitHub (Dec 19, 2023):

Yeah I went through all that.

Pull the latest ollama-webui and try the build method:

Remove/kill both ollama and ollama-webui in docker:
If ollama is not running on docker (sudo systemctl stop ollama)

If ollama is running on docker:
docker remove ollama ollama-webui.

docker compose up -d --build

[
Optional:
Then on the same terminal, try to do: ollama pull orca-mini:3b
]

Then open ollama-webui at http://localhost:8080

@oliverbob commented on GitHub (Dec 19, 2023): > Yeah I went through all that. Pull the latest ollama-webui and try the build method: Remove/kill both ollama and ollama-webui in docker: If ollama is not running on docker (sudo systemctl stop ollama) If ollama is running on docker: docker remove ollama ollama-webui. docker compose up -d --build [ Optional: Then on the same terminal, try to do: ollama pull orca-mini:3b ] Then open ollama-webui at [http://localhost:8080](http://localhost:8080/)
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

As I've stated I'm not using docker.

@ralyodio commented on GitHub (Dec 19, 2023): As I've stated I'm not using docker.
Author
Owner

@tjbck commented on GitHub (Dec 19, 2023):

We do not support the commandnpm start for deployment and is strictly for development and testing purposes. Do you also have the backend running as outlined in the readme.md?

@tjbck commented on GitHub (Dec 19, 2023): We do not support the command`npm start` for deployment and is strictly for development and testing purposes. Do you also have the backend running as outlined in the readme.md?
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

yes, npm run dev is for development.

@ralyodio commented on GitHub (Dec 19, 2023): yes, `npm run dev` is for development.
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

I figured it out. I think its a bug with using npm start

I had to change the settings in the UI to use /api instead of /ollama/api and it connected!

@ralyodio commented on GitHub (Dec 19, 2023): I figured it out. I think its a bug with using `npm start` I had to change the settings in the UI to use `/api` instead of `/ollama/api` and it connected!
Author
Owner

@tjbck commented on GitHub (Dec 19, 2023):

Seems like you have not read the instructions here: https://github.com/ollama-webui/ollama-webui#how-to-install-without-docker, and you're using an unsupported method of installation, I'll close this issue.

If you'd like to follow the instructions provided, refer to the screenshot of the readme.md I took below, Thanks.
image

@tjbck commented on GitHub (Dec 19, 2023): Seems like you have not read the instructions here: https://github.com/ollama-webui/ollama-webui#how-to-install-without-docker, and you're using an unsupported method of installation, I'll close this issue. If you'd like to follow the instructions provided, refer to the screenshot of the readme.md I took below, Thanks. ![image](https://github.com/ollama-webui/ollama-webui/assets/25473318/a8e19067-5202-486f-abae-e01dd44c24c4)
Author
Owner

@ralyodio commented on GitHub (Dec 19, 2023):

your instructions suck ass and you got a bug anyway, but hey what do i know. i've only been doing this 25 years.

@ralyodio commented on GitHub (Dec 19, 2023): your instructions suck ass and you got a bug anyway, but hey what do i know. i've only been doing this 25 years.
Author
Owner

@tjbck commented on GitHub (Dec 19, 2023):

Again, to reiterate npm start is not a supported method of deploying the webui, and it won't work unless you configure everything the right way. Thank you.

@tjbck commented on GitHub (Dec 19, 2023): Again, to reiterate `npm start` is not a supported method of deploying the webui, and it won't work unless you configure everything the right way. Thank you.
Author
Owner

@oliverbob commented on GitHub (Dec 20, 2023):

Timothy,

Again, to reiterate npm start is not a supported method of deploying the webui, and it won't work unless you configure everything the right way. Thank you.

Please only remove only the abusive part of comments, not the entire comments. I've been reading the convo in my email. Lets be open to developer frustrations. It happens at times.

Anyways, you have a very great work TJBCK. You're perhaps even better than Sam Altman. Cheers my friend. 👍

Thanks a lot.

@oliverbob commented on GitHub (Dec 20, 2023): Timothy, > Again, to reiterate `npm start` is not a supported method of deploying the webui, and it won't work unless you configure everything the right way. Thank you. Please only remove only the abusive part of comments, not the entire comments. I've been reading the convo in my email. Lets be open to developer frustrations. It happens at times. Anyways, you have a very great work TJBCK. You're perhaps even better than Sam Altman. Cheers my friend. :+1: Thanks a lot.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#105