Open WebUI Fails to Connect to Local Ollama Instance After Update Without Ollama Running #1108

Closed
opened 2025-11-11 14:37:43 -06:00 by GiteaMirror · 6 comments
Owner

Originally created by @Foadsf on GitHub (Jun 3, 2024).

as a follow up to this question:

Description

Bug Summary:
After updating and running Open WebUI through Pinokio without running Ollama first, Open WebUI is no longer able to communicate with my local Ollama instance. This has caused the application to fail to start correctly, resulting in a black screen when accessing http://127.0.0.1:8000/.

Steps to Reproduce:

  1. Install Open WebUI through Pinokio and Ollama via winget.
  2. Ensure Open WebUI is running smoothly.
  3. Forget to start Ollama and update+run Open WebUI through Pinokio once.
  4. Attempt to restart Open WebUI with Ollama running.
  5. Observe the black screen and failure to connect to Ollama.

Expected Behavior:
Open WebUI should connect to Ollama and function correctly even if Ollama was not started before updating Open WebUI. Ideally, updating Open WebUI should not affect its ability to communicate with Ollama.

Actual Behavior:
Open WebUI fails to communicate with the local Ollama instance, resulting in a black screen and failure to operate as expected.

Environment

  • Open WebUI Version: 0.2.2

  • Ollama: 0.1.41

  • Operating System: Microsoft Windows [Version 10.0.19045.4412]

  • Browser: Microsoft Edge

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:

  1. Open Microsoft Edge and press F12 to open Developer Tools.
  2. Go to the Console tab.
  3. Clear any existing logs using the clear button.
  4. Refresh the page and reproduce the issue.
  5. Right-click within the Console window, select Save as..., and save the logs to a file.
  6. Attach the saved file here.

Docker Container Logs:
N/A

Screenshot:
Pinokio_XF4NC8QgxF

Installation Method

Installed Open WebUI through Pinokio and Ollama via winget.

Additional Information

I believe the issue arose because I forgot to start Ollama before updating Open WebUI through Pinokio. This might have changed some settings, preventing Open WebUI from communicating with Ollama.

The most important aspect for me is to restore my previous chats. I prefer not to reinstall Open WebUI, or if necessary, I need a way to back up my previous chats.

Originally created by @Foadsf on GitHub (Jun 3, 2024). as a follow up to [this question](https://discord.com/channels/1121039057993089076/1131623060315844710/1247072529365991424): ## Description **Bug Summary:** After updating and running Open WebUI through Pinokio without running Ollama first, Open WebUI is no longer able to communicate with my local Ollama instance. This has caused the application to fail to start correctly, resulting in a black screen when accessing `http://127.0.0.1:8000/`. **Steps to Reproduce:** 1. Install Open WebUI through [Pinokio](https://program.pinokio.computer/#/?id=install) and Ollama via winget. 2. Ensure Open WebUI is running smoothly. 3. Forget to start Ollama and update+run Open WebUI through Pinokio once. 4. Attempt to restart Open WebUI with Ollama running. 5. Observe the black screen and failure to connect to Ollama. **Expected Behavior:** Open WebUI should connect to Ollama and function correctly even if Ollama was not started before updating Open WebUI. Ideally, updating Open WebUI should not affect its ability to communicate with Ollama. **Actual Behavior:** Open WebUI fails to communicate with the local Ollama instance, resulting in a black screen and failure to operate as expected. ## Environment - **Open WebUI Version:** `0.2.2` - **Ollama:** `0.1.41` - **Operating System:** Microsoft Windows [Version 10.0.19045.4412] - **Browser:** Microsoft Edge ## Reproduction Details **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** 1. Open Microsoft Edge and press `F12` to open Developer Tools. 2. Go to the `Console` tab. 3. Clear any existing logs using the clear button. 4. Refresh the page and reproduce the issue. 5. Right-click within the Console window, select `Save as...`, and save the logs to a file. 6. Attach the saved file [here](https://github.com/user-attachments/files/15530382/127.0.0.1-1717400344351.log). **Docker Container Logs:** N/A **Screenshot:** ![Pinokio_XF4NC8QgxF](https://github.com/open-webui/open-webui/assets/12762442/da0117f0-77d7-4a2f-b179-2ddd7ca885ad) ## Installation Method Installed Open WebUI through Pinokio and Ollama via winget. ## Additional Information I believe the issue arose because I forgot to start Ollama before updating Open WebUI through Pinokio. This might have changed some settings, preventing Open WebUI from communicating with Ollama. The most important aspect for me is to restore my previous chats. I prefer not to reinstall Open WebUI, or if necessary, I need a way to back up my previous chats.
Author
Owner

@phonosys commented on GitHub (Jun 3, 2024):

I'm having the same issue, i don't think its related to Ollama's status (mine is running and working).

raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 404, message='Not Found', url=URL('https://api.openai.com/v1/api/models')

looks like error handling issue?

@phonosys commented on GitHub (Jun 3, 2024): I'm having the same issue, i don't think its related to Ollama's status (mine is running and working). raise ClientResponseError( aiohttp.client_exceptions.ClientResponseError: 404, message='Not Found', url=URL('https://api.openai.com/v1/api/models') looks like error handling issue?
Author
Owner

@Foadsf commented on GitHub (Jun 3, 2024):

@phonosys, thanks for the input. I suspect the issue might be that Open WebUI is unable to communicate with Ollama correctly and defaults to the OpenAI API.

To troubleshoot, I want to:

  1. Verify if Ollama is running and accessible. Check if it's exposed at the configured IP address and port.
  2. Ensure the config.yaml or relevant configuration file in Open WebUI correctly points to Ollama.

If Ollama runs on a different port or IP than expected, I need to update the Open WebUI configuration to match. I'll also be sure to check the error handling as you suggested.

Any tips on verifying Ollama's status would be appreciated.

@Foadsf commented on GitHub (Jun 3, 2024): @phonosys, thanks for the input. I suspect the issue might be that Open WebUI is unable to communicate with Ollama correctly and defaults to the OpenAI API. To troubleshoot, I want to: 1. Verify if Ollama is running and accessible. Check if it's exposed at the configured IP address and port. 2. Ensure the `config.yaml` or relevant configuration file in Open WebUI correctly points to Ollama. If Ollama runs on a different port or IP than expected, I need to update the Open WebUI configuration to match. I'll also be sure to check the error handling as you suggested. Any tips on verifying Ollama's status would be appreciated.
Author
Owner

@phonosys commented on GitHub (Jun 3, 2024):

On my end Ollama runs just fine, can run and switch models using the configured IP and port.
on the open-webui side:
empty model list (Open WebUI is unable to communicate with Ollama correctly like you mentioned)
INFO:apps.openai.main:get_all_models()
None

@phonosys commented on GitHub (Jun 3, 2024): On my end Ollama runs just fine, can run and switch models using the configured IP and port. on the open-webui side: empty model list (Open WebUI is unable to communicate with Ollama correctly like you mentioned) INFO:apps.openai.main:get_all_models() None
Author
Owner

@Foadsf commented on GitHub (Jun 3, 2024):

asked a new question here on Discord.

@Foadsf commented on GitHub (Jun 3, 2024): asked a new question [here](https://discord.com/channels/1170866489302188073/1247114046549921812/1247114046549921812) on Discord.
Author
Owner

@davidlight2018 commented on GitHub (Jun 3, 2024):

Got same problems.

On my end Ollama runs just fine, can run and switch models using the configured IP and port. on the open-webui side: empty model list (Open WebUI is unable to communicate with Ollama correctly like you mentioned) INFO:apps.openai.main:get_all_models() None

@davidlight2018 commented on GitHub (Jun 3, 2024): Got same problems. > On my end Ollama runs just fine, can run and switch models using the configured IP and port. on the open-webui side: empty model list (Open WebUI is unable to communicate with Ollama correctly like you mentioned) INFO:apps.openai.main:get_all_models() None
Author
Owner

@Foadsf commented on GitHub (Jun 3, 2024):

I ended up uninstalling Open WiebUI and reinstalling everything back. I lost all my previous information! 😞

@Foadsf commented on GitHub (Jun 3, 2024): I ended up uninstalling Open WiebUI and reinstalling everything back. I lost all my previous information! 😞
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1108