mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 06:02:06 -05:00
Open WebUI Fails to Connect to Local Ollama Instance After Update Without Ollama Running #1108
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Foadsf on GitHub (Jun 3, 2024).
as a follow up to this question:
Description
Bug Summary:
After updating and running Open WebUI through Pinokio without running Ollama first, Open WebUI is no longer able to communicate with my local Ollama instance. This has caused the application to fail to start correctly, resulting in a black screen when accessing
http://127.0.0.1:8000/.Steps to Reproduce:
Expected Behavior:
Open WebUI should connect to Ollama and function correctly even if Ollama was not started before updating Open WebUI. Ideally, updating Open WebUI should not affect its ability to communicate with Ollama.
Actual Behavior:
Open WebUI fails to communicate with the local Ollama instance, resulting in a black screen and failure to operate as expected.
Environment
Open WebUI Version:
0.2.2Ollama:
0.1.41Operating System: Microsoft Windows [Version 10.0.19045.4412]
Browser: Microsoft Edge
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
F12to open Developer Tools.Consoletab.Save as..., and save the logs to a file.Docker Container Logs:
N/A
Screenshot:

Installation Method
Installed Open WebUI through Pinokio and Ollama via winget.
Additional Information
I believe the issue arose because I forgot to start Ollama before updating Open WebUI through Pinokio. This might have changed some settings, preventing Open WebUI from communicating with Ollama.
The most important aspect for me is to restore my previous chats. I prefer not to reinstall Open WebUI, or if necessary, I need a way to back up my previous chats.
@phonosys commented on GitHub (Jun 3, 2024):
I'm having the same issue, i don't think its related to Ollama's status (mine is running and working).
raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 404, message='Not Found', url=URL('https://api.openai.com/v1/api/models')
looks like error handling issue?
@Foadsf commented on GitHub (Jun 3, 2024):
@phonosys, thanks for the input. I suspect the issue might be that Open WebUI is unable to communicate with Ollama correctly and defaults to the OpenAI API.
To troubleshoot, I want to:
config.yamlor relevant configuration file in Open WebUI correctly points to Ollama.If Ollama runs on a different port or IP than expected, I need to update the Open WebUI configuration to match. I'll also be sure to check the error handling as you suggested.
Any tips on verifying Ollama's status would be appreciated.
@phonosys commented on GitHub (Jun 3, 2024):
On my end Ollama runs just fine, can run and switch models using the configured IP and port.
on the open-webui side:
empty model list (Open WebUI is unable to communicate with Ollama correctly like you mentioned)
INFO:apps.openai.main:get_all_models()
None
@Foadsf commented on GitHub (Jun 3, 2024):
asked a new question here on Discord.
@davidlight2018 commented on GitHub (Jun 3, 2024):
Got same problems.
@Foadsf commented on GitHub (Jun 3, 2024):
I ended up uninstalling Open WiebUI and reinstalling everything back. I lost all my previous information! 😞