mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 19:38:46 -05:00
[GH-ISSUE #9128] Cannot Connect to Ollama Chat Model in n8n #15395
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @inside-mo on GitHub (Jan 30, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/9128
Bug Report
Installation Method
Self-hosted on VPS with Coolify.
Environment
Open WebUI Version: 0.0.57
Operating System: Ubuntu
Browser (if applicable): Brave 1.74.78
Confirmation:
Expected Behavior:
Upon adding the base URL of Ollama to the "Ollama Chat Model" node in N8N, I have access to the Ollama models via the dropdown window.
Actual Behavior:
I can add the base URL of Ollama to the "Ollama Chat Model" but no models are displayed in the dropdown. Despite the connection being tested successfully:
Description
Bug Summary:
I can connect to my locally hosted Ollama with the n8n Ollama Chat Model node, but not access the models
Reproduction Details
Steps to Reproduce:
Install Coolify
Install Ollama service
Create subdomain DNS entry for Ollama, e.g. ollama.website.com
Install n8n
Log in to n8n.
Select the Ollama Chat Model node
Click on create a new login credential
Insert the base URL
Save the connection
Attempt to select an AI model
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots/Screen Recordings (if applicable):

Additional Information
Here's the docker compose file:
services:
ollama-api:
image: 'ollama/ollama:latest'
volumes:
- 'ollama:/root/.ollama'
ports:
- '11434:11434'
healthcheck:
test:
- CMD
- ollama
- list
interval: 5s
timeout: 30s
retries: 10
open-webui:
image: 'ghcr.io/open-webui/open-webui:main'
volumes:
- 'open-webui:/app/backend/data'
depends_on:
- ollama-api
environment:
- SERVICE_FQDN_OLLAMA_8080
- 'OLLAMA_BASE_URL=http://ollama-api:11434'
healthcheck:
test:
- CMD
- curl
- '-f'
- 'http://127.0.0.1:8080'
interval: 5s
timeout: 30s
retries: 10
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!