[GH-ISSUE #9128] Cannot Connect to Ollama Chat Model in n8n #15395

Closed
opened 2026-04-19 21:36:58 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @inside-mo on GitHub (Jan 30, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/9128

Bug Report


Installation Method

Self-hosted on VPS with Coolify.

Environment

  • Open WebUI Version: 0.0.57

  • Operating System: Ubuntu

  • Browser (if applicable): Brave 1.74.78

Confirmation:

  • [x ] I have read and followed all the instructions provided in the README.md.
  • [x ] I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Upon adding the base URL of Ollama to the "Ollama Chat Model" node in N8N, I have access to the Ollama models via the dropdown window.

Actual Behavior:

I can add the base URL of Ollama to the "Ollama Chat Model" but no models are displayed in the dropdown. Despite the connection being tested successfully:

Description

Bug Summary:
I can connect to my locally hosted Ollama with the n8n Ollama Chat Model node, but not access the models

Reproduction Details

Steps to Reproduce:
Install Coolify
Install Ollama service
Create subdomain DNS entry for Ollama, e.g. ollama.website.com
Install n8n
Log in to n8n.
Select the Ollama Chat Model node
Click on create a new login credential
Insert the base URL
Save the connection
Attempt to select an AI model

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots/Screen Recordings (if applicable):
Image

Image

Additional Information

Here's the docker compose file:
services:
ollama-api:
image: 'ollama/ollama:latest'
volumes:
- 'ollama:/root/.ollama'
ports:
- '11434:11434'
healthcheck:
test:
- CMD
- ollama
- list
interval: 5s
timeout: 30s
retries: 10
open-webui:
image: 'ghcr.io/open-webui/open-webui:main'
volumes:
- 'open-webui:/app/backend/data'
depends_on:
- ollama-api
environment:
- SERVICE_FQDN_OLLAMA_8080
- 'OLLAMA_BASE_URL=http://ollama-api:11434'
healthcheck:
test:
- CMD
- curl
- '-f'
- 'http://127.0.0.1:8080'
interval: 5s
timeout: 30s
retries: 10

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @inside-mo on GitHub (Jan 30, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/9128 # Bug Report --- ## Installation Method Self-hosted on VPS with Coolify. ## Environment - **Open WebUI Version:** 0.0.57 - **Operating System:** Ubuntu - **Browser (if applicable):** Brave 1.74.78 **Confirmation:** - [x ] I have read and followed all the instructions provided in the README.md. - [x ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Upon adding the base URL of Ollama to the "Ollama Chat Model" node in N8N, I have access to the Ollama models via the dropdown window. ## Actual Behavior: I can add the base URL of Ollama to the "Ollama Chat Model" but no models are displayed in the dropdown. Despite the connection being tested successfully: ## Description **Bug Summary:** I can connect to my locally hosted Ollama with the n8n Ollama Chat Model node, but not access the models ## Reproduction Details **Steps to Reproduce:** Install Coolify Install Ollama service Create subdomain DNS entry for Ollama, e.g. ollama.website.com Install n8n Log in to n8n. Select the Ollama Chat Model node Click on create a new login credential Insert the base URL Save the connection Attempt to select an AI model ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots/Screen Recordings (if applicable):** ![Image](https://github.com/user-attachments/assets/9919fd2a-11ce-4c96-8acb-6138df883de5) ![Image](https://github.com/user-attachments/assets/d618a228-cfc2-4a3f-b970-2f94ecc051b9) ## Additional Information Here's the docker compose file: services: ollama-api: image: 'ollama/ollama:latest' volumes: - 'ollama:/root/.ollama' ports: - '11434:11434' healthcheck: test: - CMD - ollama - list interval: 5s timeout: 30s retries: 10 open-webui: image: 'ghcr.io/open-webui/open-webui:main' volumes: - 'open-webui:/app/backend/data' depends_on: - ollama-api environment: - SERVICE_FQDN_OLLAMA_8080 - 'OLLAMA_BASE_URL=http://ollama-api:11434' healthcheck: test: - CMD - curl - '-f' - 'http://127.0.0.1:8080' interval: 5s timeout: 30s retries: 10 ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#15395