[GH-ISSUE #21225] issue: Ollama API connection attempts when ENABLE_OLLAMA_API is False (url_idx endpoints) #58083

Closed
opened 2026-05-05 22:18:33 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @tosfos on GitHub (Feb 6, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21225

Bug Description

When ENABLE_OLLAMA_API is set to False, Open WebUI still attempts to connect to Ollama API endpoints when accessing routes that include a url_idx parameter, causing connection errors even though Ollama APIs are explicitly disabled.

Error Log

2026-01-30 21:09:33.627 | ERROR | open_webui.routers.ollama:send_get_request:98 - Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Name or service not known]

Root Cause

The issue occurs in endpoints that accept an optional url_idx parameter. When url_idx is None, these endpoints correctly call get_all_models() which checks ENABLE_OLLAMA_API before making requests. However, when url_idx is provided, the code directly accesses OLLAMA_BASE_URLS[url_idx] and makes HTTP requests without first checking the ENABLE_OLLAMA_API flag.

Affected Endpoints

  1. GET /ollama/api/tags/{url_idx} (line 440-490 in backend/open_webui/routers/ollama.py)

    • When url_idx is provided, it directly accesses OLLAMA_BASE_URLS[url_idx] at line 450 without checking ENABLE_OLLAMA_API
    • Only when url_idx is None does it call get_all_models() which properly checks the flag
  2. GET /openai/models/{url_idx} (line 543+ in backend/open_webui/routers/openai.py)

    • Similar issue: when url_idx is provided, it accesses OPENAI_API_BASE_URLS[url_idx] without checking ENABLE_OPENAI_API

Code Evidence

Buggy code in get_ollama_tags (lines 447-450):

if url_idx is None:
    models = await get_all_models(request, user=user)  # ✅ This checks ENABLE_OLLAMA_API
else:
    url = request.app.state.config.OLLAMA_BASE_URLS[url_idx]  # ❌ No check!
    # ... makes request without checking ENABLE_OLLAMA_API

Correct implementation in get_ollama_versions (line 558):

async def get_ollama_versions(request: Request, url_idx: Optional[int] = None):
    if request.app.state.config.ENABLE_OLLAMA_API:  # ✅ Checks flag first
        if url_idx is None:
            # ...
        else:
            # ... makes request
    else:
        # ... returns appropriate error

Steps to Reproduce

  1. Set ENABLE_OLLAMA_API=False in environment variables
  2. Start Open WebUI
  3. Make a request to /ollama/api/tags/{url_idx} with any valid url_idx (e.g., 0)
  4. Observe connection error in logs even though Ollama API is disabled

Expected Behavior

When ENABLE_OLLAMA_API is False, all Ollama API endpoints should return an appropriate error (e.g., 503 Service Unavailable or 403 Forbidden) without attempting to connect to Ollama servers, regardless of whether url_idx is provided or not.

Actual Behavior

When ENABLE_OLLAMA_API is False and url_idx is provided:

  • The endpoint attempts to connect to OLLAMA_BASE_URLS[url_idx]
  • Connection errors are logged (e.g., "Cannot connect to host host.docker.internal:11434")
  • The request fails with a connection error instead of a clear "API disabled" message

Proposed Fix

Add ENABLE_OLLAMA_API / ENABLE_OPENAI_API checks at the beginning of affected endpoint functions, before accessing url_idx-specific URLs. For example:

@router.get("/api/tags/{url_idx}")
async def get_ollama_tags(
    request: Request, url_idx: Optional[int] = None, user=Depends(get_verified_user)
):
    if not request.app.state.config.ENABLE_OLLAMA_API:
        raise HTTPException(
            status_code=503,
            detail="Ollama API is disabled"
        )
    
    # ... rest of the function

Impact

  • Severity: Medium
  • Frequency: Occurs on every request to affected endpoints when APIs are disabled
  • User Experience: Confusing error messages and unnecessary connection attempts
  • Log Noise: Fills logs with connection errors even when the feature is intentionally disabled

Environment

  • Open WebUI version: Latest (as of 2026-01-30)
  • Configuration: ENABLE_OLLAMA_API=False
  • Docker environment with host.docker.internal:11434 as default Ollama URL
Originally created by @tosfos on GitHub (Feb 6, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/21225 ## Bug Description When `ENABLE_OLLAMA_API` is set to `False`, Open WebUI still attempts to connect to Ollama API endpoints when accessing routes that include a `url_idx` parameter, causing connection errors even though Ollama APIs are explicitly disabled. ## Error Log ``` 2026-01-30 21:09:33.627 | ERROR | open_webui.routers.ollama:send_get_request:98 - Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Name or service not known] ``` ## Root Cause The issue occurs in endpoints that accept an optional `url_idx` parameter. When `url_idx` is `None`, these endpoints correctly call `get_all_models()` which checks `ENABLE_OLLAMA_API` before making requests. However, when `url_idx` is provided, the code directly accesses `OLLAMA_BASE_URLS[url_idx]` and makes HTTP requests **without** first checking the `ENABLE_OLLAMA_API` flag. ### Affected Endpoints 1. **`GET /ollama/api/tags/{url_idx}`** (line 440-490 in `backend/open_webui/routers/ollama.py`) - When `url_idx` is provided, it directly accesses `OLLAMA_BASE_URLS[url_idx]` at line 450 without checking `ENABLE_OLLAMA_API` - Only when `url_idx is None` does it call `get_all_models()` which properly checks the flag 2. **`GET /openai/models/{url_idx}`** (line 543+ in `backend/open_webui/routers/openai.py`) - Similar issue: when `url_idx` is provided, it accesses `OPENAI_API_BASE_URLS[url_idx]` without checking `ENABLE_OPENAI_API` ### Code Evidence **Buggy code in `get_ollama_tags`** (lines 447-450): ```python if url_idx is None: models = await get_all_models(request, user=user) # ✅ This checks ENABLE_OLLAMA_API else: url = request.app.state.config.OLLAMA_BASE_URLS[url_idx] # ❌ No check! # ... makes request without checking ENABLE_OLLAMA_API ``` **Correct implementation in `get_ollama_versions`** (line 558): ```python async def get_ollama_versions(request: Request, url_idx: Optional[int] = None): if request.app.state.config.ENABLE_OLLAMA_API: # ✅ Checks flag first if url_idx is None: # ... else: # ... makes request else: # ... returns appropriate error ``` ## Steps to Reproduce 1. Set `ENABLE_OLLAMA_API=False` in environment variables 2. Start Open WebUI 3. Make a request to `/ollama/api/tags/{url_idx}` with any valid `url_idx` (e.g., `0`) 4. Observe connection error in logs even though Ollama API is disabled ## Expected Behavior When `ENABLE_OLLAMA_API` is `False`, all Ollama API endpoints should return an appropriate error (e.g., 503 Service Unavailable or 403 Forbidden) **without** attempting to connect to Ollama servers, regardless of whether `url_idx` is provided or not. ## Actual Behavior When `ENABLE_OLLAMA_API` is `False` and `url_idx` is provided: - The endpoint attempts to connect to `OLLAMA_BASE_URLS[url_idx]` - Connection errors are logged (e.g., "Cannot connect to host host.docker.internal:11434") - The request fails with a connection error instead of a clear "API disabled" message ## Proposed Fix Add `ENABLE_OLLAMA_API` / `ENABLE_OPENAI_API` checks at the beginning of affected endpoint functions, before accessing `url_idx`-specific URLs. For example: ```python @router.get("/api/tags/{url_idx}") async def get_ollama_tags( request: Request, url_idx: Optional[int] = None, user=Depends(get_verified_user) ): if not request.app.state.config.ENABLE_OLLAMA_API: raise HTTPException( status_code=503, detail="Ollama API is disabled" ) # ... rest of the function ``` ## Impact - **Severity**: Medium - **Frequency**: Occurs on every request to affected endpoints when APIs are disabled - **User Experience**: Confusing error messages and unnecessary connection attempts - **Log Noise**: Fills logs with connection errors even when the feature is intentionally disabled ## Environment - Open WebUI version: Latest (as of 2026-01-30) - Configuration: `ENABLE_OLLAMA_API=False` - Docker environment with `host.docker.internal:11434` as default Ollama URL
Author
Owner

@Classic298 commented on GitHub (Feb 6, 2026):

https://github.com/open-webui/open-webui/pull/21226

<!-- gh-comment-id:3862643475 --> @Classic298 commented on GitHub (Feb 6, 2026): https://github.com/open-webui/open-webui/pull/21226
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58083