[GH-ISSUE #21257] GitHub Models: refresh fails because Open WebUI calls /inference/models (404) #58089

Closed
opened 2026-05-05 22:19:03 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @tall27 on GitHub (Feb 8, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21257

Bug: GitHub Models endpoint uses /catalog/models, but Open WebUI requests /inference/models

Summary

When configuring Open WebUI with GitHub Models, model refresh fails with a network/error state because Open WebUI requests:

https://models.github.ai/inference/models

That path returns 404 for GitHub Models, while model catalog is exposed via:

https://models.github.ai/catalog/models

As a result, external models are not listed in UI even though chat inference endpoint is valid.

Environment

  • Open WebUI: ghcr.io/open-webui/open-webui:main (as of 2026-02-07)
  • Deployment: Docker Compose
  • Provider configured as OpenAI-compatible connection
  • Base URL tested: https://models.github.ai/inference
  • Auth: Bearer token (GitHub PAT)

Steps to Reproduce

  1. Open Admin -> Connections in Open WebUI.
  2. Configure OpenAI-compatible provider:
    • Base URL: https://models.github.ai/inference
    • API key: GitHub PAT
    • Token type: Bearer
  3. Save, then click refresh models.

Actual Result

Model refresh fails and Open WebUI logs show attempts to fetch /inference/models and then 404/content-type parsing errors.

Example log excerpt:

ERROR open_webui.routers.openai:get_models ...
url='https://models.github.ai/inference/models'
aiohttp.client_exceptions.ContentTypeError: 404 ... unexpected mimetype: text/plain
GET /openai/models/0 HTTP/1.1 500

Expected Result

Open WebUI should be able to list GitHub Models and use them without needing a custom proxy.

Suggested Fix

Detect GitHub Models base (models.github.ai) and use:

  • model list endpoint: GET /catalog/models
  • chat endpoint: POST /inference/chat/completions
  • embeddings endpoint: POST /inference/embeddings

Or allow provider-specific mapping for model-list and inference routes in OpenAI-compatible connections.

Notes

Workaround currently used: local proxy that maps OpenAI-style /v1/models to GitHub /catalog/models.

Originally created by @tall27 on GitHub (Feb 8, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/21257 ## Bug: GitHub Models endpoint uses `/catalog/models`, but Open WebUI requests `/inference/models` ### Summary When configuring Open WebUI with GitHub Models, model refresh fails with a network/error state because Open WebUI requests: `https://models.github.ai/inference/models` That path returns 404 for GitHub Models, while model catalog is exposed via: `https://models.github.ai/catalog/models` As a result, external models are not listed in UI even though chat inference endpoint is valid. ### Environment - Open WebUI: `ghcr.io/open-webui/open-webui:main` (as of 2026-02-07) - Deployment: Docker Compose - Provider configured as OpenAI-compatible connection - Base URL tested: `https://models.github.ai/inference` - Auth: Bearer token (GitHub PAT) ### Steps to Reproduce 1. Open `Admin -> Connections` in Open WebUI. 2. Configure OpenAI-compatible provider: - Base URL: `https://models.github.ai/inference` - API key: GitHub PAT - Token type: Bearer 3. Save, then click refresh models. ### Actual Result Model refresh fails and Open WebUI logs show attempts to fetch `/inference/models` and then 404/content-type parsing errors. Example log excerpt: ```text ERROR open_webui.routers.openai:get_models ... url='https://models.github.ai/inference/models' aiohttp.client_exceptions.ContentTypeError: 404 ... unexpected mimetype: text/plain GET /openai/models/0 HTTP/1.1 500 ``` ### Expected Result Open WebUI should be able to list GitHub Models and use them without needing a custom proxy. ### Suggested Fix Detect GitHub Models base (`models.github.ai`) and use: - model list endpoint: `GET /catalog/models` - chat endpoint: `POST /inference/chat/completions` - embeddings endpoint: `POST /inference/embeddings` Or allow provider-specific mapping for model-list and inference routes in OpenAI-compatible connections. ### Notes Workaround currently used: local proxy that maps OpenAI-style `/v1/models` to GitHub `/catalog/models`.
Author
Owner

@pr-validator-bot commented on GitHub (Feb 8, 2026):

⚠️ Missing Issue Title Prefix

@tall27, your issue title is missing a prefix (e.g., bug:, feat:, docs:).

Please update your issue title to include one of the following prefixes:

  • bug: Bug report or error you've encountered
  • feat: Feature request or enhancement suggestion
  • docs: Documentation issue or improvement request
  • question: Question about usage or functionality
  • help: Request for help or support

Example: bug: Login fails when using special characters in password

<!-- gh-comment-id:3867326781 --> @pr-validator-bot commented on GitHub (Feb 8, 2026): # ⚠️ Missing Issue Title Prefix @tall27, your issue title is missing a prefix (e.g., `bug:`, `feat:`, `docs:`). Please update your issue title to include one of the following prefixes: - **bug**: Bug report or error you've encountered - **feat**: Feature request or enhancement suggestion - **docs**: Documentation issue or improvement request - **question**: Question about usage or functionality - **help**: Request for help or support Example: `bug: Login fails when using special characters in password`
Author
Owner

@Classic298 commented on GitHub (Feb 8, 2026):

Essentially the same core question/issue as this: https://github.com/open-webui/open-webui/issues/21230

And provider-specific behaviour/code changes are not wanted.
If GitHub Models does not apply the OpenAI API schema, then they are not natively supported. Period.

If they do and just lack the /models endpoint - then good, you can still use them but the fact that they don't have a /models endpoint is not for Open WebUI to fix.

<!-- gh-comment-id:3867331332 --> @Classic298 commented on GitHub (Feb 8, 2026): Essentially the same core question/issue as this: https://github.com/open-webui/open-webui/issues/21230 And provider-specific behaviour/code changes are not wanted. If GitHub Models does not apply the OpenAI API schema, then they are not natively supported. Period. If they do and just lack the /models endpoint - then good, you can still use them but the fact that they don't have a /models endpoint is not for Open WebUI to fix.
Author
Owner
<!-- gh-comment-id:3867333736 --> @Classic298 commented on GitHub (Feb 8, 2026): https://docs.openwebui.com/faq#q-why-doesnt-open-webui-support-specific-providers-latest-api-eg-openai-responses-api
Author
Owner

@Classic298 commented on GitHub (Feb 8, 2026):

Also you said in the PR and here you'd need a proxy - but not really.
The only thing that fails is the connection verification - and it fails for a reason. GitHub doesn't have the models endpoint where it should be to be OpenAI compatible. Not OpenAI compatible - so connection verification for an OpenAI compatible connection fails.
Absolutely intended and expected.

If not OpenAI compatible then the connection verification fails

Since this is the only thing that GitHub fails to implement correctly, the models themselves still work in your case, so no proxy needed.

<!-- gh-comment-id:3867353013 --> @Classic298 commented on GitHub (Feb 8, 2026): Also you said in the PR and here you'd need a proxy - but not really. The only thing that fails is the connection verification - and it fails for a reason. GitHub doesn't have the models endpoint where it should be to be OpenAI compatible. Not OpenAI compatible - so connection verification for an OpenAI compatible connection fails. Absolutely intended and expected. If not OpenAI compatible then the connection verification fails Since this is the only thing that GitHub fails to implement correctly, the models themselves still work in your case, so no proxy needed.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58089