[GH-ISSUE #13678] feat: ability to mark openai connections as local connections. #55662

Closed
opened 2026-05-05 17:47:46 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @martinoturrina on GitHub (May 8, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13678

Originally assigned to: @tjbck on GitHub.

Check Existing Issues

  • I have searched the existing issues and discussions.

Problem Description

Currently ANY OpenAPI compatible connection is marked as external, being that other local LLM providers (like mlx_lm) provide a compatible endpoint but still are local. Currently by default all models via OpenAPI connection shows are external

Desired Solution you'd like

There should be a way to still show this models as local and not only as external

Alternatives Considered

No response

Additional Context

Image

Originally created by @martinoturrina on GitHub (May 8, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/13678 Originally assigned to: @tjbck on GitHub. ### Check Existing Issues - [x] I have searched the existing issues and discussions. ### Problem Description Currently ANY OpenAPI compatible connection is marked as external, being that other local LLM providers (like mlx_lm) provide a compatible endpoint but still are local. Currently by default all models via OpenAPI connection shows are external ### Desired Solution you'd like There should be a way to still show this models as local and not only as external ### Alternatives Considered _No response_ ### Additional Context ![Image](https://github.com/user-attachments/assets/ecabb5f5-7d63-4f55-bf68-47c728741bb8)
Author
Owner

@Azzeo commented on GitHub (May 8, 2025):

There should be an override option, and not be dependent on whether it is in the same local network
For example, the VLLM server can be hosted at
https://vllm.intern.mycompanyproduction.com/v1/chat/completions
While OWU can be hosted at:
https://chat.mycompany.com/

<!-- gh-comment-id:2862525745 --> @Azzeo commented on GitHub (May 8, 2025): There should be an override option, and not be dependent on whether it is in the same local network For example, the VLLM server can be hosted at https://vllm.intern.mycompanyproduction.com/v1/chat/completions While OWU can be hosted at: https://chat.mycompany.com/
Author
Owner

@tjbck commented on GitHub (May 8, 2025):

Agreed, investigating.

<!-- gh-comment-id:2862572418 --> @tjbck commented on GitHub (May 8, 2025): Agreed, investigating.
Author
Owner

@imbible commented on GitHub (May 12, 2025):

Bump. Very useful feature.

<!-- gh-comment-id:2870425691 --> @imbible commented on GitHub (May 12, 2025): Bump. Very useful feature.
Author
Owner

@tjbck commented on GitHub (May 16, 2025):

Addressed with 08e4c163ea

<!-- gh-comment-id:2887731746 --> @tjbck commented on GitHub (May 16, 2025): Addressed with 08e4c163ea99d5310f50da6959b0d694a4e441c8
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#55662