Make ID of the Workspace model available in pipes #3952

Closed
opened 2025-11-11 15:43:11 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @Simon-Stone on GitHub (Feb 18, 2025).

Feature Request

Is your feature request related to a problem? Please describe.
I am working with the pipe feature to connect to not fully OpenAI-compatible providers, e.g. Anthropic and Google. I also use these pipes to implement a token tracking functionality. The behavior of the pipe should be slightly different depending on whether the piped model is being used directly or as a base model for a Workspace model. However, the pipe is not aware of whether the request is coming directly or as part of a Workspace model.

Describe the solution you'd like
The method Pipe.pipe(self, body: dict, __user__: dict) should be made aware of the context somehow.

Describe alternatives you've considered
I don't really see an alternative to this.

I am willing to provide a pull request that implements this, but welcome any feedback on this issue.

I can see that the middleware has the model information in process_chat_payload(), but is not forwarding it. I wonder if that would be a good place to start. Alternatively, we could add the model ID to the __user__ dict, which is passed to the pipe. This would be the least intrusive, but seems like it would not be quite in the right place.

Originally created by @Simon-Stone on GitHub (Feb 18, 2025). # Feature Request **Is your feature request related to a problem? Please describe.** I am working with the pipe feature to connect to not fully OpenAI-compatible providers, e.g. Anthropic and Google. I also use these pipes to implement a token tracking functionality. The behavior of the pipe should be slightly different depending on whether the piped model is being used directly or as a base model for a Workspace model. However, the pipe is not aware of whether the request is coming directly or as part of a Workspace model. **Describe the solution you'd like** The method `Pipe.pipe(self, body: dict, __user__: dict)` should be made aware of the context somehow. **Describe alternatives you've considered** I don't really see an alternative to this. I am willing to provide a pull request that implements this, but welcome any feedback on this issue. I can see that the middleware has the model information in [`process_chat_payload()`](https://github.com/open-webui/open-webui/blob/3f3a5bb0ab8ce3425f317f1e57b084523aa2b2a5/backend/open_webui/utils/middleware.py#L628), but is not forwarding it. I wonder if that would be a good place to start. Alternatively, we could add the model ID to the `__user__` dict, which is passed to the pipe. This would be the least intrusive, but seems like it would not be quite in the right place.
Author
Owner

@tjbck commented on GitHub (Feb 18, 2025):

I believe __model__ might be what you're looking for but it's not fully supported yet.

@tjbck commented on GitHub (Feb 18, 2025): I believe `__model__` might be what you're looking for but it's not fully supported yet.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3952