[function] Unable to emit message envents asynchronously when emitting citations events first #3777

Closed
opened 2025-11-11 15:39:36 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @sticky-note on GitHub (Feb 12, 2025).

Installation Method

Docker Compose default install with ollama-cuda

Environment

  • Open WebUI Version: v0.5.10

  • Ollama (if applicable): Not relevant

  • Operating System: Arch + docker

  • Browser (if applicable): Tested on latest Chrome and Firefox

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Be able to emit message AND citation in any order asynchrounously from a pipe function ( manifold )

Actual Behavior:

Hello OpenWebUI team, I'm encountering difficulties to get the display of citations and streaming response when I'm emitting citations from an async for:
Here an example code:

(...imports...)
EventEmitter = Callable[[dict], Awaitable[None]]

class Pipe:
  (..valves...)

  def __init__(self):
    self.type: str = "manifold"
    (...)

  def pipes(self) -> list[dict[str, str]]:
    return [(...)]
  
  async def stream_response(
            event_emitter: Optional[EventEmitter], streaming_response: any
        ) -> AsyncGenerator:
      # Some events (type "citation" occurs here") at the beginning of response parsing with the form of:
      await event_emitter({
          "type": "citation",
          "data": {
              "name": "citation_x"
              "document": ["content_x"],
              "metadata": {"name": "citation_x", "source": "https://..."]}
              "distances": [0.998545],
          },
      })
      (...)
      # yields occurs after while parsing
      (...)

   async def pipe(
        self,
        body: dict,
        __user__: Optional[dict] = None,
        # __event_call__,
        __task__: Optional[str] = None,
        __tools__: Optional[dict[str, dict]] = None,
        __task_body__: Optional[dict] = None,
        __event_emitter__: Optional[EventEmitter] = None,
        # __valves__=None,
    ) -> AsyncGenerator:
        
        if __task__ == TASKS.FUNCTION_CALLING:
            return
        if __task__ == TASKS.TITLE_GENERATION:
            yield f'{__task_body__["model"]}'  # type: ignore
            return
        if __task__ in [TASKS.AUTOCOMPLETE_GENERATION, TASKS.TAGS_GENERATION]:
            # TODO: prompt and tags autogeneration
            return
        (...)
        try:
            stream_res: AsyncGenerator = (Object from 3rd-party lib to pull from 3rd-party source)

            async for chunk in stream_response(
                __event_emitter__, streaming_response=stream_res
            ):
                yield chunk
        except:
          (...)

Description

Bug Summary:

  • Everything works OK when I comment citations events but I have not any citation displaying
  • Everything works OK If citations events occurs before the line async for chunk in stream_reponse( but the data I want to display as citation is not yet pulled from the 3rd-party source.
  • Citations displays correcly when citations events occur after the response streaming but the WebUI cannot link between response content citations and citations emitted

In this case, I can see the network reply in web console displays the correct payload (I suppose) but I get a bunch of errors on each message chat event (I suppose) on the Console. See attached pictures.

The problems seems to occur when the first citation is emitted in the async for loop, all the subsequent messages event seems to be misinterpreted by the frontend.
Btw, I can see in backend logs it is correctly yielding the completions from 3rd-party source.

If anyone can shed some light on this for me, I'll be grateful?

Reproduction Details

Steps to Reproduce:

  • Upload a function with this example on fresh install

Logs and Screenshots

Browser Console Logs:

Image

Docker Container Logs:
Not really relevant but I can Upload if asked

Screenshots/Screen Recordings (if applicable):

Image

Additional Information

I can reproduce on my side and upload more info if not reproducible on your side
I have separated stream_response() for readability but I have tested also inside the async for inside the main pipe() function with same results.

Note

Let me know if further information are required to address this bug or if I simply don't get it right

Originally created by @sticky-note on GitHub (Feb 12, 2025). ## Installation Method Docker Compose default install with ollama-cuda ## Environment - **Open WebUI Version:** `v0.5.10` - **Ollama (if applicable):** Not relevant - **Operating System:** Arch + docker - **Browser (if applicable):** Tested on latest Chrome and Firefox **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [ ] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Be able to emit message AND citation in any order asynchrounously from a pipe function ( manifold ) ## Actual Behavior: Hello `OpenWebUI` team, I'm encountering difficulties to get the display of `citations` and streaming response when I'm emitting citations from an `async for`: Here an example code: ```python (...imports...) EventEmitter = Callable[[dict], Awaitable[None]] class Pipe: (..valves...) def __init__(self): self.type: str = "manifold" (...) def pipes(self) -> list[dict[str, str]]: return [(...)] async def stream_response( event_emitter: Optional[EventEmitter], streaming_response: any ) -> AsyncGenerator: # Some events (type "citation" occurs here") at the beginning of response parsing with the form of: await event_emitter({ "type": "citation", "data": { "name": "citation_x" "document": ["content_x"], "metadata": {"name": "citation_x", "source": "https://..."]} "distances": [0.998545], }, }) (...) # yields occurs after while parsing (...) async def pipe( self, body: dict, __user__: Optional[dict] = None, # __event_call__, __task__: Optional[str] = None, __tools__: Optional[dict[str, dict]] = None, __task_body__: Optional[dict] = None, __event_emitter__: Optional[EventEmitter] = None, # __valves__=None, ) -> AsyncGenerator: if __task__ == TASKS.FUNCTION_CALLING: return if __task__ == TASKS.TITLE_GENERATION: yield f'{__task_body__["model"]}' # type: ignore return if __task__ in [TASKS.AUTOCOMPLETE_GENERATION, TASKS.TAGS_GENERATION]: # TODO: prompt and tags autogeneration return (...) try: stream_res: AsyncGenerator = (Object from 3rd-party lib to pull from 3rd-party source) async for chunk in stream_response( __event_emitter__, streaming_response=stream_res ): yield chunk except: (...) ``` ## Description **Bug Summary:** - Everything works OK when I comment citations events but I have not any citation displaying - Everything works OK If citations events occurs before the line `async for chunk in stream_reponse(` but the data I want to display as citation is not yet pulled from the 3rd-party source. - Citations displays correcly when `citations` events occur after the response streaming but the WebUI cannot link between response content citations and citations emitted In this case, I can see the network reply in web console displays the correct payload (I suppose) but I get a bunch of errors on each message chat event (I suppose) on the Console. See attached pictures. The problems seems to occur when the first citation is emitted in the `async for` loop, all the subsequent messages event seems to be misinterpreted by the frontend. Btw, I can see in backend logs it is correctly yielding the completions from 3rd-party source. If anyone can shed some light on this for me, I'll be grateful? ## Reproduction Details **Steps to Reproduce:** - Upload a function with this example on fresh install ## Logs and Screenshots **Browser Console Logs:** ![Image](https://github.com/user-attachments/assets/d89527ae-b690-420c-b13d-9a3aa6285327) **Docker Container Logs:** Not really relevant but I can Upload if asked **Screenshots/Screen Recordings (if applicable):** ![Image](https://github.com/user-attachments/assets/59684d26-c9eb-4ee0-b207-5a7202732ff4) ## Additional Information I can reproduce on my side and upload more info if not reproducible on your side I have separated `stream_response()` for readability but I have tested also inside the `async for` inside the main `pipe()` function with same results. ## Note Let me know if further information are required to address this bug or if I simply don't get it right
Author
Owner

@rgaricano commented on GitHub (Feb 12, 2025):

I'm not sure but it seems to me that the type of the emitter should be "function" and name "citation

@rgaricano commented on GitHub (Feb 12, 2025): I'm not sure but it seems to me that the type of the emitter should be "function" and name "citation
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3777