[GH-ISSUE #10608] Version 0.68: When using langchain4j to call the model deployed by ollama and the function tool, the streaming output does not take effect #69038

Open
opened 2026-05-04 16:58:42 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @zhangsic-wlf on GitHub (May 7, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10608

Originally created by @zhangsic-wlf on GitHub (May 7, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10608
GiteaMirror added the feature request label 2026-05-04 16:58:42 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69038