mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-12 10:04:14 -05:00
bug: windows litellm subprocess issue #747
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @wxjttxs on GitHub (Apr 28, 2024).
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-3' coro=<start_litellm_background() done, defined at D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py:112> exception=NotImplementedError()>
Traceback (most recent call last):
File "D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py", line 127, in start_litellm_background
await run_background_process(command)
File "D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py", line 88, in run_background_process
process = await asyncio.create_subprocess_exec(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\softwares\anaconda\Lib\asyncio\subprocess.py", line 223, in create_subprocess_exec
transport, protocol = await loop.subprocess_exec(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\softwares\anaconda\Lib\asyncio\base_events.py", line 1694, in subprocess_exec
transport = await self._make_subprocess_transport(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\softwares\anaconda\Lib\asyncio\base_events.py", line 502, in _make_subprocess_transport
raise NotImplementedError
NotImplementedError
@justinh-rahb commented on GitHub (Apr 28, 2024):
Related bug was fixed and merged to main, re-pull and try again.
@tjbck commented on GitHub (Apr 28, 2024):
Might be a windows compatibility issue here, try
ENABLE_LITELLM=false.@tjbck commented on GitHub (Apr 28, 2024):
Related #1758
@silentoplayz commented on GitHub (May 11, 2024):
@wxjttxs Can you confirm whether or not this issue has been solved with #1758? I realized this has not been linked to the PR and closed yet.
@tjbck commented on GitHub (May 26, 2024):
This issue has been resolved on 0.2.0.dev1, bundled LiteLLM support will be deprecated for our future releases, I'd recommend you start migrating your LiteLLM config.yaml to a self-hosted LiteLLM instance if you wish to host LiteLLM. You'd still be able to add them to our webui via OpenAI Connections.