bug: windows litellm subprocess issue #747

Closed
opened 2025-11-11 14:30:23 -06:00 by GiteaMirror · 5 comments
Owner

Originally created by @wxjttxs on GitHub (Apr 28, 2024).

ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-3' coro=<start_litellm_background() done, defined at D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py:112> exception=NotImplementedError()>
Traceback (most recent call last):
File "D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py", line 127, in start_litellm_background
await run_background_process(command)
File "D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py", line 88, in run_background_process
process = await asyncio.create_subprocess_exec(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\softwares\anaconda\Lib\asyncio\subprocess.py", line 223, in create_subprocess_exec
transport, protocol = await loop.subprocess_exec(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\softwares\anaconda\Lib\asyncio\base_events.py", line 1694, in subprocess_exec
transport = await self._make_subprocess_transport(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\softwares\anaconda\Lib\asyncio\base_events.py", line 502, in _make_subprocess_transport
raise NotImplementedError
NotImplementedError

Originally created by @wxjttxs on GitHub (Apr 28, 2024). ERROR:asyncio:Task exception was never retrieved future: <Task finished name='Task-3' coro=<start_litellm_background() done, defined at D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py:112> exception=NotImplementedError()> Traceback (most recent call last): File "D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py", line 127, in start_litellm_background await run_background_process(command) File "D:\program\phoenix\phoenix_try3\open-webui\backend\apps\litellm\main.py", line 88, in run_background_process process = await asyncio.create_subprocess_exec( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\softwares\anaconda\Lib\asyncio\subprocess.py", line 223, in create_subprocess_exec transport, protocol = await loop.subprocess_exec( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\softwares\anaconda\Lib\asyncio\base_events.py", line 1694, in subprocess_exec transport = await self._make_subprocess_transport( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\softwares\anaconda\Lib\asyncio\base_events.py", line 502, in _make_subprocess_transport raise NotImplementedError NotImplementedError
Author
Owner

@justinh-rahb commented on GitHub (Apr 28, 2024):

Related bug was fixed and merged to main, re-pull and try again.

@justinh-rahb commented on GitHub (Apr 28, 2024): Related bug was fixed and merged to main, re-pull and try again.
Author
Owner

@tjbck commented on GitHub (Apr 28, 2024):

Might be a windows compatibility issue here, try ENABLE_LITELLM=false.

@tjbck commented on GitHub (Apr 28, 2024): Might be a windows compatibility issue here, try `ENABLE_LITELLM=false`.
Author
Owner

@tjbck commented on GitHub (Apr 28, 2024):

Related #1758

@tjbck commented on GitHub (Apr 28, 2024): Related #1758
Author
Owner

@silentoplayz commented on GitHub (May 11, 2024):

@wxjttxs Can you confirm whether or not this issue has been solved with #1758? I realized this has not been linked to the PR and closed yet.

@silentoplayz commented on GitHub (May 11, 2024): @wxjttxs Can you confirm whether or not this issue has been solved with #1758? I realized this has not been linked to the PR and closed yet.
Author
Owner

@tjbck commented on GitHub (May 26, 2024):

This issue has been resolved on 0.2.0.dev1, bundled LiteLLM support will be deprecated for our future releases, I'd recommend you start migrating your LiteLLM config.yaml to a self-hosted LiteLLM instance if you wish to host LiteLLM. You'd still be able to add them to our webui via OpenAI Connections.

@tjbck commented on GitHub (May 26, 2024): This issue has been resolved on 0.2.0.dev1, bundled LiteLLM support will be deprecated for our future releases, I'd recommend you start migrating your LiteLLM config.yaml to a self-hosted LiteLLM instance if you wish to host LiteLLM. You'd still be able to add them to our webui via OpenAI Connections.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#747