Time to first response degraded by 2 -4 seconds since 0.4, SOLUTION #2737

Closed
opened 2025-11-11 15:13:21 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @kim-gtek on GitHub (Nov 21, 2024).

I noticed that OPENWEBUI takes 2-4 seconds longer to load chats.

By looking at the logs, the problem is clear, before every chat OPENWEBUI calls

get_all_models()

I have fixed this issue by adding the environment variables:
backend/open_webui/config.py
Line 533:

ALL_MODELS_CACHE_TTL = "1800"
OPENAI_MODELS_CACHE_TTL = "1800"
OLLAMA_MODELS_CACHE_TTL = "1800"

These are imported into the following files:

backend/open_webui/apps/main.py
Line 1007:

@cached(ttl=MODELS_CACHE_TTL) # Cache variable to control get_all_models()
async def get_all_models():

backend/open_webui/apps/ollama/main.py
Line 260:

@cached(ttl=OLLAMA_MODELS_CACHE_TTL)
async def get_all_models():

backend/open_webui/apps/openai/main.py
Line 372:

@cached(ttl=OPENAI_MODELS_CACHE_TTL)
async def get_all_models() -> dict[str, list]:

Obviously importing from aiocache import cached to each file, and adding the corresponding environment variable to from open_webui.config import (

Originally created by @kim-gtek on GitHub (Nov 21, 2024). I noticed that OPENWEBUI takes 2-4 seconds longer to load chats. By looking at the logs, the problem is clear, before every chat OPENWEBUI calls `get_all_models()` I have fixed this issue by adding the environment variables: **backend/open_webui/config.py** Line 533: ``` ALL_MODELS_CACHE_TTL = "1800" OPENAI_MODELS_CACHE_TTL = "1800" OLLAMA_MODELS_CACHE_TTL = "1800" ``` These are imported into the following files: **backend/open_webui/apps/main.py** Line 1007: ``` @cached(ttl=MODELS_CACHE_TTL) # Cache variable to control get_all_models() async def get_all_models(): ``` **backend/open_webui/apps/ollama/main.py** Line 260: ``` @cached(ttl=OLLAMA_MODELS_CACHE_TTL) async def get_all_models(): ``` **backend/open_webui/apps/openai/main.py** Line 372: ``` @cached(ttl=OPENAI_MODELS_CACHE_TTL) async def get_all_models() -> dict[str, list]: ``` Obviously importing **from aiocache import cached** to each file, and adding the corresponding environment variable to **from open_webui.config import (**
Author
Owner

@tkg61 commented on GitHub (Nov 21, 2024):

Seeing this as well!

@tkg61 commented on GitHub (Nov 21, 2024): Seeing this as well!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2737