mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
Incorrect information returned by {{USER_LOCATION}} - Prompt Variables Support #1701
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @DustyTurtleDip on GitHub (Aug 6, 2024).
Bug Report
Description
📅 Prompt Variables Support: is not returning the correct location.
Bug Summary:
No matter what LLM model I use, the value provided by the {{USER_LOCATION}} variable is wrong.
Steps to Reproduce:
Add the {{USER_LOCATION}} variable to the LLM system prompt, then ask the model the following question : "What is my current location ? "
Expected Behavior:
The LLM should respond with an accurate position or geolocation.
Actual Behavior:
The LLM always get the following (WRONG) geolocation : 45.519, -70.861
Environment
Open WebUI Version: [v0.3.11]
Ollama: [0.3.3]
Operating System: [
Installed on Debian 12 running in Docker
Accessed from Ubuntu Desktop 24.04 Noble
Accessed from Android 15
Browser:
Ubuntu - Google Chrome 127.0.6533.88 (Official Build) (64-bit)
Android - Google Chrome 127.0.6533.84
Reproduction Details
Confirmation:
Browser Console Logs:
browser_console.log
Docker Container Logs:
INFO: 192.168.1.1:0 - "POST /api/v1/chats/new HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [apps.ollama.main] generate_ollama_embeddings model='llama3.1:8b' prompt='What is my current location ? ' options=None keep_alive=None
INFO [apps.ollama.main] url: http://ollama:11434
INFO [apps.ollama.main] generate_ollama_embeddings {'embedding': [-4.9622979164123535, STRIPPED -0.16112491488456726]}
INFO: 192.168.1.1:0 - "POST /api/v1/memories/query HTTP/1.1" 200 OK
INFO [apps.ollama.main] url: http://ollama:11434
INFO: 192.168.1.1:0 - "POST /ollama/api/chat HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/chat/completed HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [apps.ollama.main] url: http://ollama:11434
generate_title
llama3.1:8b
generate_ollama_chat_completion
INFO: 192.168.1.1:0 - "POST /api/task/title/completions HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
Screenshots:

Installation Method
Docker Compose