Incorrect information returned by {{USER_LOCATION}} - Prompt Variables Support #1701

Closed
opened 2025-11-11 14:50:25 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @DustyTurtleDip on GitHub (Aug 6, 2024).

Bug Report

Description

📅 Prompt Variables Support: is not returning the correct location.

Bug Summary:
No matter what LLM model I use, the value provided by the {{USER_LOCATION}} variable is wrong.

Steps to Reproduce:
Add the {{USER_LOCATION}} variable to the LLM system prompt, then ask the model the following question : "What is my current location ? "

Expected Behavior:
The LLM should respond with an accurate position or geolocation.

Actual Behavior:
The LLM always get the following (WRONG) geolocation : 45.519, -70.861

Environment

  • Open WebUI Version: [v0.3.11]

  • Ollama: [0.3.3]

  • Operating System: [
    Installed on Debian 12 running in Docker
    Accessed from Ubuntu Desktop 24.04 Noble
    Accessed from Android 15

  • Browser:
    Ubuntu - Google Chrome 127.0.6533.88 (Official Build) (64-bit)
    Android - Google Chrome 127.0.6533.84

Reproduction Details

Confirmation:

  • [ X ] I have read and followed all the instructions provided in the README.md.
  • [ X ] I am on the latest version of both Open WebUI and Ollama.
  • [ X ] I have included the browser console logs.
  • [ X ] I have included the Docker container logs.

Browser Console Logs:
browser_console.log

Docker Container Logs:
INFO: 192.168.1.1:0 - "POST /api/v1/chats/new HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [apps.ollama.main] generate_ollama_embeddings model='llama3.1:8b' prompt='What is my current location ? ' options=None keep_alive=None
INFO [apps.ollama.main] url: http://ollama:11434
INFO [apps.ollama.main] generate_ollama_embeddings {'embedding': [-4.9622979164123535, STRIPPED -0.16112491488456726]}
INFO: 192.168.1.1:0 - "POST /api/v1/memories/query HTTP/1.1" 200 OK
INFO [apps.ollama.main] url: http://ollama:11434
INFO: 192.168.1.1:0 - "POST /ollama/api/chat HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/chat/completed HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [apps.ollama.main] url: http://ollama:11434
generate_title
llama3.1:8b
generate_ollama_chat_completion
INFO: 192.168.1.1:0 - "POST /api/task/title/completions HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK

Screenshots:
Screenshot from 2024-08-06 15-47-08

wrong info

Screenshot from 2024-08-06 16-02-32

Installation Method

Docker Compose

Originally created by @DustyTurtleDip on GitHub (Aug 6, 2024). # Bug Report ## Description [📅 Prompt Variables Support: ](https://docs.openwebui.com/features/#:~:text=%F0%9F%93%85%20Prompt%20Variables,%3E%20Interface%20menu.) is not returning the correct location. **Bug Summary:** No matter what LLM model I use, the value provided by the {{USER_LOCATION}} variable is wrong. **Steps to Reproduce:** Add the {{USER_LOCATION}} variable to the LLM system prompt, then ask the model the following question : "What is my current location ? " **Expected Behavior:** The LLM should respond with an accurate position or geolocation. **Actual Behavior:** The LLM always get the following (WRONG) geolocation : 45.519, -70.861 ## Environment - **Open WebUI Version:** [v0.3.11] - **Ollama:** [0.3.3] - **Operating System:** [ Installed on Debian 12 running in Docker Accessed from Ubuntu Desktop 24.04 Noble Accessed from Android 15 - **Browser:** Ubuntu - Google Chrome 127.0.6533.88 (Official Build) (64-bit) Android - Google Chrome 127.0.6533.84 ## Reproduction Details **Confirmation:** - [ X ] I have read and followed all the instructions provided in the README.md. - [ X ] I am on the latest version of both Open WebUI and Ollama. - [ X ] I have included the browser console logs. - [ X ] I have included the Docker container logs. **Browser Console Logs:** [browser_console.log](https://github.com/user-attachments/files/16515401/browser_console.log) **Docker Container Logs:** INFO: 192.168.1.1:0 - "POST /api/v1/chats/new HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO [apps.ollama.main] generate_ollama_embeddings model='llama3.1:8b' prompt='What is my current location ? ' options=None keep_alive=None INFO [apps.ollama.main] url: http://ollama:11434 INFO [apps.ollama.main] generate_ollama_embeddings {'embedding': [-4.9622979164123535, ***STRIPPED*** -0.16112491488456726]} INFO: 192.168.1.1:0 - "POST /api/v1/memories/query HTTP/1.1" 200 OK INFO [apps.ollama.main] url: http://ollama:11434 INFO: 192.168.1.1:0 - "POST /ollama/api/chat HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "POST /api/chat/completed HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO [apps.ollama.main] url: http://ollama:11434 generate_title llama3.1:8b generate_ollama_chat_completion INFO: 192.168.1.1:0 - "POST /api/task/title/completions HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "POST /api/v1/chats/23c2fd21-23ef-4982-92bc-d91a7ba5d982 HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO: 192.168.1.1:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK **Screenshots:** ![Screenshot from 2024-08-06 15-47-08](https://github.com/user-attachments/assets/9e279fb3-bf19-42cc-99ce-4b5825fc8409) ![wrong info](https://github.com/user-attachments/assets/228fbcb6-bc50-444b-a6c0-100e91741596) ![Screenshot from 2024-08-06 16-02-32](https://github.com/user-attachments/assets/3f80b382-040c-4960-9516-b7511cd9ac06) ## Installation Method Docker Compose
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1701