[GH-ISSUE #3256] Llama2 on windows - how is this possible? Possible security issue? #64041

Closed
opened 2026-05-03 15:57:41 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @FnomAgram on GitHub (Mar 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3256

What is the issue?

Hello, I'm using Llama2 in Ollama on windows (preview). There are no major problems, but yesterday I tried to start Llama2 with command "ollama run llama2:70b --verbose". Before I could start writing any prompt I got respose as on attached screenshot about my processor (7950x) (I did not write "CPU") . With data about utilisation and up time (I'm not sure but I think data was correct)?

I did not know that llama2 on ollama could interact with pc that is currently running ollama. I stoped response and tried again to get info on pc that ollama is running on but I could not get this kind of response again, only general resposne about different types of CPUs or some halucinations (that I'm running Intel CPU, different RAM quantity, different GPU...).

Is it possible that llama2 on ollama can get data about hardware of computer it's running on? Is this known fact and only my ignorance is here to blame? Are there security issues if somebody does not know that this is possible? Is this bug at all or a feature that I'm not aware? How can it be reproduced?

Thanks in advance, and sorry if I'm asking stupid question.

Screenshot

What did you expect to see?

Nothing.

Steps to reproduce

I tried to reproduce this response again multiple times but I could not get this kind of result.

Are there any recent changes that introduced the issue?

No.

OS

Windows

Architecture

amd64

Platform

No response

Ollama version

0.1.29

GPU

Nvidia

GPU info

3090TI

CPU

AMD

Other software

No response

Originally created by @FnomAgram on GitHub (Mar 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3256 ### What is the issue? Hello, I'm using Llama2 in Ollama on windows (preview). There are no major problems, but yesterday I tried to start Llama2 with command "ollama run llama2:70b --verbose". Before I could start writing any prompt I got respose as on attached screenshot about my processor (7950x) (I did not write "CPU") . With data about utilisation and up time (I'm not sure but I think data was correct)? I did not know that llama2 on ollama could interact with pc that is currently running ollama. I stoped response and tried again to get info on pc that ollama is running on but I could not get this kind of response again, only general resposne about different types of CPUs or some halucinations (that I'm running Intel CPU, different RAM quantity, different GPU...). Is it possible that llama2 on ollama can get data about hardware of computer it's running on? Is this known fact and only my ignorance is here to blame? Are there security issues if somebody does not know that this is possible? Is this bug at all or a feature that I'm not aware? How can it be reproduced? Thanks in advance, and sorry if I'm asking stupid question. ![Screenshot](https://github.com/ollama/ollama/assets/163992294/beb3fb36-8239-4436-b85b-f0617ee639fc) ### What did you expect to see? Nothing. ### Steps to reproduce I tried to reproduce this response again multiple times but I could not get this kind of result. ### Are there any recent changes that introduced the issue? No. ### OS Windows ### Architecture amd64 ### Platform _No response_ ### Ollama version 0.1.29 ### GPU Nvidia ### GPU info 3090TI ### CPU AMD ### Other software _No response_
GiteaMirror added the bug label 2026-05-03 15:57:41 -05:00
Author
Owner

@dhiltgen commented on GitHub (Mar 20, 2024):

From the screenshot, this looks like a copy-and-paste from some other tool into the terminal where Ollama was running. The prefix >>> indicates this is an input prompt where Ollama is taking input from the user. Responses will not have that prefix. My guess is you selected this text from another application or terminal, and middle clicked in the terminal with Ollama running. I was unable to reproduce. It sounds like you were unable to reproduce either. If you do find a repro scenario, let us know.

<!-- gh-comment-id:2008973014 --> @dhiltgen commented on GitHub (Mar 20, 2024): From the screenshot, this looks like a copy-and-paste from some other tool into the terminal where Ollama was running. The prefix `>>>` indicates this is an input prompt where Ollama is taking input from the user. Responses will not have that prefix. My guess is you selected this text from another application or terminal, and middle clicked in the terminal with Ollama running. I was unable to reproduce. It sounds like you were unable to reproduce either. If you do find a repro scenario, let us know.
Author
Owner

@FnomAgram commented on GitHub (Mar 20, 2024):

Thanks.
I'm still confused. I don't know how could I copied and pasted this data without knowing. But still, thanks. To me, it's important to know that ollama/llama2 does not have access to data about underlying hardware/operating system/file system.

<!-- gh-comment-id:2010171344 --> @FnomAgram commented on GitHub (Mar 20, 2024): Thanks. I'm still confused. I don't know how could I copied and pasted this data without knowing. But still, thanks. To me, it's important to know that ollama/llama2 does not have access to data about underlying hardware/operating system/file system.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64041