panic: runtime error: invalid memory address or nil pointer dereference #3146

Closed
opened 2025-11-12 11:26:38 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @wywself on GitHub (Jun 11, 2024).

What is the issue?

I am using Tesla M60, which is on the GPU card support list. However, when I execute the following command to start the model, an error is reported as follows.

# ollama run qwen:7b
Error: Post "http://127.0.0.1:11434/api/chat": EOF

The log is as follows:
image

lscpu as follows:
image

How to resolve it? Thank you.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.42

Originally created by @wywself on GitHub (Jun 11, 2024). ### What is the issue? I am using Tesla M60, which is on the GPU card support list. However, when I execute the following command to start the model, an error is reported as follows. ``` # ollama run qwen:7b Error: Post "http://127.0.0.1:11434/api/chat": EOF ``` The log is as follows: ![image](https://github.com/ollama/ollama/assets/8843053/5b6c8d58-9ef7-4e26-8c01-8aec93e12ed2) `lscpu` as follows: ![image](https://github.com/ollama/ollama/assets/8843053/76ab98cd-ee0e-4622-9971-c67e0872ca14) How to resolve it? Thank you. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.42
GiteaMirror added the
bug
label 2025-11-12 11:26:38 -06:00
Author
Owner

@wywself commented on GitHub (Jun 12, 2024):

ollama run qwen2:7b success.

@wywself commented on GitHub (Jun 12, 2024): `ollama run qwen2:7b` success.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#3146
No description provided.