[GH-ISSUE #1838] Cuda Error with 2GB VRAM: Error: Post "http://127.0.0.1:11434/api/generate": EOF #63085

Closed
opened 2026-05-03 11:43:29 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @falaimo on GitHub (Jan 7, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/1838

Hello everyone, in Ollama version 0.1.18, I'm encountering the error "Error: Post "http://127.0.0.1:11434/api/generate": EOF" when starting Ollama with any model. I think it depends of cuda...
logs_ollama.txt

Originally created by @falaimo on GitHub (Jan 7, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/1838 Hello everyone, in Ollama version 0.1.18, I'm encountering the error "Error: Post "http://127.0.0.1:11434/api/generate": EOF" when starting Ollama with any model. I think it depends of cuda... [logs_ollama.txt](https://github.com/jmorganca/ollama/files/13852832/logs_ollama.txt)
GiteaMirror added the bug label 2026-05-03 11:43:29 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63085