[GH-ISSUE #2065] Any ollama command results in CORE DUMPED (ollama not using GPU) #26955

Closed
opened 2026-04-22 03:45:03 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @Rushmore75 on GitHub (Jan 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2065

Trying to interact with the command at all just returns Illegal instruction (core dumped). The journalctl logs just show

Started Ollama Service
ollama.service: Main process exited, code=dumped, status=4/ILL
ollama.service: Failed with result 'core-dump;

System:
Kernel: 5.15.0-91-generic
Distro: Ubuntu 22.04.3 LTS
Hardware: (Proxmox 8.1.3)

  • CPU: x86-64-v2-AES
  • GPU: (Passthru) Nvidia 1070
  • BIOS: SeaBIOS
  • Machine: i440fx

I would imagine it is linked to #2000 - perhaps something to so with VMs?

Originally created by @Rushmore75 on GitHub (Jan 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2065 Trying to interact with the command at all just returns `Illegal instruction (core dumped)`. The journalctl logs just show ``` Started Ollama Service ollama.service: Main process exited, code=dumped, status=4/ILL ollama.service: Failed with result 'core-dump; ``` System: Kernel: 5.15.0-91-generic Distro: Ubuntu 22.04.3 LTS Hardware: (Proxmox 8.1.3) * CPU: x86-64-v2-AES * GPU: (Passthru) Nvidia 1070 * BIOS: SeaBIOS * Machine: i440fx I would imagine it is linked to #2000 - perhaps something to so with VMs?
GiteaMirror added the linuxbug labels 2026-04-22 03:45:03 -05:00
Author
Owner

@Rushmore75 commented on GitHub (Jan 19, 2024):

Also tried with the q35 machine, still crashes

<!-- gh-comment-id:1899726078 --> @Rushmore75 commented on GitHub (Jan 19, 2024): Also tried with the `q35` machine, still crashes
Author
Owner

@Rushmore75 commented on GitHub (Jan 19, 2024):

As far as i can tell (building & running the program locally with gdb) the error is coming from https://github.com/gabriel-vasile/mimetype/blob/master/internal/magic/ftyp.go line 11...

<!-- gh-comment-id:1899795279 --> @Rushmore75 commented on GitHub (Jan 19, 2024): As far as i can tell (building & running the program locally with gdb) the error is coming from https://github.com/gabriel-vasile/mimetype/blob/master/internal/magic/ftyp.go line 11...
Author
Owner

@Rushmore75 commented on GitHub (Jan 19, 2024):

updating mimetype in go.mod seemed to fix the issue. It required the addition of a second package however, I forgot the name but it was listed in the output message from the addition

<!-- gh-comment-id:1899822090 --> @Rushmore75 commented on GitHub (Jan 19, 2024): updating mimetype in [go.mod](./go.mod) seemed to fix the issue. It required the addition of a second package however, I forgot the name but it was listed in the output message from the addition
Author
Owner

@Rushmore75 commented on GitHub (Jan 20, 2024):

Was already installed:

cuda-drivers-545
cuda-drivers
cuda-keyring

Fixed my PPAs

commands source

sudo apt-get --purge remove "*cublas*" "cuda*" "*nvidia*"
sudo apt-get clean
sudo apt-get autoremove
sudo apt-get update
sudo apt-get upgrade

Then installed

nvidia-cuda-toolkit
nvidia-driver-535

Rebuild ollama

(I just removed the whole repo and re-cloned it)

  1. Add to go.mod:
    Change github.com/gabriel-vasile/mimetype v1.4.3 to github.com/gabriel-vasile/mimetype v1.4.3
    This was causing core dump to happen before, idk why, but updating it fixes it.

  2. go get github.com/go-playground/validator/v10@v10.14.0

  3. go generate ./...

  4. go build -buildmode=pie -trimpath -mod=readonly -modcacherw -ldflags=-linkmode=external -ldflags=-buildid=''
    These flags are from the ollama-cuda AUR package, idk really what they do lol

How ever, still no gpu accelration... I'm using the llama2 model now.
nvtop shows no programs using the gpu and nvidia-smi doesn't either.
When I run the program it shows "INFO CUDA Compute Capability detected: 6.1"

<!-- gh-comment-id:1901782229 --> @Rushmore75 commented on GitHub (Jan 20, 2024): # Was already installed: cuda-drivers-545 cuda-drivers cuda-keyring # Fixed my PPAs [commands source](https://askubuntu.com/questions/1289811/cant-install-nvidia-driver-toolkit-on-ubuntu-20-04-lts-needs-uninstallable-pa) ```bash sudo apt-get --purge remove "*cublas*" "cuda*" "*nvidia*" sudo apt-get clean sudo apt-get autoremove sudo apt-get update sudo apt-get upgrade ``` # Then installed nvidia-cuda-toolkit nvidia-driver-535 # Rebuild ollama (I just removed the whole repo and re-cloned it) 1) Add to `go.mod`: Change `github.com/gabriel-vasile/mimetype v1.4.3` to `github.com/gabriel-vasile/mimetype v1.4.3` This was causing **core dump** to happen before, idk why, but updating it fixes it. 2) `go get github.com/go-playground/validator/v10@v10.14.0` 3) `go generate ./...` 4) `go build -buildmode=pie -trimpath -mod=readonly -modcacherw -ldflags=-linkmode=external -ldflags=-buildid=''` These flags are from the [ollama-cuda AUR](https://gitlab.archlinux.org/archlinux/packaging/packages/ollama-cuda/-/blob/main/PKGBUILD?ref_type=heads) package, idk really what they do lol How ever, still no gpu accelration... I'm using the llama2 model now. `nvtop` shows no programs using the gpu and `nvidia-smi` doesn't either. When I run the program it shows "INFO CUDA Compute Capability detected: 6.1"
Author
Owner

@nkeilar commented on GitHub (Jan 21, 2024):

I'm also running into similar issues, Ubuntu 22.04, using the 545 drivers... Lots of stability issues. But was hard to get Ubuntu to be happy with a single consistent set of drivers.

<!-- gh-comment-id:1902611076 --> @nkeilar commented on GitHub (Jan 21, 2024): I'm also running into similar issues, Ubuntu 22.04, using the 545 drivers... Lots of stability issues. But was hard to get Ubuntu to be happy with a single consistent set of drivers.
Author
Owner

@pwgit-create commented on GitHub (Jan 21, 2024):

I have the same issue on Ubuntu 22.04.3. Just got my Nvidia 4070 TI passed to my VM and Ollama installed with GPU enabled for the first time 😃 Has a fix been integrated into the latest release of Ollama or is the problem on my side? Awesome work with Ollama by the way, I Love it!

EDIT: Running the binary from pre-release v0.1.21 has resulted in it now working :)

<!-- gh-comment-id:1902646444 --> @pwgit-create commented on GitHub (Jan 21, 2024): I have the same issue on Ubuntu 22.04.3. Just got my Nvidia 4070 TI passed to my VM and Ollama installed with GPU enabled for the first time :smiley: Has a fix been integrated into the latest release of Ollama or is the problem on my side? Awesome work with Ollama by the way, I Love it! EDIT: Running the binary from pre-release v0.1.21 has resulted in it now working :)
Author
Owner

@Rushmore75 commented on GitHub (Jan 24, 2024):

nice, it would seem that the original issue that this was opened for - you can get around by just building from source or using a pre-release

<!-- gh-comment-id:1907392534 --> @Rushmore75 commented on GitHub (Jan 24, 2024): nice, it would seem that the original issue that this was opened for - you can get around by just building from source or using a pre-release
Author
Owner

@pdevine commented on GitHub (Mar 11, 2024):

Going to go ahead and close this since it should now be working.

<!-- gh-comment-id:1989101917 --> @pdevine commented on GitHub (Mar 11, 2024): Going to go ahead and close this since it should now be working.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26955