[GH-ISSUE #7650] AMD Radeon 780M GPU (Pop OS !) System 76 #66937

Closed
opened 2026-05-04 08:55:21 -05:00 by GiteaMirror · 38 comments
Owner

Originally created by @ihgumilar on GitHub (Nov 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7650

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Hi,

I would like to ask your help.
I am running Ollama with the following GPU, but it seems that it is not picking up my GPU. Is there any advice ?

AMD Ryzen™ 7 7840U processor.

When I run ollama serve, it gives me this error. Any advice ?

Thanks

2024/11/13 17:40:14 routes.go:1189: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:11.0.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/ihshan/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2024-11-13T17:40:14.880+07:00 level=INFO source=images.go:755 msg="total blobs: 0"
time=2024-11-13T17:40:14.880+07:00 level=INFO source=images.go:762 msg="total unused blobs removed: 0"
time=2024-11-13T17:40:14.881+07:00 level=INFO source=routes.go:1240 msg="Listening on 127.0.0.1:11435 (version 0.4.1)"
time=2024-11-13T17:40:14.881+07:00 level=INFO source=common.go:135 msg="extracting embedded files" dir=/tmp/ollama1477910346/runners
time=2024-11-13T17:40:14.949+07:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[rocm cpu cpu_avx cpu_avx2 cuda_v11 cuda_v12]"
time=2024-11-13T17:40:14.949+07:00 level=INFO source=gpu.go:221 msg="looking for compatible GPUs"
time=2024-11-13T17:40:16.902+07:00 level=INFO source=gpu.go:610 msg="no nvidia devices detected by library /usr/lib/x86_64-linux-gnu/libcuda.so.560.35.03"
time=2024-11-13T17:40:22.056+07:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
time=2024-11-13T17:40:22.057+07:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB"
time=2024-11-13T17:40:22.057+07:00 level=INFO source=amd_linux.go:399 msg="no compatible amdgpu devices detected"
time=2024-11-13T17:40:22.057+07:00 level=INFO source=gpu.go:386 msg="no compatible GPUs were discovered"
time=2024-11-13T17:40:22.057+07:00 level=INFO source=types.go:123 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="30.6 GiB" available="23.7 GiB"

OS

Linux

GPU

AMD

CPU

Other

Ollama version

0.4.1

Originally created by @ihgumilar on GitHub (Nov 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7650 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Hi, I would like to ask your help. I am running Ollama with the following GPU, but it seems that it is not picking up my GPU. Is there any advice ? AMD Ryzen™ 7 7840U processor. When I **run ollama serve**, it gives me this error. Any advice ? Thanks ``` 2024/11/13 17:40:14 routes.go:1189: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:11.0.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/ihshan/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2024-11-13T17:40:14.880+07:00 level=INFO source=images.go:755 msg="total blobs: 0" time=2024-11-13T17:40:14.880+07:00 level=INFO source=images.go:762 msg="total unused blobs removed: 0" time=2024-11-13T17:40:14.881+07:00 level=INFO source=routes.go:1240 msg="Listening on 127.0.0.1:11435 (version 0.4.1)" time=2024-11-13T17:40:14.881+07:00 level=INFO source=common.go:135 msg="extracting embedded files" dir=/tmp/ollama1477910346/runners time=2024-11-13T17:40:14.949+07:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[rocm cpu cpu_avx cpu_avx2 cuda_v11 cuda_v12]" time=2024-11-13T17:40:14.949+07:00 level=INFO source=gpu.go:221 msg="looking for compatible GPUs" time=2024-11-13T17:40:16.902+07:00 level=INFO source=gpu.go:610 msg="no nvidia devices detected by library /usr/lib/x86_64-linux-gnu/libcuda.so.560.35.03" time=2024-11-13T17:40:22.056+07:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory" time=2024-11-13T17:40:22.057+07:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB" time=2024-11-13T17:40:22.057+07:00 level=INFO source=amd_linux.go:399 msg="no compatible amdgpu devices detected" time=2024-11-13T17:40:22.057+07:00 level=INFO source=gpu.go:386 msg="no compatible GPUs were discovered" time=2024-11-13T17:40:22.057+07:00 level=INFO source=types.go:123 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="30.6 GiB" available="23.7 GiB" ``` ### OS Linux ### GPU AMD ### CPU Other ### Ollama version 0.4.1
GiteaMirror added the amdbuglinuxgpu labels 2026-05-04 08:55:23 -05:00
Author
Owner
<!-- gh-comment-id:2473659688 --> @kth8 commented on GitHub (Nov 13, 2024): https://github.com/alexhegit/Playing-with-ROCm/blob/main/inference/LLM/Run_Ollama_with_AMD_iGPU780M-QuickStart.md
Author
Owner

@ihgumilar commented on GitHub (Nov 13, 2024):

Thanks @kth8
When I run sudo apt install amdgpu-dkms, it says this issue. It seems that latest kernel is not supported.

IMG_20241113_224738

Therefore, it keeps showing processor 100℅ CPU when run ollama ps

Any advice?

<!-- gh-comment-id:2474007379 --> @ihgumilar commented on GitHub (Nov 13, 2024): Thanks @kth8 When I run `sudo apt install amdgpu-dkms`, it says this issue. It seems that latest kernel is not supported. ![IMG_20241113_224738](https://github.com/user-attachments/assets/3b3d1a66-1b0f-4465-bafc-0a29386cdb76) Therefore, it keeps showing processor 100℅ CPU when run `ollama ps` Any advice?
Author
Owner

@kth8 commented on GitHub (Nov 13, 2024):

You can consult the official System76 documentation but it seems like you already had amdgpu installed and just needed the rocm package.

<!-- gh-comment-id:2474074462 --> @kth8 commented on GitHub (Nov 13, 2024): You can consult the official [System76 documentation](https://support.system76.com/articles/rocm/) but it seems like you already had `amdgpu` installed and just needed the `rocm` package.
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

ur screenshot shows ur serve is spawned as other user
u need to set the env var / vars where u run ' ollama serve '

<!-- gh-comment-id:2475218600 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): ur screenshot shows ur serve is spawned as other user u need to set the env var / vars where u run ' ollama serve '
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

I still cannot make it. Can you elaborate more please and give me example @fxmbsw7 ?

Thanks mates for your help. Much appreciated @kth8

<!-- gh-comment-id:2475260676 --> @ihgumilar commented on GitHub (Nov 14, 2024): I still cannot make it. Can you elaborate more please and give me example @fxmbsw7 ? Thanks mates for your help. Much appreciated @kth8
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

pkill ollama
HSA_OVERRIDE_GFX_VERSION=11.0.0 OLLAMA_FLASH_ATTENTION=1 ollama serve &>>~/serve.ollama &

<!-- gh-comment-id:2475281369 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): pkill ollama HSA_OVERRIDE_GFX_VERSION=11.0.0 OLLAMA_FLASH_ATTENTION=1 ollama serve &>>~/serve.ollama &
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

also looking at ur screenshot
ur apt amdgpu install failed

<!-- gh-comment-id:2475288197 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): also looking at ur screenshot ur apt amdgpu install failed
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

did u reboot the machine ?
new kernels include by amd.com amdgpu drivers

<!-- gh-comment-id:2475288765 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): did u reboot the machine ? new kernels include by amd.com amdgpu drivers
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

pkill ollama HSA_OVERRIDE_GFX_VERSION=11.0.0 OLLAMA_FLASH_ATTENTION=1 ollama serve &>>~/serve.ollama &

Should I put those env variables in . bashrc file?

<!-- gh-comment-id:2475297023 --> @ihgumilar commented on GitHub (Nov 14, 2024): > pkill ollama HSA_OVERRIDE_GFX_VERSION=11.0.0 OLLAMA_FLASH_ATTENTION=1 ollama serve &>>~/serve.ollama & Should I put those env variables in . bashrc file?
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

also looking at ur screenshot ur apt amdgpu install failed

Yes, I can see that. Any advice? @fxmbsw7

<!-- gh-comment-id:2475298583 --> @ihgumilar commented on GitHub (Nov 14, 2024): > also looking at ur screenshot ur apt amdgpu install failed Yes, I can see that. Any advice? @fxmbsw7
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

did u reboot the machine ? new kernels include by amd.com amdgpu drivers

I did reboot several times. Thanks

<!-- gh-comment-id:2475298996 --> @ihgumilar commented on GitHub (Nov 14, 2024): > did u reboot the machine ? new kernels include by amd.com amdgpu drivers I did reboot several times. Thanks
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

put in bashrc yes
without ollama &
or with ..

anyway ktr8 also said
..
so anyway try removing this amdgpu kernel package and install this rocm package

apt-get update
apt-get remove amdgpu-dkms
apt-get upgrade rocm

<!-- gh-comment-id:2475304464 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): put in bashrc yes without ollama & or with .. anyway ktr8 also said .. so anyway try removing this amdgpu kernel package and install this rocm package apt-get update apt-get remove amdgpu-dkms apt-get upgrade rocm
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

It seems that this kernel 6.9.3-76060903-generic (my kernel) is not supported yet by amdgpu-dkms
https://rocm.docs.amd.com/en/latest/compatibility/compatibility-matrix.html#operating-systems-and-kernel-versions

image

It seems similar issue with higher kernel 6.11.08 https://github.com/ROCm/ROCm/issues/3870 has been reported

What do you think @fxmbsw7 ?

<!-- gh-comment-id:2475451062 --> @ihgumilar commented on GitHub (Nov 14, 2024): It seems that this kernel 6.9.3-76060903-generic (my kernel) is not supported yet by amdgpu-dkms https://rocm.docs.amd.com/en/latest/compatibility/compatibility-matrix.html#operating-systems-and-kernel-versions ![image](https://github.com/user-attachments/assets/21dff1db-470c-44b2-8422-c3c68e138626) It seems similar issue with higher kernel 6.11.08 https://github.com/ROCm/ROCm/issues/3870 has been reported What do you think @fxmbsw7 ?
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

im not sure i run debian and ...
well if rocm says only old kernels ...
try

apt-cache search linux-image

or however popsuss names its kernels
ans try look for 6.8 kernel and install it then ( try )

apt-get install {name}

then reboot and select older kernel

..

On Thu, Nov 14, 2024, 6:29 AM Ihshan Gumilar @.***>
wrote:

It seems that this kernel 6.9.3-76060903-generic (my kernel) is not
supported yet by amdgpu-dkms

https://rocm.docs.amd.com/en/latest/compatibility/compatibility-matrix.html#operating-systems-and-kernel-versions

image.png (view on web)
https://github.com/user-attachments/assets/21dff1db-470c-44b2-8422-c3c68e138626

It seems similar issue with higher kernel 6.11.08 ROCm/ROCm#3870
https://github.com/ROCm/ROCm/issues/3870 has been reported

What do you think @fxmbsw7 https://github.com/fxmbsw7 ?


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475451062,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3PS7TVB47JFJILCCDT2AQYNJAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGQ2TCMBWGI
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475460937 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): im not sure i run debian and ... well if rocm says only old kernels ... try apt-cache search linux-image or however popsuss names its kernels ans try look for 6.8 kernel and install it then ( try ) apt-get install {name} then reboot and select older kernel .. On Thu, Nov 14, 2024, 6:29 AM Ihshan Gumilar ***@***.***> wrote: > It seems that this kernel 6.9.3-76060903-generic (my kernel) is not > supported yet by amdgpu-dkms > > https://rocm.docs.amd.com/en/latest/compatibility/compatibility-matrix.html#operating-systems-and-kernel-versions > > image.png (view on web) > <https://github.com/user-attachments/assets/21dff1db-470c-44b2-8422-c3c68e138626> > > It seems similar issue with higher kernel 6.11.08 ROCm/ROCm#3870 > <https://github.com/ROCm/ROCm/issues/3870> has been reported > > What do you think @fxmbsw7 <https://github.com/fxmbsw7> ? > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475451062>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3PS7TVB47JFJILCCDT2AQYNJAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGQ2TCMBWGI> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner
<!-- gh-comment-id:2475515288 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): https://rocm.docs.amd.com/projects/radeon/en/latest/docs/compatibility.html
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

1 uninstall dkms
2 try install rocm
0 see if without dkms u have a amdgpu driver
0 try radeon / pro drivers download by page
https://www.amd.com/en/support/download/linux-drivers.html
0 also try dkms install by amd.com

<!-- gh-comment-id:2475517830 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): 1 uninstall dkms 2 try install rocm 0 see if without dkms u have a amdgpu driver 0 try radeon / pro drivers download by page https://www.amd.com/en/support/download/linux-drivers.html 0 also try dkms install by amd.com
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

Let me try @fxmbsw7 thanks

<!-- gh-comment-id:2475535231 --> @ihgumilar commented on GitHub (Nov 14, 2024): Let me try @fxmbsw7 thanks
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

As mentioned "The ROCm package interfaces with the AMDGPU driver built into the default Pop!_OS kernel, and does not require installing any DKMS packages."
image

https://support.system76.com/articles/rocm/

<!-- gh-comment-id:2475536374 --> @ihgumilar commented on GitHub (Nov 14, 2024): As mentioned "The ROCm package interfaces with the AMDGPU driver built into the default Pop!_OS kernel, and **does not require installing any DKMS packages.**" ![image](https://github.com/user-attachments/assets/0ad4cabc-6b9c-4e31-a836-46f6dce6aef7) https://support.system76.com/articles/rocm/
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

try radeon / pro drivers download by page

Donwgrade the kernel is not possible since it has an issue with hardware such as wifi etc.

<!-- gh-comment-id:2475551282 --> @ihgumilar commented on GitHub (Nov 14, 2024): > try radeon / pro drivers download by page Donwgrade the kernel is not possible since it has an issue with hardware such as wifi etc.
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

u upgraded all packages yet ?

apt-get upgrade

but anyway u gotta uninstall the dkms like first

and check that url of the other dood

On Thu, Nov 14, 2024, 7:52 AM Ihshan Gumilar @.***>
wrote:

try radeon / pro drivers download by page

Donwgrade the kernel is not possible since it has an issue with hardware
such as wifi etc.


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475551282,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3IR2Z6EZMK53OHUD2L2ARCEPAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGU2TCMRYGI
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475554545 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): u upgraded all packages yet ? apt-get upgrade but anyway u gotta uninstall the dkms like first and check that url of the other dood On Thu, Nov 14, 2024, 7:52 AM Ihshan Gumilar ***@***.***> wrote: > try radeon / pro drivers download by page > > Donwgrade the kernel is not possible since it has an issue with hardware > such as wifi etc. > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475551282>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3IR2Z6EZMK53OHUD2L2ARCEPAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGU2TCMRYGI> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

1 uninstall dkms
2 try install rocm
0 see if without dkms u have a amdgpu driver

I did the above steps. II have amdgpu driver without dkms

image

<!-- gh-comment-id:2475560574 --> @ihgumilar commented on GitHub (Nov 14, 2024): > 1 uninstall dkms > 2 try install rocm > 0 see if without dkms u have a amdgpu driver I did the above steps. II have amdgpu driver without dkms ![image](https://github.com/user-attachments/assets/5e43cf7f-e28c-428e-b88d-f1a2a8191242)
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

After following the instruction, I run this as suggested here. It seems not picking up amdgpu

image

<!-- gh-comment-id:2475580007 --> @ihgumilar commented on GitHub (Nov 14, 2024): After following the instruction, I run this as suggested [here](https://github.com/alexhegit/Playing-with-ROCm/blob/main/inference/LLM/Run_Ollama_with_AMD_iGPU780M-QuickStart.md). It seems not picking up amdgpu ![image](https://github.com/user-attachments/assets/a54c47ee-0dc9-44fd-9291-6451f10c67c7)
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

k ye ..
another try is u update firmware
for example by git
then copy over to /lib/firmware or whereever it was
then try downgraded kernel

On Thu, Nov 14, 2024, 8:10 AM Ihshan Gumilar @.***>
wrote:

After following the instruction, I run this as suggested here
https://github.com/alexhegit/Playing-with-ROCm/blob/main/inference/LLM/Run_Ollama_with_AMD_iGPU780M-QuickStart.md.
It seems not picking up amdgpu

image.png (view on web)
https://github.com/user-attachments/assets/a54c47ee-0dc9-44fd-9291-6451f10c67c7


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475580007,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3K4W2PM7WFWXGZUAHL2AREH7AVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGU4DAMBQG4
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475591645 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): k ye .. another try is u update firmware for example by git then copy over to /lib/firmware or whereever it was then try downgraded kernel On Thu, Nov 14, 2024, 8:10 AM Ihshan Gumilar ***@***.***> wrote: > After following the instruction, I run this as suggested here > <https://github.com/alexhegit/Playing-with-ROCm/blob/main/inference/LLM/Run_Ollama_with_AMD_iGPU780M-QuickStart.md>. > It seems not picking up amdgpu > > image.png (view on web) > <https://github.com/user-attachments/assets/a54c47ee-0dc9-44fd-9291-6451f10c67c7> > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475580007>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3K4W2PM7WFWXGZUAHL2AREH7AVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGU4DAMBQG4> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

Downgrade kernel is not solution for now since it creates issue with hardware such as wifi and other keyboards

<!-- gh-comment-id:2475610287 --> @ihgumilar commented on GitHub (Nov 14, 2024): Downgrade kernel is not solution for now since it creates issue with hardware such as wifi and other keyboards
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

i tell u git and cp cmd
it overwrites all firmwares with newest officials
then u try
once
or no try

On Thu, Nov 14, 2024, 8:30 AM Ihshan Gumilar @.***>
wrote:

Downgrade kernel is not solution for now since it creates issue with
hardware such as wifi and other keyboards


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475610287,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3JN3KA5P3RIJKKFVGD2ARGQDAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGYYTAMRYG4
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475615810 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): i tell u git and cp cmd it overwrites all firmwares with newest officials then u try once or no try On Thu, Nov 14, 2024, 8:30 AM Ihshan Gumilar ***@***.***> wrote: > Downgrade kernel is not solution for now since it creates issue with > hardware such as wifi and other keyboards > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475610287>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3JN3KA5P3RIJKKFVGD2ARGQDAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGYYTAMRYG4> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

Sorry, I don't get it what do you mean by i tell u git and cp cmd ? which git repo do you mean ?

<!-- gh-comment-id:2475622597 --> @ihgumilar commented on GitHub (Nov 14, 2024): Sorry, I don't get it what do you mean by i tell u git and cp cmd ? which git repo do you mean ?
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

cp -ap /lib/firmware/ /fw.bak
git clone
https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git
fw
cp -ap fw/* /lib/firmware/

most worst that can happen is too new fw blocks new kernel
but that really isnt the usual , as i update for butfixes
thats why it might revive ur old kernel
backup will be /fw.bak dir

reboot into old kernel and see
else boot back

On Thu, Nov 14, 2024, 8:37 AM Ihshan Gumilar @.***>
wrote:

Sorry, I don't get it what do you mean by i tell u git and cp cmd ? which
git repo do you mean ?


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475622597,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3IZU2M7UN2ZRY7376T2ARHM7AVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGYZDENJZG4
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475641024 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): cp -ap /lib/firmware/ /fw.bak git clone https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git fw cp -ap fw/* /lib/firmware/ most worst that can happen is too new fw blocks new kernel but that really isnt the usual , as i update for butfixes thats why it might revive ur old kernel backup will be /fw.bak dir reboot into old kernel and see else boot back On Thu, Nov 14, 2024, 8:37 AM Ihshan Gumilar ***@***.***> wrote: > Sorry, I don't get it what do you mean by i tell u git and cp cmd ? which > git repo do you mean ? > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475622597>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3IZU2M7UN2ZRY7376T2ARHM7AVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGYZDENJZG4> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

https://www.linuxfromscratch.org/blfs/view/svn/postlfs/firmware.html

<!-- gh-comment-id:2475651524 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): https://www.linuxfromscratch.org/blfs/view/svn/postlfs/firmware.html
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

Sorry @fxmbsw7 , I am trying to understand what is the main point changing the kernel that you suggested ?

<!-- gh-comment-id:2475721070 --> @ihgumilar commented on GitHub (Nov 14, 2024): Sorry @fxmbsw7 , I am trying to understand what is the main point changing the kernel that you suggested ?
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

it is on the rocm support list
so ur ollama recogns ur igpu
as topic :))

On Thu, Nov 14, 2024, 9:35 AM Ihshan Gumilar @.***>
wrote:

Sorry @fxmbsw7 https://github.com/fxmbsw7 , I am trying to understand
what is the main point changing the kernel that you suggested ?


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475801181 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): it is on the rocm support list so ur ollama recogns ur igpu as topic :)) On Thu, Nov 14, 2024, 9:35 AM Ihshan Gumilar ***@***.***> wrote: > Sorry @fxmbsw7 <https://github.com/fxmbsw7> , I am trying to understand > what is the main point changing the kernel that you suggested ? > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

6.8 is the last in their list
..
not that i believe it
but i dun haf better infos
i wanna try on win z1 xtreme 780m too later somewhen

On Thu, Nov 14, 2024, 10:13 AM #!microsuxx @.***> wrote:

it is on the rocm support list
so ur ollama recogns ur igpu
as topic :))

On Thu, Nov 14, 2024, 9:35 AM Ihshan Gumilar @.***>
wrote:

Sorry @fxmbsw7 https://github.com/fxmbsw7 , I am trying to understand
what is the main point changing the kernel that you suggested ?


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475803352 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): 6.8 is the last in their list .. not that i believe it but i dun haf better infos i wanna try on win z1 xtreme 780m too later somewhen On Thu, Nov 14, 2024, 10:13 AM #!microsuxx ***@***.***> wrote: > it is on the rocm support list > so ur ollama recogns ur igpu > as topic :)) > > On Thu, Nov 14, 2024, 9:35 AM Ihshan Gumilar ***@***.***> > wrote: > >> Sorry @fxmbsw7 <https://github.com/fxmbsw7> , I am trying to understand >> what is the main point changing the kernel that you suggested ? >> >> — >> Reply to this email directly, view it on GitHub >> <https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070>, >> or unsubscribe >> <https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA> >> . >> You are receiving this because you were mentioned.Message ID: >> ***@***.***> >> >
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

I tried to downgrade the kernel to 6.8 but my WiFi is not detected :)

6.8 is the last in their list .. not that i believe it but i dun haf better infos i wanna try on win z1 xtreme 780m too later somewhen

On Thu, Nov 14, 2024, 10:13 AM #!microsuxx @.> wrote: it is on the rocm support list so ur ollama recogns ur igpu as topic :)) On Thu, Nov 14, 2024, 9:35 AM Ihshan Gumilar @.> wrote: > Sorry @fxmbsw7 https://github.com/fxmbsw7 , I am trying to understand > what is the main point changing the kernel that you suggested ? > > — > Reply to this email directly, view it on GitHub > <#7650 (comment)>, > or unsubscribe > https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA > . > You are receiving this because you were mentioned.Message ID: > @.***> >

<!-- gh-comment-id:2475941482 --> @ihgumilar commented on GitHub (Nov 14, 2024): I tried to downgrade the kernel to 6.8 but my WiFi is not detected :) > 6.8 is the last in their list .. not that i believe it but i dun haf better infos i wanna try on win z1 xtreme 780m too later somewhen > […](#) > On Thu, Nov 14, 2024, 10:13 AM #!microsuxx ***@***.***> wrote: it is on the rocm support list so ur ollama recogns ur igpu as topic :)) On Thu, Nov 14, 2024, 9:35 AM Ihshan Gumilar ***@***.***> wrote: > Sorry @fxmbsw7 <https://github.com/fxmbsw7> , I am trying to understand > what is the main point changing the kernel that you suggested ? > > — > Reply to this email directly, view it on GitHub > <[#7650 (comment)](https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070)>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

with updated firmware ?

On Thu, Nov 14, 2024, 11:13 AM Ihshan Gumilar @.***>
wrote:

I tried to downgrade the kernel to 6.8 but my WiFi is not detected :)

6.8 is the last in their list .. not that i believe it but i dun haf
better infos i wanna try on win z1 xtreme 780m too later somewhen
… <#m_3805883905382010878_>
On Thu, Nov 14, 2024, 10:13 AM #!microsuxx @.> wrote: it is on the
rocm support list so ur ollama recogns ur igpu as topic :)) On Thu, Nov 14,
2024, 9:35 AM Ihshan Gumilar @.
> wrote: > Sorry @fxmbsw7
https://github.com/fxmbsw7 https://github.com/fxmbsw7 , I am trying to
understand > what is the main point changing the kernel that you suggested
? > > — > Reply to this email directly, view it on GitHub > <#7650
(comment)
https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070>,

or unsubscribe >
https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA
. > You are receiving this because you were mentioned.Message ID: > @.***>


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2475941482,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3JHXRQ6AFMAWJTRMUD2ARZU3AVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVHE2DCNBYGI
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2475949580 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): with updated firmware ? On Thu, Nov 14, 2024, 11:13 AM Ihshan Gumilar ***@***.***> wrote: > I tried to downgrade the kernel to 6.8 but my WiFi is not detected :) > > 6.8 is the last in their list .. not that i believe it but i dun haf > better infos i wanna try on win z1 xtreme 780m too later somewhen > … <#m_3805883905382010878_> > On Thu, Nov 14, 2024, 10:13 AM #!microsuxx *@*.*> wrote: it is on the > rocm support list so ur ollama recogns ur igpu as topic :)) On Thu, Nov 14, > 2024, 9:35 AM Ihshan Gumilar @.*> wrote: > Sorry @fxmbsw7 > <https://github.com/fxmbsw7> https://github.com/fxmbsw7 , I am trying to > understand > what is the main point changing the kernel that you suggested > ? > > — > Reply to this email directly, view it on GitHub > <#7650 > (comment) > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475721070>>, > > or unsubscribe > > https://github.com/notifications/unsubscribe-auth/AJMLP3POXBRKIZQQ2TBUW5T2AROENAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVG4ZDCMBXGA > > . > You are receiving this because you were mentioned.Message ID: > *@*.***> > > > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2475941482>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3JHXRQ6AFMAWJTRMUD2ARZU3AVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVHE2DCNBYGI> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

Yes :(
I may have to wait for the next release :)

<!-- gh-comment-id:2476286396 --> @ihgumilar commented on GitHub (Nov 14, 2024): Yes :( I may have to wait for the next release :)
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

aidd yess ..

On Thu, Nov 14, 2024, 1:54 PM Ihshan Gumilar @.***>
wrote:

Yes :(
I may have to wait for the next release :)


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2476286396,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3MH7E4Q6U45DNSKCJT2ASMRJAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZWGI4DMMZZGY
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2476288292 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): aidd yess .. On Thu, Nov 14, 2024, 1:54 PM Ihshan Gumilar ***@***.***> wrote: > Yes :( > I may have to wait for the next release :) > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2476286396>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3MH7E4Q6U45DNSKCJT2ASMRJAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZWGI4DMMZZGY> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@ihgumilar commented on GitHub (Nov 14, 2024):

Thanks for your help @fxmbsw7

<!-- gh-comment-id:2476312430 --> @ihgumilar commented on GitHub (Nov 14, 2024): Thanks for your help @fxmbsw7
Author
Owner

@fxmbsw7 commented on GitHub (Nov 14, 2024):

its aidd , thxx too for cooperation

On Thu, Nov 14, 2024, 2:07 PM Ihshan Gumilar @.***>
wrote:

Thanks for your help @fxmbsw7 https://github.com/fxmbsw7


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7650#issuecomment-2476312430,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3M5PBZZPYOXTLFOSJD2ASN7TAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZWGMYTENBTGA
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2477250619 --> @fxmbsw7 commented on GitHub (Nov 14, 2024): its aidd , thxx too for cooperation On Thu, Nov 14, 2024, 2:07 PM Ihshan Gumilar ***@***.***> wrote: > Thanks for your help @fxmbsw7 <https://github.com/fxmbsw7> > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7650#issuecomment-2476312430>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3M5PBZZPYOXTLFOSJD2ASN7TAVCNFSM6AAAAABRWFSZYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZWGMYTENBTGA> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@dhiltgen commented on GitHub (Mar 11, 2026):

Release 0.17.8 updates Linux to ROCm v7 which covers support for this GPU. Please give the RC a try and let us know if you run into any problems.

<!-- gh-comment-id:4041995441 --> @dhiltgen commented on GitHub (Mar 11, 2026): Release 0.17.8 updates Linux to ROCm v7 which covers support for this GPU. Please give the [RC a try](https://github.com/ollama/ollama/blob/main/docs/linux.mdx#installing-specific-versions) and let us know if you run into any problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66937