[GH-ISSUE #2455] Update rocm version for docker build to 6.0.2 #1435

Closed
opened 2026-04-12 11:20:21 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @mkesper on GitHub (Feb 11, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2455

Originally assigned to: @dhiltgen on GitHub.

Docker builds are still based against rocm-5.7.1. Rocm version 6.0.2 seems to work better so please offer at least a variant of the image with rocm 6.0.2.

Related PR: https://github.com/ollama/ollama/pull/2454

Originally created by @mkesper on GitHub (Feb 11, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2455 Originally assigned to: @dhiltgen on GitHub. Docker builds are still based against rocm-5.7.1. Rocm version 6.0.2 seems to work better so please offer at least a variant of the image with rocm 6.0.2. Related PR: https://github.com/ollama/ollama/pull/2454
GiteaMirror added the amd label 2026-04-12 11:20:21 -05:00
Author
Owner

@mkesper commented on GitHub (Feb 13, 2024):

Additionally: Would it make sense to offer the rocm image as a completely seperate image. It's quite a different product.

<!-- gh-comment-id:1940677215 --> @mkesper commented on GitHub (Feb 13, 2024): Additionally: Would it make sense to offer the rocm image as a completely seperate image. It's quite a different product.
Author
Owner

@dhiltgen commented on GitHub (Mar 11, 2024):

Additionally: Would it make sense to offer the rocm image as a completely seperate image. It's quite a different product.

Can you elaborate? Are you suggesting a separate ollama release or binary for rocm vs cuda? One of the key objectives we have is to streamline the user experience so a single app works across a broad set of GPUs and CPUs.

<!-- gh-comment-id:1989619384 --> @dhiltgen commented on GitHub (Mar 11, 2024): > Additionally: Would it make sense to offer the rocm image as a completely seperate image. It's quite a different product. Can you elaborate? Are you suggesting a separate ollama release or binary for rocm vs cuda? One of the key objectives we have is to streamline the user experience so a single app works across a broad set of GPUs and CPUs.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1435