[GH-ISSUE #4132] model run command not rendered on mobile #2568

Closed
opened 2026-04-12 12:54:13 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @olumolu on GitHub (May 3, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4132

Originally assigned to: @BruceMacD on GitHub.

https://ollama.com/library/phi3:3.8b
In the page the installation commrnd is not written likw llama and gemma.

Originally created by @olumolu on GitHub (May 3, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4132 Originally assigned to: @BruceMacD on GitHub. https://ollama.com/library/phi3:3.8b In the page the installation commrnd is not written likw llama and gemma.
GiteaMirror added the feature requestollama.com labels 2026-04-12 12:54:13 -05:00
Author
Owner

@thinkverse commented on GitHub (May 3, 2024):

The installation command is written in the input to the right of the tag drop-down. It's ollama run <model> for an interactive session and ollama pull <model> to just download the model without an interactive session. It's the same CLI commands for all models.

Screenshot 2024-05-03 at 22 44 06
<!-- gh-comment-id:2093741697 --> @thinkverse commented on GitHub (May 3, 2024): The installation command is written in the input to the right of the tag drop-down. It's `ollama run <model>` for an interactive session and `ollama pull <model>` to just download the model without an interactive session. It's the same CLI commands for all models. <img width="771" alt="Screenshot 2024-05-03 at 22 44 06" src="https://github.com/ollama/ollama/assets/2221746/f24ed390-6f45-4d58-8632-68a3e6735299">
Author
Owner

@olumolu commented on GitHub (May 3, 2024):

Not visible in a mobile device
Screenshot_20240504-044317~2

<!-- gh-comment-id:2093875492 --> @olumolu commented on GitHub (May 3, 2024): Not visible in a mobile device ![Screenshot_20240504-044317~2](https://github.com/ollama/ollama/assets/162728301/9422cfd6-ce9d-4b4b-90a0-7abc2cc587cb)
Author
Owner

@olumolu commented on GitHub (May 3, 2024):

This is for llama3
Screenshot_20240504-044542~2

<!-- gh-comment-id:2093876546 --> @olumolu commented on GitHub (May 3, 2024): This is for llama3 ![Screenshot_20240504-044542~2](https://github.com/ollama/ollama/assets/162728301/b7d8a02c-9794-452b-90a9-7033a4c96572)
Author
Owner

@thinkverse commented on GitHub (May 3, 2024):

Not visible in a mobile device

Not sure why you'd use a mobile device to access the website given Ollama is a desktop/server app. But either way, ollama run <model> and ollama pull <model>. It's the same for all models regardless of what's in their respective READMEs if you're using the ollama CLI.

To grab a specific tag of the model you'd add the tag after the model name <model>[:tag], for e.g:

ollama run llama3:70b
<!-- gh-comment-id:2093878023 --> @thinkverse commented on GitHub (May 3, 2024): > Not visible in a mobile device Not sure why you'd use a mobile device to access the website given Ollama is a desktop/server app. But either way, `ollama run <model>` and `ollama pull <model>`. It's the same for all models regardless of what's in their respective READMEs if you're using the `ollama` CLI. To grab a specific tag of the model you'd add the tag after the model name `<model>[:tag]`, for e.g: ```shell ollama run llama3:70b ```
Author
Owner

@olumolu commented on GitHub (May 4, 2024):

I use mobile because it easier. Just look into and put into server else you need to ssh and then use a laptop to access and do stuff.you can also remove openssh. So you can't be effected with xz vulnerability

<!-- gh-comment-id:2093905270 --> @olumolu commented on GitHub (May 4, 2024): I use mobile because it easier. Just look into and put into server else you need to ssh and then use a laptop to access and do stuff.you can also remove openssh. So you can't be effected with xz vulnerability
Author
Owner

@hoyyeva commented on GitHub (May 14, 2025):

Hi @olumolu

hank you for reporting the issue. We’ve just deployed a new version of the model page today, and the issue is no longer relevant as we've also added the model run command back to mobile.

<!-- gh-comment-id:2881413774 --> @hoyyeva commented on GitHub (May 14, 2025): Hi @olumolu hank you for reporting the issue. We’ve just deployed a new version of the model page today, and the issue is no longer relevant as we've also added the model run command back to mobile.
Author
Owner

@olumolu commented on GitHub (May 14, 2025):

Hi @olumolu

hank you for reporting the issue. We’ve just deployed a new version of the model page today, and the issue is no longer relevant as we've also added the model run command back to mobile.

Thank you for fixing this i really appreciate.

<!-- gh-comment-id:2881597037 --> @olumolu commented on GitHub (May 14, 2025): > Hi [@olumolu](https://github.com/olumolu) > > hank you for reporting the issue. We’ve just deployed a new version of the model page today, and the issue is no longer relevant as we've also added the model run command back to mobile. Thank you for fixing this i really appreciate.
Author
Owner

@olumolu commented on GitHub (May 14, 2025):

Hi @olumolu

hank you for reporting the issue. We’ve just deployed a new version of the model page today, and the issue is no longer relevant as we've also added the model run command back to mobile.

Hi i find another error.

Can you please loook into it.?

Description:
Open explore models
For example i took qwen3
Open this
https://ollama.com/library/qwen3:30b
You will see this link now just tap on view all and after that it will show all quantised versions just tap on any non default version. It will openes up after it opens you dont have view all option anu more you have to go back twice to chnage to a different model that you want to select with is bad by design.
Here are some screenshots

Image
Image
Image

<!-- gh-comment-id:2881608597 --> @olumolu commented on GitHub (May 14, 2025): > Hi [@olumolu](https://github.com/olumolu) > > hank you for reporting the issue. We’ve just deployed a new version of the model page today, and the issue is no longer relevant as we've also added the model run command back to mobile. Hi i find another error. Can you please loook into it.? Description: Open explore models For example i took qwen3 Open this https://ollama.com/library/qwen3:30b You will see this link now just tap on view all and after that it will show all quantised versions just tap on any non default version. It will openes up after it opens you dont have view all option anu more you have to go back twice to chnage to a different model that you want to select with is bad by design. Here are some screenshots ![Image](https://github.com/user-attachments/assets/26cead3a-13bc-46e0-8503-b4c680b1f942) ![Image](https://github.com/user-attachments/assets/4aa02750-339b-4614-b0fe-444081731ff9) ![Image](https://github.com/user-attachments/assets/a92fba5f-5cf8-46fb-b5f9-d55c3620ce1b)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2568