[GH-ISSUE #2050] general Question #1185

Closed
opened 2026-04-12 10:58:05 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @ghost on GitHub (Jan 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2050

Is there any way to run ollama models on any computer without a GPU?

Originally created by @ghost on GitHub (Jan 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2050 Is there any way to run ollama models on any computer without a GPU?
Author
Owner

@easp commented on GitHub (Jan 18, 2024):

Yes. Just install Ollama, pull a model and run it.

Speeds won't be great, so you'll probably want to focus on smaller models ≤7b parameters. Mixtral and its fine-tunes are pretty good. There are also smaller models like phi, orca-mini, mini-llama, but they can be pretty "dumb."

<!-- gh-comment-id:1898849523 --> @easp commented on GitHub (Jan 18, 2024): Yes. Just install Ollama, pull a model and run it. Speeds won't be great, so you'll probably want to focus on smaller models ≤7b parameters. Mixtral and its fine-tunes are pretty good. There are also smaller models like phi, orca-mini, mini-llama, but they can be pretty "dumb."
Author
Owner

@ghost commented on GitHub (Jan 20, 2024):

image
it dont let me!!

<!-- gh-comment-id:1901691713 --> @ghost commented on GitHub (Jan 20, 2024): ![image](https://github.com/jmorganca/ollama/assets/87431619/0bb3f747-fee8-44aa-9a03-b99a01900731) it dont let me!!
Author
Owner

@easp commented on GitHub (Jan 20, 2024):

You have the server running now.

Open another terminal window and ollama run mistral

<!-- gh-comment-id:1902245367 --> @easp commented on GitHub (Jan 20, 2024): You have the server running now. Open another terminal window and `ollama run mistral`
Author
Owner

@ghost commented on GitHub (Jan 24, 2024):

Is there any other really light weight model that you'll recommend that have things like saving conversation history and stuff

<!-- gh-comment-id:1908156212 --> @ghost commented on GitHub (Jan 24, 2024): Is there any other really light weight model that you'll recommend that have things like saving conversation history and stuff
Author
Owner

@ghost commented on GitHub (Jan 24, 2024):

Is there any other model thats light weight (under 10gb but run fast) and also os fast in performance and is not dumb and stuff

<!-- gh-comment-id:1908162837 --> @ghost commented on GitHub (Jan 24, 2024): Is there any other model thats light weight (under 10gb but run fast) and also os fast in performance and is not dumb and stuff
Author
Owner

@pdevine commented on GitHub (Mar 11, 2024):

You can save the conversation history with /save <model name> inside of the REPL and then when you go to use it again you can use ollama run <model name> with the name of the model you saved. In terms of lightweight models, I think phi is probably the smallest. Use ollama run phi to pull/start using it.

I'm going to go ahead and close the issue, but feel free to keep commenting.

<!-- gh-comment-id:1989135627 --> @pdevine commented on GitHub (Mar 11, 2024): You can save the conversation history with `/save <model name>` inside of the REPL and then when you go to use it again you can use `ollama run <model name>` with the name of the model you saved. In terms of lightweight models, I think `phi` is probably the smallest. Use `ollama run phi` to pull/start using it. I'm going to go ahead and close the issue, but feel free to keep commenting.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1185