[GH-ISSUE #1004] Feature request: Make ollama run also launch serve as child process #487

Closed
opened 2026-04-12 10:09:58 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Anrock on GitHub (Nov 4, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1004

I'm using ollama mostly for fun and occasional queries - couple of times per day.

Right now one have to run ollama serve in one terminal and then run ollama run in another terminal which is a bit clunky.
Another option is to constantly have ollama serve running in the background however running a model consumes substantial amount of RAM and VRAM.

Would it be possible to make ollama run also run a serve as child process and shut it down when user exits from the prompt if some option is given to run?

Originally created by @Anrock on GitHub (Nov 4, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1004 I'm using ollama mostly for fun and occasional queries - couple of times per day. Right now one have to run `ollama serve` in one terminal and then run `ollama run` in another terminal which is a bit clunky. Another option is to constantly have `ollama serve` running in the background however running a model consumes substantial amount of RAM and VRAM. Would it be possible to make `ollama run` also run a `serve` as child process and shut it down when user exits from the prompt if some option is given to `run`?
Author
Owner

@wrapss commented on GitHub (Nov 4, 2023):

You can create a service to automatically start ollama as described in the Manual install instructions.

<!-- gh-comment-id:1793570557 --> @wrapss commented on GitHub (Nov 4, 2023): You can create a service to automatically start ollama as described in the Manual install instructions.
Author
Owner

@Anrock commented on GitHub (Nov 4, 2023):

@wrapss I know. As I've mentioned in second paragraph it will constanly occupy substantial amount of resources when running or am I'm wrong?

<!-- gh-comment-id:1793572152 --> @Anrock commented on GitHub (Nov 4, 2023): @wrapss I know. As I've mentioned in second paragraph it will constanly occupy substantial amount of resources when running or am I'm wrong?
Author
Owner

@Anrock commented on GitHub (Nov 4, 2023):

Oh, I see. As soon as I exit prompt the server resource consumption drops to almost zero.

<!-- gh-comment-id:1793573265 --> @Anrock commented on GitHub (Nov 4, 2023): Oh, I see. As soon as I exit prompt the server resource consumption drops to almost zero.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#487