[GH-ISSUE #6204] The Quickstart section in README is missing the 'ollama start' command #29635

Closed
opened 2026-04-22 08:40:29 -05:00 by GiteaMirror · 22 comments
Owner

Originally created by @yurivict on GitHub (Aug 6, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6204

What is the issue?

People who run ollama for the first time wouldn't know that 'ollama start' needs to be run.

OS

Linux

GPU

No response

CPU

Intel

Ollama version

0.3.4

Originally created by @yurivict on GitHub (Aug 6, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6204 ### What is the issue? People who run ollama for the first time wouldn't know that 'ollama start' needs to be run. ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.3.4
GiteaMirror added the bug label 2026-04-22 08:40:29 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 6, 2024):

If the user has installed via the methods listed on the README, the ollama service should be started automatically. The manual install instructions recommend installing a service to run the downloaded file, but you are right that if they skip the recommendations, the ollama server won't be running.

<!-- gh-comment-id:2271729661 --> @rick-github commented on GitHub (Aug 6, 2024): If the user has installed via the methods listed on the README, the ollama service should be started automatically. The manual install instructions recommend installing a service to run the downloaded file, but you are right that if they skip the recommendations, the ollama server won't be running.
Author
Owner

@yurivict commented on GitHub (Aug 6, 2024):

I installed from the FreeBSD port which builds from source. The user only has the ollama command.

<!-- gh-comment-id:2271752010 --> @yurivict commented on GitHub (Aug 6, 2024): I installed from the FreeBSD port which builds from source. The user only has the ```ollama``` command.
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

I just ran into this as well. Did the Linux install. Everything looked fine. ollama at the CLI returned a list of commands. I went to the quick start and had this exact issue.

<!-- gh-comment-id:2327413681 --> @cfjedimaster commented on GitHub (Sep 3, 2024): I just ran into this as well. Did the Linux install. Everything looked fine. `ollama` at the CLI returned a list of commands. I went to the quick start and had this exact issue.
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

As an aside, start isn't even listed in the available commands when you type ollama. Making a PR now for a suggested fix.

<!-- gh-comment-id:2327417550 --> @cfjedimaster commented on GitHub (Sep 3, 2024): As an aside, `start` isn't even listed in the available commands when you type `ollama`. Making a PR now for a suggested fix.
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

If you run ollama start it will start the ollama server in your environment - models will be stored in ~/.ollama/models instead of /usr/share/ollama/.ollama/models, environment variable changes made in the ollama service file won't affect the server, etc. If you did the curl -fsSL https://ollama.com/install.sh | sh method of install and the server didn't start, it would be better to figure out why that failed. What does journalctl -u ollama --no-pager show?

<!-- gh-comment-id:2327428598 --> @rick-github commented on GitHub (Sep 3, 2024): If you run `ollama start` it will start the ollama server in your environment - models will be stored in ~/.ollama/models instead of /usr/share/ollama/.ollama/models, environment variable changes made in the ollama service file won't affect the server, etc. If you did the `curl -fsSL https://ollama.com/install.sh | sh` method of install and the server didn't start, it would be better to figure out why that failed. What does `journalctl -u ollama --no-pager` show?
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

It returns No journal files were found. I'll add that when I ran the install script, it certainly seemed like everything went ok. I don't remember any issues.

<!-- gh-comment-id:2327434944 --> @cfjedimaster commented on GitHub (Sep 3, 2024): It returns `No journal files were found`. I'll add that when I ran the install script, it certainly seemed like everything went ok. I don't remember any issues.
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

What Linux distro are you using?

<!-- gh-comment-id:2327437837 --> @rick-github commented on GitHub (Sep 3, 2024): What Linux distro are you using?
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

Ubuntu 22.04.4 LTS, via WSL.

<!-- gh-comment-id:2327439377 --> @cfjedimaster commented on GitHub (Sep 3, 2024): Ubuntu 22.04.4 LTS, via WSL.
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

What's the output of systemctl status ollama.service?

<!-- gh-comment-id:2327443822 --> @rick-github commented on GitHub (Sep 3, 2024): What's the output of `systemctl status ollama.service`?
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

System has not been booted with systemd as init system (PID 1). Can't operate.
Failed to connect to bus: Host is down
<!-- gh-comment-id:2327446306 --> @cfjedimaster commented on GitHub (Sep 3, 2024): ``` System has not been booted with systemd as init system (PID 1). Can't operate. Failed to connect to bus: Host is down ```
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

OK, seems systemd is not working, which would explain why the ollama server wasn't started. Not sure why, was it desktop or server image (or something special for WSL)?

<!-- gh-comment-id:2327453628 --> @rick-github commented on GitHub (Sep 3, 2024): OK, seems systemd is not working, which would explain why the ollama server wasn't started. Not sure why, was it desktop or server image (or something special for WSL)?
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

Eh... this I can't say. I've been a WSL user for 5+ years or so, but am mostly a casual user of Linux. I prefer it over cmd.exe, PowerShell, etc, and know it better, but just enough to be dangerous. ;)

<!-- gh-comment-id:2327457119 --> @cfjedimaster commented on GitHub (Sep 3, 2024): Eh... this I can't say. I've been a WSL user for 5+ years or so, but am mostly a casual user of Linux. I prefer it over cmd.exe, PowerShell, etc, and know it better, but just enough to be dangerous. ;)
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

Looks like systemd support in WSL is -very- new, https://learn.microsoft.com/en-us/windows/wsl/systemd, and therefore may be why it's not running on my side.

<!-- gh-comment-id:2327458416 --> @cfjedimaster commented on GitHub (Sep 3, 2024): Looks like systemd support in WSL is -very- new, https://learn.microsoft.com/en-us/windows/wsl/systemd, and therefore may be why it's not running on my side.
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

In theory, and this could be a new issue, shouldn't the install script have thrown an error of some sort? I'm looking at the output now and nothing at all seems amiss:

ray@Hoth:~/projects/raymondcamden2023$ curl -fsSL https://ollama.com/install.sh | sh
>>> Installing ollama to /usr/local
[sudo] password for ray:
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%##O#-#
######################################################################## 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Nvidia GPU detected.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
<!-- gh-comment-id:2327460091 --> @cfjedimaster commented on GitHub (Sep 3, 2024): In theory, and this could be a new issue, shouldn't the install script have thrown an error of some sort? I'm looking at the output now and nothing at all seems amiss: ``` ray@Hoth:~/projects/raymondcamden2023$ curl -fsSL https://ollama.com/install.sh | sh >>> Installing ollama to /usr/local [sudo] password for ray: >>> Downloading Linux amd64 bundle ######################################################################## 100.0%##O#-# ######################################################################## 100.0% >>> Creating ollama user... >>> Adding ollama user to render group... >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Nvidia GPU detected. >>> The Ollama API is now available at 127.0.0.1:11434. >>> Install complete. Run "ollama" from the command line. >>> The Ollama API is now available at 127.0.0.1:11434. >>> Install complete. Run "ollama" from the command line. ```
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

FYI, I followed the directions at the link above and systemd is now running, and the command you shared earlier now shows the Ollama Service.

<!-- gh-comment-id:2327467888 --> @cfjedimaster commented on GitHub (Sep 3, 2024): FYI, I followed the directions at the link above and systemd is now running, and the command you shared earlier now shows the Ollama Service.
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

Yes, it looks like an issue for the installer, it checks for a running systemd but if it's not, the installer doesn't notify the user that the service won't be automatically started.

<!-- gh-comment-id:2327483563 --> @rick-github commented on GitHub (Sep 3, 2024): Yes, it looks like an issue for the installer, it checks for a running systemd but if it's not, the installer doesn't notify the user that the service won't be automatically started.
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

I'll file a new bug. I do think the QS could use a quick note (like what I added in my PR) though. :)

<!-- gh-comment-id:2327490779 --> @cfjedimaster commented on GitHub (Sep 3, 2024): I'll file a new bug. I do think the QS could use a quick note (like what I added in my PR) though. :)
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

The problem with having other ways to start the server is that it multiplies the support burden. For example, if you file a bug with ollama, one of the frequent responses is to ask for logs, and as demonstrated, the logs are not available if you ollama start.

Is there any reason you are not using the windows version of ollama?

<!-- gh-comment-id:2327499218 --> @rick-github commented on GitHub (Sep 3, 2024): The problem with having other ways to start the server is that it multiplies the support burden. For example, if you file a bug with ollama, one of the frequent responses is to ask for logs, and as demonstrated, the logs are not available if you `ollama start`. Is there any reason you are not using the windows version of ollama?
Author
Owner

@cfjedimaster commented on GitHub (Sep 3, 2024):

But - don't you agree that if the script tries to use systemd and fails, it should report it? Obviously the issue was on my side, but w/ no output from the script about the failure, it took longer to debug.

As for your second question - I generally use the linux version of CLIs as I spend my time in WSL usually.

<!-- gh-comment-id:2327539306 --> @cfjedimaster commented on GitHub (Sep 3, 2024): But - don't you agree that if the script tries to use systemd and fails, it should report it? Obviously the issue was on my side, but w/ no output from the script about the failure, it took longer to debug. As for your second question - I generally use the linux version of CLIs as I spend my time in WSL usually.
Author
Owner

@rick-github commented on GitHub (Sep 3, 2024):

Yes, I agree that the script should notify the user that the system won't perform as expected due to the lack of a working systemd.

<!-- gh-comment-id:2327542860 --> @rick-github commented on GitHub (Sep 3, 2024): Yes, I agree that the script should notify the user that the system won't perform as expected due to the lack of a working systemd.
Author
Owner

@dhiltgen commented on GitHub (Sep 5, 2024):

BSD support isn't part of the official release - this is tracked via #1102

The quickstart guide is intended to cover officially supported binary distributions. It sounds like there is a port out there for BSD which does not include any system service configuration, nor documentation on how to use it. The requested docs seem like they should belong there.

When building from source, we do discuss the serve command (start is an alias) in the main readme - https://github.com/ollama/ollama/tree/main?tab=readme-ov-file#building

<!-- gh-comment-id:2332439938 --> @dhiltgen commented on GitHub (Sep 5, 2024): BSD support isn't part of the official release - this is tracked via #1102 The quickstart guide is intended to cover officially supported binary distributions. It sounds like there is a port out there for BSD which does not include any system service configuration, nor documentation on how to use it. The requested docs seem like they should belong there. When building from source, we do discuss the `serve` command (`start` is an alias) in the main readme - https://github.com/ollama/ollama/tree/main?tab=readme-ov-file#building
Author
Owner

@yurivict commented on GitHub (Sep 5, 2024):

It sounds like there is a port out there for BSD which does not include any system service configuration, nor documentation on how to use it.

Wrong. The FreeBSD port comes with this initial documentation packaged as a package message:
https://cgit.freebsd.org/ports/tree/misc/ollama/pkg-message

My suggestion was for people who attempt to install ollama using this repository.

<!-- gh-comment-id:2332446727 --> @yurivict commented on GitHub (Sep 5, 2024): > It sounds like there is a port out there for BSD which does not include any system service configuration, nor documentation on how to use it. Wrong. The FreeBSD port comes with this initial documentation packaged as a package message: https://cgit.freebsd.org/ports/tree/misc/ollama/pkg-message My suggestion was for people who attempt to install ollama using this repository.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29635