[GH-ISSUE #2928] Error: could not connect to ollama app, is it running? #1793

Closed
opened 2026-04-12 11:49:21 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @ttkrpink on GitHub (Mar 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2928

I think I removed everything, and reinstalled ollama on Ubuntu:22.04. After a “fresh” install, the command line can not connect to ollama app.

Ubuntu: ~ $ curl -fsSL https://ollama.com/install.sh | sh
>>> Downloading ollama...
######################################################################## 100.0%#=#=-#  #
######################################################################## 100.0%
>>> Installing ollama to /usr/local/bin...
[sudo] password for user:
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> NVIDIA GPU installed.
Ubuntu: ~ $ ollama
Usage:
  ollama [flags]
  ollama [command]

Available Commands:
  serve       Start ollama
  create      Create a model from a Modelfile
  show        Show information for a model
  run         Run a model
  pull        Pull a model from a registry
  push        Push a model to a registry
  list        List models
  cp          Copy a model
  rm          Remove a model
  help        Help about any command

Flags:
  -h, --help      help for ollama
  -v, --version   Show version information

Use "ollama [command] --help" for more information about a command.
Ubuntu: ~ $ ollama list
Error: could not connect to ollama app, is it running?
Ubuntu: ~ $ sudo service ollama status
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
    Drop-In: /etc/systemd/system/ollama.service.d
             └─override.conf
     Active: active (running) since Tue 2024-03-05 16:04:03 CST; 28s ago
   Main PID: 605340 (ollama)
      Tasks: 27 (limit: 76514)
     Memory: 486.3M
        CPU: 7.895s
     CGroup: /system.slice/ollama.service
             └─605340 /usr/local/bin/ollama serve

3月 05 16:04:03 Ubuntu ollama[605340]: time=2024-03-05T16:04:03.849+08:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0"
3月 05 16:04:03 Ubuntu ollama[605340]: time=2024-03-05T16:04:03.849+08:00 level=INFO source=routes.go:1019 msg="Listening on [::]:33020 (version 0.1.27)"
3月 05 16:04:03 Ubuntu ollama[605340]: time=2024-03-05T16:04:03.850+08:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.164+08:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu_avx cpu >
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.164+08:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.164+08:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia>
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.168+08:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/usr/lib/x86_64-lin>
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.175+08:00 level=INFO source=gpu.go:99 msg="Nvidia GPU detected"
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.175+08:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.191+08:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.6"
[Service]
Environment="OLLAMA_HOST=0.0.0.0:33020"

I have to use ollama serve first then I can pull model files. If I check the service port, both 33020 and 11434 are in service.

If the ollama is running as a service, do I suppose to download model file directly without launch another ollama serve from command line?

Thanks

Originally created by @ttkrpink on GitHub (Mar 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2928 I think I removed everything, and reinstalled ollama on Ubuntu:22.04. After a “fresh” install, the command line can not connect to ollama app. ``` Ubuntu: ~ $ curl -fsSL https://ollama.com/install.sh | sh >>> Downloading ollama... ######################################################################## 100.0%#=#=-# # ######################################################################## 100.0% >>> Installing ollama to /usr/local/bin... [sudo] password for user: >>> Adding ollama user to render group... >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... >>> NVIDIA GPU installed. Ubuntu: ~ $ ollama Usage: ollama [flags] ollama [command] Available Commands: serve Start ollama create Create a model from a Modelfile show Show information for a model run Run a model pull Pull a model from a registry push Push a model to a registry list List models cp Copy a model rm Remove a model help Help about any command Flags: -h, --help help for ollama -v, --version Show version information Use "ollama [command] --help" for more information about a command. Ubuntu: ~ $ ollama list Error: could not connect to ollama app, is it running? Ubuntu: ~ $ sudo service ollama status ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled) Drop-In: /etc/systemd/system/ollama.service.d └─override.conf Active: active (running) since Tue 2024-03-05 16:04:03 CST; 28s ago Main PID: 605340 (ollama) Tasks: 27 (limit: 76514) Memory: 486.3M CPU: 7.895s CGroup: /system.slice/ollama.service └─605340 /usr/local/bin/ollama serve 3月 05 16:04:03 Ubuntu ollama[605340]: time=2024-03-05T16:04:03.849+08:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0" 3月 05 16:04:03 Ubuntu ollama[605340]: time=2024-03-05T16:04:03.849+08:00 level=INFO source=routes.go:1019 msg="Listening on [::]:33020 (version 0.1.27)" 3月 05 16:04:03 Ubuntu ollama[605340]: time=2024-03-05T16:04:03.850+08:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.164+08:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu_avx cpu > 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.164+08:00 level=INFO source=gpu.go:94 msg="Detecting GPU type" 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.164+08:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia> 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.168+08:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/usr/lib/x86_64-lin> 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.175+08:00 level=INFO source=gpu.go:99 msg="Nvidia GPU detected" 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.175+08:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" 3月 05 16:04:08 Ubuntu ollama[605340]: time=2024-03-05T16:04:08.191+08:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.6" ``` ``` [Service] Environment="OLLAMA_HOST=0.0.0.0:33020" ``` I have to use ollama serve first then I can pull model files. If I check the service port, both 33020 and 11434 are in service. If the ollama is running as a service, do I suppose to download model file directly without launch another ollama serve from command line? Thanks
Author
Owner

@xuqihua commented on GitHub (Mar 6, 2024):

+1

<!-- gh-comment-id:1980779939 --> @xuqihua commented on GitHub (Mar 6, 2024): +1
Author
Owner

@mxyng commented on GitHub (Mar 6, 2024):

You've overwritten OLLAMA_HOST so the service serves on port 33020. ollama CLI uses 11434 by default so unless you specified to use 33020, it'll use 11434 which isn't open. That's why you needed to call ollama serve in order to pull a model.

Try this

OLLAMA_HOST=127.0.0.1:33020 ollama list
<!-- gh-comment-id:1981938552 --> @mxyng commented on GitHub (Mar 6, 2024): You've overwritten OLLAMA_HOST so the service serves on port 33020. ollama CLI uses 11434 by default so unless you specified to use 33020, it'll use 11434 which isn't open. That's why you needed to call `ollama serve` in order to pull a model. Try this ``` OLLAMA_HOST=127.0.0.1:33020 ollama list ```
Author
Owner

@mcgillg3141 commented on GitHub (Mar 12, 2024):

This did not work for me either. Fresh install, not working.

<!-- gh-comment-id:1992309210 --> @mcgillg3141 commented on GitHub (Mar 12, 2024): This did not work for me either. Fresh install, not working.
Author
Owner

@Bolofofopt commented on GitHub (Mar 17, 2024):

Same in me, just installed and it's not working

<!-- gh-comment-id:2002488959 --> @Bolofofopt commented on GitHub (Mar 17, 2024): Same in me, just installed and it's not working
Author
Owner

@mxyng commented on GitHub (Mar 18, 2024):

@mcgillg3141 @Bolofofopt please open a new issue with details on what's no working because the root cause of the original issue has been identified

<!-- gh-comment-id:2003128062 --> @mxyng commented on GitHub (Mar 18, 2024): @mcgillg3141 @Bolofofopt please open a new issue with details on what's no working because the root cause of the original issue has been identified
Author
Owner

@Bolofofopt commented on GitHub (Mar 18, 2024):

I've found how to solve it via Reddit: https://www.reddit.com/r/ollama/comments/196rbo7/cant_get_ollama_to_run_in_ubuntu/

<!-- gh-comment-id:2003214608 --> @Bolofofopt commented on GitHub (Mar 18, 2024): I've found how to solve it via Reddit: https://www.reddit.com/r/ollama/comments/196rbo7/cant_get_ollama_to_run_in_ubuntu/
Author
Owner

@surfingtonio commented on GitHub (Feb 16, 2025):

In my case, systemd is not running. I had to open /etc/wsl.conf using:

sudo nano /etc/wsl.conf

and add:

[boot]
systemd=true

After that I restarted the wsl and it worked!

Reference:
https://learn.microsoft.com/en-us/windows/wsl/systemd#how-to-enable-systemd

<!-- gh-comment-id:2661559783 --> @surfingtonio commented on GitHub (Feb 16, 2025): In my case, systemd is not running. I had to open /etc/wsl.conf using: `sudo nano /etc/wsl.conf` and add: ``` [boot] systemd=true ``` After that I restarted the wsl and it worked! Reference: https://learn.microsoft.com/en-us/windows/wsl/systemd#how-to-enable-systemd
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1793