[GH-ISSUE #1752] Ollama can run in Docker (hosted in local machine) but not directly in local #1003

Closed
opened 2026-04-12 10:42:54 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Huertas97 on GitHub (Dec 31, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1752

It is quite strange.
I have deployed the container of ollama and I can access to the bash shell and load models and chat with them. But when I install Ollama in the local system (the same that is running the docker container), when I try to chat with the same model (explored: tinyllama and mistral), it says:

Error: llama runner exited, you may not have enough available memory to run this model

Originally created by @Huertas97 on GitHub (Dec 31, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1752 It is quite strange. I have deployed the container of ollama and I can access to the bash shell and load models and chat with them. But when I install Ollama in the local system (the same that is running the docker container), when I try to chat with the same model (explored: tinyllama and mistral), it says: `Error: llama runner exited, you may not have enough available memory to run this model`
Author
Owner

@igorschlum commented on GitHub (Dec 31, 2023):

Hi @Huertas97 what is your OS and how much memory you have? Try restart your computer and run Ollama directly in the system to test if it works. The memory allocated for docker could be still used and not available. Do you use the 0.1.17 version of Ollama?

<!-- gh-comment-id:1873023955 --> @igorschlum commented on GitHub (Dec 31, 2023): Hi @Huertas97 what is your OS and how much memory you have? Try restart your computer and run Ollama directly in the system to test if it works. The memory allocated for docker could be still used and not available. Do you use the 0.1.17 version of Ollama?
Author
Owner

@Huertas97 commented on GitHub (Jan 1, 2024):

Hi @igorschlum thank you for you quick response,

TL;DR:
Shutting down and starting the machine after installing Ollama from the Instalation Doc worked for me. Restarting was not enough.

More details if someone is facing similar problems:

My OS is Linux Lite based on Ubuntu 22.04.03 LTS. These are the details:

> cat /etc/os-release                                                    (base) 
PRETTY_NAME="Linux Lite 6.6"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.3 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy

This OS is running in a machine with 16GB RAM DDR4.
I have installed the 0.1.17 Ollama version.

Wrapping up,
As proposed by @igorschlum restarting the machine (shutting it down completely) after installing Ollama solves it.

So I close the Issue thread

<!-- gh-comment-id:1873293856 --> @Huertas97 commented on GitHub (Jan 1, 2024): Hi @igorschlum thank you for you quick response, **TL;DR:** Shutting down and starting the machine after installing Ollama from the [Instalation Doc](https://ollama.ai/download) worked for me. Restarting was not enough. **More details if someone is facing similar problems:** My OS is Linux Lite based on Ubuntu 22.04.03 LTS. These are the details: ```shell > cat /etc/os-release (base) PRETTY_NAME="Linux Lite 6.6" NAME="Ubuntu" VERSION_ID="22.04" VERSION="22.04.3 LTS (Jammy Jellyfish)" VERSION_CODENAME=jammy ID=ubuntu ID_LIKE=debian HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" UBUNTU_CODENAME=jammy ``` This OS is running in a machine with 16GB RAM DDR4. I have installed the 0.1.17 Ollama version. Wrapping up, As proposed by @igorschlum restarting the machine (shutting it down completely) after installing Ollama solves it. So I close the Issue thread
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1003