[GH-ISSUE #3071] Unable to get ollama serve working #48401

Closed
opened 2026-04-28 08:05:08 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @harsham05 on GitHub (Mar 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3071

I have installed Ollama and the Ollama python client on Ubuntu. I am unable to interact with it using the Ollama Python client.

$ ollama list
NAME            ID              SIZE    MODIFIED
llama2:70b      e7f6c06ffef4    38 GB   2 hours ago
llama2:latest   78e26419b446    3.8 GB  4 days ago
$ OLLAMA_HOST=127.0.0.1:7656 ollama serve

When I try to interact with ollama in Python, I get a ResponseError. Thank you in advance.

import ollama
response = ollama.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])
image
Originally created by @harsham05 on GitHub (Mar 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3071 I have installed Ollama and the Ollama python client on Ubuntu. I am unable to interact with it using the Ollama Python client. ``` $ ollama list NAME ID SIZE MODIFIED llama2:70b e7f6c06ffef4 38 GB 2 hours ago llama2:latest 78e26419b446 3.8 GB 4 days ago ``` ``` $ OLLAMA_HOST=127.0.0.1:7656 ollama serve ``` When I try to interact with ollama in Python, I get a ResponseError. Thank you in advance. ``` import ollama response = ollama.chat(model='llama2', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?', }, ]) print(response['message']['content']) ``` <img width="1214" alt="image" src="https://github.com/ollama/ollama/assets/8755540/b5bbdcc9-845f-477d-9b3d-22f56d977be1">
Author
Owner

@pdevine commented on GitHub (Mar 12, 2024):

@harsham05 if you want to connect to ollama on a custom port, you'll have to create a custom client to do this (this is in the ollama-python readme ).

An example:

from ollama import Client
client = Client(host='http://127.0.0.1:7656')
response = client.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])

I'm going to go ahead and close the issue, but feel free to keep commenting.

<!-- gh-comment-id:1989709300 --> @pdevine commented on GitHub (Mar 12, 2024): @harsham05 if you want to connect to ollama on a custom port, you'll have to create a custom client to do this (this is in the [ollama-python readme](https://github.com/ollama/ollama-python) ). An example: ``` from ollama import Client client = Client(host='http://127.0.0.1:7656') response = client.chat(model='llama2', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?', }, ]) ``` I'm going to go ahead and close the issue, but feel free to keep commenting.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48401