[GH-ISSUE #1117] Change Default 11434 Port & fw question #562

Closed
opened 2026-04-12 10:15:19 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @jjsarf on GitHub (Nov 14, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1117

Does anyone know how to change Ollama's default port?

Also how do we allow other computers to hit the /generate api?

Thanks,
John

Originally created by @jjsarf on GitHub (Nov 14, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1117 Does anyone know how to change Ollama's default port? Also how do we allow other computers to hit the /generate api? Thanks, John
Author
Owner

@jmorganca commented on GitHub (Nov 14, 2023):

Hi @jjsarf you can use the OLLAMA_HOST environment variable in combination with ollama serve

E.g. to expose Ollama externally on port 8080 you can use:

OLLAMA_HOST=0.0.0.0:8080 ollama serve

Feel free to post another issue! Will close this one for now

<!-- gh-comment-id:1809465610 --> @jmorganca commented on GitHub (Nov 14, 2023): Hi @jjsarf you can use the `OLLAMA_HOST` environment variable in combination with `ollama serve` E.g. to expose Ollama externally on port 8080 you can use: ``` OLLAMA_HOST=0.0.0.0:8080 ollama serve ``` Feel free to post another issue! Will close this one for now
Author
Owner

@jjsarf commented on GitHub (Nov 14, 2023):

That did not work fully.
super@mario:/var/www/html$ curl -X POST http://ai:8080/api/generate -d '{
"model": "llama2",
"prompt": "How are you?",
"stream": true,
"options": {
"top_k": 20,
"top_p": 0.9,
"typical_p": 0.7,
"temperature": 0.8,
"repeat_penalty": 1.2,
"presence_penalty": 1.5,
"frequency_penalty": 1.0
}
}'
curl: (7) Failed to connect to ai port 8080 after 230 ms: Connection refused

I also tried with the IPs
C:\usb\laragon-portable\se>curl -X POST http://10.0.0.252:8080/api/generate -d "{"model": "llama2", "prompt": "Why is the sky blue?", "stream": true, "options": { "top_k": 20, "top_p": 0.9, "typical_p": 0.7, "temperature": 0.8, "repeat_penalty": 1.2, "presence_penalty": 1.5, "frequency_penalty": 1.0 }}"
curl: (7) Failed to connect to 10.0.0.252 port 8080: Connection refused

<!-- gh-comment-id:1809541269 --> @jjsarf commented on GitHub (Nov 14, 2023): That did not work fully. super@mario:/var/www/html$ curl -X POST http://ai:8080/api/generate -d '{ "model": "llama2", "prompt": "How are you?", "stream": true, "options": { "top_k": 20, "top_p": 0.9, "typical_p": 0.7, "temperature": 0.8, "repeat_penalty": 1.2, "presence_penalty": 1.5, "frequency_penalty": 1.0 } }' curl: (7) Failed to connect to ai port 8080 after 230 ms: Connection refused I also tried with the IPs C:\usb\laragon-portable\se>curl -X POST http://10.0.0.252:8080/api/generate -d "{\"model\": \"llama2\", \"prompt\": \"Why is the sky blue?\", \"stream\": true, \"options\": { \"top_k\": 20, \"top_p\": 0.9, \"typical_p\": 0.7, \"temperature\": 0.8, \"repeat_penalty\": 1.2, \"presence_penalty\": 1.5, \"frequency_penalty\": 1.0 }}" curl: (7) Failed to connect to 10.0.0.252 port 8080: Connection refused
Author
Owner

@jjsarf commented on GitHub (Nov 14, 2023):

But it did help if I run it locally were the ollama is installed (for the "ai" host) not for the external IP.

<!-- gh-comment-id:1809542410 --> @jjsarf commented on GitHub (Nov 14, 2023): But it did help if I run it locally were the ollama is installed (for the "ai" host) not for the external IP.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#562