[GH-ISSUE #6942] Ollama bricks chromium based apps on mac #4394

Closed
opened 2026-04-12 15:20:07 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @skakwy on GitHub (Sep 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6942

What is the issue?

I got a weird issue where chromium based apps stop working due to an ERR_ADDRESS_INVALID error. At first, I thought it would be some kind of problem with chromium and tried out a few different things, however nothing worked till I stopped ollama. Since I got ollama that weird error started to popup randomly. Without ollama running in the background, I wasn't able to reproduce the error.
If I try to use ollama I either get the error: Error: pull model manifest: file does not exist or Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connect: can't assign requested address tho I usually get the first one before the second one.
Honestly I have no idea what could cause this issue and reinstalling might fix the error but since I seem to be the only one to have this problem, it might be good to look further into it. I attached the log file, feel free to ask if I have to provide anything else.

What I tried to fix this issue (before knowing its probably ollama fault): disabled proxys, disabled ipv6, flushed dns records, turned off any apple related safety features for network. Tried different mac os versions (now on 15.0), different networks.
server.log

Edit:
It might be because of the ports that ollama uses, but I don't really think that 11434 is used by chromium for it to not work or does ollama use other ports as well ?

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.3.11

Originally created by @skakwy on GitHub (Sep 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6942 ### What is the issue? I got a weird issue where chromium based apps stop working due to an ERR_ADDRESS_INVALID error. At first, I thought it would be some kind of problem with chromium and tried out a few different things, however nothing worked till I stopped ollama. Since I got ollama that weird error started to popup randomly. Without ollama running in the background, I wasn't able to reproduce the error. If I try to use ollama I either get the error: `Error: pull model manifest: file does not exist` or `Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connect: can't assign requested address` tho I usually get the first one before the second one. Honestly I have no idea what could cause this issue and reinstalling might fix the error but since I seem to be the only one to have this problem, it might be good to look further into it. I attached the log file, feel free to ask if I have to provide anything else. What I tried to fix this issue (before knowing its probably ollama fault): disabled proxys, disabled ipv6, flushed dns records, turned off any apple related safety features for network. Tried different mac os versions (now on 15.0), different networks. [server.log](https://github.com/user-attachments/files/17121474/server.log) Edit: It might be because of the ports that ollama uses, but I don't really think that 11434 is used by chromium for it to not work or does ollama use other ports as well ? ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.3.11
GiteaMirror added the bug label 2026-04-12 15:20:07 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 25, 2024):

Do you have a process that is querying the ollama server for the model list? Because it's querying thousands of times per second, which may be exhausting the open ports on your system, and causing problems for other processes that use networking.

$ egrep  '/api/(show|tags)' server.log | awk '{print $4}' | uniq -c | head
    188 01:45:33
   1456 01:45:34
   2460 01:45:35
   2516 01:45:36
   2300 01:45:37
   2176 01:45:38
   2100 01:45:39
   2190 01:45:40
    941 01:45:41
      1 01:45:42


<!-- gh-comment-id:2372676329 --> @rick-github commented on GitHub (Sep 25, 2024): Do you have a process that is querying the ollama server for the model list? Because it's querying thousands of times per second, which may be exhausting the open ports on your system, and causing problems for other processes that use networking. ```console $ egrep '/api/(show|tags)' server.log | awk '{print $4}' | uniq -c | head 188 01:45:33 1456 01:45:34 2460 01:45:35 2516 01:45:36 2300 01:45:37 2176 01:45:38 2100 01:45:39 2190 01:45:40 941 01:45:41 1 01:45:42 ```
Author
Owner

@skakwy commented on GitHub (Sep 25, 2024):

I do try to use a vs code extension called continue. I wasn't able to reproduce the error without the extension so ig it's the fault of continue. Thanks for helping.

<!-- gh-comment-id:2373543493 --> @skakwy commented on GitHub (Sep 25, 2024): I do try to use a vs code extension called continue. I wasn't able to reproduce the error without the extension so ig it's the fault of continue. Thanks for helping.
Author
Owner

@pmochine commented on GitHub (Sep 30, 2024):

Thank you @skakwy I was also coming from this bug, I just started to install continue and I guess that's the reason haha

<!-- gh-comment-id:2383312467 --> @pmochine commented on GitHub (Sep 30, 2024): Thank you @skakwy I was also coming from this bug, I just started to install continue and I guess that's the reason haha
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4394