[GH-ISSUE #15533] Error: failed to initialize MLX: libmlxc.dylib not found #35685

Closed
opened 2026-04-22 20:22:04 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @andrew-astetik on GitHub (Apr 13, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15533

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

$ curl http://localhost:11434/api/generate -d '{
  "model": "x/z-image-turbo",
  "prompt": "a sunset over mountains",
  "width": 512,
  "height": 512,
  "steps": 4,
  "seed": 42,
  "stream": false
}'
{"error":"mlx runner failed: Error: failed to initialize MLX: libmlxc.dylib not found (exit: exit status 1)"}

Relevant log output

time=2026-04-13T09:40:30.596+05:00 level=INFO source=sched.go:484 msg="system memory" total="48.0 GiB" free="31.0 GiB" free_swap="0 B"
time=2026-04-13T09:40:30.596+05:00 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="35.5 GiB" free="36.0 GiB" minimum="512.0 MiB" overhead="0 B"
time=2026-04-13T09:40:30.599+05:00 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/z-image-turbo:latest port=58548
time=2026-04-13T09:40:30.600+05:00 level=INFO source=sched.go:561 msg="loaded runners" count=1
time=2026-04-13T09:40:30.612+05:00 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-13T09:40:30.612+05:00 level=INFO msg=\"starting mlx runner\" model=x/z-image-turbo:latest port=58548 mode=imagegen"
time=2026-04-13T09:40:30.612+05:00 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-13T09:40:30.612+05:00 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: libmlxc.dylib not found\""
time=2026-04-13T09:40:30.612+05:00 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: libmlxc.dylib not found"
time=2026-04-13T09:40:30.612+05:00 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: libmlxc.dylib not found (exit: exit status 1)"
time=2026-04-13T09:40:30.613+05:00 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40146
[GIN] 2026/04/13 - 09:40:30 | 500 |   97.157125ms |       127.0.0.1 | POST     "/api/generate"

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.20.6

Originally created by @andrew-astetik on GitHub (Apr 13, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15533 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? ```shell $ curl http://localhost:11434/api/generate -d '{ "model": "x/z-image-turbo", "prompt": "a sunset over mountains", "width": 512, "height": 512, "steps": 4, "seed": 42, "stream": false }' {"error":"mlx runner failed: Error: failed to initialize MLX: libmlxc.dylib not found (exit: exit status 1)"} ``` ### Relevant log output ```shell time=2026-04-13T09:40:30.596+05:00 level=INFO source=sched.go:484 msg="system memory" total="48.0 GiB" free="31.0 GiB" free_swap="0 B" time=2026-04-13T09:40:30.596+05:00 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="35.5 GiB" free="36.0 GiB" minimum="512.0 MiB" overhead="0 B" time=2026-04-13T09:40:30.599+05:00 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/z-image-turbo:latest port=58548 time=2026-04-13T09:40:30.600+05:00 level=INFO source=sched.go:561 msg="loaded runners" count=1 time=2026-04-13T09:40:30.612+05:00 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-13T09:40:30.612+05:00 level=INFO msg=\"starting mlx runner\" model=x/z-image-turbo:latest port=58548 mode=imagegen" time=2026-04-13T09:40:30.612+05:00 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-13T09:40:30.612+05:00 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: libmlxc.dylib not found\"" time=2026-04-13T09:40:30.612+05:00 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: libmlxc.dylib not found" time=2026-04-13T09:40:30.612+05:00 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: libmlxc.dylib not found (exit: exit status 1)" time=2026-04-13T09:40:30.613+05:00 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40146 [GIN] 2026/04/13 - 09:40:30 | 500 | 97.157125ms | 127.0.0.1 | POST "/api/generate" ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.20.6
GiteaMirror added the bug label 2026-04-22 20:22:04 -05:00
Author
Owner

@rosariogueli commented on GitHub (Apr 13, 2026):

i updated my ollama just now, and i get the same error "libmlxc.dylib not found". It was fine just before I upgraded.
Macbook Pro M4 Max 64GB RAM

<!-- gh-comment-id:4238120909 --> @rosariogueli commented on GitHub (Apr 13, 2026): i updated my ollama just now, and i get the same error "libmlxc.dylib not found". It was fine just before I upgraded. Macbook Pro M4 Max 64GB RAM
Author
Owner

@dehain commented on GitHub (Apr 13, 2026):

Same here

i updated my ollama just now, and i get the same error "libmlxc.dylib not found". It was fine just before I upgraded.

<!-- gh-comment-id:4238450822 --> @dehain commented on GitHub (Apr 13, 2026): Same here > i updated my ollama just now, and i get the same error "libmlxc.dylib not found". It was fine just before I upgraded.
Author
Owner

@SnowZhangSN commented on GitHub (Apr 14, 2026):

Same here

macos 26.2 m3 max , x/z-image-turbo:latest

downgrade to 0.20.3 works.

<!-- gh-comment-id:4240921514 --> @SnowZhangSN commented on GitHub (Apr 14, 2026): Same here macos 26.2 m3 max , x/z-image-turbo:latest downgrade to 0.20.3 works.
Author
Owner

@navidabdi commented on GitHub (Apr 15, 2026):

Same here, I had to downgrade to 0.20.3 to make it work:

Run the following command to stop the current version and install the working version, which in this case is 0.20.3

pkill Ollama 2>/dev/null || true

curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.20.3 sh
<!-- gh-comment-id:4252686279 --> @navidabdi commented on GitHub (Apr 15, 2026): Same here, I had to downgrade to `0.20.3` to make it work: Run the following command to stop the current version and install the working version, which in this case is `0.20.3` ```bash pkill Ollama 2>/dev/null || true curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.20.3 sh ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35685