[GH-ISSUE #14200] Error: loading z-image-turbo #55766

Closed
opened 2026-04-29 09:42:34 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @d0r1h on GitHub (Feb 11, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14200

What is the issue?

I'm trying to load ollama run x/z-image-turbo WebLink

But getting following error :

Error: failed to load model: 500 Internal Server Error: mlx runner failed: Error: failed to load MLX function symbols (exit: exit status 1)

Relevant log output

time=2026-02-11T14:33:44.463+05:30 level=INFO source=server.go:147 msg="starting mlx runner subprocess" exe=/opt/homebrew/Cellar/ollama/0.15.5/bin/ollama model=x/z-image-turbo:latest port=52607 mode=imagegen
time=2026-02-11T14:33:44.471+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="MLX: Failed to load symbol: mlx_metal_device_info"
time=2026-02-11T14:33:44.473+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="MLX: Failed to load symbol: mlx_metal_device_info"
time=2026-02-11T14:33:44.473+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="time=2026-02-11T14:33:44.473+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to load MLX function symbols\""
time=2026-02-11T14:33:44.473+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="Error: failed to load MLX function symbols"
time=2026-02-11T14:33:44.474+05:30 level=INFO source=server.go:362 msg="stopping mlx runner subprocess" pid=52067

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.15.5

Originally created by @d0r1h on GitHub (Feb 11, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14200 ### What is the issue? I'm trying to load **ollama run x/z-image-turbo** [WebLink](https://ollama.com/x/z-image-turbo) But getting following error : Error: failed to load model: 500 Internal Server Error: mlx runner failed: Error: failed to load MLX function symbols (exit: exit status 1) ### Relevant log output ```shell time=2026-02-11T14:33:44.463+05:30 level=INFO source=server.go:147 msg="starting mlx runner subprocess" exe=/opt/homebrew/Cellar/ollama/0.15.5/bin/ollama model=x/z-image-turbo:latest port=52607 mode=imagegen time=2026-02-11T14:33:44.471+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="MLX: Failed to load symbol: mlx_metal_device_info" time=2026-02-11T14:33:44.473+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="MLX: Failed to load symbol: mlx_metal_device_info" time=2026-02-11T14:33:44.473+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="time=2026-02-11T14:33:44.473+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to load MLX function symbols\"" time=2026-02-11T14:33:44.473+05:30 level=WARN source=server.go:140 msg=mlx-runner msg="Error: failed to load MLX function symbols" time=2026-02-11T14:33:44.474+05:30 level=INFO source=server.go:362 msg="stopping mlx runner subprocess" pid=52067 ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.15.5
GiteaMirror added the bug label 2026-04-29 09:42:34 -05:00
Author
Owner

@tyfiero commented on GitHub (Feb 12, 2026):

same issue here

<!-- gh-comment-id:3893434053 --> @tyfiero commented on GitHub (Feb 12, 2026): same issue here
Author
Owner

@SnowZhangSN commented on GitHub (Feb 13, 2026):

same issue here.

ollama version is 0.16.1

ollama run x/z-image-turbo:latest
Error: failed to load model: 500 Internal Server Error: mlx runner failed: model.norm.weight (exit: exit status 1)

<!-- gh-comment-id:3894412986 --> @SnowZhangSN commented on GitHub (Feb 13, 2026): same issue here. ollama version is 0.16.1 ollama run x/z-image-turbo:latest Error: failed to load model: 500 Internal Server Error: mlx runner failed: model.norm.weight (exit: exit status 1)
Author
Owner

@msis commented on GitHub (Feb 13, 2026):

this is a duplicate of #14118

<!-- gh-comment-id:3894734984 --> @msis commented on GitHub (Feb 13, 2026): this is a duplicate of #14118
Author
Owner

@senki commented on GitHub (Feb 13, 2026):

I can confirm that for me, upgrading to 0.15.6 solves the error.

<!-- gh-comment-id:3894886132 --> @senki commented on GitHub (Feb 13, 2026): I can confirm that for me, upgrading to 0.15.6 solves the error.
Author
Owner

@slyapustin commented on GitHub (Feb 13, 2026):

@senki Which MacOS version you have?

<!-- gh-comment-id:3896011767 --> @slyapustin commented on GitHub (Feb 13, 2026): @senki Which MacOS version you have?
Author
Owner

@kaiwanyawit-chawankul commented on GitHub (Feb 13, 2026):

@slyapustin No that guy but I found the same error after upgrading my mac version to 26.3 (25D125). my ollama is 0.16.1.

<!-- gh-comment-id:3897682762 --> @kaiwanyawit-chawankul commented on GitHub (Feb 13, 2026): @slyapustin No that guy but I found the same error after upgrading my mac version to 26.3 (25D125). my ollama is 0.16.1.
Author
Owner

@kaiwanyawit-chawankul commented on GitHub (Feb 13, 2026):

I can confirm that for me, upgrading to 0.15.6 solves the error.

downgrade to 0.15.6 can fix the issue. https://github.com/ollama/ollama/releases/download/v0.15.6/Ollama.dmg

Thanks @senki

<!-- gh-comment-id:3897702681 --> @kaiwanyawit-chawankul commented on GitHub (Feb 13, 2026): > I can confirm that for me, upgrading to 0.15.6 solves the error. downgrade to 0.15.6 can fix the issue. https://github.com/ollama/ollama/releases/download/v0.15.6/Ollama.dmg Thanks @senki
Author
Owner

@d0r1h commented on GitHub (Feb 14, 2026):

Thanks @senki

I've upgraded and it solved the issue, I am able to generate the images now.

So closing the issue now.

Thanks Everyone

<!-- gh-comment-id:3900537515 --> @d0r1h commented on GitHub (Feb 14, 2026): Thanks @senki I've upgraded and it solved the issue, I am able to generate the images now. So closing the issue now. Thanks Everyone
Author
Owner

@slyapustin commented on GitHub (Feb 14, 2026):

@d0r1h I don't think it solved, but downgrading to old version seems to solve that issue.

<!-- gh-comment-id:3902074583 --> @slyapustin commented on GitHub (Feb 14, 2026): @d0r1h I don't think it solved, but **downgrading** to old version seems to solve that issue.
Author
Owner

@senki commented on GitHub (Feb 18, 2026):

My apologies for the late reply.

If anyone needs that information later, it’s Tahoe. The version is either 26.2 or 2.3 at the time.

<!-- gh-comment-id:3920417807 --> @senki commented on GitHub (Feb 18, 2026): My apologies for the late reply. If anyone needs that information later, it’s Tahoe. The version is either 26.2 or 2.3 at the time.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55766