[GH-ISSUE #15456] MLX initialization fails on Apple Silicon due to x86_64-only libmlxc.dylib #9879

Closed
opened 2026-04-12 22:44:31 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @milindmore22 on GitHub (Apr 9, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15456

What is the issue?

Issue Description

Describe the bug
When attempting to run a model (e.g., x/flux2-klein) on an Apple Silicon Mac (arm64), the model fails to load with an MLX initialization error. The error indicates that /Applications/Ollama.app/Contents/Resources/libmlxc.dylib is compiled for x86_64 and is missing the arm64 architecture slice.

While the main ollama CLI executable is correctly built as a Universal Binary (containing both x86_64 and arm64), the bundled libmlxc.dylib library inside the macOS app bundle appears to be Intel-only, causing dlopen to fail on Apple Silicon.

Steps to reproduce

  1. Install Ollama on an Apple Silicon (M-series) Mac.
  2. Run the command: ollama run x/flux2-klein (or potentially any model utilizing the MLX runner).
  3. Observe the crash/error.

Expected behavior
The model should load and run natively using the arm64 architecture and Apple's MLX framework without dynamic library architecture mismatches.

Actual behavior / Error logs

ollama run x/flux2-klein     

Error: failed to load model: 500 Internal Server Error: mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)

Environment Context

$ arch
arm64

$ file $(which ollama)
/usr/local/bin/ollama: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64]
/usr/local/bin/ollama (for architecture x86_64):	Mach-O 64-bit executable x86_64
/usr/local/bin/ollama (for architecture arm64):	Mach-O 64-bit executable arm64

Relevant log output

cat ~/.ollama/logs/server.log

time=2026-04-09T21:28:05.037+05:30 level=INFO source=routes.go:1744 msg="server config" env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:0 OLLAMA_DEBUG:INFO OLLAMA_DEBUG_LOG_REQUESTS:false OLLAMA_EDITOR: OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Users/milind/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NO_CLOUD:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false http_proxy: https_proxy: no_proxy:]"
time=2026-04-09T21:28:05.038+05:30 level=INFO source=routes.go:1746 msg="Ollama cloud disabled: false"
time=2026-04-09T21:28:05.076+05:30 level=INFO source=images.go:499 msg="total blobs: 1553"
time=2026-04-09T21:28:05.079+05:30 level=INFO source=images.go:506 msg="total unused blobs removed: 0"
time=2026-04-09T21:28:05.079+05:30 level=INFO source=routes.go:1802 msg="Listening on 127.0.0.1:11434 (version 0.20.4)"
time=2026-04-09T21:28:05.080+05:30 level=INFO source=runner.go:67 msg="discovering available GPUs..."
time=2026-04-09T21:28:05.081+05:30 level=INFO source=server.go:444 msg="starting runner" cmd="/Applications/Ollama.app/Contents/Resources/ollama runner --ollama-engine --port 49577"
time=2026-04-09T21:28:05.279+05:30 level=INFO source=types.go:42 msg="inference compute" id=0 filter_id=0 library=Metal compute=0.0 name=Metal description="Apple M3" libdirs="" driver=0.0 pci_id="" type=discrete total="11.8 GiB" available="11.8 GiB"
time=2026-04-09T21:28:05.279+05:30 level=INFO source=routes.go:1852 msg="vram-based default context" total_vram="11.8 GiB" default_num_ctx=4096
[GIN] 2026/04/09 - 21:28:05 | 200 |      97.375µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:28:05 | 200 |      96.959µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:28:05 | 200 |      29.709µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:28:05 | 200 |    5.537667ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:28:05 | 200 |  371.428959ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2026/04/09 - 21:28:05 | 200 |  373.599542ms |       127.0.0.1 | POST     "/api/me"
[GIN] 2026/04/09 - 21:28:05 | 200 |  374.556542ms |       127.0.0.1 | POST     "/api/me"
[GIN] 2026/04/09 - 21:28:16 | 200 |     103.541µs |       127.0.0.1 | HEAD     "/"
[GIN] 2026/04/09 - 21:28:16 | 200 |   47.683916ms |       127.0.0.1 | POST     "/api/show"
time=2026-04-09T21:28:16.094+05:30 level=INFO source=sched.go:484 msg="system memory" total="16.0 GiB" free="4.8 GiB" free_swap="0 B"
time=2026-04-09T21:28:16.094+05:30 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="11.3 GiB" free="11.8 GiB" minimum="512.0 MiB" overhead="0 B"
time=2026-04-09T21:28:16.096+05:30 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/flux2-klein:latest port=49588
time=2026-04-09T21:28:16.097+05:30 level=INFO source=sched.go:561 msg="loaded runners" count=1
time=2026-04-09T21:28:16.112+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:28:16.111+05:30 level=INFO msg=\"starting mlx runner\" model=x/flux2-klein:latest port=49588 mode=imagegen"
time=2026-04-09T21:28:16.112+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:28:16.112+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib\""
time=2026-04-09T21:28:16.112+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib"
time=2026-04-09T21:28:16.112+05:30 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)"
time=2026-04-09T21:28:16.112+05:30 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40144
[GIN] 2026/04/09 - 21:28:16 | 500 |   58.593209ms |       127.0.0.1 | POST     "/api/generate"
[GIN] 2026/04/09 - 21:28:20 | 200 |        34.5µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:28:20 | 200 |    7.546708ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:28:50 | 200 |    9.383583ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:29:20 | 200 |    6.216833ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:29:50 | 200 |    9.070583ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:30:20 | 200 |    9.028042ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:30:50 | 200 |   10.970625ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:31:20 | 200 |   13.021458ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:31:51 | 200 |    7.698958ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:32:21 | 200 |    9.673583ms |       127.0.0.1 | GET      "/api/tags"
time=2026-04-09T21:32:25.691+05:30 level=INFO source=routes.go:1744 msg="server config" env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:0 OLLAMA_DEBUG:INFO OLLAMA_DEBUG_LOG_REQUESTS:false OLLAMA_EDITOR: OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Users/milind/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NO_CLOUD:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false http_proxy: https_proxy: no_proxy:]"
time=2026-04-09T21:32:25.691+05:30 level=INFO source=routes.go:1746 msg="Ollama cloud disabled: false"
time=2026-04-09T21:32:25.720+05:30 level=INFO source=images.go:499 msg="total blobs: 1553"
time=2026-04-09T21:32:25.723+05:30 level=INFO source=images.go:506 msg="total unused blobs removed: 0"
time=2026-04-09T21:32:25.723+05:30 level=INFO source=routes.go:1802 msg="Listening on 127.0.0.1:11434 (version 0.20.4)"
time=2026-04-09T21:32:25.723+05:30 level=INFO source=runner.go:67 msg="discovering available GPUs..."
time=2026-04-09T21:32:25.725+05:30 level=INFO source=server.go:444 msg="starting runner" cmd="/Applications/Ollama.app/Contents/Resources/ollama runner --ollama-engine --port 49604"
time=2026-04-09T21:32:25.847+05:30 level=INFO source=types.go:42 msg="inference compute" id=0 filter_id=0 library=Metal compute=0.0 name=Metal description="Apple M3" libdirs="" driver=0.0 pci_id="" type=discrete total="11.8 GiB" available="11.8 GiB"
time=2026-04-09T21:32:25.847+05:30 level=INFO source=routes.go:1852 msg="vram-based default context" total_vram="11.8 GiB" default_num_ctx=4096
[GIN] 2026/04/09 - 21:32:51 | 200 |     10.3265ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:33:21 | 200 |    9.249708ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:33:34 | 200 |      47.417µs |       127.0.0.1 | HEAD     "/"
[GIN] 2026/04/09 - 21:33:34 | 200 |   44.295667ms |       127.0.0.1 | POST     "/api/show"
time=2026-04-09T21:33:34.508+05:30 level=INFO source=sched.go:484 msg="system memory" total="16.0 GiB" free="4.9 GiB" free_swap="0 B"
time=2026-04-09T21:33:34.508+05:30 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="11.3 GiB" free="11.8 GiB" minimum="512.0 MiB" overhead="0 B"
time=2026-04-09T21:33:34.510+05:30 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/flux2-klein:latest port=49613
time=2026-04-09T21:33:34.511+05:30 level=INFO source=sched.go:561 msg="loaded runners" count=1
time=2026-04-09T21:33:34.525+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:33:34.525+05:30 level=INFO msg=\"starting mlx runner\" model=x/flux2-klein:latest port=49613 mode=imagegen"
time=2026-04-09T21:33:34.525+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:33:34.525+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib\""
time=2026-04-09T21:33:34.525+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib"
time=2026-04-09T21:33:34.526+05:30 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)"
time=2026-04-09T21:33:34.526+05:30 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40210
[GIN] 2026/04/09 - 21:33:34 | 500 |   58.612667ms |       127.0.0.1 | POST     "/api/generate"
[GIN] 2026/04/09 - 21:33:51 | 200 |   10.084541ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:34:21 | 200 |    7.503083ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:34:51 | 200 |    7.784959ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:35:21 | 200 |    8.020625ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:35:51 | 200 |     9.75525ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:36:22 | 200 |    8.458208ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:36:55 | 200 |    7.709125ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:36:55 | 200 |      33.167µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:36:55 | 200 |  444.523792ms |       127.0.0.1 | POST     "/api/me"
[GIN] 2026/04/09 - 21:37:08 | 200 |        64.5µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:37:25 | 200 |     7.70025ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:37:55 | 200 |    9.238958ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:38:25 | 200 |    8.438333ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:38:55 | 200 |   12.781709ms |       127.0.0.1 | GET      "/api/tags"
[GIN] 2026/04/09 - 21:39:03 | 200 |      66.291µs |       127.0.0.1 | GET      "/api/version"
[GIN] 2026/04/09 - 21:39:08 | 200 |      57.584µs |       127.0.0.1 | HEAD     "/"
[GIN] 2026/04/09 - 21:39:08 | 200 |   47.836875ms |       127.0.0.1 | POST     "/api/show"
time=2026-04-09T21:39:08.131+05:30 level=INFO source=sched.go:484 msg="system memory" total="16.0 GiB" free="4.9 GiB" free_swap="0 B"
time=2026-04-09T21:39:08.131+05:30 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="11.3 GiB" free="11.8 GiB" minimum="512.0 MiB" overhead="0 B"
time=2026-04-09T21:39:08.133+05:30 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/flux2-klein:latest port=49648
time=2026-04-09T21:39:08.134+05:30 level=INFO source=sched.go:561 msg="loaded runners" count=1
time=2026-04-09T21:39:08.147+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:39:08.147+05:30 level=INFO msg=\"starting mlx runner\" model=x/flux2-klein:latest port=49648 mode=imagegen"
time=2026-04-09T21:39:08.147+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:39:08.147+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib\""
time=2026-04-09T21:39:08.147+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib"
time=2026-04-09T21:39:08.148+05:30 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)"
time=2026-04-09T21:39:08.148+05:30 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40284
[GIN] 2026/04/09 - 21:39:08 | 500 |   57.135459ms |       127.0.0.1 | POST     "/api/generate"

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.20.4

Originally created by @milindmore22 on GitHub (Apr 9, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15456 ### What is the issue? ### Issue Description **Describe the bug** When attempting to run a model (e.g., `x/flux2-klein`) on an Apple Silicon Mac (arm64), the model fails to load with an MLX initialization error. The error indicates that `/Applications/Ollama.app/Contents/Resources/libmlxc.dylib` is compiled for `x86_64` and is missing the `arm64` architecture slice. While the main `ollama` CLI executable is correctly built as a Universal Binary (containing both x86_64 and arm64), the bundled `libmlxc.dylib` library inside the macOS app bundle appears to be Intel-only, causing `dlopen` to fail on Apple Silicon. **Steps to reproduce** 1. Install Ollama on an Apple Silicon (M-series) Mac. 2. Run the command: `ollama run x/flux2-klein` (or potentially any model utilizing the MLX runner). 3. Observe the crash/error. **Expected behavior** The model should load and run natively using the `arm64` architecture and Apple's MLX framework without dynamic library architecture mismatches. **Actual behavior / Error logs** ```shell ollama run x/flux2-klein Error: failed to load model: 500 Internal Server Error: mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1) ``` **Environment Context** ```shell $ arch arm64 $ file $(which ollama) /usr/local/bin/ollama: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64] /usr/local/bin/ollama (for architecture x86_64): Mach-O 64-bit executable x86_64 /usr/local/bin/ollama (for architecture arm64): Mach-O 64-bit executable arm64 ``` ### Relevant log output ```shell cat ~/.ollama/logs/server.log time=2026-04-09T21:28:05.037+05:30 level=INFO source=routes.go:1744 msg="server config" env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:0 OLLAMA_DEBUG:INFO OLLAMA_DEBUG_LOG_REQUESTS:false OLLAMA_EDITOR: OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Users/milind/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NO_CLOUD:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false http_proxy: https_proxy: no_proxy:]" time=2026-04-09T21:28:05.038+05:30 level=INFO source=routes.go:1746 msg="Ollama cloud disabled: false" time=2026-04-09T21:28:05.076+05:30 level=INFO source=images.go:499 msg="total blobs: 1553" time=2026-04-09T21:28:05.079+05:30 level=INFO source=images.go:506 msg="total unused blobs removed: 0" time=2026-04-09T21:28:05.079+05:30 level=INFO source=routes.go:1802 msg="Listening on 127.0.0.1:11434 (version 0.20.4)" time=2026-04-09T21:28:05.080+05:30 level=INFO source=runner.go:67 msg="discovering available GPUs..." time=2026-04-09T21:28:05.081+05:30 level=INFO source=server.go:444 msg="starting runner" cmd="/Applications/Ollama.app/Contents/Resources/ollama runner --ollama-engine --port 49577" time=2026-04-09T21:28:05.279+05:30 level=INFO source=types.go:42 msg="inference compute" id=0 filter_id=0 library=Metal compute=0.0 name=Metal description="Apple M3" libdirs="" driver=0.0 pci_id="" type=discrete total="11.8 GiB" available="11.8 GiB" time=2026-04-09T21:28:05.279+05:30 level=INFO source=routes.go:1852 msg="vram-based default context" total_vram="11.8 GiB" default_num_ctx=4096 [GIN] 2026/04/09 - 21:28:05 | 200 | 97.375µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:28:05 | 200 | 96.959µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:28:05 | 200 | 29.709µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:28:05 | 200 | 5.537667ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:28:05 | 200 | 371.428959ms | 127.0.0.1 | POST "/api/show" [GIN] 2026/04/09 - 21:28:05 | 200 | 373.599542ms | 127.0.0.1 | POST "/api/me" [GIN] 2026/04/09 - 21:28:05 | 200 | 374.556542ms | 127.0.0.1 | POST "/api/me" [GIN] 2026/04/09 - 21:28:16 | 200 | 103.541µs | 127.0.0.1 | HEAD "/" [GIN] 2026/04/09 - 21:28:16 | 200 | 47.683916ms | 127.0.0.1 | POST "/api/show" time=2026-04-09T21:28:16.094+05:30 level=INFO source=sched.go:484 msg="system memory" total="16.0 GiB" free="4.8 GiB" free_swap="0 B" time=2026-04-09T21:28:16.094+05:30 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="11.3 GiB" free="11.8 GiB" minimum="512.0 MiB" overhead="0 B" time=2026-04-09T21:28:16.096+05:30 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/flux2-klein:latest port=49588 time=2026-04-09T21:28:16.097+05:30 level=INFO source=sched.go:561 msg="loaded runners" count=1 time=2026-04-09T21:28:16.112+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:28:16.111+05:30 level=INFO msg=\"starting mlx runner\" model=x/flux2-klein:latest port=49588 mode=imagegen" time=2026-04-09T21:28:16.112+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:28:16.112+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib\"" time=2026-04-09T21:28:16.112+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib" time=2026-04-09T21:28:16.112+05:30 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)" time=2026-04-09T21:28:16.112+05:30 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40144 [GIN] 2026/04/09 - 21:28:16 | 500 | 58.593209ms | 127.0.0.1 | POST "/api/generate" [GIN] 2026/04/09 - 21:28:20 | 200 | 34.5µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:28:20 | 200 | 7.546708ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:28:50 | 200 | 9.383583ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:29:20 | 200 | 6.216833ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:29:50 | 200 | 9.070583ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:30:20 | 200 | 9.028042ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:30:50 | 200 | 10.970625ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:31:20 | 200 | 13.021458ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:31:51 | 200 | 7.698958ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:32:21 | 200 | 9.673583ms | 127.0.0.1 | GET "/api/tags" time=2026-04-09T21:32:25.691+05:30 level=INFO source=routes.go:1744 msg="server config" env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:0 OLLAMA_DEBUG:INFO OLLAMA_DEBUG_LOG_REQUESTS:false OLLAMA_EDITOR: OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Users/milind/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NO_CLOUD:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false http_proxy: https_proxy: no_proxy:]" time=2026-04-09T21:32:25.691+05:30 level=INFO source=routes.go:1746 msg="Ollama cloud disabled: false" time=2026-04-09T21:32:25.720+05:30 level=INFO source=images.go:499 msg="total blobs: 1553" time=2026-04-09T21:32:25.723+05:30 level=INFO source=images.go:506 msg="total unused blobs removed: 0" time=2026-04-09T21:32:25.723+05:30 level=INFO source=routes.go:1802 msg="Listening on 127.0.0.1:11434 (version 0.20.4)" time=2026-04-09T21:32:25.723+05:30 level=INFO source=runner.go:67 msg="discovering available GPUs..." time=2026-04-09T21:32:25.725+05:30 level=INFO source=server.go:444 msg="starting runner" cmd="/Applications/Ollama.app/Contents/Resources/ollama runner --ollama-engine --port 49604" time=2026-04-09T21:32:25.847+05:30 level=INFO source=types.go:42 msg="inference compute" id=0 filter_id=0 library=Metal compute=0.0 name=Metal description="Apple M3" libdirs="" driver=0.0 pci_id="" type=discrete total="11.8 GiB" available="11.8 GiB" time=2026-04-09T21:32:25.847+05:30 level=INFO source=routes.go:1852 msg="vram-based default context" total_vram="11.8 GiB" default_num_ctx=4096 [GIN] 2026/04/09 - 21:32:51 | 200 | 10.3265ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:33:21 | 200 | 9.249708ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:33:34 | 200 | 47.417µs | 127.0.0.1 | HEAD "/" [GIN] 2026/04/09 - 21:33:34 | 200 | 44.295667ms | 127.0.0.1 | POST "/api/show" time=2026-04-09T21:33:34.508+05:30 level=INFO source=sched.go:484 msg="system memory" total="16.0 GiB" free="4.9 GiB" free_swap="0 B" time=2026-04-09T21:33:34.508+05:30 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="11.3 GiB" free="11.8 GiB" minimum="512.0 MiB" overhead="0 B" time=2026-04-09T21:33:34.510+05:30 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/flux2-klein:latest port=49613 time=2026-04-09T21:33:34.511+05:30 level=INFO source=sched.go:561 msg="loaded runners" count=1 time=2026-04-09T21:33:34.525+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:33:34.525+05:30 level=INFO msg=\"starting mlx runner\" model=x/flux2-klein:latest port=49613 mode=imagegen" time=2026-04-09T21:33:34.525+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:33:34.525+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib\"" time=2026-04-09T21:33:34.525+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib" time=2026-04-09T21:33:34.526+05:30 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)" time=2026-04-09T21:33:34.526+05:30 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40210 [GIN] 2026/04/09 - 21:33:34 | 500 | 58.612667ms | 127.0.0.1 | POST "/api/generate" [GIN] 2026/04/09 - 21:33:51 | 200 | 10.084541ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:34:21 | 200 | 7.503083ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:34:51 | 200 | 7.784959ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:35:21 | 200 | 8.020625ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:35:51 | 200 | 9.75525ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:36:22 | 200 | 8.458208ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:36:55 | 200 | 7.709125ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:36:55 | 200 | 33.167µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:36:55 | 200 | 444.523792ms | 127.0.0.1 | POST "/api/me" [GIN] 2026/04/09 - 21:37:08 | 200 | 64.5µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:37:25 | 200 | 7.70025ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:37:55 | 200 | 9.238958ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:38:25 | 200 | 8.438333ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:38:55 | 200 | 12.781709ms | 127.0.0.1 | GET "/api/tags" [GIN] 2026/04/09 - 21:39:03 | 200 | 66.291µs | 127.0.0.1 | GET "/api/version" [GIN] 2026/04/09 - 21:39:08 | 200 | 57.584µs | 127.0.0.1 | HEAD "/" [GIN] 2026/04/09 - 21:39:08 | 200 | 47.836875ms | 127.0.0.1 | POST "/api/show" time=2026-04-09T21:39:08.131+05:30 level=INFO source=sched.go:484 msg="system memory" total="16.0 GiB" free="4.9 GiB" free_swap="0 B" time=2026-04-09T21:39:08.131+05:30 level=INFO source=sched.go:491 msg="gpu memory" id=0 library=Metal available="11.3 GiB" free="11.8 GiB" minimum="512.0 MiB" overhead="0 B" time=2026-04-09T21:39:08.133+05:30 level=INFO source=server.go:171 msg="starting mlx runner subprocess" model=x/flux2-klein:latest port=49648 time=2026-04-09T21:39:08.134+05:30 level=INFO source=sched.go:561 msg="loaded runners" count=1 time=2026-04-09T21:39:08.147+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:39:08.147+05:30 level=INFO msg=\"starting mlx runner\" model=x/flux2-klein:latest port=49648 mode=imagegen" time=2026-04-09T21:39:08.147+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="time=2026-04-09T21:39:08.147+05:30 level=ERROR msg=\"unable to initialize MLX\" error=\"failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib\"" time=2026-04-09T21:39:08.147+05:30 level=WARN source=server.go:164 msg=mlx-runner msg="Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib" time=2026-04-09T21:39:08.148+05:30 level=ERROR source=sched.go:567 msg="error loading llama server" error="mlx runner failed: Error: failed to initialize MLX: MLX: Failed to load /Applications/Ollama.app/Contents/Resources/libmlxc.dylib: dlopen(/Applications/Ollama.app/Contents/Resources/libmlxc.dylib, 0x0009): tried: '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64e.v1' or 'arm64' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Applications/Ollama.app/Contents/Resources/libmlxc.dylib' (no such file), '/Applications/Ollama.app/Contents/Resources/libmlxc.dylib (exit: exit status 1)" time=2026-04-09T21:39:08.148+05:30 level=INFO source=server.go:363 msg="stopping mlx runner subprocess" pid=40284 [GIN] 2026/04/09 - 21:39:08 | 500 | 57.135459ms | 127.0.0.1 | POST "/api/generate" ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.20.4
GiteaMirror added the bug label 2026-04-12 22:44:31 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9879