[GH-ISSUE #3032] Ollama errors with msg="Failed to load dynamic library [...]/libext_server.so exception std::bad_alloc #27623

Closed
opened 2026-04-22 05:06:18 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @pythonHuang on GitHub (Mar 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3032

Error: Unable to load dynamic library: Unable to load dynamic server library: �Ҳ���ָ����ģ�顣

Originally created by @pythonHuang on GitHub (Mar 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3032 Error: Unable to load dynamic library: Unable to load dynamic server library: �Ҳ���ָ����ģ�顣
GiteaMirror added the bug label 2026-04-22 05:06:18 -05:00
Author
Owner

@cinnamon17 commented on GitHub (Mar 10, 2024):

Same here i got this error in the logs

msg="Failed to load dynamic library /tmp/ollama136388332/cpu_avx2/libext_server.so exception std::bad_alloc"

<!-- gh-comment-id:1987226098 --> @cinnamon17 commented on GitHub (Mar 10, 2024): Same here i got this error in the logs `msg="Failed to load dynamic library /tmp/ollama136388332/cpu_avx2/libext_server.so exception std::bad_alloc"`
Author
Owner

@Howe829 commented on GitHub (Mar 10, 2024):

Hi, gemma requires ollama >=0.1.26, you can check your ollama version using ollama --version.

<!-- gh-comment-id:1987257775 --> @Howe829 commented on GitHub (Mar 10, 2024): Hi, gemma requires ollama >=0.1.26, you can check your ollama version using `ollama --version`.
Author
Owner

@cinnamon17 commented on GitHub (Mar 10, 2024):

@Howe829 Hi,I have 0.1.28 version

<!-- gh-comment-id:1987275357 --> @cinnamon17 commented on GitHub (Mar 10, 2024): @Howe829 Hi,I have 0.1.28 version
Author
Owner

@Howe829 commented on GitHub (Mar 11, 2024):

@cinnamon17 It seems there is a problem with loading OLLAMA Library.You can check this Markdown for more information troubleshooting

<!-- gh-comment-id:1987475921 --> @Howe829 commented on GitHub (Mar 11, 2024): @cinnamon17 It seems there is a problem with loading OLLAMA Library.You can check this Markdown for more information [troubleshooting](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md)
Author
Owner

@jmorganca commented on GitHub (Mar 11, 2024):

Hi @pythonHuang does this only occur with gemma:2b, or other models as well? Sorry this happened

<!-- gh-comment-id:1987687326 --> @jmorganca commented on GitHub (Mar 11, 2024): Hi @pythonHuang does this only occur with `gemma:2b`, or other models as well? Sorry this happened
Author
Owner

@fraschm1998 commented on GitHub (Apr 16, 2024):

I got the same error while trying to use wizardlm2:8x22b-q4_0:

llm_load_tensors: ggml ctx size =    0.22 MiB
llama_model_load: error loading model: create_tensor: tensor 'blk.0.ffn_gate.0.weight' not found
llama_load_model_from_file: exception loading model
time=2024-04-16T02:36:23.822Z level=WARN source=llm.go:170 msg="Failed to load dynamic library /tmp/ollama1100549038/runners/cpu_avx2/libext_server.so  exception create_tensor: tensor 'blk.0.ffn_gate.0.weight' not found"
<!-- gh-comment-id:2058129280 --> @fraschm1998 commented on GitHub (Apr 16, 2024): I got the same error while trying to use wizardlm2:8x22b-q4_0: ``` llm_load_tensors: ggml ctx size = 0.22 MiB llama_model_load: error loading model: create_tensor: tensor 'blk.0.ffn_gate.0.weight' not found llama_load_model_from_file: exception loading model time=2024-04-16T02:36:23.822Z level=WARN source=llm.go:170 msg="Failed to load dynamic library /tmp/ollama1100549038/runners/cpu_avx2/libext_server.so exception create_tensor: tensor 'blk.0.ffn_gate.0.weight' not found" ```
Author
Owner

@jmorganca commented on GitHub (May 29, 2024):

This should be fixed now

<!-- gh-comment-id:2138116341 --> @jmorganca commented on GitHub (May 29, 2024): This should be fixed now
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27623