[GH-ISSUE #2386] Unable to load dynamic server library on Mac. #47899

Closed
opened 2026-04-28 05:43:17 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @StarstormVC on GitHub (Feb 7, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2386

My environment:
Macbook Pro | MacOS ver Sonoma:14.3

After updating my OS, I have the following issue when I run ollama run llama2. I had also pulled the model successfully.

Error: Unable to load dynamic library: Unable to load dynamic server library: dlopen(/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib, 0x0006): tried: '/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib' (no such file), '/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib' (no su

Originally created by @StarstormVC on GitHub (Feb 7, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2386 My environment: Macbook Pro | MacOS ver Sonoma:14.3 After updating my OS, I have the following issue when I run ollama run llama2. I had also pulled the model successfully. Error: Unable to load dynamic library: Unable to load dynamic server library: dlopen(/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib, 0x0006): tried: '/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib' (no such file), '/var/folders/h6/41y3dhqd0p9cd8p8rmfn6t000000gn/T/ollama1989849860/metal/libext_server.dylib' (no su
GiteaMirror added the bug label 2026-04-28 05:43:17 -05:00
Author
Owner

@easp commented on GitHub (Feb 7, 2024):

I think I had this same error this morning. Restarting the Ollama app ended up fixing it.

<!-- gh-comment-id:1933009505 --> @easp commented on GitHub (Feb 7, 2024): I think I had this same error this morning. Restarting the Ollama app ended up fixing it.
Author
Owner

@jmorganca commented on GitHub (Feb 8, 2024):

This should be fixed in https://github.com/ollama/ollama/pull/2403 and will be in the upcoming release! Sorry to anyone who hit this!

<!-- gh-comment-id:1933274015 --> @jmorganca commented on GitHub (Feb 8, 2024): This should be fixed in https://github.com/ollama/ollama/pull/2403 and will be in the upcoming release! Sorry to anyone who hit this!
Author
Owner

@elewis787 commented on GitHub (Mar 13, 2024):

@jmorganca - I am running into this issue building off the main branch which appears to have the changes from #2403.

I am building on a M3 Mac.

I am not setting values for OLLAMA_LLM_LIBRARY.

In the current implementation, the dynlibs path is not being updated after falling back to the nativeInit function.

	demandLib := os.Getenv("OLLAMA_LLM_LIBRARY")
	if demandLib != "" {
		libPath := availableDynLibs[demandLib]
		if libPath == "" {
			slog.Info(fmt.Sprintf("Invalid OLLAMA_LLM_LIBRARY %s - not found", demandLib))
		} else {
			slog.Info(fmt.Sprintf("Loading OLLAMA_LLM_LIBRARY=%s", demandLib))
			dynLibs = []string{libPath}
		}
	}

	// We stage into a temp directory, and if we've been idle for a while, it may have been reaped
	_, err := os.Stat(dynLibs[0])
	if err != nil {
		slog.Info(fmt.Sprintf("%s has disappeared, reloading libraries", dynLibs[0]))
		err = nativeInit()
		if err != nil {
			return nil, err
		}
	}

The main problem is that availableDynLibs is not updated until after the nativeInit function - which means the dynlibs string remains empty. This may only be limited to metal variants but there are a few solutions I believe.

We can change the order and do the env look-up and lib path set after the naviteInit so that availableDynLibs contains the metal variant

or we could set the dynlibs path after the nativeInit()

dynLibs = []string{availableDynLibs[gpuInfo.Library]}

Side note - I am using the LLM package outside of the Ollama app - so I could be missing something that the normal Ollama server/app does to prevent this.

If you agree this is an issue - I'm happy to push the changes I have.

<!-- gh-comment-id:1995589758 --> @elewis787 commented on GitHub (Mar 13, 2024): @jmorganca - I am running into this issue building off the main branch which appears to have the changes from #2403. I am building on a M3 Mac. I am not setting values for `OLLAMA_LLM_LIBRARY`. In the current implementation, the dynlibs path is not being updated after falling back to the nativeInit function. ``` demandLib := os.Getenv("OLLAMA_LLM_LIBRARY") if demandLib != "" { libPath := availableDynLibs[demandLib] if libPath == "" { slog.Info(fmt.Sprintf("Invalid OLLAMA_LLM_LIBRARY %s - not found", demandLib)) } else { slog.Info(fmt.Sprintf("Loading OLLAMA_LLM_LIBRARY=%s", demandLib)) dynLibs = []string{libPath} } } // We stage into a temp directory, and if we've been idle for a while, it may have been reaped _, err := os.Stat(dynLibs[0]) if err != nil { slog.Info(fmt.Sprintf("%s has disappeared, reloading libraries", dynLibs[0])) err = nativeInit() if err != nil { return nil, err } } ``` The main problem is that availableDynLibs is not updated until after the nativeInit function - which means the dynlibs string remains empty. This may only be limited to metal variants but there are a few solutions I believe. We can change the order and do the env look-up and lib path set after the naviteInit so that availableDynLibs contains the metal variant or we could set the dynlibs path after the nativeInit() `dynLibs = []string{availableDynLibs[gpuInfo.Library]}` Side note - I am using the LLM package outside of the Ollama app - so I could be missing something that the normal Ollama server/app does to prevent this. If you agree this is an issue - I'm happy to push the changes I have.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47899