[GH-ISSUE #9677] The installation-free version of Ollama cannot change the model installation path #6314

Closed
opened 2026-04-12 17:47:37 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @lmh87883819 on GitHub (Mar 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9677

What is the issue?

I used the installation-free version of Ollama. I tried to use environment variables to set the model installation location, but it did not take effect. Is there any good way?

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @lmh87883819 on GitHub (Mar 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9677 ### What is the issue? I used the installation-free version of Ollama. I tried to use environment variables to set the model installation location, but it did not take effect. Is there any good way? ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the needs more infobug labels 2026-04-12 17:47:37 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 12, 2025):

What is "installation-free version of Ollama"?

<!-- gh-comment-id:2717853586 --> @rick-github commented on GitHub (Mar 12, 2025): What is "installation-free version of Ollama"?
Author
Owner

@3unnycheung commented on GitHub (Mar 12, 2025):

Where are models stored?
macOS: ~/.ollama/models
Linux: /usr/share/ollama/.ollama/models
Windows: C:\Users%username%.ollama\models
How do I set them to a different location?
If a different directory needs to be used, set the environment variable OLLAMA_MODELS to the chosen directory.

Note: on Linux using the standard installer, the ollama user needs read and write access to the specified directory. To assign the directory to the ollama user run sudo chown -R ollama:ollama .

Refer to the section above for how to set environment variables on your platform.

<!-- gh-comment-id:2717936572 --> @3unnycheung commented on GitHub (Mar 12, 2025): Where are models stored? macOS: ~/.ollama/models Linux: /usr/share/ollama/.ollama/models Windows: C:\Users\%username%\.ollama\models How do I set them to a different location? If a different directory needs to be used, set the environment variable OLLAMA_MODELS to the chosen directory. Note: on Linux using the standard installer, the ollama user needs read and write access to the specified directory. To assign the directory to the ollama user run sudo chown -R ollama:ollama <directory>. Refer to the section [above](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server) for how to set environment variables on your platform.
Author
Owner

@lmh87883819 commented on GitHub (Mar 13, 2025):

https://github.com/ollama/ollama/releases/download/v0.6.0/ollama-windows-amd64.zip,Using this zip file to extract the version, I am a windows system, I set OLLAMA_MODELS, but there is no effect

<!-- gh-comment-id:2721094511 --> @lmh87883819 commented on GitHub (Mar 13, 2025): https://github.com/ollama/ollama/releases/download/v0.6.0/ollama-windows-amd64.zip,Using this zip file to extract the version, I am a windows system, I set OLLAMA_MODELS, but there is no effect
Author
Owner

@rick-github commented on GitHub (Mar 13, 2025):

How do you start the server? Where did you set OLLAMA_MODELS?

<!-- gh-comment-id:2721104109 --> @rick-github commented on GitHub (Mar 13, 2025): How do you start the server? Where did you set OLLAMA_MODELS?
Author
Owner

@lmh87883819 commented on GitHub (Mar 13, 2025):

thank you for your patience,This is where I set the environment variables
Image

Then I started the ollama service in node, like this:
const runOllamaApplication = async (
appPath: string,
appId: string,
runArgs: string[],
) => {
// 先清理已存在的 Ollama 进程
if (!(await cleanupOllamaProcess())) {
logit(" 清理现有 Ollama 进程失败");
return false;
}

const ollamaPath = path.join(appPath, "ollama.exe");
if (!fs.existsSync(ollamaPath)) {
logit(❌ ollama.exe 不存在: ${ollamaPath});
return false;
}

logit(🚀 启动 Ollama: ${appId});
logit(📂 应用路径: ${appPath});

return new Promise((resolve) => {
const process = spawn("${ollamaPath}", ["serve", ...runArgs], {
shell: true,
cwd: appPath,
});
let isStarted = false;
let startupTimeout = setTimeout(() => {
if (!isStarted) {
process.kill();
resolve(false);
}
}, 300000);

// 检查 Ollama API 是否可访问
const checkOllamaAPI = async () => {
  try {
    const response = await fetch("http://127.0.0.1:11434");
    const text = await response.text();
    logit("checkOllamaAPI", text);
    if (text.includes("Ollama is running")) {
      isStarted = true;
      clearTimeout(startupTimeout);
      logit("✅ Ollama API 已启动成功");
      getMainWindow()?.webContents.send(COMMAND.UPDATE_APP_STATUS, {
        appId,
        status: true,
      });
      resolve(true);
      return true;
    }
  } catch (error) {
    logit("checkOllamaAPI", error);
    return false;
  }
  return false;
};

// 定期检查 API
const checkInterval = setInterval(async () => {
  if (await checkOllamaAPI()) {
    clearInterval(checkInterval);
  }
}, 1000);

process.stdout.on("data", (data) => {
  const output = data.toString();
  logit(`📝 Ollama输出: ${output}`);
  getMainWindow()?.webContents.send(COMMAND.LOADING_TEXT, output);
});

process.stderr.on("data", (data) => {
  getMainWindow()?.webContents.send(COMMAND.LOADING_TEXT, data.toString());
});

process.on("error", () => {
  clearTimeout(startupTimeout);
  clearInterval(checkInterval);
  getMainWindow()?.webContents.send(COMMAND.UPDATE_APP_STATUS, {
    appId,
    status: false,
  });
  resolve(false);
});

process.on("close", () => {
  if (!isStarted) {
    clearTimeout(startupTimeout);
    clearInterval(checkInterval);
    getMainWindow()?.webContents.send(COMMAND.UPDATE_APP_STATUS, {
      appId,
      status: false,
    });
    resolve(false);
  }
});

process.on("exit", () => {
  clearTimeout(startupTimeout);
  clearInterval(checkInterval);
  process.kill();
});

});
};
Image

<!-- gh-comment-id:2721155117 --> @lmh87883819 commented on GitHub (Mar 13, 2025): thank you for your patience,This is where I set the environment variables ![Image](https://github.com/user-attachments/assets/51275ec5-eb74-4e7b-a3d1-503bc10cdc78) Then I started the ollama service in node, like this: const runOllamaApplication = async ( appPath: string, appId: string, runArgs: string[], ) => { // 先清理已存在的 Ollama 进程 if (!(await cleanupOllamaProcess())) { logit("❌ 清理现有 Ollama 进程失败"); return false; } const ollamaPath = path.join(appPath, "ollama.exe"); if (!fs.existsSync(ollamaPath)) { logit(`❌ ollama.exe 不存在: ${ollamaPath}`); return false; } logit(`🚀 启动 Ollama: ${appId}`); logit(`📂 应用路径: ${appPath}`); return new Promise((resolve) => { const process = spawn(`"${ollamaPath}"`, ["serve", ...runArgs], { shell: true, cwd: appPath, }); let isStarted = false; let startupTimeout = setTimeout(() => { if (!isStarted) { process.kill(); resolve(false); } }, 300000); // 检查 Ollama API 是否可访问 const checkOllamaAPI = async () => { try { const response = await fetch("http://127.0.0.1:11434"); const text = await response.text(); logit("checkOllamaAPI", text); if (text.includes("Ollama is running")) { isStarted = true; clearTimeout(startupTimeout); logit("✅ Ollama API 已启动成功"); getMainWindow()?.webContents.send(COMMAND.UPDATE_APP_STATUS, { appId, status: true, }); resolve(true); return true; } } catch (error) { logit("checkOllamaAPI", error); return false; } return false; }; // 定期检查 API const checkInterval = setInterval(async () => { if (await checkOllamaAPI()) { clearInterval(checkInterval); } }, 1000); process.stdout.on("data", (data) => { const output = data.toString(); logit(`📝 Ollama输出: ${output}`); getMainWindow()?.webContents.send(COMMAND.LOADING_TEXT, output); }); process.stderr.on("data", (data) => { getMainWindow()?.webContents.send(COMMAND.LOADING_TEXT, data.toString()); }); process.on("error", () => { clearTimeout(startupTimeout); clearInterval(checkInterval); getMainWindow()?.webContents.send(COMMAND.UPDATE_APP_STATUS, { appId, status: false, }); resolve(false); }); process.on("close", () => { if (!isStarted) { clearTimeout(startupTimeout); clearInterval(checkInterval); getMainWindow()?.webContents.send(COMMAND.UPDATE_APP_STATUS, { appId, status: false, }); resolve(false); } }); process.on("exit", () => { clearTimeout(startupTimeout); clearInterval(checkInterval); process.kill(); }); }); }; ![Image](https://github.com/user-attachments/assets/569ae3ea-7c1f-47bb-bbfb-ac32a3420155)
Author
Owner

@rick-github commented on GitHub (Mar 13, 2025):

If you start your app from a terminal window, try setting OLLAMA_MODELS in the terminal window before starting the app.

<!-- gh-comment-id:2721556427 --> @rick-github commented on GitHub (Mar 13, 2025): If you start your app from a terminal window, try setting `OLLAMA_MODELS` in the terminal window before starting the app.
Author
Owner

@lmh87883819 commented on GitHub (Mar 15, 2025):

Thanks for replying, following your instructions, it really works

<!-- gh-comment-id:2726532952 --> @lmh87883819 commented on GitHub (Mar 15, 2025): Thanks for replying, following your instructions, it really works
Author
Owner

@Arthurcxl commented on GitHub (Mar 24, 2025):

hallo! 我在win 和 wsl 下都设置了 OLLAMA_MODELS 变量路径,我想把它安装到 d盘:export PATH="/mnt/d/Ollama/bin:$PATH"
OLLAMA_MODELS="/mnt/d/Ollama/models" 但是结果他还是去安装到默认的路径:/usr/share/ollama/.ollama/models 。 这该怎么办?
我之前也是使用的 win的安装包,但是我也是设置路径之后,他也还是会去安装到 C:\Users%username%.ollama\models 不会安装到d盘
mayday!!

<!-- gh-comment-id:2747092143 --> @Arthurcxl commented on GitHub (Mar 24, 2025): hallo! 我在win 和 wsl 下都设置了 OLLAMA_MODELS 变量路径,我想把它安装到 d盘:export PATH="/mnt/d/Ollama/bin:$PATH" OLLAMA_MODELS="/mnt/d/Ollama/models" 但是结果他还是去安装到默认的路径:/usr/share/ollama/.ollama/models 。 这该怎么办? 我之前也是使用的 win的安装包,但是我也是设置路径之后,他也还是会去安装到 C:\Users%username%.ollama\models 不会安装到d盘 mayday!!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6314