[GH-ISSUE #9573] Model saving broken when parent model name has / #6243

Closed
opened 2026-04-12 17:40:39 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @sultanqasim on GitHub (Mar 7, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9573

Originally assigned to: @pdevine on GitHub.

What is the issue?

Using the /save command works for models whose parent doesn't have a slash in the name, but seems broken for models whose parent has a slash.

Example:

[sultan@wailer ~]$ ollama run phi4
>>> /set parameter num_ctx 16384
Set parameter 'num_ctx' to '16384'
>>> /set parameter temperature 0.5
Set parameter 'temperature' to '0.5'
>>> /save phi4-sqk
Created new model 'phi4-sqk'
>>> /bye

[sultan@wailer ~]$ ollama run huihui_ai/phi4-abliterated
>>> /set parameter num_ctx 16384
Set parameter 'num_ctx' to '16384'
>>> /set parameter temperature 0.5
Set parameter 'temperature' to '0.5'
>>> /save phi4-sqk
error: The model name 'phi4-sqk' is invalid

Relevant log output


OS

Linux

GPU

No response

CPU

Intel

Ollama version

0.5.13

Originally created by @sultanqasim on GitHub (Mar 7, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9573 Originally assigned to: @pdevine on GitHub. ### What is the issue? Using the /save command works for models whose parent doesn't have a slash in the name, but seems broken for models whose parent has a slash. Example: ~~~sh [sultan@wailer ~]$ ollama run phi4 >>> /set parameter num_ctx 16384 Set parameter 'num_ctx' to '16384' >>> /set parameter temperature 0.5 Set parameter 'temperature' to '0.5' >>> /save phi4-sqk Created new model 'phi4-sqk' >>> /bye [sultan@wailer ~]$ ollama run huihui_ai/phi4-abliterated >>> /set parameter num_ctx 16384 Set parameter 'num_ctx' to '16384' >>> /set parameter temperature 0.5 Set parameter 'temperature' to '0.5' >>> /save phi4-sqk error: The model name 'phi4-sqk' is invalid ~~~ ### Relevant log output ```shell ``` ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.5.13
GiteaMirror added the bug label 2026-04-12 17:40:39 -05:00
Author
Owner

@pdevine commented on GitHub (Mar 15, 2025):

I've root caused this and what's happening is the huihui_ai/phi4-abliterated model was created on Windows and the Ollama client is picking up the file name of the windows blob used to create the model (which is C:\Users\admin\.ollama\models\blobs\sha256-a47ab2fd4766db9e8ad65b720812d67cbde9404848a4ed9f8c15c50d7e5bd127) instead of picking the correct model name.

I'm close to a fix but just want to do more testing first.

<!-- gh-comment-id:2726106682 --> @pdevine commented on GitHub (Mar 15, 2025): I've root caused this and what's happening is the `huihui_ai/phi4-abliterated` model was created on Windows and the Ollama client is picking up the file name of the windows blob used to create the model (which is `C:\Users\admin\.ollama\models\blobs\sha256-a47ab2fd4766db9e8ad65b720812d67cbde9404848a4ed9f8c15c50d7e5bd127`) instead of picking the correct model name. I'm close to a fix but just want to do more testing first.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6243