[GH-ISSUE #6468] bug: Nested model in registry - cannot access model settings on my own model at https://ollama.com/ #66106

Closed
opened 2026-05-03 23:59:42 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @BobMerkus on GitHub (Aug 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6468

Originally assigned to: @BruceMacD on GitHub.

What is the issue?

Somehow I have managed to create a nested model when pushing, which causes me to be unable to access the repo settings. When you look on the Model Page deepseek-v2-tool-calling it shows MFDoom/MFDoom/deepseek-v2-tool-calling instead of MFDoom/deepseek-v2-tool-calling.

Same goes for deepseek-coder-v2-tool-calling

I cant access the settings here, it raises an 404. I can still access the settings here, but the delete button does not work, HTTP Dump:

Request URL:
https://ollama.com/MFDoom/MFDoom/deepseek-v2-tool-calling
Request Method:
DELETE
Status Code:
400 Bad Request
Remote Address:
34.120.132.20:443
Referrer Policy:
strict-origin-when-cross-origin

:authority:
ollama.com
:method:
DELETE
:path:
/MFDoom/MFDoom/deepseek-v2-tool-calling
:scheme:
https
accept:
*/*
accept-encoding:
gzip, deflate, br, zstd
accept-language:
en-US,en;q=0.7
content-length:
0
content-type:
application/x-www-form-urlencoded
cookie:
hx-current-url:
https://ollama.com/MFDoom/deepseek-v2-tool-calling/settings
hx-request:
true
origin:
https://ollama.com
priority:
u=1, i
referer:
https://ollama.com/MFDoom/deepseek-v2-tool-calling/settings
sec-ch-ua:
"Not)A;Brand";v="99", "Brave";v="127", "Chromium";v="127"
sec-ch-ua-mobile:
?0
sec-ch-ua-platform:
"Windows"
sec-fetch-dest:
empty
sec-fetch-mode:
cors
sec-fetch-site:
same-origin
sec-gpc:
1
user-agent:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.0.0 Safari/537.36

Cant edit/preview the readme either:
Preview
404 page not found

Reproduce it: I might have accidentally referred my own model in the FROM statement and pushed it again

OS

No response

GPU

No response

CPU

No response

Ollama version

0.3.6

Originally created by @BobMerkus on GitHub (Aug 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6468 Originally assigned to: @BruceMacD on GitHub. ### What is the issue? Somehow I have managed to create a nested model when pushing, which causes me to be unable to access the repo settings. When you look on the Model Page [deepseek-v2-tool-calling](https://ollama.com/MFDoom/deepseek-v2-tool-calling) it shows MFDoom/MFDoom/deepseek-v2-tool-calling instead of MFDoom/deepseek-v2-tool-calling. Same goes for [deepseek-coder-v2-tool-calling](https://ollama.com/MFDoom/deepseek-coder-v2-tool-calling) I cant access the settings [here](https://ollama.com/MFDoom/MFDoom/deepseek-v2-tool-calling/settings), it raises an 404. I can still access the settings [here](https://ollama.com/MFDoom/deepseek-v2-tool-calling/settings), but the delete button does not work, HTTP Dump: ``` Request URL: https://ollama.com/MFDoom/MFDoom/deepseek-v2-tool-calling Request Method: DELETE Status Code: 400 Bad Request Remote Address: 34.120.132.20:443 Referrer Policy: strict-origin-when-cross-origin :authority: ollama.com :method: DELETE :path: /MFDoom/MFDoom/deepseek-v2-tool-calling :scheme: https accept: */* accept-encoding: gzip, deflate, br, zstd accept-language: en-US,en;q=0.7 content-length: 0 content-type: application/x-www-form-urlencoded cookie: hx-current-url: https://ollama.com/MFDoom/deepseek-v2-tool-calling/settings hx-request: true origin: https://ollama.com priority: u=1, i referer: https://ollama.com/MFDoom/deepseek-v2-tool-calling/settings sec-ch-ua: "Not)A;Brand";v="99", "Brave";v="127", "Chromium";v="127" sec-ch-ua-mobile: ?0 sec-ch-ua-platform: "Windows" sec-fetch-dest: empty sec-fetch-mode: cors sec-fetch-site: same-origin sec-gpc: 1 user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.0.0 Safari/537.36 ``` Cant edit/preview the readme either: Preview 404 page not found Reproduce it: I might have accidentally referred my own model in the FROM statement and pushed it again ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.3.6
GiteaMirror added the bugollama.com labels 2026-05-03 23:59:42 -05:00
Author
Owner

@bmizerany commented on GitHub (Aug 22, 2024):

@BobMerkus Will you please provide the steps you took to create the "nested" model so we can work to reproduce on our end?

<!-- gh-comment-id:2305834747 --> @bmizerany commented on GitHub (Aug 22, 2024): @BobMerkus Will you please provide the steps you took to create the "nested" model so we can work to reproduce on our end?
Author
Owner

@BobMerkus commented on GitHub (Aug 22, 2024):

@bmizerany
I just reproduced it on a test

I'm not really sure how this happened, i followed the general Modelfile docs

ollama create MFDoom/deepseek-v2-tool-calling-test:latest --file Modelfile
ollama push MFDoom/deepseek-v2-tool-calling-test:latest

FROM deepseek-coder-v2:latest
PARAMETER mirostat 0
PARAMETER mirostat_eta 0.1
PARAMETER mirostat_tau 5.0
PARAMETER num_ctx 4096
PARAMETER repeat_last_n 64
PARAMETER repeat_penalty 1.1
PARAMETER temperature 0.5
PARAMETER seed 0
PARAMETER tfs_z 1.0
PARAMETER num_predict 128
PARAMETER top_k 40
PARAMETER top_p 0.9
PARAMETER min_p 0.0

TEMPLATE """
{{ if .Messages }}
{{- if or .System .Tools }}
{{- if .System }}
{{ .System }}
{{- end }}
{{- if .Tools }}
You are a helpful assistant with tool calling capabilities. When you receive a tool call response, use the output to format an answer to the orginal use question.
{{- end }}
{{- end }}
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}User:
{{- if and $.Tools $last }}
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables.
{{ $.Tools }}
{{- end }}
{{ .Content }}
{{ if $last }}Assistant: 
{{ end }}
{{- else if eq .Role "assistant" }}Assistant: 
{{- if .ToolCalls }}
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}
{{- else }}
{{ .Content }}{{ if not $last }}{{ end }}
{{- end }}
{{- else if eq .Role "tool" }}
{{ .Content }}{{ if $last }}Assistant: 
{{ end }}
{{- end }}
{{- end }}
{{- else }}
{{- if .System }}
{{ .System }}{{ end }}{{ if .Prompt }}User:
{{ .Prompt }}{{ end }}Assistant: 
{{ end }}{{ .Response }}{{ if .Response }}{{ end }}
"""
<!-- gh-comment-id:2305923988 --> @BobMerkus commented on GitHub (Aug 22, 2024): @bmizerany I just reproduced it on a [test](https://ollama.com/MFDoom/deepseek-v2-tool-calling-test) I'm not really sure how this happened, i followed the general [Modelfile docs](https://github.com/ollama/ollama/blob/main/docs/modelfile.md) ```bash ollama create MFDoom/deepseek-v2-tool-calling-test:latest --file Modelfile ollama push MFDoom/deepseek-v2-tool-calling-test:latest ``` ```modelfile FROM deepseek-coder-v2:latest PARAMETER mirostat 0 PARAMETER mirostat_eta 0.1 PARAMETER mirostat_tau 5.0 PARAMETER num_ctx 4096 PARAMETER repeat_last_n 64 PARAMETER repeat_penalty 1.1 PARAMETER temperature 0.5 PARAMETER seed 0 PARAMETER tfs_z 1.0 PARAMETER num_predict 128 PARAMETER top_k 40 PARAMETER top_p 0.9 PARAMETER min_p 0.0 TEMPLATE """ {{ if .Messages }} {{- if or .System .Tools }} {{- if .System }} {{ .System }} {{- end }} {{- if .Tools }} You are a helpful assistant with tool calling capabilities. When you receive a tool call response, use the output to format an answer to the orginal use question. {{- end }} {{- end }} {{- range $i, $_ := .Messages }} {{- $last := eq (len (slice $.Messages $i)) 1 }} {{- if eq .Role "user" }}User: {{- if and $.Tools $last }} Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt. Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables. {{ $.Tools }} {{- end }} {{ .Content }} {{ if $last }}Assistant: {{ end }} {{- else if eq .Role "assistant" }}Assistant: {{- if .ToolCalls }} {{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }} {{- else }} {{ .Content }}{{ if not $last }}{{ end }} {{- end }} {{- else if eq .Role "tool" }} {{ .Content }}{{ if $last }}Assistant: {{ end }} {{- end }} {{- end }} {{- else }} {{- if .System }} {{ .System }}{{ end }}{{ if .Prompt }}User: {{ .Prompt }}{{ end }}Assistant: {{ end }}{{ .Response }}{{ if .Response }}{{ end }} """ ```
Author
Owner

@BruceMacD commented on GitHub (Aug 22, 2024):

Hi @BobMerkus, this was a UI issue on ollama.com. Nothing you did. Thanks for reporting it. Should be fixed now, let me know if you see anymore problems.

<!-- gh-comment-id:2305930884 --> @BruceMacD commented on GitHub (Aug 22, 2024): Hi @BobMerkus, this was a UI issue on ollama.com. Nothing you did. Thanks for reporting it. Should be fixed now, let me know if you see anymore problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66106