[GH-ISSUE #3542] Push of new model #48695

Closed
opened 2026-04-28 09:06:25 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @emsi on GitHub (Apr 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3542

What is the issue?

When trying to push NEW model I get Error: file does not exist.

I have debugged it under mitm and it seems ollama server tries to HEAD the new, nonexisting, model:

[19:36:39.502][172.17.0.1:52910] server connect registry.ollama.ai:443 (104.21.75.227:443)
172.17.0.1:52910: HEAD https://registry.ollama.ai/v2/emsi/Qra-13b/blobs/sha256:d381585268275794b5c658640369b3c112d982a0fef343da4bf50404bfe9e03f
               << 404 Not Found 0b
172.17.0.1:52910: POST https://registry.ollama.ai/v2/emsi/Qra-13b/blobs/uploads/
               << 404 Not Found 19b

What did you expect to see?

It should upload.

Steps to reproduce

Create a completely new model then:

root@105638d575be:/# ollama -v
ollama version is 0.1.30
root@105638d575be:/# ollama run emsi/Qra-13b
>>> Send a message (/? for help)
root@105638d575be:/# ollama push emsi/Qra-13b
retrieving manifest 
Error: file does not exist

Are there any recent changes that introduced the issue?

No response

OS

Linux

Architecture

No response

Platform

Docker

Ollama version

0.1.30

GPU

Nvidia

GPU info

No response

CPU

No response

Other software

No response

Originally created by @emsi on GitHub (Apr 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3542 ### What is the issue? When trying to push NEW model I get `Error: file does not exist`. I have debugged it under mitm and it seems ollama server tries to HEAD the new, nonexisting, model: ``` [19:36:39.502][172.17.0.1:52910] server connect registry.ollama.ai:443 (104.21.75.227:443) 172.17.0.1:52910: HEAD https://registry.ollama.ai/v2/emsi/Qra-13b/blobs/sha256:d381585268275794b5c658640369b3c112d982a0fef343da4bf50404bfe9e03f << 404 Not Found 0b 172.17.0.1:52910: POST https://registry.ollama.ai/v2/emsi/Qra-13b/blobs/uploads/ << 404 Not Found 19b ``` ### What did you expect to see? It should upload. ### Steps to reproduce Create a completely new model then: ``` root@105638d575be:/# ollama -v ollama version is 0.1.30 root@105638d575be:/# ollama run emsi/Qra-13b >>> Send a message (/? for help) root@105638d575be:/# ollama push emsi/Qra-13b retrieving manifest Error: file does not exist ``` ### Are there any recent changes that introduced the issue? _No response_ ### OS Linux ### Architecture _No response_ ### Platform Docker ### Ollama version 0.1.30 ### GPU Nvidia ### GPU info _No response_ ### CPU _No response_ ### Other software _No response_
GiteaMirror added the bug label 2026-04-28 09:06:25 -05:00
Author
Owner

@jmorganca commented on GitHub (Apr 8, 2024):

Hi there, sorry about this. Model names in Ollama can only have lowercase characters. If you rename the model to emsi/qra-13b it should work. emsi/qra:13b is also another name I'd recommend 😊 . Will merge this with https://github.com/ollama/ollama/issues/3297

<!-- gh-comment-id:2043612359 --> @jmorganca commented on GitHub (Apr 8, 2024): Hi there, sorry about this. Model names in Ollama can only have lowercase characters. If you rename the model to `emsi/qra-13b` it should work. `emsi/qra:13b` is also another name I'd recommend 😊 . Will merge this with https://github.com/ollama/ollama/issues/3297
Author
Owner

@emsi commented on GitHub (Apr 9, 2024):

Hi!

Thanks @jmorganca for the replay. It's a pitty that web UI offers a vaidator while ollama itself just spits out a bogus error message that gives no clue about the cause of the problem.

<!-- gh-comment-id:2045323231 --> @emsi commented on GitHub (Apr 9, 2024): Hi! Thanks @jmorganca for the replay. It's a pitty that web UI offers a vaidator while ollama itself just spits out a bogus error message that gives no clue about the cause of the problem.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48695