[GH-ISSUE #7344] after some time idle / phone standby , getting to the termux ollama run cmd makes it restart the dl from 0 #30427

Open
opened 2026-04-22 10:02:42 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @fxmbsw7 on GitHub (Oct 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7344

What is the issue?

so i know ollama can resume downloads
but the following issue happened to me now the second time on a different model dl
i run ollama run model
it downloads ..
i can switch apps , switch back to termux ollama , no problem
but after some screen of time i return to termux and see it just began from 0 again ...

OS

Linux

GPU

Other

CPU

Other

Ollama version

0.3.14

Originally created by @fxmbsw7 on GitHub (Oct 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7344 ### What is the issue? so i know ollama can resume downloads but the following issue happened to me now the second time on a different model dl i run ollama run model it downloads .. i can switch apps , switch back to termux ollama , no problem but after some screen of time i return to termux and see it just began from 0 again ... ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0.3.14
GiteaMirror added the needs more infobug labels 2026-04-22 10:02:42 -05:00
Author
Owner

@pdevine commented on GitHub (Oct 24, 2024):

Can you post some steps (maybe with screenshots?) to reproduce? I was looking at this yesterday and it seemed to be working correctly.

<!-- gh-comment-id:2435982328 --> @pdevine commented on GitHub (Oct 24, 2024): Can you post some steps (maybe with screenshots?) to reproduce? I was looking at this yesterday and it seemed to be working correctly.
Author
Owner

@fxmbsw7 commented on GitHub (Oct 24, 2024):

so maybe try .. use termux to ollama run a big new dl
then , maybe go thru apps around , or maybe just keep termux
close display for 30m or more ( i dunno how long )
then return to termux ollama
for me it did happen 2x on one model and 1x now on another
the weird problem is , it can resume , but what makes it start from 0 i dunno

<!-- gh-comment-id:2435997678 --> @fxmbsw7 commented on GitHub (Oct 24, 2024): so maybe try .. use termux to ollama run a big new dl then , maybe go thru apps around , or maybe just keep termux close display for 30m or more ( i dunno how long ) then return to termux ollama for me it did happen 2x on one model and 1x now on another the weird problem is , it can resume , but what makes it start from 0 i dunno
Author
Owner

@fxmbsw7 commented on GitHub (Oct 25, 2024):

dont have screenshots , would just be termux with the ollama cmd ..

<!-- gh-comment-id:2436553625 --> @fxmbsw7 commented on GitHub (Oct 25, 2024): dont have screenshots , would just be termux with the ollama cmd ..
Author
Owner

@dhiltgen commented on GitHub (Nov 5, 2024):

@fxmbsw7 how are you running the serve command under termux? Is it via a systemd service, or through some other mechanism? Is it possible when Ollama is backgrounded for a while the OS or scheduler is shutting down the server? If so, that would explain the behavior. We don't persist knowledge of inflight downloads on shutdown, and on next server start, we scan the filesystem for orphan blobs and go through a garbage collection pass to clean up the filesystem.

<!-- gh-comment-id:2458144591 --> @dhiltgen commented on GitHub (Nov 5, 2024): @fxmbsw7 how are you running the serve command under termux? Is it via a systemd service, or through some other mechanism? Is it possible when Ollama is backgrounded for a while the OS or scheduler is shutting down the server? If so, that would explain the behavior. We don't persist knowledge of inflight downloads on shutdown, and on next server start, we scan the filesystem for orphan blobs and go through a garbage collection pass to clean up the filesystem.
Author
Owner

@fxmbsw7 commented on GitHub (Nov 5, 2024):

hi there
i use aliases script
which does alike : ollama serve &>>~/log &
so itd be backgrounded on the , or older shell
.. a comment on ur serve close .. it may but i dont think so cause two
things
first i havent noticed closure like i think following ollama cmds may
fail w/o serve cmd again , which i dont remember running again
and .. if serve closed , wouldnt the run dl fail too at the same time ?

andro may have closed things , that seems the issue .. but accurately i
cant tell

On Tue, Nov 5, 2024, 22:02 Daniel Hiltgen @.***> wrote:

@fxmbsw7 https://github.com/fxmbsw7 how are you running the serve
command under termux? Is it via a systemd service, or through some other
mechanism? Is it possible when Ollama is backgrounded for a while the OS or
scheduler is shutting down the server? If so, that would explain the
behavior. We don't persist knowledge of inflight downloads on shutdown, and
on next server start, we scan the filesystem for orphan blobs and go
through a garbage collection pass to clean up the filesystem.


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7344#issuecomment-2458144591,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3IBSR7TT7YZ77EHOGLZ7EW5ZAVCNFSM6AAAAABQRN776KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJYGE2DINJZGE
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2458270160 --> @fxmbsw7 commented on GitHub (Nov 5, 2024): hi there i use aliases script which does alike : ollama serve &>>~/log & so itd be backgrounded on the , or older shell .. a comment on ur serve close .. it may but i dont think so cause two things first i havent noticed closure like *i think* following ollama cmds may fail w/o serve cmd again , which i dont remember running again and .. if serve closed , wouldnt the run dl fail too at the same time ? andro may have closed things , that seems the issue .. but accurately i cant tell On Tue, Nov 5, 2024, 22:02 Daniel Hiltgen ***@***.***> wrote: > @fxmbsw7 <https://github.com/fxmbsw7> how are you running the serve > command under termux? Is it via a systemd service, or through some other > mechanism? Is it possible when Ollama is backgrounded for a while the OS or > scheduler is shutting down the server? If so, that would explain the > behavior. We don't persist knowledge of inflight downloads on shutdown, and > on next server start, we scan the filesystem for orphan blobs and go > through a garbage collection pass to clean up the filesystem. > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7344#issuecomment-2458144591>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3IBSR7TT7YZ77EHOGLZ7EW5ZAVCNFSM6AAAAABQRN776KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJYGE2DINJZGE> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@dhiltgen commented on GitHub (Nov 6, 2024):

Can you share the server log? Maybe there's something interesting in there when the failure occurs.

<!-- gh-comment-id:2460321555 --> @dhiltgen commented on GitHub (Nov 6, 2024): Can you share the server log? Maybe there's something interesting in there when the failure occurs.
Author
Owner

@fxmbsw7 commented on GitHub (Nov 6, 2024):

awwhhh dood , such a great idea
half a day ago
i had to reset termux
sorry dun haf anymore

On Wed, Nov 6, 2024, 18:03 Daniel Hiltgen @.***> wrote:

Can you share the server log? Maybe there's something interesting in there
when the failure occurs.


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7344#issuecomment-2460321555,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3KUSYD2RFZZWAG5GE3Z7JDUZAVCNFSM6AAAAABQRN776KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINRQGMZDCNJVGU
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2460623818 --> @fxmbsw7 commented on GitHub (Nov 6, 2024): awwhhh dood , such a great idea half a day ago i had to reset termux sorry dun haf anymore On Wed, Nov 6, 2024, 18:03 Daniel Hiltgen ***@***.***> wrote: > Can you share the server log? Maybe there's something interesting in there > when the failure occurs. > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7344#issuecomment-2460321555>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3KUSYD2RFZZWAG5GE3Z7JDUZAVCNFSM6AAAAABQRN776KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINRQGMZDCNJVGU> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@fxmbsw7 commented on GitHub (Dec 6, 2024):

hello i have an example
i go termux n run download a model
go home screen , disable mobile data ( my inet ) , and enable again
go back to termux .. ollama starts / started from 0

i tested later the no switch away from termux , .. the dl stalled without sight of any continue or re-dl

https://github.com/user-attachments/assets/fe48eadd-37e4-4cfe-8d3f-a03ba61906ad

<!-- gh-comment-id:2522636635 --> @fxmbsw7 commented on GitHub (Dec 6, 2024): hello i have an example i go termux n run download a model go home screen , disable mobile data ( my inet ) , and enable again go back to termux .. ollama starts / started from 0 i tested later the no switch away from termux , .. the dl stalled without sight of any continue or re-dl https://github.com/user-attachments/assets/fe48eadd-37e4-4cfe-8d3f-a03ba61906ad
Author
Owner

@fxmbsw7 commented on GitHub (Dec 6, 2024):

on a second try without leaving termux , the dl continued well

<!-- gh-comment-id:2522639289 --> @fxmbsw7 commented on GitHub (Dec 6, 2024): on a second try without leaving termux , the dl continued well
Author
Owner

@fxmbsw7 commented on GitHub (Dec 6, 2024):

on later tries with switching to home , and back
termux reatarted again and again but from last saved part ( 189 mb in my case now ) ..

<!-- gh-comment-id:2522643066 --> @fxmbsw7 commented on GitHub (Dec 6, 2024): on later tries with switching to home , and back termux reatarted again and again but from last saved part ( 189 mb in my case now ) ..
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30427