[GH-ISSUE #5649] My Ollama stopped working to transcribe videos. #29283

Closed
opened 2026-04-22 08:00:50 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @TioJota on GitHub (Jul 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5649

Originally assigned to: @jmorganca on GitHub.

What is the issue?

I updated to llama3, I use the SubtitleEdit program to transcribe other languages, the program translates correctly, but when I want to transcribe from English to Spanish it tells me the following message:

**"Ollama (local LLM)" requires a web server running locally!
{"error":"CreateFile C:\Users\jgcar\.ollama\models\blobs\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."}

Read more?**

I don't know absolutely anything about programming language, I'm just a video editor, if you could help me how I can solve it I would appreciate it.
Captura de pantalla 2024-07-12 035221

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

llama3

Originally created by @TioJota on GitHub (Jul 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5649 Originally assigned to: @jmorganca on GitHub. ### What is the issue? I updated to llama3, I use the SubtitleEdit program to transcribe other languages, the program translates correctly, but when I want to transcribe from English to Spanish it tells me the following message: **"Ollama (local LLM)" requires a web server running locally! {"error":"CreateFile C:\\Users\\jgcar\\.ollama\\models\\blobs\\sha256-00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29: The system cannot find the file specified."} Read more?** I don't know absolutely anything about programming language, I'm just a video editor, if you could help me how I can solve it I would appreciate it. ![Captura de pantalla 2024-07-12 035221](https://github.com/user-attachments/assets/b1fa6d8e-79a6-43d2-87d5-dd2a321ded6d) ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version llama3
GiteaMirror added the bug label 2026-04-22 08:00:50 -05:00
Author
Owner

@jmorganca commented on GitHub (Jul 12, 2024):

Hi @TioJota I'm sorry about that – would it be possible to run ollama pull llama3 in a terminal window (you can use cmd.exe or powershell.exe) - let me know if that works. Also feel free to shoot me an email and I'm happy to help walk you through it there or via a voice call on discord or similar 😊

<!-- gh-comment-id:2225872619 --> @jmorganca commented on GitHub (Jul 12, 2024): Hi @TioJota I'm sorry about that – would it be possible to run `ollama pull llama3` in a terminal window (you can use `cmd.exe` or `powershell.exe`) - let me know if that works. Also feel free to shoot me an email and I'm happy to help walk you through it there or via a voice call on discord or similar 😊
Author
Owner

@TioJota commented on GitHub (Jul 12, 2024):

HI @jmorganca I just sent you an Email, check your inbox

<!-- gh-comment-id:2226227792 --> @TioJota commented on GitHub (Jul 12, 2024): HI @jmorganca I just sent you an Email, check your inbox
Author
Owner

@TioJota commented on GitHub (Jul 16, 2024):

@jmorganca hey!!!

<!-- gh-comment-id:2231145315 --> @TioJota commented on GitHub (Jul 16, 2024): @jmorganca hey!!!
Author
Owner

@jmorganca commented on GitHub (Nov 17, 2024):

Hi @TioJota thanks for the issue! Sorry for the delay. Make sure Ollama's running – you should be able to see it in the system tray in the bottom right:

image

Let me know if you're still having issues

<!-- gh-comment-id:2481424910 --> @jmorganca commented on GitHub (Nov 17, 2024): Hi @TioJota thanks for the issue! Sorry for the delay. Make sure Ollama's running – you should be able to see it in the system tray in the bottom right: ![image](https://github.com/user-attachments/assets/38e73316-1cac-458c-adea-a908dabae5c9) Let me know if you're still having issues
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29283