[GH-ISSUE #3670] how use local model? #64298

Closed
opened 2026-05-03 16:59:13 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @njhouse365 on GitHub (Apr 16, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3670

What is the issue?

1

What did you expect to see?

No response

Steps to reproduce

No response

Are there any recent changes that introduced the issue?

No response

OS

No response

Architecture

No response

Platform

No response

Ollama version

No response

GPU

No response

GPU info

No response

CPU

No response

Other software

No response

Originally created by @njhouse365 on GitHub (Apr 16, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3670 ### What is the issue? 1 ### What did you expect to see? _No response_ ### Steps to reproduce _No response_ ### Are there any recent changes that introduced the issue? _No response_ ### OS _No response_ ### Architecture _No response_ ### Platform _No response_ ### Ollama version _No response_ ### GPU _No response_ ### GPU info _No response_ ### CPU _No response_ ### Other software _No response_
GiteaMirror added the bug label 2026-05-03 16:59:13 -05:00
Author
Owner

@thinkverse commented on GitHub (Apr 16, 2024):

Reading the documentation for importing a GGUF, PyTorch or Safetensor model might be helpful: https://github.com/ollama/ollama/blob/main/docs/import.md

<!-- gh-comment-id:2058808630 --> @thinkverse commented on GitHub (Apr 16, 2024): Reading the documentation for importing a GGUF, PyTorch or Safetensor model might be helpful: https://github.com/ollama/ollama/blob/main/docs/import.md
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64298