mirror of
https://github.com/ollama/ollama.git
synced 2026-05-08 00:51:34 -05:00
Closed
opened 2026-05-04 11:44:46 -05:00 by GiteaMirror
·
191 comments
No Branch/Tag Specified
main
dhiltgen/ci
dhiltgen/llama-runner
parth-launch-codex-app
hoyyeva/anthropic-local-image-path
hoyyeva/anthropic-reference-images-path
parth-anthropic-reference-images-path
brucemacd/download-before-remove
hoyyeva/editor-config-repair
parth-mlx-decode-checkpoints
hoyyeva/fix-codex-model-metadata-warning
hoyyeva/qwen
parth/hide-claude-desktop-till-release
hoyyeva/opencode-image-modality
parth-add-claude-code-autoinstall
release_v0.22.0
pdevine/manifest-list
codex/fix-codex-model-metadata-warning
pdevine/addressable-manifest
brucemacd/launch-fetch-reccomended
jmorganca/llama-compat
launch-copilot-cli
hoyyeva/opencode-thinking
release_v0.20.7
parth-auto-save-backup
parth-test
jmorganca/gemma4-audio-replacements
fix-manifest-digest-on-pull
hoyyeva/vscode-improve
brucemacd/install-server-wait
parth/update-claude-docs
brucemac/start-ap-install
pdevine/mlx-update
pdevine/qwen35_vision
drifkin/api-show-fallback
mintlify/image-generation-1773352582
hoyyeva/server-context-length-local-config
jmorganca/faster-reptition-penalties
jmorganca/convert-nemotron
parth-pi-thinking
pdevine/sampling-penalties
jmorganca/fix-create-quantization-memory
dongchen/resumable_transfer_fix
pdevine/sampling-cache-error
jessegross/mlx-usage
hoyyeva/openclaw-config
hoyyeva/app-html
pdevine/qwen3next
brucemacd/sign-sh-install
brucemacd/tui-update
brucemacd/usage-api
jmorganca/launch-empty
fix-app-dist-embed
mxyng/mlx-compile
mxyng/mlx-quant
mxyng/mlx-glm4.7
mxyng/mlx
brucemacd/simplify-model-picker
jmorganca/qwen3-concurrent
fix-glm-4.7-flash-mla-config
drifkin/qwen3-coder-opening-tag
brucemacd/usage-cli
fix-cuda12-fattn-shmem
ollama-imagegen-docs
parth/fix-multiline-inputs
brucemacd/config-docs
mxyng/model-files
mxyng/simple-execute
fix-imagegen-ollama-models
mxyng/async-upload
jmorganca/lazy-no-dtype-changes
imagegen-auto-detect-create
parth/decrease-concurrent-download-hf
fix-mlx-quantize-init
jmorganca/x-cleanup
usage
imagegen-readme
jmorganca/glm-image
mlx-gpu-cd
jmorganca/imagegen-modelfile
parth/agent-skills
parth/agent-allowlist
parth/signed-in-offline
parth/agents
parth/fix-context-chopping
improve-cloud-flow
parth/add-models-websearch
parth/prompt-renderer-mcp
jmorganca/native-settings
jmorganca/download-stream-hash
jmorganca/client2-rebased
brucemacd/oai-chat-req-multipart
jessegross/multi_chunk_reserve
grace/additional-omit-empty
grace/mistral-3-large
mxyng/tokenizer2
mxyng/tokenizer
jessegross/flash
hoyyeva/windows-nacked-app
mxyng/cleanup-attention
grace/deepseek-parser
hoyyeva/remember-unsent-prompt
parth/add-lfs-pointer-error-conversion
parth/olmo2-test2
hoyyeva/ollama-launchagent-plist
nicole/olmo-model
parth/olmo-test
mxyng/remove-embedded
parth/render-template
jmorganca/intellect-3
parth/remove-prealloc-linter
jmorganca/cmd-eval
nicole/nomic-embed-text-fix
mxyng/lint-2
hoyyeva/add-gemini-3-pro-preview
hoyyeva/load-model-list
mxyng/expand-path
mxyng/environ-2
hoyyeva/deeplink-json-encoding
parth/improve-tool-calling-tests
hoyyeva/conversation
hoyyeva/assistant-edit-response
hoyyeva/thinking
origin/brucemacd/invalid-char-i-err
parth/improve-tool-calling
jmorganca/required-omitempty
grace/qwen3-vl-tests
mxyng/iter-client
parth/docs-readme
nicole/embed-test
pdevine/integration-benchstat
parth/remove-generate-cmd
parth/add-toolcall-id
mxyng/server-tests
jmorganca/glm-4.6
jmorganca/gin-h-compat
drifkin/stable-tool-args
pdevine/qwen3-more-thinking
parth/add-websearch-client
nicole/websearch_local
jmorganca/qwen3-coder-updates
grace/deepseek-v3-migration-tests
mxyng/fix-create
jmorganca/cloud-errors
pdevine/parser-tidy
revert-12233-parth/simplify-entrypoints-runner
parth/enable-so-gpt-oss
brucemacd/qwen3vl
jmorganca/readme-simplify
parth/gpt-oss-structured-outputs
revert-12039-jmorganca/tools-braces
mxyng/embeddings
mxyng/gguf
mxyng/benchmark
mxyng/types-null
parth/move-parsing
mxyng/gemma2
jmorganca/docs
mxyng/16-bit
mxyng/create-stdin
pdevine/authorizedkeys
mxyng/quant
parth/opt-in-error-context-window
brucemacd/cache-models
brucemacd/runner-completion
jmorganca/llama-update-6
brucemacd/benchmark-list
brucemacd/partial-read-caps
parth/deepseek-r1-tools
mxyng/omit-array
parth/tool-prefix-temp
brucemacd/runner-test
jmorganca/qwen25vl
brucemacd/model-forward-test-ext
parth/python-function-parsing
jmorganca/cuda-compression-none
drifkin/num-parallel
drifkin/chat-truncation-fix
jmorganca/sync
parth/python-tools-calling
drifkin/array-head-count
brucemacd/create-no-loop
parth/server-enable-content-stream-with-tools
qwen25omni
mxyng/v3
brucemacd/ropeconfig
jmorganca/silence-tokenizer
parth/sample-so-test
parth/sampling-structured-outputs
brucemacd/doc-go-engine
parth/constrained-sampling-json
jmorganca/mistral-wip
brucemacd/mistral-small-convert
parth/sample-unmarshal-json-for-params
brucemacd/jomorganca/mistral
pdevine/bfloat16
jmorganca/mistral
brucemacd/mistral
pdevine/logging
parth/sample-correctness-fix
parth/sample-fix-sorting
jmorgan/sample-fix-sorting-extras
jmorganca/temp-0-images
brucemacd/parallel-embed-models
brucemacd/shim-grammar
jmorganca/fix-gguf-error
bmizerany/nameswork
jmorganca/faster-releases
bmizerany/validatenames
brucemacd/err-no-vocab
brucemacd/rope-config
brucemacd/err-hint
brucemacd/qwen2_5
brucemacd/logprobs
brucemacd/new_runner_graph_bench
progress-flicker
brucemacd/forward-test
brucemacd/go_qwen2
pdevine/gemma2
jmorganca/add-missing-symlink-eval
mxyng/next-debug
parth/set-context-size-openai
brucemacd/next-bpe-bench
brucemacd/next-bpe-test
brucemacd/new_runner_e2e
brucemacd/new_runner_qwen2
pdevine/convert-cohere2
brucemacd/convert-cli
parth/log-probs
mxyng/next-mlx
mxyng/cmd-history
parth/templating
parth/tokenize-detokenize
brucemacd/check-key-register
bmizerany/grammar
jmorganca/vendor-081b29bd
mxyng/func-checks
jmorganca/fix-null-format
parth/fix-default-to-warn-json
jmorganca/qwen2vl
jmorganca/no-concat
parth/cmd-cleanup-SO
brucemacd/check-key-register-structured-err
parth/openai-stream-usage
parth/fix-referencing-so
stream-tools-stop
jmorganca/degin-1
brucemacd/install-path-clean
brucemacd/push-name-validation
brucemacd/browser-key-register
jmorganca/openai-fix-first-message
jmorganca/fix-proxy
jessegross/sample
parth/disallow-streaming-tools
dhiltgen/remove_submodule
jmorganca/ga
jmorganca/mllama
pdevine/newlines
pdevine/geems-2b
jmorganca/llama-bump
mxyng/modelname-7
mxyng/gin-slog
mxyng/modelname-6
jyan/convert-prog
jyan/quant5
paligemma-support
pdevine/import-docs
jmorganca/openai-context
jyan/paligemma
jyan/p2
jyan/palitest
bmizerany/embedspeedup
jmorganca/llama-vit
brucemacd/allow-ollama
royh/ep-methods
royh/whisper
mxyng/api-models
mxyng/fix-memory
jyan/q4_4/8
jyan/ollama-v
royh/stream-tools
roy-embed-parallel
bmizerany/hrm
revert-5963-revert-5924-mxyng/llama3.1-rope
royh/embed-viz
jyan/local2
jyan/auth
jyan/local
jyan/parse-temp
jmorganca/template-mistral
jyan/reord-g
royh-openai-suffixdocs
royh-imgembed
royh-embed-parallel
jyan/quant4
royh-precision
jyan/progress
pdevine/fix-template
jyan/quant3
pdevine/ggla
mxyng/update-registry-domain
jmorganca/ggml-static
mxyng/create-context
jyan/v0.146
mxyng/layers-from-files
build_dist
bmizerany/noseek
royh-ls
royh-name
timeout
mxyng/server-timestamp
bmizerany/nosillyggufslurps
royh-params
jmorganca/llama-cpp-7c26775
royh-openai-delete
royh-show-rigid
jmorganca/enable-fa
jmorganca/no-error-template
jyan/format
royh-testdelete
bmizerany/fastverify
language_support
pdevine/ps-glitches
brucemacd/tokenize
bruce/iq-quants
bmizerany/filepathwithcoloninhost
mxyng/split-bin
bmizerany/client-registry
jmorganca/if-none-match
native
jmorganca/native
jmorganca/batch-embeddings
jmorganca/initcmake
jmorganca/mm
pdevine/showggmlinfo
modenameenforcealphanum
bmizerany/modenameenforcealphanum
jmorganca/done-reason
jmorganca/llama-cpp-8960fe8
ollama.com
bmizerany/filepathnobuild
bmizerany/types/model/defaultfix
rmdisplaylong
nogogen
bmizerany/x
modelfile-readme
bmizerany/replacecolon
jmorganca/limit
jmorganca/execstack
jmorganca/replace-assets
mxyng/tune-concurrency
jmorganca/testing
whitespace-detection
jmorganca/options
upgrade-all
scratch
cuda-search
mattw/airenamer
mattw/allmodelsonhuggingface
mattw/quantcontext
mattw/whatneedstorun
brucemacd/llama-mem-calc
mattw/faq-context
mattw/communitylinks
mattw/noprune
mattw/python-functioncalling
rename
mxyng/install
pulse
remove-first
editor
mattw/selfqueryingretrieval
cgo
mattw/howtoquant
api
matt/streamingapi
format-config
mxyng/extra-args
shell
update-nous-hermes
cp-model
upload-progress
fix-unknown-model
fix-model-names
delete-fix
insecure-registry
ls
deletemodels
progressbar
readme-updates
license-layers
skip-list
list-models
modelpath
matt/examplemodelfiles
distribution
go-opts
v0.30.0-rc7
v0.30.0-rc6
v0.30.0-rc5
v0.23.2
v0.23.2-rc0
v0.30.0-rc4
v0.30.0-rc3
v0.30.0-rc2
v0.30.0-rc1
v0.30.0-rc0
v0.23.1
v0.23.1-rc0
v0.23.0
v0.23.0-rc0
v0.22.1
v0.22.1-rc1
v0.22.1-rc0
v0.22.0
v0.22.0-rc1
v0.21.3-rc0
v0.21.2-rc1
v0.21.2
v0.21.2-rc0
v0.21.1
v0.21.1-rc1
v0.21.1-rc0
v0.21.0
v0.21.0-rc1
v0.21.0-rc0
v0.20.8-rc0
v0.20.7
v0.20.7-rc1
v0.20.7-rc0
v0.20.6
v0.20.6-rc1
v0.20.6-rc0
v0.20.5
v0.20.5-rc2
v0.20.5-rc1
v0.20.5-rc0
v0.20.4
v0.20.4-rc2
v0.20.4-rc1
v0.20.4-rc0
v0.20.3
v0.20.3-rc0
v0.20.2
v0.20.1
v0.20.1-rc2
v0.20.1-rc1
v0.20.1-rc0
v0.20.0
v0.20.0-rc1
v0.20.0-rc0
v0.19.0
v0.19.0-rc2
v0.19.0-rc1
v0.19.0-rc0
v0.18.4-rc1
v0.18.4-rc0
v0.18.3
v0.18.3-rc2
v0.18.3-rc1
v0.18.3-rc0
v0.18.2
v0.18.2-rc1
v0.18.2-rc0
v0.18.1
v0.18.1-rc1
v0.18.1-rc0
v0.18.0
v0.18.0-rc2
v0.18.0-rc1
v0.18.0-rc0
v0.17.8-rc4
v0.17.8-rc3
v0.17.8-rc2
v0.17.8-rc1
v0.17.8-rc0
v0.17.7
v0.17.7-rc2
v0.17.7-rc1
v0.17.7-rc0
v0.17.6
v0.17.5
v0.17.4
v0.17.3
v0.17.2
v0.17.1
v0.17.1-rc2
v0.17.1-rc1
v0.17.1-rc0
v0.17.0
v0.17.0-rc2
v0.17.0-rc1
v0.17.0-rc0
v0.16.3
v0.16.3-rc2
v0.16.3-rc1
v0.16.3-rc0
v0.16.2
v0.16.2-rc0
v0.16.1
v0.16.0
v0.16.0-rc2
v0.16.0-rc0
v0.16.0-rc1
v0.15.6
v0.15.5
v0.15.5-rc5
v0.15.5-rc4
v0.15.5-rc3
v0.15.5-rc2
v0.15.5-rc1
v0.15.5-rc0
v0.15.4
v0.15.3
v0.15.2
v0.15.1
v0.15.1-rc1
v0.15.1-rc0
v0.15.0-rc6
v0.15.0
v0.15.0-rc5
v0.15.0-rc4
v0.15.0-rc3
v0.15.0-rc2
v0.15.0-rc1
v0.15.0-rc0
v0.14.3
v0.14.3-rc3
v0.14.3-rc2
v0.14.3-rc1
v0.14.3-rc0
v0.14.2
v0.14.2-rc1
v0.14.2-rc0
v0.14.1
v0.14.0-rc11
v0.14.0
v0.14.0-rc10
v0.14.0-rc9
v0.14.0-rc8
v0.14.0-rc7
v0.14.0-rc6
v0.14.0-rc5
v0.14.0-rc4
v0.14.0-rc3
v0.14.0-rc2
v0.14.0-rc1
v0.14.0-rc0
v0.13.5
v0.13.5-rc1
v0.13.5-rc0
v0.13.4-rc2
v0.13.4
v0.13.4-rc1
v0.13.4-rc0
v0.13.3
v0.13.3-rc1
v0.13.3-rc0
v0.13.2
v0.13.2-rc2
v0.13.2-rc1
v0.13.2-rc0
v0.13.1
v0.13.1-rc2
v0.13.1-rc1
v0.13.1-rc0
v0.13.0
v0.13.0-rc0
v0.12.11
v0.12.11-rc1
v0.12.11-rc0
v0.12.10
v0.12.10-rc1
v0.12.10-rc0
v0.12.9-rc0
v0.12.9
v0.12.8
v0.12.8-rc0
v0.12.7
v0.12.7-rc1
v0.12.7-rc0
v0.12.7-citest0
v0.12.6
v0.12.6-rc1
v0.12.6-rc0
v0.12.5
v0.12.5-rc0
v0.12.4
v0.12.4-rc7
v0.12.4-rc6
v0.12.4-rc5
v0.12.4-rc4
v0.12.4-rc3
v0.12.4-rc2
v0.12.4-rc1
v0.12.4-rc0
v0.12.3
v0.12.2
v0.12.2-rc0
v0.12.1
v0.12.1-rc1
v0.12.1-rc2
v0.12.1-rc0
v0.12.0
v0.12.0-rc1
v0.12.0-rc0
v0.11.11
v0.11.11-rc3
v0.11.11-rc2
v0.11.11-rc1
v0.11.11-rc0
v0.11.10
v0.11.9
v0.11.9-rc0
v0.11.8
v0.11.8-rc0
v0.11.7-rc1
v0.11.7-rc0
v0.11.7
v0.11.6
v0.11.6-rc0
v0.11.5-rc4
v0.11.5-rc3
v0.11.5
v0.11.5-rc5
v0.11.5-rc2
v0.11.5-rc1
v0.11.5-rc0
v0.11.4
v0.11.4-rc0
v0.11.3
v0.11.3-rc0
v0.11.2
v0.11.1
v0.11.0-rc0
v0.11.0-rc1
v0.11.0-rc2
v0.11.0
v0.10.2-int1
v0.10.1
v0.10.0
v0.10.0-rc4
v0.10.0-rc3
v0.10.0-rc2
v0.10.0-rc1
v0.10.0-rc0
v0.9.7-rc1
v0.9.7-rc0
v0.9.6
v0.9.6-rc0
v0.9.6-ci0
v0.9.5
v0.9.4-rc5
v0.9.4-rc6
v0.9.4
v0.9.4-rc3
v0.9.4-rc4
v0.9.4-rc1
v0.9.4-rc2
v0.9.4-rc0
v0.9.3
v0.9.3-rc5
v0.9.4-citest0
v0.9.3-rc4
v0.9.3-rc3
v0.9.3-rc2
v0.9.3-rc1
v0.9.3-rc0
v0.9.2
v0.9.1
v0.9.1-rc1
v0.9.1-rc0
v0.9.1-ci1
v0.9.1-ci0
v0.9.0
v0.9.0-rc0
v0.8.0
v0.8.0-rc0
v0.7.1-rc2
v0.7.1
v0.7.1-rc1
v0.7.1-rc0
v0.7.0
v0.7.0-rc1
v0.7.0-rc0
v0.6.9-rc0
v0.6.8
v0.6.8-rc0
v0.6.7
v0.6.7-rc2
v0.6.7-rc1
v0.6.7-rc0
v0.6.6
v0.6.6-rc2
v0.6.6-rc1
v0.6.6-rc0
v0.6.5-rc1
v0.6.5
v0.6.5-rc0
v0.6.4-rc0
v0.6.4
v0.6.3-rc1
v0.6.3
v0.6.3-rc0
v0.6.2
v0.6.2-rc0
v0.6.1
v0.6.1-rc0
v0.6.0-rc0
v0.6.0
v0.5.14-rc0
v0.5.13
v0.5.13-rc6
v0.5.13-rc5
v0.5.13-rc4
v0.5.13-rc3
v0.5.13-rc2
v0.5.13-rc1
v0.5.13-rc0
v0.5.12
v0.5.12-rc1
v0.5.12-rc0
v0.5.11
v0.5.10
v0.5.9
v0.5.9-rc0
v0.5.8-rc13
v0.5.8
v0.5.8-rc12
v0.5.8-rc11
v0.5.8-rc10
v0.5.8-rc9
v0.5.8-rc8
v0.5.8-rc7
v0.5.8-rc6
v0.5.8-rc5
v0.5.8-rc4
v0.5.8-rc3
v0.5.8-rc2
v0.5.8-rc1
v0.5.8-rc0
v0.5.7
v0.5.6
v0.5.5
v0.5.5-rc0
v0.5.4
v0.5.3
v0.5.3-rc0
v0.5.2
v0.5.2-rc3
v0.5.2-rc2
v0.5.2-rc1
v0.5.2-rc0
v0.5.1
v0.5.0
v0.5.0-rc1
v0.4.8-rc0
v0.4.7
v0.4.6
v0.4.5
v0.4.4
v0.4.3
v0.4.3-rc0
v0.4.2
v0.4.2-rc1
v0.4.2-rc0
v0.4.1
v0.4.1-rc0
v0.4.0
v0.4.0-rc8
v0.4.0-rc7
v0.4.0-rc6
v0.4.0-rc5
v0.4.0-rc4
v0.4.0-rc3
v0.4.0-rc2
v0.4.0-rc1
v0.4.0-rc0
v0.4.0-ci3
v0.3.14
v0.3.14-rc0
v0.3.13
v0.3.12
v0.3.12-rc5
v0.3.12-rc4
v0.3.12-rc3
v0.3.12-rc2
v0.3.12-rc1
v0.3.11
v0.3.11-rc4
v0.3.11-rc3
v0.3.11-rc2
v0.3.11-rc1
v0.3.10
v0.3.10-rc1
v0.3.9
v0.3.8
v0.3.7
v0.3.7-rc6
v0.3.7-rc5
v0.3.7-rc4
v0.3.7-rc3
v0.3.7-rc2
v0.3.7-rc1
v0.3.6
v0.3.5
v0.3.4
v0.3.3
v0.3.2
v0.3.1
v0.3.0
v0.2.8
v0.2.8-rc2
v0.2.8-rc1
v0.2.7
v0.2.6
v0.2.5
v0.2.4
v0.2.3
v0.2.2
v0.2.2-rc2
v0.2.2-rc1
v0.2.1
v0.2.0
v0.1.49-rc14
v0.1.49-rc13
v0.1.49-rc12
v0.1.49-rc11
v0.1.49-rc10
v0.1.49-rc9
v0.1.49-rc8
v0.1.49-rc7
v0.1.49-rc6
v0.1.49-rc4
v0.1.49-rc5
v0.1.49-rc3
v0.1.49-rc2
v0.1.49-rc1
v0.1.48
v0.1.47
v0.1.46
v0.1.45-rc5
v0.1.45
v0.1.45-rc4
v0.1.45-rc3
v0.1.45-rc2
v0.1.45-rc1
v0.1.44
v0.1.43
v0.1.42
v0.1.41
v0.1.40
v0.1.40-rc1
v0.1.39
v0.1.39-rc2
v0.1.39-rc1
v0.1.38
v0.1.37
v0.1.36
v0.1.35
v0.1.35-rc1
v0.1.34
v0.1.34-rc1
v0.1.33
v0.1.33-rc7
v0.1.33-rc6
v0.1.33-rc5
v0.1.33-rc4
v0.1.33-rc3
v0.1.33-rc2
v0.1.33-rc1
v0.1.32
v0.1.32-rc2
v0.1.32-rc1
v0.1.31
v0.1.30
v0.1.29
v0.1.28
v0.1.27
v0.1.26
v0.1.25
v0.1.24
v0.1.23
v0.1.22
v0.1.21
v0.1.20
v0.1.19
v0.1.18
v0.1.17
v0.1.16
v0.1.15
v0.1.14
v0.1.13
v0.1.12
v0.1.11
v0.1.10
v0.1.9
v0.1.8
v0.1.7
v0.1.6
v0.1.5
v0.1.4
v0.1.3
v0.1.2
v0.1.1
v0.1.0
v0.0.21
v0.0.20
v0.0.19
v0.0.18
v0.0.17
v0.0.16
v0.0.15
v0.0.14
v0.0.13
v0.0.12
v0.0.11
v0.0.10
v0.0.9
v0.0.8
v0.0.7
v0.0.6
v0.0.5
v0.0.4
v0.0.3
v0.0.2
v0.0.1
Labels
Clear labels
amd
api
app
bug
build
cli
cloud
compatibility
context-length
create
docker
documentation
embeddings
feature request
feedback wanted
good first issue
gpt-oss
gpu
harmony
help wanted
image
install
intel
js
launch
linux
macos
memory
mlx
model
needs more info
networking
nvidia
ollama.com
performance
pull-request
python
question
registry
rendering
thinking
tools
top
vulkan
windows
wsl
Mirrored from GitHub Pull Request
No Label
networking
Milestone
No items
No Milestone
Projects
Clear projects
No project
No Assignees
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: github-starred/ollama#67807
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @auserwn on GitHub (Feb 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8873
C:\Windows\System32>ollama -v
ollama version is 0.5.7
C:\Windows\System32>ollama run deepseek-r1:1.5b
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
What error is this and how do I solve it?
@Mrkk1 commented on GitHub (Feb 6, 2025):
While pulling normally just now, all pulling services on all servers were interrupted.
@rjindael commented on GitHub (Feb 6, 2025):
Seems to be that the server is currently overloaded. I am receiving the same issue
@ariyankabir7 commented on GitHub (Feb 6, 2025):
let just wait without filling the server with comments and let the developer do their work. 😑
@Zeying-Gong commented on GitHub (Feb 6, 2025):
Same here
@Marvin4444 commented on GitHub (Feb 6, 2025):
same problem
@Leibnizhu commented on GitHub (Feb 6, 2025):
Same here
@Mrkk1 commented on GitHub (Feb 6, 2025):
I feel that developers of the Pulla Ollama model around the world will soon ask questions under this post
@mickyss commented on GitHub (Feb 6, 2025):
Same here
@TT0824 commented on GitHub (Feb 6, 2025):
Same here
@neverbiasu commented on GitHub (Feb 6, 2025):
root@02444a7210b2:/workspace# ollama run deepseek-r1:14b
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
same on Linux
@somera commented on GitHub (Feb 6, 2025):
Same on linux.
@hxf4869 commented on GitHub (Feb 6, 2025):
Same here
@gulihua10010 commented on GitHub (Feb 6, 2025):
same on mac Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@thekumi commented on GitHub (Feb 6, 2025):
Probably the server that's overloaded or something. The error comes straight from the server, e.g. here: https://registry.ollama.ai/v2/library/llama3/manifests/latest
@linlinlin530 commented on GitHub (Feb 6, 2025):
same on macOS just now, just 10 mins ago I was still pulling
@catna commented on GitHub (Feb 6, 2025):
Same here
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}@wusanshou2017 commented on GitHub (Feb 6, 2025):
same on both windows and linux now
@Cakeday commented on GitHub (Feb 6, 2025):
same on m1 Mac -- tried reinstalling -- same thing
@mkblbj commented on GitHub (Feb 6, 2025):
same!
@HilariousJi commented on GitHub (Feb 6, 2025):
same here on Windows
@wuyanzu commented on GitHub (Feb 6, 2025):
same here
@ayylmaonade commented on GitHub (Feb 6, 2025):
It's happening across all OS. Linux & Windows both impacted here.
@peakhell commented on GitHub (Feb 6, 2025):
same here
@MR6464466 commented on GitHub (Feb 6, 2025):
same here
@SummerSec commented on GitHub (Feb 6, 2025):
same here
@machaojin1917939763 commented on GitHub (Feb 6, 2025):
same here
@walawalala commented on GitHub (Feb 6, 2025):
same here
@yangxiaoyx commented on GitHub (Feb 6, 2025):
Same here
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@ra1mj commented on GitHub (Feb 6, 2025):
same here
@BigPineappleDe commented on GitHub (Feb 6, 2025):
same here
@bjttt commented on GitHub (Feb 6, 2025):
same here
@bmw8080 commented on GitHub (Feb 6, 2025):
same here~
@reTsubasa commented on GitHub (Feb 6, 2025):
same here
@GeekDaiDao commented on GitHub (Feb 6, 2025):
same here~
@LeonardZhang commented on GitHub (Feb 6, 2025):
same here
@Biao-K commented on GitHub (Feb 6, 2025):
same here
@thekumi commented on GitHub (Feb 6, 2025):
More "same here" messages are not going to help anyone, it's abundantly clear at this point that everyone is affected. Please don't spam developers.
@jiongjiong commented on GitHub (Feb 6, 2025):
same here
@zt1115798334 commented on GitHub (Feb 6, 2025):
我也出现相同的问题了
@Konk32 commented on GitHub (Feb 6, 2025):
Same here
@Soporior commented on GitHub (Feb 6, 2025):
same here
@zhangningfang commented on GitHub (Feb 6, 2025):
same here
@WZS666 commented on GitHub (Feb 6, 2025):
Same here 🎠
@abadfox233 commented on GitHub (Feb 6, 2025):
same here
@DongShengWork commented on GitHub (Feb 6, 2025):
+1
@xiaomaofeng commented on GitHub (Feb 6, 2025):
Same here
@auserwn commented on GitHub (Feb 6, 2025):
你之前部署过没,我是第一次使用ollama 想部署个deepseek看看
---- Replied Message ----
| From | @.> |
| Date | 2/6/2025 16:39 |
| To | @.> |
| Cc | @.>,
@.> |
| Subject | Re: [ollama/ollama] pull model manifest: 500 (Issue #8873) |
我也出现相同的问题了
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: @.***>
@lfhy commented on GitHub (Feb 6, 2025):
+1
@ziuyung commented on GitHub (Feb 6, 2025):
+1
@QFJiao commented on GitHub (Feb 6, 2025):
pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]} +1 什么时候能好哇,本来下载就慢
@xiaohei07 commented on GitHub (Feb 6, 2025):
+1
@ImproveRei0 commented on GitHub (Feb 6, 2025):
same here ,新鲜bug
@chenggx commented on GitHub (Feb 6, 2025):
Same here too
@suhli commented on GitHub (Feb 6, 2025):
same here
@prideli0310 commented on GitHub (Feb 6, 2025):
+1
@PengGYing commented on GitHub (Feb 6, 2025):
+1
@reele commented on GitHub (Feb 6, 2025):
+1
@irisOvo commented on GitHub (Feb 6, 2025):
same here
@OneYiLei commented on GitHub (Feb 6, 2025):
+1
@kimiisme commented on GitHub (Feb 6, 2025):
+10086
@SCUlevi commented on GitHub (Feb 6, 2025):
same here
@JX223 commented on GitHub (Feb 6, 2025):
+1
@Samge0 commented on GitHub (Feb 6, 2025):
+1
@i7eo commented on GitHub (Feb 6, 2025):
Please look at this @Leibnizhu
@sadness-sa commented on GitHub (Feb 6, 2025):
+1
@2h-Lin commented on GitHub (Feb 6, 2025):
same here
@qwertyu45rtdj commented on GitHub (Feb 6, 2025):
same here
@JARVGOD commented on GitHub (Feb 6, 2025):
+1
@abulili commented on GitHub (Feb 6, 2025):
Same here
@CorvusCorax1024 commented on GitHub (Feb 6, 2025):
+1
@afang518 commented on GitHub (Feb 6, 2025):
😯
@guoer577 commented on GitHub (Feb 6, 2025):
+1
@Yuding-Zhang commented on GitHub (Feb 6, 2025):
same on windows
@xiaofengcanyue-d commented on GitHub (Feb 6, 2025):
+1,sooo Fresh bugs🤣🤣🤣
@QGtiger commented on GitHub (Feb 6, 2025):
+1
@liberty4me commented on GitHub (Feb 6, 2025):
+1
@wjl1272165766 commented on GitHub (Feb 6, 2025):
Same here
@codingloverzz commented on GitHub (Feb 6, 2025):
same here
@deno-source commented on GitHub (Feb 6, 2025):
Same here
@Mrkk1 commented on GitHub (Feb 6, 2025):
After the Spring Festival, everyone starts to use deepseek, right?
@dahuoshi commented on GitHub (Feb 6, 2025):
Same here
@anson920520 commented on GitHub (Feb 6, 2025):
same here
@xuzheng0017 commented on GitHub (Feb 6, 2025):
same here
@heroinlin commented on GitHub (Feb 6, 2025):
same here
@WBarbar commented on GitHub (Feb 6, 2025):
一直这样
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@mipmip commented on GitHub (Feb 6, 2025):
same here
@WBarbar commented on GitHub (Feb 6, 2025):
C:\Users\84304>ollama -v
ollama version is 0.5.7
C:\Users\84304>ollama run deepseek-r1:1.5b
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
求大大解决
@xinshou-xin commented on GitHub (Feb 6, 2025):
same here. what time can slove?
@BigBossWHD commented on GitHub (Feb 6, 2025):
same here on MacOS 10min ago
@Zoexyz77 commented on GitHub (Feb 6, 2025):
same
@MoZiBai commented on GitHub (Feb 6, 2025):
Same here
@wwpu commented on GitHub (Feb 6, 2025):
what happened
@binz123 commented on GitHub (Feb 6, 2025):
sam here on linux ubuntu 10min ago
@wangybbj commented on GitHub (Feb 6, 2025):
Same here
@Phoenix-PKU commented on GitHub (Feb 6, 2025):
same on MacOS
@FT-Fetters commented on GitHub (Feb 6, 2025):
Same here
@ColorCJY commented on GitHub (Feb 6, 2025):
Is the server down because too many people are pulling?
@1SolPi1 commented on GitHub (Feb 6, 2025):
Same here
@Asabis commented on GitHub (Feb 6, 2025):
Same here
@maskedman227 commented on GitHub (Feb 6, 2025):
Same here
@acshmily commented on GitHub (Feb 6, 2025):
if someone else want to use deepseek ,you can try use modelscope
you can visit modelscope learn how to use
and deepseek GGUF model here:https://www.modelscope.cn/models/unsloth/DeepSeek-R1-Distill-Qwen-32B-GGUF
@Harrytty commented on GitHub (Feb 6, 2025):
same here
@guoer577 commented on GitHub (Feb 6, 2025):
Neither my mac nor my PC can be pulled. It must be the problem with the ollama server! Seeking solutions
@cristinarccc commented on GitHub (Feb 6, 2025):
same here
@Zevi-zzy commented on GitHub (Feb 6, 2025):
same problem
@thekumi commented on GitHub (Feb 6, 2025):
@guoer577 It is. At this point, the only solution is waiting for the server to be fixed.
@Pendlers commented on GitHub (Feb 6, 2025):
same here
@mYoCaRdiA commented on GitHub (Feb 6, 2025):
+1
@edwarddu commented on GitHub (Feb 6, 2025):
same here ...when to recover ?
@LLL672-ae commented on GitHub (Feb 6, 2025):
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
why?
@ElonMikeLi commented on GitHub (Feb 6, 2025):
So many people have the same mistake at the same time. Maybe it's not our problem. Could it be that the DeepSeek server crashed and we can't pull mode.
@mg-kell0806 commented on GitHub (Feb 6, 2025):
same here
@outman000 commented on GitHub (Feb 6, 2025):
same here
@mYoCaRdiA commented on GitHub (Feb 6, 2025):
+1
@jackywood commented on GitHub (Feb 6, 2025):
Error: pull model manifest: 500:我刚刚也遇到了
@AnGoYang commented on GitHub (Feb 6, 2025):
same here on MacOS
@DongShengWork commented on GitHub (Feb 6, 2025):
@mYoCaRdiA commented on GitHub (Feb 6, 2025):
+1
@35763670SpadeKing commented on GitHub (Feb 6, 2025):
一样 不知道怎么解决
@mYoCaRdiA commented on GitHub (Feb 6, 2025):
what happened goooooooooooood
@xavierchan commented on GitHub (Feb 6, 2025):
Just calm down, this is server‘s bug, just wait to be fixed.
@codingforward commented on GitHub (Feb 6, 2025):
same here:
C:\Users\Administrator>ollama run deepseek-r1:7b
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@dahuoshi commented on GitHub (Feb 6, 2025):
@aitransformer commented on GitHub (Feb 6, 2025):
waiting
@mYoCaRdiA commented on GitHub (Feb 6, 2025):
The servers on their website are down
@mg-kell0806 commented on GitHub (Feb 6, 2025):
Not only does the deepseek model fail to pull, other models also pull exceptions
@TechEnthusiast41 commented on GitHub (Feb 6, 2025):
the same problem:
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@mYoCaRdiA commented on GitHub (Feb 6, 2025):
the ollama sever crashed ,not deepseek
@AllenPu2020 commented on GitHub (Feb 6, 2025):
same problem
-> % ollama run deepseek-r1:14b
pulling manifest
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@TanShun commented on GitHub (Feb 6, 2025):
Same here, all models can't be pulled, not just deepseek.
@xiaosheng-conk commented on GitHub (Feb 6, 2025):
同样的问题,下载deepseek-r1:14b的时候
@MoZiBai commented on GitHub (Feb 6, 2025):
the ollama sever crashed ,not deepseek
@wesley1211 commented on GitHub (Feb 6, 2025):
same here. might be too much pull for deepseek model
@kuiyuboy commented on GitHub (Feb 6, 2025):
Why, I'm finally almost done, and suddenly this whole !!
@ComradePenguin-1917 commented on GitHub (Feb 6, 2025):
same. You can try download from hugging face or 魔搭 and install manually.
@ElonMikeLi commented on GitHub (Feb 6, 2025):
Oh,That's too bad,maybe just waiting
@TechEnthusiast41 commented on GitHub (Feb 6, 2025):
you can only waiting/(ㄒoㄒ)/~~
@yang-fei commented on GitHub (Feb 6, 2025):
same here
@Rilfoyle commented on GitHub (Feb 6, 2025):
same here
@ruochen-yyb commented on GitHub (Feb 6, 2025):
same here
@wangshan-aqi commented on GitHub (Feb 6, 2025):
same here
@ycbing commented on GitHub (Feb 6, 2025):
same here
@EvolvedGhost commented on GitHub (Feb 6, 2025):
same here
@jackeyhao commented on GitHub (Feb 6, 2025):
这里也是一样的
@Eliot-Shen commented on GitHub (Feb 6, 2025):
The same. Let me guess. We are downloading some version of deepseek model right? hhh
@SacribethGrass commented on GitHub (Feb 6, 2025):
same here. 我还以为是我的问题
@vatofichor commented on GitHub (Feb 6, 2025):
very interesting to see the this outage, its with any model pull
@qianxingkeji commented on GitHub (Feb 6, 2025):
Same here
@LonerDo commented on GitHub (Feb 6, 2025):
same here
@JudeNiroshan commented on GitHub (Feb 6, 2025):
Please stop commenting "same here".
It makes too much noise and doesn't help others at all. We all know it's broken. Give sometime they'll provide a solution or a workaround.
@vatofichor commented on GitHub (Feb 6, 2025):
the llama2-uncensored model got nerfed
@NingNingRobot commented on GitHub (Feb 6, 2025):
hugging face servers were interrupted
@ColdWindScholar commented on GitHub (Feb 6, 2025):
same here
@Se1ker commented on GitHub (Feb 6, 2025):
same here
@z-qh commented on GitHub (Feb 6, 2025):
same here
@Tan-Yi commented on GitHub (Feb 6, 2025):
Error: pull model manifest: 500: {"errors":[{"code":"INTERNAL_ERROR","message":"internal error"}]}
@Zha-Miku commented on GitHub (Feb 6, 2025):
some here
@Jessmin commented on GitHub (Feb 6, 2025):
same here
@hxlive commented on GitHub (Feb 6, 2025):
same here
@SacribethGrass commented on GitHub (Feb 6, 2025):
现在好像可以了,修复好了。
It seems Okay now.
@OneYiLei commented on GitHub (Feb 6, 2025):
It should work now.
@Jessmin commented on GitHub (Feb 6, 2025):
It seems Okay
@SacribethGrass commented on GitHub (Feb 6, 2025):
@ClickListener commented on GitHub (Feb 6, 2025):
可以了
@wangshan-aqi commented on GitHub (Feb 6, 2025):
@ColdWindScholar commented on GitHub (Feb 6, 2025):
thanks sir!!! its okay now!
@qianxingkeji commented on GitHub (Feb 6, 2025):
It is Okay
@vatofichor commented on GitHub (Feb 6, 2025):
yay everyone pull party, im sure they will love this right after a crash
@Zha-Miku commented on GitHub (Feb 6, 2025):
Are you all pulling the deepepseek-r1 model?😀
@MoZiBai commented on GitHub (Feb 6, 2025):
It is Okay now
@qianxingkeji commented on GitHub (Feb 6, 2025):
yes
@WeikangLin93 commented on GitHub (Feb 6, 2025):
It is Okay now
@OneYiLei commented on GitHub (Feb 6, 2025):
deepepseek-r1 good!!!!!
@Wetcue commented on GitHub (Feb 6, 2025):
just because of deepseek!
@Eliot-Shen commented on GitHub (Feb 6, 2025):
Engineer/students worldwide do the same thing at the same time. Nice feeling.
@Fackyhub commented on GitHub (Feb 6, 2025):
I already download 70%,spent all time,Damn!
@wjl1272165766 commented on GitHub (Feb 6, 2025):
It seems Okay
@Zha-Miku commented on GitHub (Feb 6, 2025):
I will get it!!!
@xiaosheng-conk commented on GitHub (Feb 6, 2025):
@qwertyu45rtdj commented on GitHub (Feb 6, 2025):
How to re-download it
@wangybbj commented on GitHub (Feb 6, 2025):
It's ok now
@ElonMikeLi commented on GitHub (Feb 6, 2025):
@jojochen81 commented on GitHub (Feb 6, 2025):
现在好了
@AllenPu2020 commented on GitHub (Feb 6, 2025):
@aka-keqing commented on GitHub (Feb 6, 2025):
it's ok now
@dahuoshi commented on GitHub (Feb 6, 2025):
Service was restored
@yingxiaaa7 commented on GitHub (Feb 6, 2025):
I've already downloaded 90%; now I have to redownload, and it is so slow. But still feel grateful to the developers.
@wsjkx commented on GitHub (Feb 6, 2025):
It's ok now. I'm starting download now.
@AllenPu2020 commented on GitHub (Feb 6, 2025):
If I want to save docker images, we use
docker save -o my_image.tar my_image:tagAnd then, we copy
my_image.tarto other computerUse
docker load -i my_image.tarto load this docker image。So,
ollama save models?ollama load models@jmorganca commented on GitHub (Feb 6, 2025):
Hi folks, so sorry about this, ollama.com was impacted by last night's Cloudflare R2 outage. It should be back to normal now
@PixelSymbols commented on GitHub (Nov 18, 2025):
same issue: https://registry.ollama.ai/v2/library/llama3/manifests/latest
Internal server error Error code 500
Visit cloudflare.com for more information.
2025-11-18 14:16:09 UTC