[GH-ISSUE #6890] "/show parameters" command causes crashes when running Qwen 2.5 models, on version 0.3.11 #4359

Closed
opened 2026-04-12 15:17:55 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ghost on GitHub (Sep 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6890

What is the issue?

This only happens after changing the parameters through /set parameter command.
Here's an example:
PS H:\ztmp> ollama run qwen2.5

/show parameters
No parameters were specified for this model.
/set parameter top_k 1
Set parameter 'top_k' to '1'
/show parameters
error: couldn't get model
Error: something went wrong, please see the ollama server logs for details
PS H:\ztmp>

This is on windows 11.

Here's the error message from the ollama serve terminal tab:
2024/09/20 08:47:43 [Recovery] 2024/09/20 - 08:47:43 panic recovered:
assignment to entry in nil map
runtime/map_faststr.go:205 (0x7a93ba)
github.com/ollama/ollama/server/routes.go:807 (0x12cc57e)
github.com/ollama/ollama/server/routes.go:732 (0x12cb497)
github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1287cca)
github.com/ollama/ollama/server/routes.go:1076 (0x12d0c14)
github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1295d39)
github.com/gin-gonic/gin@v1.10.0/recovery.go:102 (0x1295d27)
github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1294e64)
github.com/gin-gonic/gin@v1.10.0/logger.go:249 (0x1294e4b)
github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1294291)
github.com/gin-gonic/gin@v1.10.0/gin.go:633 (0x1293d00)
github.com/gin-gonic/gin@v1.10.0/gin.go:589 (0x1293831)
net/http/server.go:2688 (0xafaecc)
net/http/server.go:3142 (0xafc6cd)
net/http/server.go:2044 (0xaf79c7)
runtime/asm_amd64.s:1695 (0x8026a0)

OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.3.11

Originally created by @ghost on GitHub (Sep 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6890 ### What is the issue? This only happens after changing the parameters through /set parameter command. Here's an example: PS H:\ztmp> ollama run qwen2.5 >>> /show parameters No parameters were specified for this model. >>> /set parameter top_k 1 Set parameter 'top_k' to '1' >>> /show parameters error: couldn't get model Error: something went wrong, please see the ollama server logs for details PS H:\ztmp> This is on windows 11. Here's the error message from the ollama serve terminal tab: 2024/09/20 08:47:43 [Recovery] 2024/09/20 - 08:47:43 panic recovered: assignment to entry in nil map runtime/map_faststr.go:205 (0x7a93ba) github.com/ollama/ollama/server/routes.go:807 (0x12cc57e) github.com/ollama/ollama/server/routes.go:732 (0x12cb497) github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1287cca) github.com/ollama/ollama/server/routes.go:1076 (0x12d0c14) github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1295d39) github.com/gin-gonic/gin@v1.10.0/recovery.go:102 (0x1295d27) github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1294e64) github.com/gin-gonic/gin@v1.10.0/logger.go:249 (0x1294e4b) github.com/gin-gonic/gin@v1.10.0/context.go:185 (0x1294291) github.com/gin-gonic/gin@v1.10.0/gin.go:633 (0x1293d00) github.com/gin-gonic/gin@v1.10.0/gin.go:589 (0x1293831) net/http/server.go:2688 (0xafaecc) net/http/server.go:3142 (0xafc6cd) net/http/server.go:2044 (0xaf79c7) runtime/asm_amd64.s:1695 (0x8026a0) ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.3.11
GiteaMirror added the bug label 2026-04-12 15:17:55 -05:00
Author
Owner

@jonbuffington commented on GitHub (Dec 13, 2024):

I do not think this is model specific. For example, I can recreate using llama3.2:1b. My steps are:

% ollama --version      
ollama version is 0.5.1
% ollama run llama3.2:1b
>>> /set parameter temperature 0.5
Set parameter 'temperature' to '0.5'
>>> /show parameters
error: couldn't get model
Error: something went wrong, please see the ollama server logs for details
%
<!-- gh-comment-id:2541881748 --> @jonbuffington commented on GitHub (Dec 13, 2024): I do not think this is model specific. For example, I can recreate using `llama3.2:1b`. My steps are: ``` % ollama --version ollama version is 0.5.1 % ollama run llama3.2:1b >>> /set parameter temperature 0.5 Set parameter 'temperature' to '0.5' >>> /show parameters error: couldn't get model Error: something went wrong, please see the ollama server logs for details % ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4359