[GH-ISSUE #5289] bug: ollama show bakllava:latest panic: interface conversion: interface {} is nil, not string #3314

Closed
opened 2026-04-12 13:53:04 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @silentoplayz on GitHub (Jun 26, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5289

Originally assigned to: @royjhan on GitHub.

What is the issue?

When I run the command:

ollama show bakllava:latest

Specifically this model https://ollama.com/library/bakllava:latest, though I haven't tried other quantizations) in command prompt, I am returned this error back instantly:

C:\Users\G30>ollama show bakllava:latest
panic: interface conversion: interface {} is nil, not string

goroutine 1 [running]:
github.com/ollama/ollama/cmd.ShowHandler(0xc000549b08, {0xc0004810f0, 0x1, 0x141ea86?})
        github.com/ollama/ollama/cmd/cmd.go:675 +0x18a5
github.com/spf13/cobra.(*Command).execute(0xc000549b08, {0xc0004810c0, 0x1, 0x1})
        github.com/spf13/cobra@v1.7.0/command.go:940 +0x882
github.com/spf13/cobra.(*Command).ExecuteC(0xc000549508)
        github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5
github.com/spf13/cobra.(*Command).Execute(...)
        github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
        github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
        github.com/ollama/ollama/main.go:11 +0x4d

I expected to be shown the model parameters. e.g:

C:\Users\G30>ollama show llava-llama3:latest
  Model
        arch                    llama
        parameters              8B
        quantization            Q4_K_M
        context length          8192
        embedding length        4096

  Projector
        arch                            clip
        parameters                      311.89M
        projector type                  mlp
        embedding length                1024
        projection dimensionality       768

  Parameters
        num_ctx         4096
        num_keep        4
        stop            "<|start_header_id|>"
        stop            "<|end_header_id|>"
        stop            "<|eot_id|>"

Additional information:

  • After conducting a thorough search, I've determined that the bakllava:latest model is the only problem model installed on my machine. When scanning the parameters and context length for each model in my Ollama models folder, I found that only bakllava:latest returned an issue instead of displaying model information. I've also attempted to resolve the problem by running ollama rm bakllava:latest and then repulling the model, but the same error persists even after doing so on Ollama v0.1.46. Notably, the model itself runs without issues when used with Ollama or through a web interface, but the problem only arises when running ollama show for this specific model.

OS

Edition: Windows 11 Pro
Version: 24H2
Installed on: ‎5/‎19/‎2024
OS build: 26120.770
Experience: Windows Feature Experience Pack 1000.26100.6.0

GPU

AMD RX 6800 XT

CPU

Intel i7-12700k

Ollama version

C:\Users\G30>ollama --version
ollama version is 0.1.46
Originally created by @silentoplayz on GitHub (Jun 26, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5289 Originally assigned to: @royjhan on GitHub. ### What is the issue? **When I run the command:** ```cmd ollama show bakllava:latest ``` **Specifically** this model https://ollama.com/library/bakllava:latest, though I haven't tried other quantizations) in command prompt, I am returned this error back instantly: ```cmd C:\Users\G30>ollama show bakllava:latest panic: interface conversion: interface {} is nil, not string goroutine 1 [running]: github.com/ollama/ollama/cmd.ShowHandler(0xc000549b08, {0xc0004810f0, 0x1, 0x141ea86?}) github.com/ollama/ollama/cmd/cmd.go:675 +0x18a5 github.com/spf13/cobra.(*Command).execute(0xc000549b08, {0xc0004810c0, 0x1, 0x1}) github.com/spf13/cobra@v1.7.0/command.go:940 +0x882 github.com/spf13/cobra.(*Command).ExecuteC(0xc000549508) github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 github.com/spf13/cobra.(*Command).Execute(...) github.com/spf13/cobra@v1.7.0/command.go:992 github.com/spf13/cobra.(*Command).ExecuteContext(...) github.com/spf13/cobra@v1.7.0/command.go:985 main.main() github.com/ollama/ollama/main.go:11 +0x4d ``` I expected to be shown the model parameters. e.g: ```cmd C:\Users\G30>ollama show llava-llama3:latest Model arch llama parameters 8B quantization Q4_K_M context length 8192 embedding length 4096 Projector arch clip parameters 311.89M projector type mlp embedding length 1024 projection dimensionality 768 Parameters num_ctx 4096 num_keep 4 stop "<|start_header_id|>" stop "<|end_header_id|>" stop "<|eot_id|>" ``` Additional information: - After conducting a thorough search, I've determined that the `bakllava:latest` model is the only problem model installed on my machine. When scanning the parameters and context length for each model in my Ollama models folder, I found that only `bakllava:latest` returned an issue instead of displaying model information. I've also attempted to resolve the problem by running `ollama rm bakllava:latest` and then repulling the model, but the same error persists even after doing so on Ollama v0.1.46. Notably, the model itself runs without issues when used with Ollama or through a web interface, but the problem only arises when running `ollama show` for this specific model. ### OS Edition: Windows 11 Pro Version: 24H2 Installed on: ‎5/‎19/‎2024 OS build: 26120.770 Experience: Windows Feature Experience Pack 1000.26100.6.0 ### GPU AMD RX 6800 XT ### CPU Intel i7-12700k ### Ollama version ```cmd C:\Users\G30>ollama --version ollama version is 0.1.46 ```
GiteaMirror added the bug label 2026-04-12 13:53:04 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3314