[PR #13300] [MERGED] test: add ministral-3 #14155

Closed
opened 2026-04-13 00:46:46 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/13300
Author: @dhiltgen
Created: 12/2/2025
Status: Merged
Merged: 12/2/2025
Merged by: @dhiltgen

Base: mainHead: new_tests


📝 Commits (1)

📊 Changes

2 files changed (+6 additions, -0 deletions)

View changed files

📝 integration/llm_image_test.go (+3 -0)
📝 integration/utils_test.go (+3 -0)

📄 Description

Examples:

go test ./integration -tags=integration,models -v -run TestModelsChat/ministral-3
=== RUN   TestModelsChat
time=2025-12-02T09:36:29.836-08:00 level=INFO msg="Setting timeouts" soft=7m59.999632292s hard=9m39.999632208s
time=2025-12-02T09:36:29.839-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54156
time=2025-12-02T09:36:29.839-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54156
time=2025-12-02T09:36:29.840-08:00 level=INFO msg="starting server" url=127.0.0.1:54156
time=2025-12-02T09:36:30.342-08:00 level=WARN msg="No VRAM info available, testing all models, so larger ones might timeout..."
=== RUN   TestModelsChat/ministral-3
time=2025-12-02T09:36:30.342-08:00 level=INFO msg="checking status of model" model=ministral-3
time=2025-12-02T09:36:30.346-08:00 level=INFO msg="model missing" model=ministral-3
time=2025-12-02T09:36:51.208-08:00 level=INFO msg=loading model=ministral-3
time=2025-12-02T09:36:55.323-08:00 level=INFO msg="test pass" model=ministral-3 messages="[{Role:user Content:why is the sky blue? Be brief but factual in your reply Thinking: Images:[] ToolCalls:[] ToolName: ToolCallID:}]" contains="[rayleigh scatter atmosphere nitrogen oxygen wavelength interact]" response="The sky appears blue due to **Rayleigh scattering**, where shorter blue wavelengths of sunlight are scattered more by Earth's atmosphere than other colors. This scattered blue light reaches our eyes from all directions, making the sky look blue."
time=2025-12-02T09:36:55.328-08:00 level=INFO msg="shutting down server"
time=2025-12-02T09:36:55.328-08:00 level=INFO msg="waiting for server to exit"
time=2025-12-02T09:36:55.382-08:00 level=INFO msg="server exited"
time=2025-12-02T09:36:55.382-08:00 level=INFO msg="terminate complete"
time=2025-12-02T09:36:55.382-08:00 level=INFO msg="cleanup complete" failed=false
--- PASS: TestModelsChat (25.55s)
    --- PASS: TestModelsChat/ministral-3 (24.99s)
PASS
ok      github.com/ollama/ollama/integration    25.760s
go test ./integration -tags=integration,models -v -run TestVisionModels/ministral-3                 
=== RUN   TestVisionModels
=== RUN   TestVisionModels/ministral-3
time=2025-12-02T09:37:10.658-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54222
time=2025-12-02T09:37:10.658-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54222
time=2025-12-02T09:37:10.658-08:00 level=INFO msg="starting server" url=127.0.0.1:54222
time=2025-12-02T09:37:10.783-08:00 level=INFO msg="checking status of model" model=ministral-3
time=2025-12-02T09:37:10.822-08:00 level=INFO msg="model already present" model=ministral-3
...
--- PASS: TestVisionModels (2.85s)
    --- PASS: TestVisionModels/ministral-3 (2.85s)
PASS
ok      github.com/ollama/ollama/integration    3.058s
go test ./integration -tags=integration,models -v -run TestAPIToolCalling/ministral-3
=== RUN   TestAPIToolCalling
time=2025-12-02T09:37:39.185-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54239
time=2025-12-02T09:37:39.185-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54239
time=2025-12-02T09:37:39.185-08:00 level=INFO msg="starting server" url=127.0.0.1:54239
=== RUN   TestAPIToolCalling/ministral-3
time=2025-12-02T09:37:39.291-08:00 level=INFO msg="checking status of model" model=ministral-3
time=2025-12-02T09:37:39.332-08:00 level=INFO msg="model already present" model=ministral-3
time=2025-12-02T09:37:41.757-08:00 level=INFO msg="shutting down server"
time=2025-12-02T09:37:41.757-08:00 level=INFO msg="waiting for server to exit"
time=2025-12-02T09:37:41.805-08:00 level=INFO msg="server exited"
time=2025-12-02T09:37:41.805-08:00 level=INFO msg="terminate complete"
time=2025-12-02T09:37:41.805-08:00 level=INFO msg="cleanup complete" failed=false
--- PASS: TestAPIToolCalling (2.62s)
    --- PASS: TestAPIToolCalling/ministral-3 (2.47s)
PASS
ok      github.com/ollama/ollama/integration    2.829s
go test ./integration -tags=integration,library -v -run TestLibraryModelsChat/ministral-3
=== RUN   TestLibraryModelsChat
time=2025-12-02T09:38:29.812-08:00 level=INFO msg="Setting timeouts" soft=7m59.99952275s hard=9m39.999522667s
time=2025-12-02T09:38:29.815-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54260
time=2025-12-02T09:38:29.815-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54260
time=2025-12-02T09:38:29.815-08:00 level=INFO msg="starting server" url=127.0.0.1:54260
=== RUN   TestLibraryModelsChat/ministral-3
time=2025-12-02T09:38:29.926-08:00 level=INFO msg="checking status of model" model=ministral-3
time=2025-12-02T09:38:29.968-08:00 level=INFO msg="model already present" model=ministral-3
time=2025-12-02T09:38:33.023-08:00 level=INFO msg="test pass" model=ministral-3 messages="[{Role:user Content:why is the sky blue? Be brief but factual in your reply Thinking: Images:[] ToolCalls:[] ToolName: ToolCallID:}]" contains="[rayleigh scatter atmosphere nitrogen oxygen wavelength interact]" response="The sky appears blue due to **Rayleigh scattering**, where shorter wavelengths of sunlight (blue and violet) are scattered more by Earth's atmosphere than longer wavelengths (red, orange). Our eyes are less sensitive to violet, so we perceive the sky as blue."
time=2025-12-02T09:38:33.024-08:00 level=INFO msg="shutting down server"
time=2025-12-02T09:38:33.024-08:00 level=INFO msg="waiting for server to exit"
time=2025-12-02T09:38:33.076-08:00 level=INFO msg="server exited"
time=2025-12-02T09:38:33.076-08:00 level=INFO msg="terminate complete"
time=2025-12-02T09:38:33.076-08:00 level=INFO msg="cleanup complete" failed=false
--- PASS: TestLibraryModelsChat (3.26s)
    --- PASS: TestLibraryModelsChat/ministral-3 (3.10s)
PASS
ok      github.com/ollama/ollama/integration    3.476s
go test ./integration -tags=integration,perf -v -run TestModelsPerf/ministral-3       
=== RUN   TestModelsPerf
time=2025-12-02T09:39:05.386-08:00 level=INFO msg="Setting timeouts" soft=7m59.999644417s hard=9m39.999644334s
time=2025-12-02T09:39:05.391-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54275
time=2025-12-02T09:39:05.391-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54275
time=2025-12-02T09:39:05.391-08:00 level=INFO msg="starting server" url=127.0.0.1:54275
time=2025-12-02T09:39:05.517-08:00 level=WARN msg="No VRAM info available, testing all models, so larger ones might timeout..."
=== RUN   TestModelsPerf/ministral-3:latest
time=2025-12-02T09:39:05.520-08:00 level=INFO msg="checking status of model" model=ministral-3:latest
time=2025-12-02T09:39:05.561-08:00 level=INFO msg="model already present" model=ministral-3:latest
time=2025-12-02T09:39:05.598-08:00 level=INFO msg=scneario model=ministral-3:latest max_context=262144
time=2025-12-02T09:39:08.527-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,4096,100,550,1.24,649.80,57.27
time=2025-12-02T09:39:23.640-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,4096,100,2860,0.08,803.32,45.46
time=2025-12-02T09:39:26.768-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,8192,100,550,1.43,647.20,57.16
time=2025-12-02T09:39:47.892-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,8192,100,5090,0.08,655.46,38.87
time=2025-12-02T09:39:51.513-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,16384,100,550,1.83,582.22,57.45
time=2025-12-02T09:40:28.142-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,16384,100,9260,0.08,498.92,31.02
time=2025-12-02T09:40:45.298-08:00 level=INFO msg="Model fully loaded into GPU"
MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS
MODEL_PERF_DATA:ministral-3:latest,262144,100,550,15.35,590.32,57.65
...

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/13300 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 12/2/2025 **Status:** ✅ Merged **Merged:** 12/2/2025 **Merged by:** [@dhiltgen](https://github.com/dhiltgen) **Base:** `main` ← **Head:** `new_tests` --- ### 📝 Commits (1) - [`566ffb1`](https://github.com/ollama/ollama/commit/566ffb15c85c56e95a352d35d980f51a0c3aa78d) test: add ministral-3 ### 📊 Changes **2 files changed** (+6 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `integration/llm_image_test.go` (+3 -0) 📝 `integration/utils_test.go` (+3 -0) </details> ### 📄 Description Examples: ``` go test ./integration -tags=integration,models -v -run TestModelsChat/ministral-3 === RUN TestModelsChat time=2025-12-02T09:36:29.836-08:00 level=INFO msg="Setting timeouts" soft=7m59.999632292s hard=9m39.999632208s time=2025-12-02T09:36:29.839-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54156 time=2025-12-02T09:36:29.839-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54156 time=2025-12-02T09:36:29.840-08:00 level=INFO msg="starting server" url=127.0.0.1:54156 time=2025-12-02T09:36:30.342-08:00 level=WARN msg="No VRAM info available, testing all models, so larger ones might timeout..." === RUN TestModelsChat/ministral-3 time=2025-12-02T09:36:30.342-08:00 level=INFO msg="checking status of model" model=ministral-3 time=2025-12-02T09:36:30.346-08:00 level=INFO msg="model missing" model=ministral-3 time=2025-12-02T09:36:51.208-08:00 level=INFO msg=loading model=ministral-3 time=2025-12-02T09:36:55.323-08:00 level=INFO msg="test pass" model=ministral-3 messages="[{Role:user Content:why is the sky blue? Be brief but factual in your reply Thinking: Images:[] ToolCalls:[] ToolName: ToolCallID:}]" contains="[rayleigh scatter atmosphere nitrogen oxygen wavelength interact]" response="The sky appears blue due to **Rayleigh scattering**, where shorter blue wavelengths of sunlight are scattered more by Earth's atmosphere than other colors. This scattered blue light reaches our eyes from all directions, making the sky look blue." time=2025-12-02T09:36:55.328-08:00 level=INFO msg="shutting down server" time=2025-12-02T09:36:55.328-08:00 level=INFO msg="waiting for server to exit" time=2025-12-02T09:36:55.382-08:00 level=INFO msg="server exited" time=2025-12-02T09:36:55.382-08:00 level=INFO msg="terminate complete" time=2025-12-02T09:36:55.382-08:00 level=INFO msg="cleanup complete" failed=false --- PASS: TestModelsChat (25.55s) --- PASS: TestModelsChat/ministral-3 (24.99s) PASS ok github.com/ollama/ollama/integration 25.760s ``` ``` go test ./integration -tags=integration,models -v -run TestVisionModels/ministral-3 === RUN TestVisionModels === RUN TestVisionModels/ministral-3 time=2025-12-02T09:37:10.658-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54222 time=2025-12-02T09:37:10.658-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54222 time=2025-12-02T09:37:10.658-08:00 level=INFO msg="starting server" url=127.0.0.1:54222 time=2025-12-02T09:37:10.783-08:00 level=INFO msg="checking status of model" model=ministral-3 time=2025-12-02T09:37:10.822-08:00 level=INFO msg="model already present" model=ministral-3 ... --- PASS: TestVisionModels (2.85s) --- PASS: TestVisionModels/ministral-3 (2.85s) PASS ok github.com/ollama/ollama/integration 3.058s ``` ``` go test ./integration -tags=integration,models -v -run TestAPIToolCalling/ministral-3 === RUN TestAPIToolCalling time=2025-12-02T09:37:39.185-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54239 time=2025-12-02T09:37:39.185-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54239 time=2025-12-02T09:37:39.185-08:00 level=INFO msg="starting server" url=127.0.0.1:54239 === RUN TestAPIToolCalling/ministral-3 time=2025-12-02T09:37:39.291-08:00 level=INFO msg="checking status of model" model=ministral-3 time=2025-12-02T09:37:39.332-08:00 level=INFO msg="model already present" model=ministral-3 time=2025-12-02T09:37:41.757-08:00 level=INFO msg="shutting down server" time=2025-12-02T09:37:41.757-08:00 level=INFO msg="waiting for server to exit" time=2025-12-02T09:37:41.805-08:00 level=INFO msg="server exited" time=2025-12-02T09:37:41.805-08:00 level=INFO msg="terminate complete" time=2025-12-02T09:37:41.805-08:00 level=INFO msg="cleanup complete" failed=false --- PASS: TestAPIToolCalling (2.62s) --- PASS: TestAPIToolCalling/ministral-3 (2.47s) PASS ok github.com/ollama/ollama/integration 2.829s ``` ``` go test ./integration -tags=integration,library -v -run TestLibraryModelsChat/ministral-3 === RUN TestLibraryModelsChat time=2025-12-02T09:38:29.812-08:00 level=INFO msg="Setting timeouts" soft=7m59.99952275s hard=9m39.999522667s time=2025-12-02T09:38:29.815-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54260 time=2025-12-02T09:38:29.815-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54260 time=2025-12-02T09:38:29.815-08:00 level=INFO msg="starting server" url=127.0.0.1:54260 === RUN TestLibraryModelsChat/ministral-3 time=2025-12-02T09:38:29.926-08:00 level=INFO msg="checking status of model" model=ministral-3 time=2025-12-02T09:38:29.968-08:00 level=INFO msg="model already present" model=ministral-3 time=2025-12-02T09:38:33.023-08:00 level=INFO msg="test pass" model=ministral-3 messages="[{Role:user Content:why is the sky blue? Be brief but factual in your reply Thinking: Images:[] ToolCalls:[] ToolName: ToolCallID:}]" contains="[rayleigh scatter atmosphere nitrogen oxygen wavelength interact]" response="The sky appears blue due to **Rayleigh scattering**, where shorter wavelengths of sunlight (blue and violet) are scattered more by Earth's atmosphere than longer wavelengths (red, orange). Our eyes are less sensitive to violet, so we perceive the sky as blue." time=2025-12-02T09:38:33.024-08:00 level=INFO msg="shutting down server" time=2025-12-02T09:38:33.024-08:00 level=INFO msg="waiting for server to exit" time=2025-12-02T09:38:33.076-08:00 level=INFO msg="server exited" time=2025-12-02T09:38:33.076-08:00 level=INFO msg="terminate complete" time=2025-12-02T09:38:33.076-08:00 level=INFO msg="cleanup complete" failed=false --- PASS: TestLibraryModelsChat (3.26s) --- PASS: TestLibraryModelsChat/ministral-3 (3.10s) PASS ok github.com/ollama/ollama/integration 3.476s ``` ``` go test ./integration -tags=integration,perf -v -run TestModelsPerf/ministral-3 === RUN TestModelsPerf time=2025-12-02T09:39:05.386-08:00 level=INFO msg="Setting timeouts" soft=7m59.999644417s hard=9m39.999644334s time=2025-12-02T09:39:05.391-08:00 level=INFO msg="server connection" host=127.0.0.1 port=54275 time=2025-12-02T09:39:05.391-08:00 level=INFO msg="setting env" OLLAMA_HOST=127.0.0.1:54275 time=2025-12-02T09:39:05.391-08:00 level=INFO msg="starting server" url=127.0.0.1:54275 time=2025-12-02T09:39:05.517-08:00 level=WARN msg="No VRAM info available, testing all models, so larger ones might timeout..." === RUN TestModelsPerf/ministral-3:latest time=2025-12-02T09:39:05.520-08:00 level=INFO msg="checking status of model" model=ministral-3:latest time=2025-12-02T09:39:05.561-08:00 level=INFO msg="model already present" model=ministral-3:latest time=2025-12-02T09:39:05.598-08:00 level=INFO msg=scneario model=ministral-3:latest max_context=262144 time=2025-12-02T09:39:08.527-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,4096,100,550,1.24,649.80,57.27 time=2025-12-02T09:39:23.640-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,4096,100,2860,0.08,803.32,45.46 time=2025-12-02T09:39:26.768-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,8192,100,550,1.43,647.20,57.16 time=2025-12-02T09:39:47.892-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,8192,100,5090,0.08,655.46,38.87 time=2025-12-02T09:39:51.513-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,16384,100,550,1.83,582.22,57.45 time=2025-12-02T09:40:28.142-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,16384,100,9260,0.08,498.92,31.02 time=2025-12-02T09:40:45.298-08:00 level=INFO msg="Model fully loaded into GPU" MODEL_PERF_HEADER:MODEL,CONTEXT,GPU PERCENT,APPROX PROMPT COUNT,LOAD TIME,PROMPT EVAL TPS,EVAL TPS MODEL_PERF_DATA:ministral-3:latest,262144,100,550,15.35,590.32,57.65 ... ``` --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-13 00:46:46 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#14155