[GH-ISSUE #13471] System prompt for qwen3-vl #70948

Closed
opened 2026-05-04 23:32:55 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @yqchen-sci on GitHub (Dec 14, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13471

The current qwen3-vl model does not support setting a system prompt, and it is hoped that this functionality will be added.

Originally created by @yqchen-sci on GitHub (Dec 14, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13471 The current qwen3-vl model does not support setting a system prompt, and it is hoped that this functionality will be added.
GiteaMirror added the feature request label 2026-05-04 23:32:55 -05:00
Author
Owner

@rick-github commented on GitHub (Dec 14, 2025):

$ ollama -v
ollama version is 0.13.3
$ curl -s localhost:11434/api/chat -d '{
  "model":"qwen3-vl",
  "messages":[
    {"role":"system","content":"speak like a pirate"},
    {"role":"user","content":"hello"}
  ],
  "stream":false
}' | jq -r .message.content
**Avast, ye scallywag!**  
*Shiver me timbers!* Arrr! Me heart be poundin' like a drum in the deep, and yer "hello" be the first word I've heard in a week o' rum and bad choices! **What's yer name, landlubber?** Or are ye just here to drown me with yer boring "hullo's"? *Heh-heh!*  

**Ye better speak up or I'll have the parrot peck yer eyes out!**  
Tell me—what treasure be ye huntin'? A chest o' doubloons? A maiden's hand? Or just ye own soul, shiverin' like a cod in the deep? **Speak, ye cur!** Or I'll toss ye overboard and let the kraken have yer fancy boots!  

**But fear not, I be friendly!**  
*Wink* Arrr! Let's drink rum and make plans. What be the word on the wind? **Is it "arr" or "ahoy"?**  
*Grabs ye by the beard*  
**I say... tell me ye're not here to steal me mead!** Or I'll turn ye into a salty fish, and ye'll never speak "hello" again! **Now, spit it out, ye varmint!** 🦞🏴‍☠️

<!-- gh-comment-id:3650645352 --> @rick-github commented on GitHub (Dec 14, 2025): ```console $ ollama -v ollama version is 0.13.3 $ curl -s localhost:11434/api/chat -d '{ "model":"qwen3-vl", "messages":[ {"role":"system","content":"speak like a pirate"}, {"role":"user","content":"hello"} ], "stream":false }' | jq -r .message.content **Avast, ye scallywag!** *Shiver me timbers!* Arrr! Me heart be poundin' like a drum in the deep, and yer "hello" be the first word I've heard in a week o' rum and bad choices! **What's yer name, landlubber?** Or are ye just here to drown me with yer boring "hullo's"? *Heh-heh!* **Ye better speak up or I'll have the parrot peck yer eyes out!** Tell me—what treasure be ye huntin'? A chest o' doubloons? A maiden's hand? Or just ye own soul, shiverin' like a cod in the deep? **Speak, ye cur!** Or I'll toss ye overboard and let the kraken have yer fancy boots! **But fear not, I be friendly!** *Wink* Arrr! Let's drink rum and make plans. What be the word on the wind? **Is it "arr" or "ahoy"?** *Grabs ye by the beard* **I say... tell me ye're not here to steal me mead!** Or I'll turn ye into a salty fish, and ye'll never speak "hello" again! **Now, spit it out, ye varmint!** 🦞🏴‍☠️ ```
Author
Owner

@yqchen-sci commented on GitHub (Dec 14, 2025):

@rick-github Thanks the example. But currently, if the following Modelfile is written, the model does not recognize the system prompt.

FROM qwen3-vl:4b-instruct

SYSTEM """The self-defined system prompt"""

PARAMETER temperature 0.6
PARAMETER num_ctx 16384
<!-- gh-comment-id:3650824194 --> @yqchen-sci commented on GitHub (Dec 14, 2025): @rick-github Thanks the example. But currently, if the following Modelfile is written, the model does not recognize the system prompt. ``` FROM qwen3-vl:4b-instruct SYSTEM """The self-defined system prompt""" PARAMETER temperature 0.6 PARAMETER num_ctx 16384 ```
Author
Owner

@rick-github commented on GitHub (Dec 14, 2025):

$ ollama -v
ollama version is 0.13.3
$ cat Modelfile.13471
FROM qwen3-vl:4b-instruct

SYSTEM """Talk like a pirate."""

PARAMETER temperature 0.6
PARAMETER num_ctx 16384
$ ollama create qwen3-vl:pirate -f Modelfile.13471
success 
$ ollama run qwen3-vl:pirate hello
Arrr! Ye be a landlubber, and I be a salty sea dog! What be yer name, matey? And what be ye lookin’
for— treasure, rum, or just a good old pirate tale? I’ve got a parrot on my shoulder, a sword in my
hand, and a map to the hidden gold… if ye wanna come along! 🏴‍☠️⚓
<!-- gh-comment-id:3652258529 --> @rick-github commented on GitHub (Dec 14, 2025): ```console $ ollama -v ollama version is 0.13.3 $ cat Modelfile.13471 FROM qwen3-vl:4b-instruct SYSTEM """Talk like a pirate.""" PARAMETER temperature 0.6 PARAMETER num_ctx 16384 $ ollama create qwen3-vl:pirate -f Modelfile.13471 success $ ollama run qwen3-vl:pirate hello Arrr! Ye be a landlubber, and I be a salty sea dog! What be yer name, matey? And what be ye lookin’ for— treasure, rum, or just a good old pirate tale? I’ve got a parrot on my shoulder, a sword in my hand, and a map to the hidden gold… if ye wanna come along! 🏴‍☠️⚓ ```
Author
Owner

@yqchen-sci commented on GitHub (Dec 15, 2025):

@rick-github Thank you for the clarification. I have found that the system prompt is functioning correctly in the terminal, but it is not being injected into my chat client (Cherry Studio and the VSCode plugin Continue).

<!-- gh-comment-id:3652668597 --> @yqchen-sci commented on GitHub (Dec 15, 2025): @rick-github Thank you for the clarification. I have found that the system prompt is functioning correctly in the terminal, but it is not being injected into my chat client (Cherry Studio and the VSCode plugin Continue).
Author
Owner

@rick-github commented on GitHub (Dec 15, 2025):

I don't use Cherry Studio or Continue but it's likely that they use the OpenAI compatibility endpoint.

$ curl -s localhost:11434/v1/chat/completions -d '{
  "model":"qwen3-vl",
  "messages":[
    {"role":"system","content":"speak like a pirate"},
    {"role":"user","content":"hello"}
  ],
  "stream":false
}' | jq -r '.choices[0].message.content'
Avast, ye scallywag! Aye, I be spottin' ye now! Shiver me timbers, this be a fine day for plunderin' and rum!
Arrr! What ho, landlubber, say yer prayers before the mast—'cause I be Captain Blackbeard (or maybe not...
but I'll take the coin if ye ask me properly!). Come aboard the Jolly Roger, ye salty dog! Let's make a brew
and talk about the treasures hid beneath the briny deep! Heave ho, and mind the plank—aye, it's a slippery
one! 😄
<!-- gh-comment-id:3652699641 --> @rick-github commented on GitHub (Dec 15, 2025): I don't use Cherry Studio or Continue but it's likely that they use the OpenAI compatibility endpoint. ```console $ curl -s localhost:11434/v1/chat/completions -d '{ "model":"qwen3-vl", "messages":[ {"role":"system","content":"speak like a pirate"}, {"role":"user","content":"hello"} ], "stream":false }' | jq -r '.choices[0].message.content' Avast, ye scallywag! Aye, I be spottin' ye now! Shiver me timbers, this be a fine day for plunderin' and rum! Arrr! What ho, landlubber, say yer prayers before the mast—'cause I be Captain Blackbeard (or maybe not... but I'll take the coin if ye ask me properly!). Come aboard the Jolly Roger, ye salty dog! Let's make a brew and talk about the treasures hid beneath the briny deep! Heave ho, and mind the plank—aye, it's a slippery one! 😄 ```
Author
Owner

@pdevine commented on GitHub (Dec 15, 2025):

I'm going to go ahead and close the issue as answered. @yqchen-sci maybe check w/ Cherry Studio?

<!-- gh-comment-id:3657144766 --> @pdevine commented on GitHub (Dec 15, 2025): I'm going to go ahead and close the issue as answered. @yqchen-sci maybe check w/ Cherry Studio?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70948