[GH-ISSUE #2356] Phi modelfile is incorrect #27126

Closed
opened 2026-04-22 04:05:56 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @mak448a on GitHub (Feb 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2356

Originally assigned to: @bmizerany on GitHub.

When I use phi ollama and put in the system prompt, it doesn't respond as well as it does in LM Studio.
Is the internal prompt in ollama correct?
LM Studio uses "Instruct:" and "Output:" as markers for the user's message and the assistant's message.

LM Studio: {"speech": "Hi!", "program": "null"}
Ollama: Welcome to our chatbot program. How can I assist you today?

Here's the code I used:

import ollama

prompt = """You are Daniel.
Give a response as a JSON object with properties "speech" and "program". Both of these keys must always be filled. Do not reply with anything else other than a JSON object.
Example of JSON object: {"speech": "Hi!", "program": "null"}

Instruct: Hello!
Output: {"speech": "Hi!", "program": "null"}
Instruct: Can you open discord?
Output: {"speech": "Certainly!", "program": "discord"}
Instruct: Can you open firefox?
Output: {"speech": "Certainly! Here it is!", "program": "firefox"}
Instruct: Turn off the computer.
Output: {"speech": "Sure, I'll do that.", "program": "shutdown"}
Instruct: Goodnight.
Output: {"speech": "You too!", "program": "null"}"""

response = ollama.chat(model="phi", 
    messages=[
        {
            "role": "system",
            "content": prompt
        },
        {
            "role": "user",
            "content": "Hello!"
        },
    ],
    stream=True
)

for chunk in response:
    print(chunk['message']['content'], end='', flush=True)

Also, should I post this in ollama-python instead of the main ollama repo?

Originally created by @mak448a on GitHub (Feb 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2356 Originally assigned to: @bmizerany on GitHub. When I use phi ollama and put in the system prompt, it doesn't respond as well as it does in LM Studio. Is the internal prompt in ollama correct? LM Studio uses "Instruct:" and "Output:" as markers for the user's message and the assistant's message. LM Studio: `{"speech": "Hi!", "program": "null"}` Ollama: ` Welcome to our chatbot program. How can I assist you today?` Here's the code I used: ```python import ollama prompt = """You are Daniel. Give a response as a JSON object with properties "speech" and "program". Both of these keys must always be filled. Do not reply with anything else other than a JSON object. Example of JSON object: {"speech": "Hi!", "program": "null"} Instruct: Hello! Output: {"speech": "Hi!", "program": "null"} Instruct: Can you open discord? Output: {"speech": "Certainly!", "program": "discord"} Instruct: Can you open firefox? Output: {"speech": "Certainly! Here it is!", "program": "firefox"} Instruct: Turn off the computer. Output: {"speech": "Sure, I'll do that.", "program": "shutdown"} Instruct: Goodnight. Output: {"speech": "You too!", "program": "null"}""" response = ollama.chat(model="phi", messages=[ { "role": "system", "content": prompt }, { "role": "user", "content": "Hello!" }, ], stream=True ) for chunk in response: print(chunk['message']['content'], end='', flush=True) ``` Also, should I post this in ollama-python instead of the main ollama repo?
GiteaMirror added the needs more info label 2026-04-22 04:05:56 -05:00
Author
Owner

@mak448a commented on GitHub (Feb 6, 2024):

Well, looks like the internal modelfile was prompted differently. Instead of Instruct: and Output:, it uses User: and Assistant:. And for the system, the modelfile used System: but LM Studio used nothing.

<!-- gh-comment-id:1928594963 --> @mak448a commented on GitHub (Feb 6, 2024): Well, looks like the internal modelfile was prompted differently. Instead of `Instruct:` and `Output:`, it uses `User:` and `Assistant:`. And for the system, the modelfile used `System:` but LM Studio used nothing.
Author
Owner

@mak448a commented on GitHub (Feb 6, 2024):

I fixed it slightly by creating a new modelfile. It still doesn't work as well.

<!-- gh-comment-id:1928595111 --> @mak448a commented on GitHub (Feb 6, 2024): I fixed it slightly by creating a new modelfile. It still doesn't work as well.
Author
Owner

@bmizerany commented on GitHub (Mar 12, 2024):

@mak448a Are you able to reproduce with the ollama CLI? If so, do you mind sharing the commands/steps you took?

<!-- gh-comment-id:1989678772 --> @bmizerany commented on GitHub (Mar 12, 2024): @mak448a Are you able to reproduce with the ollama CLI? If so, do you mind sharing the commands/steps you took?
Author
Owner

@mak448a commented on GitHub (Mar 12, 2024):

I think I did, but it was a while ago. I'll have to try again when I have time.

<!-- gh-comment-id:1991806207 --> @mak448a commented on GitHub (Mar 12, 2024): I think I did, but it was a while ago. I'll have to try again when I have time.
Author
Owner

@bmizerany commented on GitHub (Mar 12, 2024):

@mak448a Using the latest Ollama version (0.1.28) running the latest phi model and using your script (unmodified) I get:

{
    "speech": "Hello!"
}

Is that more like what you're looking for?

<!-- gh-comment-id:1992277522 --> @bmizerany commented on GitHub (Mar 12, 2024): @mak448a Using the latest Ollama version (0.1.28) running the [latest phi model](https://ollama.com/library/phi:2.7b) and using your script (unmodified) I get: ```json { "speech": "Hello!" } ``` Is that more like what you're looking for?
Author
Owner

@bmizerany commented on GitHub (Mar 12, 2024):

I'm going to close since we're unable to reproduce here. Please feel free to update / reopen if the issue persists.

<!-- gh-comment-id:1992315854 --> @bmizerany commented on GitHub (Mar 12, 2024): I'm going to close since we're unable to reproduce here. Please feel free to update / reopen if the issue persists.
Author
Owner

@mak448a commented on GitHub (Mar 13, 2024):

Yes, it must have been fixed. Thanks @bmizerany

<!-- gh-comment-id:1992911810 --> @mak448a commented on GitHub (Mar 13, 2024): Yes, it must have been fixed. Thanks @bmizerany
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27126