[GH-ISSUE #1496] Add Phi-2 model #47320

Closed
opened 2026-04-28 03:35:13 -05:00 by GiteaMirror · 19 comments
Owner

Originally created by @hunnble on GitHub (Dec 13, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1496

Originally assigned to: @jmorganca on GitHub.

The Phi-2 model performs well. Should we consider adding it to Ollama?

Originally created by @hunnble on GitHub (Dec 13, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1496 Originally assigned to: @jmorganca on GitHub. The Phi-2 model performs well. Should we consider adding it to Ollama?
Author
Owner

@easp commented on GitHub (Dec 13, 2023):

It appears to use the same model architecture as Phi-1.5. That model architecure is not supported by Llama.cpp.

In addition, the model weights are not licensed for redistribution, at least not currently. You have to log-in to Azure with an account that has a credit card on file.

<!-- gh-comment-id:1854408073 --> @easp commented on GitHub (Dec 13, 2023): It appears to use the same model architecture as Phi-1.5. That [model architecure is not supported by Llama.cpp](https://github.com/ggerganov/llama.cpp/issues/3146). In addition, the model weights are not licensed for redistribution, at least not currently. You have to log-in to Azure with an account that has a credit card on file.
Author
Owner

@izard commented on GitHub (Dec 14, 2023):

The model weights can be downloaded by anyone from official https://huggingface.co/microsoft/phi-2/tree/main MSFT page. The license is quite permissive.

<!-- gh-comment-id:1854939717 --> @izard commented on GitHub (Dec 14, 2023): The model weights can be downloaded by anyone from official https://huggingface.co/microsoft/phi-2/tree/main MSFT page. The license is quite permissive.
Author
Owner

@easp commented on GitHub (Dec 14, 2023):

They've loosened things up in the last 10hr. The fact remains though that support for the model architecture in Llama.cpp hasn't progressed much (at least not visibly) since September.

<!-- gh-comment-id:1855084968 --> @easp commented on GitHub (Dec 14, 2023): They've loosened things up in the last 10hr. The fact remains though that support for the model architecture in Llama.cpp hasn't progressed much (at least not visibly) since September.
Author
Owner

@HardKothari commented on GitHub (Dec 15, 2023):

Not sure if this can help but there is a model card with GGUF format file and if they can be used. I tried locally creating Modelfile and it had internal error in the server.

https://huggingface.co/radames/phi-2-quantized

<!-- gh-comment-id:1858487772 --> @HardKothari commented on GitHub (Dec 15, 2023): Not sure if this can help but there is a model card with GGUF format file and if they can be used. I tried locally creating Modelfile and it had internal error in the server. https://huggingface.co/radames/phi-2-quantized
Author
Owner

@jingfelix commented on GitHub (Dec 16, 2023):

Seems like Phi-2 will be supported by llama.cpp in forseeeable future 👀
https://github.com/ggerganov/llama.cpp/pull/4490

<!-- gh-comment-id:1858712414 --> @jingfelix commented on GitHub (Dec 16, 2023): Seems like Phi-2 will be supported by llama.cpp in forseeeable future 👀 https://github.com/ggerganov/llama.cpp/pull/4490
Author
Owner

@vaclcer commented on GitHub (Dec 18, 2023):

it is now. Fingers crossed for Ollama :)

<!-- gh-comment-id:1861732403 --> @vaclcer commented on GitHub (Dec 18, 2023): it is now. Fingers crossed for Ollama :)
Author
Owner

@Jbollenbacher commented on GitHub (Dec 19, 2023):

Seems like Phi-2 will be supported by llama.cpp in forseeeable future 👀
https://github.com/ggerganov/llama.cpp/pull/4490

This just merged. llama.cpp now supports phi-2.

<!-- gh-comment-id:1862009956 --> @Jbollenbacher commented on GitHub (Dec 19, 2023): > Seems like Phi-2 will be supported by llama.cpp in forseeeable future 👀 > https://github.com/ggerganov/llama.cpp/pull/4490 This just merged. llama.cpp now supports phi-2.
Author
Owner

@brianjking commented on GitHub (Dec 19, 2023):

Now that this has been merged what is required to get this working in ollama?

<!-- gh-comment-id:1862973613 --> @brianjking commented on GitHub (Dec 19, 2023): Now that this has been merged what is required to get this working in ollama?
Author
Owner

@HardKothari commented on GitHub (Dec 19, 2023):

I believe Ollama has already added the support and is already available in pre-release v0.1.17
Check this out..

https://github.com/jmorganca/ollama/releases/tag/v0.1.17

https://ollama.ai/jmorgan/phi/tags

<!-- gh-comment-id:1863037480 --> @HardKothari commented on GitHub (Dec 19, 2023): I believe Ollama has already added the support and is already available in pre-release v0.1.17 Check this out.. https://github.com/jmorganca/ollama/releases/tag/v0.1.17 https://ollama.ai/jmorgan/phi/tags
Author
Owner

@ibnbd commented on GitHub (Dec 20, 2023):

https://ollama.ai/library/phi

<!-- gh-comment-id:1864501642 --> @ibnbd commented on GitHub (Dec 20, 2023): https://ollama.ai/library/phi
Author
Owner

@oliverbob commented on GitHub (Dec 23, 2023):

Yes, its lightning fast. Its already supported by ollama.

<!-- gh-comment-id:1868267389 --> @oliverbob commented on GitHub (Dec 23, 2023): Yes, its lightning fast. Its already supported by ollama.
Author
Owner

@snajjar commented on GitHub (Dec 23, 2023):

Not working on my machine, :(
(Lenovo Thinkpad P14S, Linux 6.5.7 Arch)

$ ollama run phi
⠙   Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull phi:latest

I tried different tags for the phi model, but none of them work. Do I need to install ollama from git?

<!-- gh-comment-id:1868278397 --> @snajjar commented on GitHub (Dec 23, 2023): Not working on my machine, :( (Lenovo Thinkpad P14S, Linux 6.5.7 Arch) ``` $ ollama run phi ⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull phi:latest ``` I tried different tags for the phi model, but none of them work. Do I need to install ollama from git?
Author
Owner

@bgokden commented on GitHub (Dec 23, 2023):

Not working on my machine, :( (Lenovo Thinkpad P14S, Linux 6.5.7 Arch)

$ ollama run phi
⠙   Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull phi:latest

I tried different tags for the phi model, but none of them work. Do I need to install ollama from git?

Which version of ollama are you using? It worked for me with v0.1.17.
I have installed ollama from git with cloning this tag.

<!-- gh-comment-id:1868279600 --> @bgokden commented on GitHub (Dec 23, 2023): > Not working on my machine, :( (Lenovo Thinkpad P14S, Linux 6.5.7 Arch) > > ``` > $ ollama run phi > ⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull phi:latest > ``` > > I tried different tags for the phi model, but none of them work. Do I need to install ollama from git? Which version of ollama are you using? It worked for me with v0.1.17. I have installed ollama from git with cloning this tag.
Author
Owner

@snajjar commented on GitHub (Dec 23, 2023):

Indeed. Even though the package version stated 1.0.17, ollama --version gave 1.0.9...

Re-installed from git, it works in 1.0.17

<!-- gh-comment-id:1868294768 --> @snajjar commented on GitHub (Dec 23, 2023): Indeed. Even though the package version stated 1.0.17, `ollama --version` gave 1.0.9... Re-installed from git, it works in 1.0.17
Author
Owner

@oliverbob commented on GitHub (Dec 23, 2023):

Not working on my machine, :( (Lenovo Thinkpad P14S, Linux 6.5.7 Arch)

$ ollama run phi
⠙   Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull phi:latest

I tried different tags for the phi model, but none of them work. Do I need to install ollama from git?

You clearly need an update if that's the case. Just run the installer via curl https://ollama.ai/install.sh | sh to update it once more.

<!-- gh-comment-id:1868317913 --> @oliverbob commented on GitHub (Dec 23, 2023): > Not working on my machine, :( (Lenovo Thinkpad P14S, Linux 6.5.7 Arch) > > ``` > $ ollama run phi > ⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull phi:latest > ``` > > I tried different tags for the phi model, but none of them work. Do I need to install ollama from git? You clearly need an update if that's the case. Just run the installer via `curl https://ollama.ai/install.sh | sh` to update it once more.
Author
Owner

@oliverbob commented on GitHub (Dec 23, 2023):

Indeed. Even though the package version stated 1.0.17, ollama --version gave 1.0.9...

Re-installed from git, it works in 1.0.17

You could be running two versions of Ollama. One via docker and one via the install script.

Run sudo lsof -i :11434 to find out. Or docker ps. If its on docker, you need to rebuild docker, but only when you have don a fresh git pull to update the local repo. Either remove docker ollama or stop it in docker before installing the curl https://ollama.ai/install.sh | sh. Then redownload Phi-2.

<!-- gh-comment-id:1868318757 --> @oliverbob commented on GitHub (Dec 23, 2023): > Indeed. Even though the package version stated 1.0.17, `ollama --version` gave 1.0.9... > > Re-installed from git, it works in 1.0.17 You could be running two versions of Ollama. One via docker and one via the install script. Run sudo lsof -i :11434 to find out. Or `docker ps`. If its on docker, you need to rebuild docker, but only when you have don a fresh `git pull` to update the local repo. Either remove docker ollama or stop it in docker before installing the `curl https://ollama.ai/install.sh | sh`. Then redownload Phi-2.
Author
Owner

@trickster commented on GitHub (Dec 24, 2023):

Is there any plan to add dolphin phi-2 as well? https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2/tree/main

<!-- gh-comment-id:1868430989 --> @trickster commented on GitHub (Dec 24, 2023): Is there any plan to add dolphin phi-2 as well? https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2/tree/main
Author
Owner

@jmorganca commented on GitHub (Dec 24, 2023):

@trickster yes! It's here https://ollama.ai/library/dolphin-phi 😊

<!-- gh-comment-id:1868601001 --> @jmorganca commented on GitHub (Dec 24, 2023): @trickster yes! It's here https://ollama.ai/library/dolphin-phi 😊
Author
Owner

@jmorganca commented on GitHub (Dec 24, 2023):

Closing this as Phi has been added!

<!-- gh-comment-id:1868601066 --> @jmorganca commented on GitHub (Dec 24, 2023): Closing this as Phi has been added!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47320