[GH-ISSUE #5384] dolphin-phi3 and dolphin-qwen2 #29125

Closed
opened 2026-04-22 07:47:21 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @olumolu on GitHub (Jun 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5384

dolphin-phi3 and qwen2 dolphin https://huggingface.co/cognitivecomputations
can we have this models so people can us them like dolphin-llama3

Originally created by @olumolu on GitHub (Jun 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5384 dolphin-phi3 and qwen2 dolphin https://huggingface.co/cognitivecomputations can we have this models so people can us them like dolphin-llama3
GiteaMirror added the model label 2026-04-22 07:47:21 -05:00
Author
Owner

@rjmalagon commented on GitHub (Jun 29, 2024):

Hi @userforsource, I uploaded dolphin-qwen2 on https://ollama.com/rjmalagon, if you need a particular quantized of either model for yourself, I may help you with individual gguf files or uploads to my online Ollama library, while waiting for official uploads.

<!-- gh-comment-id:2198259976 --> @rjmalagon commented on GitHub (Jun 29, 2024): Hi @userforsource, I uploaded dolphin-qwen2 on https://ollama.com/rjmalagon, if you need a particular quantized of either model for yourself, I may help you with individual gguf files or uploads to my online Ollama library, while waiting for official uploads.
Author
Owner

@olumolu commented on GitHub (Jun 29, 2024):

Thanks but i use alpaca so. This need to be in the ollama list to use with that.

<!-- gh-comment-id:2198265554 --> @olumolu commented on GitHub (Jun 29, 2024): Thanks but i use alpaca so. This need to be in the ollama list to use with that.
Author
Owner

@rick-github commented on GitHub (Jun 29, 2024):

alpaca runs a local ollama server, so you can communicate with it directly. On my machine it's localhost:11435 because I already have an ollama server at 11434. To download rjmalagon's dolphin-qwen model:

curl localhost:11435/api/pull -d '{"name": "rjmalagon/dolphin-2.9.2-qwen2-7b-f16"}'

Then click on the books icon ("Manage models") icon next to the model selector and the new model should appear in the list.

Note that CognitiveComputations started the dolphin work, and they upload their models to ollama, so you can get their phi3 model from there.

<!-- gh-comment-id:2198337009 --> @rick-github commented on GitHub (Jun 29, 2024): alpaca runs a local ollama server, so you can communicate with it directly. On my machine it's localhost:11435 because I already have an ollama server at 11434. To download rjmalagon's dolphin-qwen model: ``` curl localhost:11435/api/pull -d '{"name": "rjmalagon/dolphin-2.9.2-qwen2-7b-f16"}' ``` Then click on the books icon ("Manage models") icon next to the model selector and the new model should appear in the list. Note that [CognitiveComputations](https://huggingface.co/cognitivecomputations) started the dolphin work, and they [upload their models](https://ollama.com/search?q=cognitivecomputations&p=1) to ollama, so you can get their phi3 model from there.
Author
Owner

@pdevine commented on GitHub (Jul 3, 2024):

@botollama as @rick-github mentioned, the official versions of those can be found in:

https://ollama.com/CognitiveComputations/dolphin-qwen2

and

https://ollama.com/CognitiveComputations/dolphin-phi-3

I'll go ahead and close the issue.

<!-- gh-comment-id:2206883335 --> @pdevine commented on GitHub (Jul 3, 2024): @botollama as @rick-github mentioned, the official versions of those can be found in: https://ollama.com/CognitiveComputations/dolphin-qwen2 and https://ollama.com/CognitiveComputations/dolphin-phi-3 I'll go ahead and close the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29125