[GH-ISSUE #12652] Add InclusionAI's models #54910

Closed
opened 2026-04-29 07:57:14 -05:00 by GiteaMirror · 13 comments
Owner

Originally created by @alerikaisattera on GitHub (Oct 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12652

Add support for Ling-lite/Ling-lite-1.5/Ring-Lite/Ling-mini-2.0/Ring-mini-2.0

Originally created by @alerikaisattera on GitHub (Oct 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12652 Add support for Ling-lite/Ling-lite-1.5/Ring-Lite/Ling-mini-2.0/Ring-mini-2.0
GiteaMirror added the model label 2026-04-29 07:57:14 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 16, 2025):

https://github.com/ggml-org/llama.cpp/pull/16063

<!-- gh-comment-id:3410341489 --> @rick-github commented on GitHub (Oct 16, 2025): https://github.com/ggml-org/llama.cpp/pull/16063
Author
Owner

@alerikaisattera commented on GitHub (Oct 16, 2025):

That's for 2.0. Ling 1/1.5 seem to be supported in llama.cpp but not ollama

<!-- gh-comment-id:3410774368 --> @alerikaisattera commented on GitHub (Oct 16, 2025): That's for 2.0. Ling 1/1.5 seem to be supported in llama.cpp but not ollama
Author
Owner

@maternion commented on GitHub (Oct 17, 2025):

That's for 2.0. Ling 1/1.5 seem to be supported in llama.cpp but not ollama

Ollama won't add older models most probably, because we already have the successors. So it would make sense for them to add bailingmoev2 architecture, but as you said llama.cpp hasn't implemented it yet https://github.com/ggml-org/llama.cpp/pull/16063.

<!-- gh-comment-id:3414850932 --> @maternion commented on GitHub (Oct 17, 2025): > That's for 2.0. Ling 1/1.5 seem to be supported in llama.cpp but not ollama Ollama won't add older models most probably, because we already have the successors. So it would make sense for them to add bailingmoev2 architecture, but as you said llama.cpp hasn't implemented it yet https://github.com/ggml-org/llama.cpp/pull/16063.
Author
Owner

@wangsff commented on GitHub (Oct 21, 2025):

ggml-org/llama.cpp#16063

Hi, I noticed that this PR has already been merged into llama.cpp's master branch. Could you please update the submodule in Ollama to the latest master of llama.cpp ASAP?

<!-- gh-comment-id:3425153917 --> @wangsff commented on GitHub (Oct 21, 2025): > [ggml-org/llama.cpp#16063](https://github.com/ggml-org/llama.cpp/pull/16063) Hi, I noticed that this PR has already been merged into llama.cpp's master branch. Could you please update the submodule in Ollama to the latest master of llama.cpp ASAP?
Author
Owner

@alerikaisattera commented on GitHub (Oct 23, 2025):

Ollama won't add older models most probably, because we already have the successors.

Ling 1/1.5 appears to be better than 2.0 in some aspects, presumably due to larger expert size. I suggest adding both

<!-- gh-comment-id:3437523654 --> @alerikaisattera commented on GitHub (Oct 23, 2025): > Ollama won't add older models most probably, because we already have the successors. Ling 1/1.5 appears to be better than 2.0 in some aspects, presumably due to larger expert size. I suggest adding both
Author
Owner

@rick-github commented on GitHub (Oct 23, 2025):

$ ollama run hf.co/mradermacher/Ling-lite-1.5-2507-GGUF:Q4_K_M
>>> /set system respond in english
Set system message.
>>> hello
Hello! How can I assist you today? 😊

<!-- gh-comment-id:3437921661 --> @rick-github commented on GitHub (Oct 23, 2025): ```console $ ollama run hf.co/mradermacher/Ling-lite-1.5-2507-GGUF:Q4_K_M >>> /set system respond in english Set system message. >>> hello Hello! How can I assist you today? 😊 ```
Author
Owner

@rick-github commented on GitHub (Oct 23, 2025):

With thinking control.

FROM hf.co/mradermacher/Ling-lite-1.5-2507-GGUF:Q4_K_M

TEMPLATE """<role>SYSTEM</role>
{{- if .System }}
{{- .System }}
{{- end }}
detailed thinking {{ if and .IsThinkSet .Think }}on{{ else }}off{{ end }}
{{ range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1}}
{{- if eq .Role "user" }}<role>HUMAN</role>
{{- .Content }}
{{- else if eq .Role "assistant" }}<role>ASSISTANT</role>
{{- if and $last $.IsThinkSet $.Think .Thinking false -}}
<think>{{ .Thinking }}</think>
{{- end }}
{{- .Content }}
{{- else if eq .Role "tool" }}<role>OBSERVATION</role>
{{- .Content }}
{{- end }}
{{- if and $last (ne .Role "assistant") }}<role>ASSISTANT</role>
{{- if and $.IsThinkSet $.Think }}<think>{{ end }}
{{- end }}
{{- end -}}
"""
PARAMETER stop <role>
PARAMETER stop <role>HUMAN</role>
$ ollama pull hf.co/mradermacher/Ling-lite-1.5-2507-GGUF:Q4_K_M
$ ollama create ling-lite-1.5:16b-2507-q4_K_M -f Modelfile
$ ollama run ling-lite-1.5:16b-2507-q4_K_M 
>>> hello
Thinking...
Okay, the user said "hello". I need to respond appropriately. Let me think about how to start a conversation. Maybe ask how they're doing or something 
related. They might be expecting a friendly reply. Also, considering possible contexts, like if they want information, but since their message is just 
"hello", probably a casual chat. Should keep it neutral and inviting. Avoid being too formal or too casual. Check for any specific guidelines, but since 
there's none, standard response should work. Maybe ask "Hi there! How can I assist you today?" That covers both greeting and offering help. 
Alternatively, "Hello! What brings you here today?" But maybe that's assuming a purpose. If they just want a simple hello back, then "Hello! How are 
you?" is also good. Need to make sure the response is welcoming and open-ended. Probably best to use a common phrase like "Hello! How can I help you?" 
since it's helpful and sets a positive tone.

...done thinking.

Hello! 😊 How can I assist you today? If you have any questions or need information, feel free to ask!

>>> /set nothink
Set 'nothink' mode.
>>> hello
Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat? 😊
<!-- gh-comment-id:3438014875 --> @rick-github commented on GitHub (Oct 23, 2025): With thinking control. ```dockerfile FROM hf.co/mradermacher/Ling-lite-1.5-2507-GGUF:Q4_K_M TEMPLATE """<role>SYSTEM</role> {{- if .System }} {{- .System }} {{- end }} detailed thinking {{ if and .IsThinkSet .Think }}on{{ else }}off{{ end }} {{ range $i, $_ := .Messages }} {{- $last := eq (len (slice $.Messages $i)) 1}} {{- if eq .Role "user" }}<role>HUMAN</role> {{- .Content }} {{- else if eq .Role "assistant" }}<role>ASSISTANT</role> {{- if and $last $.IsThinkSet $.Think .Thinking false -}} <think>{{ .Thinking }}</think> {{- end }} {{- .Content }} {{- else if eq .Role "tool" }}<role>OBSERVATION</role> {{- .Content }} {{- end }} {{- if and $last (ne .Role "assistant") }}<role>ASSISTANT</role> {{- if and $.IsThinkSet $.Think }}<think>{{ end }} {{- end }} {{- end -}} """ PARAMETER stop <role> PARAMETER stop <role>HUMAN</role> ``` ```console $ ollama pull hf.co/mradermacher/Ling-lite-1.5-2507-GGUF:Q4_K_M $ ollama create ling-lite-1.5:16b-2507-q4_K_M -f Modelfile $ ollama run ling-lite-1.5:16b-2507-q4_K_M >>> hello Thinking... Okay, the user said "hello". I need to respond appropriately. Let me think about how to start a conversation. Maybe ask how they're doing or something related. They might be expecting a friendly reply. Also, considering possible contexts, like if they want information, but since their message is just "hello", probably a casual chat. Should keep it neutral and inviting. Avoid being too formal or too casual. Check for any specific guidelines, but since there's none, standard response should work. Maybe ask "Hi there! How can I assist you today?" That covers both greeting and offering help. Alternatively, "Hello! What brings you here today?" But maybe that's assuming a purpose. If they just want a simple hello back, then "Hello! How are you?" is also good. Need to make sure the response is welcoming and open-ended. Probably best to use a common phrase like "Hello! How can I help you?" since it's helpful and sets a positive tone. ...done thinking. Hello! 😊 How can I assist you today? If you have any questions or need information, feel free to ask! >>> /set nothink Set 'nothink' mode. >>> hello Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat? 😊
Author
Owner

@rick-github commented on GitHub (Oct 28, 2025):

Vendor sync in #12791.

<!-- gh-comment-id:3455125402 --> @rick-github commented on GitHub (Oct 28, 2025): Vendor sync in #12791.
Author
Owner

@rick-github commented on GitHub (Nov 7, 2025):

#12791 is merged and ollama will support Ling-mini-2.0/Ring-mini-2.0 from 0.12.11.

<!-- gh-comment-id:3502402201 --> @rick-github commented on GitHub (Nov 7, 2025): #12791 is merged and ollama will support Ling-mini-2.0/Ring-mini-2.0 from 0.12.11.
Author
Owner

@alerikaisattera commented on GitHub (Nov 9, 2025):

12791 is merged and ollama will support Ling-mini-2.0/Ring-mini-2.0 from 0.12.11.

Will 1/1.5 be added as well?

<!-- gh-comment-id:3508518220 --> @alerikaisattera commented on GitHub (Nov 9, 2025): > 12791 is merged and ollama will support Ling-mini-2.0/Ring-mini-2.0 from 0.12.11. Will 1/1.5 be added as well?
Author
Owner

@rick-github commented on GitHub (Nov 9, 2025):

Links to the HF models you want included?

<!-- gh-comment-id:3508525573 --> @rick-github commented on GitHub (Nov 9, 2025): Links to the HF models you want included?
Author
Owner

@alerikaisattera commented on GitHub (Nov 10, 2025):

Ling 1/1.5-lite series

https://huggingface.co/collections/inclusionAI/ling

Ring-lite series

https://huggingface.co/collections/inclusionAI/ring

<!-- gh-comment-id:3512857913 --> @alerikaisattera commented on GitHub (Nov 10, 2025): Ling 1/1.5-lite series https://huggingface.co/collections/inclusionAI/ling Ring-lite series https://huggingface.co/collections/inclusionAI/ring
Author
Owner

@rick-github commented on GitHub (Nov 10, 2025):

https://github.com/ollama/ollama/issues/12652#issuecomment-3438014875

$ ollama run ling-lite-1.5:16b-2507-q4_K_M 
>>> hello
Thinking...
Okay, the user just said "hello". That's a very simple greeting. I need to respond in a friendly and appropriate way. Let me think about possible responses. Common 
greetings usually involve acknowledging the hello and maybe asking how they're doing or starting a conversation. Since the user hasn't provided any context beyond 
"hello", keeping it neutral makes sense. Maybe something like "Hi there! How can I help you today?" That's polite, helpful, and opens the door for them to ask anything 
they need. Alternatively, a simple "Hello!" back might be too brief. They might expect a bit more engagement. Also, considering that this could be an AI interaction, 
offering assistance is a good approach. Need to make sure the response is warm but professional. Avoid being overly formal or casual. Check if there are any specific 
guidelines for such interactions. Usually, welcoming and open-ended responses work best. So, "Hello! How can I assist you today?" seems balanced.

...done thinking.

Hello! 😊 How can I assist you today? If you have any questions or need help with something, feel free to let me know!

>>> Send a message (/? for help)
$ ollama run ring-lite:16b-2507-q4_K_M 
>>> hello
Thinking...
We are starting a conversation. The user said "hello". This is a simple greeting, so we should respond politely and appropriately.
 Since the user hasn't asked a specific question, we can respond with a friendly greeting in return and perhaps offer some help or open the door for them to ask 
something.

Steps:
1. Acknowledge the greeting: Respond with "Hello!" or similar.
2. Be warm and inviting: Let the user know we are here to help.
3. Open-ended question: Since they just said hello, we might want to encourage them to ask something. Alternatively, we can wait for their next input.

However, note that in a conversational AI context, we don't want to be too pushy without knowing the user's intent. So a simple and friendly response is best.

Possible response: "Hello! How can I help you today?"

But since the user might just be testing or starting, we can also say: "Hi there! What can I do for you?"

Alternatively, if we want to be very basic: "Hello!" and then wait. But that might not be as engaging.

Considering the context of an AI assistant, it's better to be proactive but polite.

I think a good response is: "Hello! How can I assist you today?"

This invites the user to state their need.

Let me write the response accordingly.
...done thinking.

Hello! 👋 It's great to hear from you. How can I assist you today? Whether you have a question, need help with something, or just want to chat—I'm here for it! 😊

>>> Send a message (/? for help)
<!-- gh-comment-id:3513143365 --> @rick-github commented on GitHub (Nov 10, 2025): https://github.com/ollama/ollama/issues/12652#issuecomment-3438014875 ```console $ ollama run ling-lite-1.5:16b-2507-q4_K_M >>> hello Thinking... Okay, the user just said "hello". That's a very simple greeting. I need to respond in a friendly and appropriate way. Let me think about possible responses. Common greetings usually involve acknowledging the hello and maybe asking how they're doing or starting a conversation. Since the user hasn't provided any context beyond "hello", keeping it neutral makes sense. Maybe something like "Hi there! How can I help you today?" That's polite, helpful, and opens the door for them to ask anything they need. Alternatively, a simple "Hello!" back might be too brief. They might expect a bit more engagement. Also, considering that this could be an AI interaction, offering assistance is a good approach. Need to make sure the response is warm but professional. Avoid being overly formal or casual. Check if there are any specific guidelines for such interactions. Usually, welcoming and open-ended responses work best. So, "Hello! How can I assist you today?" seems balanced. ...done thinking. Hello! 😊 How can I assist you today? If you have any questions or need help with something, feel free to let me know! >>> Send a message (/? for help) ``` ```console $ ollama run ring-lite:16b-2507-q4_K_M >>> hello Thinking... We are starting a conversation. The user said "hello". This is a simple greeting, so we should respond politely and appropriately. Since the user hasn't asked a specific question, we can respond with a friendly greeting in return and perhaps offer some help or open the door for them to ask something. Steps: 1. Acknowledge the greeting: Respond with "Hello!" or similar. 2. Be warm and inviting: Let the user know we are here to help. 3. Open-ended question: Since they just said hello, we might want to encourage them to ask something. Alternatively, we can wait for their next input. However, note that in a conversational AI context, we don't want to be too pushy without knowing the user's intent. So a simple and friendly response is best. Possible response: "Hello! How can I help you today?" But since the user might just be testing or starting, we can also say: "Hi there! What can I do for you?" Alternatively, if we want to be very basic: "Hello!" and then wait. But that might not be as engaging. Considering the context of an AI assistant, it's better to be proactive but polite. I think a good response is: "Hello! How can I assist you today?" This invites the user to state their need. Let me write the response accordingly. ...done thinking. Hello! 👋 It's great to hear from you. How can I assist you today? Whether you have a question, need help with something, or just want to chat—I'm here for it! 😊 >>> Send a message (/? for help) ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54910