[GH-ISSUE #15741] Kimi models require pro ? #72094

Open
opened 2026-05-05 03:27:12 -05:00 by GiteaMirror · 34 comments
Owner

Originally created by @Azecko on GitHub (Apr 22, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15741

What is the issue?

I've always used kimi-k2.5:cloud for Ollama. Mainly with the Hermes Agent.

Today, when I tried to message my Hermes, I got the error HTTP 403: Error code: 403 - {'error': 'this model requires a subscription, upgrade for access: https://ollama.com/upgrade.

I tried to prompt kimi using just ollama run, same thing.
Using the new Kimi CLI from v0.21.1 (because I also tried to update Ollama), same thing.
With kimi-k2.6:cloud, I also get the same error.
But I do not get this error when I try with other cloud models, like gemma.

I'm correctly signed in when I run ollama signin.

My Cloud usage for this week is at 37%, and 0% for the session at the time I write this issue.

Am I the only one who get this error ? Is there something that I missed and now Kimi models require Ollama pro ?

Relevant log output

Error: 403 Forbidden: this model requires a subscription, upgrade for access: https://ollama.com/upgrade

OS

Linux

GPU

No response

CPU

Intel

Ollama version

0.21.1

Originally created by @Azecko on GitHub (Apr 22, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15741 ### What is the issue? I've always used `kimi-k2.5:cloud` for Ollama. Mainly with the Hermes Agent. Today, when I tried to message my Hermes, I got the error `HTTP 403: Error code: 403 - {'error': 'this model requires a subscription, upgrade for access: https://ollama.com/upgrade`. I tried to prompt kimi using just `ollama run`, same thing. Using the new Kimi CLI from `v0.21.1` (because I also tried to update Ollama), same thing. With `kimi-k2.6:cloud`, I also get the same error. But I do not get this error when I try with other cloud models, like gemma. I'm correctly signed in when I run `ollama signin`. My Cloud usage for this week is at 37%, and 0% for the session at the time I write this issue. Am I the only one who get this error ? Is there something that I missed and now Kimi models require Ollama pro ? ### Relevant log output ```shell Error: 403 Forbidden: this model requires a subscription, upgrade for access: https://ollama.com/upgrade ``` ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.21.1
GiteaMirror added the bug label 2026-05-05 03:27:12 -05:00
Author
Owner

@sruizcarmona commented on GitHub (Apr 22, 2026):

Same here, no updates from ollama anywhere...

<!-- gh-comment-id:4296487030 --> @sruizcarmona commented on GitHub (Apr 22, 2026): Same here, no updates from ollama anywhere...
Author
Owner

@AngelSanchezB commented on GitHub (Apr 22, 2026):

Same here since last night...

<!-- gh-comment-id:4297472028 --> @AngelSanchezB commented on GitHub (Apr 22, 2026): Same here since last night...
Author
Owner

@ACheshirov commented on GitHub (Apr 22, 2026):

They disabled for free users almost all big models.

<!-- gh-comment-id:4297502872 --> @ACheshirov commented on GitHub (Apr 22, 2026): They disabled for free users almost all big models.
Author
Owner

@LeoLP1 commented on GitHub (Apr 22, 2026):

@ACheshirov Is that something you think, or did you read it somewhere?

<!-- gh-comment-id:4297696517 --> @LeoLP1 commented on GitHub (Apr 22, 2026): @ACheshirov Is that something you think, or did you read it somewhere?
Author
Owner

@ACheshirov commented on GitHub (Apr 22, 2026):

@LeoLP1 I tried a lot of their large cloud models - all of them are disabled for free users.

People on their Discord server says that it's not confirmed that the restrictions are permanent...
However, considering the number of people with paid subscriptions who have complained about interruptions and very slow responses from the models, this free tier restriction was expected...

More likely, they will think of another way to give access to free users that is more difficult to abuse.

<!-- gh-comment-id:4298304095 --> @ACheshirov commented on GitHub (Apr 22, 2026): @LeoLP1 I tried a lot of their large cloud models - all of them are disabled for free users. People on their Discord server says that it's not confirmed that the restrictions are permanent... However, considering the number of people with paid subscriptions who have complained about interruptions and very slow responses from the models, this free tier restriction was expected... More likely, they will think of another way to give access to free users that is more difficult to abuse.
Author
Owner

@asahu commented on GitHub (Apr 23, 2026):

Same here :(

<!-- gh-comment-id:4301552518 --> @asahu commented on GitHub (Apr 23, 2026): Same here :(
Author
Owner

@Azecko commented on GitHub (Apr 23, 2026):

I'm glad to see that I'm not the only one encountering this issue.
I hope that we will have a communication asap about the situation from the Ollama team, because this is really weird.

<!-- gh-comment-id:4302244544 --> @Azecko commented on GitHub (Apr 23, 2026): I'm glad to see that I'm not the only one encountering this issue. I hope that we will have a communication asap about the situation from the Ollama team, because this is really weird.
Author
Owner

@25kgozon commented on GitHub (Apr 23, 2026):

Free models got pay walled bc some users were abusing them too much apparently. I'm having the same 403 issue still too

<!-- gh-comment-id:4304746801 --> @25kgozon commented on GitHub (Apr 23, 2026): Free models got pay walled bc some users were abusing them too much apparently. I'm having the same 403 issue still too
Author
Owner

@MCOoost commented on GitHub (Apr 23, 2026):

Same issue here. Both kimi-k2.5:cloud and kimi-k2.6:cloud were working perfectly fine for me yesterday (April 22), and today they both return:

Error: this model requires a subscription, upgrade for access

I'm still well within my free cloud usage limits, and other cloud models (e.g., Gemma) continue to work without any issues on the same account.

I completely understand that business models evolve, but the lack of transparency here is disappointing. A silent, unannounced paywall on models that were freely accessible 24 hours ago creates a frustrating experience for users who have come to rely on them. If a subscription is now required, that is fair, but some advance notice or a clear changelog would have been greatly appreciated.

Could the team please clarify:

  • Was this an intentional change for Kimi models specifically?
  • Is this a temporary issue or a permanent requirement going forward?

Thank you for the otherwise excellent work on Ollama.

<!-- gh-comment-id:4305672276 --> @MCOoost commented on GitHub (Apr 23, 2026): Same issue here. Both kimi-k2.5:cloud and kimi-k2.6:cloud were working perfectly fine for me yesterday (April 22), and today they both return: ` Error: this model requires a subscription, upgrade for access ` I'm still well within my free cloud usage limits, and other cloud models (e.g., Gemma) continue to work without any issues on the same account. I completely understand that business models evolve, but the lack of transparency here is disappointing. A silent, unannounced paywall on models that were freely accessible 24 hours ago creates a frustrating experience for users who have come to rely on them. If a subscription is now required, that is fair, but some advance notice or a clear changelog would have been greatly appreciated. Could the team please clarify: - Was this an intentional change for Kimi models specifically? - Is this a temporary issue or a permanent requirement going forward? Thank you for the otherwise excellent work on Ollama.
Author
Owner

@siathalysedI commented on GitHub (Apr 23, 2026):

Same issue here, kimi-k2.6-cloud and I got:
Error code: 403 - {'error': 'this model requires a subscription, upgrade for access: https://ollama.com/upgrade (ref: 9d6c4cd4-b628-4882-b652-f71f638e117c)'}

I can't understand why this happens and with no further notice and such obscured information about it.

<!-- gh-comment-id:4306469178 --> @siathalysedI commented on GitHub (Apr 23, 2026): Same issue here, kimi-k2.6-cloud and I got: `Error code: 403 - {'error': 'this model requires a subscription, upgrade for access: https://ollama.com/upgrade (ref: 9d6c4cd4-b628-4882-b652-f71f638e117c)'}` I can't understand why this happens and with no further notice and such obscured information about it.
Author
Owner

@dnky-1 commented on GitHub (Apr 24, 2026):

same here trying to use glm-5.1 model

<!-- gh-comment-id:4315712302 --> @dnky-1 commented on GitHub (Apr 24, 2026): same here trying to use glm-5.1 model
Author
Owner

@FALLEN-01 commented on GitHub (Apr 25, 2026):

I am facing the same issues

<!-- gh-comment-id:4320164129 --> @FALLEN-01 commented on GitHub (Apr 25, 2026): I am facing the same issues
Author
Owner

@mahiarirani commented on GitHub (Apr 25, 2026):

same issue trying to use kimi-k2.6:cloud

<!-- gh-comment-id:4320598123 --> @mahiarirani commented on GitHub (Apr 25, 2026): same issue trying to use kimi-k2.6:cloud
Author
Owner

@freerider7777 commented on GitHub (Apr 26, 2026):

"There's no such thing as a free lunch."

<!-- gh-comment-id:4322174298 --> @freerider7777 commented on GitHub (Apr 26, 2026): "There's no such thing as a free lunch."
Author
Owner

@gry321 commented on GitHub (May 1, 2026):

I am facing the same issues

<!-- gh-comment-id:4358727373 --> @gry321 commented on GitHub (May 1, 2026): I am facing the same issues
Author
Owner

@Azecko commented on GitHub (May 2, 2026):

Now same thing with qwen2.5. This is getting annoying.

<!-- gh-comment-id:4363827948 --> @Azecko commented on GitHub (May 2, 2026): Now same thing with qwen2.5. This is getting annoying.
Author
Owner

@hiteshseth commented on GitHub (May 2, 2026):

Anyone has a list of models which actually work now? Ollama website lists all which ofcourse is not true

<!-- gh-comment-id:4364031810 --> @hiteshseth commented on GitHub (May 2, 2026): Anyone has a list of models which actually work now? Ollama website lists all which ofcourse is not true
Author
Owner

@AccidentalJedi commented on GitHub (May 2, 2026):

OS: Windows 11
MAX Plan (upgraded yesterday)

Latest Ollama update:

CONSTANT 503 service unavailable errors

I haven't decided how much longer I can afford to wait for something reliable. the service hit a pinnacle, and then has become completely useless. I do mean that. COMPLETELY useless. How long do we continue to pay for services we aren't getting? are there going to be compensations, or is it just going to be "our loss"?

<!-- gh-comment-id:4364081717 --> @AccidentalJedi commented on GitHub (May 2, 2026): OS: Windows 11 MAX Plan (upgraded yesterday) Latest Ollama update: CONSTANT 503 service unavailable errors I haven't decided how much longer I can afford to wait for something reliable. the service hit a pinnacle, and then has become completely useless. I do mean that. COMPLETELY useless. How long do we continue to pay for services we aren't getting? are there going to be compensations, or is it just going to be "our loss"?
Author
Owner

@zakoche commented on GitHub (May 2, 2026):

same problem but with all model's

<!-- gh-comment-id:4364890217 --> @zakoche commented on GitHub (May 2, 2026): same problem but with all model's
Author
Owner

@okankayci commented on GitHub (May 3, 2026):

Forbidden: this model requires a subscription, upgrade for access:

I can't use it at all.

<!-- gh-comment-id:4365546192 --> @okankayci commented on GitHub (May 3, 2026): Forbidden: this model requires a subscription, upgrade for access: I can't use it at all.
Author
Owner

@vinnytherobot commented on GitHub (May 3, 2026):

Same error here, I've already tried with GLM-5.1, Minimax-2.7, kimi-2.5 and they all return the same error.

<!-- gh-comment-id:4366295717 --> @vinnytherobot commented on GitHub (May 3, 2026): Same error here, I've already tried with GLM-5.1, Minimax-2.7, kimi-2.5 and they all return the same error.
Author
Owner

@hiteshseth commented on GitHub (May 3, 2026):

Even gemma is not working!

Yahoo Mail: Search, Organize, Conquer

On Sun, May 3, 2026 at 8:36, João @.***> wrote: vinnytherobot left a comment (ollama/ollama#15741)
Same error here, I've already tried with GLM-5.1, Minimax-2.7, kimi-2.5 and they all return the same error.


Reply to this email directly, view it on GitHub, or unsubscribe.
Triage notifications on the go with GitHub Mobile for iOS or Android.
You are receiving this because you are subscribed to this thread.Message ID: @.***>

<!-- gh-comment-id:4366515589 --> @hiteshseth commented on GitHub (May 3, 2026): Even gemma is not working! Yahoo Mail: Search, Organize, Conquer On Sun, May 3, 2026 at 8:36, João ***@***.***> wrote: vinnytherobot left a comment (ollama/ollama#15741) Same error here, I've already tried with GLM-5.1, Minimax-2.7, kimi-2.5 and they all return the same error. — Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
Author
Owner

@ACheshirov commented on GitHub (May 3, 2026):

Folks keep calling this an "issue", but that’s not really an issue. It’s intentional. They’ve deliberately cut off all the major models for free tier accounts, and there’s a pretty good chance they’re never coming back. :)
From here on out, Ollama doesn’t have much left to offer compared to other similar tools. LM Studio is just miles ahead (more options, better performance, a wider model selection, and a cleaner UI). The only good thing Ollama was better was cloud models.

<!-- gh-comment-id:4366589046 --> @ACheshirov commented on GitHub (May 3, 2026): Folks keep calling this an "issue", but that’s not really an issue. It’s intentional. They’ve deliberately cut off all the major models for free tier accounts, and there’s a pretty good chance they’re never coming back. :) From here on out, Ollama doesn’t have much left to offer compared to other similar tools. LM Studio is just miles ahead (more options, better performance, a wider model selection, and a cleaner UI). The only good thing Ollama was better was cloud models.
Author
Owner

@romabysen commented on GitHub (May 4, 2026):

I would say the issue is the complete lack of communication. No announcement for anything. The pricing page says nothing about model restrictions on the free plan.

<!-- gh-comment-id:4367987158 --> @romabysen commented on GitHub (May 4, 2026): I would say the issue is the complete lack of communication. No announcement for anything. The pricing page says nothing about model restrictions on the free plan.
Author
Owner

@somera commented on GitHub (May 4, 2026):

[...] The only good thing Ollama was better was cloud models.

And Ollama is not running the cloud models with Ollama. ;) I heard that.

<!-- gh-comment-id:4368625806 --> @somera commented on GitHub (May 4, 2026): > [...] The only good thing Ollama was better was cloud models. And Ollama is not running the cloud models with Ollama. ;) I heard that.
Author
Owner

@bam93 commented on GitHub (May 4, 2026):

@Azecko Same here for minimax 2.7 cloud. Worked like a charm for many weeks. I am properly signed in. Also tried to sign out back in, not working. Any news? Unless this is a deliberate policy change, but then it should be officially announced. As I'm hitting the same issue, but with minimax-m2.7:cloud rather than Kimi models — this really appears to be broader than the title suggests -> should the title of the bug be changed to make that clear (cc @Azecko ) ?

Error:

Error: 403 Forbidden: this model requires a subscription, upgrade for access: https://ollama.com/upgrade

Context:

  • My free cloud credits are not exhausted (weekly and session usage both well within limits)
  • I am correctly signed in (ollama signin)
  • Signing out and back in (ollama signout / ollama signin) does not resolve it
  • This started recently — the same model worked without issue last week

There is no official announcement from Ollama about restricting specific models to paid plans, so I believe this is a bug in how certain models validate credentials on the backend, not an intentional tier change.

<!-- gh-comment-id:4369901073 --> @bam93 commented on GitHub (May 4, 2026): @Azecko Same here for minimax 2.7 cloud. Worked like a charm for many weeks. I am properly signed in. Also tried to sign out back in, not working. Any news? Unless this is a deliberate policy change, but then it should be officially announced. As I'm hitting the same issue, but with `minimax-m2.7:cloud` rather than Kimi models — this really appears to be broader than the title suggests -> should the title of the bug be changed to make that clear (cc @Azecko ) ? **Error:** ``` Error: 403 Forbidden: this model requires a subscription, upgrade for access: https://ollama.com/upgrade ``` **Context:** - My free cloud credits are not exhausted (weekly and session usage both well within limits) - I am correctly signed in (`ollama signin`) - Signing out and back in (`ollama signout` / `ollama signin`) does not resolve it - This started recently — the same model worked without issue last week There is no official announcement from Ollama about restricting specific models to paid plans, so I believe this is a bug in how certain models validate credentials on the backend, not an intentional tier change.
Author
Owner

@Azecko commented on GitHub (May 4, 2026):

Updated the issue title since we know see that the problem do not appear only on kimi models.

<!-- gh-comment-id:4370291896 --> @Azecko commented on GitHub (May 4, 2026): Updated the issue title since we know see that the problem do not appear only on kimi models.
Author
Owner

@AccidentalJedi commented on GitHub (May 4, 2026):

I'm about to set a deadline that if these issues aren't solved... I'll be cancelling my max plan and moving on. Deepseek V4 models are STILL unusable. why bother listing them if you aren't set up to serve them? That's AFTER today's latest update as well.

<!-- gh-comment-id:4371242547 --> @AccidentalJedi commented on GitHub (May 4, 2026): I'm about to set a deadline that if these issues aren't solved... I'll be cancelling my max plan and moving on. Deepseek V4 models are STILL unusable. why bother listing them if you aren't set up to serve them? That's AFTER today's latest update as well.
Author
Owner

@bam93 commented on GitHub (May 4, 2026):

@AccidentalJedi yes for me it's the other way round: I was very seriously considering getting a paid plan with ollama, because I was quite satisfied with the free tier service (just not enough obviously, for my needs), but now if they make this move, and on top of that totally unannounced, out of the blue, that's a show stopper for me. And I thought I had found the rare provider that doesn't mess with the users. I hope they reply properly and officially to this issue here, then I can make an informed decision.

But what is weird: the issue is 2 weeks old.. for me, for the last 2 weeks, minimax 2.7 was working fine. Just when it reset last night, no joy any more. Maybe they are tuning which models to include in the exclusion to tune their cost model? Still there could/should have been an announcement and response by the time now. 😞

<!-- gh-comment-id:4371693761 --> @bam93 commented on GitHub (May 4, 2026): @AccidentalJedi yes for me it's the other way round: I was very seriously considering getting a paid plan with ollama, because I was quite satisfied with the free tier service (just not enough obviously, for my needs), but now if they make this move, and on top of that totally unannounced, out of the blue, that's a show stopper for me. And I thought I had found the rare provider that doesn't mess with the users. I hope they reply properly and officially to this issue here, then I can make an informed decision. But what is weird: the issue is 2 weeks old.. for me, for the last 2 weeks, minimax 2.7 was working fine. Just when it reset last night, no joy any more. Maybe they are tuning which models to include in the exclusion to tune their cost model? Still there could/should have been an announcement and response by the time now. 😞
Author
Owner

@ACheshirov commented on GitHub (May 4, 2026):

I already wrote this in Discord, but I’ll post it here as well - even though it looks like there are no admins around, or they just don’t care, since nobody responds anyway.

The main reason I’m holding off on upgrading to Pro is the complete lack of transparency from the team. That’s also why I prefer to just load tokens directly in z.ai for GLM 5.1. It honestly feels like they operate with a “we don’t owe you explanations” mindset.

Literally overnight, they cut off free-tier users from access to almost all cloud models, and didn’t even bother to post an announcement explaining how long this would last, why it was necessary, or anything like that.

My other issue is, again, tied to the same thing - lack of transparency. Most APIs work on a token-based system, where you know exactly how much you’re paying per million tokens. That way, everything is clear and there’s no room for surprises.

Right now, we have no idea how usage is actually being calculated. How am I supposed to trust that tomorrow they won’t just decide the current usage limits don’t work for them anymore and quietly reduce them? And judging by how things look, they’re not exactly the type of persons to explain their decisions - so something like that could easily happen without anyone even knowing.

<!-- gh-comment-id:4372463263 --> @ACheshirov commented on GitHub (May 4, 2026): I already wrote this in Discord, but I’ll post it here as well - even though it looks like there are no admins around, or they just don’t care, since nobody responds anyway. The main reason I’m holding off on upgrading to Pro is the complete lack of transparency from the team. That’s also why I prefer to just load tokens directly in z.ai for GLM 5.1. It honestly feels like they operate with a “we don’t owe you explanations” mindset. Literally overnight, they cut off free-tier users from access to almost all cloud models, and didn’t even bother to post an announcement explaining how long this would last, why it was necessary, or anything like that. My other issue is, again, tied to the same thing - lack of transparency. Most APIs work on a token-based system, where you know exactly how much you’re paying per million tokens. That way, everything is clear and there’s no room for surprises. Right now, we have no idea how usage is actually being calculated. How am I supposed to trust that tomorrow they won’t just decide the current usage limits don’t work for them anymore and quietly reduce them? And judging by how things look, they’re not exactly the type of persons to explain their decisions - so something like that could easily happen without anyone even knowing.
Author
Owner

@quarthex commented on GitHub (May 4, 2026):

Updated the issue title since we know see that the problem do not appear only on kimi models.

I believe, that you can restate the title as BigCloud models requires pro?

<!-- gh-comment-id:4374469426 --> @quarthex commented on GitHub (May 4, 2026): > Updated the issue title since we know see that the problem do not appear only on kimi models. I believe, that you can restate the title as *~~Big~~Cloud models requires pro?*
Author
Owner

@DanielAraldi commented on GitHub (May 4, 2026):

A complete lack of respect and transparency towards users. Honestly, I'm disgusted.

<!-- gh-comment-id:4374627348 --> @DanielAraldi commented on GitHub (May 4, 2026): A complete lack of respect and transparency towards users. Honestly, I'm disgusted.
Author
Owner

@sdev138 commented on GitHub (May 4, 2026):

Having the same issue while testing GLM 5.1 and deepseek-v4-flash. Kept getting 403s where I would need to upgrade despite literally having 0% quota usage atm.

<!-- gh-comment-id:4374944788 --> @sdev138 commented on GitHub (May 4, 2026): Having the same issue while testing GLM 5.1 and deepseek-v4-flash. Kept getting 403s where I would need to upgrade despite literally having 0% quota usage atm.
Author
Owner

@nccongg commented on GitHub (May 5, 2026):

nah, i can't use any cloud model

<!-- gh-comment-id:4376520733 --> @nccongg commented on GitHub (May 5, 2026): nah, i can't use any cloud model
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72094