[GH-ISSUE #11667] Denylist certain models in ollama config #69774

Closed
opened 2026-05-04 19:10:02 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Markus92 on GitHub (Aug 5, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11667

I am affiliated with a HPC computing facility at the University of Texas Southwestern Medical Center. We are working on making Ollama available to our users so they can run LLMs on one of our GPU nodes for research purposes.

Texas State Bill 1893 and a recent proclamation from the governor prohibits any institutions affiliated with the State of Texas to install Deepseek on State-owned devices. This might include model weights, too (it is not clear, and I am not a lawyer). As we are a state-affiliated entity, we are covered by this rule. Therefore we'd like to deny- or blacklist deepseek from our local ollama deployments.

We don't want to block full network access for ollama, as we'd like to give users access so they can pull and deploy their own models, we just want to block ollama pull deepseek-r1 for example, while still allowing all other commands.

Note: I do not want to get into a discussion about how big or small the security risk would be from only downloading and using model weights locally. The main reason we'd like this feature is compliance.

Originally created by @Markus92 on GitHub (Aug 5, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11667 I am affiliated with a HPC computing facility at the University of Texas Southwestern Medical Center. We are working on making Ollama available to our users so they can run LLMs on one of our GPU nodes for research purposes. Texas State Bill 1893 and a recent proclamation from the governor prohibits any institutions affiliated with the State of Texas to install Deepseek on State-owned devices. This might include model weights, too (it is not clear, and I am not a lawyer). As we are a state-affiliated entity, we are covered by this rule. Therefore we'd like to deny- or blacklist deepseek from our local ollama deployments. We don't want to block full network access for ollama, as we'd like to give users access so they can pull and deploy their own models, we just want to block `ollama pull deepseek-r1` for example, while still allowing all other commands. Note: I do not want to get into a discussion about how big or small the security risk would be from only downloading and using model weights locally. The main reason we'd like this feature is compliance.
GiteaMirror added the feature request label 2026-05-04 19:10:02 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 5, 2025):

ollama doesn't have any black/white listing capability built in. Off the top of my head, there are a couple of ways to do this.

  1. Pre-create a manifest file for the black listed model and make the manifest file un-modifiable by the ollama server.
  2. Install a proxy that inspects the requests and rejects any that have "model":"deepseek-r1".

The problem that you face is that a user can upload a blacklisted model to an external repo (eg HuggingFace) with a different name, and then pull the model to your server. Perhaps a safer solution would be option #2 above with a whitelist, for which users need to apply to have a model added.

<!-- gh-comment-id:3155719517 --> @rick-github commented on GitHub (Aug 5, 2025): ollama doesn't have any black/white listing capability built in. Off the top of my head, there are a couple of ways to do this. 1. Pre-create a manifest file for the black listed model and make the manifest file un-modifiable by the ollama server. 2. Install a proxy that inspects the requests and rejects any that have `"model":"deepseek-r1"`. The problem that you face is that a user can upload a blacklisted model to an external repo (eg HuggingFace) with a different name, and then pull the model to your server. Perhaps a safer solution would be option #2 above with a whitelist, for which users need to apply to have a model added.
Author
Owner

@Markus92 commented on GitHub (Aug 6, 2025):

I can see if I can talk to our proxy admins to block these respects, as all of our data is sent through a proxy already. I do like the idea of a pre-created manifest file, too.

If a user decides to upload/download the model under a different name, at that point they are actively trying to circumvent limits and its more of an issue for compliance department than for us sysadmins.

<!-- gh-comment-id:3161316906 --> @Markus92 commented on GitHub (Aug 6, 2025): I can see if I can talk to our proxy admins to block these respects, as all of our data is sent through a proxy already. I do like the idea of a pre-created manifest file, too. If a user decides to upload/download the model under a different name, at that point they are actively trying to circumvent limits and its more of an issue for compliance department than for us sysadmins.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69774