[GH-ISSUE #13503] Request to Publish Public FP16 Model on Ollama Cloud – PrettyBird BCE Basic 14B Coder #8903

Open
opened 2026-04-12 21:42:21 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @Ahmet-Dev on GitHub (Dec 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13503

Subject: Request to Publish Public FP16 Model on Ollama Cloud – PrettyBird BCE Basic 14B Coder

Hello Ollama Team,

We would like to request the publication and cloud hosting of our public FP16 model on Ollama Cloud.

Model Information

  • Model name: prometech_corp/prettybird_bce_basic_15b_coder
  • Base model: Qwen2.5-Coder-14B / Qwen2.5-Coder-14B-Instruct
  • Precision: FP16 (full precision, non-quantized)
  • Context length: 32K
  • License: Special
  • Primary use case: Code generation and coding assistance

References

Request

Our goal is to:

  • Publish this model as publicly accessible on Ollama Cloud
  • Ensure it is available explicitly as an FP16 variant (not quantized)
  • Allow users to pull and run it directly via standard Ollama commands and APIs

Please let us know:

  • If any additional packaging, Modelfile adjustments, or validation steps are required
  • The recommended process for uploading or syncing the FP16 weights
  • Any constraints or best practices for public FP16 models on Ollama Cloud

We are happy to provide any further technical details or make adjustments as needed.

Thank you for your support.

Best regards,
Ahmet Kahraman
CEO Prometech Computer Sciences AŞ
[ahmetkahraman@prometech.net.tr]

Originally created by @Ahmet-Dev on GitHub (Dec 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13503 **Subject:** Request to Publish Public FP16 Model on Ollama Cloud – PrettyBird BCE Basic 14B Coder Hello Ollama Team, We would like to request the publication and cloud hosting of our **public FP16 model** on **Ollama Cloud**. ### Model Information * **Model name:** `prometech_corp/prettybird_bce_basic_15b_coder` * **Base model:** Qwen2.5-Coder-14B / Qwen2.5-Coder-14B-Instruct * **Precision:** **FP16 (full precision, non-quantized)** * **Context length:** 32K * **License:** Special * **Primary use case:** Code generation and coding assistance ### References * Ollama model page (current): [https://ollama.com/prometech_corp/prettybird_bce_basic_15b_coder](https://ollama.com/prometech_corp/prettybird_bce_basic_15b_coder) * Model details & benchmarks (GitHub): [https://github.com/Ahmet-Dev/bce](https://github.com/Ahmet-Dev/bce) * Hugging Face repository (FP16 + model metadata): [https://huggingface.co/pthcorp/prettybird_bce_basic_coder_15b](https://huggingface.co/pthcorp/prettybird_bce_basic_coder_15b) ### Request Our goal is to: * Publish this model as **publicly accessible** on **Ollama Cloud** * Ensure it is available explicitly as an **FP16 variant** (not quantized) * Allow users to pull and run it directly via standard Ollama commands and APIs Please let us know: * If any additional packaging, Modelfile adjustments, or validation steps are required * The recommended process for uploading or syncing the FP16 weights * Any constraints or best practices for public FP16 models on Ollama Cloud We are happy to provide any further technical details or make adjustments as needed. Thank you for your support. Best regards, **Ahmet Kahraman** CEO Prometech Computer Sciences AŞ [[ahmetkahraman@prometech.net.tr](mailto:ahmetkahraman@prometech.net.tr)]
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8903