[PR #15089] middleware: improve ID entropy for OpenAI-compatible response IDs #46262

Open
opened 2026-04-25 01:45:08 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/15089
Author: @Muszic
Created: 3/27/2026
Status: 🔄 Open

Base: mainHead: fix-openai-id-collisions


📝 Commits (1)

  • 19bdc89 middleware: improve ID entropy for OpenAI-compatible response IDs

📊 Changes

1 file changed (+4 additions, -4 deletions)

View changed files

📝 middleware/openai.go (+4 -4)

📄 Description

What is the Bug?

Every OpenAI-compatible chat and completion request generates a unique ID. However, the IDs are currently generated using rand.Intn(999) and rand.Intn(999999), which gives a severely limited number of possible values. On any moderately busy server, concurrent requests will routinely collide on the same ID.

Additionally, OpenAI uses high-entropy globally unique identifiers. Ollama currently produces very short IDs like chatcmpl-42 or resp_7.

The Fix

Replaced rand.Intn(999) and rand.Intn(999999) with rand.Int63() and formatted the output as a hex string (%x) in the ChatMiddleware, CompletionsMiddleware, and ResponsesMiddleware.

This vastly improves the entropy (providing ~9.2 quintillion possible values) and produces more authentic-looking IDs (e.g., chatcmpl-7fffffffffffffff) without requiring any new external dependencies or impacting performance.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/15089 **Author:** [@Muszic](https://github.com/Muszic) **Created:** 3/27/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `fix-openai-id-collisions` --- ### 📝 Commits (1) - [`19bdc89`](https://github.com/ollama/ollama/commit/19bdc8916f426b47d3b15f9a5e59933a747b70d8) middleware: improve ID entropy for OpenAI-compatible response IDs ### 📊 Changes **1 file changed** (+4 additions, -4 deletions) <details> <summary>View changed files</summary> 📝 `middleware/openai.go` (+4 -4) </details> ### 📄 Description ### What is the Bug? Every OpenAI-compatible chat and completion request generates a unique ID. However, the IDs are currently generated using `rand.Intn(999)` and `rand.Intn(999999)`, which gives a severely limited number of possible values. On any moderately busy server, concurrent requests will routinely collide on the same ID. Additionally, OpenAI uses high-entropy globally unique identifiers. Ollama currently produces very short IDs like `chatcmpl-42` or `resp_7`. ### The Fix Replaced `rand.Intn(999)` and `rand.Intn(999999)` with `rand.Int63()` and formatted the output as a hex string (`%x`) in the `ChatMiddleware`, `CompletionsMiddleware`, and `ResponsesMiddleware`. This vastly improves the entropy (providing ~9.2 quintillion possible values) and produces more authentic-looking IDs (e.g., `chatcmpl-7fffffffffffffff`) without requiring any new external dependencies or impacting performance. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 01:45:08 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46262