[GH-ISSUE #7518] Support for # of completions? (for loom obsidian plugin) #30541

Closed
opened 2026-04-22 10:15:07 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @cognitivetech on GitHub (Nov 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7518

I'm trying to adapt the loom obsidian plugin to use ollama.

now it seems to work fine, except I only ever get 1 completion. where settings.n is the number of completions I would like to generate.

https://github.com/cosmicoptima/loom/blob/master/main.ts

  async completeOpenAICompat(prompt: string) {
    prompt = this.trimOpenAIPrompt(prompt);

    // @ts-expect-error TODO
    let url = getPreset(this.settings).url;

    if (!(url.startsWith("http://") || url.startsWith("https://")))
      url = "https://" + url;
    if (!url.endsWith("/")) url += "/";
    url = url.replace(/v1\//, "");
    url += "v1/completions";
    let body: any = {
      prompt,
      model: getPreset(this.settings).model,
      max_tokens: this.settings.maxTokens,
      n: this.settings.n,
      temperature: this.settings.temperature,
      top_p: this.settings.topP,
      best_of:
        this.settings.bestOf > this.settings.n
          ? this.settings.bestOf
          : this.settings.n,
    };
Originally created by @cognitivetech on GitHub (Nov 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7518 I'm trying to adapt the loom obsidian plugin to use ollama. now it seems to work fine, except I only ever get 1 completion. where settings.n is the number of completions I would like to generate. https://github.com/cosmicoptima/loom/blob/master/main.ts ```javascript async completeOpenAICompat(prompt: string) { prompt = this.trimOpenAIPrompt(prompt); // @ts-expect-error TODO let url = getPreset(this.settings).url; if (!(url.startsWith("http://") || url.startsWith("https://"))) url = "https://" + url; if (!url.endsWith("/")) url += "/"; url = url.replace(/v1\//, ""); url += "v1/completions"; let body: any = { prompt, model: getPreset(this.settings).model, max_tokens: this.settings.maxTokens, n: this.settings.n, temperature: this.settings.temperature, top_p: this.settings.topP, best_of: this.settings.bestOf > this.settings.n ? this.settings.bestOf : this.settings.n, }; ```
GiteaMirror added the feature request label 2026-04-22 10:15:07 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30541