[PR #6559] [MERGED] Go server command line options support #22692

Closed
opened 2026-04-19 16:29:34 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/6559
Author: @jessegross
Created: 8/29/2024
Status: Merged
Merged: 9/3/2024
Merged by: @jessegross

Base: jmorganca/llamaHead: jessegross/goserver-options


📝 Commits (3)

  • 6f50a86 runner.go: Support resource usage command line options
  • c5cd67e runner.go: Don't cast a Go handle to a C void *
  • d9f500d runner.go: Support GGUF LoRAs

📊 Changes

7 files changed (+97 additions, -737 deletions)

View changed files

📝 llama/common.cpp (+3 -15)
📝 llama/example/main.go (+8 -3)
📝 llama/llama.cpp (+0 -287)
📝 llama/llama.go (+53 -35)
📝 llama/llama.h (+0 -14)
llama/patches/09-lora.diff (+0 -350)
📝 llama/runner/runner.go (+33 -33)

📄 Description

Support for command line options for controlling resource usage such as mlock, mmap and GPU allocation. In addition, switches support to more recent LoRA formats.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/6559 **Author:** [@jessegross](https://github.com/jessegross) **Created:** 8/29/2024 **Status:** ✅ Merged **Merged:** 9/3/2024 **Merged by:** [@jessegross](https://github.com/jessegross) **Base:** `jmorganca/llama` ← **Head:** `jessegross/goserver-options` --- ### 📝 Commits (3) - [`6f50a86`](https://github.com/ollama/ollama/commit/6f50a8633a8cc0a104cb2259e2de79fdda4e9514) runner.go: Support resource usage command line options - [`c5cd67e`](https://github.com/ollama/ollama/commit/c5cd67e7a8ac7d9548f4404b1f6e008b46dc69fa) runner.go: Don't cast a Go handle to a C void * - [`d9f500d`](https://github.com/ollama/ollama/commit/d9f500d915be9223010b8a2ffab24d43176fbc63) runner.go: Support GGUF LoRAs ### 📊 Changes **7 files changed** (+97 additions, -737 deletions) <details> <summary>View changed files</summary> 📝 `llama/common.cpp` (+3 -15) 📝 `llama/example/main.go` (+8 -3) 📝 `llama/llama.cpp` (+0 -287) 📝 `llama/llama.go` (+53 -35) 📝 `llama/llama.h` (+0 -14) ➖ `llama/patches/09-lora.diff` (+0 -350) 📝 `llama/runner/runner.go` (+33 -33) </details> ### 📄 Description Support for command line options for controlling resource usage such as mlock, mmap and GPU allocation. In addition, switches support to more recent LoRA formats. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-19 16:29:34 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#22692