[PR #1239] [MERGED] feat: add cloud LLM provider support for caption generation #1192

Closed
opened 2026-03-22 16:03:40 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/harvard-edge/cs249r_book/pull/1239
Author: @octo-patch
Created: 3/15/2026
Status: Merged
Merged: 3/21/2026
Merged by: @profvjreddi

Base: devHead: feature/cloud-llm-providers


📝 Commits (2)

  • 97237e7 feat: add cloud LLM provider support for caption generation
  • 8117f39 feat: upgrade MiniMax default model to M2.7

📊 Changes

1 file changed (+186 additions, -17 deletions)

View changed files

📝 book/tools/scripts/content/manage_captions.py (+186 -17)

📄 Description

Summary

Add cloud LLM provider support to the manage_captions.py tool, enabling caption generation using OpenAI, Groq, MiniMax, or any OpenAI-compatible API.

Changes

  • Add --provider and --api-key CLI flags for cloud LLM providers
  • Support OpenAI, Groq, and MiniMax with auto-detected API keys
  • Use MiniMax-M2.7 as the recommended MiniMax model (latest flagship with enhanced reasoning and coding)
  • MiniMax-M2.7-highspeed also available for low-latency scenarios
  • Keep Ollama as default provider for local usage

Testing

  • Syntax validated
  • CLI help text verified with updated model references

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/harvard-edge/cs249r_book/pull/1239 **Author:** [@octo-patch](https://github.com/octo-patch) **Created:** 3/15/2026 **Status:** ✅ Merged **Merged:** 3/21/2026 **Merged by:** [@profvjreddi](https://github.com/profvjreddi) **Base:** `dev` ← **Head:** `feature/cloud-llm-providers` --- ### 📝 Commits (2) - [`97237e7`](https://github.com/harvard-edge/cs249r_book/commit/97237e79b0547c02b07c13c7d4816da0ba9e0c62) feat: add cloud LLM provider support for caption generation - [`8117f39`](https://github.com/harvard-edge/cs249r_book/commit/8117f39c3b077056caf6004d7c9c805e7d71bd85) feat: upgrade MiniMax default model to M2.7 ### 📊 Changes **1 file changed** (+186 additions, -17 deletions) <details> <summary>View changed files</summary> 📝 `book/tools/scripts/content/manage_captions.py` (+186 -17) </details> ### 📄 Description ## Summary Add cloud LLM provider support to the manage_captions.py tool, enabling caption generation using OpenAI, Groq, MiniMax, or any OpenAI-compatible API. ## Changes - Add --provider and --api-key CLI flags for cloud LLM providers - Support OpenAI, Groq, and MiniMax with auto-detected API keys - Use MiniMax-M2.7 as the recommended MiniMax model (latest flagship with enhanced reasoning and coding) - MiniMax-M2.7-highspeed also available for low-latency scenarios - Keep Ollama as default provider for local usage ## Testing - Syntax validated - CLI help text verified with updated model references --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-03-22 16:03:40 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/cs249r_book#1192