[GH-ISSUE #10714] Community Made VS Code Extention #7040

Open
opened 2026-04-12 18:56:55 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @HansUXdev on GitHub (May 15, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10714

Ollama is great but this community really needs to collab on making a VS Code Extension that makes coding with local models as easy as installing an extension, using the side bar to download model, command + k for editing code differences, command + i for standard chatting. No middleware like ngrok. Just a LLM studio like UI built into the sidebars, and directly connect to offline, local llms.

We don't need new code editors like cursor, windsurf, etc that charge us an arm and leg to collect our data and our code.
We just need a damn good extension that makes local llm powered coding simple, easy to use and keeps our data and code secure.

I wonder if anyone else has thought about this enough to want to work on something like this.

Originally created by @HansUXdev on GitHub (May 15, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10714 Ollama is great but this community really needs to collab on making a VS Code Extension that makes coding with local models as easy as installing an extension, using the side bar to download model, command + k for editing code differences, command + i for standard chatting. No middleware like ngrok. Just a LLM studio like UI built into the sidebars, and directly connect to offline, local llms. We don't need new code editors like cursor, windsurf, etc that charge us an arm and leg to collect our data and our code. We just need a damn good extension that makes local llm powered coding simple, easy to use and keeps our data and code secure. I wonder if anyone else has thought about this enough to want to work on something like this.
GiteaMirror added the feature request label 2026-04-12 18:56:55 -05:00
Author
Owner

@Igorgro commented on GitHub (May 15, 2025):

What about continue.dev? https://github.com/continuedev/continue

<!-- gh-comment-id:2884559438 --> @Igorgro commented on GitHub (May 15, 2025): What about continue.dev? https://github.com/continuedev/continue
Author
Owner
<!-- gh-comment-id:2916137162 --> @wywerne commented on GitHub (May 28, 2025): In Copilot : https://code.visualstudio.com/docs/copilot/language-models#_bring-your-own-language-model-key
Author
Owner

@Igorgro commented on GitHub (May 28, 2025):

@wywerne the copilot extension

  1. Is not open source (at least for now)
  2. Still requires to login through GitHub even fo local usage, see https://github.com/microsoft/vscode/issues/246551
<!-- gh-comment-id:2917646474 --> @Igorgro commented on GitHub (May 28, 2025): @wywerne the copilot extension 1. Is not open source (at least for now) 2. Still requires to login through GitHub even fo local usage, see https://github.com/microsoft/vscode/issues/246551
Author
Owner

@Philaaadata commented on GitHub (May 29, 2025):

@Igorgro :
https://code.visualstudio.com/blogs/2025/05/19/openSourceAIEditor : "We will open source the code in the GitHub Copilot Chat extension under the MIT license"

You're logged into GitHub, right?

<!-- gh-comment-id:2919687506 --> @Philaaadata commented on GitHub (May 29, 2025): @Igorgro : https://code.visualstudio.com/blogs/2025/05/19/openSourceAIEditor : "We will open source the code in the [GitHub Copilot Chat extension](https://marketplace.visualstudio.com/items?itemName=GitHub.copilot-chat) under the MIT license" You're logged into GitHub, right?
Author
Owner

@Igorgro commented on GitHub (May 29, 2025):

Not in vscode. I'm working with code under NDA and it is forbidden to use cloud AI services with it. That the default use case for local LLM, but copilot requires GitHub login even for using with local models. That's why I use continu.dev which works completely offline when connecting to ollama

<!-- gh-comment-id:2919699868 --> @Igorgro commented on GitHub (May 29, 2025): Not in vscode. I'm working with code under NDA and it is forbidden to use cloud AI services with it. That the default use case for local LLM, but copilot requires GitHub login even for using with local models. That's why I use continu.dev which works completely offline when connecting to ollama
Author
Owner

@Philaaadata commented on GitHub (May 29, 2025):

Ok I understand, I also use continue and I have my own internal git repository... For security and sovereignty reasons. It's fluent in French with qwen2.5-coder.

in .continue/config.yaml, I have :

name: Local Assistant
version: 1.0.0
schema: v1
models:
  - name: Mistral (Ollama)
    provider: ollama
    apiBase: http://localhost:11434
    model: mistral:7b
    temperature: 0.2
    contextLength: 4096
    roles:
      - chat
      - edit
      - apply
      - autocomplete
  - name: qwen2.5-coder (Ollama)
    provider: ollama
    apiBase: http://localhost:11434
    model: qwen2.5-coder:1.5b
    roles:
      - chat
      - edit
      - apply
      - autocomplete
defaultModel: qwen2.5-coder (Ollama)
context:
  - provider: code
  - provider: docs
  - provider: diff
  - provider: terminal
  - provider: problems
  - provider: folder
  - provider: codebase
<!-- gh-comment-id:2919762069 --> @Philaaadata commented on GitHub (May 29, 2025): Ok I understand, I also use continue and I have my own internal git repository... For security and sovereignty reasons. It's fluent in French with qwen2.5-coder. in .continue/config.yaml, I have : ```` name: Local Assistant version: 1.0.0 schema: v1 models: - name: Mistral (Ollama) provider: ollama apiBase: http://localhost:11434 model: mistral:7b temperature: 0.2 contextLength: 4096 roles: - chat - edit - apply - autocomplete - name: qwen2.5-coder (Ollama) provider: ollama apiBase: http://localhost:11434 model: qwen2.5-coder:1.5b roles: - chat - edit - apply - autocomplete defaultModel: qwen2.5-coder (Ollama) context: - provider: code - provider: docs - provider: diff - provider: terminal - provider: problems - provider: folder - provider: codebase ````
Author
Owner

@hyber97 commented on GitHub (Nov 11, 2025):

Need offline

<!-- gh-comment-id:3516597114 --> @hyber97 commented on GitHub (Nov 11, 2025): Need offline
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7040