[GH-ISSUE #8272] Ollama models give low inference with Continue extension on VS Code Community Edition. #5290

Closed
opened 2026-04-12 16:28:18 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ENUMERA8OR on GitHub (Dec 31, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/8272

Please tell me how to troubleshoot shoot this issue. I want to increase the model inference on vs code. Any suggestionss would be helpful.

Originally created by @ENUMERA8OR on GitHub (Dec 31, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/8272 Please tell me how to troubleshoot shoot this issue. I want to increase the model inference on vs code. Any suggestionss would be helpful.
Author
Owner

@rick-github commented on GitHub (Dec 31, 2024):

If you give a clear description of the problem it will be easier to provide suggestions. Do you want to increase the speed of inference? The quality? The volume? Which model do you want to use for inference? What is not working in the current set up?

<!-- gh-comment-id:2566267638 --> @rick-github commented on GitHub (Dec 31, 2024): If you give a clear description of the problem it will be easier to provide suggestions. Do you want to increase the speed of inference? The quality? The volume? Which model do you want to use for inference? What is not working in the current set up?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5290