[GH-ISSUE #14135] VS Code 1.109.0 + GitHub Copilot Chat 0.37.1 + Ollama 0.15.5 with Local models not working #55733

Closed
opened 2026-04-29 09:39:39 -05:00 by GiteaMirror · 23 comments
Owner

Originally created by @Klaus-Michael on GitHub (Feb 7, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14135

Originally assigned to: @ParthSareen on GitHub.

What is the issue?

VS Code with the GitHub Copilot Chat Extension is no longer able to talk to local Models provided by Ollama.
Local Models are no longer able to be selected in Agend Mode. They only appear when Ask or Edit Mode is selected.
When trying to communicate with them via Chat the following Message is displayed:

Sorry, your request failed. Please try again.

Copilot Request id: e62e7b25-6417-4bc3-afa6-0675b1e594f1

Reason: 404 page not found: Error: 404 page not found at Q3._provideLanguageModelResponse (c:\Users\ScorpionTDL.vscode\extensions\github.copilot-chat-0.37.1\dist\extension.js:1414:11576) at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

Relevant log output

VS Code Version: 
Version: 1.109.0 (user setup)
Commit: bdd88df003631aaa0bcbe057cb0a940b80a476fa
Date: 2026-02-04T02:01:38.288Z
Electron: 39.3.0
ElectronBuildId: 13168319
Chromium: 142.0.7444.265
Node.js: 22.21.1
V8: 14.2.231.22-electron.0
OS: Windows_NT x64 10.0.26200

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.15.5

Originally created by @Klaus-Michael on GitHub (Feb 7, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14135 Originally assigned to: @ParthSareen on GitHub. ### What is the issue? VS Code with the GitHub Copilot Chat Extension is no longer able to talk to local Models provided by Ollama. Local Models are no longer able to be selected in Agend Mode. They only appear when Ask or Edit Mode is selected. When trying to communicate with them via Chat the following Message is displayed: > Sorry, your request failed. Please try again. > > Copilot Request id: e62e7b25-6417-4bc3-afa6-0675b1e594f1 > > Reason: 404 page not found: Error: 404 page not found at Q3._provideLanguageModelResponse (c:\Users\ScorpionTDL.vscode\extensions\github.copilot-chat-0.37.1\dist\extension.js:1414:11576) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) ### Relevant log output ```shell VS Code Version: Version: 1.109.0 (user setup) Commit: bdd88df003631aaa0bcbe057cb0a940b80a476fa Date: 2026-02-04T02:01:38.288Z Electron: 39.3.0 ElectronBuildId: 13168319 Chromium: 142.0.7444.265 Node.js: 22.21.1 V8: 14.2.231.22-electron.0 OS: Windows_NT x64 10.0.26200 ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.15.5
GiteaMirror added the bug label 2026-04-29 09:39:39 -05:00
Author
Owner

@0APOCALYPSE0 commented on GitHub (Feb 7, 2026):

I am also facing the same issue.

<!-- gh-comment-id:3865087427 --> @0APOCALYPSE0 commented on GitHub (Feb 7, 2026): I am also facing the same issue.
Author
Owner

@Mincher commented on GitHub (Feb 7, 2026):

Same issue here:

Version: 1.109.0
Commit: bdd88df003631aaa0bcbe057cb0a940b80a476fa
Date: 2026-02-04T02:01:38.288Z
Electron: 39.3.0
ElectronBuildId: 13168319
Chromium: 142.0.7444.265
Node.js: 22.21.1
V8: 14.2.231.22-electron.0
OS: Linux x64 6.17.0-12-generic snap (Ubuntu 25.10)

<!-- gh-comment-id:3865123971 --> @Mincher commented on GitHub (Feb 7, 2026): Same issue here: Version: 1.109.0 Commit: bdd88df003631aaa0bcbe057cb0a940b80a476fa Date: 2026-02-04T02:01:38.288Z Electron: 39.3.0 ElectronBuildId: 13168319 Chromium: 142.0.7444.265 Node.js: 22.21.1 V8: 14.2.231.22-electron.0 OS: Linux x64 6.17.0-12-generic snap (Ubuntu 25.10)
Author
Owner

@Klaus-Michael commented on GitHub (Feb 7, 2026):

I discovered that there are several issues open regarding this in the vscode repro. Might be that this is a vscode issue and not on the ollama side.
a few examples:
https://github.com/microsoft/vscode/issues/293271
https://github.com/microsoft/vscode/issues/293370
https://github.com/microsoft/vscode/issues/293071

<!-- gh-comment-id:3865188651 --> @Klaus-Michael commented on GitHub (Feb 7, 2026): I discovered that there are several issues open regarding this in the vscode repro. Might be that this is a vscode issue and not on the ollama side. a few examples: https://github.com/microsoft/vscode/issues/293271 https://github.com/microsoft/vscode/issues/293370 https://github.com/microsoft/vscode/issues/293071
Author
Owner

@kritonpc commented on GitHub (Feb 9, 2026):

Same for me. It's not an ollama issue, as I had an older version of copilot chat where it worked, and when I updated copilot chat it started showing this error.

<!-- gh-comment-id:3872555271 --> @kritonpc commented on GitHub (Feb 9, 2026): Same for me. It's not an ollama issue, as I had an older version of copilot chat where it worked, and when I updated copilot chat it started showing this error.
Author
Owner

@Binaryzero commented on GitHub (Feb 10, 2026):

The 404 error is caused by an incorrect call to the Ollama API, which is using /chat/completions instead of the correct v1/chat/completions as specified in the Ollama documentation.

<!-- gh-comment-id:3880362639 --> @Binaryzero commented on GitHub (Feb 10, 2026): The 404 error is caused by an incorrect call to the Ollama API, which is using `/chat/completions` instead of the correct `v1/chat/completions` as specified in the [Ollama documentation.](https://docs.ollama.com/api/openai-compatibility#simple-v1/chat/completions-example)
Author
Owner

@crcarlo commented on GitHub (Feb 10, 2026):

@Binaryzero is right. A temporary solution I found is to change ollama's configuration URL by going to

Manage Models... > Open Language Models (JSON) (the small document icon on top right)

and change URL in chatLanguageModels.json by adding /v1 at the end. (Ex. http://localhost:11434/v1)

This will brake ollama's models listing inside Language Models settings, but will allow the pre-selected model to work.

<!-- gh-comment-id:3880668372 --> @crcarlo commented on GitHub (Feb 10, 2026): @Binaryzero is right. A temporary solution I found is to change ollama's configuration URL by going to **Manage Models...** > **Open Language Models (JSON)** (the small document icon on top right) and change URL in `chatLanguageModels.json` by adding `/v1` at the end. (Ex. `http://localhost:11434/v1`) This will brake ollama's models listing inside Language Models settings, but will allow the pre-selected model to work.
Author
Owner

@NgocDuy3112 commented on GitHub (Feb 11, 2026):

@crcarlo change the URL in chatLanguageModels.json does not work in my case.

<!-- gh-comment-id:3881664740 --> @NgocDuy3112 commented on GitHub (Feb 11, 2026): @crcarlo change the URL in `chatLanguageModels.json` does not work in my case.
Author
Owner

@Binaryzero commented on GitHub (Feb 11, 2026):

That’s because it appends /v1 to all transactions, but /api/tags, which lists the models, doesn’t require /v1.

<!-- gh-comment-id:3881691042 --> @Binaryzero commented on GitHub (Feb 11, 2026): That’s because it appends /v1 to all transactions, but /api/tags, which lists the models, doesn’t require /v1.
Author
Owner

@0APOCALYPSE0 commented on GitHub (Feb 11, 2026):

A temporary solution for this is:

  1. Install the OAI compatibility provider for the Copilot extension in VS Code.
  2. Open settings and search for the "OAI" keyword. You will see a URL option, which is set to a Hugging Face URL by default. Change it to http://localhost:11434/v1.
  3. Go to "Manage Models," then click on the "Add Provider" button on the top right. Select "OAI," provide a name, and a fake API key (e.g., "none"). You will then be able to see all the Ollama models under the OAI provider, and you can use them.

Note: Only models that support tool calling will work.

<!-- gh-comment-id:3882248128 --> @0APOCALYPSE0 commented on GitHub (Feb 11, 2026): A temporary solution for this is: 1. Install the OAI compatibility provider for the Copilot extension in VS Code. 2. Open settings and search for the "OAI" keyword. You will see a URL option, which is set to a Hugging Face URL by default. Change it to `http://localhost:11434/v1`. 3. Go to "Manage Models," then click on the "Add Provider" button on the top right. Select "OAI," provide a name, and a fake API key (e.g., "none"). You will then be able to see all the Ollama models under the OAI provider, and you can use them. Note: Only models that support tool calling will work.
Author
Owner

@NgocDuy3112 commented on GitHub (Feb 11, 2026):

@0APOCALYPSE0 thanks a lot, the models work as I expected

<!-- gh-comment-id:3882458072 --> @NgocDuy3112 commented on GitHub (Feb 11, 2026): @0APOCALYPSE0 thanks a lot, the models work as I expected
Author
Owner

@Klaus-Michael commented on GitHub (Feb 11, 2026):

@0APOCALYPSE0 Thanks, works for me as well.

<!-- gh-comment-id:3882514046 --> @Klaus-Michael commented on GitHub (Feb 11, 2026): @0APOCALYPSE0 Thanks, works for me as well.
Author
Owner

@kingeke commented on GitHub (Feb 12, 2026):

Thansk @0APOCALYPSE0 worked for me too.

<!-- gh-comment-id:3887910097 --> @kingeke commented on GitHub (Feb 12, 2026): Thansk @0APOCALYPSE0 worked for me too.
Author
Owner

@guhuajun commented on GitHub (Feb 12, 2026):

Thanks @0APOCALYPSE0, it works for me.

<!-- gh-comment-id:3887999064 --> @guhuajun commented on GitHub (Feb 12, 2026): Thanks @0APOCALYPSE0, it works for me.
Author
Owner

@landracer commented on GitHub (Feb 12, 2026):

@0APOCALYPSE0 Thats not an acceptable solution for me.

same issue after update
(I see it's not OS dependent, as OP is on windoze, I'm not)

Sorry, your request failed. Please try again.

Copilot Request id: 61224c80-6225-4640-95cf-da20dbccb12c

Reason: 404 page not found: Error: 404 page not found at Q3._provideLanguageModelResponse ~/.vscode/extensions/github.copilot-chat-0.37.0/dist/extension.js:1414:11576) at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

The continue extension cpilot clone (still works, it's not my ollama)

<!-- gh-comment-id:3888254962 --> @landracer commented on GitHub (Feb 12, 2026): @0APOCALYPSE0 Thats not an acceptable solution for me. same issue after update (I see it's not OS dependent, as OP is on windoze, I'm not) Sorry, your request failed. Please try again. Copilot Request id: 61224c80-6225-4640-95cf-da20dbccb12c Reason: 404 page not found: Error: 404 page not found at Q3._provideLanguageModelResponse ~/.vscode/extensions/github.copilot-chat-0.37.0/dist/extension.js:1414:11576) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) The continue extension cpilot clone (still works, it's not my ollama)
Author
Owner

@JMarcSyd commented on GitHub (Feb 12, 2026):

It needs to be fixed on the Copilot side: some people have raised it already: https://github.com/microsoft/vscode/issues/293770

<!-- gh-comment-id:3888339595 --> @JMarcSyd commented on GitHub (Feb 12, 2026): It needs to be fixed on the Copilot side: some people have raised it already: https://github.com/microsoft/vscode/issues/293770
Author
Owner

@ayush0477 commented on GitHub (Feb 22, 2026):

will it get fixed ?

<!-- gh-comment-id:3941620762 --> @ayush0477 commented on GitHub (Feb 22, 2026): will it get fixed ?
Author
Owner

@JMarcSyd commented on GitHub (Feb 22, 2026):

The fix has been merged https://github.com/microsoft/vscode-copilot-chat/pull/3858 now we need to wait until it works its way to release.

<!-- gh-comment-id:3941787391 --> @JMarcSyd commented on GitHub (Feb 22, 2026): The fix has been merged https://github.com/microsoft/vscode-copilot-chat/pull/3858 now we need to wait until it works its way to release.
Author
Owner

@BilalKhan-Code commented on GitHub (Feb 27, 2026):

Still not fixed!

<!-- gh-comment-id:3973284222 --> @BilalKhan-Code commented on GitHub (Feb 27, 2026): Still not fixed!
Author
Owner

@crcarlo commented on GitHub (Feb 27, 2026):

@BilalKhan-Code This will be resolved after the release of this february's milestone https://github.com/microsoft/vscode-copilot-chat/milestone/34

<!-- gh-comment-id:3973895384 --> @crcarlo commented on GitHub (Feb 27, 2026): @BilalKhan-Code This will be resolved after the release of this february's milestone https://github.com/microsoft/vscode-copilot-chat/milestone/34
Author
Owner

@mwaseema commented on GitHub (Mar 1, 2026):

Seems like this wasn't part of the recent release. Any way to get the build with this fix?

<!-- gh-comment-id:3980619104 --> @mwaseema commented on GitHub (Mar 1, 2026): Seems like this wasn't part of the recent release. Any way to get the build with this fix?
Author
Owner

@AdriiiRusso commented on GitHub (Mar 4, 2026):

@0APOCALYPSE0 Funcionando de 10, ¡Gracias papá!

<!-- gh-comment-id:3999009372 --> @AdriiiRusso commented on GitHub (Mar 4, 2026): @0APOCALYPSE0 Funcionando de 10, ¡Gracias papá!
Author
Owner

@ParthSareen commented on GitHub (Apr 13, 2026):

Going to close this out. The latest VS Code version has fixes in which should take care of this :)

<!-- gh-comment-id:4238641050 --> @ParthSareen commented on GitHub (Apr 13, 2026): Going to close this out. The latest VS Code version has fixes in which should take care of this :)
Author
Owner

@adil-adysh commented on GitHub (Apr 28, 2026):

still facing this issue on latest vs code
Sorry, your request failed. Please try again.
Copilot Request id: d732d364-3084-4f19-a047-fd560f66aaf6
Reason: Server error: 500: Error: Server error: 500
at Cj._provideLanguageModelResponse (c:\Users\adilh.vscode\extensions\github.copilot-chat-0.45.1\dist\extension.js:1688:12937)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
Note: GitHub is currently experiencing a service disruption. This may be affecting Copilot. Check GitHub Status for details.

Version: 1.117.0 (user setup)
Commit: 10c8e557c8b9f9ed0a87f61f1c9a44bde731c409
Date: 2026-04-21T16:12:14-07:00
Electron: 39.8.7
ElectronBuildId: 13841579
Chromium: 142.0.7444.265
Node.js: 22.22.1
V8: 14.2.231.22-electron.0
OS: Windows_NT x64 10.0.26200

<!-- gh-comment-id:4336742478 --> @adil-adysh commented on GitHub (Apr 28, 2026): still facing this issue on latest vs code Sorry, your request failed. Please try again. Copilot Request id: d732d364-3084-4f19-a047-fd560f66aaf6 Reason: Server error: 500: Error: Server error: 500 at Cj._provideLanguageModelResponse (c:\Users\adilh\.vscode\extensions\github.copilot-chat-0.45.1\dist\extension.js:1688:12937) at process.processTicksAndRejections (node:internal/process/task_queues:103:5) Note: GitHub is currently experiencing a service disruption. This may be affecting Copilot. Check [GitHub Status](https://www.githubstatus.com) for details. Version: 1.117.0 (user setup) Commit: 10c8e557c8b9f9ed0a87f61f1c9a44bde731c409 Date: 2026-04-21T16:12:14-07:00 Electron: 39.8.7 ElectronBuildId: 13841579 Chromium: 142.0.7444.265 Node.js: 22.22.1 V8: 14.2.231.22-electron.0 OS: Windows_NT x64 10.0.26200
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55733