[GH-ISSUE #3502] post no longer works via fetch #2158

Closed
opened 2026-04-12 12:23:17 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @ralyodio on GitHub (Apr 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3502

What is the issue?

		const response = await fetch('https://ai.profullstack.com/api/generate', {
			method: 'POST',
			body: JSON.stringify({
				/* these don't categorize properly */
				// model: 'llama2-uncensored:latest',
				// model: 'codellama',
				// model: 'phi:latest',
				// model: 'dolphin-mixtral:latest',
				/* todo: try next */
				// model: 'mixtral:latest',
				// model: 'mistral',
				model: 'solar',
				prompt,
				stream: false,
				format: 'json'
			})
		});

		if (response.ok) {
			const data = await response.json();
			console.log(data.response);

			return JSON.parse(data.response);
		} else {
			console.error(await response.json());
		}

What did you expect to see?

it worked a month or two ago. i just updated

Steps to reproduce

	const response = await fetch('https://ai.profullstack.com/api/generate', {
		method: 'POST',
		body: JSON.stringify({
			/* these don't categorize properly */
			// model: 'llama2-uncensored:latest',
			// model: 'codellama',
			// model: 'phi:latest',
			// model: 'dolphin-mixtral:latest',
			/* todo: try next */
			// model: 'mixtral:latest',
			// model: 'mistral',
			model: 'solar',
			prompt,
			stream: false,
			format: 'json'
		})
	});

	if (response.ok) {
		const data = await response.json();
		console.log(data.response);

		return JSON.parse(data.response);
	} else {
		console.error(await response.json());
	}

Are there any recent changes that introduced the issue?

yes, i upgraded to latest ollama

OS

Linux

Architecture

amd64

Platform

Docker

Ollama version

No response

GPU

Other

GPU info

no gpu

CPU

Intel

Other software

No response

Originally created by @ralyodio on GitHub (Apr 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3502 ### What is the issue? ``` const response = await fetch('https://ai.profullstack.com/api/generate', { method: 'POST', body: JSON.stringify({ /* these don't categorize properly */ // model: 'llama2-uncensored:latest', // model: 'codellama', // model: 'phi:latest', // model: 'dolphin-mixtral:latest', /* todo: try next */ // model: 'mixtral:latest', // model: 'mistral', model: 'solar', prompt, stream: false, format: 'json' }) }); if (response.ok) { const data = await response.json(); console.log(data.response); return JSON.parse(data.response); } else { console.error(await response.json()); } ``` ### What did you expect to see? it worked a month or two ago. i just updated ### Steps to reproduce const response = await fetch('https://ai.profullstack.com/api/generate', { method: 'POST', body: JSON.stringify({ /* these don't categorize properly */ // model: 'llama2-uncensored:latest', // model: 'codellama', // model: 'phi:latest', // model: 'dolphin-mixtral:latest', /* todo: try next */ // model: 'mixtral:latest', // model: 'mistral', model: 'solar', prompt, stream: false, format: 'json' }) }); if (response.ok) { const data = await response.json(); console.log(data.response); return JSON.parse(data.response); } else { console.error(await response.json()); } ### Are there any recent changes that introduced the issue? yes, i upgraded to latest ollama ### OS Linux ### Architecture amd64 ### Platform Docker ### Ollama version _No response_ ### GPU Other ### GPU info no gpu ### CPU Intel ### Other software _No response_
GiteaMirror added the bug label 2026-04-12 12:23:17 -05:00
Author
Owner

@ralyodio commented on GitHub (Apr 5, 2024):

I get 405 method not allowed. My nginx proxy looks like this:

    location / {
        # auth_basic "ai access required";
        # auth_basic_user_file /etc/nginx/.htpasswd;
        proxy_pass http://localhost:5432;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_cache_bypass $http_upgrade;
        # proxy_set_header Host $host;
        # proxy_set_header X-Real-IP $remote_addr;
        # proxy_set_header X-Forwarded-Proto https;
        # proxy_set_header X-Forwarded-For $remote_addr;
    }

something broke after recent upgrade of ollamaa

<!-- gh-comment-id:2039631491 --> @ralyodio commented on GitHub (Apr 5, 2024): I get 405 method not allowed. My nginx proxy looks like this: ``` location / { # auth_basic "ai access required"; # auth_basic_user_file /etc/nginx/.htpasswd; proxy_pass http://localhost:5432; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_cache_bypass $http_upgrade; # proxy_set_header Host $host; # proxy_set_header X-Real-IP $remote_addr; # proxy_set_header X-Forwarded-Proto https; # proxy_set_header X-Forwarded-For $remote_addr; } ``` something broke after recent upgrade of ollamaa
Author
Owner

@eusebiu commented on GitHub (Apr 5, 2024):

I think the stream: false flag is ignored. I get the same behaviour.

On f4b31c2d53 (16 Mar 24) the issue is not happening (~0.1.29).

<!-- gh-comment-id:2040701374 --> @eusebiu commented on GitHub (Apr 5, 2024): I think the stream: false flag is ignored. I get the same behaviour. On f4b31c2d53923d55491db8f373817fd68cee6bda (16 Mar 24) the issue is not happening (~0.1.29).
Author
Owner

@ralyodio commented on GitHub (Apr 6, 2024):

weird. i need it, i wonder why it was removed.

<!-- gh-comment-id:2040912779 --> @ralyodio commented on GitHub (Apr 6, 2024): weird. i need it, i wonder why it was removed.
Author
Owner

@eusebiu commented on GitHub (Apr 6, 2024):

Not removed, but ignored.
I think this c863c6a96d breaks it.

I tested before and after; after, the stream: false is ignored, before it works fine.

<!-- gh-comment-id:2041023302 --> @eusebiu commented on GitHub (Apr 6, 2024): Not removed, but ignored. I think this c863c6a96d01afda37b65f86eb14a1f04a3c7c47 breaks it. I tested before and after; after, the stream: false is ignored, before it works fine.
Author
Owner

@ralyodio commented on GitHub (Apr 9, 2024):

is anyone looking into this? i upgraded to latest. I still get 405

<!-- gh-comment-id:2044460303 --> @ralyodio commented on GitHub (Apr 9, 2024): is anyone looking into this? i upgraded to latest. I still get 405
Author
Owner

@ralyodio commented on GitHub (Apr 9, 2024):

https://github.com/ollama/ollama/blob/main/docs/api.md#request-2

<!-- gh-comment-id:2044471265 --> @ralyodio commented on GitHub (Apr 9, 2024): https://github.com/ollama/ollama/blob/main/docs/api.md#request-2
Author
Owner

@eusebiu commented on GitHub (Apr 13, 2024):

I think now should work. Please get the latest.

<!-- gh-comment-id:2053555552 --> @eusebiu commented on GitHub (Apr 13, 2024): I think now should work. Please get the latest.
Author
Owner

@pdevine commented on GitHub (May 15, 2024):

It should be working fine:

% curl -X POST http://localhost:11434/api/generate -d '{"model": "llama3", "stream": false, "prompt": "hi there"}'
{"model":"llama3","created_at":"2024-05-15T00:08:31.042327Z","response":"Hi there! It's nice to meet you. Is there something I can help you with or would you like to chat?","done":true,"done_reason":"stop","context":[128006,882,128007,198,198,6151,1070,128009,128006,78191,128007,198,198,13347,1070,0,1102,6,82,6555,311,3449,499,13,2209,1070,2555,358,649,1520,499,449,477,1053,499,1093,311,6369,30,128009],"total_duration":1571077041,"load_duration":1075794458,"prompt_eval_count":12,"prompt_eval_duration":80494000,"eval_count":26,"eval_duration":414090000}

I'll go ahead an close the issue.

<!-- gh-comment-id:2111378713 --> @pdevine commented on GitHub (May 15, 2024): It should be working fine: ``` % curl -X POST http://localhost:11434/api/generate -d '{"model": "llama3", "stream": false, "prompt": "hi there"}' {"model":"llama3","created_at":"2024-05-15T00:08:31.042327Z","response":"Hi there! It's nice to meet you. Is there something I can help you with or would you like to chat?","done":true,"done_reason":"stop","context":[128006,882,128007,198,198,6151,1070,128009,128006,78191,128007,198,198,13347,1070,0,1102,6,82,6555,311,3449,499,13,2209,1070,2555,358,649,1520,499,449,477,1053,499,1093,311,6369,30,128009],"total_duration":1571077041,"load_duration":1075794458,"prompt_eval_count":12,"prompt_eval_duration":80494000,"eval_count":26,"eval_duration":414090000} ``` I'll go ahead an close the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2158