[GH-ISSUE #1446] letsencrypt certificates installed but get error on https #773

Closed
opened 2026-04-12 10:27:20 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @itscvenk on GitHub (Dec 9, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1446

Hello

I saw https://github.com/jmorganca/ollama/pull/1310#issue-2015690206 which says

Place cert.pem and key.pem into ~/.ollama/ssl/ restart server. It will come up in SSL mode. Remove, rename or delete files to disable ssl mode.

I had generated the letsencrypt self signed certificates and copied them into /usr/share/ollama/.ollama (as I had followed the manual instructions for installing Ollama). I had done a chown ollama:ollama for both the files I had copied into the above folder. later, did systemctl daemon-reload as well as systemctl restart ollama and rebooted my Ubuntu 20 vm for good measure as well

And then for a Curl with a http request, I get a response. All is well

For a https request, i get

`curl https://mysubdomain.mydomain.com:11434/api/generate -d '{

"model": "openchat",
"stream": false,
"prompt": "Hello"
}'
curl: (35) error:1408F10B:SSL routines:ssl3_get_record:wrong version number`

and journalctl -u ollama shows no logs for this :-( . Obviously because the request never reached Ollama.

Usually the above error occurs when there is a conflict in the port or if the port is not open, etc. Is the SSL configured on a different port, other than 11434?

How do I get SSL to work please?

Thanks

Edit:

Note:

I had removed the id_ed file there as well as the one with the .pub extension and restarted the daemon and the service: then I see the following in the logs:

Dec 09 14:02:27 mysubdomain.mydomain.com ollama[2971]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new private key. Dec 09 14:02:27 mysubdomain.mydomain.com ollama[2971]: Your new public key is: Dec 09 14:02:27 mysubdomain.mydomain.com ollama[2971]: ssh-ed25519 AA<<truncated>>Y

But the error remains on https

Originally created by @itscvenk on GitHub (Dec 9, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1446 Hello I saw https://github.com/jmorganca/ollama/pull/1310#issue-2015690206 which says `Place cert.pem and key.pem into ~/.ollama/ssl/ restart server. It will come up in SSL mode. Remove, rename or delete files to disable ssl mode.` I had generated the letsencrypt self signed certificates and copied them into /usr/share/ollama/.ollama (as I had followed the manual instructions for installing Ollama). I had done a chown ollama:ollama for both the files I had copied into the above folder. later, did systemctl daemon-reload as well as systemctl restart ollama and rebooted my Ubuntu 20 vm for good measure as well And then for a Curl with a http request, I get a response. All is well For a https request, i get `curl https://mysubdomain.mydomain.com:11434/api/generate -d '{ > "model": "openchat", > "stream": false, > "prompt": "Hello" > }' curl: (35) error:1408F10B:SSL routines:ssl3_get_record:wrong version number` and journalctl -u ollama shows no logs for this :-( . Obviously because the request never reached Ollama. Usually the above error occurs when there is a conflict in the port or if the port is not open, etc. Is the SSL configured on a different port, other than 11434? How do I get SSL to work please? Thanks Edit: Note: I had removed the id_ed file there as well as the one with the .pub extension and restarted the daemon and the service: then I see the following in the logs: `Dec 09 14:02:27 mysubdomain.mydomain.com ollama[2971]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new private key. Dec 09 14:02:27 mysubdomain.mydomain.com ollama[2971]: Your new public key is: Dec 09 14:02:27 mysubdomain.mydomain.com ollama[2971]: ssh-ed25519 AA<<truncated>>Y` But the error remains on https
Author
Owner

@easp commented on GitHub (Dec 9, 2023):

Did you compile ollama from source from that feature branch? That pull request hasn't been accepted yet and once it has been it still needs to be released in a binary.

The keypair you messed with is for pushing models to your personal library on ollama.ai.

<!-- gh-comment-id:1848602181 --> @easp commented on GitHub (Dec 9, 2023): Did you compile ollama from source from that feature branch? That pull request hasn't been accepted yet and once it has been it still needs to be released in a binary. The keypair you messed with is for pushing models to your personal library on ollama.ai.
Author
Owner

@itscvenk commented on GitHub (Dec 9, 2023):

Did you compile ollama from source from that feature branch? That pull request hasn't been accepted yet and once it has been it still needs to be released in a binary.

The keypair you messed with is for pushing models to your personal library on ollama.ai.

Ohhhh.. thanks @easp

I shall wait for the pull request to be accepted or try using that particular source: hadn't noticed this

I had done the manual install

<!-- gh-comment-id:1848603222 --> @itscvenk commented on GitHub (Dec 9, 2023): > Did you compile ollama from source from that feature branch? That pull request hasn't been accepted yet and once it has been it still needs to be released in a binary. > > The keypair you messed with is for pushing models to your personal library on ollama.ai. Ohhhh.. thanks @easp I shall wait for the pull request to be accepted or try using that particular source: hadn't noticed this I had done the manual install
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#773