[GH-ISSUE #10666] Support for T5 architecture ? #7011

Open
opened 2026-04-12 18:54:06 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @gitced on GitHub (May 11, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10666

Hello,

Newbie here, apologies in advance if my post is out of place.

I converted this model:
https://huggingface.co/Babelscape/t5-base-summarization-claim-extractor
into a gguf:
-rw-r--r-- 1 ubuntu ubuntu 446938368 May 11 15:56 T5-base-summarization-claim-extractor_f16.gguf
Then i created an ollama version with this modelfile:

FROM ./T5-base-summarization-claim-extractor_f16.gguf
TEMPLATE """{{ .Prompt }}"""

Everything seems ok and i can list and launch the model in ollama.
But i get an error when sending a prompt:

ollama run T5-base-summarization-claim-extractor_f16:latest
>>> "Apples are fruits"
Error: POST predict: Post "http://127.0.0.1:40885/completion": EOF

I read in another post that ollama didn't support T5 architecture yet.
Is that why i'm seeing this error ?
If yes, any plan to support T5 in the future ?

Thanks
Cedric

Originally created by @gitced on GitHub (May 11, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10666 Hello, Newbie here, apologies in advance if my post is out of place. I converted this model: https://huggingface.co/Babelscape/t5-base-summarization-claim-extractor into a gguf: `-rw-r--r-- 1 ubuntu ubuntu 446938368 May 11 15:56 T5-base-summarization-claim-extractor_f16.gguf` Then i created an ollama version with this modelfile: ``` FROM ./T5-base-summarization-claim-extractor_f16.gguf TEMPLATE """{{ .Prompt }}""" ``` Everything seems ok and i can list and launch the model in ollama. But i get an error when sending a prompt: ``` ollama run T5-base-summarization-claim-extractor_f16:latest >>> "Apples are fruits" Error: POST predict: Post "http://127.0.0.1:40885/completion": EOF ``` I read in another post that ollama didn't support T5 architecture yet. Is that why i'm seeing this error ? If yes, any plan to support T5 in the future ? Thanks Cedric
GiteaMirror added the model label 2026-04-12 18:54:06 -05:00
Author
Owner

@gitced commented on GitHub (May 11, 2025):

Extra piece of info:
I downloaded the binary release of llama.cpp and could test my gguf file with the following command:
./llama-cli -m ~/T5-base-summarization-claim-extractor_f16.gguf -p "Apples are fruits" -n 50
And it works well.
So the gguf file is fine and working.

<!-- gh-comment-id:2870006587 --> @gitced commented on GitHub (May 11, 2025): Extra piece of info: I downloaded the binary release of llama.cpp and could test my gguf file with the following command: `./llama-cli -m ~/T5-base-summarization-claim-extractor_f16.gguf -p "Apples are fruits" -n 50` And it works well. So the gguf file is fine and working.
Author
Owner

@Harry-maximum commented on GitHub (May 14, 2025):

Hi, I am also working on this, is this possible that i convert the safetensor file to gguf file and then it works well? Cause the architecture of the T5small is not supported by Ollama?

<!-- gh-comment-id:2878699995 --> @Harry-maximum commented on GitHub (May 14, 2025): Hi, I am also working on this, is this possible that i convert the safetensor file to gguf file and then it works well? Cause the architecture of the T5small is not supported by Ollama?
Author
Owner

@gitced commented on GitHub (May 14, 2025):

Not sure i get your question, but yes, you can use a gguf without Ollama.
Ollama is built on top of llama.cpp, so you can use directly a gguf with llama.cpp (with the command i shown in my previous post), but you will loose all the comfort of Ollama.

<!-- gh-comment-id:2878707294 --> @gitced commented on GitHub (May 14, 2025): Not sure i get your question, but yes, you can use a gguf without Ollama. Ollama is built on top of llama.cpp, so you can use directly a gguf with llama.cpp (with the command i shown in my previous post), but you will loose all the comfort of Ollama.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7011