[GH-ISSUE #970] problem on last release #46985

Closed
opened 2026-04-28 02:28:56 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @francescoagati on GitHub (Nov 2, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/970

hello,
i have notice a big change with last release.
many models in a simple task of summarize become crazy and generate or random words or enter in an infinite loop.
i have do rollback to an old version of ollama

Originally created by @francescoagati on GitHub (Nov 2, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/970 hello, i have notice a big change with last release. many models in a simple task of summarize become crazy and generate or random words or enter in an infinite loop. i have do rollback to an old version of ollama
Author
Owner

@jmorganca commented on GitHub (Nov 2, 2023):

Hi @francescoagati, I'm sorry you hit this – may I ask:

  • Are you on macOS or Linux (or WSL?)
  • Which model did you run
  • What are your hardware specs?

Thanks – we'll get this fixed

<!-- gh-comment-id:1790936007 --> @jmorganca commented on GitHub (Nov 2, 2023): Hi @francescoagati, I'm sorry you hit this – may I ask: - Are you on macOS or Linux (or WSL?) - Which model did you run - What are your hardware specs? Thanks – we'll get this fixed
Author
Owner

@francescoagati commented on GitHub (Nov 2, 2023):

macos all models based on mistral (dolphin, orca, zephyr but also nous-hermes:13b) m2 max with g32 gb and last version of ollama. i use langchain for summary text

<!-- gh-comment-id:1790939472 --> @francescoagati commented on GitHub (Nov 2, 2023): macos all models based on mistral (dolphin, orca, zephyr but also nous-hermes:13b) m2 max with g32 gb and last version of ollama. i use langchain for summary text
Author
Owner

@jmorganca commented on GitHub (Nov 2, 2023):

@francescoagati ok thank you – which langchain integration are you using? Ollama or ChatOllama?

<!-- gh-comment-id:1790945082 --> @jmorganca commented on GitHub (Nov 2, 2023): @francescoagati ok thank you – which langchain integration are you using? Ollama or ChatOllama?
Author
Owner

@francescoagati commented on GitHub (Nov 2, 2023):

this is an example of code

`def summarize(llm,docs):

print("summarize")

map_custom_prompt='''
    Summarize the following text in a clear and concise way:
    TEXT:`{text}`
    Brief Summary:
'''
print("summarize 2")

map_prompt_template = PromptTemplate (
    input_variables=['text'],
    template=map_custom_prompt
)



combine_custom_prompt='''
    Generate a summary of the following text that includes the following elements:

    * A title that accurately reflects the content of the text.
    * An introduction paragraph that provides an overview of the topic.
    * Bullet points that list the key points of the text.
    * A conclusion paragraph that summarizes the main points of the text.

    Text:`{text}`
    '''

combine_prompt_template = PromptTemplate(
    template=combine_custom_prompt,
    input_variables=['text']
)


print("load summarize chain")

chain = load_summarize_chain(llm, 
                             chain_type='map_reduce',
                             map_prompt=map_prompt_template,
                             combine_prompt=combine_prompt_template,
                             verbose=False
                             )
result = chain.run(docs)
return result

`

<!-- gh-comment-id:1790948319 --> @francescoagati commented on GitHub (Nov 2, 2023): this is an example of code `def summarize(llm,docs): print("summarize") map_custom_prompt=''' Summarize the following text in a clear and concise way: TEXT:`{text}` Brief Summary: ''' print("summarize 2") map_prompt_template = PromptTemplate ( input_variables=['text'], template=map_custom_prompt ) combine_custom_prompt=''' Generate a summary of the following text that includes the following elements: * A title that accurately reflects the content of the text. * An introduction paragraph that provides an overview of the topic. * Bullet points that list the key points of the text. * A conclusion paragraph that summarizes the main points of the text. Text:`{text}` ''' combine_prompt_template = PromptTemplate( template=combine_custom_prompt, input_variables=['text'] ) print("load summarize chain") chain = load_summarize_chain(llm, chain_type='map_reduce', map_prompt=map_prompt_template, combine_prompt=combine_prompt_template, verbose=False ) result = chain.run(docs) return result `
Author
Owner

@francescoagati commented on GitHub (Nov 2, 2023):

actually i i use 0.0.19 because this script with 0.17 isnt usable

<!-- gh-comment-id:1791284000 --> @francescoagati commented on GitHub (Nov 2, 2023): actually i i use 0.0.19 because this script with 0.17 isnt usable
Author
Owner

@jmorganca commented on GitHub (Nov 4, 2023):

Hi @francescoagati would it be possible to upgrade to the latest version 0.1.8 and try there? This issue should be fixed. If it’s not feel free to re-open!

<!-- gh-comment-id:1793523167 --> @jmorganca commented on GitHub (Nov 4, 2023): Hi @francescoagati would it be possible to upgrade to the latest version 0.1.8 and try there? This issue should be fixed. If it’s not feel free to re-open!
Author
Owner

@francescoagati commented on GitHub (Nov 4, 2023):

yes now work
thanks. i close this issue

<!-- gh-comment-id:1793547948 --> @francescoagati commented on GitHub (Nov 4, 2023): yes now work thanks. i close this issue
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46985