[GH-ISSUE #1698] Any good examples of running flask with ollama(Mixtral) #26719

Closed
opened 2026-04-22 03:11:10 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @andysingal on GitHub (Dec 24, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1698

Hi,
I was trying to run my Mixtral model but was not sure how to verify:

python app.py
* Serving Flask app '__main__'
 * Debug mode: off
WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on http://127.0.0.1:5000/
Press CTRL+C to quit

How to verify?
i saw some code online?

curl --location '<http://127.0.0.1:5000/process_form>' \\--form 'query="What does the author discuss about NFL"
Originally created by @andysingal on GitHub (Dec 24, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1698 Hi, I was trying to run my Mixtral model but was not sure how to verify: ``` python app.py * Serving Flask app '__main__' * Debug mode: off WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Running on http://127.0.0.1:5000/ Press CTRL+C to quit ``` How to verify? i saw some code online? ``` curl --location '<http://127.0.0.1:5000/process_form>' \\--form 'query="What does the author discuss about NFL" ```
Author
Owner

@jeanjerome commented on GitHub (Dec 26, 2023):

Hi,

Your curl request should be:
curl --location 'http://127.0.0.1:5000/process_form' --form 'query="What does the author discuss about NFL"

Have you tried these other implementations?: https://github.com/jmorganca/ollama?tab=readme-ov-file#web--desktop ?

<!-- gh-comment-id:1869457943 --> @jeanjerome commented on GitHub (Dec 26, 2023): Hi, Your curl request should be: `curl --location 'http://127.0.0.1:5000/process_form' --form 'query="What does the author discuss about NFL"` Have you tried these other implementations?: [https://github.com/jmorganca/ollama?tab=readme-ov-file#web--desktop](https://github.com/jmorganca/ollama?tab=readme-ov-file#web--desktop) ?
Author
Owner

@rgaidot commented on GitHub (Dec 28, 2023):

I don't understand, did you create a proxy for ollama with flask? Can you share your app.py?

If you want try directly via ollama:

curl http://127.0.0.1:11434/api/generate -d '{
  "model": "mistral",
  "prompt":"What does the author discuss about NFL"
}'
curl http://127.0.0.1:11434/api/chat -d '{
  "model": "mistral",
  "messages": [
    { "role": "user", "content": "What does the author discuss about NFL" }
  ]
}'
<!-- gh-comment-id:1871392813 --> @rgaidot commented on GitHub (Dec 28, 2023): I don't understand, did you create a proxy for ollama with flask? Can you share your app.py? If you want try directly via ollama: ```sh curl http://127.0.0.1:11434/api/generate -d '{ "model": "mistral", "prompt":"What does the author discuss about NFL" }' ``` ```sh curl http://127.0.0.1:11434/api/chat -d '{ "model": "mistral", "messages": [ { "role": "user", "content": "What does the author discuss about NFL" } ] }' ```
Author
Owner

@technovangelist commented on GitHub (Jan 5, 2024):

Hi @andysingal are you still having an issue or did the answers from @jeanjerome and @rgaidot solve it for you?

<!-- gh-comment-id:1877978187 --> @technovangelist commented on GitHub (Jan 5, 2024): Hi @andysingal are you still having an issue or did the answers from @jeanjerome and @rgaidot solve it for you?
Author
Owner

@andysingal commented on GitHub (Jan 5, 2024):

Sorry for the late reply, the issue is resolved.

On Fri, Jan 5, 2024 at 06:43 Matt Williams @.***> wrote:

Hi @andysingal https://github.com/andysingal are you still having an
issue or did the answers from @jeanjerome https://github.com/jeanjerome
and @rgaidot https://github.com/rgaidot solve it for you?


Reply to this email directly, view it on GitHub
https://github.com/jmorganca/ollama/issues/1698#issuecomment-1877978187,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AE4LJNMK44GBHHI6BXZFD7LYM5HVFAVCNFSM6AAAAABBBVLPCCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZXHE3TQMJYG4
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:1878042730 --> @andysingal commented on GitHub (Jan 5, 2024): Sorry for the late reply, the issue is resolved. On Fri, Jan 5, 2024 at 06:43 Matt Williams ***@***.***> wrote: > Hi @andysingal <https://github.com/andysingal> are you still having an > issue or did the answers from @jeanjerome <https://github.com/jeanjerome> > and @rgaidot <https://github.com/rgaidot> solve it for you? > > — > Reply to this email directly, view it on GitHub > <https://github.com/jmorganca/ollama/issues/1698#issuecomment-1877978187>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AE4LJNMK44GBHHI6BXZFD7LYM5HVFAVCNFSM6AAAAABBBVLPCCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZXHE3TQMJYG4> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@technovangelist commented on GitHub (Jan 5, 2024):

Great. Thanks so much for that info.

<!-- gh-comment-id:1878053977 --> @technovangelist commented on GitHub (Jan 5, 2024): Great. Thanks so much for that info.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26719