[GH-ISSUE #2765] "ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like " | Migration failed: 010_migrate_modelfiles_to_models #28537

Closed
opened 2026-04-25 03:09:03 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ThatCoffeeGuy on GitHub (Jun 3, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2765

Bug Report

Description

Bug Summary:
After updating to a newer version, open-webui stopped working when it tried to migrate something that failed. Restarting the container not helping at all, it's just spamming the error and crashing.

Steps to Reproduce:
Download certain models / profiles it doesn't like and let it try migrating them by an update.

Expected Behavior:
It should work.

Actual Behavior:
It is failing and instead of skipping, crashes entirely.

Environment

  • Open WebUI Version: 0.2.2.

  • Ollama (if applicable): [e.g., 0.1.30, 0.1.32-rc1]

  • Operating System:Ubuntu 22.04

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Docker Container Logs:

Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Loading WEBUI_SECRET_KEY from .webui_secret_key
Migration failed: 010_migrate_modelfiles_to_models
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 199, in run_one
    migrate(migrator, self.database, fake=fake)
  File "<string>", line 43, in migrate
  File "<string>", line 69, in migrate_modelfile_to_model
  File "/app/backend/utils/misc.py", line 162, in parse_ollama_modelfile
    value = float(value)
            ^^^^^^^^^^^^
ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like `You are a dreamer, creating a beautiful advanced dreamy fantasy sentence ...`'
/app
Traceback (most recent call last):
  File "/usr/local/bin/uvicorn", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 410, in main
    run(
  File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 578, in run
    server.run()
  File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 61, in run
    return asyncio.run(self.serve(sockets=sockets))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 68, in serve
    config.load()
  File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 473, in load
    self.loaded_app = import_from_string(self.app)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 21, in import_from_string
    module = importlib.import_module(module_str)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/app/backend/main.py", line 23, in <module>
    from apps.ollama.main import app as ollama_app, get_all_models as get_ollama_models
  File "/app/backend/apps/ollama/main.py", line 34, in <module>
    from apps.webui.models.models import Models
  File "/app/backend/apps/webui/models/models.py", line 11, in <module>
    from apps.webui.internal.db import DB, JSONField
  File "/app/backend/apps/webui/internal/db.py", line 38, in <module>
    router.run()
  File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 229, in run
    done.append(self.run_one(mname, migrator, fake=fake, force=fake))
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 199, in run_one
    migrate(migrator, self.database, fake=fake)
  File "<string>", line 43, in migrate
  File "<string>", line 69, in migrate_modelfile_to_model
  File "/app/backend/utils/misc.py", line 162, in parse_ollama_modelfile
    value = float(value)
            ^^^^^^^^^^^^
ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like `You are a dreamer, creating a beautiful advanced dreamy fantasy sentence ...`'

Installation Method

docker pull ghcr.io/open-webui/open-webui:main
docker stop open-webui
docker rm open-webui
docker run -d -p 3000:8080 --add-host=127.0.0.1:host-gateway --network=host -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Originally created by @ThatCoffeeGuy on GitHub (Jun 3, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2765 # Bug Report ## Description **Bug Summary:** After updating to a newer version, open-webui stopped working when it tried to migrate something that failed. Restarting the container not helping at all, it's just spamming the error and crashing. **Steps to Reproduce:** Download certain models / profiles it doesn't like and let it try migrating them by an update. **Expected Behavior:** It should work. **Actual Behavior:** It is failing and instead of skipping, crashes entirely. ## Environment - **Open WebUI Version: 0.2.2.** - **Ollama (if applicable):** [e.g., 0.1.30, 0.1.32-rc1] - **Operating System:Ubuntu 22.04** ## Reproduction Details **Confirmation:** - [X] I have read and followed all the instructions provided in the README.md. - [X] I am on the latest version of both Open WebUI and Ollama. - [X] I have included the browser console logs. - [X] I have included the Docker container logs. **Docker Container Logs:** ``` Loading WEBUI_SECRET_KEY from file, not provided as an environment variable. Loading WEBUI_SECRET_KEY from .webui_secret_key Migration failed: 010_migrate_modelfiles_to_models Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 199, in run_one migrate(migrator, self.database, fake=fake) File "<string>", line 43, in migrate File "<string>", line 69, in migrate_modelfile_to_model File "/app/backend/utils/misc.py", line 162, in parse_ollama_modelfile value = float(value) ^^^^^^^^^^^^ ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like `You are a dreamer, creating a beautiful advanced dreamy fantasy sentence ...`' /app Traceback (most recent call last): File "/usr/local/bin/uvicorn", line 8, in <module> sys.exit(main()) ^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 410, in main run( File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 578, in run server.run() File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 61, in run return asyncio.run(self.serve(sockets=sockets)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 68, in serve config.load() File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 473, in load self.loaded_app = import_from_string(self.app) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 21, in import_from_string module = importlib.import_module(module_str) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "<frozen importlib._bootstrap>", line 1204, in _gcd_import File "<frozen importlib._bootstrap>", line 1176, in _find_and_load File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 690, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 940, in exec_module File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed File "/app/backend/main.py", line 23, in <module> from apps.ollama.main import app as ollama_app, get_all_models as get_ollama_models File "/app/backend/apps/ollama/main.py", line 34, in <module> from apps.webui.models.models import Models File "/app/backend/apps/webui/models/models.py", line 11, in <module> from apps.webui.internal.db import DB, JSONField File "/app/backend/apps/webui/internal/db.py", line 38, in <module> router.run() File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 229, in run done.append(self.run_one(mname, migrator, fake=fake, force=fake)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 199, in run_one migrate(migrator, self.database, fake=fake) File "<string>", line 43, in migrate File "<string>", line 69, in migrate_modelfile_to_model File "/app/backend/utils/misc.py", line 162, in parse_ollama_modelfile value = float(value) ^^^^^^^^^^^^ ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like `You are a dreamer, creating a beautiful advanced dreamy fantasy sentence ...`' ``` ## Installation Method ``` docker pull ghcr.io/open-webui/open-webui:main docker stop open-webui docker rm open-webui docker run -d -p 3000:8080 --add-host=127.0.0.1:host-gateway --network=host -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ```
Author
Owner

@tjbck commented on GitHub (Jun 3, 2024):

I'll modify the code to skip the modelfiles that cause errors so you won't have issues with the migration, but basically you had an invalid modelfile.

<!-- gh-comment-id:2145916137 --> @tjbck commented on GitHub (Jun 3, 2024): I'll modify the code to skip the modelfiles that cause errors so you won't have issues with the migration, but basically you had an invalid modelfile.
Author
Owner

@tjbck commented on GitHub (Jun 3, 2024):

Should be fixed, let me know if the issue persists!

<!-- gh-comment-id:2146025559 --> @tjbck commented on GitHub (Jun 3, 2024): Should be fixed, let me know if the issue persists!
Author
Owner

@ThatCoffeeGuy commented on GitHub (Jun 7, 2024):

Thank you, I pulled and it seems to have fixed the issue, I am able to run and access my history. Thanks!

<!-- gh-comment-id:2155037179 --> @ThatCoffeeGuy commented on GitHub (Jun 7, 2024): Thank you, I pulled and it seems to have fixed the issue, I am able to run and access my history. Thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#28537