[GH-ISSUE #23505] issue: Import of ChatGPT export file is broken #20003

Open
opened 2026-04-20 02:35:04 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @fsx8 on GitHub (Apr 8, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23505

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.8.12

Ollama Version (if applicable)

No response

Operating System

Ubuntu 24.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Chats from a ChatGPT export file (conversations.json) are correctly imported.

Actual Behavior

There is a green success toast notification saying "Imported 0 Chats", although the uploaded file contains dozens of chats and is used as-is right after the export from ChatGPT without any further modification.

The logs show a POST /api/v1/chats/import HTTP/1.1" 200 which makes me assume that the conversion code is probably getting confused.

Steps to Reproduce

  1. Download a Chat export zip file from ChatGPT
  2. Import the chat file by going to the Data Controls settings in Open-WebUI and select the unzipped conversations.json file

Logs & Screenshots

POST /api/v1/chats/import HTTP/1.1" 200

Image

Additional Information

No response

Originally created by @fsx8 on GitHub (Apr 8, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/23505 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.8.12 ### Ollama Version (if applicable) _No response_ ### Operating System Ubuntu 24.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Chats from a ChatGPT export file (conversations.json) are correctly imported. ### Actual Behavior There is a green success toast notification saying "Imported 0 Chats", although the uploaded file contains dozens of chats and is used as-is right after the export from ChatGPT without any further modification. The logs show a `POST /api/v1/chats/import HTTP/1.1" 200` which makes me assume that the conversion code is probably getting confused. ### Steps to Reproduce 1. Download a Chat export zip file from ChatGPT 2. Import the chat file by going to the **Data Controls** settings in Open-WebUI and select the unzipped `conversations.json` file ### Logs & Screenshots `POST /api/v1/chats/import HTTP/1.1" 200` <img width="1451" height="876" alt="Image" src="https://github.com/user-attachments/assets/f4f9ff9a-00cd-411d-a271-9b87e9a2892e" /> ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-20 02:35:04 -05:00
Author
Owner

@swapnilshekade commented on GitHub (Apr 8, 2026):

Hi! I took a look at this issue and tried to understand how ChatGPT exports are being parsed.

It seems like chats that belong to folders/projects in the ChatGPT export may include additional metadata or slightly different structure, which might not be handled correctly during the import process. My suspicion is that these chats are either getting filtered out or failing during the mapping → message tree conversion and being skipped silently.

I’m thinking of approaching this by:

  • normalizing the ChatGPT export parsing to ignore non-essential metadata differences
  • ensuring all chats (including folder-based ones) go through the same conversion pipeline
  • adding logging to make skipped/failed imports more visible

Does this direction sound reasonable? If so, I’d be happy to work on a PR.

<!-- gh-comment-id:4208130139 --> @swapnilshekade commented on GitHub (Apr 8, 2026): Hi! I took a look at this issue and tried to understand how ChatGPT exports are being parsed. It seems like chats that belong to folders/projects in the ChatGPT export may include additional metadata or slightly different structure, which might not be handled correctly during the import process. My suspicion is that these chats are either getting filtered out or failing during the mapping → message tree conversion and being skipped silently. I’m thinking of approaching this by: - normalizing the ChatGPT export parsing to ignore non-essential metadata differences - ensuring all chats (including folder-based ones) go through the same conversion pipeline - adding logging to make skipped/failed imports more visible Does this direction sound reasonable? If so, I’d be happy to work on a PR.
Author
Owner

@nightt5879 commented on GitHub (Apr 13, 2026):

Hi, I’d like to work on this issue. I’ll reproduce it first and investigate the ChatGPT conversations.json import path. If no one is already working on it, I can open a PR.

<!-- gh-comment-id:4233531974 --> @nightt5879 commented on GitHub (Apr 13, 2026): Hi, I’d like to work on this issue. I’ll reproduce it first and investigate the ChatGPT `conversations.json` import path. If no one is already working on it, I can open a PR.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#20003