The add_file_context function used a positional zip() to pair API
payload messages with DB-stored messages. After
process_messages_with_output() expands assistant messages containing
tool calls into multiple OpenAI-format messages (assistant + tool
results), the payload list becomes longer than the stored list. This
caused the zip to misalign, so subsequent user messages never received
their attached_files tags -- the model could see uploaded images via
vision but had no file URL to pass to edit_image.
Fix: filter both lists to user-role messages only before zipping.
User messages maintain the same order in both lists regardless of
assistant message expansion, restoring correct file context injection.
Fixes#21878
Replace bare except clauses with except Exception to follow Python best practices and avoid catching unexpected system exceptions like KeyboardInterrupt and SystemExit.
The non-streaming response handler was saving assistant messages without
their usage/token data. While the streaming handler correctly extracted
and saved usage information, the non-streaming path discarded it entirely.
This caused assistant messages from non-streaming completions to have
NULL usage in the chat_message table, making them invisible to the
analytics token aggregation queries and contributing to the '0 tokens'
display in Admin Panel Analytics.
Extract and normalize the usage data from the API response and include
it in the database upsert, matching the pattern already used by the
streaming handler.
Backend emits terminal events for write_file, replace_file_content,
and run_command. Frontend showFileNavDir subscriber uses startsWith
path matching to smartly refresh only when the event is relevant:
- write_file/replace_file_content: refresh if path is in current view
- run_command: always refresh (uses root '/' which matches everything)
- Also adds copy-to-clipboard button and code preview full-height fix