Compare commits

...

60 Commits

Author SHA1 Message Date
github-actions[bot]
b9dae6ba3d [AI] Bundle SQL migrations into the worker chunk; probe views at startup (#7710)
Fixes the FatalError users hit after upgrading to 26.5.0 (issue #7710),
where `Error: no such column: _.custom_upcoming_length` repeated on
every page. Same fragility caused #7759 (FatalError when the migration
list couldn't be fetched at all from a different network).

Root cause: the service worker precaches `data-file-index.txt` and the
staged migration `.sql` files separately from the AQL view definitions
(which live inside the kcab worker JS chunk). On upgrade, a stale
precache could serve an old `data-file-index.txt` alongside the new JS
chunk whose views referenced the not-yet-applied column.

Fix:
- Inline all SQL migrations into the same JS chunk as the AQL schema
  via `import.meta.glob({ query: '?raw', eager: true })`, so the two
  cannot desync across a service-worker cache boundary.
- Synthesize JS migration names from `Object.keys(javascriptMigrations)`
  so we don't need a second glob.
- Remove the build-time copy of `migrations/` into `public/data/` and
  drop the dead `mkdir('/migrations')` at runtime; keep a defensive
  filter that ignores any `migrations/…` entries from stale precaches.
- Add `probeViews()` to `updateViews`: after view recreation, run
  `SELECT * FROM <v> LIMIT 0` against each configured view so a future
  schema/migration desync surfaces as a recoverable `schema-out-of-sync`
  error at startup instead of a cryptic per-query crash.
- Funnel the new error through the existing `out-of-sync-migrations`
  recovery dialog via a new `classifyUpdateVersionError` helper —
  guides the user to re-sync from server.

Reliability fixes uncovered while writing tests:
- `withMigrationsDir` wraps in try/finally so a throwing callback no
  longer leaks MIGRATIONS_DIR into the rest of the suite.
- The `migrate` mock in `mocks/setup.ts` preserves the real return
  value (and resets the uuid seed in a `finally`).

Test coverage: 40 tests across migrations.test.ts (27), update.test.ts
(7), and classify-error.test.ts (6), covering the migration runner,
view probe, error funnel, and end-to-end migrate against the bundled
list with column-presence assertions.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-15 20:19:58 +01:00
Matiss Janis Aboltins
329a7e81e7 [AI] Keep the mobile page header mounted across navigation to avoid flashing (#7842)
* [AI] Keep the mobile page header mounted across navigation to avoid flashing

On mobile, navigating between pages unmounted and remounted the page header
(including its background), causing a visible flash while the next page
rendered. Mobile pages now publish their header content through a context to a
single persistent `MobilePageHeaderSlot` rendered by the app shell, so only the
header content swaps while its background stays put.

* [AI] Render the persistent mobile page header via a portal

Replaces the context/state + useLayoutEffect plumbing for the persistent
mobile header with a portal into the `MobilePageHeaderSlot` DOM node, so
header content reconciles in place without re-rendering the provider on every
page render.

* Add release notes for PR #7842

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-15 16:13:14 +00:00
Sebastián Maluk
e8d95fdf6b [AI] feat: add an experimental balance forecast report (#7310)
* [AI] feat: add balance forecast backend

* [AI] feat: add balance forecast report UI

* [AI] feat: gate balance forecast behind an experimental flag

* [AI] Include account-less schedules in balance forecast via explicit flag

- Add includeAccountlessSchedules to forecast/generate and normalize
  schedules without an account into FORECAST_UNASSIGNED_ACCOUNT_ID
- When enabled, append synthetic bucket and rule stub; skip transfer legs
  for unassigned schedules
- Balance forecast UI sets the flag when widget meta has no account filter
- Add loot-core tests for include vs exclude behavior

* [AI] Improve balance forecast chart refresh UX

Keep forecast charts stable during refetches and let the Y-axis scale to forecast data so balance changes remain visible.

* [AI] Document balance forecast report

Add experimental user documentation and navigation links for the new balance forecast report.

* [AI] Link balance forecast experimental flag to feedback issue #7669

* docs: add PR release notes

* [AI] chore: rerun CI
2026-05-15 02:07:32 +00:00
Maksim Zhukau
2e0342574f [AI] fix: match no transactions when "has tags" input lacks # (closes #7797) (#7808)
* [AI] fix: match no transactions when "has tags" input lacks `#` (closes #7797)

The SQL-side `hasTags` filter extracts `#tag` patterns from the user
input and `$and`s them together. When the input has no `#` (e.g. user
types `foo` instead of `#foo`), the extraction returns an empty array
and the resulting `$and: []` matches every transaction.

Mirror the empty-`oneOf` behaviour and return the match-nothing
sentinel (`{ id: null }`) in that case.

* [AI] Add release notes entry for #7808

---------

Co-authored-by: MaksZhukov <maks_zhukov_97@users.noreply.github.com>
2026-05-14 23:36:14 +00:00
Matt Fiddaman
36e5cb17f5 fix healthcheck script (#7840)
* fix healthcheck script

* note

* test release docker image
2026-05-14 21:39:10 +00:00
Matiss Janis Aboltins
2c9e0af3e4 Fix OpenID auth test flakiness (#7847)
* [AI] Fix flaky openid /config test from cross-worker auth race

Vitest runs sync-server test files in parallel workers that share
account.sqlite. Other files (e.g. app-account.test.js) insert 'openid'
auth rows, and auth.method is a PRIMARY KEY, so a concurrent INSERT in
app-openid.test.ts can hit UNIQUE constraint failed: auth.method.

Use INSERT OR REPLACE in the helper and clear the auth table in
beforeEach for a clean start.

* Add release notes for PR #7847

* Change category from Bugfixes to Maintenance

Fix OpenID authentication test flakiness by ensuring test isolation with INSERT OR REPLACE.

* [AI] Disable file parallelism for sync-server tests

The previous fix only patched insertOpenIdAuth in app-openid.test.ts,
but app-account.test.js's insertAuthRow helper also does plain
INSERT INTO auth ... 'openid' ... (lines 197, 203, 210, 229, 245).
With maxWorkers: 2 and a shared account.sqlite, either file's INSERT
can race the other's and hit UNIQUE constraint failed: auth.method.

Disable cross-file parallelism so test files run sequentially against
the shared DB. Within-file tests still run sequentially by default.
Test suite goes from ~20s to ~36s; trades some speed for stability.

* [AI] Revert openid test changes, reword release note

The fileParallelism: false change in vitest.config.ts already prevents
the auth.method UNIQUE-constraint race across files, so the INSERT OR
REPLACE and extra beforeEach cleanup in app-openid.test.ts are no longer
needed. Revert that file back to its original state and reword the
release note to describe the actual fix.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-14 21:30:33 +00:00
Matt Fiddaman
e04924810d clean up GoCardless bank factory loading process (#7809)
* remove TLA in bank-factory

* note
2026-05-14 21:04:20 +00:00
Matiss Janis Aboltins
872a40f829 Add explicit permissions to GitHub Actions workflows (#7846)
* [AI] Add explicit permissions blocks to GitHub Actions workflows

Resolves zizmor "excessive-permissions" findings by declaring minimal
`permissions:` blocks for workflows that previously relied on the default
GITHUB_TOKEN permissions.

https://claude.ai/code/session_01FsyCaLEqb3C4egMPUoAFRg

* Add release notes for PR #7846

* Update category for release notes

Changed category from Enhancements to Maintenance.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-14 20:17:50 +00:00
Matiss Janis Aboltins
fd01bd855c [AI] Stabilize size-compare job by pinning downloads to run_id (#7780)
* [AI] Stabilize size-compare job by pinning downloads to run_id

The compare job in .github/workflows/size-compare.yml was flaky because
fountainhead/action-wait-for-check matched a check by name from any run
on the branch, while dawidd6/action-download-artifact with branch:/pr:
filters and workflow_conclusion: '' resolved to the latest run regardless
of completion. When a new master build started in the seconds between
waiting and downloading, the action picked up the in-progress run and
failed with "artifact not found".

Replaces the eight wait-for-check steps with one actions/github-script
step that polls listWorkflowRuns for a successful build.yml run on
master and the PR head SHA in parallel via Promise.all, then pins all
eight downloads to those run_ids.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Add release notes for PR #7780

* Change category to Maintenance in release notes

Updated category from 'Enhancements' to 'Maintenance'.

* [AI] Clean up comment to remove reference to previous implementation

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-14 19:41:13 +00:00
Matiss Janis Aboltins
7e0f024c97 Add repository metadata to @actual-app/crdt package.json (#7845)
* [AI] Fix npm provenance for @actual-app/crdt and bump to 3.0.1

Add the missing repository field to packages/crdt/package.json so the npm
provenance bundle can validate the source against
https://github.com/actualbudget/actual. Without it, publishing fails with
"Error verifying sigstore provenance bundle: repository.url is \"\"".

* Add release notes for PR #7845

* [AI] Revert @actual-app/crdt version back to 3.0.0

* Fix metadata formatting in package.json for crdt

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-14 19:34:39 +00:00
Matiss Janis Aboltins
2b7bc80f4e release crdt v3.0.0 (#7818)
* release crdt v3.0.0

* Test

* Bump version

* Revert readme
2026-05-14 19:08:51 +00:00
Matiss Janis Aboltins
6c1699f0b0 [AI] Replace any-typed Modal in undo state with structural type (#7813)
* [AI] Replace any-typed Modal in undo state with structural type

loot-core can't import @actual-app/web's Modal union, so the undo MRU
typed openModal as `any`. The undo system only stores the value and
reads `.name`, so a minimal structural shape `{ name: string; options?: unknown }`
is enough. desktop-client's full Modal still assigns to it, and the one
reader (global-events.ts) re-narrows back to Modal when handing the
value to replaceModal().

* Add release notes for PR #7813

* Update 7813.md

* [AI] Add TODO on Modal cast for future type consolidation

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-13 18:08:11 +00:00
Matt Fiddaman
4b1b68a353 automation UI: add spend from functionality (#7823)
* add spend from template

* note
2026-05-13 17:35:41 +00:00
Matt Fiddaman
dbf5d7c079 automation UI: tweak font colours to be more readable (#7819)
* s/pagetext{Subdued,Light}

* note
2026-05-13 17:31:25 +00:00
Matt Fiddaman
6bfc299d28 automation UI: add cleanup functionality (#7815)
* [AI] Share cleanup-group helpers and let storeTemplates write cleanup_def

- Extract resolveCleanupGroup and tombstoneOrphanCleanupGroups out of
  cleanup-template-notes.ts into a new cleanup-groups.ts so the
  upcoming UI-driven create flow can reuse the resurrect-aware lookup.
- Let storeTemplates accept an optional cleanup array per category
  (omitted = leave as-is, [] = clear, non-empty = replace), and run the
  orphan tombstone sweep whenever cleanup_def is touched so groups
  removed from the UI don't linger.
- Register budget/store-note-cleanups so the UI can migrate a single
  category's notes on demand, and budget/create-cleanup-group so it can
  create groups inline.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Add UI editor for end-of-month cleanup automations

- New cleanup row in the BudgetAutomations sidebar with read-only
  summary; selecting it opens an editor with a Global scope card and
  an optional named-group scope (single group per category for now,
  since multi-group ordering depends on category sort).
- Each scope card has independent "send leftover" / "take a share"
  toggles plus a weight; group scopes additionally support
  "only enough to cover overspending".
- Group picker is a typeahead that creates groups inline via
  budget/create-cleanup-group.
- useCategoryCleanup migrates notes to cleanup_def at modal-open for
  unmigrated categories; useCleanupGroups streams the live list.
- Un-migrate flow renders cleanup_def back to #cleanup note lines and
  drops rows whose group can't be resolved, so users never see UUIDs
  in their notes.
- Sidebar/automation-button "has automations" probes also check
  cleanup_def so cleanup-only categories still get the indicator.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* note

* review pass 1

* bring automation logic in line with cleanup logic

* review pass 3

* coderabbit pass 1

* wording suggestions

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-13 17:01:11 +00:00
Julian Dominguez-Schatz
a7e22b023c Disable postinstall scripts except for an allowlist (#7825)
* Disable postinstall scripts except for an allowlist

* Add release notes

* Temp: trigger docs CI

* Revert "Temp: trigger docs CI"

This reverts commit 1c2ca1125c.

* Remove some unneeded builds
2026-05-13 13:29:15 +00:00
Julian Dominguez-Schatz
d3e7c1ee87 Fix some issues caught by zizmor (#7826)
* Fix some issues caught by zizmor

* Add release notes

* Add more cache ignores

* Add comments on reasoning
2026-05-13 13:19:16 +00:00
Aryan Katiyar
a7e100276e [AI] Fix category group filtering for budgeted custom reports (#7629)
* [AI] Fix category group filtering for budgeted custom reports

* [AI] Add release notes for budgeted report filter fix
2026-05-13 10:33:14 +00:00
Tonchain
ee82a16026 Fix split transaction popover wrapping (#7814)
* Fix split transaction popover wrapping

* Add release note for split popover fix

* Uncap split popover width
2026-05-13 10:07:14 +00:00
Matiss Janis Aboltins
b61732e20e [AI] Add workflow to auto-label AI-generated PRs (#7817)
* [AI] Add workflow to label '[AI]'-prefixed PRs as 'AI generated'

https://claude.ai/code/session_018yp3BsEq1CyPcw8t57nLVu

* [AI] Suppress zizmor dangerous-triggers finding and add release note

https://claude.ai/code/session_018yp3BsEq1CyPcw8t57nLVu

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-13 05:17:11 +00:00
Matiss Janis Aboltins
83073b3ee0 [AI] automated publishing workflow for crdt package (#7805)
* [AI] Require @actual-app/crdt version bump and auto-publish

Adds two workflows:
- crdt-version-check: fails PRs that modify files in packages/crdt/
  without bumping the version in packages/crdt/package.json.
- publish-crdt: publishes @actual-app/crdt to npm when the version in
  packages/crdt/package.json changes on master, tagging the release as
  crdt-v<version>.

* [AI] Skip git tagging in @actual-app/crdt publish workflow

Remove the tag-and-push step and the now-unused version output;
downgrade contents permission to read.

* [AI] Simplify crdt version-bump workflows

- Drop the redundant explicit base-branch fetch (fetch-depth: 0 already
  retrieves all remote branches).
- Remove the unreachable "no changes" guard; the pull_request paths
  filter already scopes the workflow to packages/crdt changes.
- Replace the embedded Node semver comparison with `sort -V`.
- Read versions with `jq` instead of inline Node.

* [AI] Add release notes for crdt publish workflows

* [AI] Restrict GITHUB_TOKEN permissions in crdt workflows

Add top-level `permissions: contents: read` to both crdt workflows so
the implicit jobs no longer inherit overly broad permissions (flagged by
zizmor).

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-12 21:23:47 +00:00
diodijon
78234102fa Added scoped ErrorBoundary elements to all the modals in the desktop-client modals folder (#7560)
* add scoped ErrorBoundary to base Modal component

* update release note

* [autofix.ci] apply automated fixes

* Delete packages/loot-core/src/mocks/files/budgets/test-budget/metadata.json

* Delete packages/loot-core/src/mocks/files/budgets/test-budget/db.sqlite

* Add ErrorBoundary to Modal component for error handling

Errors thrown within any modal's content will now display a fallback UI (FeatureErrorFallback) instead of crashing the entire application.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-05-12 21:15:07 +00:00
Matiss Janis Aboltins
2c7f3c7a3d [AI] Replace google-protobuf with @bufbuild/protobuf (#7535)
* [AI] crdt: typecheck test files and clean up lint issues

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Replace google-protobuf with @bufbuild/protobuf

Swap the google-protobuf + ts-protoc-gen + protoc-gen-js toolchain for
@bufbuild/protobuf + @bufbuild/protoc-gen-es. The generator now emits a
single pure-TS sync_pb.ts (no .js sidecar, no globalThis.proto hack)
and a thin wrapper in proto/compat.ts preserves the SyncProtoBuf /
SyncRequest / etc. API so call sites stay unchanged. Removes the
loot-core CommonJS require polyfill that only existed to service
google-protobuf.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Align @bufbuild/protobuf version ranges with installed

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: drop the SyncProtoBuf compat layer

The proto/compat.ts wrapper was introduced alongside the bufbuild
migration to avoid touching call sites. With bufbuild messages already
exposing fields as plain mutable properties, the wrapper was just
boilerplate hiding direct reads and writes — and it had drifted (e.g.
setMessagesList was called in a test but never defined).

Delete compat.ts and migrate the six call sites in loot-core and
sync-server to use @bufbuild/protobuf directly. The crdt package now
re-exports the sync_pb types/schemas and the three bufbuild runtime
helpers (create, fromBinary, toBinary) so consumers keep a single
import source.

Also switch sync-server's @actual-app/crdt dependency from the pinned
"2.1.0" to "workspace:*", matching api/loot-core — the npm pin was
pulling the stale published copy instead of the workspace source.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] CI: drive sync-server build through lage so crdt deps are built

Before: the server job ran `yarn workspace @actual-app/sync-server build`
directly, which invokes tsgo without first emitting the workspace
dependencies' declarations. That worked when sync-server pinned crdt to
the published npm version (declarations bundled in the tarball), but
with `workspace:*` it fails with TS6305 because packages/crdt/dist/*.d.ts
hasn't been built yet.

Switch the CI command to `yarn build --to=@actual-app/sync-server`.
Lage respects the `dependsOn: ['^build']` pipeline and builds
@actual-app/crdt (and the other transitive deps) before sync-server.

Using --to rather than --scope keeps the build set minimal; --scope
would also include dependents like desktop-electron.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] sync-server: build project references via tsgo -b

The build script ran plain `tsgo`, which doesn't compile referenced
projects. With @actual-app/crdt now a `workspace:*` dep (no bundled
declarations from the npm tarball), the sync-server build fails with
TS6305 because packages/crdt/dist/index.d.ts doesn't exist yet.

Switch to `tsgo -b` so the sync-server build is self-contained: it
emits crdt's declarations into packages/crdt/dist on demand. This
mirrors what the sync-server `typecheck` script already does and fixes
all callers (`build:server`, docker-edge, publish workflows, the
direct `yarn workspace @actual-app/sync-server build` invocation in
build.yml) without needing per-workflow lage orchestration.

Revert the build.yml workaround added in the previous commit.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] sync-server: build @actual-app/crdt before tsgo

The previous tsgo -b approach emitted crdt's .d.ts via the project
reference but never produced dist/index.js — tsgo respects crdt's
tsconfig which has emitDeclarationOnly: true, and the actual JS
runtime is emitted by Vite in crdt's build script. So sync-server
compiled cleanly but crashed at runtime when forked by desktop-electron
(require('@actual-app/crdt') resolved to a package whose main pointed
at a nonexistent file, surfaced in e2e as the onboarding screen never
leaving the "Configure your server" state).

Unlike packages/api (which uses Vite with noExternal: true and bundles
crdt's source inline), sync-server uses plain tsgo compilation and
keeps its deps external — so crdt must be built ahead of time and be
resolvable via node_modules at runtime.

Chain `yarn workspace @actual-app/crdt build` before tsgo so every
caller of sync-server's build (build:server, docker-edge, publish
workflows, direct invocations in CI) gets a complete crdt dist. Revert
tsgo -b back to plain tsgo since crdt's build step now emits both the
JS and the declarations.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: expose dist/ via conditional exports so Node can load it

The package's `exports` field pointed straight at `./src/index.ts`,
which works for TS tooling and bundlers (vite with noExternal, vitest)
but breaks at plain-Node runtime — Node can't execute `.ts` files and
resolves dependent `./crdt` as a directory import, failing with
ERR_UNSUPPORTED_DIR_IMPORT.

That was invisible before because sync-server pinned
`@actual-app/crdt@2.1.0` and ran against the published npm tarball
(whose `publishConfig.exports` had already been promoted to the main
`exports` by yarn pack). Switching sync-server to `workspace:*` made
the raw workspace exports win at runtime: the compiled server imported
crdt when desktop-electron forked it, Node hit the `.ts` entry, the
utility process crashed before emitting `server-started`, and the
onboarding flow stalled on "Configure your server".

Switch to the same conditional-exports pattern packages/api already
uses: types → dist/index.d.ts, development → src/index.ts (for vitest
runs that enable the `development` condition), default → dist/index.js
(Node runtime and any other consumer). `publishConfig.exports` still
collapses this to just types + default for the npm tarball.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: split exports per consumer (browser source, node dist)

Previous commit's conditional exports routed everything non-development
to ./dist/index.js. That broke the web build: rolldown runs with
conditions ['electron-renderer', 'module', 'browser', 'default'] — no
match for development, falls through to the dist entry, which isn't
built by bin/package-browser, and fails to resolve @actual-app/crdt
when bundling loot-core's server/undo.ts.

Split the entries so each consumer lands on the right artifact:

  types       → ./dist/index.d.ts   (TypeScript, project references)
  development → ./src/index.ts      (vitest — both configs include it)
  browser     → ./src/index.ts      (web rolldown bundles the source)
  node        → ./dist/index.js     (sync-server forked by Node at
                                     runtime — the failure that kicked
                                     off this whole saga)
  default     → ./src/index.ts      (fallback for bundlers like api's
                                     vite build with conditions=['api'])

Verified: node resolves to dist, yarn build:browser succeeds from a
clean crdt/, sync-server build produces both dist/index.js and
build/app.js, loot-core (552) + sync-server (386) tests pass, full
typecheck clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] address review feedback on crdt/sync-server

- generate-proto: add `set -euo pipefail` so a protoc failure exits the
  script non-zero instead of silently running oxfmt on whatever is in
  src/proto/ from the previous run.
- sync.proto SyncRequest: field numbers jumped from 3 to 5; declare
  `reserved 4;` so the slot can't be silently reused for a new field
  with an incompatible type. Regenerated sync_pb.ts — the reservation
  shows up in the encoded file descriptor.
- sync-simple.js: SQLite stores is_encrypted as a 0/1 integer and
  better-sqlite3 hands it back as a number, but the bufbuild
  MessageEnvelope schema types isEncrypted as bool. Coerce to boolean
  when constructing the envelope so the JS value matches the field
  type before toBinary runs.

Skipped the suggested `types` → ./src/index.ts swap in crdt's exports:
packages/api uses the same `types` → dist pattern and TypeScript's
bundler resolution already falls through when dist/*.d.ts doesn't yet
exist (verified — loot-core typecheck passes with packages/crdt/dist
removed).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] address review feedback on encoder/app-sync test

- encoder.ts: prefs.getPrefs().encryptKeyId is `string | undefined`
  (MetadataPrefs is a Partial<>). The bufbuild SyncRequestSchema's
  keyId field is a non-optional proto3 string. Current code worked by
  accident — passing undefined into `create(Schema, init)` falls back
  to the schema default '' — but relied on bufbuild's undef-handling
  and would break if someone dropped @ts-strict-ignore. Normalize to
  '' explicitly.
- app-sync.test.ts: add a short WHY comment next to
  `syncRequest.since = ''` in "returns 422 if since is not provided".
  The test's intent (missing since) only matches the handler's
  `requestPb.since || null` falsy-check because proto3 strips '' on
  the wire and decodes it back to ''. Not obvious without the comment.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: load source directly in dev, only use dist when published

Local exports point at src/index.ts so consumers (sync-server in
particular) never load a stale Vite bundle. publishConfig keeps the
dist/ mapping for npm consumers. Switched the Vite output to ESM and
added "type": "module" so the published bundle stays consistent.

Sync-server's existing extension-resolution loader is extended to
handle directory imports and is now registered at runtime via
--import ./register-loader.mjs, matching how tests already load it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] desktop-electron: register sync-server loader on the embedded fork

The Electron app starts the sync server via utilityProcess.fork, which
bypasses sync-server's `start` script. With crdt now loaded from
source, the fork needs the same `--import register-loader.mjs` that
the standalone server uses; otherwise it crashes on the extensionless
`from './crdt'` directory import. Adds the loader files to
sync-server's published `files` so they actually ship with the
packaged app.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] sync-server: bootstrap entry that registers the loader for utilityProcess

Electron's utilityProcess.fork accepts execArgv but silently ignores
--import (verified with a minimal repro: the flag shows up in
process.execArgv but the preload module never executes), so the
previous attempt was a no-op and the embedded sync-server still
crashed on crdt's ESM directory imports. Add packages/sync-server/start.mjs
that statically imports register-loader.mjs and then dynamic-imports
build/app.js, so the loader is in place before the app's module graph
resolves. desktop-electron now points utilityProcess.fork at start.mjs
and drops the ineffective --import flag.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-12 17:55:52 +00:00
Yosof Badr
a4cb6dac37 [AI] fix: allow clearing pre-assigned category on new transactions (#7521)
* [AI] fix: allow clearing pre-assigned category on new transactions

Add a "Nothing" button to the category autocomplete modal that allows
users to clear a pre-assigned category when adding or editing
transactions. Previously, when a payee had a pre-assigned category,
there was no way to remove it and leave the transaction uncategorized.

Closes #7390

* [AI] docs: add release notes for PR #7521

* [AI] chore: re-trigger CI for flaky test

The test failure in methods.test.ts (Budgets: successfully update budgets)
is a pre-existing flaky test caused by a race condition in
advanceSchedulesService. The async schedule service fires via
void runMutator() after a sync event, but the database can be closed
before the query completes. This is unrelated to the PR changes which
only touch desktop-client UI code.

* chore: retrigger CI (flaky api test)

* fix type issue, better text

* more type fixes

* actually fixed?

---------

Co-authored-by: youngcw <calebyoung94@gmail.com>
2026-05-12 17:02:18 +00:00
youngcw
d1f9f8aecf release crossover report (#7804)
* release crossover report

* note
2026-05-12 16:45:01 +00:00
youngcw
3b927e754c 🐛 fix crossover report bugs (#7803)
* fix crossover report bugs

* note

* fixes

* refix
2026-05-12 16:02:09 +00:00
Alec Bakholdin
f2f3a5aa6d reserved whitespace for bank sync indicator when non-synced accounts … (#7611)
* moved the bank sync indicator to the right side of the text in mobile accounts view

* release notes

* moved spacing to the left again but made it smaller

* removed react from imports

* compressed space further

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7611

---------

Co-authored-by: Alec Bakholdin <alecbakholdin.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-12 15:56:42 +00:00
Stephen Brown II
8fbad7d64f [AI] No longer adjust date to match on transfers (#7722) 2026-05-12 15:51:08 +00:00
Nikhil Verma
44a3013772 [AI] Add R keyboard shortcut for Make transfer (#7750)
* [AI] Add T keyboard shortcut for Make transfer

* [AI] Add release notes for #7750

* [AI] Switch Make transfer shortcut from T to R and document it

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7750

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7750

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7750

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-12 15:40:16 +00:00
Matt Fiddaman
70716a59da automation UI: add goal template functionality (#7792)
* add goal to automations UI

* note

* tweak description

* Update upcoming-release-notes/7792.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-05-12 14:30:05 +00:00
Matt Fiddaman
e83567216f automation UI: open to the first automation (#7811)
* open to first automation

* note
2026-05-12 14:29:08 +00:00
Matt Fiddaman
92d4f82b66 automation UI: allow "save by date" automations not to repeat (#7810)
* allow save by automations not to repeat

* note
2026-05-12 14:28:58 +00:00
Matiss Janis Aboltins
50feba1afb [AI] Fix flaky API test timeouts and use sync file write in tests (#7806)
* [AI] Fix flaky upload-user-file test

The "uploads and updates an existing file successfully" test wrote the
old file content using the async callback form of fs.writeFile without
awaiting it. That write could land after the upload endpoint had already
written the new content, leaving the file with stale content and failing
the assertion. Use fs.writeFileSync so the setup completes before the
request is sent.

* [AI] Increase api test timeouts to fix flaky budget-load test

methods.test.ts loads a budget file and runs all DB migrations in each
test/hook. On busy CI runners this regularly approaches the default 5s
limit, and when it exceeds it the in-flight loadBudget keeps running after
teardown closes the database, producing a cascade of unhandled rejections
("database connection is not open", "no such table: v_schedules",
"Cannot read properties of undefined (reading 'timestamp')") that fail the
suite. Bump testTimeout/hookTimeout to 20s for the api package.

* [AI] Add release note for flaky test fixes

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-11 22:07:25 +00:00
Nadir Miralimov
8263e58eb2 [AI] Replace payee and category autocomplete filter/sort with fzf fuzy search (#7261)
* [AI] feat(web): replace custom autocomplete ranking with fzf

Replace substring-based filter/sort in PayeeAutocomplete and
CategoryAutocomplete with fzf fuzzy search. Remove deprecated
autocompleteRanking utility.

Closes #7261

* Update #7261 release notes

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7261

* Regenerate yarn.lock file

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7261

* Restore e2e snapshots

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7261

---------

Co-authored-by: Nadir Miralimov <riid@pm.me>
Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-11 21:33:36 +00:00
Matt Fiddaman
d015858e4a persist cleanup templates in the DB (#7794)
* add cleanup DB structure

* persist cleanup templates in the DB

* note

* Update packages/loot-core/src/types/models/cleanup-templates.ts

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>

* jfdoming feedback

* coderabbit

---------

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>
2026-05-11 21:30:51 +00:00
dependabot[bot]
f3a9c1a02c Bump mermaid from 11.12.1 to 11.15.0 (#7801)
Bumps [mermaid](https://github.com/mermaid-js/mermaid) from 11.12.1 to 11.15.0.
- [Release notes](https://github.com/mermaid-js/mermaid/releases)
- [Commits](https://github.com/mermaid-js/mermaid/compare/mermaid@11.12.1...mermaid@11.15.0)

---
updated-dependencies:
- dependency-name: mermaid
  dependency-version: 11.15.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-11 21:10:29 +00:00
Matiss Janis Aboltins
daa698e7d2 [AI] Fix /update-vrt merge step when only one shard has changes (#7802)
The Merge VRT Patches job collects shard patches with the glob
`/tmp/shard-patches/*/vrt-shard.patch`, which assumes every downloaded
artifact lands in its own `path/<artifact-name>/` subdirectory. But
actions/download-artifact only does that when 2+ artifacts match the
pattern; when exactly one matches it unpacks the artifact directly into
`path`. So whenever a `/update-vrt` run touches snapshots in a single
shard (the common case) the patch ends up at
`/tmp/shard-patches/vrt-shard.patch`, the glob matches nothing, and the
job reports "No shard patches to merge" despite a patch having been
generated (e.g. run 25679233565).

Replace the glob with a recursive `find` so the patches are located
under either layout. `merge-multiple: true` is not an option here
because every shard artifact contains a file literally named
`vrt-shard.patch` and they would overwrite each other.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-11 20:38:27 +00:00
Matt Fiddaman
e0536b593d add EB feedback issue (#7800)
* add EB feedback issue

* note
2026-05-11 20:22:21 +00:00
Matt Fiddaman
d500057494 automation UI: add options section to sidebar (#7791)
* add options section to sidebar

* note

* don't allow switching from option to automation

* use short description for sidebar, hide projected balance
2026-05-11 20:09:31 +00:00
Matiss Janis Aboltins
0086f805f8 [AI] Release custom themes feature (#7775)
* [AI] Release custom themes feature

Remove the customThemes experimental feature flag while keeping the
functionality intact (now enabled for all users).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Revise custom themes release note to experimental

Updated the release notes to reflect the experimental status of the custom themes feature.

* [AI] Move custom themes docs out of experimental

Custom themes graduated from experimental in this release; move the
guide to /docs/custom-themes, drop the experimental warnings and the
flag-toggle instructions, and update the historical link target in
releases.md plus a brief pointer from settings/index.md.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Update upcoming-release-notes/7775.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-05-11 20:08:23 +00:00
James Skinner
911d8371cc Tracking budget api income (#7526)
* Add tests and handling income group in different budget types

* Handle income groups in reflect budget logic

* Add release note

* Lint fix

* Address coderabbit feedback

* Remove ts-strict-ignore

* Change test dates, and assert group existence

* fix naming

* fix typecheck

---------

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
2026-05-11 19:36:51 +00:00
Matiss Janis Aboltins
a95c0ad9b0 [AI] sync server changes; crdt & et al (#7702)
* [AI] Load @actual-app/crdt from source in dev, only bundle for publish

@actual-app/crdt's local exports now point at src/index.ts so consumers
(sync-server, loot-core, desktop-client) never see a stale Vite bundle.
publishConfig keeps the dist/ mapping for npm consumers. crdt's
tsconfig switches to bundler module resolution to match the rest of
the workspace (no extensions in source imports).

Sync-server's existing extension-resolution loader is extended to also
handle directory-index imports (./crdt → ./crdt/index.ts), and the
standalone `start` / `start-monitor` scripts now invoke Node with
--import ./register-loader.mjs so the loader is in place before crdt's
source resolves.

Electron's utilityProcess.fork accepts execArgv but doesn't actually
preload --import modules, so a new packages/sync-server/start.mjs
bootstrap entry registers the loader imperatively and then dynamic-
imports build/app.js. desktop-electron's startSyncServer() points the
fork at start.mjs. sync-server's "files" array now ships start.mjs,
register-loader.mjs and loader.mjs so packaged Electron / npm
consumers actually receive them.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Add release notes for PR #7702

* [AI] Restructure sync-server to build with Vite

Replace the hand-rolled tsgo + add-import-extensions + copy-static-assets
+ runtime loader pipeline with a single Vite SSR build. Bundles every
entry (app, bin/actual-server, scripts/*) and inlines @actual-app/crdt
source so Node never has to resolve TS at runtime — the
MODULE_TYPELESS_PACKAGE_JSON warning that surfaced via crdt's source
exports is gone. Migrations and bank handlers move from readdir-based
dynamic imports to import.meta.glob; messages.sql becomes a ?raw import.

Drop loader.mjs, register-loader.mjs, start.mjs, and
bin/add-import-extensions.mjs. Electron's startSyncServer() forks
build/app.js directly. publishConfig.imports goes away (subpath imports
are resolved at build time and don't appear in the bundle).

In dev (start:server-dev) sync-server proxies to Vite, so loosen the CSP
to allow Vite's inline preamble script and HMR websocket — production
CSP is unchanged. desktop-client skips registerSW() in dev (and disables
vite-plugin-pwa's devOptions) so stale cached assets don't override
edits between page loads.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Address review feedback

- sync-server CSP: drop 'unsafe-eval' from the production script-src;
  the bundle has no genuine eval/new Function usage (only a defensive
  branch in setimmediate's polyfill that's never hit). Keep it on the
  dev branch where Vite's HMR runtime relies on it. Add a comment so
  it's obvious which branch needs it and why.
- bank-factory: widen the loader glob to ./banks/*_*.{ts,js} so
  TypeScript handlers are discovered too, mirroring migrations.ts.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Restore 'unsafe-eval' in production CSP for Electron

The Electron app needs `'unsafe-eval'` at runtime, so revert the dev-only
restriction and keep `'unsafe-eval'` in both branches. Comment updated to
record the actual reason instead of marking it as removable.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Revert bank-factory glob change

Widening the glob to ./banks/*_*.{ts,js} broke the desktop e2e tests in
CI even though every current handler is .js and the brace expansion
matches no .ts files locally. Reverting to ./banks/*_*.js — the change
had no behavioural benefit since there are no TS handlers, so the
nitpick isn't worth chasing.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Strip CSP comment to restore identical state to 9513c1e16

The desktop e2e has been failing despite my prior commits being a strict
revert (only difference was a 2-line comment, which can't change runtime).
Removing even the comment so the branch matches 9513c1e16's relevant
files exactly, to isolate whether the failure is from the master merge
or from CI-environment drift.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Make rebuild-electron actually rebuild better-sqlite3

PR #7712 simplified rebuild-electron to just `electron-rebuild -f -o
better-sqlite3,bcrypt` from the repo root. Two problems with that:

  1. Without `-m`, electron-rebuild scans the root workspace's package.json
     for native deps. better-sqlite3 isn't a direct root dep — it lives
     under packages/sync-server/ — so the scan returns no candidates and
     the rebuild silently no-ops.
  2. Without --build-from-source, electron-rebuild defers to
     prebuild-install, which downloads a stale prebuilt binary keyed off
     better-sqlite3's package.json (ABI 127) instead of recompiling
     against Electron 39's bundled Node ABI 140. The download succeeds
     and "Rebuild Complete" prints, but the resulting `better_sqlite3.node`
     can't `dlopen` inside Electron's utility process — sync-server
     crashes immediately on db init, the renderer's startSyncServer IPC
     never resolves, and the e2e test hangs on "Configure your server".

Point -m at packages/desktop-electron (which transitively pulls in
better-sqlite3 and bcrypt via @actual-app/sync-server) and force a real
compile via --build-from-source. Verified locally: better-sqlite3
rebuilds to darwin-arm64-140 and the desktop e2e onboarding test passes
in 6s instead of hanging for 60s.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Restore CSP unsafe-eval comment

Bring back the explanatory comment that was stripped diagnostically in
99682268c. Now that the desktop e2e regression is traced to
rebuild-electron and not to anything in this branch, we can keep the
documentation noting why 'unsafe-eval' is retained in both CSP branches.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Restore bank-factory glob to ./banks/*_*.{ts,js}

Re-apply the glob widening originally added in 145868f9d. It was
reverted in 531b1a191 because the desktop e2e was failing — that
failure is now traced to the rebuild-electron breakage (fixed in
6e8ac0784), not to this glob. Mirroring migrations.ts so future TS
bank handlers are picked up.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Fix applyAppUpdate hanging in dev mode

In dev mode browser-preload's updateSW was () => undefined, so
applyAppUpdate() — which calls updateSW() and then awaits a
deliberately never-resolving promise (waiting for the SW-driven page
reload) — hung the renderer instead of refreshing. In prod the page
is replaced by the new service worker, so the never-resolving await is
fine. The dev path now triggers a plain window.location.reload() so
the page reloads and the never-settling await is irrelevant, matching
prod's effective behaviour.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Revert rebuild-electron to master version

* Revert "[AI] Revert rebuild-electron to master version"

This reverts commit 4b6baab79f.

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-11 17:40:42 +00:00
Matiss Janis Aboltins
d9308c1474 [AI] Fix mobile bank sync indicators not updating during sync (#7784)
* [AI] Update mobile bank sync indicators live during sync

Mobile's account list uses react-aria-components ListBox with the
items render-function pattern, which memoizes rows by item identity.
Without a dependencies prop, changes to syncingAccountIds,
failedAccounts, and updatedAccounts in Redux didn't cause the
per-account dots to re-render until the items array itself changed,
so the green/yellow/red indicators only updated after the full sync
finished.

Pass these Redux selections via the dependencies prop so the rows
re-render as state changes during sync. Also clear SimpleFin
accounts from accountsSyncing right after the batch call returns,
so their indicators reflect completion before the per-account loop
starts on the remaining accounts.

https://claude.ai/code/session_01DNkRSgqW5JEtYpZjxvj7Bi

* [AI] Update release notes filename and author

https://claude.ai/code/session_01DNkRSgqW5JEtYpZjxvj7Bi

* [AI] Drop verbose comment on SimpleFin sync dispatch

https://claude.ai/code/session_01DNkRSgqW5JEtYpZjxvj7Bi

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-11 17:06:21 +00:00
Matiss Janis Aboltins
425db2d94d [AI] Recover from BackendInitFailure and show a meaningful error (#7761)
* [AI] Recover from BackendInitFailure and show a meaningful error

When the backend Worker fails to load (e.g., the hashed kcab.worker
asset can't be fetched), the SharedWorker would cache the
app-init-failure and replay it to every subsequent tab forever, while
the FatalError modal showed a misleading "browser version" message.

- Retry importScripts in production (3 attempts) so a transient blip
  doesn't brick the SharedWorker.
- Clear lastAppInitFailure when the client acknowledges the failure,
  when a backend later connects successfully (centralized in
  broadcastConnect), and when a fresh init arrives with no active
  groups (the failed leader is gone).
- Add a BackendInitFailure branch to FatalError's RenderSimple with a
  message that points the user at reload / hard refresh.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Remove support contact message from FatalError

Removed support contact message from FatalError component.

* [AI] Fix error propagation in importScriptsWithRetry

- Change Promise executor to accept both resolve and reject
- Properly propagate errors using .then(resolve).catch(reject)
- Fixes issue where errors from recursive retry calls were swallowed

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-11 17:06:01 +00:00
Matiss Janis Aboltins
5d270340a5 CLI: Hide hidden categories by default in list commands (#7786)
* [AI] CLI: hide hidden categories by default in list commands

The `categories list` and `category-groups list` commands now exclude
hidden entries by default. Pass `--include-hidden` to include them, mirroring
the existing `--include-closed` flag for `accounts list`.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Rename release note to 7785.md and update author

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] CLI: simplify category-groups list and consolidate test setup

- Flatten the include-hidden ternary on category-groups list into a
  single filter chain, mirroring categories list.
- Consolidate duplicated stderr/stdout spy setup into one outer
  describe in categories.test.ts.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Rename release note to 7786.md to match PR number

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Push hidden-category filtering down to the API/query layer

Add an optional \`hidden\` filter to \`api.getCategories\` and
\`api.getCategoryGroups\`. When set, the AQL query filters category groups
by hidden status and nested categories are filtered to match. Internal
callers (no options) keep the existing "return everything" behavior.

The CLI \`categories list\` and \`category-groups list\` commands now pass
\`{ hidden: false }\` instead of filtering client-side after fetching.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Document new \`hidden\` option on getCategories and getCategoryGroups

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] getCategories: include hidden categories from visible groups in list

When \`hidden: true\` was requested, the flat list only contained hidden
categories that lived inside hidden groups, because it was derived from
the same already-filtered groups used for the grouped view. A hidden
category sitting in a visible group was silently dropped.

Fetch the unfiltered groups for the list view and filter by
\`category.hidden\` so the list reflects every hidden category regardless
of its parent group's hidden status. The grouped view is unchanged.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] getCategories: query categories table directly when hidden=true

Replace the second \`getCategoryGroups()\` call (which loaded every group
plus its nested categories just to be flattened and filtered) with a
direct \`q('categories').filter({ hidden: true })\` AQL query. Same
result, one targeted query instead of fetching all groups.

The non-hidden=true paths are unchanged.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-11 17:05:19 +00:00
Matt Fiddaman
070144f182 automation UI: dry run should show full demand (#7790)
* dry run should only take into account the single category

* fix error/warning consistency

* note

* ignore priorities for dry run
2026-05-11 16:10:32 +00:00
Alec Bakholdin
e479c84898 Added logic to prevent autocomplete from flickering on alt tab (#7637)
* added logic to prevent autocomplete from flickering on alt tab

* lint auto fixes

* removed import and linting autofix

* release notes

* flipped false to true for allowOpening

* updated release notes

* release note update to trigger CI

---------

Co-authored-by: Alec Bakholdin <alecbakholdin.com>
2026-05-11 15:34:17 +00:00
Tyler Quinlivan
1306da27c5 docs: fix wrong example number (#7773) 2026-05-11 15:20:46 +00:00
Aurel Demiri
0fd510a1d4 Integrate Enable Banking as a bank sync provider (#7345)
* Integrate Enable Banking as bank sync provider

Rewrite Enable Banking modal to match GoCardless pattern

Resolve Enable Banking bugs and improve auth flow

* [AI] Address code review feedback for Enable Banking integration

Bug fixes:
- Fix double-negative for DBIT transaction amounts (e.g. '--25.99')
- Fix payeeName counterparty mapping (CRDT→debtor, DBIT→creditor)
- Add missing state validation in EnableBankingCallback and /auth_callback
- Fix stuck loading state in useEnableBankingStatus with try/catch/finally
- Make session-expiry error matching case-insensitive
- Prefer CLAV balance type for startingBalance in /transactions route
- Guard setTimeout in post/del/patch when timeout is null
- Distinguish abort from network failure in post() catch

Credential handling:
- Add validateCredentials() to validate before persisting secrets
- Refactor client to use enablebanking-configure instead of manual secret-set
- Distinguish null (loading) from false (not configured) in setup checks

Poll-auth robustness:
- Add unique waiter IDs to prevent superseded waiter cleanup race
- Always cache results in completedAuths for retry resilience
- Add client disconnect cleanup via res.on('close')
- Cancel poll when Enable Banking modal closes via AbortController
- Prevent concurrent poll controller race with local reference check

Code quality:
- Extract buildSessionResult() to deduplicate auth_callback/complete-auth
- Add enabled parameter to useEnableBankingStatus to skip unused requests
- Add re-entrancy guard on onJump, reset bank on country change
- Refetch bank list after Enable Banking setup completes
- Type enableBankingConfigure config, make state required in completeAuth
- Add AbortError→TIMED_OUT test, fix startAuth test assertion
- Add afterAll vi.unstubAllGlobals() for test cleanup
- Add explanatory comments for bank-per-account model and in-memory maps

* [AI] Fix missing patterns in Enable Banking integration

- Add SyncServerEnableBankingAccount to ExternalAccount union and
  getInstitutionName parameter type in SelectLinkedAccountsModal
- Use BankSyncProviders type in mobile BankSyncAccountsList instead of
  hardcoded union missing enableBanking
- Add getSecretsError handling to EnableBankingInitialiseModal for
  proper auth/permission error messages
- Replace hardcoded #666 color with theme.pageTextSubdued
- Wrap onConnectEnableBanking in try/catch with error notification and
  init modal re-open, matching SimpleFin/PluggyAI pattern
- Translate hardcoded error string in enablebanking.ts
- Add 60s timeout to downloadEnableBankingTransactions matching PluggyAI
- Revert out-of-scope changes to del()/patch() in post.ts
- Revert shared starting balance dedup logic back to master pattern

* Forward PSU headers to Enable Banking API

* Fix Enable Banking re-auth dispatch

* Respect ASPSP maximum_consent_validity when starting Enable Banking auth

* Fix missing types for module jws

* Add upcoming release notes

* Fix format

Expected "sign" (value-import) to come before "Algorithm"

* Fix code review findings on Enable Banking integration

* [AI] Disable Enable Banking button while status is loading

* typo

* [AI] Migrate enable-banking files to subpath imports

Update all enable-banking files to use # subpath imports and
@actual-app/core paths, matching the migration done in master.
Add #enablebanking entry to desktop-client package.json imports map.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* [AI] Add #app-enablebanking subpath imports to sync-server package.json

Register enablebanking service, utils, and root entries in both
the imports and publishConfig.imports maps.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* Add jws to dependencies

* [AI] Harden Enable Banking OAuth callback handoff

Enforce exact OAuth state round-trip in the Enable Banking callback so
mismatched/missing state values no longer silently complete the flow.
Replace unsafe `as`/`!` assertions in the auth handoff with typed
locals so the callback path stays sound under strict TypeScript.

* [AI] Tighten Enable Banking type safety

Make the Enable Banking external-msg modal strict-ts compatible,
annotate the id type in linkEnableBankingAccount, derive
AccountSyncSource from a single SYNC_PROVIDERS list, and annotate the
return type of getJWTBody. No behaviour change.

* [AI] Fix Enable Banking poll lifecycle and abort handling

Make the popup-driven auth poll cancellable and isolated:

- Allow the popup retry path to abort the in-flight poll instead of
  leaving it hanging on the previous attempt.
- Clear the Enable Banking stateRef when the retry attempt finishes so
  a new attempt starts from a clean state.
- Start useEnableBankingStatus in loading state until the first fetch
  resolves so the UI doesn't briefly flash "not connected".
- Cancel only the requested poll, not every in-flight Enable Banking
  poll, so unrelated link attempts aren't affected.
- Skip writing the poll response when the client has already
  disconnected, with a regression test covering the disconnect path.

* [AI] Tighten Enable Banking client/test plumbing

Misc code-quality improvements with no behaviour change:

- Parallelize Enable Banking secret reset calls so wiping multiple
  secrets doesn't serialize the request chain.
- Use absolute imports in the enable-banking client module to match the
  rest of desktop-client.
- Document externalSignal usage in the post helper.
- Tighten Enable Banking test fixtures with `satisfies` and dynamic
  dates so they stop drifting when the real "now" moves.

* [AI] Fix Enable Banking initial-balance and post-link bookkeeping

Apply the standard post-sync bookkeeping when linking an Enable Banking
account so the new account picks up the same starting-balance
treatment as other bank-sync providers, and skip pending transactions
when computing the initial balance so the figure isn't inflated by
transactions that haven't cleared yet.

* [AI] Refine Enable Banking error model and bank-sync surface

Carry the human-readable Enable Banking message in
EnableBankingError.error_type and the machine-friendly identifier in
error_code, then map error_code to a bank-sync category in the
/transactions wire format so AccountSyncCheck can match on the same
categories as other providers.

* [AI] Improve Enable Banking bank-sync field mapping

Bring the Enable Banking transaction normalizer in line with how other
bank-sync providers feed the field mapper:

- Strip SEPA structured prefixes from remittance text so notes/payee
  display the human-meaningful portion instead of the SEPA boilerplate.
- Return the notes field and spread the raw transaction so downstream
  field mapping can reach the full payload.
- Expose Enable Banking raw fields in the bank-sync field mapper UI so
  users can map any underlying property, not just the curated subset.

* [AI] Use req.ip for Enable Banking PSU header so trust-proxy whitelist applies

* [AI] Address Enable Banking CodeRabbit pass-3 follow-ups

Three small fixes from the latest CodeRabbit re-review:

- Guard the aspsps fetch in EnableBankingExternalMsgModal against stale
  responses. Switching countries quickly could let an earlier in-flight
  request overwrite the newer selection's bank list. Added a cleanup
  flag in the useEffect so only the latest response updates state.
- Clear `enablebanking_auth_state` from localStorage when the auth flow
  exits, but only if the stored value still matches this attempt's
  state, so a concurrent retry can't wipe a newer session. Wrapping
  the poll in try/finally covers every return path (success, timeout,
  abort, body-level error).
- Use `Boolean(trans.booked)` in the Enable Banking initial-balance
  predicate to match `normalizeBankSyncTransactions`. The Enable
  Banking normalizer always sets `booked` to a boolean today, so this
  is defensive rather than a live bug, but keeping the two predicates
  aligned avoids surprises if the upstream shape ever loosens.

* [AI] Address Enable Banking CodeRabbit pass-3 follow-ups (round 2)

Two more findings from the latest CodeRabbit pass:

- Guard onJump against stale-retry completions. Token each call with a
  monotonic jumpIdRef counter and gate every post-await write
  (setError/setWaiting after onMoveExternal, the second setWaiting,
  and the finally-block ref reset) on `myJumpId === jumpIdRef.current`.
  Without this, a retry click while the previous poll was still
  unwinding could surface the older call's error in the newer
  attempt's UI and clear stateRef/isJumpingRef out from under it,
  leaving the new poll un-cancellable.
- Translate the (beta) suffix on Enable Banking ASPSP names so
  non-English locales don't surface a hardcoded English token in the
  bank list. The existing `actual/no-untranslated-strings` rule misses
  this case (regex requires a leading uppercase, and template-literal
  interpolations aren't visited as standalone strings).

* [AI] Use SEPA prefix allowlist instead of catch-all regex

The previous `^[A-Z]{3,}\+` regex would incorrectly strip merchant
tokens like `BMW+`, `USB+`, or `COVID+` from the start of a remittance
line. Replaced it with an explicit allowlist of known SEPA / ISO 20022
prefixes and added a regression test covering the false-positive case.

* [AI] Use uuidv4 instead of crypto.randomUUID in Enable Banking

Aligns with master's revert in #7734 (crypto.randomUUID back to uuid
library). Two stray spots remained in Enable Banking code: the
link-account flow in loot-core/server/accounts/app.ts and the OAuth
state token in sync-server/app-enablebanking.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-05-11 14:05:30 +00:00
Matiss Janis Aboltins
82673ecd50 [AI] Use bash for /update-vrt merge step (#7783)
The Merge VRT Patches job runs inside the Playwright container where
the default GitHub Actions shell is `sh -e {0}`, not bash. The merge
step uses bash-only constructs (`shopt -s nullglob`, array literals,
`${#patches[@]}`, `"${patches[@]}"`), so every /update-vrt run that
reaches the merge stage now exits 127 with `shopt: not found` (e.g.
run 25609625260).

Pin this step to `shell: bash` to match the explicit `shell: bash` we
already use elsewhere in the workflow. The sibling shard-patch creation
steps stay on the default sh because they only use POSIX features.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-10 18:56:50 +00:00
Matiss Janis Aboltins
18c704b3ba [AI] Sync server: harden CORS proxy method validation (#7788)
* [AI] Sync server: harden CORS proxy method validation

The CORS proxy validated `method` against a fallback-normalized value but
forwarded the raw client-supplied value to fetch(), letting a non-string
input (e.g. ["POST"]) bypass the GET/HEAD allowlist via undici's String()
coercion. Reject non-string method, pass the validated normalized method
to fetch(), and drop the unreachable body-forwarding branch.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Polish release notes wording

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Rename 7787.md to 7788.md

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-10 16:11:40 +00:00
Michael Clark
b05c207123 :electron: Publish to Microsoft store after release is published (#7757)
* move desktop app microsoft store publish to after the release is published

* release notes
2026-05-09 21:16:13 +00:00
Matiss Janis Aboltins
b9ab3e7bc6 [AI] Fix /update-vrt build step after lage browser-build refactor (#7781)
The build-web job in vrt-update-generate.yml invoked
`yarn workspace @actual-app/core build:browser`, but #7602 removed that
script when it routed the browser pipeline through
`lage build:browser --to=@actual-app/web` (orchestrated by
bin/package-browser). The recent /update-vrt parallelization (#7641)
preserved the now-stale per-workspace invocations, so every comment
trigger fails with "Couldn't find a script named build:browser".

Match the working e2e-test.yml build-web step exactly:
`yarn build:browser --skip-translations`. lage's `^build` edge handles
the upstream graph (crdt, plugins-service, loot-core) automatically, and
`--skip-translations` keeps the captured snapshots aligned with regular
VRT runs (which also strip Weblate locale chunks for determinism).

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-09 18:16:42 +00:00
Matiss Janis Aboltins
4f40defe9e [AI] Mobile: live value tracking (#7774)
* [AI] Update mobile budget value color live as user types

The mobile FocusableAmountInput's color was computed from the saved
`value` prop, so it stayed in the gray "zero" state until blur. Track
the in-progress edited value via the existing `onChangeValue` callback
and feed it to `makeAmountFullStyle` so the color reflects what the
user is currently typing.

* Add release notes for PR #7774

* Change category from Features to Bugfix

* [AI] Reapply sign when computing live amount color

liveValue holds the absolute value (the input field has no sign — the
+/- toggle controls it separately), so passing it directly to
makeAmountFullStyle picked positiveColor for amounts the user intends
as negative. Pass maybeApplyNegative(liveValue, isNegative) so the
color matches the signed value.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-09 18:08:58 +00:00
Will Lapinel
3799b587ec [AI] Add getNote and updateNote to public API (#7769)
* [AI] Add getNote and updateNote to public API

Notes on categories and other entities have no public API surface today.
The internal `notes-save` handler exists and works, but callers outside
the app must reach into undocumented internals to use it.

A concrete motivation: AI assistants driving Actual through an MCP server
(e.g. Claude via @actual-app/api) can set budget templates and savings
goals on categories by writing specially-formatted strings to the notes
field (e.g. `#template 250`, `#goal 1000`). Without a public API this
requires using the private `lib.send('notes-save', …)` path, which is
fragile and not guaranteed to stay stable.

This commit adds two public methods:
- `getNote(id)` — returns the NoteEntity for a given entity id, or null
- `updateNote(id, note)` — sets the note string on any entity by id

Implementation:
- Adds `notes-get` handler in `packages/loot-core/src/server/notes/app.ts`
- Adds `api/note-get` and `api/note-update` handlers in `api.ts`
- Adds `ApiHandlers` types for both new handlers
- Exposes `getNote` / `updateNote` in `packages/api/methods.ts`
- Adds a test covering get (null before set) and set/update round-trip

Testing:
- `yarn typecheck` — passed (10/10 packages, 0 errors)
- `yarn lint:fix` — passed (0 errors)
- `yarn workspace @actual-app/api test` — passed (19/19 tests, including
  the new "Notes: successfully get and update note" test)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* [AI] Add release note for PR #7769

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* [AI] Address review feedback: tighten types and add docs

- Use NoteEntity field types (Pick<NoteEntity, 'id'>, NoteEntity['id'],
  NoteEntity['note']) instead of plain strings throughout
- Rename getNotes -> getNote (singular) in notes/app.ts
- Add Notes section to packages/docs/docs/api/reference.md

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-09 16:17:56 +00:00
Jon Bramley
8e1f27f316 Modal Blur remove static will-change: transform (#7760)
* remove static will-change: transform

* Release notes

* Fix text blur in modals by updating CSS

---------

Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-05-09 16:00:23 +00:00
Nikhil Verma
fb95d4c92d [AI] Document Dev Container option in development-setup docs (#7729)
* [AI] Document Dev Container option in development-setup docs

* [AI] Add release notes for #7729

* [AI] Update spell-check dictionary for Codespaces
2026-05-09 15:45:57 +00:00
LIZ
2782d464ab Fix last month report widgets restoring as static (#7768)
* Fix last month report widgets restoring as static

* Add release note
2026-05-09 15:30:04 +00:00
dependabot[bot]
b63f5dd303 Bump fast-uri from 3.1.0 to 3.1.2 (#7762)
Bumps [fast-uri](https://github.com/fastify/fast-uri) from 3.1.0 to 3.1.2.
- [Release notes](https://github.com/fastify/fast-uri/releases)
- [Commits](https://github.com/fastify/fast-uri/compare/v3.1.0...v3.1.2)

---
updated-dependencies:
- dependency-name: fast-uri
  dependency-version: 3.1.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-09 15:05:29 +00:00
dependabot[bot]
35a01b0fa6 Bump @babel/plugin-transform-modules-systemjs from 7.28.5 to 7.29.4 (#7776)
Bumps [@babel/plugin-transform-modules-systemjs](https://github.com/babel/babel/tree/HEAD/packages/babel-plugin-transform-modules-systemjs) from 7.28.5 to 7.29.4.
- [Release notes](https://github.com/babel/babel/releases)
- [Changelog](https://github.com/babel/babel/blob/main/CHANGELOG.md)
- [Commits](https://github.com/babel/babel/commits/v7.29.4/packages/babel-plugin-transform-modules-systemjs)

---
updated-dependencies:
- dependency-name: "@babel/plugin-transform-modules-systemjs"
  dependency-version: 7.29.4
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-09 15:03:49 +00:00
350 changed files with 14368 additions and 3915 deletions

View File

@@ -1,7 +1,7 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/docker-existing-docker-compose
{
"name": "Actual development",
"name": "Actual Devcontainer",
"dockerComposeFile": ["../docker-compose.yml", "docker-compose.yml"],
// Alternatively:
// "image": "mcr.microsoft.com/devcontainers/typescript-node:0-16",

View File

@@ -44,6 +44,7 @@ CLP
CMCIFRPAXXX
COBADEFF
CODEOWNERS
Codespaces
COEP
commerzbank
Copiar

View File

@@ -10,6 +10,10 @@ inputs:
description: 'Whether to download translations as part of setup, default true'
required: false
default: 'true'
cache:
description: 'Whether to restore and save dependency and Lage caches, default true'
required: false
default: 'true'
runs:
using: composite
@@ -18,6 +22,7 @@ runs:
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22
package-manager-cache: ${{ inputs.cache }}
- name: Install yarn
run: npm install -g yarn
shell: bash
@@ -28,6 +33,7 @@ runs:
shell: bash
- name: Cache
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
if: ${{ inputs.cache == 'true' }}
id: cache
with:
path: ${{ format('{0}/**/node_modules', inputs.working-directory) }}
@@ -37,6 +43,7 @@ runs:
shell: bash
- name: Cache Lage
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
if: ${{ inputs.cache == 'true' }}
with:
path: ${{ format('{0}/.lage', inputs.working-directory) }}
key: lage-${{ runner.os }}-${{ github.sha }}

View File

@@ -0,0 +1,27 @@
name: Add 'AI generated' label to '[AI]' PRs
##########################################################################################
# This workflow uses the 'pull_request_target' event so it has a token that can add a #
# label to PRs from forks. It does NOT check out or execute any code from the PR, so it #
# is not vulnerable to the usual 'pull_request_target' code-injection concerns. Keep it #
# that way - do not add a checkout step or run any PR-provided scripts here. #
##########################################################################################
on:
# This workflow never checks out or runs PR code; it only reads the PR title and adds a label.
pull_request_target: # zizmor: ignore[dangerous-triggers]
types: [opened, reopened, edited]
permissions:
pull-requests: write
jobs:
add-ai-generated-label:
name: Add 'AI generated' label
runs-on: ubuntu-latest
if: startsWith(github.event.pull_request.title, '[AI]')
steps:
- uses: actions-ecosystem/action-add-labels@bd52874380e3909a1ac983768df6976535ece7f8 # v1.1.0
with:
labels: AI generated
github_token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -14,6 +14,9 @@ on:
pull_request:
merge_group:
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}

View File

@@ -7,6 +7,9 @@ on:
pull_request:
merge_group:
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}

View File

@@ -11,6 +11,11 @@ on:
required: true
type: string
permissions:
contents: read
issues: read
pull-requests: read
jobs:
count-points:
runs-on: ubuntu-latest

View File

@@ -0,0 +1,48 @@
name: CRDT version bump check
on:
pull_request:
paths:
- 'packages/crdt/**'
permissions:
contents: read
jobs:
check-version-bump:
runs-on: ubuntu-latest
name: Ensure @actual-app/crdt version is bumped
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
persist-credentials: false
- name: Verify version bump
env:
BASE_REF: ${{ github.base_ref }}
run: |
set -euo pipefail
if ! git cat-file -e "origin/${BASE_REF}:packages/crdt/package.json" 2>/dev/null; then
echo "packages/crdt/package.json does not exist on the base branch; skipping."
exit 0
fi
BASE_VERSION=$(git show "origin/${BASE_REF}:packages/crdt/package.json" | jq -r .version)
HEAD_VERSION=$(jq -r .version packages/crdt/package.json)
echo "Base version: $BASE_VERSION"
echo "Head version: $HEAD_VERSION"
if [ "$BASE_VERSION" = "$HEAD_VERSION" ]; then
echo "::error file=packages/crdt/package.json::Files in packages/crdt/ were modified but the @actual-app/crdt version was not bumped. Please update the \"version\" field in packages/crdt/package.json."
exit 1
fi
HIGHEST=$(printf '%s\n%s\n' "$BASE_VERSION" "$HEAD_VERSION" | sort -V | tail -n1)
if [ "$HIGHEST" != "$HEAD_VERSION" ]; then
echo "::error file=packages/crdt/package.json::The @actual-app/crdt version ($HEAD_VERSION) must be greater than the base version ($BASE_VERSION)."
exit 1
fi
echo "Version bumped from $BASE_VERSION to $HEAD_VERSION."

View File

@@ -37,6 +37,8 @@ jobs:
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
download-translations: 'false'
- name: Bump package versions

View File

@@ -75,6 +75,9 @@ jobs:
# This is faster and avoids yarn memory issues
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Build Web
run: yarn build:server
@@ -88,10 +91,15 @@ jobs:
tags: actualbudget/actual-server-testing
- name: Test that the docker image boots
timeout-minutes: 1
run: |
docker run --detach --network=host actualbudget/actual-server-testing
sleep 10
curl --fail -sS -LI -w '%{http_code}\n' --retry 20 --retry-delay 1 --retry-connrefused localhost:5006
docker run --detach --network=host --name actual-server actualbudget/actual-server-testing
HEALTHCMD=$(yq -r '.services.actual_server.healthcheck.test[1]' packages/sync-server/docker-compose.yml)
until docker exec actual-server sh -c "$HEALTHCMD"; do sleep 1; done
- name: Dump container logs on failure
if: failure()
run: docker logs actual-server || true
# This will use the cache from the earlier build step and not rebuild the image
# https://docs.docker.com/build/ci/github-actions/test-before-push/

View File

@@ -23,6 +23,10 @@ env:
TAGS: |
type=semver,pattern={{version}}
permissions:
contents: read
packages: write
jobs:
build:
name: Build Docker image
@@ -77,9 +81,29 @@ jobs:
# This is faster and avoids yarn memory issues
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Build Web
run: yarn build:server
- name: Build ubuntu image for testing
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7.1.0
with:
context: .
push: false
load: true
file: packages/sync-server/docker/ubuntu.Dockerfile
tags: actualbudget/actual-server-testing
- name: Test that the ubuntu image boots
timeout-minutes: 1
run: |
docker rm -f actual-server 2>/dev/null || true
docker run --detach --network=host --name actual-server actualbudget/actual-server-testing
HEALTHCMD=$(yq -r '.services.actual_server.healthcheck.test[1]' packages/sync-server/docker-compose.yml)
until docker exec actual-server sh -c "$HEALTHCMD"; do sleep 1; done
- name: Build and push ubuntu image
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7.1.0
with:
@@ -89,6 +113,23 @@ jobs:
platforms: linux/amd64,linux/arm64,linux/arm/v7
tags: ${{ steps.meta.outputs.tags }}
- name: Build alpine image for testing
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7.1.0
with:
context: .
push: false
load: true
file: packages/sync-server/docker/alpine.Dockerfile
tags: actualbudget/actual-server-testing
- name: Test that the alpine image boots
timeout-minutes: 1
run: |
docker rm -f actual-server 2>/dev/null || true
docker run --detach --network=host --name actual-server actualbudget/actual-server-testing
HEALTHCMD=$(yq -r '.services.actual_server.healthcheck.test[1]' packages/sync-server/docker-compose.yml)
until docker exec actual-server sh -c "$HEALTHCMD"; do sleep 1; done
- name: Build and push alpine image
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7.1.0
with:
@@ -97,3 +138,7 @@ jobs:
file: packages/sync-server/docker/alpine.Dockerfile
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
tags: ${{ steps.alpine-meta.outputs.tags }}
- name: Dump container logs on failure
if: failure()
run: docker logs actual-server || true

View File

@@ -199,11 +199,13 @@ jobs:
if: github.event_name == 'pull_request'
run: |
mkdir -p vrt-metadata
echo "${{ github.event.pull_request.number }}" > vrt-metadata/pr-number.txt
echo "${{ needs.vrt.result }}" > vrt-metadata/vrt-result.txt
echo "${PR_NUMBER}" > vrt-metadata/pr-number.txt
echo "${VRT_RESULT}" > vrt-metadata/vrt-result.txt
echo "${STEPS_PLAYWRIGHT_REPORT_VRT_OUTPUTS_ARTIFACT_URL}" > vrt-metadata/artifact-url.txt
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
STEPS_PLAYWRIGHT_REPORT_VRT_OUTPUTS_ARTIFACT_URL: ${{ steps.playwright-report-vrt.outputs.artifact-url }}
VRT_RESULT: ${{ needs.vrt.result }}
- name: Upload VRT metadata
if: github.event_name == 'pull_request'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1

View File

@@ -67,6 +67,9 @@ jobs:
STEPS_PROCESS_VERSION_OUTPUTS_VERSION: ${{ steps.process_version.outputs.version }}
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Build Electron for Mac
if: ${{ startsWith(matrix.os, 'macos') }}
run: ./bin/package-electron
@@ -117,49 +120,7 @@ jobs:
!packages/desktop-electron/dist/Actual-windows.exe
packages/desktop-electron/dist/*.AppImage
packages/desktop-electron/dist/*.flatpak
packages/desktop-electron/dist/*.appx
outputs:
version: ${{ steps.process_version.outputs.version }}
publish-microsoft-store:
needs: build
runs-on: windows-latest
environment: release
if: ${{ github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') }}
steps:
- name: Install StoreBroker
shell: powershell
run: |
Install-Module -Name StoreBroker -AcceptLicense -Force -Scope CurrentUser -Verbose
- name: Download Microsoft Store artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: actual-electron-windows-latest-appx
- name: Submit to Microsoft Store
shell: powershell
run: |
# Disable telemetry
$global:SBDisableTelemetry = $true
# Authenticate against the store
$pass = ConvertTo-SecureString -String '${{ secrets.MICROSOFT_STORE_CLIENT_SECRET }}' -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList ${{ secrets.MICROSOFT_STORE_CLIENT_ID }},$pass
Set-StoreBrokerAuthentication -TenantId '${{ secrets.MICROSOFT_STORE_TENANT_ID }}' -Credential $cred
# Zip and create metadata files
$artifacts = Get-ChildItem -Path . -Filter *.appx | Select-Object -ExpandProperty FullName
New-StoreBrokerConfigFile -Path "$PWD/config.json" -AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }}
New-SubmissionPackage -ConfigPath "$PWD/config.json" -DisableAutoPackageNameFormatting -AppxPath $artifacts -OutPath "$PWD" -OutName submission
# Submit the app
# See https://github.com/microsoft/StoreBroker/blob/master/Documentation/USAGE.md#the-easy-way
Update-ApplicationSubmission `
-AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }} `
-SubmissionDataPath "submission.json" `
-PackagePath "submission.zip" `
-ReplacePackages `
-NoStatus `
-AutoCommit `
-Force

View File

@@ -19,6 +19,9 @@ on:
- '!packages/docs/**' # Docs changes don't affect Electron
- '!packages/eslint-plugin-actual/**' # Eslint plugin changes don't affect Electron
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true

View File

@@ -6,6 +6,9 @@ on:
- cron: '0 4 * * *'
workflow_dispatch:
permissions:
contents: read
jobs:
extract-and-upload-i18n-strings:
runs-on: ubuntu-latest

View File

@@ -4,6 +4,9 @@ on:
issues:
types: [labeled]
permissions:
issues: write
jobs:
needs-votes:
if: ${{ github.event.label.name == 'feature' }}

View File

@@ -4,6 +4,9 @@ on:
issues:
types: [labeled]
permissions:
issues: write
jobs:
tech-support:
if: ${{ github.event.label.name == 'tech-support' }}

View File

@@ -4,6 +4,9 @@ on:
issues:
types: [closed]
permissions:
issues: write
jobs:
remove-help-wanted:
if: ${{ !contains(github.event.issue.labels.*.name, 'feature') && contains(github.event.issue.labels.*.name, 'help wanted') }}

View File

@@ -1,37 +0,0 @@
# When the "unfreeze" label is added to a PR, add that PR to Merge Freeze's unblocked list
# so it can be merged during a freeze. Uses pull_request_target so the workflow runs in
# the base repo and has access to MERGEFREEZE_ACCESS_TOKEN for fork PRs; it does not
# checkout or run any PR code. Requires MERGEFREEZE_ACCESS_TOKEN repo secret
# (project-specific token from Merge Freeze Web API panel for actualbudget/actual / master).
# See: https://docs.mergefreeze.com/web-api#post-freeze-status
name: Merge Freeze add PR to unblocked list
on:
pull_request_target:
types: [labeled]
jobs:
unfreeze:
if: ${{ github.event.label.name == 'unfreeze' }}
runs-on: ubuntu-latest
permissions: {}
concurrency:
group: merge-freeze-unfreeze-${{ github.ref }}-labels
cancel-in-progress: false
steps:
- name: POST to Merge Freeze add PR to unblocked list
env:
MERGEFREEZE_ACCESS_TOKEN: ${{ secrets.MERGEFREEZE_ACCESS_TOKEN }}
PR_NUMBER: ${{ github.event.pull_request.number }}
USER_NAME: ${{ github.actor }}
run: |
set -e
if [ -z "$MERGEFREEZE_ACCESS_TOKEN" ]; then
echo "::error::MERGEFREEZE_ACCESS_TOKEN secret is not set"
exit 1
fi
url="https://www.mergefreeze.com/api/branches/actualbudget/actual/master/?access_token=${MERGEFREEZE_ACCESS_TOKEN}"
payload=$(jq -n --arg user_name "$USER_NAME" --argjson pr "$PR_NUMBER" '{frozen: true, user_name: $user_name, unblocked_prs: [$pr]}')
curl -sf -X POST "$url" -H "Content-Type: application/json" -d "$payload"
echo "Merge Freeze updated: PR #$PR_NUMBER added to unblocked list."

View File

@@ -12,6 +12,9 @@ on:
tags:
- v**
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: false
@@ -28,6 +31,9 @@ jobs:
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Install Netlify
run: npm install netlify-cli@17.10.1 -g

86
.github/workflows/publish-crdt.yml vendored Normal file
View File

@@ -0,0 +1,86 @@
name: Publish @actual-app/crdt
# Automatically publishes @actual-app/crdt when its package.json version
# changes on master (typically via a merged PR that bumped the version).
on:
push:
branches:
- master
paths:
- 'packages/crdt/package.json'
workflow_dispatch:
permissions:
contents: read
concurrency:
group: publish-crdt
cancel-in-progress: false
jobs:
check-version:
runs-on: ubuntu-latest
name: Check if publish is needed
outputs:
should-publish: ${{ steps.check.outputs.should-publish }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Compare local version against npm registry
id: check
run: |
set -euo pipefail
LOCAL_VERSION=$(jq -r .version packages/crdt/package.json)
echo "Local version: $LOCAL_VERSION"
PUBLISHED_VERSION=$(npm view @actual-app/crdt version 2>/dev/null || echo "")
echo "Published version: ${PUBLISHED_VERSION:-<none>}"
if [ "$LOCAL_VERSION" = "$PUBLISHED_VERSION" ]; then
echo "Versions match - nothing to publish."
echo "should-publish=false" >> "$GITHUB_OUTPUT"
else
echo "Version changed - publish required."
echo "should-publish=true" >> "$GITHUB_OUTPUT"
fi
publish:
needs: check-version
if: needs.check-version.outputs.should-publish == 'true'
runs-on: ubuntu-latest
name: Publish @actual-app/crdt to npm
permissions:
contents: read
id-token: write # Required for npm OIDC provenance
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
download-translations: 'false'
- name: Build @actual-app/crdt
run: yarn workspace @actual-app/crdt build
- name: Pack @actual-app/crdt
run: yarn workspace @actual-app/crdt pack --filename @actual-app/crdt.tgz
- name: Setup node and npm registry
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 24
check-latest: true
# Avoid restoring potentially poisoned caches in release jobs.
package-manager-cache: false
registry-url: 'https://registry.npmjs.org'
- name: Publish to npm
run: npm publish packages/crdt/@actual-app/crdt.tgz --access public --provenance

View File

@@ -18,6 +18,9 @@ concurrency:
group: publish-flathub
cancel-in-progress: false
permissions:
contents: read
jobs:
publish-flathub:
runs-on: ubuntu-22.04

View File

@@ -0,0 +1,116 @@
name: Publish Microsoft Store
defaults:
run:
shell: bash
on:
release:
types: [published]
workflow_dispatch:
inputs:
tag:
description: 'Release tag (e.g. v25.3.0)'
required: true
type: string
concurrency:
group: publish-microsoft-store
cancel-in-progress: false
permissions:
contents: read
jobs:
publish-microsoft-store:
runs-on: windows-latest
environment: release
steps:
- name: Resolve version
id: resolve_version
env:
EVENT_NAME: ${{ github.event_name }}
RELEASE_TAG: ${{ github.event.release.tag_name }}
INPUT_TAG: ${{ inputs.tag }}
run: |
if [[ "$EVENT_NAME" == "release" ]]; then
TAG="$RELEASE_TAG"
else
TAG="$INPUT_TAG"
fi
if [[ -z "$TAG" ]]; then
echo "::error::No tag provided"
exit 1
fi
# Validate tag format (v-prefixed semver, e.g. v25.3.0 or v1.2.3-beta.1)
if [[ ! "$TAG" =~ ^v[0-9]+\.[0-9]+\.[0-9]+(-[a-zA-Z0-9.]+)?$ ]]; then
echo "::error::Invalid tag format: $TAG (expected v-prefixed semver, e.g. v25.3.0)"
exit 1
fi
VERSION="${TAG#v}"
echo "tag=$TAG" >> "$GITHUB_OUTPUT"
echo "version=$VERSION" >> "$GITHUB_OUTPUT"
echo "Resolved tag=$TAG version=$VERSION"
- name: Verify release assets exist
env:
GH_TOKEN: ${{ github.token }}
STEPS_RESOLVE_VERSION_OUTPUTS_TAG: ${{ steps.resolve_version.outputs.tag }}
run: |
TAG="${STEPS_RESOLVE_VERSION_OUTPUTS_TAG}"
echo "Checking release assets for tag $TAG..."
ASSETS=$(gh api "repos/${{ github.repository }}/releases/tags/$TAG" --jq '.assets[].name')
echo "Found assets:"
echo "$ASSETS"
if ! echo "$ASSETS" | grep -q "\.appx$"; then
echo "::error::No .appx assets found in release $TAG"
exit 1
fi
echo "Required .appx assets found."
- name: Download Microsoft Store artifacts
env:
GH_TOKEN: ${{ github.token }}
STEPS_RESOLVE_VERSION_OUTPUTS_TAG: ${{ steps.resolve_version.outputs.tag }}
run: |
TAG="${STEPS_RESOLVE_VERSION_OUTPUTS_TAG}"
gh release download "$TAG" --repo "${{ github.repository }}" --pattern "*.appx"
- name: Install StoreBroker
shell: powershell
run: |
Install-Module -Name StoreBroker -AcceptLicense -Force -Scope CurrentUser -Verbose
- name: Submit to Microsoft Store
shell: powershell
run: |
# Disable telemetry
$global:SBDisableTelemetry = $true
# Authenticate against the store
$pass = ConvertTo-SecureString -String '${{ secrets.MICROSOFT_STORE_CLIENT_SECRET }}' -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList ${{ secrets.MICROSOFT_STORE_CLIENT_ID }},$pass
Set-StoreBrokerAuthentication -TenantId '${{ secrets.MICROSOFT_STORE_TENANT_ID }}' -Credential $cred
# Zip and create metadata files
$artifacts = Get-ChildItem -Path . -Filter *.appx | Select-Object -ExpandProperty FullName
New-StoreBrokerConfigFile -Path "$PWD/config.json" -AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }}
New-SubmissionPackage -ConfigPath "$PWD/config.json" -DisableAutoPackageNameFormatting -AppxPath $artifacts -OutPath "$PWD" -OutName submission
# Submit the app
# See https://github.com/microsoft/StoreBroker/blob/master/Documentation/USAGE.md#the-easy-way
Update-ApplicationSubmission `
-AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }} `
-SubmissionDataPath "submission.json" `
-PackagePath "submission.zip" `
-ReplacePackages `
-NoStatus `
-AutoCommit `
-Force

View File

@@ -13,6 +13,9 @@ defaults:
env:
CI: true
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: false
@@ -45,6 +48,9 @@ jobs:
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- if: ${{ startsWith(matrix.os, 'ubuntu') }}
name: Setup Flatpak dependencies

View File

@@ -9,6 +9,9 @@ on:
- cron: '0 0 * * *'
workflow_dispatch:
permissions:
contents: read
jobs:
build-and-pack:
runs-on: ubuntu-latest
@@ -21,6 +24,9 @@ jobs:
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Update package versions
if: github.event_name != 'push'
@@ -105,6 +111,8 @@ jobs:
with:
node-version: 24
check-latest: true
# Avoid restoring potentially poisoned caches in release jobs.
package-manager-cache: false
registry-url: 'https://registry.npmjs.org'
- name: Publish Core

View File

@@ -33,6 +33,7 @@ jobs:
permissions:
pull-requests: write
contents: read
actions: read
steps:
- name: Checkout base branch
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -44,140 +45,120 @@ jobs:
with:
download-translations: 'false'
- name: Wait for ${{github.base_ref}} web build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: master-web-build
# Resolve one successful `build.yml` run for each side (master and PR
# head) up front, then pin every download below to its `run_id`. This
# ensures artifact downloads are consistent and prevents race conditions.
- name: Resolve build runs
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
id: build-runs
env:
BASE_REF: ${{ github.base_ref }}
HEAD_SHA: ${{ github.event.pull_request.head.sha }}
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: web
ref: ${{github.base_ref}}
- name: Wait for ${{github.base_ref}} API build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: master-api-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: api
ref: ${{github.base_ref}}
- name: Wait for ${{github.base_ref}} CLI build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: master-cli-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: cli
ref: ${{github.base_ref}}
- name: Wait for ${{github.base_ref}} CRDT build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: master-crdt-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: crdt
ref: ${{github.base_ref}}
script: |
const TIMEOUT_MS = 30 * 60 * 1000;
const SLEEP_MS = 15000;
- name: Wait for PR build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: wait-for-web-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: web
ref: ${{github.event.pull_request.head.sha}}
- name: Wait for API PR build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: wait-for-api-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: api
ref: ${{github.event.pull_request.head.sha}}
- name: Wait for CLI PR build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: wait-for-cli-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: cli
ref: ${{github.event.pull_request.head.sha}}
- name: Wait for CRDT PR build to succeed
uses: fountainhead/action-wait-for-check@5a908a24814494009c4bb27c242ea38c93c593be # v1.2.0
id: wait-for-crdt-build
with:
token: ${{ secrets.GITHUB_TOKEN }}
checkName: crdt
ref: ${{github.event.pull_request.head.sha}}
async function resolveRun({ label, filter, notFoundHint }) {
const deadline = Date.now() + TIMEOUT_MS;
while (true) {
const { data } = await github.rest.actions.listWorkflowRuns({
owner: context.repo.owner,
repo: context.repo.repo,
workflow_id: 'build.yml',
...filter,
status: 'success',
per_page: 1,
});
if (data.workflow_runs.length > 0) {
const run = data.workflow_runs[0];
core.info(`Found ${label} build run ${run.id} (${run.html_url})`);
return run.id;
}
if (Date.now() > deadline) {
throw new Error(
`No successful build.yml run found for ${label} within 30 min — ${notFoundHint}.`,
);
}
core.info(`No successful ${label} build run yet — sleeping 15s.`);
await new Promise(r => setTimeout(r, SLEEP_MS));
}
}
- name: Report build failure
if: steps.wait-for-web-build.outputs.conclusion == 'failure' || steps.wait-for-api-build.outputs.conclusion == 'failure' || steps.wait-for-cli-build.outputs.conclusion == 'failure' || steps.wait-for-crdt-build.outputs.conclusion == 'failure'
run: |
echo "Build failed on PR branch or ${GITHUB_BASE_REF}"
exit 1
const baseRef = process.env.BASE_REF;
const headSha = process.env.HEAD_SHA;
const [masterRunId, headRunId] = await Promise.all([
resolveRun({
label: baseRef,
filter: { branch: baseRef },
notFoundHint: `${baseRef} may be broken`,
}),
resolveRun({
label: `PR head ${headSha}`,
filter: { head_sha: headSha },
notFoundHint:
'build may still be running, have failed, or the branch may have been force-pushed',
}),
]);
core.setOutput('master_run_id', masterRunId);
core.setOutput('head_run_id', headRunId);
- name: Download web build artifact from ${{github.base_ref}}
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
id: pr-web-build
with:
branch: ${{github.base_ref}}
run_id: ${{ steps.build-runs.outputs.master_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: build-stats
path: base
- name: Download API build artifact from ${{github.base_ref}}
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
id: pr-api-build
with:
branch: ${{github.base_ref}}
run_id: ${{ steps.build-runs.outputs.master_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: api-build-stats
path: base
- name: Download build stats from PR
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
pr: ${{github.event.pull_request.number}}
run_id: ${{ steps.build-runs.outputs.head_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: build-stats
path: head
allow_forks: true
- name: Download API stats from PR
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
pr: ${{github.event.pull_request.number}}
run_id: ${{ steps.build-runs.outputs.head_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: api-build-stats
path: head
allow_forks: true
- name: Download CLI build artifact from ${{github.base_ref}}
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
branch: ${{github.base_ref}}
run_id: ${{ steps.build-runs.outputs.master_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: cli-build-stats
path: base
- name: Download CLI stats from PR
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
pr: ${{github.event.pull_request.number}}
run_id: ${{ steps.build-runs.outputs.head_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: cli-build-stats
path: head
allow_forks: true
- name: Download CRDT build artifact from ${{github.base_ref}}
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
branch: ${{github.base_ref}}
run_id: ${{ steps.build-runs.outputs.master_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: crdt-build-stats
path: base
- name: Download CRDT stats from PR
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
pr: ${{github.event.pull_request.number}}
run_id: ${{ steps.build-runs.outputs.head_run_id }}
workflow: build.yml
workflow_conclusion: '' # ignore the conclusion of the workflow, since we already checked it
name: crdt-build-stats
path: head
allow_forks: true
- name: Strip content hashes from stats files
run: |
if [ -f ./head/web-stats.json ]; then

View File

@@ -82,16 +82,17 @@ jobs:
with:
download-translations: 'false'
- name: Build browser bundle
# REACT_APP_NETLIFY=true keeps the "Create test file" button in the
# production bundle — every VRT test's beforeEach relies on it via
# ConfigurationPage.createTestFile().
# REACT_APP_NETLIFY=true flips isNonProductionEnvironment() on in the
# bundle so the "Create test file" button (used by every e2e beforeEach
# via ConfigurationPage.createTestFile()) is still rendered in a
# production build. Without it, e2e tests would time out waiting for
# a button that was tree-shaken out.
# --skip-translations keeps VRT screenshots deterministic by rendering
# source-code English instead of upstream Weblate en.json (which can
# drift between snapshot capture and test runs).
env:
REACT_APP_NETLIFY: 'true'
run: |
yarn workspace plugins-service build
yarn workspace @actual-app/crdt build
yarn workspace @actual-app/core build:browser
yarn workspace @actual-app/web build:browser
run: yarn build:browser --skip-translations
- name: Upload build artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
@@ -257,13 +258,17 @@ jobs:
- name: Merge shard patches
id: create-patch
shell: bash
run: |
git config --global --add safe.directory "$GITHUB_WORKSPACE"
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
shopt -s nullglob
patches=(/tmp/shard-patches/*/vrt-shard.patch)
# actions/download-artifact puts a lone matched artifact directly in
# `path` but gives each of several its own `path/<name>/` subdir, so
# recurse instead of globbing `*/vrt-shard.patch` (which would miss
# the common single-shard case).
mapfile -t patches < <(find /tmp/shard-patches -type f -name 'vrt-shard.patch' | sort)
if [ ${#patches[@]} -eq 0 ]; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"

View File

@@ -15,7 +15,8 @@
"vi": "readonly",
"backend": "readonly",
"importScripts": "readonly",
"FS": "readonly"
"FS": "readonly",
"__APP_VERSION__": "readonly"
},
"rules": {
// Import sorting

View File

@@ -7,3 +7,7 @@ enableTransparentWorkspaces: false
nodeLinker: node-modules
yarnPath: .yarn/releases/yarn-4.13.0.cjs
# Secure default: don't run postinstall scripts.
# If a new package requires them, add it to dependenciesMeta in package.json.
enableScripts: false

View File

@@ -52,7 +52,7 @@
"playwright": "yarn workspace @actual-app/web run playwright",
"vrt": "yarn workspace @actual-app/web run vrt",
"vrt:docker": "./bin/run-vrt",
"rebuild-electron": "./node_modules/.bin/electron-rebuild -f -m ./packages/desktop-electron -o better-sqlite3,bcrypt",
"rebuild-electron": "./node_modules/.bin/electron-rebuild -m ./packages/desktop-electron -o better-sqlite3,bcrypt --build-from-source -f",
"rebuild-node": "yarn workspace @actual-app/core rebuild",
"lint": "oxfmt --check . && oxlint --type-aware --quiet",
"lint:fix": "oxfmt . && oxlint --fix --type-aware --quiet",
@@ -87,6 +87,23 @@
"typescript": "^6.0.2",
"vitest": "^4.1.2"
},
"dependenciesMeta": {
"bcrypt": {
"built": true
},
"better-sqlite3": {
"built": true
},
"electron": {
"built": true
},
"esbuild": {
"built": true
},
"sharp": {
"built": true
}
},
"resolutions": {
"adm-zip": "patch:adm-zip@npm%3A0.5.16#~/.yarn/patches/adm-zip-npm-0.5.16-4556fea098.patch",
"minimatch@10.2.1": "10.2.5",

View File

@@ -516,6 +516,29 @@ describe('API CRUD operations', () => {
);
});
// apis: getNote, updateNote
test('Notes: successfully get and update note', async () => {
const categories = await api.getCategories();
const categoryId = categories[0].id;
// No note exists initially
const initial = await api.getNote(categoryId);
expect(initial).toBeNull();
// Set a note
await api.updateNote(categoryId, 'Test note content');
const afterSet = await api.getNote(categoryId);
expect(afterSet).toEqual({ id: categoryId, note: 'Test note content' });
// Update the note
await api.updateNote(categoryId, 'Updated note content');
const afterUpdate = await api.getNote(categoryId);
expect(afterUpdate).toEqual({
id: categoryId,
note: 'Updated note content',
});
});
// apis: getRules, getPayeeRules, createRule, updateRule, deleteRule
test('Rules: successfully update rules', async () => {
await api.createPayee({ name: 'test-payee' });

View File

@@ -13,6 +13,7 @@ import type { ImportTransactionsOpts } from '@actual-app/core/types/api-handlers
import type { Handlers } from '@actual-app/core/types/handlers';
import type {
ImportTransactionEntity,
NoteEntity,
RuleEntity,
TransactionEntity,
} from '@actual-app/core/types/models';
@@ -203,8 +204,8 @@ export function getAccountBalance(id: APIAccountEntity['id'], cutoff?: Date) {
return send('api/account-balance', { id, cutoff });
}
export function getCategoryGroups() {
return send('api/category-groups-get');
export function getCategoryGroups(options: { hidden?: boolean } = {}) {
return send('api/category-groups-get', options);
}
export function createCategoryGroup(group: Omit<APICategoryGroupEntity, 'id'>) {
@@ -225,8 +226,8 @@ export function deleteCategoryGroup(
return send('api/category-group-delete', { id, transferCategoryId });
}
export function getCategories() {
return send('api/categories-get', { grouped: false });
export function getCategories(options: { hidden?: boolean } = {}) {
return send('api/categories-get', { grouped: false, ...options });
}
export function createCategory(category: Omit<APICategoryEntity, 'id'>) {
@@ -247,6 +248,14 @@ export function deleteCategory(
return send('api/category-delete', { id, transferCategoryId });
}
export function getNote(id: NoteEntity['id']) {
return send('api/note-get', { id });
}
export function updateNote(id: NoteEntity['id'], note: NoteEntity['note']) {
return send('api/note-update', { id, note });
}
export function getCommonPayees() {
return send('api/common-payees-get');
}

View File

@@ -85,6 +85,12 @@ export default defineConfig({
},
test: {
globals: true,
// Each test loads a budget file and runs all DB migrations, which can be
// slow on busy CI runners; the default 5s timeout is too tight and causes
// flaky timeouts (and a cascade of unhandled rejections from in-flight work
// continuing after teardown).
testTimeout: 20_000,
hookTimeout: 20_000,
onConsoleLog(log: string, type: 'stdout' | 'stderr'): boolean | void {
// print only console.error
return type === 'stderr';

View File

@@ -0,0 +1,131 @@
import * as api from '@actual-app/api';
import { Command } from 'commander';
import { printOutput } from '#output';
import { registerCategoriesCommand } from './categories';
import { registerCategoryGroupsCommand } from './category-groups';
vi.mock('@actual-app/api', () => ({
getCategories: vi.fn().mockResolvedValue([]),
createCategory: vi.fn().mockResolvedValue('new-id'),
updateCategory: vi.fn().mockResolvedValue(undefined),
deleteCategory: vi.fn().mockResolvedValue(undefined),
getCategoryGroups: vi.fn().mockResolvedValue([]),
createCategoryGroup: vi.fn().mockResolvedValue('new-group-id'),
updateCategoryGroup: vi.fn().mockResolvedValue(undefined),
deleteCategoryGroup: vi.fn().mockResolvedValue(undefined),
}));
vi.mock('#connection', () => ({
withConnection: vi.fn((_opts, fn) => fn()),
}));
vi.mock('#output', () => ({
printOutput: vi.fn(),
}));
function createProgram(): Command {
const program = new Command();
program.option('--format <format>');
program.option('--server-url <url>');
program.option('--password <pw>');
program.option('--session-token <token>');
program.option('--sync-id <id>');
program.option('--data-dir <dir>');
program.option('--verbose');
program.exitOverride();
registerCategoriesCommand(program);
registerCategoryGroupsCommand(program);
return program;
}
async function run(args: string[]) {
const program = createProgram();
await program.parseAsync(['node', 'test', ...args]);
}
describe('categories commands', () => {
let stderrSpy: ReturnType<typeof vi.spyOn>;
let stdoutSpy: ReturnType<typeof vi.spyOn>;
beforeEach(() => {
vi.clearAllMocks();
stderrSpy = vi
.spyOn(process.stderr, 'write')
.mockImplementation(() => true);
stdoutSpy = vi
.spyOn(process.stdout, 'write')
.mockImplementation(() => true);
});
afterEach(() => {
stderrSpy.mockRestore();
stdoutSpy.mockRestore();
});
describe('categories list', () => {
it('asks the API to exclude hidden categories by default', async () => {
await run(['categories', 'list']);
expect(api.getCategories).toHaveBeenCalledWith({ hidden: false });
});
it('asks the API for all categories when --include-hidden is passed', async () => {
await run(['categories', 'list', '--include-hidden']);
expect(api.getCategories).toHaveBeenCalledWith({});
});
it('prints whatever the API returns', async () => {
const visible = {
id: '1',
name: 'Visible',
group_id: 'g1',
hidden: false,
};
vi.mocked(api.getCategories).mockResolvedValue([visible]);
await run(['categories', 'list']);
expect(printOutput).toHaveBeenCalledWith([visible], undefined);
});
it('passes format option to printOutput', async () => {
vi.mocked(api.getCategories).mockResolvedValue([]);
await run(['--format', 'csv', 'categories', 'list']);
expect(printOutput).toHaveBeenCalledWith([], 'csv');
});
});
describe('category-groups list', () => {
it('asks the API to exclude hidden groups by default', async () => {
await run(['category-groups', 'list']);
expect(api.getCategoryGroups).toHaveBeenCalledWith({ hidden: false });
});
it('asks the API for all groups when --include-hidden is passed', async () => {
await run(['category-groups', 'list', '--include-hidden']);
expect(api.getCategoryGroups).toHaveBeenCalledWith({});
});
it('prints whatever the API returns', async () => {
const group = {
id: 'g1',
name: 'Group',
is_income: false,
hidden: false,
categories: [{ id: 'c1', name: 'Cat', group_id: 'g1', hidden: false }],
};
vi.mocked(api.getCategoryGroups).mockResolvedValue([group]);
await run(['category-groups', 'list']);
expect(printOutput).toHaveBeenCalledWith([group], undefined);
});
});
});

View File

@@ -12,13 +12,16 @@ export function registerCategoriesCommand(program: Command) {
categories
.command('list')
.description('List all categories')
.action(async () => {
.description('List categories (excludes hidden by default)')
.option('--include-hidden', 'Include hidden categories', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getCategories();
const result = await api.getCategories(
cmdOpts.includeHidden ? {} : { hidden: false },
);
printOutput(result, opts.format);
},
{ mutates: false },

View File

@@ -12,13 +12,16 @@ export function registerCategoryGroupsCommand(program: Command) {
groups
.command('list')
.description('List all category groups')
.action(async () => {
.description('List category groups (excludes hidden by default)')
.option('--include-hidden', 'Include hidden groups and categories', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getCategoryGroups();
const result = await api.getCategoryGroups(
cmdOpts.includeHidden ? {} : { hidden: false },
);
printOutput(result, opts.format);
},
{ mutates: false },

View File

@@ -1,4 +1,5 @@
#!/bin/bash
set -euo pipefail
cd "$(dirname "$(dirname "$0")")"
@@ -7,20 +8,10 @@ if ! [ -x "$(command -v protoc)" ]; then
exit 1
fi
export PATH="$PWD/bin:$PATH"
protoc --plugin="protoc-gen-ts=../../node_modules/.bin/protoc-gen-ts" \
--ts_opt=esModuleInterop=true \
--ts_out="src/proto" \
--js_out=import_style=commonjs,binary:src/proto \
protoc --plugin="protoc-gen-es=../../node_modules/.bin/protoc-gen-es" \
--es_opt=target=ts \
--es_out="src/proto" \
--proto_path=src/proto \
sync.proto
../../node_modules/.bin/oxfmt src/proto/*.d.ts
for file in src/proto/*.d.ts; do
{ echo "/* oxlint-disable typescript/no-namespace */"; sed 's/export class/export declare class/g' "$file"; } > "${file%.d.ts}.ts"
rm "$file"
done
echo 'One more step! Find the `var global = ...` declaration in src/proto/sync_pb.js and change it to `var global = globalThis;`'
../../node_modules/.bin/oxfmt src/proto/*.ts

View File

@@ -1,8 +1,13 @@
{
"name": "@actual-app/crdt",
"version": "2.1.0",
"version": "3.0.0",
"description": "CRDT layer of Actual",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/actualbudget/actual.git",
"directory": "packages/crdt"
},
"files": [
"dist",
"!dist/**/*.test.d.ts",
@@ -10,14 +15,11 @@
"!dist/**/*.spec.d.ts",
"!dist/**/*.spec.d.ts.map"
],
"main": "dist/index.js",
"types": "dist/index.d.ts",
"type": "module",
"main": "src/index.ts",
"types": "src/index.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"development": "./src/index.ts",
"default": "./dist/index.js"
}
".": "./src/index.ts"
},
"publishConfig": {
"exports": {
@@ -25,7 +27,9 @@
"types": "./dist/index.d.ts",
"default": "./dist/index.js"
}
}
},
"main": "dist/index.js",
"types": "dist/index.d.ts"
},
"scripts": {
"build:node": "vite build",
@@ -35,16 +39,14 @@
"typecheck": "tsgo -b"
},
"dependencies": {
"google-protobuf": "^3.21.4",
"@bufbuild/protobuf": "^2.11.0",
"murmurhash": "^2.0.1",
"uuid": "^14.0.0"
},
"devDependencies": {
"@types/google-protobuf": "3.15.12",
"@bufbuild/protoc-gen-es": "^2.11.0",
"@typescript/native-preview": "beta",
"protoc-gen-js": "3.21.4-4",
"rollup-plugin-visualizer": "^7.0.1",
"ts-protoc-gen": "0.15.0",
"vite": "^8.0.5",
"vitest": "^4.1.2"
}

View File

@@ -1,6 +1,3 @@
import './proto/sync_pb.js'; // Import for side effects
import type * as SyncPb from './proto/sync_pb';
export {
merkle,
getClock,
@@ -13,16 +10,17 @@ export {
Timestamp,
} from './crdt';
declare global {
var proto: typeof SyncPb;
}
export {
type EncryptedData,
type Message,
type MessageEnvelope,
type SyncRequest,
type SyncResponse,
EncryptedDataSchema,
MessageSchema,
MessageEnvelopeSchema,
SyncRequestSchema,
SyncResponseSchema,
} from './proto/sync_pb';
const { proto } = globalThis;
export const SyncRequest = proto.SyncRequest;
export const SyncResponse = proto.SyncResponse;
export const Message = proto.Message;
export const MessageEnvelope = proto.MessageEnvelope;
export const EncryptedData = proto.EncryptedData;
export const SyncProtoBuf = proto;
export { create, fromBinary, toBinary } from '@bufbuild/protobuf';

View File

@@ -21,6 +21,7 @@ message MessageEnvelope {
}
message SyncRequest {
reserved 4;
repeated MessageEnvelope messages = 1;
string fileId = 2;
string groupId = 3;

File diff suppressed because it is too large Load Diff

View File

@@ -1,217 +1,161 @@
/* oxlint-disable typescript/no-namespace */
// package:
// file: sync.proto
// @generated by protoc-gen-es v2.11.0 with parameter "target=ts"
// @generated from file sync.proto (syntax proto3)
/* eslint-disable */
import * as jspb from 'google-protobuf';
import type { Message as Message$1 } from '@bufbuild/protobuf';
import type { GenFile, GenMessage } from '@bufbuild/protobuf/codegenv2';
import { fileDesc, messageDesc } from '@bufbuild/protobuf/codegenv2';
export declare class EncryptedData extends jspb.Message {
getIv(): Uint8Array | string;
getIv_asU8(): Uint8Array;
getIv_asB64(): string;
setIv(value: Uint8Array | string): void;
/**
* Describes the file sync.proto.
*/
export const file_sync: GenFile /*@__PURE__*/ = fileDesc(
'CgpzeW5jLnByb3RvIjoKDUVuY3J5cHRlZERhdGESCgoCaXYYASABKAwSDwoHYXV0aFRhZxgCIAEoDBIMCgRkYXRhGAMgASgMIkYKB01lc3NhZ2USDwoHZGF0YXNldBgBIAEoCRILCgNyb3cYAiABKAkSDgoGY29sdW1uGAMgASgJEg0KBXZhbHVlGAQgASgJIkoKD01lc3NhZ2VFbnZlbG9wZRIRCgl0aW1lc3RhbXAYASABKAkSEwoLaXNFbmNyeXB0ZWQYAiABKAgSDwoHY29udGVudBgDIAEoDCJ2CgtTeW5jUmVxdWVzdBIiCghtZXNzYWdlcxgBIAMoCzIQLk1lc3NhZ2VFbnZlbG9wZRIOCgZmaWxlSWQYAiABKAkSDwoHZ3JvdXBJZBgDIAEoCRINCgVrZXlJZBgFIAEoCRINCgVzaW5jZRgGIAEoCUoECAQQBSJCCgxTeW5jUmVzcG9uc2USIgoIbWVzc2FnZXMYASADKAsyEC5NZXNzYWdlRW52ZWxvcGUSDgoGbWVya2xlGAIgASgJYgZwcm90bzM',
);
getAuthtag(): Uint8Array | string;
getAuthtag_asU8(): Uint8Array;
getAuthtag_asB64(): string;
setAuthtag(value: Uint8Array | string): void;
/**
* @generated from message EncryptedData
*/
export type EncryptedData = Message$1<'EncryptedData'> & {
/**
* @generated from field: bytes iv = 1;
*/
iv: Uint8Array;
getData(): Uint8Array | string;
getData_asU8(): Uint8Array;
getData_asB64(): string;
setData(value: Uint8Array | string): void;
/**
* @generated from field: bytes authTag = 2;
*/
authTag: Uint8Array;
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): EncryptedData.AsObject;
static toObject(
includeInstance: boolean,
msg: EncryptedData,
): EncryptedData.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: EncryptedData,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): EncryptedData;
static deserializeBinaryFromReader(
message: EncryptedData,
reader: jspb.BinaryReader,
): EncryptedData;
}
/**
* @generated from field: bytes data = 3;
*/
data: Uint8Array;
};
export namespace EncryptedData {
export type AsObject = {
iv: Uint8Array | string;
authtag: Uint8Array | string;
data: Uint8Array | string;
};
}
/**
* Describes the message EncryptedData.
* Use `create(EncryptedDataSchema)` to create a new message.
*/
export const EncryptedDataSchema: GenMessage<EncryptedData> /*@__PURE__*/ =
messageDesc(file_sync, 0);
export declare class Message extends jspb.Message {
getDataset(): string;
setDataset(value: string): void;
/**
* @generated from message Message
*/
export type Message = Message$1<'Message'> & {
/**
* @generated from field: string dataset = 1;
*/
dataset: string;
getRow(): string;
setRow(value: string): void;
/**
* @generated from field: string row = 2;
*/
row: string;
getColumn(): string;
setColumn(value: string): void;
/**
* @generated from field: string column = 3;
*/
column: string;
getValue(): string;
setValue(value: string): void;
/**
* @generated from field: string value = 4;
*/
value: string;
};
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): Message.AsObject;
static toObject(includeInstance: boolean, msg: Message): Message.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: Message,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): Message;
static deserializeBinaryFromReader(
message: Message,
reader: jspb.BinaryReader,
): Message;
}
/**
* Describes the message Message.
* Use `create(MessageSchema)` to create a new message.
*/
export const MessageSchema: GenMessage<Message> /*@__PURE__*/ = messageDesc(
file_sync,
1,
);
export namespace Message {
export type AsObject = {
dataset: string;
row: string;
column: string;
value: string;
};
}
/**
* @generated from message MessageEnvelope
*/
export type MessageEnvelope = Message$1<'MessageEnvelope'> & {
/**
* @generated from field: string timestamp = 1;
*/
timestamp: string;
export declare class MessageEnvelope extends jspb.Message {
getTimestamp(): string;
setTimestamp(value: string): void;
/**
* @generated from field: bool isEncrypted = 2;
*/
isEncrypted: boolean;
getIsencrypted(): boolean;
setIsencrypted(value: boolean): void;
/**
* @generated from field: bytes content = 3;
*/
content: Uint8Array;
};
getContent(): Uint8Array | string;
getContent_asU8(): Uint8Array;
getContent_asB64(): string;
setContent(value: Uint8Array | string): void;
/**
* Describes the message MessageEnvelope.
* Use `create(MessageEnvelopeSchema)` to create a new message.
*/
export const MessageEnvelopeSchema: GenMessage<MessageEnvelope> /*@__PURE__*/ =
messageDesc(file_sync, 2);
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): MessageEnvelope.AsObject;
static toObject(
includeInstance: boolean,
msg: MessageEnvelope,
): MessageEnvelope.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: MessageEnvelope,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): MessageEnvelope;
static deserializeBinaryFromReader(
message: MessageEnvelope,
reader: jspb.BinaryReader,
): MessageEnvelope;
}
/**
* @generated from message SyncRequest
*/
export type SyncRequest = Message$1<'SyncRequest'> & {
/**
* @generated from field: repeated MessageEnvelope messages = 1;
*/
messages: MessageEnvelope[];
export namespace MessageEnvelope {
export type AsObject = {
timestamp: string;
isencrypted: boolean;
content: Uint8Array | string;
};
}
/**
* @generated from field: string fileId = 2;
*/
fileId: string;
export declare class SyncRequest extends jspb.Message {
clearMessagesList(): void;
getMessagesList(): Array<MessageEnvelope>;
setMessagesList(value: Array<MessageEnvelope>): void;
addMessages(value?: MessageEnvelope, index?: number): MessageEnvelope;
/**
* @generated from field: string groupId = 3;
*/
groupId: string;
getFileid(): string;
setFileid(value: string): void;
/**
* @generated from field: string keyId = 5;
*/
keyId: string;
getGroupid(): string;
setGroupid(value: string): void;
/**
* @generated from field: string since = 6;
*/
since: string;
};
getKeyid(): string;
setKeyid(value: string): void;
/**
* Describes the message SyncRequest.
* Use `create(SyncRequestSchema)` to create a new message.
*/
export const SyncRequestSchema: GenMessage<SyncRequest> /*@__PURE__*/ =
messageDesc(file_sync, 3);
getSince(): string;
setSince(value: string): void;
/**
* @generated from message SyncResponse
*/
export type SyncResponse = Message$1<'SyncResponse'> & {
/**
* @generated from field: repeated MessageEnvelope messages = 1;
*/
messages: MessageEnvelope[];
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): SyncRequest.AsObject;
static toObject(
includeInstance: boolean,
msg: SyncRequest,
): SyncRequest.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: SyncRequest,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): SyncRequest;
static deserializeBinaryFromReader(
message: SyncRequest,
reader: jspb.BinaryReader,
): SyncRequest;
}
/**
* @generated from field: string merkle = 2;
*/
merkle: string;
};
export namespace SyncRequest {
export type AsObject = {
messagesList: Array<MessageEnvelope.AsObject>;
fileid: string;
groupid: string;
keyid: string;
since: string;
};
}
export declare class SyncResponse extends jspb.Message {
clearMessagesList(): void;
getMessagesList(): Array<MessageEnvelope>;
setMessagesList(value: Array<MessageEnvelope>): void;
addMessages(value?: MessageEnvelope, index?: number): MessageEnvelope;
getMerkle(): string;
setMerkle(value: string): void;
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): SyncResponse.AsObject;
static toObject(
includeInstance: boolean,
msg: SyncResponse,
): SyncResponse.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: SyncResponse,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): SyncResponse;
static deserializeBinaryFromReader(
message: SyncResponse,
reader: jspb.BinaryReader,
): SyncResponse;
}
export namespace SyncResponse {
export type AsObject = {
messagesList: Array<MessageEnvelope.AsObject>;
merkle: string;
};
}
/**
* Describes the message SyncResponse.
* Use `create(SyncResponseSchema)` to create a new message.
*/
export const SyncResponseSchema: GenMessage<SyncResponse> /*@__PURE__*/ =
messageDesc(file_sync, 4);

View File

@@ -4,8 +4,8 @@
"rootDir": "./src",
"composite": true,
"target": "ES2021",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"module": "ES2022",
"moduleResolution": "bundler",
"noEmit": false,
"emitDeclarationOnly": true,
"declaration": true,

View File

@@ -6,7 +6,7 @@ import { defineConfig } from 'vite';
export default defineConfig({
ssr: {
noExternal: true,
external: ['google-protobuf', 'murmurhash'],
external: ['@bufbuild/protobuf', 'murmurhash'],
},
build: {
ssr: true,
@@ -16,7 +16,7 @@ export default defineConfig({
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'src/index.ts'),
formats: ['cjs'],
formats: ['es'],
fileName: () => 'index.js',
},
},

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 134 KiB

After

Width:  |  Height:  |  Size: 134 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 132 KiB

After

Width:  |  Height:  |  Size: 132 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 133 KiB

After

Width:  |  Height:  |  Size: 133 KiB

View File

@@ -25,6 +25,75 @@ export class ReportsPage {
return new ReportsPage(this.page);
}
async goToBalanceForecastPage() {
const gridItems = this.pageContent.locator('.react-grid-item');
const count = await gridItems.count();
let targetItem: Locator | null = null;
for (let i = count - 1; i >= 0; i--) {
const item = gridItems.nth(i);
await item.scrollIntoViewIfNeeded();
const heading = item.getByRole('heading', { name: /^Balance Forecast/i });
if (await heading.isVisible()) {
targetItem = item;
break;
}
}
if (!targetItem) {
await this.page.evaluate(() => {
window.scrollTo(0, document.documentElement.scrollHeight);
});
const refreshedCount = await gridItems.count();
for (let i = refreshedCount - 1; i >= 0; i--) {
const item = gridItems.nth(i);
await item.scrollIntoViewIfNeeded();
const heading = item.getByRole('heading', {
name: /^Balance Forecast/i,
});
if (await heading.isVisible()) {
targetItem = item;
break;
}
}
}
if (!targetItem) {
throw new Error('No Balance Forecast dashboard card found in the grid');
}
const cardNavigateButton = targetItem.getByRole('button', {
name: /^Balance Forecast/i,
});
await Promise.all([
this.page.waitForURL(/\/reports\/forecast\//),
cardNavigateButton.click(),
]);
await this.pageContent
.getByRole('button', { name: 'Monthly' })
.waitFor({ state: 'visible' });
return new ReportsPage(this.page);
}
async selectForecastGranularity(granularity: string) {
await this.pageContent.getByRole('button', { name: 'Monthly' }).click();
const option = this.page.getByRole('button', { name: granularity });
await option.waitFor({ state: 'visible' });
await option.click();
await this.pageContent
.getByRole('button', { name: granularity })
.waitFor({ state: 'visible' });
}
async addWidget(widgetName: string) {
await this.pageContent
.getByRole('button', { name: 'Add new widget' })
.click();
await this.page.getByRole('button', { name: widgetName }).click();
}
async goToCustomReportPage() {
await this.pageContent
.getByRole('button', { name: 'Add new widget' })

View File

@@ -42,17 +42,22 @@ export class SettingsPage {
}
async enableExperimentalFeature(featureName: string) {
if (await this.advancedSettingsButton.isVisible()) {
await this.advancedSettingsButton.click();
}
await this.advancedSettingsButton.waitFor({
state: 'visible',
timeout: 2000,
});
await this.advancedSettingsButton.click();
if (await this.experimentalSettingsButton.isVisible()) {
await this.experimentalSettingsButton.click();
}
await this.experimentalSettingsButton.waitFor({
state: 'visible',
timeout: 2000,
});
await this.experimentalSettingsButton.click();
const featureCheckbox = this.page.getByRole('checkbox', {
name: featureName,
});
await featureCheckbox.waitFor({ state: 'visible' });
if (!(await featureCheckbox.isChecked())) {
await featureCheckbox.click();
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 96 KiB

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 96 KiB

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 96 KiB

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 60 KiB

After

Width:  |  Height:  |  Size: 60 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 60 KiB

After

Width:  |  Height:  |  Size: 60 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 60 KiB

After

Width:  |  Height:  |  Size: 60 KiB

View File

@@ -55,6 +55,28 @@ test.describe.parallel('Reports', () => {
await expect(page).toMatchThemeScreenshots();
});
test.describe('balance forecast', () => {
test.beforeEach(async () => {
const settingsPage = await navigation.goToSettingsPage();
await settingsPage.enableExperimentalFeature('Balance Forecast Report');
reportsPage = await navigation.goToReportsPage();
await reportsPage.waitToLoad();
await reportsPage.addWidget('Balance forecast');
await reportsPage.goToBalanceForecastPage();
});
test('loads balance forecast report with monthly granularity', async () => {
await expect(page).toMatchThemeScreenshots();
});
test('switches to daily granularity', async () => {
await reportsPage.selectForecastGranularity('Daily');
await expect(page).toMatchThemeScreenshots();
});
});
test.describe.parallel('custom reports', () => {
let customReportPage: CustomReportPage;

Binary file not shown.

Before

Width:  |  Height:  |  Size: 106 KiB

After

Width:  |  Height:  |  Size: 109 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 83 KiB

After

Width:  |  Height:  |  Size: 83 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 84 KiB

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 83 KiB

After

Width:  |  Height:  |  Size: 83 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

After

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 81 KiB

After

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 81 KiB

After

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 109 KiB

After

Width:  |  Height:  |  Size: 109 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 110 KiB

After

Width:  |  Height:  |  Size: 110 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 110 KiB

After

Width:  |  Height:  |  Size: 110 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 108 KiB

After

Width:  |  Height:  |  Size: 108 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 109 KiB

After

Width:  |  Height:  |  Size: 109 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 109 KiB

After

Width:  |  Height:  |  Size: 109 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.9 KiB

After

Width:  |  Height:  |  Size: 3.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.9 KiB

After

Width:  |  Height:  |  Size: 3.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.9 KiB

After

Width:  |  Height:  |  Size: 3.6 KiB

View File

@@ -27,6 +27,7 @@
"#transactions": "./src/transactions/index.ts",
"#undo": "./src/undo/index.ts",
"#global-events": "./src/global-events.ts",
"#enablebanking": "./src/enablebanking.ts",
"#gocardless": "./src/gocardless.ts",
"#i18n": "./src/i18n.ts",
"#mocks": "./src/mocks.tsx",
@@ -41,6 +42,7 @@
"#components/budget": "./src/components/budget/index.tsx",
"#components/budget/goals/actions": "./src/components/budget/goals/actions.ts",
"#components/budget/goals/automationExamples": "./src/components/budget/goals/automationExamples.ts",
"#components/budget/goals/cleanupModel": "./src/components/budget/goals/cleanupModel.ts",
"#components/budget/goals/constants": "./src/components/budget/goals/constants.ts",
"#components/budget/goals/displayTemplateMeta": "./src/components/budget/goals/displayTemplateMeta.ts",
"#components/budget/goals/formatMonthLabel": "./src/components/budget/goals/formatMonthLabel.ts",
@@ -160,6 +162,7 @@
"cross-env": "^10.1.0",
"date-fns": "^4.1.0",
"downshift": "9.3.2",
"fzf": "^0.5.2",
"html-to-image": "^1.11.13",
"hyperformula": "^3.2.0",
"i18next": "^25.10.10",

View File

@@ -5,6 +5,7 @@ import type { SyncResponseWithErrors } from '@actual-app/core/server/accounts/ap
import type {
AccountEntity,
CategoryEntity,
SyncServerEnableBankingAccount,
SyncServerGoCardlessAccount,
SyncServerPluggyAiAccount,
SyncServerSimpleFinAccount,
@@ -499,6 +500,48 @@ export function useLinkAccountPluggyAiMutation() {
});
}
type LinkAccountEnableBankingPayload = LinkAccountBasePayload & {
externalAccount: SyncServerEnableBankingAccount;
};
export function useLinkAccountEnableBankingMutation() {
const queryClient = useQueryClient();
const dispatch = useDispatch();
const { t } = useTranslation();
return useMutation({
mutationFn: async ({
externalAccount,
upgradingId,
offBudget,
startingDate,
startingBalance,
}: LinkAccountEnableBankingPayload) => {
await send('enablebanking-accounts-link', {
externalAccount,
upgradingId,
offBudget,
startingDate,
startingBalance,
});
},
onSuccess: () => {
invalidateQueries(queryClient);
invalidateQueries(queryClient, payeeQueries.lists());
},
onError: error => {
console.error('Error linking account to Enable Banking:', error);
dispatchErrorNotification(
dispatch,
t(
'There was an error linking the account to Enable Banking. Please try again.',
),
error,
);
},
});
}
type SyncAccountsPayload = {
id?: AccountEntity['id'] | undefined;
};
@@ -590,6 +633,8 @@ export function useSyncAccountsMutation() {
accountIdsToSync = accountIdsToSync.filter(
id => !simpleFinAccounts.find(sfa => sfa.id === id),
);
dispatch(setAccountsSyncing({ ids: accountIdsToSync }));
}
// Loop through the accounts and perform sync operation.. one by one

View File

@@ -335,10 +335,17 @@ const isUpdateReadyForDownloadPromise = new Promise(resolve => {
resolve(true);
};
});
const updateSW = registerSW({
immediate: true,
onNeedRefresh: markUpdateReadyForDownload,
});
// Skip SW registration in dev so stale cached assets don't override edits
// between page loads. Plugin code that needs a SW can register one itself.
// In dev there is no SW to install, so applyAppUpdate() can't rely on the
// SW lifecycle to swap the page — fall back to a plain reload so callers
// don't hang on the never-resolving promise inside applyAppUpdate.
const updateSW = IS_DEV
? () => window.location.reload()
: registerSW({
immediate: true,
onNeedRefresh: markUpdateReadyForDownload,
});
global.Actual = {
IS_DEV,

View File

@@ -25,14 +25,15 @@ const importScriptsWithRetry = async (script, { maxRetries = 5 } = {}) => {
}
// Attempt to retry after a small delay
await new Promise(resolve =>
setTimeout(async () => {
await importScriptsWithRetry(script, {
await new Promise((resolve, reject) => {
setTimeout(() => {
importScriptsWithRetry(script, {
maxRetries: maxRetries - 1,
});
resolve();
}, 5000),
);
})
.then(resolve)
.catch(reject);
}, 5000);
});
}
};
@@ -76,9 +77,11 @@ self.addEventListener('message', async event => {
return;
}
// A single failed importScripts bricks the SharedWorker until
// it's evicted, so retry in production too.
await importScriptsWithRetry(
`${msg.publicUrl}/kcab/kcab.worker.${hash}.js`,
{ maxRetries: isDev ? 5 : 0 },
{ maxRetries: isDev ? 5 : 3 },
);
backend.initApp(isDev, self).catch(err => {

View File

@@ -0,0 +1,118 @@
import { useEffect, useRef, useState } from 'react';
import { Trans, useTranslation } from 'react-i18next';
import { Paragraph } from '@actual-app/components/paragraph';
import { View } from '@actual-app/components/view';
import { send } from '@actual-app/core/platform/client/connection';
import { Error as ErrorAlert } from '#components/alerts';
import { useUrlParam } from '#hooks/useUrlParam';
export function EnableBankingCallback() {
const { t } = useTranslation();
const [code] = useUrlParam('code');
const [stateParam] = useUrlParam('state');
const [errorParam] = useUrlParam('error');
const storedState = localStorage.getItem('enablebanking_auth_state');
const stateValid =
typeof stateParam === 'string' &&
typeof storedState === 'string' &&
stateParam === storedState;
const [status, setStatus] = useState<'loading' | 'success' | 'error'>(
'loading',
);
const [errorMessage, setErrorMessage] = useState('');
const calledRef = useRef(false);
useEffect(() => {
if (calledRef.current) return;
calledRef.current = true;
async function handleCallback() {
if (errorParam) {
setStatus('error');
setErrorMessage(
t('Authorization was denied or failed: {{error}}', {
error: errorParam,
}),
);
return;
}
if (!code) {
setStatus('error');
setErrorMessage(t('Missing authorization parameters.'));
return;
}
if (!stateValid) {
localStorage.removeItem('enablebanking_auth_state');
setStatus('error');
setErrorMessage(t('Authorization state mismatch. Please try again.'));
return;
}
try {
const result = await send('enablebanking-complete-auth', {
code,
state: stateParam,
});
if (result.error) {
setStatus('error');
setErrorMessage(
result.error.message || t('Failed to complete authorization.'),
);
return;
}
setStatus('success');
localStorage.removeItem('enablebanking_auth_state');
// Auto-close after a short delay
setTimeout(() => {
window.close();
}, 1500);
} catch {
setStatus('error');
setErrorMessage(t('An unexpected error occurred.'));
}
}
void handleCallback();
}, [code, stateParam, stateValid, errorParam, t]);
return (
<View
style={{
padding: 20,
maxWidth: 500,
margin: '40px auto',
textAlign: 'center',
}}
>
{status === 'loading' && (
<Paragraph>
<Trans>Completing authorization...</Trans>
</Paragraph>
)}
{status === 'success' && (
<Paragraph>
<Trans>
Authorization successful! This window will close automatically.
</Trans>
</Paragraph>
)}
{status === 'error' && (
<>
<ErrorAlert>{errorMessage}</ErrorAlert>
<Paragraph style={{ marginTop: 10 }}>
<Trans>You can close this window and try again.</Trans>
</Paragraph>
</>
)}
</View>
);
}

View File

@@ -30,7 +30,7 @@ describe('FatalError', () => {
expect(screen.getByText(/IndexedDB/)).toBeInTheDocument();
});
it('renders the generic simple message for an app-init-failure without a specific cause', () => {
it('renders a backend-worker message for a BackendInitFailure', () => {
const error = {
type: 'app-init-failure',
BackendInitFailure: true,
@@ -38,6 +38,16 @@ describe('FatalError', () => {
render(<FatalError error={error} />, { wrapper: TestProviders });
expect(
screen.getByText(/couldn't load a critical backend worker/i),
).toBeInTheDocument();
});
it('renders the generic simple message for an app-init-failure without a specific cause', () => {
const error = { type: 'app-init-failure' };
render(<FatalError error={error} />, { wrapper: TestProviders });
expect(
screen.getByText(/problem loading the app in this browser version/i),
).toBeInTheDocument();

View File

@@ -69,10 +69,17 @@ function RenderSimple({ error }: RenderSimpleProps) {
</Trans>
</Text>
);
} else if ('BackendInitFailure' in error && error.BackendInitFailure) {
msg = (
<Text>
<Trans>
Actual couldn't load a critical backend worker. Reload the page to try
again; if the problem persists, do a hard refresh to clear any stale
cached assets.
</Trans>
</Text>
);
} else {
// This indicates the backend failed to initialize. Show the
// user something at least so they aren't looking at a blank
// screen
msg = (
<Text>
<Trans>
@@ -92,19 +99,6 @@ function RenderSimple({ error }: RenderSimpleProps) {
}}
>
<Text>{msg}</Text>
<Text>
<Trans>
Please get{' '}
<Link
variant="external"
linkColor="muted"
to="https://actualbudget.org/contact"
>
in touch
</Link>{' '}
for support
</Trans>
</Text>
</SpaceBetween>
);
}

Some files were not shown because too many files have changed in this diff Show More