Compare commits

..

136 Commits

Author SHA1 Message Date
Matiss Janis Aboltins
6c1699f0b0 [AI] Replace any-typed Modal in undo state with structural type (#7813)
* [AI] Replace any-typed Modal in undo state with structural type

loot-core can't import @actual-app/web's Modal union, so the undo MRU
typed openModal as `any`. The undo system only stores the value and
reads `.name`, so a minimal structural shape `{ name: string; options?: unknown }`
is enough. desktop-client's full Modal still assigns to it, and the one
reader (global-events.ts) re-narrows back to Modal when handing the
value to replaceModal().

* Add release notes for PR #7813

* Update 7813.md

* [AI] Add TODO on Modal cast for future type consolidation

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-13 18:08:11 +00:00
Matt Fiddaman
4b1b68a353 automation UI: add spend from functionality (#7823)
* add spend from template

* note
2026-05-13 17:35:41 +00:00
Matt Fiddaman
dbf5d7c079 automation UI: tweak font colours to be more readable (#7819)
* s/pagetext{Subdued,Light}

* note
2026-05-13 17:31:25 +00:00
Matt Fiddaman
6bfc299d28 automation UI: add cleanup functionality (#7815)
* [AI] Share cleanup-group helpers and let storeTemplates write cleanup_def

- Extract resolveCleanupGroup and tombstoneOrphanCleanupGroups out of
  cleanup-template-notes.ts into a new cleanup-groups.ts so the
  upcoming UI-driven create flow can reuse the resurrect-aware lookup.
- Let storeTemplates accept an optional cleanup array per category
  (omitted = leave as-is, [] = clear, non-empty = replace), and run the
  orphan tombstone sweep whenever cleanup_def is touched so groups
  removed from the UI don't linger.
- Register budget/store-note-cleanups so the UI can migrate a single
  category's notes on demand, and budget/create-cleanup-group so it can
  create groups inline.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Add UI editor for end-of-month cleanup automations

- New cleanup row in the BudgetAutomations sidebar with read-only
  summary; selecting it opens an editor with a Global scope card and
  an optional named-group scope (single group per category for now,
  since multi-group ordering depends on category sort).
- Each scope card has independent "send leftover" / "take a share"
  toggles plus a weight; group scopes additionally support
  "only enough to cover overspending".
- Group picker is a typeahead that creates groups inline via
  budget/create-cleanup-group.
- useCategoryCleanup migrates notes to cleanup_def at modal-open for
  unmigrated categories; useCleanupGroups streams the live list.
- Un-migrate flow renders cleanup_def back to #cleanup note lines and
  drops rows whose group can't be resolved, so users never see UUIDs
  in their notes.
- Sidebar/automation-button "has automations" probes also check
  cleanup_def so cleanup-only categories still get the indicator.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* note

* review pass 1

* bring automation logic in line with cleanup logic

* review pass 3

* coderabbit pass 1

* wording suggestions

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-13 17:01:11 +00:00
Julian Dominguez-Schatz
a7e22b023c Disable postinstall scripts except for an allowlist (#7825)
* Disable postinstall scripts except for an allowlist

* Add release notes

* Temp: trigger docs CI

* Revert "Temp: trigger docs CI"

This reverts commit 1c2ca1125c.

* Remove some unneeded builds
2026-05-13 13:29:15 +00:00
Julian Dominguez-Schatz
d3e7c1ee87 Fix some issues caught by zizmor (#7826)
* Fix some issues caught by zizmor

* Add release notes

* Add more cache ignores

* Add comments on reasoning
2026-05-13 13:19:16 +00:00
Aryan Katiyar
a7e100276e [AI] Fix category group filtering for budgeted custom reports (#7629)
* [AI] Fix category group filtering for budgeted custom reports

* [AI] Add release notes for budgeted report filter fix
2026-05-13 10:33:14 +00:00
Tonchain
ee82a16026 Fix split transaction popover wrapping (#7814)
* Fix split transaction popover wrapping

* Add release note for split popover fix

* Uncap split popover width
2026-05-13 10:07:14 +00:00
Matiss Janis Aboltins
b61732e20e [AI] Add workflow to auto-label AI-generated PRs (#7817)
* [AI] Add workflow to label '[AI]'-prefixed PRs as 'AI generated'

https://claude.ai/code/session_018yp3BsEq1CyPcw8t57nLVu

* [AI] Suppress zizmor dangerous-triggers finding and add release note

https://claude.ai/code/session_018yp3BsEq1CyPcw8t57nLVu

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-13 05:17:11 +00:00
Matiss Janis Aboltins
83073b3ee0 [AI] automated publishing workflow for crdt package (#7805)
* [AI] Require @actual-app/crdt version bump and auto-publish

Adds two workflows:
- crdt-version-check: fails PRs that modify files in packages/crdt/
  without bumping the version in packages/crdt/package.json.
- publish-crdt: publishes @actual-app/crdt to npm when the version in
  packages/crdt/package.json changes on master, tagging the release as
  crdt-v<version>.

* [AI] Skip git tagging in @actual-app/crdt publish workflow

Remove the tag-and-push step and the now-unused version output;
downgrade contents permission to read.

* [AI] Simplify crdt version-bump workflows

- Drop the redundant explicit base-branch fetch (fetch-depth: 0 already
  retrieves all remote branches).
- Remove the unreachable "no changes" guard; the pull_request paths
  filter already scopes the workflow to packages/crdt changes.
- Replace the embedded Node semver comparison with `sort -V`.
- Read versions with `jq` instead of inline Node.

* [AI] Add release notes for crdt publish workflows

* [AI] Restrict GITHUB_TOKEN permissions in crdt workflows

Add top-level `permissions: contents: read` to both crdt workflows so
the implicit jobs no longer inherit overly broad permissions (flagged by
zizmor).

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-12 21:23:47 +00:00
diodijon
78234102fa Added scoped ErrorBoundary elements to all the modals in the desktop-client modals folder (#7560)
* add scoped ErrorBoundary to base Modal component

* update release note

* [autofix.ci] apply automated fixes

* Delete packages/loot-core/src/mocks/files/budgets/test-budget/metadata.json

* Delete packages/loot-core/src/mocks/files/budgets/test-budget/db.sqlite

* Add ErrorBoundary to Modal component for error handling

Errors thrown within any modal's content will now display a fallback UI (FeatureErrorFallback) instead of crashing the entire application.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-05-12 21:15:07 +00:00
Matiss Janis Aboltins
2c7f3c7a3d [AI] Replace google-protobuf with @bufbuild/protobuf (#7535)
* [AI] crdt: typecheck test files and clean up lint issues

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Replace google-protobuf with @bufbuild/protobuf

Swap the google-protobuf + ts-protoc-gen + protoc-gen-js toolchain for
@bufbuild/protobuf + @bufbuild/protoc-gen-es. The generator now emits a
single pure-TS sync_pb.ts (no .js sidecar, no globalThis.proto hack)
and a thin wrapper in proto/compat.ts preserves the SyncProtoBuf /
SyncRequest / etc. API so call sites stay unchanged. Removes the
loot-core CommonJS require polyfill that only existed to service
google-protobuf.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Align @bufbuild/protobuf version ranges with installed

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: drop the SyncProtoBuf compat layer

The proto/compat.ts wrapper was introduced alongside the bufbuild
migration to avoid touching call sites. With bufbuild messages already
exposing fields as plain mutable properties, the wrapper was just
boilerplate hiding direct reads and writes — and it had drifted (e.g.
setMessagesList was called in a test but never defined).

Delete compat.ts and migrate the six call sites in loot-core and
sync-server to use @bufbuild/protobuf directly. The crdt package now
re-exports the sync_pb types/schemas and the three bufbuild runtime
helpers (create, fromBinary, toBinary) so consumers keep a single
import source.

Also switch sync-server's @actual-app/crdt dependency from the pinned
"2.1.0" to "workspace:*", matching api/loot-core — the npm pin was
pulling the stale published copy instead of the workspace source.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] CI: drive sync-server build through lage so crdt deps are built

Before: the server job ran `yarn workspace @actual-app/sync-server build`
directly, which invokes tsgo without first emitting the workspace
dependencies' declarations. That worked when sync-server pinned crdt to
the published npm version (declarations bundled in the tarball), but
with `workspace:*` it fails with TS6305 because packages/crdt/dist/*.d.ts
hasn't been built yet.

Switch the CI command to `yarn build --to=@actual-app/sync-server`.
Lage respects the `dependsOn: ['^build']` pipeline and builds
@actual-app/crdt (and the other transitive deps) before sync-server.

Using --to rather than --scope keeps the build set minimal; --scope
would also include dependents like desktop-electron.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] sync-server: build project references via tsgo -b

The build script ran plain `tsgo`, which doesn't compile referenced
projects. With @actual-app/crdt now a `workspace:*` dep (no bundled
declarations from the npm tarball), the sync-server build fails with
TS6305 because packages/crdt/dist/index.d.ts doesn't exist yet.

Switch to `tsgo -b` so the sync-server build is self-contained: it
emits crdt's declarations into packages/crdt/dist on demand. This
mirrors what the sync-server `typecheck` script already does and fixes
all callers (`build:server`, docker-edge, publish workflows, the
direct `yarn workspace @actual-app/sync-server build` invocation in
build.yml) without needing per-workflow lage orchestration.

Revert the build.yml workaround added in the previous commit.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] sync-server: build @actual-app/crdt before tsgo

The previous tsgo -b approach emitted crdt's .d.ts via the project
reference but never produced dist/index.js — tsgo respects crdt's
tsconfig which has emitDeclarationOnly: true, and the actual JS
runtime is emitted by Vite in crdt's build script. So sync-server
compiled cleanly but crashed at runtime when forked by desktop-electron
(require('@actual-app/crdt') resolved to a package whose main pointed
at a nonexistent file, surfaced in e2e as the onboarding screen never
leaving the "Configure your server" state).

Unlike packages/api (which uses Vite with noExternal: true and bundles
crdt's source inline), sync-server uses plain tsgo compilation and
keeps its deps external — so crdt must be built ahead of time and be
resolvable via node_modules at runtime.

Chain `yarn workspace @actual-app/crdt build` before tsgo so every
caller of sync-server's build (build:server, docker-edge, publish
workflows, direct invocations in CI) gets a complete crdt dist. Revert
tsgo -b back to plain tsgo since crdt's build step now emits both the
JS and the declarations.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: expose dist/ via conditional exports so Node can load it

The package's `exports` field pointed straight at `./src/index.ts`,
which works for TS tooling and bundlers (vite with noExternal, vitest)
but breaks at plain-Node runtime — Node can't execute `.ts` files and
resolves dependent `./crdt` as a directory import, failing with
ERR_UNSUPPORTED_DIR_IMPORT.

That was invisible before because sync-server pinned
`@actual-app/crdt@2.1.0` and ran against the published npm tarball
(whose `publishConfig.exports` had already been promoted to the main
`exports` by yarn pack). Switching sync-server to `workspace:*` made
the raw workspace exports win at runtime: the compiled server imported
crdt when desktop-electron forked it, Node hit the `.ts` entry, the
utility process crashed before emitting `server-started`, and the
onboarding flow stalled on "Configure your server".

Switch to the same conditional-exports pattern packages/api already
uses: types → dist/index.d.ts, development → src/index.ts (for vitest
runs that enable the `development` condition), default → dist/index.js
(Node runtime and any other consumer). `publishConfig.exports` still
collapses this to just types + default for the npm tarball.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: split exports per consumer (browser source, node dist)

Previous commit's conditional exports routed everything non-development
to ./dist/index.js. That broke the web build: rolldown runs with
conditions ['electron-renderer', 'module', 'browser', 'default'] — no
match for development, falls through to the dist entry, which isn't
built by bin/package-browser, and fails to resolve @actual-app/crdt
when bundling loot-core's server/undo.ts.

Split the entries so each consumer lands on the right artifact:

  types       → ./dist/index.d.ts   (TypeScript, project references)
  development → ./src/index.ts      (vitest — both configs include it)
  browser     → ./src/index.ts      (web rolldown bundles the source)
  node        → ./dist/index.js     (sync-server forked by Node at
                                     runtime — the failure that kicked
                                     off this whole saga)
  default     → ./src/index.ts      (fallback for bundlers like api's
                                     vite build with conditions=['api'])

Verified: node resolves to dist, yarn build:browser succeeds from a
clean crdt/, sync-server build produces both dist/index.js and
build/app.js, loot-core (552) + sync-server (386) tests pass, full
typecheck clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] address review feedback on crdt/sync-server

- generate-proto: add `set -euo pipefail` so a protoc failure exits the
  script non-zero instead of silently running oxfmt on whatever is in
  src/proto/ from the previous run.
- sync.proto SyncRequest: field numbers jumped from 3 to 5; declare
  `reserved 4;` so the slot can't be silently reused for a new field
  with an incompatible type. Regenerated sync_pb.ts — the reservation
  shows up in the encoded file descriptor.
- sync-simple.js: SQLite stores is_encrypted as a 0/1 integer and
  better-sqlite3 hands it back as a number, but the bufbuild
  MessageEnvelope schema types isEncrypted as bool. Coerce to boolean
  when constructing the envelope so the JS value matches the field
  type before toBinary runs.

Skipped the suggested `types` → ./src/index.ts swap in crdt's exports:
packages/api uses the same `types` → dist pattern and TypeScript's
bundler resolution already falls through when dist/*.d.ts doesn't yet
exist (verified — loot-core typecheck passes with packages/crdt/dist
removed).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] address review feedback on encoder/app-sync test

- encoder.ts: prefs.getPrefs().encryptKeyId is `string | undefined`
  (MetadataPrefs is a Partial<>). The bufbuild SyncRequestSchema's
  keyId field is a non-optional proto3 string. Current code worked by
  accident — passing undefined into `create(Schema, init)` falls back
  to the schema default '' — but relied on bufbuild's undef-handling
  and would break if someone dropped @ts-strict-ignore. Normalize to
  '' explicitly.
- app-sync.test.ts: add a short WHY comment next to
  `syncRequest.since = ''` in "returns 422 if since is not provided".
  The test's intent (missing since) only matches the handler's
  `requestPb.since || null` falsy-check because proto3 strips '' on
  the wire and decodes it back to ''. Not obvious without the comment.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] crdt: load source directly in dev, only use dist when published

Local exports point at src/index.ts so consumers (sync-server in
particular) never load a stale Vite bundle. publishConfig keeps the
dist/ mapping for npm consumers. Switched the Vite output to ESM and
added "type": "module" so the published bundle stays consistent.

Sync-server's existing extension-resolution loader is extended to
handle directory imports and is now registered at runtime via
--import ./register-loader.mjs, matching how tests already load it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] desktop-electron: register sync-server loader on the embedded fork

The Electron app starts the sync server via utilityProcess.fork, which
bypasses sync-server's `start` script. With crdt now loaded from
source, the fork needs the same `--import register-loader.mjs` that
the standalone server uses; otherwise it crashes on the extensionless
`from './crdt'` directory import. Adds the loader files to
sync-server's published `files` so they actually ship with the
packaged app.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] sync-server: bootstrap entry that registers the loader for utilityProcess

Electron's utilityProcess.fork accepts execArgv but silently ignores
--import (verified with a minimal repro: the flag shows up in
process.execArgv but the preload module never executes), so the
previous attempt was a no-op and the embedded sync-server still
crashed on crdt's ESM directory imports. Add packages/sync-server/start.mjs
that statically imports register-loader.mjs and then dynamic-imports
build/app.js, so the loader is in place before the app's module graph
resolves. desktop-electron now points utilityProcess.fork at start.mjs
and drops the ineffective --import flag.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-12 17:55:52 +00:00
Yosof Badr
a4cb6dac37 [AI] fix: allow clearing pre-assigned category on new transactions (#7521)
* [AI] fix: allow clearing pre-assigned category on new transactions

Add a "Nothing" button to the category autocomplete modal that allows
users to clear a pre-assigned category when adding or editing
transactions. Previously, when a payee had a pre-assigned category,
there was no way to remove it and leave the transaction uncategorized.

Closes #7390

* [AI] docs: add release notes for PR #7521

* [AI] chore: re-trigger CI for flaky test

The test failure in methods.test.ts (Budgets: successfully update budgets)
is a pre-existing flaky test caused by a race condition in
advanceSchedulesService. The async schedule service fires via
void runMutator() after a sync event, but the database can be closed
before the query completes. This is unrelated to the PR changes which
only touch desktop-client UI code.

* chore: retrigger CI (flaky api test)

* fix type issue, better text

* more type fixes

* actually fixed?

---------

Co-authored-by: youngcw <calebyoung94@gmail.com>
2026-05-12 17:02:18 +00:00
youngcw
d1f9f8aecf release crossover report (#7804)
* release crossover report

* note
2026-05-12 16:45:01 +00:00
youngcw
3b927e754c 🐛 fix crossover report bugs (#7803)
* fix crossover report bugs

* note

* fixes

* refix
2026-05-12 16:02:09 +00:00
Alec Bakholdin
f2f3a5aa6d reserved whitespace for bank sync indicator when non-synced accounts … (#7611)
* moved the bank sync indicator to the right side of the text in mobile accounts view

* release notes

* moved spacing to the left again but made it smaller

* removed react from imports

* compressed space further

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7611

---------

Co-authored-by: Alec Bakholdin <alecbakholdin.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-12 15:56:42 +00:00
Stephen Brown II
8fbad7d64f [AI] No longer adjust date to match on transfers (#7722) 2026-05-12 15:51:08 +00:00
Nikhil Verma
44a3013772 [AI] Add R keyboard shortcut for Make transfer (#7750)
* [AI] Add T keyboard shortcut for Make transfer

* [AI] Add release notes for #7750

* [AI] Switch Make transfer shortcut from T to R and document it

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7750

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7750

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7750

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-12 15:40:16 +00:00
Matt Fiddaman
70716a59da automation UI: add goal template functionality (#7792)
* add goal to automations UI

* note

* tweak description

* Update upcoming-release-notes/7792.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-05-12 14:30:05 +00:00
Matt Fiddaman
e83567216f automation UI: open to the first automation (#7811)
* open to first automation

* note
2026-05-12 14:29:08 +00:00
Matt Fiddaman
92d4f82b66 automation UI: allow "save by date" automations not to repeat (#7810)
* allow save by automations not to repeat

* note
2026-05-12 14:28:58 +00:00
Matiss Janis Aboltins
50feba1afb [AI] Fix flaky API test timeouts and use sync file write in tests (#7806)
* [AI] Fix flaky upload-user-file test

The "uploads and updates an existing file successfully" test wrote the
old file content using the async callback form of fs.writeFile without
awaiting it. That write could land after the upload endpoint had already
written the new content, leaving the file with stale content and failing
the assertion. Use fs.writeFileSync so the setup completes before the
request is sent.

* [AI] Increase api test timeouts to fix flaky budget-load test

methods.test.ts loads a budget file and runs all DB migrations in each
test/hook. On busy CI runners this regularly approaches the default 5s
limit, and when it exceeds it the in-flight loadBudget keeps running after
teardown closes the database, producing a cascade of unhandled rejections
("database connection is not open", "no such table: v_schedules",
"Cannot read properties of undefined (reading 'timestamp')") that fail the
suite. Bump testTimeout/hookTimeout to 20s for the api package.

* [AI] Add release note for flaky test fixes

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-11 22:07:25 +00:00
Nadir Miralimov
8263e58eb2 [AI] Replace payee and category autocomplete filter/sort with fzf fuzy search (#7261)
* [AI] feat(web): replace custom autocomplete ranking with fzf

Replace substring-based filter/sort in PayeeAutocomplete and
CategoryAutocomplete with fzf fuzzy search. Remove deprecated
autocompleteRanking utility.

Closes #7261

* Update #7261 release notes

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7261

* Regenerate yarn.lock file

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7261

* Restore e2e snapshots

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7261

---------

Co-authored-by: Nadir Miralimov <riid@pm.me>
Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-11 21:33:36 +00:00
Matt Fiddaman
d015858e4a persist cleanup templates in the DB (#7794)
* add cleanup DB structure

* persist cleanup templates in the DB

* note

* Update packages/loot-core/src/types/models/cleanup-templates.ts

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>

* jfdoming feedback

* coderabbit

---------

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>
2026-05-11 21:30:51 +00:00
dependabot[bot]
f3a9c1a02c Bump mermaid from 11.12.1 to 11.15.0 (#7801)
Bumps [mermaid](https://github.com/mermaid-js/mermaid) from 11.12.1 to 11.15.0.
- [Release notes](https://github.com/mermaid-js/mermaid/releases)
- [Commits](https://github.com/mermaid-js/mermaid/compare/mermaid@11.12.1...mermaid@11.15.0)

---
updated-dependencies:
- dependency-name: mermaid
  dependency-version: 11.15.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-11 21:10:29 +00:00
Matiss Janis Aboltins
daa698e7d2 [AI] Fix /update-vrt merge step when only one shard has changes (#7802)
The Merge VRT Patches job collects shard patches with the glob
`/tmp/shard-patches/*/vrt-shard.patch`, which assumes every downloaded
artifact lands in its own `path/<artifact-name>/` subdirectory. But
actions/download-artifact only does that when 2+ artifacts match the
pattern; when exactly one matches it unpacks the artifact directly into
`path`. So whenever a `/update-vrt` run touches snapshots in a single
shard (the common case) the patch ends up at
`/tmp/shard-patches/vrt-shard.patch`, the glob matches nothing, and the
job reports "No shard patches to merge" despite a patch having been
generated (e.g. run 25679233565).

Replace the glob with a recursive `find` so the patches are located
under either layout. `merge-multiple: true` is not an option here
because every shard artifact contains a file literally named
`vrt-shard.patch` and they would overwrite each other.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-11 20:38:27 +00:00
Matt Fiddaman
e0536b593d add EB feedback issue (#7800)
* add EB feedback issue

* note
2026-05-11 20:22:21 +00:00
Matt Fiddaman
d500057494 automation UI: add options section to sidebar (#7791)
* add options section to sidebar

* note

* don't allow switching from option to automation

* use short description for sidebar, hide projected balance
2026-05-11 20:09:31 +00:00
Matiss Janis Aboltins
0086f805f8 [AI] Release custom themes feature (#7775)
* [AI] Release custom themes feature

Remove the customThemes experimental feature flag while keeping the
functionality intact (now enabled for all users).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Revise custom themes release note to experimental

Updated the release notes to reflect the experimental status of the custom themes feature.

* [AI] Move custom themes docs out of experimental

Custom themes graduated from experimental in this release; move the
guide to /docs/custom-themes, drop the experimental warnings and the
flag-toggle instructions, and update the historical link target in
releases.md plus a brief pointer from settings/index.md.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Update upcoming-release-notes/7775.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-05-11 20:08:23 +00:00
James Skinner
911d8371cc Tracking budget api income (#7526)
* Add tests and handling income group in different budget types

* Handle income groups in reflect budget logic

* Add release note

* Lint fix

* Address coderabbit feedback

* Remove ts-strict-ignore

* Change test dates, and assert group existence

* fix naming

* fix typecheck

---------

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
2026-05-11 19:36:51 +00:00
Matiss Janis Aboltins
a95c0ad9b0 [AI] sync server changes; crdt & et al (#7702)
* [AI] Load @actual-app/crdt from source in dev, only bundle for publish

@actual-app/crdt's local exports now point at src/index.ts so consumers
(sync-server, loot-core, desktop-client) never see a stale Vite bundle.
publishConfig keeps the dist/ mapping for npm consumers. crdt's
tsconfig switches to bundler module resolution to match the rest of
the workspace (no extensions in source imports).

Sync-server's existing extension-resolution loader is extended to also
handle directory-index imports (./crdt → ./crdt/index.ts), and the
standalone `start` / `start-monitor` scripts now invoke Node with
--import ./register-loader.mjs so the loader is in place before crdt's
source resolves.

Electron's utilityProcess.fork accepts execArgv but doesn't actually
preload --import modules, so a new packages/sync-server/start.mjs
bootstrap entry registers the loader imperatively and then dynamic-
imports build/app.js. desktop-electron's startSyncServer() points the
fork at start.mjs. sync-server's "files" array now ships start.mjs,
register-loader.mjs and loader.mjs so packaged Electron / npm
consumers actually receive them.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Add release notes for PR #7702

* [AI] Restructure sync-server to build with Vite

Replace the hand-rolled tsgo + add-import-extensions + copy-static-assets
+ runtime loader pipeline with a single Vite SSR build. Bundles every
entry (app, bin/actual-server, scripts/*) and inlines @actual-app/crdt
source so Node never has to resolve TS at runtime — the
MODULE_TYPELESS_PACKAGE_JSON warning that surfaced via crdt's source
exports is gone. Migrations and bank handlers move from readdir-based
dynamic imports to import.meta.glob; messages.sql becomes a ?raw import.

Drop loader.mjs, register-loader.mjs, start.mjs, and
bin/add-import-extensions.mjs. Electron's startSyncServer() forks
build/app.js directly. publishConfig.imports goes away (subpath imports
are resolved at build time and don't appear in the bundle).

In dev (start:server-dev) sync-server proxies to Vite, so loosen the CSP
to allow Vite's inline preamble script and HMR websocket — production
CSP is unchanged. desktop-client skips registerSW() in dev (and disables
vite-plugin-pwa's devOptions) so stale cached assets don't override
edits between page loads.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Address review feedback

- sync-server CSP: drop 'unsafe-eval' from the production script-src;
  the bundle has no genuine eval/new Function usage (only a defensive
  branch in setimmediate's polyfill that's never hit). Keep it on the
  dev branch where Vite's HMR runtime relies on it. Add a comment so
  it's obvious which branch needs it and why.
- bank-factory: widen the loader glob to ./banks/*_*.{ts,js} so
  TypeScript handlers are discovered too, mirroring migrations.ts.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Restore 'unsafe-eval' in production CSP for Electron

The Electron app needs `'unsafe-eval'` at runtime, so revert the dev-only
restriction and keep `'unsafe-eval'` in both branches. Comment updated to
record the actual reason instead of marking it as removable.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Revert bank-factory glob change

Widening the glob to ./banks/*_*.{ts,js} broke the desktop e2e tests in
CI even though every current handler is .js and the brace expansion
matches no .ts files locally. Reverting to ./banks/*_*.js — the change
had no behavioural benefit since there are no TS handlers, so the
nitpick isn't worth chasing.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Strip CSP comment to restore identical state to 9513c1e16

The desktop e2e has been failing despite my prior commits being a strict
revert (only difference was a 2-line comment, which can't change runtime).
Removing even the comment so the branch matches 9513c1e16's relevant
files exactly, to isolate whether the failure is from the master merge
or from CI-environment drift.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Make rebuild-electron actually rebuild better-sqlite3

PR #7712 simplified rebuild-electron to just `electron-rebuild -f -o
better-sqlite3,bcrypt` from the repo root. Two problems with that:

  1. Without `-m`, electron-rebuild scans the root workspace's package.json
     for native deps. better-sqlite3 isn't a direct root dep — it lives
     under packages/sync-server/ — so the scan returns no candidates and
     the rebuild silently no-ops.
  2. Without --build-from-source, electron-rebuild defers to
     prebuild-install, which downloads a stale prebuilt binary keyed off
     better-sqlite3's package.json (ABI 127) instead of recompiling
     against Electron 39's bundled Node ABI 140. The download succeeds
     and "Rebuild Complete" prints, but the resulting `better_sqlite3.node`
     can't `dlopen` inside Electron's utility process — sync-server
     crashes immediately on db init, the renderer's startSyncServer IPC
     never resolves, and the e2e test hangs on "Configure your server".

Point -m at packages/desktop-electron (which transitively pulls in
better-sqlite3 and bcrypt via @actual-app/sync-server) and force a real
compile via --build-from-source. Verified locally: better-sqlite3
rebuilds to darwin-arm64-140 and the desktop e2e onboarding test passes
in 6s instead of hanging for 60s.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Restore CSP unsafe-eval comment

Bring back the explanatory comment that was stripped diagnostically in
99682268c. Now that the desktop e2e regression is traced to
rebuild-electron and not to anything in this branch, we can keep the
documentation noting why 'unsafe-eval' is retained in both CSP branches.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Restore bank-factory glob to ./banks/*_*.{ts,js}

Re-apply the glob widening originally added in 145868f9d. It was
reverted in 531b1a191 because the desktop e2e was failing — that
failure is now traced to the rebuild-electron breakage (fixed in
6e8ac0784), not to this glob. Mirroring migrations.ts so future TS
bank handlers are picked up.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Fix applyAppUpdate hanging in dev mode

In dev mode browser-preload's updateSW was () => undefined, so
applyAppUpdate() — which calls updateSW() and then awaits a
deliberately never-resolving promise (waiting for the SW-driven page
reload) — hung the renderer instead of refreshing. In prod the page
is replaced by the new service worker, so the never-resolving await is
fine. The dev path now triggers a plain window.location.reload() so
the page reloads and the never-settling await is irrelevant, matching
prod's effective behaviour.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Revert rebuild-electron to master version

* Revert "[AI] Revert rebuild-electron to master version"

This reverts commit 4b6baab79f.

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-11 17:40:42 +00:00
Matiss Janis Aboltins
d9308c1474 [AI] Fix mobile bank sync indicators not updating during sync (#7784)
* [AI] Update mobile bank sync indicators live during sync

Mobile's account list uses react-aria-components ListBox with the
items render-function pattern, which memoizes rows by item identity.
Without a dependencies prop, changes to syncingAccountIds,
failedAccounts, and updatedAccounts in Redux didn't cause the
per-account dots to re-render until the items array itself changed,
so the green/yellow/red indicators only updated after the full sync
finished.

Pass these Redux selections via the dependencies prop so the rows
re-render as state changes during sync. Also clear SimpleFin
accounts from accountsSyncing right after the batch call returns,
so their indicators reflect completion before the per-account loop
starts on the remaining accounts.

https://claude.ai/code/session_01DNkRSgqW5JEtYpZjxvj7Bi

* [AI] Update release notes filename and author

https://claude.ai/code/session_01DNkRSgqW5JEtYpZjxvj7Bi

* [AI] Drop verbose comment on SimpleFin sync dispatch

https://claude.ai/code/session_01DNkRSgqW5JEtYpZjxvj7Bi

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-11 17:06:21 +00:00
Matiss Janis Aboltins
425db2d94d [AI] Recover from BackendInitFailure and show a meaningful error (#7761)
* [AI] Recover from BackendInitFailure and show a meaningful error

When the backend Worker fails to load (e.g., the hashed kcab.worker
asset can't be fetched), the SharedWorker would cache the
app-init-failure and replay it to every subsequent tab forever, while
the FatalError modal showed a misleading "browser version" message.

- Retry importScripts in production (3 attempts) so a transient blip
  doesn't brick the SharedWorker.
- Clear lastAppInitFailure when the client acknowledges the failure,
  when a backend later connects successfully (centralized in
  broadcastConnect), and when a fresh init arrives with no active
  groups (the failed leader is gone).
- Add a BackendInitFailure branch to FatalError's RenderSimple with a
  message that points the user at reload / hard refresh.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Remove support contact message from FatalError

Removed support contact message from FatalError component.

* [AI] Fix error propagation in importScriptsWithRetry

- Change Promise executor to accept both resolve and reject
- Properly propagate errors using .then(resolve).catch(reject)
- Fixes issue where errors from recursive retry calls were swallowed

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-11 17:06:01 +00:00
Matiss Janis Aboltins
5d270340a5 CLI: Hide hidden categories by default in list commands (#7786)
* [AI] CLI: hide hidden categories by default in list commands

The `categories list` and `category-groups list` commands now exclude
hidden entries by default. Pass `--include-hidden` to include them, mirroring
the existing `--include-closed` flag for `accounts list`.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Rename release note to 7785.md and update author

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] CLI: simplify category-groups list and consolidate test setup

- Flatten the include-hidden ternary on category-groups list into a
  single filter chain, mirroring categories list.
- Consolidate duplicated stderr/stdout spy setup into one outer
  describe in categories.test.ts.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Rename release note to 7786.md to match PR number

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Push hidden-category filtering down to the API/query layer

Add an optional \`hidden\` filter to \`api.getCategories\` and
\`api.getCategoryGroups\`. When set, the AQL query filters category groups
by hidden status and nested categories are filtered to match. Internal
callers (no options) keep the existing "return everything" behavior.

The CLI \`categories list\` and \`category-groups list\` commands now pass
\`{ hidden: false }\` instead of filtering client-side after fetching.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] Document new \`hidden\` option on getCategories and getCategoryGroups

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] getCategories: include hidden categories from visible groups in list

When \`hidden: true\` was requested, the flat list only contained hidden
categories that lived inside hidden groups, because it was derived from
the same already-filtered groups used for the grouped view. A hidden
category sitting in a visible group was silently dropped.

Fetch the unfiltered groups for the list view and filter by
\`category.hidden\` so the list reflects every hidden category regardless
of its parent group's hidden status. The grouped view is unchanged.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

* [AI] getCategories: query categories table directly when hidden=true

Replace the second \`getCategoryGroups()\` call (which loaded every group
plus its nested categories just to be flattened and filtered) with a
direct \`q('categories').filter({ hidden: true })\` AQL query. Same
result, one targeted query instead of fetching all groups.

The non-hidden=true paths are unchanged.

https://claude.ai/code/session_01DhYiicACsWb5NGHX71Wv4F

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-05-11 17:05:19 +00:00
Matt Fiddaman
070144f182 automation UI: dry run should show full demand (#7790)
* dry run should only take into account the single category

* fix error/warning consistency

* note

* ignore priorities for dry run
2026-05-11 16:10:32 +00:00
Alec Bakholdin
e479c84898 Added logic to prevent autocomplete from flickering on alt tab (#7637)
* added logic to prevent autocomplete from flickering on alt tab

* lint auto fixes

* removed import and linting autofix

* release notes

* flipped false to true for allowOpening

* updated release notes

* release note update to trigger CI

---------

Co-authored-by: Alec Bakholdin <alecbakholdin.com>
2026-05-11 15:34:17 +00:00
Tyler Quinlivan
1306da27c5 docs: fix wrong example number (#7773) 2026-05-11 15:20:46 +00:00
Aurel Demiri
0fd510a1d4 Integrate Enable Banking as a bank sync provider (#7345)
* Integrate Enable Banking as bank sync provider

Rewrite Enable Banking modal to match GoCardless pattern

Resolve Enable Banking bugs and improve auth flow

* [AI] Address code review feedback for Enable Banking integration

Bug fixes:
- Fix double-negative for DBIT transaction amounts (e.g. '--25.99')
- Fix payeeName counterparty mapping (CRDT→debtor, DBIT→creditor)
- Add missing state validation in EnableBankingCallback and /auth_callback
- Fix stuck loading state in useEnableBankingStatus with try/catch/finally
- Make session-expiry error matching case-insensitive
- Prefer CLAV balance type for startingBalance in /transactions route
- Guard setTimeout in post/del/patch when timeout is null
- Distinguish abort from network failure in post() catch

Credential handling:
- Add validateCredentials() to validate before persisting secrets
- Refactor client to use enablebanking-configure instead of manual secret-set
- Distinguish null (loading) from false (not configured) in setup checks

Poll-auth robustness:
- Add unique waiter IDs to prevent superseded waiter cleanup race
- Always cache results in completedAuths for retry resilience
- Add client disconnect cleanup via res.on('close')
- Cancel poll when Enable Banking modal closes via AbortController
- Prevent concurrent poll controller race with local reference check

Code quality:
- Extract buildSessionResult() to deduplicate auth_callback/complete-auth
- Add enabled parameter to useEnableBankingStatus to skip unused requests
- Add re-entrancy guard on onJump, reset bank on country change
- Refetch bank list after Enable Banking setup completes
- Type enableBankingConfigure config, make state required in completeAuth
- Add AbortError→TIMED_OUT test, fix startAuth test assertion
- Add afterAll vi.unstubAllGlobals() for test cleanup
- Add explanatory comments for bank-per-account model and in-memory maps

* [AI] Fix missing patterns in Enable Banking integration

- Add SyncServerEnableBankingAccount to ExternalAccount union and
  getInstitutionName parameter type in SelectLinkedAccountsModal
- Use BankSyncProviders type in mobile BankSyncAccountsList instead of
  hardcoded union missing enableBanking
- Add getSecretsError handling to EnableBankingInitialiseModal for
  proper auth/permission error messages
- Replace hardcoded #666 color with theme.pageTextSubdued
- Wrap onConnectEnableBanking in try/catch with error notification and
  init modal re-open, matching SimpleFin/PluggyAI pattern
- Translate hardcoded error string in enablebanking.ts
- Add 60s timeout to downloadEnableBankingTransactions matching PluggyAI
- Revert out-of-scope changes to del()/patch() in post.ts
- Revert shared starting balance dedup logic back to master pattern

* Forward PSU headers to Enable Banking API

* Fix Enable Banking re-auth dispatch

* Respect ASPSP maximum_consent_validity when starting Enable Banking auth

* Fix missing types for module jws

* Add upcoming release notes

* Fix format

Expected "sign" (value-import) to come before "Algorithm"

* Fix code review findings on Enable Banking integration

* [AI] Disable Enable Banking button while status is loading

* typo

* [AI] Migrate enable-banking files to subpath imports

Update all enable-banking files to use # subpath imports and
@actual-app/core paths, matching the migration done in master.
Add #enablebanking entry to desktop-client package.json imports map.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* [AI] Add #app-enablebanking subpath imports to sync-server package.json

Register enablebanking service, utils, and root entries in both
the imports and publishConfig.imports maps.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* Add jws to dependencies

* [AI] Harden Enable Banking OAuth callback handoff

Enforce exact OAuth state round-trip in the Enable Banking callback so
mismatched/missing state values no longer silently complete the flow.
Replace unsafe `as`/`!` assertions in the auth handoff with typed
locals so the callback path stays sound under strict TypeScript.

* [AI] Tighten Enable Banking type safety

Make the Enable Banking external-msg modal strict-ts compatible,
annotate the id type in linkEnableBankingAccount, derive
AccountSyncSource from a single SYNC_PROVIDERS list, and annotate the
return type of getJWTBody. No behaviour change.

* [AI] Fix Enable Banking poll lifecycle and abort handling

Make the popup-driven auth poll cancellable and isolated:

- Allow the popup retry path to abort the in-flight poll instead of
  leaving it hanging on the previous attempt.
- Clear the Enable Banking stateRef when the retry attempt finishes so
  a new attempt starts from a clean state.
- Start useEnableBankingStatus in loading state until the first fetch
  resolves so the UI doesn't briefly flash "not connected".
- Cancel only the requested poll, not every in-flight Enable Banking
  poll, so unrelated link attempts aren't affected.
- Skip writing the poll response when the client has already
  disconnected, with a regression test covering the disconnect path.

* [AI] Tighten Enable Banking client/test plumbing

Misc code-quality improvements with no behaviour change:

- Parallelize Enable Banking secret reset calls so wiping multiple
  secrets doesn't serialize the request chain.
- Use absolute imports in the enable-banking client module to match the
  rest of desktop-client.
- Document externalSignal usage in the post helper.
- Tighten Enable Banking test fixtures with `satisfies` and dynamic
  dates so they stop drifting when the real "now" moves.

* [AI] Fix Enable Banking initial-balance and post-link bookkeeping

Apply the standard post-sync bookkeeping when linking an Enable Banking
account so the new account picks up the same starting-balance
treatment as other bank-sync providers, and skip pending transactions
when computing the initial balance so the figure isn't inflated by
transactions that haven't cleared yet.

* [AI] Refine Enable Banking error model and bank-sync surface

Carry the human-readable Enable Banking message in
EnableBankingError.error_type and the machine-friendly identifier in
error_code, then map error_code to a bank-sync category in the
/transactions wire format so AccountSyncCheck can match on the same
categories as other providers.

* [AI] Improve Enable Banking bank-sync field mapping

Bring the Enable Banking transaction normalizer in line with how other
bank-sync providers feed the field mapper:

- Strip SEPA structured prefixes from remittance text so notes/payee
  display the human-meaningful portion instead of the SEPA boilerplate.
- Return the notes field and spread the raw transaction so downstream
  field mapping can reach the full payload.
- Expose Enable Banking raw fields in the bank-sync field mapper UI so
  users can map any underlying property, not just the curated subset.

* [AI] Use req.ip for Enable Banking PSU header so trust-proxy whitelist applies

* [AI] Address Enable Banking CodeRabbit pass-3 follow-ups

Three small fixes from the latest CodeRabbit re-review:

- Guard the aspsps fetch in EnableBankingExternalMsgModal against stale
  responses. Switching countries quickly could let an earlier in-flight
  request overwrite the newer selection's bank list. Added a cleanup
  flag in the useEffect so only the latest response updates state.
- Clear `enablebanking_auth_state` from localStorage when the auth flow
  exits, but only if the stored value still matches this attempt's
  state, so a concurrent retry can't wipe a newer session. Wrapping
  the poll in try/finally covers every return path (success, timeout,
  abort, body-level error).
- Use `Boolean(trans.booked)` in the Enable Banking initial-balance
  predicate to match `normalizeBankSyncTransactions`. The Enable
  Banking normalizer always sets `booked` to a boolean today, so this
  is defensive rather than a live bug, but keeping the two predicates
  aligned avoids surprises if the upstream shape ever loosens.

* [AI] Address Enable Banking CodeRabbit pass-3 follow-ups (round 2)

Two more findings from the latest CodeRabbit pass:

- Guard onJump against stale-retry completions. Token each call with a
  monotonic jumpIdRef counter and gate every post-await write
  (setError/setWaiting after onMoveExternal, the second setWaiting,
  and the finally-block ref reset) on `myJumpId === jumpIdRef.current`.
  Without this, a retry click while the previous poll was still
  unwinding could surface the older call's error in the newer
  attempt's UI and clear stateRef/isJumpingRef out from under it,
  leaving the new poll un-cancellable.
- Translate the (beta) suffix on Enable Banking ASPSP names so
  non-English locales don't surface a hardcoded English token in the
  bank list. The existing `actual/no-untranslated-strings` rule misses
  this case (regex requires a leading uppercase, and template-literal
  interpolations aren't visited as standalone strings).

* [AI] Use SEPA prefix allowlist instead of catch-all regex

The previous `^[A-Z]{3,}\+` regex would incorrectly strip merchant
tokens like `BMW+`, `USB+`, or `COVID+` from the start of a remittance
line. Replaced it with an explicit allowlist of known SEPA / ISO 20022
prefixes and added a regression test covering the false-positive case.

* [AI] Use uuidv4 instead of crypto.randomUUID in Enable Banking

Aligns with master's revert in #7734 (crypto.randomUUID back to uuid
library). Two stray spots remained in Enable Banking code: the
link-account flow in loot-core/server/accounts/app.ts and the OAuth
state token in sync-server/app-enablebanking.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-05-11 14:05:30 +00:00
Matiss Janis Aboltins
82673ecd50 [AI] Use bash for /update-vrt merge step (#7783)
The Merge VRT Patches job runs inside the Playwright container where
the default GitHub Actions shell is `sh -e {0}`, not bash. The merge
step uses bash-only constructs (`shopt -s nullglob`, array literals,
`${#patches[@]}`, `"${patches[@]}"`), so every /update-vrt run that
reaches the merge stage now exits 127 with `shopt: not found` (e.g.
run 25609625260).

Pin this step to `shell: bash` to match the explicit `shell: bash` we
already use elsewhere in the workflow. The sibling shard-patch creation
steps stay on the default sh because they only use POSIX features.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-10 18:56:50 +00:00
Matiss Janis Aboltins
18c704b3ba [AI] Sync server: harden CORS proxy method validation (#7788)
* [AI] Sync server: harden CORS proxy method validation

The CORS proxy validated `method` against a fallback-normalized value but
forwarded the raw client-supplied value to fetch(), letting a non-string
input (e.g. ["POST"]) bypass the GET/HEAD allowlist via undici's String()
coercion. Reject non-string method, pass the validated normalized method
to fetch(), and drop the unreachable body-forwarding branch.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Polish release notes wording

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Rename 7787.md to 7788.md

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-10 16:11:40 +00:00
Michael Clark
b05c207123 :electron: Publish to Microsoft store after release is published (#7757)
* move desktop app microsoft store publish to after the release is published

* release notes
2026-05-09 21:16:13 +00:00
Matiss Janis Aboltins
b9ab3e7bc6 [AI] Fix /update-vrt build step after lage browser-build refactor (#7781)
The build-web job in vrt-update-generate.yml invoked
`yarn workspace @actual-app/core build:browser`, but #7602 removed that
script when it routed the browser pipeline through
`lage build:browser --to=@actual-app/web` (orchestrated by
bin/package-browser). The recent /update-vrt parallelization (#7641)
preserved the now-stale per-workspace invocations, so every comment
trigger fails with "Couldn't find a script named build:browser".

Match the working e2e-test.yml build-web step exactly:
`yarn build:browser --skip-translations`. lage's `^build` edge handles
the upstream graph (crdt, plugins-service, loot-core) automatically, and
`--skip-translations` keeps the captured snapshots aligned with regular
VRT runs (which also strip Weblate locale chunks for determinism).

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-09 18:16:42 +00:00
Matiss Janis Aboltins
4f40defe9e [AI] Mobile: live value tracking (#7774)
* [AI] Update mobile budget value color live as user types

The mobile FocusableAmountInput's color was computed from the saved
`value` prop, so it stayed in the gray "zero" state until blur. Track
the in-progress edited value via the existing `onChangeValue` callback
and feed it to `makeAmountFullStyle` so the color reflects what the
user is currently typing.

* Add release notes for PR #7774

* Change category from Features to Bugfix

* [AI] Reapply sign when computing live amount color

liveValue holds the absolute value (the input field has no sign — the
+/- toggle controls it separately), so passing it directly to
makeAmountFullStyle picked positiveColor for amounts the user intends
as negative. Pass maybeApplyNegative(liveValue, isNegative) so the
color matches the signed value.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-09 18:08:58 +00:00
Will Lapinel
3799b587ec [AI] Add getNote and updateNote to public API (#7769)
* [AI] Add getNote and updateNote to public API

Notes on categories and other entities have no public API surface today.
The internal `notes-save` handler exists and works, but callers outside
the app must reach into undocumented internals to use it.

A concrete motivation: AI assistants driving Actual through an MCP server
(e.g. Claude via @actual-app/api) can set budget templates and savings
goals on categories by writing specially-formatted strings to the notes
field (e.g. `#template 250`, `#goal 1000`). Without a public API this
requires using the private `lib.send('notes-save', …)` path, which is
fragile and not guaranteed to stay stable.

This commit adds two public methods:
- `getNote(id)` — returns the NoteEntity for a given entity id, or null
- `updateNote(id, note)` — sets the note string on any entity by id

Implementation:
- Adds `notes-get` handler in `packages/loot-core/src/server/notes/app.ts`
- Adds `api/note-get` and `api/note-update` handlers in `api.ts`
- Adds `ApiHandlers` types for both new handlers
- Exposes `getNote` / `updateNote` in `packages/api/methods.ts`
- Adds a test covering get (null before set) and set/update round-trip

Testing:
- `yarn typecheck` — passed (10/10 packages, 0 errors)
- `yarn lint:fix` — passed (0 errors)
- `yarn workspace @actual-app/api test` — passed (19/19 tests, including
  the new "Notes: successfully get and update note" test)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* [AI] Add release note for PR #7769

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* [AI] Address review feedback: tighten types and add docs

- Use NoteEntity field types (Pick<NoteEntity, 'id'>, NoteEntity['id'],
  NoteEntity['note']) instead of plain strings throughout
- Rename getNotes -> getNote (singular) in notes/app.ts
- Add Notes section to packages/docs/docs/api/reference.md

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-09 16:17:56 +00:00
Jon Bramley
8e1f27f316 Modal Blur remove static will-change: transform (#7760)
* remove static will-change: transform

* Release notes

* Fix text blur in modals by updating CSS

---------

Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-05-09 16:00:23 +00:00
Nikhil Verma
fb95d4c92d [AI] Document Dev Container option in development-setup docs (#7729)
* [AI] Document Dev Container option in development-setup docs

* [AI] Add release notes for #7729

* [AI] Update spell-check dictionary for Codespaces
2026-05-09 15:45:57 +00:00
LIZ
2782d464ab Fix last month report widgets restoring as static (#7768)
* Fix last month report widgets restoring as static

* Add release note
2026-05-09 15:30:04 +00:00
dependabot[bot]
b63f5dd303 Bump fast-uri from 3.1.0 to 3.1.2 (#7762)
Bumps [fast-uri](https://github.com/fastify/fast-uri) from 3.1.0 to 3.1.2.
- [Release notes](https://github.com/fastify/fast-uri/releases)
- [Commits](https://github.com/fastify/fast-uri/compare/v3.1.0...v3.1.2)

---
updated-dependencies:
- dependency-name: fast-uri
  dependency-version: 3.1.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-09 15:05:29 +00:00
dependabot[bot]
35a01b0fa6 Bump @babel/plugin-transform-modules-systemjs from 7.28.5 to 7.29.4 (#7776)
Bumps [@babel/plugin-transform-modules-systemjs](https://github.com/babel/babel/tree/HEAD/packages/babel-plugin-transform-modules-systemjs) from 7.28.5 to 7.29.4.
- [Release notes](https://github.com/babel/babel/releases)
- [Changelog](https://github.com/babel/babel/blob/main/CHANGELOG.md)
- [Commits](https://github.com/babel/babel/commits/v7.29.4/packages/babel-plugin-transform-modules-systemjs)

---
updated-dependencies:
- dependency-name: "@babel/plugin-transform-modules-systemjs"
  dependency-version: 7.29.4
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-09 15:03:49 +00:00
Matiss Janis Aboltins
3104503a8a Refactor VRT workflow to parallelize browser and desktop tests (#7641)
* [AI] Parallelize and shard /update-vrt workflow

Mirror the pre-built bundle + 3-way sharding pattern that #7503 applied
to e2e-test.yml, plus split desktop VRT into its own job so it runs
concurrently with the browser passes instead of sequentially on the
same runner.

- New `build-web` job compiles the browser bundle once and uploads it
  as an artifact (REACT_APP_NETLIFY=true so the "Create test file"
  button survives tree-shaking).
- `browser-vrt` runs as a 3-shard matrix, each downloading the prebuilt
  artifact and using `E2E_USE_BUILD=1` so `serve-build.mjs` replaces
  per-shard Vite startup.
- `desktop-vrt` runs in parallel with the browser shards.
- Each shard produces its own PNG-only `git format-patch` and
  validates it before upload; `merge-patch` re-validates and applies
  every shard patch to produce the single `vrt-patch-<pr>` artifact
  that `vrt-update-apply.yml` already consumes unchanged.
- Keeps `permissions: contents: read, pull-requests: read`,
  `persist-credentials: false` on every checkout, and env-indirection
  for fork-controlled values (zizmor-friendly).

Expected wall-clock drop from ~50 min to ~15-20 min.

* Add release notes for PR #7641

* [AI] Fix vacuous PNG-only patch validation regex

`git format-patch` emits a `GIT binary patch` block for PNGs and does
not produce `+++ b/foo.png` / `--- a/foo.png` text-diff headers. The
existing validation `grep -E '^(\+\+\+|---) [ab]/' patch | grep -v
'\.png$'` therefore matches zero lines for any legitimate PNG-only
patch, and the guard passes vacuously — meaning a crafted binary patch
naming a non-PNG file would also pass undetected.

Match `diff --git` headers instead. Those are present for both text
and binary patches, and naming both source and destination paths gives
us a clean `^diff --git a/<path>.png b/<path>.png$` shape to enforce.

Updated all four validation points in vrt-update-generate.yml (per
shard, in the merge-patch re-validation loop, and on the final merged
patch) plus the pre-existing third defense layer in
vrt-update-apply.yml. Also fixed FILES_CHANGED counter in apply
workflow since it relied on the same broken `+++` pattern.

Verified the new regex with binary patches: legit PNG-only ACCEPTED,
single non-PNG REJECTED, mixed PNG+non-PNG REJECTED.

Reported by CodeRabbit on PR #7641.

* [AI] Move workspace-trust step before setup in browser-vrt and desktop-vrt

Master's #7699 moved the safe.directory git config to a separate step
that runs before the setup composite action, because the setup action
performs git operations (yarn --immutable + checkout of the
translations repo) that fail when the workspace isn't trusted in
container environments.

The merge resolution kept the trust step before setup in build-web (as
master had it) but left the trust step after setup in the new
browser-vrt and desktop-vrt jobs. They have the same setup composite
and would hit the same failure mode — apply the same fix.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-09 14:26:21 +00:00
dependabot[bot]
db38565524 Bump uuid from 13.0.2 to 14.0.0 (#7739)
* Bump uuid from 13.0.2 to 14.0.0

Bumps [uuid](https://github.com/uuidjs/uuid) from 13.0.2 to 14.0.0.
- [Release notes](https://github.com/uuidjs/uuid/releases)
- [Changelog](https://github.com/uuidjs/uuid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/uuidjs/uuid/compare/v13.0.2...v14.0.0)

---
updated-dependencies:
- dependency-name: uuid
  dependency-version: 14.0.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

* [AI] Add engines field to sync-server package.json for Node.js >=22 requirement

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-08 20:40:07 +00:00
Nikhil Verma
9e9cf45641 [AI] Update preview-builds doc with current URL pattern and additional previews (#7728)
* [AI] Update preview-builds doc with current URL pattern and additional previews

* [AI] Simplify preview labels and add release notes
2026-05-08 20:26:47 +00:00
Matiss Janis Aboltins
8ab8277429 [AI] Replace Support contact link with auto-closing tech-support issue template (#7670)
* [AI] Replace Support contact link with auto-closing tech-support issue template

Convert the external Discord "Support" contact link into a proper GitHub issue
form so users who skip the redirect still land somewhere useful. The new form
has a single "Describe your problem" field and a prominent notice that tech
support tickets are auto-closed and Discord is the place to get help. A new
workflow watches for the `tech-support` label, posts a friendly Discord pointer
and closes the issue, mirroring the existing feature-request auto-close flow.

* Add release notes for PR #7670

* [AI] Replace create-or-update-comment action with gh CLI

The peter-evans/create-or-update-comment action is unnecessary since GitHub's gh CLI (pre-installed on all GitHub-hosted runners) provides the same functionality natively via 'gh issue comment' and 'gh issue close' commands. This change addresses the zizmor security scanner warning about using an action when the functionality is already included by the runner.

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

* Update 7670.md

* [AI] Fix formatting in workflow file

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

* Update issues-close-tech-support.yml

Co-authored-by: Stephen Brown II <Stephen.Brown2@gmail.com>

* Update tech-support.yml

Co-authored-by: Stephen Brown II <Stephen.Brown2@gmail.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
Co-authored-by: Stephen Brown II <Stephen.Brown2@gmail.com>
2026-05-08 20:24:17 +00:00
Matiss Janis Aboltins
d9fb66422b [AI] Document patch release process in releasing.md (#7746)
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-08 20:23:27 +00:00
Matiss Janis Aboltins
345c99be4d [AI] Bump workspace packages to 26.5.2 and document 26.5.2 in release notes (#7756) 2026-05-08 19:30:10 +00:00
Matiss Janis Aboltins
e3b42b51a3 [AI] Fix publish-npm-packages workflow setup-node input (#7755)
* [AI] Fix publish-npm-packages workflow setup-node input

The `cache` input on actions/setup-node expects a package-manager
string ('npm'/'yarn'/'pnpm'), not a boolean. Passing `cache: false`
caused the publish job on the v26.5.1 release to fail with
"Caching for 'false' is not supported". Caching is off by default,
so the input is removed entirely.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Add release notes for PR #7755

* Change category from Bugfixes to Maintenance

Fix npm dependency caching in the publish workflow by removing the cache disabling setting.

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-05-08 18:31:39 +00:00
Matiss Janis Aboltins
bc08ed97e9 [AI] 🔖 Release 26.5.1 (#7745) 2026-05-08 17:11:46 +00:00
youngcw
d2b50adf30 fix checkboxes (#7742) 2026-05-08 13:37:24 +00:00
Michael Clark
1f4f706c4a ⚙️ Updating workflow based on audit (#7740)
* updating workflows with audit feedback

* release notes
2026-05-07 20:26:12 +00:00
Matiss Janis Aboltins
852b95524b [AI] Revert crypto.randomUUID back to uuid library (#7734)
* [AI] Revert crypto.randomUUID back to uuid library

Partial revert of #7529. Restores the `uuid` package dependency in
api, crdt, desktop-client, loot-core, and sync-server, swapping every
`crypto.randomUUID()` call introduced by that PR back to `uuidv4()`
(and the `uuid()` alias in RuleEditor.tsx and ruleUtils.ts where it
was previously used). The lint rule, docs entry, and `vi.mock('uuid')`
test setup are restored as well. The `fs-extra` removals in
desktop-electron from the same PR are left in place.

https://claude.ai/code/session_01KTg1g416Jdjf5feGke8MQw

* Add release notes for PR #7734

* [check-spelling] Update metadata

Update for https://github.com/actualbudget/actual/actions/runs/25480733101/attempts/1
Accepted in https://github.com/actualbudget/actual/pull/7734#issuecomment-4394811498

Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com>
on-behalf-of: @check-spelling <check-spelling-bot@check-spelling.dev>

* [AI] Use the uuidv4 alias in RuleEditor and ruleUtils

The previous commit preserved the `v4 as uuid` alias in these two files
to match their pre-#7529 state, but the project convention (and lint
rule message) is `v4 as uuidv4`. CodeRabbit flagged the inconsistency,
so normalize the alias and call sites in both files.

https://claude.ai/code/session_01KTg1g416Jdjf5feGke8MQw

* [AI] Pin uuid to ^11.1.0 to fix Electron e2e

uuid v13 is ESM-only (no CJS entry, only an exports map with `node` and
`default` conditions both pointing to ESM files). The Electron backend is
bundled as CJS by `loot-core/vite.desktop.config.mts` and loaded via a
dynamic `await import(process.env.lootCoreScript!)` in the
desktop-electron utilityProcess; that pipeline appears to fall over on
the ESM-to-CJS transform of uuid v13 in Vite 8 / rolldown, which makes
the Functional Desktop App e2e job fail consistently while every other
check (web e2e, VRT, unit tests, lint, typecheck) passes.

uuid v11 still ships `dist/cjs/index.js`, so pinning each workspace's
uuid range to ^11.1.0 sidesteps the resolution path entirely. The API
is unchanged (`v4` is still the same export), so no source-code changes
are needed.

https://claude.ai/code/session_01KTg1g416Jdjf5feGke8MQw

* [AI] Capture Electron stdout/stderr in desktop e2e fixture (TEMP DEBUG)

Pipe the Electron main+utility process stdout/stderr into both the
playwright runner stderr and an electron.log file inside the test's
output directory. This makes the actual backend error visible in the
Functional Desktop App job output and in the desktop-app-test-results
artifact.

Will be reverted once the failure cause is identified.

https://claude.ai/code/session_01KTg1g416Jdjf5feGke8MQw

* Revert "[AI] Capture Electron stdout/stderr in desktop e2e fixture (TEMP DEBUG)"

This reverts commit 4cb5148859.

* [AI] Fix rebuild-electron and bump uuid back to ^13.0.0

The Functional Desktop App e2e job had been failing with
ERR_DLOPEN_FAILED on better-sqlite3 (NODE_MODULE_VERSION 127 vs 140),
which surfaced because this PR's yarn.lock change invalidated the
GitHub Actions node_modules cache. With a fresh install,
rebuild-electron is responsible for compiling better-sqlite3 and
bcrypt against Electron's ABI — but since #7712 the script has been
running from the repo root, where neither module is a direct
dependency, so electron-rebuild silently exits as a no-op.

Restore the `-m ./packages/desktop-electron` scoping that #7712
removed, so the rebuild actually finds and rebuilds the modules.
This fix is technically out of scope for this PR but it blocks any
PR that touches yarn.lock.

Also revert the `^11.1.0` pin from the previous commit back to
`^13.0.0`. The pin was based on a wrong hypothesis (that uuid v13's
ESM-only packaging was breaking the bundle); now that the real
cause is identified, there is no reason to deviate from the
pre-#7529 version.

https://claude.ai/code/session_01KTg1g416Jdjf5feGke8MQw

---------

Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-05-07 19:18:47 +00:00
Michael Clark
6fb73786d5 :electron: Fix desktop app server certs (#7713)
* fix desktop app server certs integration

* fix self signed certs - release notes

* security note

* release note spelling

* better message

* remove security notes cause its not a valid prop
2026-05-07 18:07:45 +00:00
Nikhil Verma
e3952d2a24 [AI] Add option to 'Copy [budget] to future months' (#7420)
* [AI] Add 'Copy to future months' budget option

Adds a new per-category budget menu option that copies the current
month's budgeted amount to all future months that already exist in
the budget. Works for both envelope and tracking budget types, and
on both desktop (inline popover) and mobile (modal) views.

* [AI] Rename release note to PR #7420

* [AI] Skip empty budget months in copyToFutureMonths

Only copy the budget value to future months that already have a
non-zero budget set for the category. Months with no budget (zero)
are left untouched.

* [AI] Rename to 'Copy until year end', limit to tracking budget, add year-end cap

* [AI] Address CodeRabbit feedback: i18n undo notification, clarify docs

* [AI] Fix typecheck: branch dispatch by budgetType for proper modal narrowing

* [AI] Fix lint: add t to useCallback deps
2026-05-06 18:19:40 +00:00
Michael Clark
fea36466d2 fix rss feed (#7732) 2026-05-06 18:18:30 +00:00
youngcw
1fadfa4e9b [AI] Fix #2155: duplicated transactions are marked as uncleared (#7723)
* init

* note
2026-05-06 17:43:30 +00:00
Aaron
6ead7ea42c docs: add Enable Actual to community repos (#7697)
Co-authored-by: Aaron <aaron@noreply.squaresine.com>
2026-05-06 00:31:41 +00:00
Matt Fiddaman
d6fc3212b9 re-sort preview transactions after rule application (#7691)
* resort schedules after rule application

* note
2026-05-05 23:48:26 +00:00
Alec Bakholdin
071611fcc5 Cannot read properties of null (reading 'toLowerCase') (#7704)
* added null safety in throwIfNot200

* release notes

* updated release notes

---------

Co-authored-by: Alec Bakholdin <alecbakholdin.com>
2026-05-05 23:03:02 +00:00
Matt Fiddaman
263358b5cf fix vrt update workflow (#7699)
* fix vrt update workflow

* note
2026-05-05 20:26:48 +00:00
Michael Clark
44fc959ed8 :electron: Fix electron dev mode not starting (#7712)
* fix electron dev mode

* release notes
2026-05-05 08:08:49 +00:00
Dan Hopkins
d787d0ce43 fix: only count failed attempts against auth rate limit (#7707)
* fix: only count failed attempts against auth rate limit

Add skipSuccessfulRequests: true to authRateLimiter so that successful
logins do not consume quota. This fixes breakage for API clients
(actual-cli, actual-mcp, custom scripts) that re-authenticate per
operation — they always provide the correct password, so they should
never be rate-limited.

Brute-force attackers generate repeated failures and still hit the wall.

Fixes #7706

* Update upcoming-release-notes/7706.md

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>

* fix: rename release note to match PR number

---------

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
2026-05-04 22:27:18 +00:00
Aurora-Flipped
2c3e2a34fd Fix/spending report date range save (#7672)
* Fix saved spending report date range

* Add release note
2026-05-04 20:28:19 +00:00
lelemm
78d533c800 Bank sync page refactor (#7449)
* Bank sync refactor extracted from plugins

* code review

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7449

* [AI] Resolve bank sync PR conflicts

Co-authored-by: lelemm <lelemm@users.noreply.github.com>

* Change author name in 7449 release notes

Updated author name in release notes.

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: lelemm <lelemm@users.noreply.github.com>
Co-authored-by: youngcw <calebyoung94@gmail.com>
2026-05-04 14:51:33 +00:00
Juulz
49f6b21f2c [Bugfix]🐛 Fix refresh icon (sync) centering in Titlebar (#7674)
* Fix svg rendering in Titlebar component

* Fix conditional rendering of syncState text

* Fix string interpolation in Titlebar component

* Add release notes for bugfix on refresh icon centering

* Clarify refresh icon centering fix in Titlebar
2026-05-04 14:49:33 +00:00
Emil Tveden Bjerglund
9f05207fe8 Fix Cover Overspending menu closing when view too narrow (#7687)
* Fix Cover Overspending menu closing when view too narrow

* Update upcoming-release-notes/7687.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Increase min width

* Set minWidth instead

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-05-04 14:48:30 +00:00
Matt Fiddaman
8366c442a2 automation UI: seperate percentage sources in error logic (#7694)
* seperate percentage sources in error logic

* note

* restructure and add test

* fix income category resolution and add regression test
2026-05-04 13:59:55 +00:00
Matt Fiddaman
4b73fd7e45 link budget automation UI experimental feature to a feedback issue (#7693)
* link to feedback issue

* note
2026-05-03 22:46:10 +00:00
Julian Dominguez-Schatz
c593bda145 Update release docs to reflect latest process (#7690)
* Update release docs to reflect latest process

* Add missed space

* GitHub doesn't like non-American spelling

* PR feedback
2026-05-03 22:02:08 +00:00
Andreas Offenhaeuser
1b86bba2cd [Bugfix] Disable 2-day lookback for automatic transactions (#7299)
* disable 2-day lookback for automatic transactions

* add PR release notes
2026-05-03 21:44:28 +00:00
Matiss Janis Aboltins
6c2c96e826 🔖 (26.5.0) (#7621)
* 🔖 (26.5.0)

* fix release note generation script (#7635)

* fix release note generation script

* note

* fix cherrypicked commits not being respected and lint race in release note generation workflow (#7640)

* fix cherrypicked commits not being respected and lint race

* note

* coderabbit suggestions

* fix lint

* make double restore possibility safe

* fix lint (#7643)

* Generate release notes for v26.5.0

* add release note highlights

* Fix Sankey income bug, when payee it not set (#7632)

* Ensure income categories are shown correct, even if payee is not set

* Add release note

* Generate release notes for v26.5.0

* increase test coverage for budget templates (#7620)

* [AI] cover existing template engine logic with regression tests

Adds tests for goal template behavior that predates this PR so the
suite can be cherry-picked onto master to confirm no regressions. No
production code changes.

Covers:
- init() validation: schedule names, by/schedule priority match, past
  by-target with and without annual/repeat, percentage source not
  found, special source aliases, duplicate limit/spend/goal
  directives, weekly limit missing start date, invalid limit period,
  unrecognized periodic period
- runRemainder cap clamping and hideDecimal fraction removal
- Income-category branch in runTemplatesForPriority
- getLimitExcess against an aggregate weekly cap
- Past by-target rolling forward via the annual period
- runSchedule full=true (no sinking accumulation), percent and fixed
  adjustments, completed-schedule filtering, past-date error for
  non-repeating schedules, monthly/weekly/daily sinking contribution
  branches when interval exceeds the pay-month-of cap, surplus
  absorption when last-month balance exceeds the target, and
  tracking-budget mode forcing all schedules pay-month-of
- applyMultipleCategoryTemplates orchestration: per-category writes,
  cross-category priority clamping when funds run out, error
  notification path
- applyTemplate force=false skipping already-budgeted categories

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* note

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix infinite loop when remainder is impossible to solve (#7623)

* fix infinite loop when remainder is impossible to solve

* note

* Generate release notes for v26.5.0

* Update author

Updated author information in the release notes.

* Fix shared worker resumption after tab suspend (#7656)

* [AI] Fix SharedWorker tab resume recovery

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* [AI] Fix SharedWorker reload readiness

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* Add release notes

* Update packages/desktop-client/src/shared-browser-server-core.ts

Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>

---------

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>

* Update docs release date

* Empty commit to bump CI

* Generate release notes for v26.5.0

* Revert "Generate release notes for v26.5.0"

This reverts commit b42c48bed5.

---------

Co-authored-by: github-merge-queue <118344674+github-merge-queue@users.noreply.github.com>
Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Emil Tveden Bjerglund <emilbp@gmail.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
2026-05-03 17:41:17 +00:00
Matt Fiddaman
6298f6a324 improve experimental budget automation UI (#7597)
* [AI] initial

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] review pass 1

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] review pass 2

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] review pass 3

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] review pass 4

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* remove dev localstorage gate

* [AI] block migration for #goal and #cleanup directives

Detect goal templates (`type: 'goal'`) and `#cleanup` lines in the
category notes before rendering the editor. When either is present,
show an unsupported-directive notice instead of the modal body so the
migration helper never runs and the Save flow can't silently overwrite
`template_settings.source` to `'ui'` (which would stop the engine from
reading the notes for those directives).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* note

* [autofix.ci] apply automated fixes

* [AI] don't synthesise refill for monthly+limit simple templates

A `#template 50 up to 200` line parses to a simple template with both
monthly and limit set. The migration helper previously expanded that
into limit + refill + periodic, but the engine's runSimple just budgets
the monthly amount and clamps to the cap — there is no implicit refill
to undo.

Only synthesise the refill for the limit-only form
(`#template up to 200`), where runSimple's fallback returns
limitAmount - fromLastMonth. Update the migration test to match.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] coderabbit fixes

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] coderabbit fixes v2

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] attribute schedule batch contributions per template

Replace the equal-split fallback with each schedule's actual monthly
contribution (computed inside runSchedule) so the per-row UI projection
reflects real cost rather than total / count.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] coderabbit v3

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] coderabbit v4

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] fix unclearable integer inputs in automation editors

GenericInput type=number fired onChange on every keystroke and the
dispatcher coerced empty/0 back to 1, so the field snapped back before
the user could retype. Switch to a local string state that commits and
clamps on blur, with min/step constraints on the native input.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] coderabbit v5

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Update packages/desktop-client/src/components/budget/goals/templateHelpers.tsx

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>

* Update packages/desktop-client/src/components/budget/goals/templateHelpers.tsx

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>

* [autofix.ci] apply automated fixes

* remove comment

* s/rule/automation

* update note wording

* tweak hold checkbox wording

* deduplicate translation strings

* rename buildPresetSeeds

* deduplicate example text

* update wording about how easy automations are

* s/P/Priority

* reuse week template

* replace week displayType with fixed

* lodash debounce

* [autofix.ci] apply automated fixes

* more graceful error handling

* change default priority to 1

* align ui and text language for available funds & all income

* fix tests

* coderabbit v{lost_count}

* more coderabbit

* extract isValidYearMonth

* extract automationExamples

* split out into multiple files

* split down templateHelpers

* [AI] cover PR additions to template engine

Tests targeting code introduced in this PR:
- per-template attribution map (perTemplateContribution) on
  CategoryTemplateContext for sibling periodic templates, batched `by`
  templates, limit-clamped totals, and remainder weights
- runBy now returns { toBudget, perTemplateNeed }
- runSchedule now returns perScheduleMonthly keyed by trimmed name,
  including a multi-schedule pay-month-of + sinking-fund mix
- dryRunCategoryTemplate end-to-end through computeTemplates, in both
  narrow-scope and wide-scope (remainder / available funds) branches
- checkPercentage now accepts category ids (UI form), not just names

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix repeat every month error

* coderabbit again.

* Update packages/desktop-client/src/components/budget/goals/automationMessages.tsx

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>

* Update packages/desktop-client/src/components/budget/goals/automationMessages.tsx

Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>

* [autofix.ci] apply automated fixes

* plural aware translation strings

* move + out of translation string

* fix types

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>
2026-05-01 10:33:02 +00:00
Matiss Janis Aboltins
1afe7c9a1e Enable auto enrichment in coderabbit configuration (#7664)
* Enable auto enrichment in coderabbit configuration

* Add release notes for PR #7664

* Change category to Maintenance and update description

Updated the category from 'Features' to 'Maintenance' and revised the description.

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-30 17:02:52 +00:00
Julian Dominguez-Schatz
24279264da Fix shared worker resumption after tab suspend (#7656)
* [AI] Fix SharedWorker tab resume recovery

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* [AI] Fix SharedWorker reload readiness

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* Add release notes

* Update packages/desktop-client/src/shared-browser-server-core.ts

Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>

---------

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-04-30 07:37:02 +00:00
Matiss Janis Aboltins
4a5ee9c2dc [AI] Make TypeScript work in test files across packages (#7642)
* [AI] Make TypeScript work in test files across packages

Match the CRDT package's tsconfig pattern in loot-core, desktop-client,
api, and desktop-electron so test files participate in the project graph
(IDE intellisense, project-wide typecheck) while production builds still
emit clean declaration files.

- Remove test-file exclusions from each package's main tsconfig
- Add tsconfig.build.json for loot-core and api with test exclusions,
  used by the build scripts
- Add e2e/tsconfig.json for desktop-client and desktop-electron with
  Playwright types
- Fix latent type errors in test files now caught by typecheck
- Disable typescript/unbound-method for test files (mock matcher pattern)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Release notes

* [AI] Address review feedback on test type fixes

- goal-template.test.ts: extract amounts to typed locals so single-value
  assertions no longer compare against unknown
- category-template-context.test.ts: replace `as unknown as DbCategory`
  double-cast with a fully-typed object using `satisfies DbCategory`
  (the previous mock had `is_income: true` which doesn't match the
  `1 | 0` shape the cast was hiding)
- api/tsconfig.build.json: broaden test exclude pattern to `**/*.test.ts`

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Widen toMatchThemeScreenshots matcher to accept Page

The matcher's runtime impl already handled both Page and Locator
(via `typeof locator.page === 'function'` branch), but the type only
declared Locator. Call sites pass a Page (`expect(page).toMatchThemeScreenshots()`),
which now compiles cleanly.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Type window.Actual in e2e fixtures and refactor matcher

- Pull loot-core/typings/window.ts into the e2e tsconfig include so
  the ambient `window.Actual` augmentation is visible.
- Refactor toMatchThemeScreenshots to derive a Page once via
  `'page' in target`, then call evaluate on the page consistently.
  The previous union-typed access (locator.evaluate, locator.page)
  didn't typecheck on Locator | Page.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Drop tsconfig.build.json for loot-core and api

The build-config indirection was incomplete protection: typecheck
(`tsgo -b` against the main tsconfig, which now includes test files)
already emits `*.test.d.ts` into `@types/`, and the build step does
not clean before re-emitting. The same is observable in crdt's
`dist/`, which currently contains test declarations on disk.

What actually keeps test declarations out of the npm tarball is the
`files` field in package.json — and loot-core already uses that
mechanism for source files (`\!src/**/*.test.ts`). Extending the same
pattern to `@types/` is more direct than maintaining a duplicate
tsconfig that doesn't reliably do its job.

- Delete loot-core/tsconfig.build.json; revert build to `tsgo -b`;
  add `\!@types/**/*.test.d.ts*`, `\!@types/**/__tests__/**`,
  `\!@types/**/__mocks__/**` to `files`.
- Delete api/tsconfig.build.json; revert build to
  `vite build && tsgo --emitDeclarationOnly`; add
  `\!@types/**/*.test.d.ts*` to `files`.

Verified: `yarn pack --dry-run` excludes all test declarations from
both packages while production declarations still pack (428 .d.ts
files for loot-core, methods.d.ts for api).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Standardise crdt to drop tsconfig.build.json

Apply the same simplification as loot-core and api: a single tsconfig
per package, with `files`-field negations preventing test
declarations from being published.

Note: this also fixes a pre-existing issue where crdt was shipping
`crdt/timestamp.test.d.ts` and `crdt/merkle.test.d.ts` to npm. The
old `tsconfig.build.json` excluded test files from declaration emit,
but the `typecheck` script (`tsgo -b` via the main tsconfig) had
already emitted them into `dist/` and the build did not clean
first, so they were packed via `"files": ["dist"]`.

After this change, `yarn pack --dry-run` packs only production
declarations (10 .d.ts files) and excludes the test ones.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 17:06:58 +00:00
arjunrawal1
a8eb204ce7 Fix release notes link in PR template (#7655)
* Fix release notes link in PR template

* Add release note
2026-04-29 16:39:10 +00:00
rboadu
f68e4fbb2a Updated API Documentation for Splitting Transactions (#7593)
* Updated API Documentation for Splitting Transactions

* Update check-spelling metadata

* Updated category name in example

* Update packages/docs/docs/api/reference.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Fixed documentation to CodeRabbits feedback

* Update packages/docs/docs/api/reference.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Updated wording of reference document

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: youngcw <calebyoung94@gmail.com>
2026-04-29 16:28:14 +00:00
Matiss Janis Aboltins
dd3b1144d1 [AI] Mobile: iconography, placeholders, and dropdown chevrons for transaction form (#7639)
* [AI] Add iconography, placeholders, and dropdown chevrons to mobile transaction form

Brings the mobile new-transaction screen in line with the design mockup so
each field shows a leading icon, dropdown affordances surface on the right,
and empty pickers display a placeholder hint.

- Extend TapField with optional icon/placeholder props (placeholder uses
  formInputTextPlaceholder when value is empty).
- Extend InputField with an optional icon prop that wraps the input in a
  bordered row when present.
- Wire SvgUser/SvgTag/SvgWallet/SvgCalendar/SvgNotesPaper into Payee,
  Category, Account, Date, and Notes for both the main form and split
  child rows; add a SvgCheveronDown rightContent on Category and Account.

* Add release notes for PR #7639

* [AI] Forward consumer style to inner Input in icon path of InputField

CodeRabbit flagged that consumer-supplied `style` was being applied to the
wrapper View in the icon branch, so input-targeted properties like
`appearance: 'none'` and `minWidth: '150px'` on the Date field never reached
the underlying `<input type="date">`. Move the style spread onto the inner
Input so those styles take effect.

* [AI] Add dropdown chevron to mobile Payee field

Show the chevron on the Payee TapField in both the main form (as the
fallback when neither the Save-location nor Nearby-payee button applies)
and in split child rows, matching Category and Account.

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7639

* [AI] Hide native date picker icon on mobile transaction form

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7639

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-29 07:56:05 +00:00
Matiss Janis Aboltins
ff0f5bdb35 [AI] lage - move browser build to using lage (#7602)
* Simplify desktop client browser build

* [AI] Move browser build orchestration into vite config and lage

Moves loot-core worker build, public/ staging (migrations, default-db,
sql-wasm, data-file-index), and build-stats wiring from the deleted
packages/desktop-client/bin/build-browser shell script into a
lootCoreBackend vite plugin in packages/desktop-client/vite.config.mts.

Adds a build:browser target to lage.config.js so bin/package-browser
runs as a single `lage build:browser --to=@actual-app/web` call, with
crdt + loot-core built via lage's ^build dependency before the
desktop-client build.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Refactor e2e-test workflow and update desktop-client configurations

* [AI] Move plugins-service staging into desktop-client vite config

Declares plugins-service as a workspace devDependency of @actual-app/web
so lage's ^build edge picks it up automatically in the build:browser
pipeline, and moves the cross-package file staging (production copy +
dev serving) into vite.config.mts, mirroring the lootCoreBackend
pattern. Drops the plugins-service shell wrapper script and simplifies
its package.json scripts to invoke vite build directly. Updates root
start:browser to run plugins-service watch in parallel with the dev
server instead of pre-building once.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Sync tsconfig project references for plugins-service edge

Follow-up to the plugins-service workspace edge: adds the
../plugins-service project reference in packages/desktop-client/tsconfig.json
via yarn sync:tsconfig-references.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* Release notes

* [AI] Ignore .venv/ so lage's git hasher skips Electron CI's Python venv

Electron CI provisions a Python virtualenv at the repo root for
setuptools. With browser builds now routed through lage, lage's
git hash-object pass walks untracked-not-ignored files and fails on
the venv's broken lib64 symlink ("fatal: Unable to hash .../.venv/lib64").

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Bake Weblate translations back into VRT/e2e bundle

build-web set download-translations: false and relied on bin/package-browser's
ad-hoc git clone + git pull. That path is fragile inside the playwright
container, so vite's import.meta.glob('/locale/*.json') frequently produced an
empty languages map and the bundle shipped with no en.json. VRTs then rendered
source-code English and diffed against snapshots authored from Weblate strings.

Route translation provisioning back through actions/checkout (download-translations: true)
in build-web and vrt-update-generate, and add --skip-translations to bin/package-browser
(mirroring bin/package-electron) so the in-script git pull is bypassed when CI
has already staged the locale dir.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Skip translation cloning in build-web bundle for VRT determinism

bin/package-browser used to unconditionally clone actualbudget/translations
before vite ran, baking Weblate en.json into the build artifact. With the
e2e-test pipeline now serving that artifact via serve-build.mjs, VRT
screenshots ended up rendering Weblate strings — drifting from the snapshots,
which were authored against source-code English (master VRTs ran on vite dev
without a locale dir).

Pass --skip-translations to bin/package-browser from build-web so the bundle
ships with no locale chunks. download-translations stays 'false' across the
e2e-test and vrt-update-generate workflows, matching the prior behavior.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 20:20:13 +00:00
Rudra Sarker
11ce29e7fd fix: disable Inter font contextual alternates to prevent x→× substitution (#7375)
* fix: disable contextual alternates in Inter font to prevent unwanted character substitution

Fixes #6351

The Inter font calt feature replaces x between digits with a
multiplication sign. Disable it while preserving ss01 and ss04.

* [autofix.ci] apply automated fixes

* Add release notes for PR #7375

* [autofix.ci] apply automated fixes

* Fix release notes category casing

* Add authors field to release notes

* Update upcoming-release-notes/7375.md

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
2026-04-28 10:56:20 +00:00
Trevin Chow
d58c9a9a07 fix(transactions): show 'Date is a required field' error when adding without date (#7511)
* fix: show consistent 'Date is a required field' error in add transaction

When adding a new transaction with a missing date, the UI surfaced the
generic 'Something internally went wrong' error instead of a specific
message, while missing account showed 'Account is a required field'.
Add an analogous date validation mirroring the existing account check,
so both required-field errors are consistent.

Fixes #7424

* fix: prevent empty date from reaching server when clearing date input

When adding a transaction, clearing the date field and clicking Add
caused DateSelect's blur handler to save an empty string as the date
via onSelect(''). The server rejected '' as an invalid date, emitting
a generic "Something internally went wrong" error.

The previous fix (adding a date check in the shouldAdd guard) didn't
work because the cell save fires on blur BEFORE shouldAdd runs — by
the time shouldAdd checks the date, it's either already sent to the
server or reverted by DateSelect's clearOnBlur logic.

Fix: change DateSelect's empty-input blur path to restore the previous
valid date instead of propagating ''. This prevents the invalid value
from ever reaching the cell save handler or the server.

Verified locally: clear date → click Add → date restores to previous
value and transaction saves normally. No generic error.

Fixes #7424
2026-04-28 10:50:54 +00:00
Matiss Janis Aboltins
598e3ec9d8 [AI] Keep mobile transaction amount field at a consistent height when empty (#7638)
* [AI] Keep mobile transaction amount field at a consistent height when empty

When the user backspaced every digit while editing the amount on the
mobile transaction form, the inner <Text> span had no content and
collapsed the pill to zero height. Fall back to a non-breaking space so
the line box keeps its natural font-driven height even when the input
is momentarily empty.

* Add release notes for PR #7638

* [AI] Use amountToCurrency(0) instead of nbsp when amount input is empty

Mirrors the unfocused-zero display rather than rendering a blank pill,
and respects the user's hideFraction pref and locale separators
through amountToCurrency.

* Simplify description of mobile transaction field height

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-28 07:31:57 +00:00
Matt Fiddaman
c2987af64f fix lint (#7643) 2026-04-27 21:24:16 +00:00
Matt Fiddaman
c7d39961cf fix cherrypicked commits not being respected and lint race in release note generation workflow (#7640)
* fix cherrypicked commits not being respected and lint race

* note

* coderabbit suggestions

* fix lint

* make double restore possibility safe
2026-04-27 20:52:07 +00:00
Matt Fiddaman
a42b7c5777 fix release note generation script (#7635)
* fix release note generation script

* note
2026-04-27 19:32:24 +00:00
Emil Tveden Bjerglund
33af9bf906 Fix Sankey income bug, when payee it not set (#7632)
* Ensure income categories are shown correct, even if payee is not set

* Add release note
2026-04-26 19:23:07 +00:00
Matt Fiddaman
46687da7a8 fix infinite loop when remainder is impossible to solve (#7623)
* fix infinite loop when remainder is impossible to solve

* note
2026-04-26 04:20:49 +00:00
Matt Fiddaman
3928d5b2a8 increase test coverage for budget templates (#7620)
* [AI] cover existing template engine logic with regression tests

Adds tests for goal template behavior that predates this PR so the
suite can be cherry-picked onto master to confirm no regressions. No
production code changes.

Covers:
- init() validation: schedule names, by/schedule priority match, past
  by-target with and without annual/repeat, percentage source not
  found, special source aliases, duplicate limit/spend/goal
  directives, weekly limit missing start date, invalid limit period,
  unrecognized periodic period
- runRemainder cap clamping and hideDecimal fraction removal
- Income-category branch in runTemplatesForPriority
- getLimitExcess against an aggregate weekly cap
- Past by-target rolling forward via the annual period
- runSchedule full=true (no sinking accumulation), percent and fixed
  adjustments, completed-schedule filtering, past-date error for
  non-repeating schedules, monthly/weekly/daily sinking contribution
  branches when interval exceeds the pay-month-of cap, surplus
  absorption when last-month balance exceeds the target, and
  tracking-budget mode forcing all schedules pay-month-of
- applyMultipleCategoryTemplates orchestration: per-category writes,
  cross-category priority clamping when funds run out, error
  notification path
- applyTemplate force=false skipping already-budgeted categories

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* note

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-25 23:35:32 +00:00
Matt Fiddaman
8b29ee40a7 show all blog posts on docs site (#7622) 2026-04-25 23:20:40 +00:00
Matiss Janis Aboltins
9acbd6388b [AI] Cache the CLI's local budget between invocations (#7539)
* [AI] Cache the CLI's local budget between invocations

Every `actual <cmd>` call currently delta-syncs the budget with the
sync server via `api.downloadBudget`, which hits the server's
500-req/min rate limit on scripted workflows. Actual is local-first:
once the budget is on disk, most read commands do not need fresh
server data.

Introduce a CLI-only cache layer inside `withConnection` that
decides per invocation whether to skip, sync, or re-download:

- Cache state lives at `{dataDir}/.actual-cli/{syncId}/state.json`,
  keyed by `syncId` to avoid the chicken-and-egg of not knowing the
  on-disk `budgetId` before the first download. The on-disk id is
  resolved via `api.getBudgets()` and persisted after first download.
- Read commands (list, balance, query run, …) skip the `/sync`
  call while `now - lastSyncedAt < cacheTtl`. Write commands
  (create, update, delete, set-*, etc.) sync before and after the
  operation to keep server state consistent.
- Encrypted budgets force a sync per call since `api/load-budget`
  does not re-verify the password.
- New `proper-lockfile`-backed shared/exclusive lock serializes
  writes while allowing parallel reads. Reader markers live in
  `{meta}/readers/`; writers sweep stale markers by PID.

New `actual sync` command with three modes: default (sync now),
`--status` (print cache age, TTL, stale flag), `--clear` (delete
cache, holding the exclusive lock to avoid racing writers).

New config surface, following the existing flag → env → config file
→ default precedence chain:

- `--cache-ttl <s>` / `ACTUAL_CACHE_TTL` / `cacheTtl` (default 60)
- `--refresh` / `--no-cache`
- `--lock-timeout <s>` / `ACTUAL_LOCK_TIMEOUT` / `lockTimeout` (10)
- `--no-lock` / `ACTUAL_NO_LOCK` / `noLock`

Every `withConnection` call site now passes an explicit
`{ mutates: boolean, skipBudget?: boolean }` so read/write intent is
visible at the edge.

The old `budgets sync` subcommand is removed — it silently diverged
from the new top-level `actual sync`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Simplify CLI cache/lock internals

Use a discriminated SyncDecision union so connection.ts no longer needs
non-null assertions on the cached state. Thread the resolved CliConfig
through withConnection's callback to drop duplicate resolveConfig calls
in the sync and budgets commands. Extract an errorCode helper and
replace the existsSync+readdirSync TOCTOU pattern in the reader-wait
polling loop with a single readdir that tolerates ENOENT.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Address PR #7539 review comments

- cache.ts: use a unique per-writer tmp filename so concurrent shared-
  lock writers (encrypted budgets, --refresh, stale TTL) don't clobber
  each other's publish and silently drop state updates.

- index.ts/config.ts: fix --no-cache and --no-lock flags. Commander
  stores --no-foo under the positive key (cache/lock) and an explicit
  false default makes the flag a no-op; the previous code also read
  the wrong keys (noCache/noLock) so the flags had no effect at all.
  Derive refresh/noLock from the correct keys.

- budgets.ts: invert encryption password precedence so the subcommand
  flag (--encryption-password) wins over env/config-file values.

- sync.ts: report stale=true in --status when lastSyncedAt is in the
  future, matching decideSyncAction's clock-skew handling.

- connection.ts: drop unnecessary `as` cast on api.getBudgets() now
  that the return type is Promise<APIFileEntity[]>.

- utils.ts: parseBoolEnv throws on unrecognized values instead of
  silently returning undefined so typos like ACTUAL_NO_LOCK=yes fail
  loudly.

- Shorten 7539 release note to a single user-facing sentence.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-25 21:58:18 +00:00
Alec Bakholdin
77f0a3e58b Update documentation for transaction import behavior based (#7614)
* Update documentation for transaction import behavior

Updated docs per the change in #7610

* [autofix.ci] apply automated fixes

* release notes

* removed patch notes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-04-25 21:44:38 +00:00
Matt Fiddaman
eb922fd191 sankey card should follow report settings (#7619)
* sankey card should follow report rules

* note
2026-04-25 16:17:45 +00:00
Julian Dominguez-Schatz
2b584e1ad0 Make double ctrl-f trigger browser-native search (#7605)
* Make double Ctrl-f trigger browser find

* Add release notes

* Invert condition

* Rename release note
2026-04-25 00:53:25 +00:00
dependabot[bot]
4f1bc3fcdd Bump postcss from 8.5.8 to 8.5.10 (#7613)
* Bump postcss from 8.5.8 to 8.5.10

Bumps [postcss](https://github.com/postcss/postcss) from 8.5.8 to 8.5.10.
- [Release notes](https://github.com/postcss/postcss/releases)
- [Changelog](https://github.com/postcss/postcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/postcss/postcss/compare/8.5.8...8.5.10)

---
updated-dependencies:
- dependency-name: postcss
  dependency-version: 8.5.10
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

* Add release notes for postcss version bump

Updated postcss version for maintenance.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Julian Dominguez-Schatz <julian.dominguezschatz@gmail.com>
2026-04-25 00:23:54 +00:00
Julian Dominguez-Schatz
ea50db524b Enable stricter electron build options (#7609)
* Enable stricter electron build options

* Add release notes

* Fix build signatures when no signing credentials are provided

* Attempt fix again
2026-04-24 23:59:28 +00:00
Julian Dominguez-Schatz
da1d0a94b9 Migrate file service to TypeScript (#7606)
* Migrate file service to TypeScript

* Add release notes

* Rabbit

* Stricter types
2026-04-24 22:50:43 +00:00
Alec Bakholdin
daab7f737e Import Transactions - Persist and set reimport deleted transactions config to true (#7610)
* fix: restored default functionality from v26.3.0 for reimport deleted transactions. import-reimport-deleted is now a synced pref that persists between imports

* release notes

* release note update

---------

Co-authored-by: Alec Bakholdin <alecbakholdin.com>
2026-04-24 18:21:45 +00:00
Emil Tveden Bjerglund
686f10247d Enhance Sankey chart datamodel, show income and allow layer filtering (#7582)
* Refactor to use directed, weighted graph as datamodel

* Fix percentage labels

* Reimplement sorting and topN handling

* Fix typing. Show toBudget on graph.

* Implement better DAG model

* Fix Other-grouping with new datamodel

* Add global sorting

* Reorder spreadsheet code for clarity

* Add percentageLabels back

* Fix all sorting modes

* Better color handling

* Handle if overbudgeted

* Fix filtering issue related to hidden nodes for Spent report

* Implement enums for special names

* Linting and typechecking

* Add layer selectors

* Trim SankeyCard

* Fix issue with empty nodes making the graph unreadable

* Add release note

* Update release note

* Reorder code

* Address coderabbit comments

* Ensure that layer-from and layer-to cannot be equal

* Update layer selectors to match selected view mode

* Fix wrong graph object reference

* Cap regex length

* Fixed wrong layer assignment for budget income categories

* Make translation not optional in createSpreadsheet

* Use predefined suffix for 'Other'

* Avoid invalid layer selection for Budgeted

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7582

* Import translation in spreadsheet, instead of passing as argument

* Remove all non-null assertions and handle safely

* Fix most uses of 'as'

* Fix issues hiding Other categories and giving wrong toBudget value

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-24 18:20:15 +00:00
Julian Dominguez-Schatz
227c995155 Disallow reconfiguring OpenID after initialization (#7608)
* Disallow reconfiguring OpenID after initialization

* Add release notes
2026-04-24 15:12:01 +00:00
dependabot[bot]
c8224d24be Bump @xmldom/xmldom from 0.8.12 to 0.8.13 (#7596)
Bumps [@xmldom/xmldom](https://github.com/xmldom/xmldom) from 0.8.12 to 0.8.13.
- [Release notes](https://github.com/xmldom/xmldom/releases)
- [Changelog](https://github.com/xmldom/xmldom/blob/master/CHANGELOG.md)
- [Commits](https://github.com/xmldom/xmldom/compare/0.8.12...0.8.13)

---
updated-dependencies:
- dependency-name: "@xmldom/xmldom"
  dependency-version: 0.8.13
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-23 20:06:30 +00:00
Dakyne
29a06b23ea Add Gruvbox Light and Dark themes to custom theme catalog (#7571)
* Add Gruvbox Light and Dark custom themes to catalog

* Add release note for PR #7571

* [autofix.ci] apply automated fixes

* Update packages/desktop-client/src/data/customThemeCatalog.json

Co-authored-by: Joel Jeremy Marquez <joeljeremy.marquez@gmail.com>

---------

Co-authored-by: Dakyne <fawn_salable_73@icloud.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Joel Jeremy Marquez <joeljeremy.marquez@gmail.com>
2026-04-23 14:25:05 +00:00
Matiss Janis Aboltins
aeb28d3b87 Add Discord notification for nightly theme catalog scan failures (#7595)
* [AI] Notify Discord when nightly theme catalog scan fails

Adds an if: failure() step to the validate-theme-catalog job that posts a
minimal alert to the DISCORD_WEBHOOK_URL webhook with a link back to the
failing workflow run. Fires on both theme validation failures (script exits
1) and earlier step failures (checkout/setup), so infrastructure breakage
is also surfaced. nofail: true keeps a Discord outage from cascading into
a red job.

* [AI] Drop setup comment from Discord notify step

* [AI] Move Discord notify to its own job gated by an environment

Splits the notify step into a separate notify-failure job that depends on
validate-theme-catalog and runs only on failure. The new job binds to the
nightly-alerts GitHub Environment so the DISCORD_WEBHOOK_URL secret is
scoped to a dedicated environment rather than inherited at the repo level
(zizmor secrets-without-environment).

* [AI] Add release notes for 7595

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-23 06:58:21 +00:00
dependabot[bot]
bd1da27404 Bump dompurify from 3.3.2 to 3.4.1 (#7591)
Bumps [dompurify](https://github.com/cure53/DOMPurify) from 3.3.2 to 3.4.1.
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/3.3.2...3.4.1)

---
updated-dependencies:
- dependency-name: dompurify
  dependency-version: 3.4.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-04-22 20:51:12 +00:00
Matiss Janis Aboltins
7501674613 [AI] Fix Docker build for workspace:* dependencies (#7564)
* [AI] Fix Docker build for workspace:* dependencies

Since @actual-app/crdt became a workspace:* dep, `yarn workspaces focus
--production` creates relative symlinks in node_modules that dangle when
only node_modules is copied into the prod image, breaking local Docker
builds with ERR_MODULE_NOT_FOUND: @actual-app/crdt.

Dereference yarn's workspace symlinks in the builder stage with `cp -RL`
so the prod stage can copy a self-contained node_modules without needing
to enumerate which workspace:* deps exist. Adding a new workspace:* dep
now requires zero Dockerfile changes.

Also move the sync-server .dockerignore to the repo root (and drop stray
local node_modules / .git / .yarn caches from the build context), since
docker builds use the repo root as context — the old sync-server-level
file was no longer being applied.

Fixes #7561.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Strip dev-only dirs from dereferenced workspace packages

The generic `cp -RL` step copies full workspace package trees into the
image (src/, e2e/, tests, build-stats, etc.). Remove them after the
dereference — they're not needed at runtime, and skipping them recovers
~67MB from the final image on both alpine and ubuntu variants.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Rephrase 7564 release note to be user-facing

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-22 19:20:51 +00:00
Jaime R Calzada
c3717e7036 [AI] Add per-schedule custom upcoming length override (#7434)
* [AI] Add per-schedule custom upcoming length override

* [AI] Add release notes for PR #7434

* [AI] Add custom length option to per-schedule upcoming length selector

* [AI] Deduplicate preset values and guard against malformed custom upcoming length

* [autofix.ci] apply automated fixes

* [AI] Retrigger CI (flaky accounts E2E test)

* [AI] Improve schedule editor layout for upcoming length field

Restructure the custom upcoming length from a checkbox + conditional
dropdown into a proper FormField with a single Select that includes
"Use global default" as the null option. Place it in a responsive
side-by-side row with the auto-post checkbox, consistent with the
form's existing layout patterns.

* [AI] Address CodeRabbit review feedback

Tighten custom upcoming length validation with a proper regex
instead of a loose hyphen check. Replace fixed height with minHeight
on the auto-post checkbox row to avoid clipping translated labels.

* Update VRT screenshots

Auto-generated by VRT workflow

PR: #7434

* [AI] Fix custom_upcoming_length not persisting on mobile

The mobile schedule save handler was missing the custom_upcoming_length
field from the payload sent to schedule/create and schedule/update.

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: youngcw <calebyoung94@gmail.com>
2026-04-22 16:33:21 +00:00
Matt Fiddaman
9d91da77ec move from tsgo dev preview to beta (#7587)
* move from tsgo dev preview to beta

* note
2026-04-22 15:22:14 +00:00
Copilot
1c97388654 [AI] Consolidate npm release and nightly publishing into one workflow (#7583)
* [AI] Unify npm release and nightly publish workflows

Agent-Logs-Url: https://github.com/actualbudget/actual/sessions/3f8de051-a9a7-4527-88d8-5c44bc06a562

Co-authored-by: jfdoming <9922514+jfdoming@users.noreply.github.com>

* [AI] Harden unified npm publish workflow conditionals

Agent-Logs-Url: https://github.com/actualbudget/actual/sessions/3f8de051-a9a7-4527-88d8-5c44bc06a562

Co-authored-by: jfdoming <9922514+jfdoming@users.noreply.github.com>

* [AI] Clarify nightly install step and add concise release note

Agent-Logs-Url: https://github.com/actualbudget/actual/sessions/af3d68aa-d217-47be-addb-1b40b08f533b

Co-authored-by: jfdoming <9922514+jfdoming@users.noreply.github.com>

* [AI] Revert release note edit and make npm publish workflow ACT-compatible (#7584)

* Initial plan

* [AI] Revert release note edit and validate workflow with act

Agent-Logs-Url: https://github.com/actualbudget/actual/sessions/df98a192-197a-4df4-a804-80b69116f742

Co-authored-by: jfdoming <9922514+jfdoming@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: jfdoming <9922514+jfdoming@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: jfdoming <9922514+jfdoming@users.noreply.github.com>
2026-04-22 15:18:29 +00:00
Julian Dominguez-Schatz
8323a7d27c Reduce permissions in stale workflow (#7555)
* Restrict permissions on stale workflow

* Add release notes for reducing permissions in stale workflow
2026-04-22 15:17:35 +00:00
Matiss Janis Aboltins
7d4e28041c [AI] Export API models as separate entry point (#7581)
* [AI] Expose API entity types via @actual-app/api/models

Adds a new `./models` subpath export on `@actual-app/api` that re-exports
the public API entity types (`APIAccountEntity`, `APICategoryEntity`,
`APICategoryGroupEntity`, `APIFileEntity`, `APIPayeeEntity`,
`APIScheduleEntity`, `APITagEntity`, `AmountOPType`) from
`@actual-app/core/server/api-models`. Consumers can now import these types
from a stable public entry point instead of reaching into core internals:

    import type {
      APICategoryEntity,
      APICategoryGroupEntity,
    } from '@actual-app/api/models';

Uses `export type *` so the compiled `dist/models.js` is empty and no
runtime code is added. The Vite lib config is expanded to a multi-entry
map (`index`, `models`) so both bundles are produced, and tsgo already
emits `@types/models.d.ts` via the existing `declarationDir` setup.

* Add release notes for PR #7581

* Modify release notes for API model exports

Updated category from 'Features' to 'Enhancements' and added API export details.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-22 10:45:54 +00:00
Matiss Janis Aboltins
3c77b3d0d5 [AI] Add enforce-boundaries ESLint rule for architectural boundaries (#7467)
* [AI] Add enforce-boundaries ESLint rule for architectural boundaries

Disallow tsconfig compilerOptions.paths, Vite resolve.alias, and
backtracked imports (../../) to enforce the use of package.json
subpath imports (#path) as the canonical aliasing mechanism.

The rule is enabled globally as an error without autofixes.

https://claude.ai/code/session_01T7VCnq5Kid7co9vBDPHWmR

* [AI] Fix enforce-boundaries lint violations

Replace backtracked imports (../../) with subpath imports (#path):
- migrations.ts: use #migrations/* mapping
- Formula.tsx: use #components/* mapping
- TransactionsTable.test.tsx: use #mocks/* mapping

Suppress unavoidable violations with oxlint-disable comments:
- preview.tsx: cross-package theme imports (pre-existing TODO)
- vite.desktop.config.ts: handlebars resolve.alias (types require root entry)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* [AI] Clean up enforce-boundaries rule: remove redundant comments, optimize Property visitor, add edge-case tests

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* Add release notes for PR #7467

* Update category for release notes

Changed category from Enhancements to Maintenance.

* [AI] Fix JSON syntax error after merge

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

* [AI] Merge master and fix lint errors in enforce-boundaries rule

Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Matiss Janis Aboltins <MatissJanis@users.noreply.github.com>
2026-04-21 21:18:42 +00:00
Matiss Janis Aboltins
846b6a6b7a [AI] Add nightly CI scan for custom theme catalog (#7566)
* [AI] Add nightly CI scan for custom theme catalog

Adds a scheduled GitHub Actions workflow that fetches `actual.css` from
every repo in `customThemeCatalog.json` and runs it through the same
`embedThemeFonts` + `validateThemeCss` pipeline the app uses at install
time. Failing themes fail the job so maintainers get an alert when a
third-party repo introduces a regression.

The scan treats fetched CSS as opaque text: never executed, never
injected into a DOM, size-capped at 512 KB per file, 15s per fetch,
restricted to raw.githubusercontent.com with redirects disabled, and
run with `contents: read` permissions only. Each catalog `repo` is
schema-checked against `owner/repo` before being interpolated into
the URL.

* [AI] Simplify theme catalog scan

- Reuse `CatalogTheme` type from customThemes instead of duplicating as
  `CatalogEntry` in the script.
- Hoist `appendFileSync` to the static `node:fs` import; drop the dynamic
  import inside `writeStepSummary`.
- Drop the narrative header docstring and the trailing `// ...` comments
  that just restated constant names.
- Drop the redundant URL-prefix re-check inside the CSS fetch helper;
  the single call site constructs the URL from a pinned literal.
- Drop the 250 ms inter-request delay (GitHub Raw rate limits are not
  relevant for 21 requests, and the trailing delay was idle wall-clock
  against the 10-min job budget).
- Give each font fetch inside `embedThemeFonts` its own 15 s timeout
  via `AbortSignal.any`, instead of sharing one signal across every
  font in a theme. Drop the now-unnecessary caller-supplied signal
  from the CI call site.

* [AI] Fix lint on theme catalog scan imports
2026-04-21 21:18:21 +00:00
Julian Dominguez-Schatz
07c71154c9 Enable trusted publishing for releases (#7579)
* Enable trusted publishing for releases

* Add release notes for PR #7579

* Update 7579.md

* [autofix.ci] apply automated fixes

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2026-04-21 18:34:38 +00:00
Matt Fiddaman
2f49c5c400 add repository details to package.json files (#7578)
* add package URLs

* note
2026-04-21 17:40:11 +00:00
Matt Fiddaman
4b28a8146e fix error when clearing the payee field of a transaction (#7532)
* treat undefined values as missing in updates

* note
2026-04-21 17:30:45 +00:00
Matiss Janis Aboltins
362d8d60e4 [AI] Optimize CI e2e tests with pre-built bundle serving (#7503)
* [AI] Speed up and stabilize Playwright e2e tests

- Serve the prebuilt browser bundle via bin/serve-build.mjs in CI to
  skip per-shard Vite startup; 3-shard matrix with 4 workers each.
- Disable CSS animations in non-VRT runs via a fixture-level init
  script; bump expect timeout to 10s for AutoSizer-bound assertions.
- Use page.evaluate() for React Aria button clicks and a native value
  setter + single input event for controlled-input fills to eliminate
  React Aria re-render races in createAccount and Payee/Category
  autocompletes.
- Click the matching option directly (instead of Enter on a not-yet-
  highlighted list) in mobile transaction and schedule autocompletes.
- FocusableAmountInput.applyText reads the DOM input value so the
  typed amount survives a blur that fires before React flushes the
  onChange state update under CPU contention.
- MobileTransactionEntryPage.fillAmount waits for the outer display
  button (reads parent props.value) so async rules-run completes
  before the next fillField snapshots the transaction.
- MobileNavigation dispatches nav link clicks through evaluate() to
  bypass Playwright's viewport-stability check against the navbar's
  react-spring transforms.
- MobileBudgetPage summary-button lookups use locator.or().waitFor()
  instead of an isVisible() cascade.
- ConfigurationPage.startFresh/createTestFile wait for the account
  header / budget table to mount before returning.
- Workflow hardening: persist-credentials=false on all actions/checkout
  and top-level permissions: contents: read (zizmor findings).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* [AI] Apply animation-disable init script to browser.newPage pages

The previous implementation extended the test-scoped `page` fixture,
but every test creates its own page via `browser.newPage()` and never
uses the fixture-provided page — so the init script was a no-op in
every test.

Move the wrap to the worker-scoped `browser` fixture: intercept
`browser.newPage` so each page created that way has `addInitScript`
applied before the caller can navigate to it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-21 17:03:56 +00:00
Julian Dominguez-Schatz
664cfdf244 Force node version 24 for trusted publishing (#7577)
* Force node version 24 for trusted publishing

* Add release notes for PR #7577

* Enable check-latest for npm setup action

* Update nightly package publishing workflow to Node.js 24

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-21 16:44:03 +00:00
Stephen Brown II
880bb67423 [AI] Fix transaction row drag breaking inline text edits (#7572)
* [AI] fix: disable transaction row drag while editing input cells

Row-level useDrag competed with notes/payee/amount fields (GH #7567).
Disable reorder drag when the row is in edit mode except for select/cleared columns.

* Add release notes
2026-04-21 16:04:17 +00:00
Julian Dominguez-Schatz
3e35d3b6f5 fix: trusted publishing requires npm version >= 11.5.1 (#7574)
* fix: trusted publishing requires npm version >= 11.5.1

* Add release notes for PR #7574

* Update .github/workflows/publish-nightly-npm-packages.yml

Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>

* Update release notes for trusted publishing fix

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Matt Fiddaman <github@m.fiddaman.uk>
2026-04-21 16:03:37 +00:00
Matiss Janis Aboltins
75da8f1851 [AI] fix: ensure crdt builds before loot-core is packed (#7565)
The `Publish nightly npm packages` workflow started failing at the
"Pack the core package" step with:

    Cannot find module '@actual-app/crdt' or its corresponding type declarations.

PR #7541 switched `@actual-app/crdt`'s package.json to conditional
exports (`types` → `./dist/index.d.ts`). `yarn pack` for
`@actual-app/core` triggers a prepack that runs `tsgo -b`, which now
resolves `@actual-app/crdt` via the `types` condition and expects
`packages/crdt/dist/index.d.ts`. Nothing was building crdt first
because loot-core's tsconfig didn't declare it as a project
reference.

Fix: declare the project reference so `tsgo -b` walks the graph and
builds crdt before loot-core. Sibling packages already do this.

Also adopt `@monorepo-utils/workspaces-to-typescript-project-references`
to keep each package's tsconfig `references` in sync with its
`workspace:*` deps, and wire it into a new `yarn check:tsconfig-references`
step in the `check` CI job plus lint-staged. Running the tool added
`../desktop-client` references to sync-server and desktop-electron
(both declare `@actual-app/web` as a workspace dep even though they
only use it at runtime via `require.resolve`); the extra references
are harmless — in CI the corresponding build is already cached by
earlier steps.

https://claude.ai/code/session_01AA2gEMqX24GWeq5BovNmaz
2026-04-20 22:07:27 +00:00
Julian Dominguez-Schatz
29275a573d Run zizmor auto-fix tool (#7533)
* Run `zizmor` auto-fix tool

* Add release notes

* Enable credential persistence for string extraction

Updated workflow to allow pushing extracted strings.

* Enable credential persistence for release notes

Enable credential persistence to allow committing release notes.
2026-04-20 19:40:04 +00:00
Copilot
ead1b8e39d Remove redundant inline type import guideline (#7553)
* Initial plan

* [AI] Remove inline type import guideline (handled by oxfmt/oxlint)

Agent-Logs-Url: https://github.com/actualbudget/actual/sessions/7891fb33-668f-444e-bd69-5806181dcecd

Co-authored-by: MatissJanis <886567+MatissJanis@users.noreply.github.com>

* Add release notes for PR #7553

* Update author and remove redundant TypeScript guidance

Updated author credit in release notes and removed outdated guidance.

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: MatissJanis <886567+MatissJanis@users.noreply.github.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Matiss Janis Aboltins <matiss@mja.lv>
2026-04-20 19:38:57 +00:00
Matiss Janis Aboltins
3c361fdabf [AI] Add AI Usage Policy for contributors (#7548)
* [AI] Add AI Usage Policy for contributors

Add a contributor-facing AI Usage Policy page modeled on ESLint's version,
covering disclosure, human-only interaction with maintainers, and author
responsibility. Wire it into the docs sidebar, link it from the contributing
index and the root CONTRIBUTING.md.

https://claude.ai/code/session_012RspFcLedoUjbEYknJYPiL

* [AI] Unwrap AI policy paragraphs, renumber release note

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-20 19:37:44 +00:00
James Skinner
8691766fb8 Fix reconciled value of children of split transactions (#7453)
* Fix reconciled value of children of split transactions

* Update release note

* Set mock transactions to include reconciled field
2026-04-20 17:14:26 +00:00
Julian Dominguez-Schatz
e896ce408a Enable trusted publishing for nightly npm packages (#7556)
* Enable trusted publishing for nightly `npm` packages

Ref: https://docs.npmjs.com/trusted-publishers

* Add release notes for PR #7556

* Change category to Maintenance and update description

* Fix formatting of id-token permission comment

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-20 14:38:06 +00:00
Matiss Janis Aboltins
3373154b40 Refactor CI workflows to use shared setup job (#7551)
* [AI] Run setup once per workflow and fan out via needs

Add a prep `setup` job at the top of `check.yml` and `build.yml`, and
make every other job in those workflows declare `needs: setup`.

The composite action in `.github/actions/setup` caches `node_modules`
keyed on `yarn.lock`. When that hash changes (dep-bump PRs, master after
a merge), the cache is cold and every fan-out job races to run
`yarn --immutable` in parallel — one wins the cache save, the rest do
redundant work. Serialising through a single `setup` job warms the
cache once so downstream jobs restore instantly and skip yarn install
via the existing `if: steps.cache.outputs.cache-hit != 'true'` guard.

No changes to the composite action or cache keys. `e2e-test.yml` is
intentionally left alone.

* [AI] Harden setup jobs and add release note

Address zizmor code-scanning findings on the new `setup` jobs added in
the previous commit:

- Scope `permissions: contents: read` so the job no longer inherits
  workflow-default write permissions.
- Pass `persist-credentials: false` to `actions/checkout` so the GitHub
  token isn't left on disk for later steps that don't need it.

Add `upcoming-release-notes/7551.md` to satisfy the release-notes PR
check.

* [AI] Disable credential persistence on build.yml checkouts

Each of `api`, `crdt`, `web`, `cli`, `server` in build.yml does
`actions/checkout` (which writes the GitHub token into `.git/config`)
and then uploads build artifacts in the same job. Zizmor flags this as
"credential persistence through GitHub Actions artifacts" because a
misconfigured upload path could capture `.git/config` and leak the
token.

None of these jobs push or write to git, so drop the credential
persistence via `persist-credentials: false` on the checkout.

* [AI] Disable credential persistence on check.yml checkouts

None of the jobs in check.yml (`constraints`, `lint`, `typecheck`,
`validate-cli`, `test`, `migrations`) push or write to git, so pass
`persist-credentials: false` to their `actions/checkout` calls to
resolve the zizmor "credential persistence" finding. Mirrors the fix
just applied to build.yml.

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-19 21:22:49 +00:00
Matiss Janis Aboltins
f85627dcf6 [AI] Disable bundle minification for readable error messages (#7538)
* [AI] Disable bundle minification for readable production error messages

The desktop-client had dead terserOptions (no `minify: 'terser'` was set, so
Vite's default esbuild minifier ran with name mangling). The loot-core and
plugins-service workers used Terser with mangle:false but still compressed.
Set `minify: false` across all three browser build configs so production
stack traces are human-readable.

https://claude.ai/code/session_01VEywxebiNYAgJia35fygQx

* [AI] Rename release note to match PR number

https://claude.ai/code/session_01VEywxebiNYAgJia35fygQx

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-19 20:35:26 +00:00
Matt Fiddaman
695fd0e7e0 fix bank sync account linking modal being disabled when relinking existing accounts (#7487)
* fix link account modal button disabled

* note
2026-04-18 22:25:28 +00:00
Matiss Janis Aboltins
9682f6d8c9 ci: disable fail-fast for Electron build workflows (#7547)
* [AI] Disable fail-fast for Electron build matrices

Prevents cancellation of in-progress platform builds when one fails, so
Windows/macOS/Linux results are all visible on a single run.

* Add release notes for PR #7547

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-04-18 21:32:21 +00:00
668 changed files with 23695 additions and 8593 deletions

View File

@@ -1,6 +1,6 @@
issue_enrichment:
auto_enrich:
enabled: false
enabled: true
reviews:
request_changes_workflow: true
review_status: false

View File

@@ -1,7 +1,7 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/docker-existing-docker-compose
{
"name": "Actual development",
"name": "Actual Devcontainer",
"dockerComposeFile": ["../docker-compose.yml", "docker-compose.yml"],
// Alternatively:
// "image": "mcr.microsoft.com/devcontainers/typescript-node:0-16",

View File

@@ -1,8 +1,6 @@
node_modules
user-files
server-files
# Yarn
**/node_modules
.git
.lage
.pnp.*
.yarn/*
!.yarn/patches

View File

@@ -3,9 +3,6 @@ contact_links:
- name: Bank-sync issues
url: https://discord.gg/pRYNYr4W5A
about: Is bank-sync not working? Returning too much or too few information? Reach out to the community on Discord.
- name: Support
url: https://discord.gg/pRYNYr4W5A
about: Need help with something? Having troubles setting up? Or perhaps issues using the API? Reach out to the community on Discord.
- name: Translations
url: https://hosted.weblate.org/projects/actualbudget/actual/
about: Found a string that needs a better translation? Add your suggestion or upvote an existing one in Weblate.

17
.github/ISSUE_TEMPLATE/tech-support.yml vendored Normal file
View File

@@ -0,0 +1,17 @@
name: Tech Support
description: Need help with something? Having troubles setting up? Or perhaps issues using the API?
title: '[Support]: '
labels: ['tech-support']
body:
- type: markdown
attributes:
value: |
> ⚠️ **Tech support tickets opened here are automatically closed.** GitHub Issues are reserved for bug reports and feature requests. The fastest way to get help is to ask the community on [Discord](https://discord.gg/pRYNYr4W5A) — that's where most of the community lives and can help you in real time.
- type: textarea
id: problem
attributes:
label: Describe your problem
description: Please describe, in as much detail as you can, what you need help with.
placeholder: I'm trying to [...] but [...]
validations:
required: true

View File

@@ -1,4 +1,4 @@
<!-- Thank you for submitting a pull request! Make sure to follow the instructions to write release notes for your PR — it should only take a minute or two: https://github.com/actualbudget/docs#writing-good-release-notes. Try running yarn generate:release-notes *before* pushing your PR for an interactive experience. -->
<!-- Thank you for submitting a pull request! Make sure to follow the instructions to write release notes for your PR — it should only take a minute or two: https://actualbudget.org/docs/contributing/#writing-good-release-notes. Try running yarn generate:release-notes *before* pushing your PR for an interactive experience. -->
## Description

View File

@@ -1,13 +1,16 @@
# See https://github.com/check-spelling/check-spelling/wiki/Configuration-Examples:-excludes
(?:^|/)(?i).nojekyll
(?:^|/)(?i)COPYRIGHT
(?:^|/)(?i)docusaurus.config.js
(?:^|/)(?i)LICEN[CS]E
(?:^|/)(?i)README.md
(?:^|/)3rdparty/
(?:^|/)go\.sum$
(?:^|/)package(?:-lock|)\.json$
(?:^|/)pyproject.toml
(?:^|/)requirements(?:-dev|-doc|-test|)\.txt$
(?:^|/)vendor/
ignore$
(?:^|/)yarn\.lock$
\.a$
\.ai$
\.avi$
@@ -53,6 +56,7 @@ ignore$
\.svgz?$
\.tar$
\.tiff?$
\.tsx$
\.ttf$
\.wav$
\.webm$
@@ -62,15 +66,12 @@ ignore$
\.zip$
^\.github/actions/spelling/
^\.github/ISSUE_TEMPLATE/
^\Q.github/workflows/spelling.yml\E$
^\.yarn/
^\Q.github/\E$
^\Q.github/workflows/spelling.yml\E$
^\Qnode_modules/\E$
^\Qsrc/\E$
^\Qstatic/\E$
^\Q.github/\E$
(?:^|/)yarn\.lock$
(?:^|/)(?i)docusaurus.config.js
(?:^|/)(?i)README.md
(?:^|/)(?i).nojekyll
^\static/
\.tsx$
^packages/docs/docs/releases\.md$
ignore$

View File

@@ -38,10 +38,13 @@ Cetelem
cimode
Citi
Citibank
claude
Cloudflare
CLP
CMCIFRPAXXX
COBADEFF
CODEOWNERS
Codespaces
COEP
commerzbank
Copiar
@@ -53,6 +56,7 @@ crt
CZK
Danske
datadir
datamodel
DATEDIF
Depositos
deselection
@@ -82,6 +86,7 @@ Globecard
GLS
gocardless
Grafana
Gruvbox
HABAL
Hampel
HELADEF
@@ -89,6 +94,7 @@ HLOOKUP
HUF
IFERROR
IFNA
Ilavenil
INDUSTRIEL
INGBPLPW
Ingo
@@ -127,6 +133,7 @@ murmurhash
NETWORKDAYS
nginx
nodenext
nord
OIDC
Okabe
overbudgeted
@@ -140,14 +147,13 @@ pluggyai
Poste
PPABPLPK
prefs
Primoco
Priotecs
proactively
Qatari
QNTOFRP
QONTO
Raiffeisen
REGEXREPLACE
relinking
revolut
RIED
RSchedule
@@ -172,7 +178,6 @@ SWEDBANK
SWEDNOKK
Synology
systemctl
tada
taskbar
templating
THB
@@ -180,6 +185,7 @@ TIMEFRAME
touchscreen
triaging
tsgo
tsgolint
TWD
UAH
ubuntu
@@ -195,4 +201,6 @@ websecure
WEEKNUM
Widiba
WOR
worktree
youngcw
zizmor

View File

@@ -9,7 +9,7 @@ runs:
node-version: 22
- name: Install dependencies
shell: bash
run: yarn workspaces focus @actual-app/ci-actions
run: yarn workspaces focus actual @actual-app/ci-actions
- name: Generate release notes
shell: bash
env:

View File

@@ -10,6 +10,10 @@ inputs:
description: 'Whether to download translations as part of setup, default true'
required: false
default: 'true'
cache:
description: 'Whether to restore and save dependency and Lage caches, default true'
required: false
default: 'true'
runs:
using: composite
@@ -18,6 +22,7 @@ runs:
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22
package-manager-cache: ${{ inputs.cache }}
- name: Install yarn
run: npm install -g yarn
shell: bash
@@ -28,6 +33,7 @@ runs:
shell: bash
- name: Cache
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
if: ${{ inputs.cache == 'true' }}
id: cache
with:
path: ${{ format('{0}/**/node_modules', inputs.working-directory) }}
@@ -37,6 +43,7 @@ runs:
shell: bash
- name: Cache Lage
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
if: ${{ inputs.cache == 'true' }}
with:
path: ${{ format('{0}/.lage', inputs.working-directory) }}
key: lage-${{ runner.os }}-${{ github.sha }}
@@ -52,8 +59,9 @@ runs:
with:
repository: actualbudget/translations
path: ${{ inputs.working-directory }}/packages/desktop-client/locale
if: ${{ inputs.download-translations == 'true' }}
persist-credentials: false
if: ${{ inputs.download-translations == 'true' && !env.ACT }}
- name: Remove untranslated languages
run: packages/desktop-client/bin/remove-untranslated-languages
shell: bash
if: ${{ inputs.download-translations == 'true' }}
if: ${{ inputs.download-translations == 'true' && !env.ACT }}

View File

@@ -0,0 +1,27 @@
name: Add 'AI generated' label to '[AI]' PRs
##########################################################################################
# This workflow uses the 'pull_request_target' event so it has a token that can add a #
# label to PRs from forks. It does NOT check out or execute any code from the PR, so it #
# is not vulnerable to the usual 'pull_request_target' code-injection concerns. Keep it #
# that way - do not add a checkout step or run any PR-provided scripts here. #
##########################################################################################
on:
# This workflow never checks out or runs PR code; it only reads the PR title and adds a label.
pull_request_target: # zizmor: ignore[dangerous-triggers]
types: [opened, reopened, edited]
permissions:
pull-requests: write
jobs:
add-ai-generated-label:
name: Add 'AI generated' label
runs-on: ubuntu-latest
if: startsWith(github.event.pull_request.title, '[AI]')
steps:
- uses: actions-ecosystem/action-add-labels@bd52874380e3909a1ac983768df6976535ece7f8 # v1.1.0
with:
labels: AI generated
github_token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -18,6 +18,8 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -16,6 +16,8 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -19,10 +19,26 @@ concurrency:
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}
jobs:
setup:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
api:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -45,9 +61,12 @@ jobs:
path: api-stats.json
crdt:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -70,9 +89,12 @@ jobs:
path: crdt-stats.json
web:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
- name: Build Web
@@ -89,9 +111,12 @@ jobs:
path: packages/desktop-client/build-stats
cli:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -114,9 +139,12 @@ jobs:
path: cli-stats.json
server:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -12,20 +12,40 @@ concurrency:
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}
jobs:
setup:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
constraints:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Check dependency version consistency
run: yarn constraints
- name: Check tsconfig project references are in sync
run: yarn check:tsconfig-references
lint:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -33,9 +53,12 @@ jobs:
- name: Lint
run: yarn lint
typecheck:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -43,9 +66,12 @@ jobs:
- name: Typecheck
run: yarn typecheck
validate-cli:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -55,9 +81,12 @@ jobs:
- name: Check that the built CLI works
run: node packages/sync-server/build/bin/actual-server.js --version
test:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -75,10 +104,13 @@ jobs:
- uses: zizmorcore/zizmor-action@71321a20a9ded102f6e9ce5718a2fcec2c4f70d8 # v0.5.2
migrations:
needs: setup
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -23,6 +23,8 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1

View File

@@ -17,6 +17,8 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -0,0 +1,48 @@
name: CRDT version bump check
on:
pull_request:
paths:
- 'packages/crdt/**'
permissions:
contents: read
jobs:
check-version-bump:
runs-on: ubuntu-latest
name: Ensure @actual-app/crdt version is bumped
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
persist-credentials: false
- name: Verify version bump
env:
BASE_REF: ${{ github.base_ref }}
run: |
set -euo pipefail
if ! git cat-file -e "origin/${BASE_REF}:packages/crdt/package.json" 2>/dev/null; then
echo "packages/crdt/package.json does not exist on the base branch; skipping."
exit 0
fi
BASE_VERSION=$(git show "origin/${BASE_REF}:packages/crdt/package.json" | jq -r .version)
HEAD_VERSION=$(jq -r .version packages/crdt/package.json)
echo "Base version: $BASE_VERSION"
echo "Head version: $HEAD_VERSION"
if [ "$BASE_VERSION" = "$HEAD_VERSION" ]; then
echo "::error file=packages/crdt/package.json::Files in packages/crdt/ were modified but the @actual-app/crdt version was not bumped. Please update the \"version\" field in packages/crdt/package.json."
exit 1
fi
HIGHEST=$(printf '%s\n%s\n' "$BASE_VERSION" "$HEAD_VERSION" | sort -V | tail -n1)
if [ "$HIGHEST" != "$HEAD_VERSION" ]; then
echo "::error file=packages/crdt/package.json::The @actual-app/crdt version ($HEAD_VERSION) must be greater than the base version ($BASE_VERSION)."
exit 1
fi
echo "Version bumped from $BASE_VERSION to $HEAD_VERSION."

View File

@@ -26,15 +26,19 @@ permissions:
jobs:
cut-release-branch:
runs-on: ubuntu-latest
environment: release
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.inputs.ref || 'master' }}
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
download-translations: 'false'
- name: Bump package versions

View File

@@ -32,11 +32,14 @@ jobs:
if: github.event_name == 'workflow_dispatch' || !github.event.repository.fork
name: Build Docker image
runs-on: ubuntu-latest
environment: release
strategy:
matrix:
os: [ubuntu, alpine]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up QEMU
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
@@ -72,6 +75,9 @@ jobs:
# This is faster and avoids yarn memory issues
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Build Web
run: yarn build:server

View File

@@ -27,8 +27,11 @@ jobs:
build:
name: Build Docker image
runs-on: ubuntu-latest
environment: release
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up QEMU
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
@@ -74,6 +77,9 @@ jobs:
# This is faster and avoids yarn memory issues
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Build Web
run: yarn build:server

View File

@@ -17,32 +17,80 @@ on:
env:
GITHUB_PR_NUMBER: ${{github.event.pull_request.number}}
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
functional:
name: Functional (shard ${{ matrix.shard }}/5)
build-web:
name: Build web bundle
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3, 4, 5]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Trust the repository directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
- name: Build browser bundle
# REACT_APP_NETLIFY=true flips isNonProductionEnvironment() on in the
# bundle so the "Create test file" button (used by every e2e beforeEach
# via ConfigurationPage.createTestFile()) is still rendered in a
# production build. Without it, e2e tests would time out waiting for
# a button that was tree-shaken out.
# --skip-translations keeps VRT screenshots deterministic by rendering
# source-code English instead of upstream Weblate en.json (which can
# drift between snapshot capture and test runs).
env:
REACT_APP_NETLIFY: 'true'
run: yarn build:browser --skip-translations
- name: Upload build artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
retention-days: 1
overwrite: true
functional:
name: Functional (shard ${{ matrix.shard }}/3)
runs-on: ubuntu-latest
needs: build-web
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
env:
E2E_USE_BUILD: '1'
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Trust the repository directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
- name: Download web build
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
- name: Run E2E Tests
run: yarn e2e --shard=${{ matrix.shard }}/5
run: yarn e2e --shard=${{ matrix.shard }}/3
- uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
if: always()
if: failure()
with:
name: desktop-client-test-results-shard-${{ matrix.shard }}
path: packages/desktop-client/test-results/
@@ -56,6 +104,8 @@ jobs:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -80,22 +130,34 @@ jobs:
overwrite: true
vrt:
name: Visual regression (shard ${{ matrix.shard }}/5)
name: Visual regression (shard ${{ matrix.shard }}/3)
runs-on: ubuntu-latest
needs: build-web
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3, 4, 5]
shard: [1, 2, 3]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
env:
E2E_USE_BUILD: '1'
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Trust the repository directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
- name: Download web build
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
- name: Run VRT Tests
run: yarn vrt --shard=${{ matrix.shard }}/5
run: yarn vrt --shard=${{ matrix.shard }}/3
- uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
if: always()
with:
@@ -113,6 +175,8 @@ jobs:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
- name: Download all blob reports
@@ -135,9 +199,13 @@ jobs:
if: github.event_name == 'pull_request'
run: |
mkdir -p vrt-metadata
echo "${{ github.event.pull_request.number }}" > vrt-metadata/pr-number.txt
echo "${{ needs.vrt.result }}" > vrt-metadata/vrt-result.txt
echo "${{ steps.playwright-report-vrt.outputs.artifact-url }}" > vrt-metadata/artifact-url.txt
echo "${PR_NUMBER}" > vrt-metadata/pr-number.txt
echo "${VRT_RESULT}" > vrt-metadata/vrt-result.txt
echo "${STEPS_PLAYWRIGHT_REPORT_VRT_OUTPUTS_ARTIFACT_URL}" > vrt-metadata/artifact-url.txt
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
STEPS_PLAYWRIGHT_REPORT_VRT_OUTPUTS_ARTIFACT_URL: ${{ steps.playwright-report-vrt.outputs.artifact-url }}
VRT_RESULT: ${{ needs.vrt.result }}
- name: Upload VRT metadata
if: github.event_name == 'pull_request'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1

View File

@@ -21,7 +21,9 @@ jobs:
# this is so the assets can be added to the release
permissions:
contents: write
environment: release
strategy:
fail-fast: false
matrix:
os:
- ubuntu-22.04
@@ -30,6 +32,8 @@ jobs:
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- if: ${{ startsWith(matrix.os, 'windows') }}
run: pip.exe install setuptools
- if: ${{ ! startsWith(matrix.os, 'windows') }}
@@ -56,11 +60,16 @@ jobs:
METAINFO_FILE="packages/desktop-electron/extra-resources/linux/com.actualbudget.actual.metainfo.xml"
TODAY=$(date +%Y-%m-%d)
VERSION=${{ steps.process_version.outputs.version }}
VERSION=${STEPS_PROCESS_VERSION_OUTPUTS_VERSION}
sed -i "s/%RELEASE_VERSION%/$VERSION/g; s/%RELEASE_DATE%/$TODAY/g" "$METAINFO_FILE"
flatpak run --command=flatpak-builder-lint org.flatpak.Builder appstream "$METAINFO_FILE"
env:
STEPS_PROCESS_VERSION_OUTPUTS_VERSION: ${{ steps.process_version.outputs.version }}
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Build Electron for Mac
if: ${{ startsWith(matrix.os, 'macos') }}
run: ./bin/package-electron
@@ -111,48 +120,7 @@ jobs:
!packages/desktop-electron/dist/Actual-windows.exe
packages/desktop-electron/dist/*.AppImage
packages/desktop-electron/dist/*.flatpak
packages/desktop-electron/dist/*.appx
outputs:
version: ${{ steps.process_version.outputs.version }}
publish-microsoft-store:
needs: build
runs-on: windows-latest
if: ${{ github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') }}
steps:
- name: Install StoreBroker
shell: powershell
run: |
Install-Module -Name StoreBroker -AcceptLicense -Force -Scope CurrentUser -Verbose
- name: Download Microsoft Store artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: actual-electron-windows-latest-appx
- name: Submit to Microsoft Store
shell: powershell
run: |
# Disable telemetry
$global:SBDisableTelemetry = $true
# Authenticate against the store
$pass = ConvertTo-SecureString -String '${{ secrets.MICROSOFT_STORE_CLIENT_SECRET }}' -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList ${{ secrets.MICROSOFT_STORE_CLIENT_ID }},$pass
Set-StoreBrokerAuthentication -TenantId '${{ secrets.MICROSOFT_STORE_TENANT_ID }}' -Credential $cred
# Zip and create metadata files
$artifacts = Get-ChildItem -Path . -Filter *.appx | Select-Object -ExpandProperty FullName
New-StoreBrokerConfigFile -Path "$PWD/config.json" -AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }}
New-SubmissionPackage -ConfigPath "$PWD/config.json" -DisableAutoPackageNameFormatting -AppxPath $artifacts -OutPath "$PWD" -OutName submission
# Submit the app
# See https://github.com/microsoft/StoreBroker/blob/master/Documentation/USAGE.md#the-easy-way
Update-ApplicationSubmission `
-AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }} `
-SubmissionDataPath "submission.json" `
-PackagePath "submission.zip" `
-ReplacePackages `
-NoStatus `
-AutoCommit `
-Force

View File

@@ -26,6 +26,7 @@ concurrency:
jobs:
build:
strategy:
fail-fast: false
matrix:
os:
- ubuntu-22.04
@@ -34,6 +35,8 @@ jobs:
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- if: ${{ startsWith(matrix.os, 'windows') }}
run: pip.exe install setuptools
- if: ${{ ! startsWith(matrix.os, 'windows') }}

View File

@@ -15,6 +15,7 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
path: actual
persist-credentials: false
- name: Set up environment
uses: ./actual/.github/actions/setup
with:
@@ -59,6 +60,8 @@ jobs:
ssh-key: ${{ secrets.STRING_IMPORT_DEPLOY_KEY }}
repository: actualbudget/translations
path: translations
# Need to be able to push back extracted strings
persist-credentials: true
- name: Generate i18n strings
working-directory: actual
run: |

View File

@@ -0,0 +1,23 @@
name: Close tech support issues with automated message
on:
issues:
types: [labeled]
jobs:
tech-support:
if: ${{ github.event.label.name == 'tech-support' }}
runs-on: ubuntu-latest
steps:
- name: Create comment and close issue
run: |
gh issue comment "$ISSUE_URL" --body ":wave: Thanks for reaching out!
GitHub Issues are reserved for bug reports and feature requests, so tech support tickets are automatically closed. The fastest way to get help is to ask the community on [Discord](https://discord.gg/pRYNYr4W5A) — that's where most of the community lives and can help you in real time.
<!-- tech-support-auto-close-comment -->"
gh issue close "$ISSUE_URL"
env:
ISSUE_URL: https://github.com/actualbudget/actual/issues/${{ github.event.issue.number }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -25,6 +25,8 @@ jobs:
steps:
# This is not a security concern because we have approved & merged the PR
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22

View File

@@ -1,37 +0,0 @@
# When the "unfreeze" label is added to a PR, add that PR to Merge Freeze's unblocked list
# so it can be merged during a freeze. Uses pull_request_target so the workflow runs in
# the base repo and has access to MERGEFREEZE_ACCESS_TOKEN for fork PRs; it does not
# checkout or run any PR code. Requires MERGEFREEZE_ACCESS_TOKEN repo secret
# (project-specific token from Merge Freeze Web API panel for actualbudget/actual / master).
# See: https://docs.mergefreeze.com/web-api#post-freeze-status
name: Merge Freeze add PR to unblocked list
on:
pull_request_target:
types: [labeled]
jobs:
unfreeze:
if: ${{ github.event.label.name == 'unfreeze' }}
runs-on: ubuntu-latest
permissions: {}
concurrency:
group: merge-freeze-unfreeze-${{ github.ref }}-labels
cancel-in-progress: false
steps:
- name: POST to Merge Freeze add PR to unblocked list
env:
MERGEFREEZE_ACCESS_TOKEN: ${{ secrets.MERGEFREEZE_ACCESS_TOKEN }}
PR_NUMBER: ${{ github.event.pull_request.number }}
USER_NAME: ${{ github.actor }}
run: |
set -e
if [ -z "$MERGEFREEZE_ACCESS_TOKEN" ]; then
echo "::error::MERGEFREEZE_ACCESS_TOKEN secret is not set"
exit 1
fi
url="https://www.mergefreeze.com/api/branches/actualbudget/actual/master/?access_token=${MERGEFREEZE_ACCESS_TOKEN}"
payload=$(jq -n --arg user_name "$USER_NAME" --argjson pr "$PR_NUMBER" '{frozen: true, user_name: $user_name, unblocked_prs: [$pr]}')
curl -sf -X POST "$url" -H "Content-Type: application/json" -d "$payload"
echo "Merge Freeze updated: PR #$PR_NUMBER added to unblocked list."

View File

@@ -19,12 +19,18 @@ concurrency:
jobs:
build-and-deploy:
runs-on: ubuntu-latest
environment: release
steps:
- name: Repository Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Install Netlify
run: npm install netlify-cli@17.10.1 -g

View File

@@ -0,0 +1,47 @@
name: Nightly theme catalog scan
on:
schedule:
# 05:15 UTC daily — runs after the i18n extract job (04:00) and well
# before the nightly Electron/npm publishes (00:00 UTC the next day).
- cron: '15 5 * * *'
workflow_dispatch:
permissions:
contents: read
jobs:
validate-theme-catalog:
name: Validate custom theme catalog
runs-on: ubuntu-latest
if: github.repository == 'actualbudget/actual'
timeout-minutes: 10
steps:
- name: Check out repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Validate themes
run: yarn workspace @actual-app/web validate:theme-catalog
notify-failure:
name: Notify Discord on failure
needs: validate-theme-catalog
if: failure() && github.repository == 'actualbudget/actual'
runs-on: ubuntu-latest
environment: nightly-alerts
timeout-minutes: 5
steps:
- name: Notify Discord
uses: sarisia/actions-status-discord@eb045afee445dc055c18d3d90bd0f244fd062708 # v1.16.0
with:
webhook: ${{ secrets.DISCORD_WEBHOOK_URL }}
status: Failure
title: Nightly theme catalog scan failed
description: The nightly scan failed. One or more themes may be broken, or the scan itself did not complete.
username: Actual Nightly
nofail: true

86
.github/workflows/publish-crdt.yml vendored Normal file
View File

@@ -0,0 +1,86 @@
name: Publish @actual-app/crdt
# Automatically publishes @actual-app/crdt when its package.json version
# changes on master (typically via a merged PR that bumped the version).
on:
push:
branches:
- master
paths:
- 'packages/crdt/package.json'
workflow_dispatch:
permissions:
contents: read
concurrency:
group: publish-crdt
cancel-in-progress: false
jobs:
check-version:
runs-on: ubuntu-latest
name: Check if publish is needed
outputs:
should-publish: ${{ steps.check.outputs.should-publish }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Compare local version against npm registry
id: check
run: |
set -euo pipefail
LOCAL_VERSION=$(jq -r .version packages/crdt/package.json)
echo "Local version: $LOCAL_VERSION"
PUBLISHED_VERSION=$(npm view @actual-app/crdt version 2>/dev/null || echo "")
echo "Published version: ${PUBLISHED_VERSION:-<none>}"
if [ "$LOCAL_VERSION" = "$PUBLISHED_VERSION" ]; then
echo "Versions match - nothing to publish."
echo "should-publish=false" >> "$GITHUB_OUTPUT"
else
echo "Version changed - publish required."
echo "should-publish=true" >> "$GITHUB_OUTPUT"
fi
publish:
needs: check-version
if: needs.check-version.outputs.should-publish == 'true'
runs-on: ubuntu-latest
name: Publish @actual-app/crdt to npm
permissions:
contents: read
id-token: write # Required for npm OIDC provenance
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
download-translations: 'false'
- name: Build @actual-app/crdt
run: yarn workspace @actual-app/crdt build
- name: Pack @actual-app/crdt
run: yarn workspace @actual-app/crdt pack --filename @actual-app/crdt.tgz
- name: Setup node and npm registry
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 24
check-latest: true
# Avoid restoring potentially poisoned caches in release jobs.
package-manager-cache: false
registry-url: 'https://registry.npmjs.org'
- name: Publish to npm
run: npm publish packages/crdt/@actual-app/crdt.tgz --access public --provenance

View File

@@ -21,6 +21,7 @@ concurrency:
jobs:
publish-flathub:
runs-on: ubuntu-22.04
environment: release
steps:
- name: Resolve version
id: resolve_version
@@ -54,8 +55,9 @@ jobs:
- name: Verify release assets exist
env:
GH_TOKEN: ${{ github.token }}
STEPS_RESOLVE_VERSION_OUTPUTS_TAG: ${{ steps.resolve_version.outputs.tag }}
run: |
TAG="${{ steps.resolve_version.outputs.tag }}"
TAG="${STEPS_RESOLVE_VERSION_OUTPUTS_TAG}"
echo "Checking release assets for tag $TAG..."
ASSETS=$(gh api "repos/${{ github.repository }}/releases/tags/$TAG" --jq '.assets[].name')
@@ -77,7 +79,7 @@ jobs:
- name: Calculate AppImage SHA256 (streamed)
run: |
VERSION="${{ steps.resolve_version.outputs.version }}"
VERSION="${STEPS_RESOLVE_VERSION_OUTPUTS_VERSION}"
BASE_URL="https://github.com/${{ github.repository }}/releases/download/v${VERSION}"
echo "Streaming x86_64 AppImage to compute SHA256..."
@@ -90,27 +92,32 @@ jobs:
echo "APPIMAGE_X64_SHA256=$APPIMAGE_X64_SHA256" >> "$GITHUB_ENV"
echo "APPIMAGE_ARM64_SHA256=$APPIMAGE_ARM64_SHA256" >> "$GITHUB_ENV"
env:
STEPS_RESOLVE_VERSION_OUTPUTS_VERSION: ${{ steps.resolve_version.outputs.version }}
- name: Checkout Flathub repo
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
repository: flathub/com.actualbudget.actual
token: ${{ secrets.FLATHUB_GITHUB_TOKEN }}
persist-credentials: false
- name: Update manifest with new version
run: |
VERSION="${{ steps.resolve_version.outputs.version }}"
VERSION="${STEPS_RESOLVE_VERSION_OUTPUTS_VERSION}"
# Replace x86_64 entry
sed -i "/x86_64.AppImage/{n;s|sha256:.*|sha256: ${{ env.APPIMAGE_X64_SHA256 }}|}" com.actualbudget.actual.yml
sed -i "/x86_64.AppImage/{n;s|sha256:.*|sha256: ${APPIMAGE_X64_SHA256}|}" com.actualbudget.actual.yml
sed -i "/x86_64.AppImage/s|url:.*|url: https://github.com/actualbudget/actual/releases/download/v${VERSION}/Actual-linux-x86_64.AppImage|" com.actualbudget.actual.yml
# Replace arm64 entry
sed -i "/arm64.AppImage/{n;s|sha256:.*|sha256: ${{ env.APPIMAGE_ARM64_SHA256 }}|}" com.actualbudget.actual.yml
sed -i "/arm64.AppImage/{n;s|sha256:.*|sha256: ${APPIMAGE_ARM64_SHA256}|}" com.actualbudget.actual.yml
sed -i "/arm64.AppImage/s|url:.*|url: https://github.com/actualbudget/actual/releases/download/v${VERSION}/Actual-linux-arm64.AppImage|" com.actualbudget.actual.yml
echo "Updated manifest:"
cat com.actualbudget.actual.yml
env:
STEPS_RESOLVE_VERSION_OUTPUTS_VERSION: ${{ steps.resolve_version.outputs.version }}
- name: Create PR in Flathub repo
uses: peter-evans/create-pull-request@5f6978faf089d4d20b00c7766989d076bb2fc7f1 # v8.1.1

View File

@@ -0,0 +1,113 @@
name: Publish Microsoft Store
defaults:
run:
shell: bash
on:
release:
types: [published]
workflow_dispatch:
inputs:
tag:
description: 'Release tag (e.g. v25.3.0)'
required: true
type: string
concurrency:
group: publish-microsoft-store
cancel-in-progress: false
jobs:
publish-microsoft-store:
runs-on: windows-latest
environment: release
steps:
- name: Resolve version
id: resolve_version
env:
EVENT_NAME: ${{ github.event_name }}
RELEASE_TAG: ${{ github.event.release.tag_name }}
INPUT_TAG: ${{ inputs.tag }}
run: |
if [[ "$EVENT_NAME" == "release" ]]; then
TAG="$RELEASE_TAG"
else
TAG="$INPUT_TAG"
fi
if [[ -z "$TAG" ]]; then
echo "::error::No tag provided"
exit 1
fi
# Validate tag format (v-prefixed semver, e.g. v25.3.0 or v1.2.3-beta.1)
if [[ ! "$TAG" =~ ^v[0-9]+\.[0-9]+\.[0-9]+(-[a-zA-Z0-9.]+)?$ ]]; then
echo "::error::Invalid tag format: $TAG (expected v-prefixed semver, e.g. v25.3.0)"
exit 1
fi
VERSION="${TAG#v}"
echo "tag=$TAG" >> "$GITHUB_OUTPUT"
echo "version=$VERSION" >> "$GITHUB_OUTPUT"
echo "Resolved tag=$TAG version=$VERSION"
- name: Verify release assets exist
env:
GH_TOKEN: ${{ github.token }}
STEPS_RESOLVE_VERSION_OUTPUTS_TAG: ${{ steps.resolve_version.outputs.tag }}
run: |
TAG="${STEPS_RESOLVE_VERSION_OUTPUTS_TAG}"
echo "Checking release assets for tag $TAG..."
ASSETS=$(gh api "repos/${{ github.repository }}/releases/tags/$TAG" --jq '.assets[].name')
echo "Found assets:"
echo "$ASSETS"
if ! echo "$ASSETS" | grep -q "\.appx$"; then
echo "::error::No .appx assets found in release $TAG"
exit 1
fi
echo "Required .appx assets found."
- name: Download Microsoft Store artifacts
env:
GH_TOKEN: ${{ github.token }}
STEPS_RESOLVE_VERSION_OUTPUTS_TAG: ${{ steps.resolve_version.outputs.tag }}
run: |
TAG="${STEPS_RESOLVE_VERSION_OUTPUTS_TAG}"
gh release download "$TAG" --repo "${{ github.repository }}" --pattern "*.appx"
- name: Install StoreBroker
shell: powershell
run: |
Install-Module -Name StoreBroker -AcceptLicense -Force -Scope CurrentUser -Verbose
- name: Submit to Microsoft Store
shell: powershell
run: |
# Disable telemetry
$global:SBDisableTelemetry = $true
# Authenticate against the store
$pass = ConvertTo-SecureString -String '${{ secrets.MICROSOFT_STORE_CLIENT_SECRET }}' -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList ${{ secrets.MICROSOFT_STORE_CLIENT_ID }},$pass
Set-StoreBrokerAuthentication -TenantId '${{ secrets.MICROSOFT_STORE_TENANT_ID }}' -Credential $cred
# Zip and create metadata files
$artifacts = Get-ChildItem -Path . -Filter *.appx | Select-Object -ExpandProperty FullName
New-StoreBrokerConfigFile -Path "$PWD/config.json" -AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }}
New-SubmissionPackage -ConfigPath "$PWD/config.json" -DisableAutoPackageNameFormatting -AppxPath $artifacts -OutPath "$PWD" -OutName submission
# Submit the app
# See https://github.com/microsoft/StoreBroker/blob/master/Documentation/USAGE.md#the-easy-way
Update-ApplicationSubmission `
-AppId ${{ secrets.MICROSOFT_STORE_PRODUCT_ID }} `
-SubmissionDataPath "submission.json" `
-PackagePath "submission.zip" `
-ReplacePackages `
-NoStatus `
-AutoCommit `
-Force

View File

@@ -20,15 +20,19 @@ concurrency:
jobs:
build:
strategy:
fail-fast: false
matrix:
os:
- ubuntu-22.04
- windows-latest
- macos-latest
runs-on: ${{ matrix.os }}
environment: release
if: github.event.repository.fork == false
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- if: ${{ startsWith(matrix.os, 'windows') }}
run: pip.exe install setuptools
@@ -41,6 +45,9 @@ jobs:
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- if: ${{ startsWith(matrix.os, 'ubuntu') }}
name: Setup Flatpak dependencies

View File

@@ -1,124 +0,0 @@
name: Publish nightly npm packages
# Nightly npm packages are built daily at midnight UTC
on:
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
jobs:
build-and-pack:
runs-on: ubuntu-latest
name: Build and pack npm packages
if: github.event.repository.fork == false
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up environment
uses: ./.github/actions/setup
- name: Update package versions
run: |
# Get new nightly versions
NEW_CORE_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/loot-core/package.json --type nightly)
NEW_WEB_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/desktop-client/package.json --type nightly)
NEW_SYNC_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/sync-server/package.json --type nightly)
NEW_API_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/api/package.json --type nightly)
NEW_CLI_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/cli/package.json --type nightly)
# Set package versions
npm version $NEW_CORE_VERSION --no-git-tag-version --workspace=@actual-app/core --no-workspaces-update
npm version $NEW_WEB_VERSION --no-git-tag-version --workspace=@actual-app/web --no-workspaces-update
npm version $NEW_SYNC_VERSION --no-git-tag-version --workspace=@actual-app/sync-server --no-workspaces-update
npm version $NEW_API_VERSION --no-git-tag-version --workspace=@actual-app/api --no-workspaces-update
npm version $NEW_CLI_VERSION --no-git-tag-version --workspace=@actual-app/cli --no-workspaces-update
- name: Yarn install
run: |
yarn install
- name: Pack the core package
run: |
yarn workspace @actual-app/core pack --filename @actual-app/core.tgz
- name: Build Server & Web
run: yarn build:server
- name: Pack the web and server packages
run: |
yarn workspace @actual-app/web pack --filename @actual-app/web.tgz
yarn workspace @actual-app/sync-server pack --filename @actual-app/sync-server.tgz
- name: Build API
run: yarn build:api
- name: Pack the api package
run: |
yarn workspace @actual-app/api pack --filename @actual-app/api.tgz
- name: Build CLI
run: yarn workspace @actual-app/cli build
- name: Pack the cli package
run: |
yarn workspace @actual-app/cli pack --filename @actual-app/cli.tgz
- name: Upload package artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: npm-packages
path: |
packages/loot-core/@actual-app/core.tgz
packages/desktop-client/@actual-app/web.tgz
packages/sync-server/@actual-app/sync-server.tgz
packages/api/@actual-app/api.tgz
packages/cli/@actual-app/cli.tgz
publish:
runs-on: ubuntu-latest
name: Publish Nightly npm packages
needs: build-and-pack
permissions:
contents: read
packages: write
steps:
- name: Download the artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: npm-packages
- name: Setup node and npm registry
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22
registry-url: 'https://registry.npmjs.org'
- name: Publish Core
run: |
npm publish loot-core/@actual-app/core.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish Web
run: |
npm publish desktop-client/@actual-app/web.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish Sync-Server
run: |
npm publish sync-server/@actual-app/sync-server.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish API
run: |
npm publish api/@actual-app/api.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish CLI
run: |
npm publish cli/@actual-app/cli.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -1,26 +1,61 @@
name: Publish npm packages
# # Npm packages are published for every new tag
# Npm packages are published for every new tag and nightly schedule
on:
push:
tags:
- 'v*.*.*'
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
permissions:
contents: read
jobs:
build-and-pack:
runs-on: ubuntu-latest
name: Build and pack npm packages
if: github.event_name == 'push' || (github.event.repository.fork == false)
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
# Avoid restoring potentially poisoned caches in release jobs.
cache: 'false'
- name: Update package versions
if: github.event_name != 'push'
run: |
# Get new nightly versions
NEW_CORE_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/loot-core/package.json --type nightly)
NEW_WEB_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/desktop-client/package.json --type nightly)
NEW_SYNC_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/sync-server/package.json --type nightly)
NEW_API_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/api/package.json --type nightly)
NEW_CLI_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/cli/package.json --type nightly)
# Set package versions
npm version $NEW_CORE_VERSION --no-git-tag-version --workspace=@actual-app/core --no-workspaces-update
npm version $NEW_WEB_VERSION --no-git-tag-version --workspace=@actual-app/web --no-workspaces-update
npm version $NEW_SYNC_VERSION --no-git-tag-version --workspace=@actual-app/sync-server --no-workspaces-update
npm version $NEW_API_VERSION --no-git-tag-version --workspace=@actual-app/api --no-workspaces-update
npm version $NEW_CLI_VERSION --no-git-tag-version --workspace=@actual-app/cli --no-workspaces-update
- name: Yarn install
if: github.event_name != 'push'
run: |
# Required after nightly `npm version` updates workspace manifests.
yarn install
- name: Pack the core package
run: |
yarn workspace @actual-app/core pack --filename @actual-app/core.tgz
- name: Build Web
- name: Build Server & Web
run: yarn build:server
- name: Pack the web and server packages
@@ -44,6 +79,7 @@ jobs:
- name: Upload package artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
if: ${{ !env.ACT }}
with:
name: npm-packages
path: |
@@ -57,9 +93,13 @@ jobs:
runs-on: ubuntu-latest
name: Publish npm packages
needs: build-and-pack
environment: release
permissions:
contents: read
packages: write
id-token: write # Required for OIDC
env:
NPM_DIST_TAG: ${{ github.event_name != 'push' && 'nightly' || '' }}
steps:
- name: Download the artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
@@ -69,35 +109,28 @@ jobs:
- name: Setup node and npm registry
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22
node-version: 24
check-latest: true
# Avoid restoring potentially poisoned caches in release jobs.
package-manager-cache: false
registry-url: 'https://registry.npmjs.org'
- name: Publish Core
run: |
npm publish loot-core/@actual-app/core.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
npm publish loot-core/@actual-app/core.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
- name: Publish Web
run: |
npm publish desktop-client/@actual-app/web.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
npm publish desktop-client/@actual-app/web.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
- name: Publish Sync-Server
run: |
npm publish sync-server/@actual-app/sync-server.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
npm publish sync-server/@actual-app/sync-server.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
- name: Publish API
run: |
npm publish api/@actual-app/api.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
npm publish api/@actual-app/api.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
- name: Publish CLI
run: |
npm publish cli/@actual-app/cli.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
npm publish cli/@actual-app/cli.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}

View File

@@ -37,13 +37,15 @@ jobs:
with:
fetch-depth: 0
token: ${{ secrets.ACTIONS_UPDATE_TOKEN || github.token }}
# Need to be able to commit release notes after generation
persist-credentials: true
- name: Get changed files
if: steps.bot-check.outputs.skip != 'true'
id: changed-files
run: |
git fetch origin ${{ github.base_ref }}
CHANGED_FILES=$(git diff --name-only origin/${{ github.base_ref }}...HEAD)
git fetch origin ${GITHUB_BASE_REF}
CHANGED_FILES=$(git diff --name-only origin/${GITHUB_BASE_REF}...HEAD)
NON_DOCS_FILES=$(echo "$CHANGED_FILES" | grep -v -e "^packages/docs/" -e "^\.github/actions/docs-spelling/" || true)
if [ -z "$NON_DOCS_FILES" ] && [ -n "$CHANGED_FILES" ]; then

View File

@@ -38,6 +38,7 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.base_ref }}
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -104,7 +105,7 @@ jobs:
- name: Report build failure
if: steps.wait-for-web-build.outputs.conclusion == 'failure' || steps.wait-for-api-build.outputs.conclusion == 'failure' || steps.wait-for-cli-build.outputs.conclusion == 'failure' || steps.wait-for-crdt-build.outputs.conclusion == 'failure'
run: |
echo "Build failed on PR branch or ${{github.base_ref}}"
echo "Build failed on PR branch or ${GITHUB_BASE_REF}"
exit 1
- name: Download web build artifact from ${{github.base_ref}}

View File

@@ -3,9 +3,12 @@ on:
schedule:
- cron: '30 1 * * *'
workflow_dispatch: # Allow manual triggering
permissions: {}
jobs:
stale:
permissions:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10.2.0
@@ -16,6 +19,8 @@ jobs:
days-before-close: 5
days-before-issue-stale: -1
stale-wip:
permissions:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10.2.0
@@ -27,6 +32,8 @@ jobs:
days-before-issue-stale: -1
stale-needs-info:
permissions:
issues: write
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10.2.0

View File

@@ -75,9 +75,12 @@ jobs:
echo "Found patch file: $PATCH_FILE"
# Validate patch only contains PNG files
# Validate patch only contains PNG files. `git format-patch` emits a
# `GIT binary patch` block for PNGs (no +++/--- lines), so check
# `diff --git` headers — those are present for both text and binary.
echo "Validating patch contains only PNG files..."
if grep -E '^(\+\+\+|---) [ab]/' "$PATCH_FILE" | grep -v '\.png$'; then
if grep -E '^diff --git ' "$PATCH_FILE" \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Patch contains non-PNG files! Rejecting for security."
echo "applied=false" >> "$GITHUB_OUTPUT"
echo "error=Patch validation failed: contains non-PNG files" >> "$GITHUB_OUTPUT"
@@ -85,7 +88,7 @@ jobs:
fi
# Extract file list for verification
FILES_CHANGED=$(grep -E '^\+\+\+ b/' "$PATCH_FILE" | sed 's/^+++ b\///' | wc -l)
FILES_CHANGED=$(grep -cE '^diff --git ' "$PATCH_FILE")
echo "Patch modifies $FILES_CHANGED PNG file(s)"
# Configure git
@@ -107,7 +110,7 @@ jobs:
fi
# Commit
git commit -m "Update VRT screenshots" -m "Auto-generated by VRT workflow" -m "PR: #${{ steps.metadata.outputs.pr_number }}"
git commit -m "Update VRT screenshots" -m "Auto-generated by VRT workflow" -m "PR: #${STEPS_METADATA_OUTPUTS_PR_NUMBER}"
echo "applied=true" >> "$GITHUB_OUTPUT"
else
@@ -116,6 +119,8 @@ jobs:
echo "error=Patch conflicts with current branch state" >> "$GITHUB_OUTPUT"
exit 1
fi
env:
STEPS_METADATA_OUTPUTS_PR_NUMBER: ${{ steps.metadata.outputs.pr_number }}
- name: Push changes
if: steps.apply.outputs.applied == 'true'

View File

@@ -36,15 +36,16 @@ jobs:
content: 'eyes'
});
generate-vrt-updates:
name: Generate VRT Updates
get-pr:
name: Resolve PR details
runs-on: ubuntu-latest
# Only run on PR comments containing /update-vrt
if: >
github.event.issue.pull_request &&
startsWith(github.event.comment.body, '/update-vrt')
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
outputs:
head_sha: ${{ steps.pr.outputs.head_sha }}
head_ref: ${{ steps.pr.outputs.head_ref }}
head_repo: ${{ steps.pr.outputs.head_repo }}
steps:
- name: Get PR details
id: pr
@@ -60,9 +61,131 @@ jobs:
core.setOutput('head_ref', pr.head.ref);
core.setOutput('head_repo', pr.head.repo.full_name);
build-web:
name: Build web bundle
runs-on: ubuntu-latest
needs: get-pr
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ steps.pr.outputs.head_sha }}
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Trust workspace directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
shell: bash
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Build browser bundle
# REACT_APP_NETLIFY=true flips isNonProductionEnvironment() on in the
# bundle so the "Create test file" button (used by every e2e beforeEach
# via ConfigurationPage.createTestFile()) is still rendered in a
# production build. Without it, e2e tests would time out waiting for
# a button that was tree-shaken out.
# --skip-translations keeps VRT screenshots deterministic by rendering
# source-code English instead of upstream Weblate en.json (which can
# drift between snapshot capture and test runs).
env:
REACT_APP_NETLIFY: 'true'
run: yarn build:browser --skip-translations
- name: Upload build artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
retention-days: 1
overwrite: true
browser-vrt:
name: Browser VRT (shard ${{ matrix.shard }}/3)
runs-on: ubuntu-latest
needs: [get-pr, build-web]
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
env:
E2E_USE_BUILD: '1'
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Trust workspace directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
shell: bash
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Download web build
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
- name: Run VRT Tests
continue-on-error: true
run: yarn vrt --update-snapshots --shard=${{ matrix.shard }}/3
- name: Create shard patch with PNG changes only
id: create-patch
run: |
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git add "**/*.png"
if git diff --staged --quiet; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No VRT changes in this shard"
exit 0
fi
echo "has_changes=true" >> "$GITHUB_OUTPUT"
git commit -m "Update VRT screenshots (browser shard ${{ matrix.shard }})"
git format-patch -1 HEAD --stdout > vrt-shard.patch
# Validate patch only contains PNG files. `git format-patch` emits a
# `GIT binary patch` block for PNGs (no +++/--- lines), so check
# `diff --git` headers — those are present for both text and binary.
if grep -E '^diff --git ' vrt-shard.patch \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Shard patch contains non-PNG files!"
exit 1
fi
- name: Upload shard patch
if: steps.create-patch.outputs.has_changes == 'true'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: vrt-shard-browser-${{ matrix.shard }}
path: vrt-shard.patch
retention-days: 1
overwrite: true
desktop-vrt:
name: Desktop VRT
runs-on: ubuntu-latest
needs: get-pr
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Trust workspace directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
shell: bash
- name: Set up environment
uses: ./.github/actions/setup
@@ -73,48 +196,124 @@ jobs:
- name: Install build tools
run: apt-get update && apt-get install -y build-essential python3
- name: Run VRT Tests on Desktop app
- name: Run Desktop VRT Tests
continue-on-error: true
run: |
yarn rebuild-electron
xvfb-run --auto-servernum --server-args="-screen 0 1920x1080x24" -- yarn e2e:desktop --update-snapshots
- name: Run VRT Tests
continue-on-error: true
run: yarn vrt --update-snapshots
- name: Create patch with PNG changes only
- name: Create shard patch with PNG changes only
id: create-patch
run: |
# Trust the repository directory (required for container environments)
git config --global --add safe.directory "$GITHUB_WORKSPACE"
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
# Stage only PNG files
git add "**/*.png"
# Check if there are any changes
if git diff --staged --quiet; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No VRT changes to commit"
echo "No VRT changes in desktop shard"
exit 0
fi
echo "has_changes=true" >> "$GITHUB_OUTPUT"
# Create commit and patch
git commit -m "Update VRT screenshots"
git format-patch -1 HEAD --stdout > vrt-update.patch
git commit -m "Update VRT screenshots (desktop)"
git format-patch -1 HEAD --stdout > vrt-shard.patch
# Validate patch only contains PNG files
if grep -E '^(\+\+\+|---) [ab]/' vrt-update.patch | grep -v '\.png$'; then
echo "ERROR: Patch contains non-PNG files!"
# See validation note in browser-vrt above.
if grep -E '^diff --git ' vrt-shard.patch \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Desktop shard patch contains non-PNG files!"
exit 1
fi
echo "Patch created successfully with PNG changes only"
- name: Upload shard patch
if: steps.create-patch.outputs.has_changes == 'true'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: vrt-shard-desktop
path: vrt-shard.patch
retention-days: 1
overwrite: true
merge-patch:
name: Merge VRT Patches
runs-on: ubuntu-latest
needs: [get-pr, browser-vrt, desktop-vrt]
if: ${{ !cancelled() && needs.get-pr.result == 'success' }}
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Download all shard patches
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
path: /tmp/shard-patches
pattern: vrt-shard-*
- name: Merge shard patches
id: create-patch
shell: bash
run: |
git config --global --add safe.directory "$GITHUB_WORKSPACE"
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
# actions/download-artifact puts a lone matched artifact directly in
# `path` but gives each of several its own `path/<name>/` subdir, so
# recurse instead of globbing `*/vrt-shard.patch` (which would miss
# the common single-shard case).
mapfile -t patches < <(find /tmp/shard-patches -type f -name 'vrt-shard.patch' | sort)
if [ ${#patches[@]} -eq 0 ]; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No shard patches to merge"
exit 0
fi
# Defense in depth: re-validate every shard patch before applying.
# See validation note in browser-vrt above for why we match
# `diff --git` headers instead of +++/--- lines.
for patch in "${patches[@]}"; do
echo "Validating $patch"
if grep -E '^diff --git ' "$patch" \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: $patch contains non-PNG files!"
exit 1
fi
done
# Apply each shard patch. Shards touch disjoint PNG files so
# order does not matter. --index stages the applied changes.
for patch in "${patches[@]}"; do
echo "Applying $patch"
git apply --index "$patch"
done
if git diff --staged --quiet; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No VRT changes after merge"
exit 0
fi
echo "has_changes=true" >> "$GITHUB_OUTPUT"
git commit -m "Update VRT screenshots"
git format-patch -1 HEAD --stdout > vrt-update.patch
# Final guard on the combined patch.
if grep -E '^diff --git ' vrt-update.patch \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Merged patch contains non-PNG files!"
exit 1
fi
echo "Merged patch created successfully with PNG changes only"
- name: Upload patch artifact
if: steps.create-patch.outputs.has_changes == 'true'
@@ -129,8 +328,11 @@ jobs:
run: |
mkdir -p pr-metadata
echo "${{ github.event.issue.number }}" > pr-metadata/pr-number.txt
echo "${{ steps.pr.outputs.head_ref }}" > pr-metadata/head-ref.txt
echo "${{ steps.pr.outputs.head_repo }}" > pr-metadata/head-repo.txt
echo "${NEEDS_GET_PR_OUTPUTS_HEAD_REF}" > pr-metadata/head-ref.txt
echo "${NEEDS_GET_PR_OUTPUTS_HEAD_REPO}" > pr-metadata/head-repo.txt
env:
NEEDS_GET_PR_OUTPUTS_HEAD_REF: ${{ needs.get-pr.outputs.head_ref }}
NEEDS_GET_PR_OUTPUTS_HEAD_REPO: ${{ needs.get-pr.outputs.head_repo }}
- name: Upload PR metadata
if: steps.create-patch.outputs.has_changes == 'true'

4
.gitignore vendored
View File

@@ -42,6 +42,9 @@ bundle.desktop.js.map
bundle.mobile.js
bundle.mobile.js.map
# Python virtualenv (Electron CI provisions one at the repo root for setuptools)
.venv/
# Yarn
.pnp.*
.yarn/*
@@ -92,4 +95,3 @@ storybook-static
.actualrc.yaml
.actualrc.yml
actual.config.js
.playwright-cli/

View File

@@ -15,7 +15,8 @@
"vi": "readonly",
"backend": "readonly",
"importScripts": "readonly",
"FS": "readonly"
"FS": "readonly",
"__APP_VERSION__": "readonly"
},
"rules": {
// Import sorting
@@ -37,6 +38,7 @@
"actual/no-anchor-tag": "error",
"actual/no-react-default-import": "error",
"actual/prefer-subpath-imports": "error",
"actual/enforce-boundaries": "error",
"actual/no-extraneous-dependencies": "error",
// JSX A11y rules
@@ -336,6 +338,11 @@
"group": ["**/*.api", "**/*.electron"],
"message": "Don't directly reference imports from other platforms"
},
{
"group": ["uuid"],
"importNames": ["*"],
"message": "Use `import { v4 as uuidv4 } from 'uuid'` instead"
},
{
"group": ["**/style", "**/colors"],
"importNames": ["colors"],
@@ -369,7 +376,14 @@
"files": ["**/*.test.{js,ts,jsx,tsx}", "packages/docs/**/*"],
"rules": {
"actual/no-untranslated-strings": "off",
"actual/prefer-logger-over-console": "off"
"actual/prefer-logger-over-console": "off",
"typescript/unbound-method": "off"
}
},
{
"files": ["packages/eslint-plugin-actual/lib/rules/__tests__/**/*"],
"rules": {
"actual/enforce-boundaries": "off"
}
},
{

View File

@@ -7,3 +7,7 @@ enableTransparentWorkspaces: false
nodeLinker: node-modules
yarnPath: .yarn/releases/yarn-4.13.0.cjs
# Secure default: don't run postinstall scripts.
# If a new package requires them, add it to dependenciesMeta in package.json.
enableScripts: false

View File

@@ -281,7 +281,6 @@ Always run `yarn typecheck` before committing.
- Avoid `any` or `unknown` unless absolutely necessary
- Look for existing type definitions in the codebase
- Avoid type assertions (`as`, `!`) - prefer `satisfies`
- Use inline type imports: `import { type MyType } from '...'`
**Naming:**

View File

@@ -1 +1,3 @@
Please review the contributing documentation on our website: https://actualbudget.org/docs/contributing/
If you plan to use AI tools when contributing, please also read our [AI Usage Policy](https://actualbudget.org/docs/contributing/ai-usage-policy).

View File

@@ -4,21 +4,30 @@ ROOT=`dirname $0`
cd "$ROOT/.."
echo "Updating translations..."
if ! [ -d packages/desktop-client/locale ]; then
git clone https://github.com/actualbudget/translations packages/desktop-client/locale
SKIP_TRANSLATIONS=false
while [[ $# -gt 0 ]]; do
case "$1" in
--skip-translations)
SKIP_TRANSLATIONS=true
shift
;;
*)
echo "Unknown argument: $1" >&2
exit 1
;;
esac
done
if [ "$SKIP_TRANSLATIONS" = false ]; then
echo "Updating translations..."
if ! [ -d packages/desktop-client/locale ]; then
git clone https://github.com/actualbudget/translations packages/desktop-client/locale
fi
pushd packages/desktop-client/locale > /dev/null
git checkout .
git pull
popd > /dev/null
packages/desktop-client/bin/remove-untranslated-languages
fi
pushd packages/desktop-client/locale > /dev/null
git checkout .
git pull
popd > /dev/null
packages/desktop-client/bin/remove-untranslated-languages
export NODE_OPTIONS="--max-old-space-size=4096"
yarn workspace @actual-app/crdt build
yarn workspace plugins-service build
yarn workspace @actual-app/core build:browser
yarn workspace @actual-app/web build:browser
echo "packages/desktop-client/build"
lage build:browser --to=@actual-app/web

View File

@@ -57,8 +57,7 @@ yarn workspace @actual-app/core build:node
yarn workspace @actual-app/web build --mode=desktop # electron specific build
# required for running the sync-server server
yarn workspace @actual-app/core build:browser
yarn workspace @actual-app/web build:browser
yarn build:browser
yarn workspace @actual-app/sync-server build
# Emit @actual-app/core declarations so desktop-electron (which includes typings/window.ts) can build

View File

@@ -25,6 +25,14 @@ module.exports = {
outputGlob: BUILD_OUTPUT_GLOBS,
},
},
// Not cached: the script stages files into public/ and build-stats/ that
// fall outside BUILD_OUTPUT_GLOBS, so a cache hit would skip the side
// effects.
'build:browser': {
type: 'npmScript',
dependsOn: ['^build'],
cache: false,
},
},
cacheOptions: {
cacheStorageConfig: {

View File

@@ -24,18 +24,16 @@
"start:server-dev": "NODE_ENV=development BROWSER_OPEN=localhost:5006 yarn npm-run-all --parallel 'start:server-monitor' 'start'",
"start:desktop": "yarn desktop-dependencies && npm-run-all --parallel 'start:desktop-*'",
"start:docs": "yarn workspace docs start",
"desktop-dependencies": "npm-run-all --parallel rebuild-electron build:browser-backend build:plugins-service",
"desktop-dependencies": "npm-run-all --parallel rebuild-electron build:plugins-service",
"start:desktop-node": "yarn workspace @actual-app/core watch:node",
"start:desktop-client": "yarn workspace @actual-app/web watch",
"start:desktop-server-client": "yarn workspace @actual-app/web build:browser",
"start:desktop-electron": "yarn workspace desktop-electron watch",
"start:browser": "yarn workspace plugins-service build-dev && npm-run-all --parallel 'start:browser-*'",
"start:browser": "npm-run-all --parallel 'start:browser-*' 'start:service-plugins'",
"start:service-plugins": "yarn workspace plugins-service watch",
"start:browser-backend": "yarn workspace @actual-app/core watch:browser",
"start:browser-frontend": "yarn workspace @actual-app/web start:browser",
"start:storybook": "yarn workspace @actual-app/components start:storybook",
"build": "lage build",
"build:browser-backend": "yarn workspace @actual-app/core build:browser",
"build:server": "yarn build:browser && yarn workspace @actual-app/sync-server build",
"build:browser": "./bin/package-browser",
"build:desktop": "./bin/package-electron",
@@ -54,20 +52,23 @@
"playwright": "yarn workspace @actual-app/web run playwright",
"vrt": "yarn workspace @actual-app/web run vrt",
"vrt:docker": "./bin/run-vrt",
"rebuild-electron": "./node_modules/.bin/electron-rebuild -m ./packages/loot-core && ./node_modules/.bin/electron-rebuild -m ./packages/desktop-electron -o better-sqlite3,bcrypt",
"rebuild-electron": "./node_modules/.bin/electron-rebuild -m ./packages/desktop-electron -o better-sqlite3,bcrypt --build-from-source -f",
"rebuild-node": "yarn workspace @actual-app/core rebuild",
"lint": "oxfmt --check . && oxlint --type-aware --quiet",
"lint:fix": "oxfmt . && oxlint --fix --type-aware --quiet",
"install:server": "yarn workspaces focus @actual-app/sync-server --production",
"constraints": "yarn constraints",
"typecheck": "tsgo -p tsconfig.root.json --noEmit && lage typecheck",
"check:tsconfig-references": "workspaces-to-typescript-project-references --check",
"sync:tsconfig-references": "workspaces-to-typescript-project-references",
"prepare": "husky"
},
"devDependencies": {
"@monorepo-utils/workspaces-to-typescript-project-references": "^2.10.3",
"@octokit/rest": "^22.0.1",
"@types/node": "^22.19.17",
"@types/prompts": "^2.4.9",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"@typescript/native-preview": "beta",
"@yarnpkg/types": "^4.0.1",
"eslint": "^10.2.0",
"eslint-plugin-perfectionist": "^5.8.0",
@@ -86,6 +87,23 @@
"typescript": "^6.0.2",
"vitest": "^4.1.2"
},
"dependenciesMeta": {
"bcrypt": {
"built": true
},
"better-sqlite3": {
"built": true
},
"electron": {
"built": true
},
"esbuild": {
"built": true
},
"sharp": {
"built": true
}
},
"resolutions": {
"adm-zip": "patch:adm-zip@npm%3A0.5.16#~/.yarn/patches/adm-zip-npm-0.5.16-4556fea098.patch",
"minimatch@10.2.1": "10.2.5",
@@ -98,8 +116,9 @@
"socks": ">=2.8.3"
},
"lint-staged": {
"packages/*/package.json": [
"ts-node ./bin/validate-publish-imports.ts --fix"
"packages/*/{package.json,tsconfig.json}": [
"ts-node ./bin/validate-publish-imports.ts --fix",
"yarn sync:tsconfig-references"
],
"*.{js,mjs,jsx,ts,tsx,md,json,yml,yaml}": [
"oxfmt --no-error-on-unmatched-pattern"

View File

@@ -1,102 +0,0 @@
/// <reference lib="webworker" />
// Worker entry for @actual-app/api's browser build.
//
// This owns the real loot-core instance (sql.js + absurd-sql + IndexedDB)
// and speaks loot-core's existing backend protocol over postMessage:
// main → worker: {id, name, args, undoTag?, catchErrors?}
// worker → main: {type:'reply', id, result, mutated, undoTag}
// {type:'error', id, error}
// {type:'connect'} (handshake heartbeat)
//
// Bootstrapping:
// - We register an `api-browser/init` handler that runs loot-core's public
// init(config), so the main-thread facade can kick off the DB + auth via
// a normal RPC call. The reply carries no return value (loot-core's
// `init(config)` resolves to `lib`, which isn't structured-cloneable).
// - connection.init(self, handlers) starts the message loop and the
// `{type:'connect'}` handshake loot-core's client connection expects.
import * as connection from '@actual-app/core/platform/server/connection';
import { handlers, init } from '@actual-app/core/server/main';
import type { InitConfig } from '@actual-app/core/server/main';
// Dev-server friendliness: consumer bundlers (Vite first, others too) run
// import-analysis on every `.js` URL they serve. loot-core's JS migrations
// use `#`-subpath imports that only resolve inside loot-core — analysis
// fails when those files live under node_modules/@actual-app/api/dist/.
// Our build writes those files with an extra `.data` suffix, so bundlers
// leave them alone. Translate the URLs here so loot-core's fetch layer
// still sees `.js` names both in the manifest and on-disk.
//
// The wrap has to install before connection.init() runs, and populateDefault-
// Filesystem is kicked off lazily from the first `load-budget` / init call.
{
const origFetch = globalThis.fetch;
const MIGRATION_JS = /\/data\/migrations\/[^/?]+\.js(\?.*)?$/;
globalThis.fetch = (async (
input: RequestInfo | URL,
initArg?: RequestInit,
): Promise<Response> => {
const url =
typeof input === 'string' ? input : (input as URL | Request).toString();
if (MIGRATION_JS.test(url)) {
// Re-target .js → .js.data before hitting the network.
const patched = url.replace(/(\.js)(\?|$)/, '.js.data$2');
return origFetch(patched, initArg);
}
if (
url.endsWith('/data-file-index.txt') ||
url.endsWith('data-file-index.txt')
) {
const res = await origFetch(input as RequestInfo | URL, initArg);
if (!res.ok) return res;
const text = await res.text();
const rewritten = text.replace(/\.js\.data(\r?\n|$)/g, '.js$1');
return new Response(rewritten, {
status: res.status,
statusText: res.statusText,
headers: res.headers,
});
}
return origFetch(input as RequestInfo | URL, initArg);
}) as typeof fetch;
}
// `api-browser/init` is a worker-local handler; it isn't part of the shared
// Handlers type. Assign via the index-signature cast rather than extending
// the type globally.
(handlers as Record<string, (args?: unknown) => Promise<unknown>>)[
'api-browser/init'
] = async function (args?: unknown) {
const payload = (args ?? {}) as InitConfig & { __assetsBaseUrl?: string };
// Main thread hands us a URL pointing at the api's own dist/ dir. Setting
// PUBLIC_URL here is what makes loot-core's populateDefaultFilesystem
// fetch `data-file-index.txt` / `data/<name>` / `sql-wasm.wasm` from our
// package instead of the consumer's page origin — no manual copy step.
const { __assetsBaseUrl, ...config } = payload;
if (__assetsBaseUrl) {
process.env.PUBLIC_URL = __assetsBaseUrl;
}
await init(config);
// Nothing to return — the resolved `lib` has functions and isn't
// structured-cloneable anyway.
};
self.addEventListener('error', e => {
// eslint-disable-next-line no-console
console.error(
'[api worker] uncaught',
(e as ErrorEvent).error ?? (e as ErrorEvent).message,
);
});
self.addEventListener('unhandledrejection', e => {
// eslint-disable-next-line no-console
console.error(
'[api worker] unhandled rejection',
(e as PromiseRejectionEvent).reason,
);
});
connection.init(self as unknown as Window, handlers);

View File

@@ -1,39 +0,0 @@
// Browser main-thread stub for `@actual-app/core/server/main`.
//
// The real loot-core runs inside the worker (see browser-worker.ts). The
// main-thread bundle reuses packages/api/methods.ts verbatim, but that file
// reads `lib.send(...)` from loot-core. Resolving that import to this stub
// routes every call over postMessage instead of touching loot-core on the
// main thread.
export type BrowserSendFn = (name: string, args?: unknown) => Promise<unknown>;
let workerSend: BrowserSendFn = () => {
return Promise.reject(
new Error('@actual-app/api: call init() before any other method'),
);
};
// Shape-cast rather than `typeof import(...)` so this stub stays
// module-graph-independent from the real loot-core.
export const lib = {
send(name: string, args?: unknown) {
return workerSend(name, args);
},
} as unknown as {
send: <T = unknown>(name: string, args?: unknown) => Promise<T>;
};
export function _setBrowserSend(fn: BrowserSendFn) {
workerSend = fn;
}
// Inline InitConfig (matches loot-core's shape) so this stub does not force
// TS to pull in the real @actual-app/core/server/main module graph at all.
export type InitConfig = {
dataDir?: string;
serverURL?: string;
password?: string;
sessionToken?: string;
verbose?: boolean;
};

View File

@@ -1,132 +0,0 @@
// Main-thread RPC bridge to the api worker.
//
// Reuses `createBackendWorker` from loot-core so absurd-sql's main-thread
// plumbing (IDB helper worker, __absurd:* filtering) stays in one place.
// Speaks loot-core's existing backend protocol:
// out: {id, name, args, catchErrors?}
// in : {type:'reply', id, result, error?}
// {type:'error', id, error}
// {type:'connect'} (handshake heartbeat)
// {type:'push', name, args}
//
// We handle the handshake by replying {name:'client-connected-to-backend'}
// on the first 'connect'. Messages sent before handshake completes are
// queued.
import { createBackendWorker } from '@actual-app/core/platform/client/backend-worker';
import type { BackendWorker } from '@actual-app/core/platform/client/backend-worker';
type Pending = {
resolve: (v: unknown) => void;
reject: (e: unknown) => void;
};
type Reply =
| {
type: 'reply';
id: string;
result?: unknown;
error?: { type?: string; message?: string; [k: string]: unknown };
}
| {
type: 'error';
id: string;
error: { type?: string; message?: string; [k: string]: unknown };
};
let backend: BackendWorker | null = null;
let connected = false;
let queue: Array<{ id: string; name: string; args?: unknown }> = [];
const pending = new Map<string, Pending>();
function nextId(): string {
if (typeof crypto !== 'undefined' && 'randomUUID' in crypto) {
return crypto.randomUUID();
}
return Date.now().toString(36) + '-' + Math.random().toString(36).slice(2);
}
function toError(info: { type?: string; message?: string } | undefined) {
const msg = info?.message || info?.type || 'api worker error';
const err = new Error(msg);
if (info?.type) err.name = info.type;
return err;
}
export function setWorker(worker: Worker): BackendWorker {
if (backend) {
backend.terminate();
}
connected = false;
queue = [];
pending.clear();
backend = createBackendWorker(worker);
backend.onMessage((data: unknown) => {
if (!data || typeof data !== 'object') return;
const msg = data as { type?: string; name?: string };
if (msg.type === 'connect') {
if (!connected) {
connected = true;
backend!.postMessage({ name: 'client-connected-to-backend' });
// Drain anything queued while waiting for the handshake.
const drained = queue;
queue = [];
for (const m of drained) backend!.postMessage(m);
}
return;
}
if (msg.type === 'reply' || msg.type === 'error') {
const reply = msg as Reply;
const p = pending.get(reply.id);
if (!p) return;
pending.delete(reply.id);
if (reply.type === 'error') {
p.reject(toError(reply.error));
} else if ('error' in reply && reply.error) {
// api/* handlers funnel errors through the reply envelope.
p.reject(toError(reply.error));
} else {
p.resolve(reply.result);
}
return;
}
// push/capture-exception/etc. — ignore for now; the api consumer
// doesn't subscribe to loot-core's server events.
});
return backend;
}
export function rpc(name: string, args?: unknown): Promise<unknown> {
if (!backend) {
return Promise.reject(
new Error('@actual-app/api: init() must be called before any api method'),
);
}
return new Promise((resolve, reject) => {
const id = nextId();
pending.set(id, { resolve, reject });
const msg = { id, name, args };
if (connected) {
backend!.postMessage(msg);
} else {
queue.push(msg);
}
});
}
export function terminate() {
if (backend) {
backend.terminate();
backend = null;
}
connected = false;
queue = [];
pending.clear();
}

View File

@@ -1,66 +0,0 @@
// Main-thread browser entry for @actual-app/api.
//
// Public surface matches the Node entry. The worker is spawned internally
// so consumers write:
//
// import * as api from '@actual-app/api';
// await api.init({ dataDir: '/documents', serverURL, password });
// await api.getAccounts();
//
// worker.js must be a sibling of browser.js at runtime. Our build ships
// them together in dist/; the consumer's bundler resolves the worker URL
// via `new URL(..., import.meta.url)`.
import { _setBrowserSend } from './browser/lib-stub';
import type { InitConfig } from './browser/lib-stub';
import { rpc, setWorker, terminate } from './browser/rpc';
export * from './methods';
export * as utils from './utils';
// Wire methods.ts's `lib.send` through the worker.
_setBrowserSend((name, args) => rpc(name, args));
function createWorker(): Worker {
// Vite's `vite:worker-import-meta-url` plugin rewrites this pattern at
// the CONSUMER's build time (emit worker.js as an asset, substitute the
// hashed URL). Feeding it a non-literal first argument keeps the api's
// OWN lib build from trying to pre-bundle it, which would fail because
// ./worker.js is not a source-tree sibling of this file.
const rel = './worker.js';
return new Worker(new URL(rel, import.meta.url), { type: 'module' });
}
export async function init(config: InitConfig = {}) {
setWorker(createWorker());
// Point loot-core's browser fs at our dist/ directory. We want the
// directory portion of this bundle's own URL so loot-core's fetches land
// on files we ship (data-file-index.txt, migrations/, default-db.sqlite,
// sql-wasm.wasm). Vite's asset plugin tries to pre-bundle
// `new URL('.', import.meta.url)` at consumer build time and picks up
// the `development` export condition (inlining index.ts as a data URL!).
// Derive the base URL via string manipulation instead so static analyzers
// leave it alone.
const assetsBaseUrl = import.meta.url.replace(/[^/]+$/, '');
await rpc('api-browser/init', { ...config, __assetsBaseUrl: assetsBaseUrl });
// Return a {send} handle compatible with the Node entry so existing
// consumer code that does `const internal = await api.init(...); internal.send(...)`
// keeps working on the browser build too.
return {
send: (name: string, args?: unknown) => rpc(name, args),
};
}
export async function shutdown() {
try {
await rpc('sync');
} catch {
// most likely no budget loaded
}
try {
await rpc('close-budget');
} catch {
// ignore
}
terminate();
}

View File

@@ -2,18 +2,43 @@ import * as fs from 'fs/promises';
import * as path from 'path';
import type { RuleEntity } from '@actual-app/core/types/models';
import { vi } from 'vitest';
import * as api from '../index';
import * as api from './index';
declare global {
var IS_TESTING: boolean;
var currentMonth: string | null;
}
// In tests we run from source; loot-core's API fs uses __dirname (for the built dist/).
// Mock the fs so path constants point at loot-core package root where migrations live.
vi.mock(
'../loot-core/src/platform/server/fs/index.api',
async importOriginal => {
const actual = (await importOriginal()) as Record<string, unknown>;
const pathMod = await import('path');
const lootCoreRoot = pathMod.join(__dirname, '..', 'loot-core');
return {
...actual,
migrationsPath: pathMod.join(lootCoreRoot, 'migrations'),
bundledDatabasePath: pathMod.join(lootCoreRoot, 'default-db.sqlite'),
demoBudgetPath: pathMod.join(lootCoreRoot, 'demo-budget'),
};
},
);
const budgetName = 'test-budget';
global.IS_TESTING = true;
beforeEach(async () => {
const budgetPath = path.join(__dirname, '/../mocks/budgets/', budgetName);
const budgetPath = path.join(__dirname, '/mocks/budgets/', budgetName);
await fs.rm(budgetPath, { force: true, recursive: true });
await createTestBudget('default-budget-template', budgetName);
await api.init({
dataDir: path.join(__dirname, '/../mocks/budgets/'),
dataDir: path.join(__dirname, '/mocks/budgets/'),
});
});
@@ -25,10 +50,10 @@ afterEach(async () => {
async function createTestBudget(templateName: string, name: string) {
const templatePath = path.join(
__dirname,
'/../../loot-core/src/mocks/files',
'/../loot-core/src/mocks/files',
templateName,
);
const budgetPath = path.join(__dirname, '/../mocks/budgets/', name);
const budgetPath = path.join(__dirname, '/mocks/budgets/', name);
await fs.mkdir(budgetPath);
await fs.copyFile(
@@ -491,6 +516,29 @@ describe('API CRUD operations', () => {
);
});
// apis: getNote, updateNote
test('Notes: successfully get and update note', async () => {
const categories = await api.getCategories();
const categoryId = categories[0].id;
// No note exists initially
const initial = await api.getNote(categoryId);
expect(initial).toBeNull();
// Set a note
await api.updateNote(categoryId, 'Test note content');
const afterSet = await api.getNote(categoryId);
expect(afterSet).toEqual({ id: categoryId, note: 'Test note content' });
// Update the note
await api.updateNote(categoryId, 'Updated note content');
const afterUpdate = await api.getNote(categoryId);
expect(afterUpdate).toEqual({
id: categoryId,
note: 'Updated note content',
});
});
// apis: getRules, getPayeeRules, createRule, updateRule, deleteRule
test('Rules: successfully update rules', async () => {
await api.createPayee({ name: 'test-payee' });

View File

@@ -13,6 +13,7 @@ import type { ImportTransactionsOpts } from '@actual-app/core/types/api-handlers
import type { Handlers } from '@actual-app/core/types/handlers';
import type {
ImportTransactionEntity,
NoteEntity,
RuleEntity,
TransactionEntity,
} from '@actual-app/core/types/models';
@@ -203,8 +204,8 @@ export function getAccountBalance(id: APIAccountEntity['id'], cutoff?: Date) {
return send('api/account-balance', { id, cutoff });
}
export function getCategoryGroups() {
return send('api/category-groups-get');
export function getCategoryGroups(options: { hidden?: boolean } = {}) {
return send('api/category-groups-get', options);
}
export function createCategoryGroup(group: Omit<APICategoryGroupEntity, 'id'>) {
@@ -225,8 +226,8 @@ export function deleteCategoryGroup(
return send('api/category-group-delete', { id, transferCategoryId });
}
export function getCategories() {
return send('api/categories-get', { grouped: false });
export function getCategories(options: { hidden?: boolean } = {}) {
return send('api/categories-get', { grouped: false, ...options });
}
export function createCategory(category: Omit<APICategoryEntity, 'id'>) {
@@ -247,6 +248,14 @@ export function deleteCategory(
return send('api/category-delete', { id, transferCategoryId });
}
export function getNote(id: NoteEntity['id']) {
return send('api/note-get', { id });
}
export function updateNote(id: NoteEntity['id'], note: NoteEntity['note']) {
return send('api/note-update', { id, note });
}
export function getCommonPayees() {
return send('api/common-payees-get');
}

1
packages/api/models.ts Normal file
View File

@@ -0,0 +1 @@
export type * from '@actual-app/core/server/api-models';

View File

@@ -1,11 +1,18 @@
{
"name": "@actual-app/api",
"version": "26.4.0",
"version": "26.5.2",
"description": "An API for Actual",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/actualbudget/actual.git",
"directory": "packages/api"
},
"files": [
"@types",
"dist"
"dist",
"!@types/**/*.test.d.ts",
"!@types/**/*.test.d.ts.map"
],
"main": "dist/index.js",
"types": "@types/index.d.ts",
@@ -13,45 +20,43 @@
".": {
"types": "./@types/index.d.ts",
"development": "./index.ts",
"browser": "./dist/browser.js",
"default": "./dist/index.js"
},
"./models": {
"types": "./@types/models.d.ts",
"development": "./models.ts",
"default": "./dist/models.js"
}
},
"publishConfig": {
"exports": {
".": {
"types": "./@types/index.d.ts",
"browser": "./dist/browser.js",
"default": "./dist/index.js"
},
"./models": {
"types": "./@types/models.d.ts",
"default": "./dist/models.js"
}
}
},
"scripts": {
"build": "npm-run-all -s build:node build:browser-worker build:browser",
"build:node": "vite build --config vite.config.mts && tsgo --emitDeclarationOnly",
"build:browser": "vite build --config vite.browser.config.mts",
"build:browser-worker": "vite build --config vite.browser-worker.config.mts",
"test": "npm-run-all -cp 'test:*'",
"test:node": "vitest --run --config vite.config.mts",
"test:browser": "vitest --run --config vitest.browser.config.mts",
"build": "vite build && tsgo --emitDeclarationOnly",
"test": "vitest --run",
"typecheck": "tsgo -b && tsc-strict"
},
"dependencies": {
"@actual-app/core": "workspace:*",
"@actual-app/crdt": "workspace:*",
"absurd-sql": "0.0.54",
"better-sqlite3": "^12.8.0",
"compare-versions": "^6.1.1"
"compare-versions": "^6.1.1",
"uuid": "^14.0.0"
},
"devDependencies": {
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"fake-indexeddb": "^6.2.5",
"jsdom": "^27.4.0",
"npm-run-all": "^4.1.5",
"@typescript/native-preview": "beta",
"rollup-plugin-visualizer": "^7.0.1",
"typescript-strict-plugin": "^2.4.4",
"vite": "^8.0.5",
"vite-plugin-node-polyfills": "^0.26.0",
"vite-plugin-peggy-loader": "^2.0.1",
"vitest": "^4.1.2"
},

View File

@@ -1,183 +0,0 @@
import { afterEach, describe, expect, test, vi } from 'vitest';
import * as api from '../index.browser';
// Swap the real Worker constructor for a mock that the tests control. Vitest
// picks this up via vite.config resolve.alias; here we just stand in globally
// because jsdom does not ship Worker at all.
class MockWorker {
public posted: Array<unknown> = [];
public responder: (
req: { id: string; name: string; args?: unknown },
reply: (res: unknown) => void,
) => void = () => undefined;
private listeners: Array<(e: MessageEvent) => void> = [];
onmessage: ((e: MessageEvent) => void) | null = null;
onerror: ((e: ErrorEvent) => void) | null = null;
private connected = false;
addEventListener(type: string, handler: (e: MessageEvent) => void) {
if (type === 'message') this.listeners.push(handler);
}
removeEventListener() {
// no-op for tests
}
postMessage(msg: unknown) {
this.posted.push(msg);
if (
msg &&
typeof msg === 'object' &&
(msg as { name?: string }).name === 'client-connected-to-backend'
) {
// Handshake complete; we won't keep sending 'connect' heartbeats.
return;
}
const req = msg as { id: string; name: string; args?: unknown };
queueMicrotask(() => {
this.responder(req, (data: unknown) => {
const ev = { data } as MessageEvent;
this.onmessage?.(ev);
for (const l of this.listeners) l(ev);
});
});
}
/** Simulate loot-core's connect handshake from the worker side. */
fireConnect() {
if (this.connected) return;
this.connected = true;
const ev = { data: { type: 'connect' } } as MessageEvent;
this.onmessage?.(ev);
for (const l of this.listeners) l(ev);
}
terminate() {
this.listeners = [];
}
}
// Every Worker the api spawns inside init() comes through here.
let lastMockWorker: MockWorker | null = null;
const mockWorkerResponder = vi.fn<
(
req: { id: string; name: string; args?: unknown },
reply: (res: unknown) => void,
) => void
>(() => undefined);
// Global Worker stub — the api's internal `new Worker(...)` will call this.
// @ts-expect-error jsdom has no Worker; we override the global for the test.
globalThis.Worker = class {
constructor(_url: URL | string, _opts?: WorkerOptions) {
const w = new MockWorker();
w.responder = (req, reply) => mockWorkerResponder(req, reply);
lastMockWorker = w;
// Fire the connect handshake on the next tick so init() resolves.
queueMicrotask(() => w.fireConnect());
return w as unknown as Worker;
}
};
// absurd-sql's main-thread bridge expects real Worker event semantics. The
// mock above exposes addEventListener; initSQLBackend just attaches a
// message listener, so it's safe with jsdom.
afterEach(async () => {
// Keep whatever responder the test installed so shutdown's sync/close-budget
// calls resolve rather than hang.
await api.shutdown().catch(() => undefined);
mockWorkerResponder.mockReset();
lastMockWorker = null;
});
describe('@actual-app/api browser facade', () => {
test('spawns a worker on init and forwards config via api-browser/init', async () => {
mockWorkerResponder.mockImplementation((req, reply) => {
reply({ type: 'reply', id: req.id, result: undefined });
});
await api.init({
dataDir: '/documents',
serverURL: 'https://example.test',
password: 'pw',
});
expect(lastMockWorker).toBeTruthy();
// First post after the handshake ack is the api-browser/init request.
const initCall = lastMockWorker!.posted.find(
m =>
m &&
typeof m === 'object' &&
(m as { name?: string }).name === 'api-browser/init',
) as { name: string; args: unknown } | undefined;
expect(initCall).toBeTruthy();
expect(initCall!.args).toMatchObject({
dataDir: '/documents',
serverURL: 'https://example.test',
password: 'pw',
});
// The api also hands over its own asset base URL so loot-core's fs
// can fetch migrations / default-db / WASM from the api's dist/
// instead of the consumer's page origin.
expect(
(initCall!.args as { __assetsBaseUrl?: string }).__assetsBaseUrl,
).toBeTypeOf('string');
});
test('rpc methods forward as {id, name, args} and read {type:reply, result}', async () => {
mockWorkerResponder.mockImplementation((req, reply) => {
if (req.name === 'api-browser/init') {
reply({ type: 'reply', id: req.id, result: undefined });
return;
}
if (req.name === 'api/accounts-get') {
reply({
type: 'reply',
id: req.id,
result: [{ id: 'a1', name: 'Checking' }],
});
return;
}
reply({
type: 'error',
id: req.id,
error: { type: 'APIError', message: 'unexpected' },
});
});
await api.init({ dataDir: '/documents' });
const accounts = await api.getAccounts();
expect(accounts).toEqual([{ id: 'a1', name: 'Checking' }]);
const sendCalls = lastMockWorker!.posted.filter(
m =>
m &&
typeof m === 'object' &&
(m as { name?: string }).name === 'api/accounts-get',
);
expect(sendCalls).toHaveLength(1);
expect((sendCalls[0] as { args?: unknown }).args).toBeUndefined();
});
test('worker errors reject at the call site', async () => {
mockWorkerResponder.mockImplementation((req, reply) => {
if (req.name === 'api-browser/init') {
reply({ type: 'reply', id: req.id, result: undefined });
return;
}
reply({
type: 'reply',
id: req.id,
error: { type: 'APIError', message: 'budget not loaded' },
});
});
await api.init({ dataDir: '/documents' });
await expect(api.getAccounts()).rejects.toThrow(/budget not loaded/);
});
});

View File

@@ -1,43 +0,0 @@
import { afterEach, describe, expect, test } from 'vitest';
import * as api from '../index';
declare const __API_DATA_DIR__: string;
afterEach(async () => {
await api.shutdown();
});
describe('api CRUD roundtrip (Node)', () => {
test('creates a budget, writes, reads it back', async () => {
const internal = await api.init({ dataDir: __API_DATA_DIR__ });
await internal.send('create-budget', {
budgetName: 'Integration Test',
testMode: true,
testBudgetId: 'integration-test',
});
await api.loadBudget('integration-test');
const accountId = await api.createAccount(
{ name: 'Checking', offbudget: false },
0,
);
await api.addTransactions(accountId, [
{ date: '2026-04-01', amount: 1000, payee_name: 'Coffee' },
{ date: '2026-04-02', amount: -500, payee_name: 'Book' },
]);
const accounts = await api.getAccounts();
expect(accounts.map(a => a.name)).toContain('Checking');
const txns = await api.getTransactions(
accountId,
'2026-04-01',
'2026-04-30',
);
expect(txns).toHaveLength(2);
expect(txns.map(t => t.amount).sort((a, b) => a - b)).toEqual([-500, 1000]);
});
});

View File

@@ -1,31 +0,0 @@
import * as fsPromises from 'fs/promises';
import * as os from 'os';
import * as path from 'path';
import { vi } from 'vitest';
// In tests we run from source; loot-core's API fs uses __dirname (for the built dist/).
// Mock the fs so path constants point at loot-core package root where migrations live.
vi.mock(
'../../loot-core/src/platform/server/fs/index.api',
async importOriginal => {
const actual = (await importOriginal()) as Record<string, unknown>;
const lootCoreRoot = path.join(__dirname, '..', '..', 'loot-core');
return {
...actual,
migrationsPath: path.join(lootCoreRoot, 'migrations'),
bundledDatabasePath: path.join(lootCoreRoot, 'default-db.sqlite'),
demoBudgetPath: path.join(lootCoreRoot, 'demo-budget'),
};
},
);
global.IS_TESTING = true;
// Shared integration test lives in a filesystem-backed tmp dir.
const dataDir = path.join(
os.tmpdir(),
`api-it-${Date.now()}-${Math.random().toString(36).slice(2)}`,
);
await fsPromises.mkdir(dataDir, { recursive: true });
globalThis.__API_DATA_DIR__ = dataDir;

View File

@@ -8,28 +8,33 @@
"module": "es2022",
"moduleResolution": "bundler",
"customConditions": ["api"],
// composite + declaration: true require `noEmit: false`, so use
// emitDeclarationOnly to keep typecheck + project refs working without
// clobbering the Vite build artifacts in dist/. build:node also passes
// --emitDeclarationOnly on the CLI (redundant but explicit).
"noEmit": false,
"emitDeclarationOnly": true,
"declaration": true,
"declarationMap": true,
"outDir": "dist",
"rootDir": ".",
"declarationDir": "@types",
"tsBuildInfoFile": "dist/.tsbuildinfo",
"plugins": [{ "name": "typescript-strict-plugin", "paths": ["."] }]
"plugins": [
{
"name": "typescript-strict-plugin",
"paths": ["."]
}
]
},
"references": [{ "path": "../crdt" }, { "path": "../loot-core" }],
"references": [
{
"path": "../loot-core"
},
{
"path": "../crdt"
}
],
"include": ["."],
"exclude": [
"**/node_modules/*",
"dist",
"@types",
"**/*.test.ts",
"test/setup.*.ts",
"*.config.ts",
"*.config.mts"
]

View File

@@ -1,62 +0,0 @@
import path from 'path';
import { defineConfig } from 'vite';
import { nodePolyfills } from 'vite-plugin-node-polyfills';
import peggyLoader from 'vite-plugin-peggy-loader';
const distDir = path.resolve(__dirname, 'dist');
// Worker bundle: contains the full loot-core + sql.js + absurd-sql stack.
// Runs inside a Web Worker where absurd-sql's Atomics.wait has the right
// thread context. Consumer spawns the worker with this file as the entry.
export default defineConfig({
define: {
// NODE_ENV is read at build time by dead-code elimination paths and
// must stay a literal. The others (PUBLIC_URL, DATA_DIR, SERVER_URL,
// DOCUMENT_DIR) are set at runtime via the `api-browser/init` handler
// which receives them from the main thread — so they stay as
// `process.env.<name>` references and the nodePolyfills-provided
// process shim serves as the backing store.
'process.env.NODE_ENV': JSON.stringify('production'),
},
build: {
target: 'esnext',
outDir: distDir,
emptyOutDir: false,
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'browser-worker.ts'),
formats: ['es'],
fileName: () => 'worker.js',
},
rollupOptions: {
output: {
codeSplitting: false,
},
},
},
plugins: [
peggyLoader(),
nodePolyfills({
include: [
'process',
'buffer',
'stream',
'path',
'crypto',
'timers',
'util',
'zlib',
'fs',
'assert',
],
globals: {
process: true,
Buffer: true,
global: true,
},
}),
],
// Intentionally no resolve.conditions: ['api'] — loot-core falls back to
// its default (browser) platform files.
});

View File

@@ -1,39 +0,0 @@
import path from 'path';
import { defineConfig } from 'vite';
const distDir = path.resolve(__dirname, 'dist');
// Main-thread facade only. Tiny bundle: no loot-core, no sql.js, no absurd-sql.
// The worker is built separately by vite.browser-worker.config.mts. The
// consumer constructs the Worker (handling URL resolution through their own
// bundler) and hands it to init().
export default defineConfig({
build: {
target: 'esnext',
outDir: distDir,
emptyOutDir: false,
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'index.browser.ts'),
formats: ['es'],
fileName: () => 'browser.js',
},
rollupOptions: {
output: {
codeSplitting: false,
},
},
},
resolve: {
alias: {
// methods.ts reads `lib.send` from loot-core's server/main. Route it
// through the main-thread stub so loot-core is never pulled into
// the main bundle.
'@actual-app/core/server/main': path.resolve(
__dirname,
'browser/lib-stub.ts',
),
},
},
});

View File

@@ -49,61 +49,10 @@ function copyMigrationsAndDefaultDb() {
throw new Error(`default-db.sqlite not found at ${defaultDbPath}`);
}
fs.copyFileSync(defaultDbPath, path.join(distDir, 'default-db.sqlite'));
// Browser consumers need sql.js' WASM to be served at the same origin
// as the bundle. Ship it alongside dist/ so downstream apps just point
// a static handler at dist and don't have to reach into node_modules.
const sqlJsWasm = require.resolve('@jlongster/sql.js/dist/sql-wasm.wasm');
fs.copyFileSync(sqlJsWasm, path.join(distDir, 'sql-wasm.wasm'));
// loot-core's browser fs bootstraps by fetching:
// `${PUBLIC_URL}data-file-index.txt` - flat manifest
// `${PUBLIC_URL}data/<name>` - each file listed in the manifest
// We point PUBLIC_URL at the api's dist dir at runtime (see
// index.browser.ts), so these two shapes need to exist here.
//
// JS migrations get a `.data` suffix on the *wire* path. Consumer
// bundlers (Vite's dev server first, others to varying degrees)
// auto-transform `.js` URLs through their import-analysis pipelines,
// which fails on loot-core's `#`-subpath imports. The api's worker
// (browser-worker.ts) wraps `fetch` to translate back to `.js` so
// loot-core's migration runner finds the file under its original
// name in the virtual FS. `.sql` migrations stay as-is.
const dataDir = path.join(distDir, 'data');
const dataMigrationsDir = path.join(dataDir, 'migrations');
fs.mkdirSync(dataMigrationsDir, { recursive: true });
linkOrCopy(
path.join(distDir, 'default-db.sqlite'),
path.join(dataDir, 'default-db.sqlite'),
);
const wireMigrationNames: string[] = [];
for (const name of fs.readdirSync(migrationsDest)) {
const wireName = name.endsWith('.js') ? `${name}.data` : name;
linkOrCopy(
path.join(migrationsDest, name),
path.join(dataMigrationsDir, wireName),
);
wireMigrationNames.push(`migrations/${wireName}`);
}
wireMigrationNames.sort();
// data-file-index.txt: one path per line, relative to `data/`.
const manifest =
['default-db.sqlite', ...wireMigrationNames].join('\n') + '\n';
fs.writeFileSync(path.join(distDir, 'data-file-index.txt'), manifest);
},
};
}
function linkOrCopy(src: string, dest: string) {
try {
fs.linkSync(src, dest);
} catch {
fs.copyFileSync(src, dest);
}
}
export default defineConfig({
ssr: {
noExternal: true,
@@ -117,9 +66,12 @@ export default defineConfig({
emptyOutDir: true,
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'index.ts'),
entry: {
index: path.resolve(__dirname, 'index.ts'),
models: path.resolve(__dirname, 'models.ts'),
},
formats: ['cjs'],
fileName: () => 'index.js',
fileName: (_format, entryName) => `${entryName}.js`,
},
},
plugins: [
@@ -133,9 +85,12 @@ export default defineConfig({
},
test: {
globals: true,
environment: 'node',
setupFiles: ['./test/setup.node.ts'],
exclude: ['**/node_modules/**', '**/browser-facade.test.ts'],
// Each test loads a budget file and runs all DB migrations, which can be
// slow on busy CI runners; the default 5s timeout is too tight and causes
// flaky timeouts (and a cascade of unhandled rejections from in-flight work
// continuing after teardown).
testTimeout: 20_000,
hookTimeout: 20_000,
onConsoleLog(log: string, type: 'stdout' | 'stderr'): boolean | void {
// print only console.error
return type === 'stderr';

View File

@@ -1,35 +0,0 @@
import path from 'path';
import { defineConfig } from 'vite';
import peggyLoader from 'vite-plugin-peggy-loader';
// Deliberately independent from vite.browser.config.mts: the build config
// applies node polyfills that would swap out Node fs in the test setup
// file. The test setup uses real Node fs to stream the on-disk fixtures
// (default-db.sqlite, migrations, sql.js WASM) through a fetch polyfill.
export default defineConfig({
plugins: [peggyLoader()],
// The facade test imports `../index.browser` directly and uses a mock
// Worker. loot-core never loads on the main thread, so no platform
// condition juggling is needed here. The sibling vite.browser.config.mts
// aliases loot-core to the stub for the bundled facade; for the test we
// mirror that so `methods.ts` resolves correctly.
resolve: {
alias: {
'@actual-app/core/server/main': path.resolve(
__dirname,
'browser/lib-stub.ts',
),
},
},
test: {
globals: true,
environment: 'jsdom',
include: ['test/browser-facade.test.ts'],
onConsoleLog(log: string, type: 'stdout' | 'stderr'): boolean | void {
return type === 'stderr';
},
maxWorkers: 2,
},
});

View File

@@ -69,6 +69,8 @@ const botEmail = '41898282+github-actions[bot]@users.noreply.github.com';
await exec(`git config user.name '${botName}'`);
await exec(`git config user.email '${botEmail}'`);
const AUTOGEN_MARKER = '<!-- release-notes:auto-generated -->';
await group('Prepare branch', async () => {
if (process.env.GITHUB_HEAD_REF) {
await exec(`git fetch origin ${process.env.GITHUB_HEAD_REF}`, {
@@ -79,17 +81,34 @@ await group('Prepare branch', async () => {
});
}
// the previous generation commit deletes source files from
// upcoming-release-notes, rebase it out so we can regenerate from all of them
const { stdout: commitHash } = await exec(
`git log --grep='${commitMessage}' --format=%H -1`,
// recover deleted release note files from previous generation commits
const baseRef = process.env.GITHUB_BASE_REF || 'master';
await exec(`git fetch origin ${baseRef}`, { stdio: 'inherit' });
const { stdout: mergeBase } = await exec(
`git merge-base HEAD origin/${baseRef}`,
);
const hash = commitHash.trim();
if (hash) {
console.log(`Dropping previous release notes commit ${hash}`);
await exec(`git rebase --onto ${hash}~1 ${hash}`, {
stdio: 'inherit',
});
const base = mergeBase.trim();
const { stdout: genLog } = await exec(
`git log --grep='${commitMessage}' --format=%H ${base}..HEAD`,
);
const genCommits = genLog.split('\n').filter(Boolean);
console.log(
`Reversing upcoming-release-notes deletions from ${genCommits.length} prior generation commit(s)`,
);
const tmpDir = process.env.RUNNER_TEMP || '/tmp';
for (const sha of genCommits) {
const patchPath = join(tmpDir, `revert-${sha}.patch`);
try {
await exec(
`git diff --diff-filter=D ${sha}~1..${sha} -- upcoming-release-notes > ${patchPath}`,
);
const { size } = await fs.stat(patchPath);
if (size > 0) {
await exec(`git apply -R --3way ${patchPath}`, { stdio: 'inherit' });
}
} finally {
await fs.unlink(patchPath).catch(() => undefined);
}
}
});
@@ -107,13 +126,14 @@ if (files.length === 0) {
const highlights = '- TODO: Add release highlights';
await group('Generate blog post', async () => {
const blogPath = join(
'packages/docs/blog',
`${releaseDate}-release-${slug}.md`,
);
const blogPath = join(
'packages/docs/blog',
`${releaseDate}-release-${slug}.md`,
);
const releasesPath = 'packages/docs/docs/releases.md';
const blogContent = `---
await group('Generate blog post', async () => {
const template = `---
title: Release ${version}
description: New release of Actual.
date: ${releaseDate}T10:00
@@ -129,18 +149,60 @@ ${highlights}
**Docker Tag: ${version}**
${AUTOGEN_MARKER}
${categorizedNotes}
`;
let blogContent;
try {
const existing = await fs.readFile(blogPath, 'utf-8');
const idx = existing.indexOf(AUTOGEN_MARKER);
if (idx === -1) {
console.log(
`WARNING: ${blogPath} missing ${AUTOGEN_MARKER}, rewriting from template`,
);
blogContent = template;
} else {
blogContent =
existing.slice(0, idx + AUTOGEN_MARKER.length) +
'\n' +
categorizedNotes +
'\n';
}
} catch (e) {
if (e.code !== 'ENOENT') throw e;
blogContent = template;
}
await fs.writeFile(blogPath, blogContent);
console.log(`Wrote ${blogPath}`);
});
await group('Update releases.md', async () => {
const releasesPath = 'packages/docs/docs/releases.md';
const existing = await fs.readFile(releasesPath, 'utf-8');
const newSection = `## ${version}
const sectionRe = new RegExp(
`(^|\\n)## ${escapeRegExp(version)}\\n[\\s\\S]*?(?=\\n## |$)`,
);
const match = existing.match(sectionRe);
let updated;
if (match) {
const section = match[0];
const idx = section.indexOf(AUTOGEN_MARKER);
if (idx === -1) {
console.log(
`WARNING: section for ${version} in ${releasesPath} missing ${AUTOGEN_MARKER}, leaving as-is`,
);
updated = existing;
} else {
const newSection =
section.slice(0, idx + AUTOGEN_MARKER.length) + '\n' + categorizedNotes;
updated = existing.replace(section, newSection);
}
} else {
const newSection = `## ${version}
Release date: ${releaseDate}
@@ -148,12 +210,14 @@ ${highlights}
**Docker Tag: ${version}**
${categorizedNotes}`;
${AUTOGEN_MARKER}
const updated = existing.replace(
'# Release Notes\n',
`# Release Notes\n\n${newSection}\n`,
);
${categorizedNotes}`;
updated = existing.replace(
'# Release Notes\n',
`# Release Notes\n\n${newSection}\n`,
);
}
await fs.writeFile(releasesPath, updated);
console.log(`Updated ${releasesPath}`);
@@ -165,13 +229,28 @@ await group('Remove used release notes', async () => {
);
});
await group('Format generated files', async () => {
await exec(`yarn exec oxfmt ${blogPath} ${releasesPath}`, {
stdio: 'inherit',
});
});
await group('Commit and push', async () => {
await exec(
'git add upcoming-release-notes packages/docs/blog packages/docs/docs/releases.md',
{ stdio: 'inherit' },
);
try {
await exec('git diff --cached --quiet');
console.log('No changes to commit');
return;
} catch {
// there are staged changes
}
await exec(`git commit -m '${commitMessage}'`);
await exec('git push --force-with-lease origin', { stdio: 'inherit' });
await exec('git push origin', { stdio: 'inherit' });
});
async function parseReleaseNotes(dir) {
@@ -205,6 +284,10 @@ async function parseReleaseNotes(dir) {
return { notesByCategory, files };
}
function escapeRegExp(str) {
return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
}
function formatNotes(notes) {
return Object.entries(notes)
.filter(([_, values]) => values.length > 0)

View File

@@ -9,7 +9,7 @@
},
"devDependencies": {
"@octokit/rest": "^22.0.1",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"@typescript/native-preview": "beta",
"extensionless": "^2.0.6",
"gray-matter": "^4.0.3",
"listify": "^1.0.3",

View File

@@ -43,13 +43,16 @@ Configuration is resolved in this order (highest priority first):
### Environment Variables
| Variable | Description |
| ---------------------- | --------------------------------------------- |
| `ACTUAL_SERVER_URL` | URL of the Actual sync server (required) |
| `ACTUAL_PASSWORD` | Server password (required unless using token) |
| `ACTUAL_SESSION_TOKEN` | Session token (alternative to password) |
| `ACTUAL_SYNC_ID` | Budget Sync ID (required for most commands) |
| `ACTUAL_DATA_DIR` | Local directory for cached budget data |
| Variable | Description |
| ---------------------- | ----------------------------------------------------- |
| `ACTUAL_SERVER_URL` | URL of the Actual sync server (required) |
| `ACTUAL_PASSWORD` | Server password (required unless using token) |
| `ACTUAL_SESSION_TOKEN` | Session token (alternative to password) |
| `ACTUAL_SYNC_ID` | Budget Sync ID (required for most commands) |
| `ACTUAL_DATA_DIR` | Local directory for cached budget data |
| `ACTUAL_CACHE_TTL` | Cache TTL in seconds (default: 60) |
| `ACTUAL_LOCK_TIMEOUT` | Budget-dir lock wait timeout in seconds (default: 10) |
| `ACTUAL_NO_LOCK` | Set to `1` to disable budget-dir locking |
### Config File
@@ -59,7 +62,10 @@ Create an `.actualrc.json` (or `.actualrc`, `.actualrc.yaml`, `actual.config.js`
{
"serverUrl": "http://localhost:5006",
"password": "your-password",
"syncId": "1cfdbb80-6274-49bf-b0c2-737235a4c81f"
"syncId": "1cfdbb80-6274-49bf-b0c2-737235a4c81f",
"cacheTtl": 60,
"lockTimeout": 10,
"noLock": false
}
```
@@ -74,6 +80,11 @@ Create an `.actualrc.json` (or `.actualrc`, `.actualrc.yaml`, `actual.config.js`
| `--session-token <token>` | Session token |
| `--sync-id <id>` | Budget Sync ID |
| `--data-dir <path>` | Data directory |
| `--cache-ttl <seconds>` | Cache TTL; `0` disables caching (default: 60) |
| `--refresh` | Force a sync on this call, ignoring the cache |
| `--no-cache` | Alias for `--refresh` |
| `--lock-timeout <secs>` | Lock wait timeout (default: 10) |
| `--no-lock` | Disable budget-dir locking (use with care) |
| `--format <format>` | Output format: `json` (default), `table`, `csv` |
| `--verbose` | Show informational messages |
@@ -92,6 +103,7 @@ Create an `.actualrc.json` (or `.actualrc`, `.actualrc.yaml`, `actual.config.js`
| `schedules` | Manage scheduled transactions |
| `query` | Run an ActualQL query |
| `server` | Server utilities and lookups |
| `sync` | Refresh or inspect local cache |
Run `actual <command> --help` for subcommands and options.
@@ -135,22 +147,32 @@ All monetary amounts are **integer cents** when passed as input (flags, JSON):
- **Split transactions:** When summing or counting transactions, filter `"is_parent": false` to avoid double-counting. A split parent holds the total amount, and its children hold the individual parts — including both would count the total twice.
- **Avoid rapid sequential requests:** Each CLI invocation opens a new server connection. Running queries in a tight loop (e.g. one per month) may trigger rate limiting or authentication failures. Instead, fetch all data in a single query with a date range filter and process locally:
- **Rapid sequential requests:** The CLI caches the budget locally (see [Caching](#caching)), so read-heavy scripts no longer need a single-query workaround by default. For very chatty scripts, run `actual sync` once and then use a long `--cache-ttl` for reads:
```bash
# Good: single query for the full year
actual query run --table transactions \
--filter '{"$and":[{"date":{"$gte":"2025-01-01"}},{"date":{"$lte":"2025-12-31"}}]}' \
--limit 5000
# Bad: one query per month in a loop (may fail with auth errors)
for month in 01 02 03 ...; do actual query run ...; done
actual sync
actual --cache-ttl 3600 query run ...
actual --cache-ttl 3600 accounts list
```
- **Uncategorized transactions:** `category.name` is `null` for transactions without a category. Account for this when filtering or grouping by category.
- **No date sub-fields in AQL:** `date.month`, `date.year`, etc. are not supported as query fields. To group by month, fetch raw transactions with a date range filter and aggregate locally in a script.
## Caching
The CLI keeps a local copy of your budget so repeated commands don't hit the sync server on every call. Within the TTL (default `60` seconds), read commands (`list`, `balance`, `query run`, …) reuse the cached budget without a network round-trip. Write commands (`add`, `update`, `set-amount`, …) always sync with the server before and after the write.
- `actual sync` — refresh the cache now.
- `actual sync --status` — show how stale the local cache is.
- `actual sync --clear` — delete the local cache; the next command re-downloads.
- `--refresh` (or `--no-cache`) — force a sync on a single call.
- `--cache-ttl <seconds>` — override the TTL for a single call (use `0` to disable caching).
### Concurrency
The CLI takes a shared lock for reads and an exclusive lock for writes on the per-budget cache directory. Many parallel reads are safe; writes serialize. If another CLI process is holding the lock, subsequent invocations wait up to `--lock-timeout` seconds (default `10`) before failing with an error. Pass `--no-lock` to opt out in trusted single-process setups.
## Running Locally (Development)
If you're working on the CLI within the monorepo:

View File

@@ -1,8 +1,13 @@
{
"name": "@actual-app/cli",
"version": "26.4.0",
"version": "26.5.2",
"description": "CLI for Actual Budget",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/actualbudget/actual.git",
"directory": "packages/cli"
},
"bin": {
"actual": "./dist/cli.js",
"actual-cli": "./dist/cli.js"
@@ -12,10 +17,12 @@
],
"type": "module",
"imports": {
"#cache": "./src/cache.ts",
"#commands/*": "./src/commands/*.ts",
"#config": "./src/config.ts",
"#connection": "./src/connection.ts",
"#input": "./src/input.ts",
"#lock": "./src/lock.ts",
"#output": "./src/output.ts",
"#utils": "./src/utils.ts"
},
@@ -28,11 +35,13 @@
"@actual-app/api": "workspace:*",
"cli-table3": "^0.6.5",
"commander": "^14.0.3",
"cosmiconfig": "^9.0.1"
"cosmiconfig": "^9.0.1",
"proper-lockfile": "^4.1.2"
},
"devDependencies": {
"@types/node": "^22.19.17",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"@types/proper-lockfile": "^4",
"@typescript/native-preview": "beta",
"rollup-plugin-visualizer": "^7.0.1",
"vite": "^8.0.5",
"vitest": "^4.1.2"

View File

@@ -0,0 +1,206 @@
import {
existsSync,
mkdtempSync,
readFileSync,
rmSync,
writeFileSync,
} from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import {
CACHE_FILE_NAME,
decideSyncAction,
readCacheState,
writeCacheState,
} from './cache';
describe('readCacheState', () => {
let dir: string;
beforeEach(() => {
dir = mkdtempSync(join(tmpdir(), 'actual-cli-cache-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('returns null when the file does not exist', () => {
expect(readCacheState(dir)).toBeNull();
});
it('returns null when the file is corrupt', () => {
writeFileSync(join(dir, CACHE_FILE_NAME), 'not json');
expect(readCacheState(dir)).toBeNull();
});
it('returns null when the file has the wrong version', () => {
writeFileSync(
join(dir, CACHE_FILE_NAME),
JSON.stringify({
version: 999,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
}),
);
expect(readCacheState(dir)).toBeNull();
});
it('returns the parsed state when the file is valid', () => {
writeFileSync(
join(dir, CACHE_FILE_NAME),
JSON.stringify({
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1234,
lastDownloadedAt: 5678,
}),
);
expect(readCacheState(dir)).toEqual({
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1234,
lastDownloadedAt: 5678,
});
});
});
describe('writeCacheState', () => {
let dir: string;
beforeEach(() => {
dir = mkdtempSync(join(tmpdir(), 'actual-cli-cache-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('writes the state to the cache file', () => {
writeCacheState(dir, {
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
});
const raw = readFileSync(join(dir, CACHE_FILE_NAME), 'utf-8');
expect(JSON.parse(raw).syncId).toBe('a');
});
it('is atomic: removes the tmp file after rename', () => {
writeCacheState(dir, {
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
});
expect(existsSync(join(dir, `${CACHE_FILE_NAME}.tmp`))).toBe(false);
});
it('does not throw when the filesystem refuses the write', () => {
// Force ENOTDIR by pointing writeCacheState at a path whose parent is a
// regular file — no OS-specific pseudo-filesystem semantics needed.
const file = join(dir, 'not-a-dir');
writeFileSync(file, '');
expect(() =>
writeCacheState(join(file, 'nested'), {
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
}),
).not.toThrow();
});
});
describe('decideSyncAction', () => {
const base = {
state: {
version: 1 as const,
syncId: 'sync-1',
budgetId: 'bud-1',
serverUrl: 'http://s',
lastSyncedAt: 1_000_000,
lastDownloadedAt: 1_000_000,
},
config: { syncId: 'sync-1', serverUrl: 'http://s' },
now: 1_000_000,
ttlMs: 60_000,
mutates: false,
refresh: false,
encrypted: false,
};
it('returns "download" when state is null', () => {
expect(decideSyncAction({ ...base, state: null }).action).toBe('download');
});
it('returns "download" when syncId changed', () => {
expect(
decideSyncAction({
...base,
config: { ...base.config, syncId: 'other' },
}).action,
).toBe('download');
});
it('returns "download" when serverUrl changed', () => {
expect(
decideSyncAction({
...base,
config: { ...base.config, serverUrl: 'http://other' },
}).action,
).toBe('download');
});
it('returns "skip" for a read within the TTL', () => {
expect(decideSyncAction({ ...base, now: 1_000_000 + 30_000 }).action).toBe(
'skip',
);
});
it('returns "sync" for a read past the TTL', () => {
expect(decideSyncAction({ ...base, now: 1_000_000 + 61_000 }).action).toBe(
'sync',
);
});
it('returns "sync" for a write even when fresh', () => {
expect(decideSyncAction({ ...base, mutates: true }).action).toBe('sync');
});
it('returns "sync" when refresh is true', () => {
expect(decideSyncAction({ ...base, refresh: true }).action).toBe('sync');
});
it('returns "sync" when ttlMs is 0', () => {
expect(decideSyncAction({ ...base, ttlMs: 0 }).action).toBe('sync');
});
it('returns "sync" for encrypted budgets within the TTL', () => {
expect(decideSyncAction({ ...base, encrypted: true }).action).toBe('sync');
});
it('treats clock skew (negative age) as stale', () => {
expect(decideSyncAction({ ...base, now: 999_999 }).action).toBe('sync');
});
it('carries cached state on non-download actions', () => {
const decision = decideSyncAction({ ...base, mutates: true });
expect(decision).toEqual({ action: 'sync', state: base.state });
});
});

107
packages/cli/src/cache.ts Normal file
View File

@@ -0,0 +1,107 @@
import { randomBytes } from 'node:crypto';
import { mkdirSync, readFileSync, renameSync, writeFileSync } from 'node:fs';
import { join } from 'node:path';
import { isRecord } from './utils';
export const CACHE_FILE_NAME = 'state.json';
export const CACHE_VERSION = 1;
export const META_ROOT_DIR = '.actual-cli';
export type CacheState = {
version: typeof CACHE_VERSION;
syncId: string;
budgetId: string;
serverUrl: string;
lastSyncedAt: number;
lastDownloadedAt: number;
};
export function getMetaDir(dataDir: string, syncId: string): string {
return join(dataDir, META_ROOT_DIR, syncId);
}
function cachePath(metaDir: string): string {
return join(metaDir, CACHE_FILE_NAME);
}
function isCacheState(value: unknown): value is CacheState {
if (!isRecord(value)) return false;
return (
value.version === CACHE_VERSION &&
typeof value.syncId === 'string' &&
typeof value.budgetId === 'string' &&
typeof value.serverUrl === 'string' &&
typeof value.lastSyncedAt === 'number' &&
typeof value.lastDownloadedAt === 'number'
);
}
export function readCacheState(metaDir: string): CacheState | null {
let raw: string;
try {
raw = readFileSync(cachePath(metaDir), 'utf-8');
} catch {
return null;
}
let parsed: unknown;
try {
parsed = JSON.parse(raw);
} catch {
return null;
}
return isCacheState(parsed) ? parsed : null;
}
export function writeCacheState(metaDir: string, state: CacheState): void {
try {
mkdirSync(metaDir, { recursive: true });
const target = cachePath(metaDir);
// Unique tmp name per writer: concurrent shared-lock commands (encrypted
// budgets, --refresh, stale TTL) can both publish, and a shared tmp path
// lets the second writer's truncate destroy the first writer's bytes
// before either renames into place.
const tmp = `${target}.${process.pid}-${randomBytes(4).toString('hex')}.tmp`;
writeFileSync(tmp, JSON.stringify(state));
renameSync(tmp, target);
} catch {
// Cache persistence is best-effort. A read-only or unreachable dir must
// not crash the CLI; the next invocation simply won't find a cache.
}
}
export type SyncDecision =
| { action: 'download' }
| { action: 'skip'; state: CacheState }
| { action: 'sync'; state: CacheState };
export type DecideSyncArgs = {
state: CacheState | null;
config: { syncId: string; serverUrl: string };
now: number;
ttlMs: number;
mutates: boolean;
refresh: boolean;
encrypted: boolean;
};
export function decideSyncAction({
state,
config,
now,
ttlMs,
mutates,
refresh,
encrypted,
}: DecideSyncArgs): SyncDecision {
if (state === null) return { action: 'download' };
if (state.syncId !== config.syncId) return { action: 'download' };
if (state.serverUrl !== config.serverUrl) return { action: 'download' };
if (mutates || refresh || ttlMs === 0 || encrypted) {
return { action: 'sync', state };
}
const age = now - state.lastSyncedAt;
if (age < 0) return { action: 'sync', state };
if (age < ttlMs) return { action: 'skip', state };
return { action: 'sync', state };
}

View File

@@ -14,26 +14,30 @@ export function registerAccountsCommand(program: Command) {
.option('--include-closed', 'Include closed accounts', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const allAccounts = await api.getAccounts();
const accounts = allAccounts.filter(
a => cmdOpts.includeClosed || !a.closed,
);
// Stable sort: on-budget first, off-budget second
// (preserves API sort_order within each group)
accounts.sort((a, b) => Number(a.offbudget) - Number(b.offbudget));
const balances = await Promise.all(
accounts.map(a => api.getAccountBalance(a.id)),
);
const output = accounts.map((a, i) => ({
id: a.id,
name: a.name,
offbudget: a.offbudget,
closed: a.closed,
balance: balances[i],
}));
printOutput(output, opts.format);
});
await withConnection(
opts,
async () => {
const allAccounts = await api.getAccounts();
const accounts = allAccounts.filter(
a => cmdOpts.includeClosed || !a.closed,
);
// Stable sort: on-budget first, off-budget second
// (preserves API sort_order within each group)
accounts.sort((a, b) => Number(a.offbudget) - Number(b.offbudget));
const balances = await Promise.all(
accounts.map(a => api.getAccountBalance(a.id)),
);
const output = accounts.map((a, i) => ({
id: a.id,
name: a.name,
offbudget: a.offbudget,
closed: a.closed,
balance: balances[i],
}));
printOutput(output, opts.format);
},
{ mutates: false },
);
});
accounts
@@ -49,13 +53,17 @@ export function registerAccountsCommand(program: Command) {
.action(async cmdOpts => {
const balance = parseIntFlag(cmdOpts.balance, '--balance');
const opts = program.opts();
await withConnection(opts, async () => {
const id = await api.createAccount(
{ name: cmdOpts.name, offbudget: cmdOpts.offbudget },
balance,
);
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const id = await api.createAccount(
{ name: cmdOpts.name, offbudget: cmdOpts.offbudget },
balance,
);
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
accounts
@@ -81,10 +89,14 @@ export function registerAccountsCommand(program: Command) {
'No update fields provided. Use --name or --offbudget.',
);
}
await withConnection(opts, async () => {
await api.updateAccount(id, fields);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.updateAccount(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
accounts
@@ -100,14 +112,18 @@ export function registerAccountsCommand(program: Command) {
)
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.closeAccount(
id,
cmdOpts.transferAccount,
cmdOpts.transferCategory,
);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.closeAccount(
id,
cmdOpts.transferAccount,
cmdOpts.transferCategory,
);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
accounts
@@ -115,10 +131,14 @@ export function registerAccountsCommand(program: Command) {
.description('Reopen a closed account')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.reopenAccount(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.reopenAccount(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
accounts
@@ -126,10 +146,14 @@ export function registerAccountsCommand(program: Command) {
.description('Delete an account')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteAccount(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteAccount(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
accounts
@@ -148,9 +172,13 @@ export function registerAccountsCommand(program: Command) {
cutoff = cutoffDate;
}
const opts = program.opts();
await withConnection(opts, async () => {
const balance = await api.getAccountBalance(id, cutoff);
printOutput({ id, balance }, opts.format);
});
await withConnection(
opts,
async () => {
const balance = await api.getAccountBalance(id, cutoff);
printOutput({ id, balance }, opts.format);
},
{ mutates: false },
);
});
}

View File

@@ -1,7 +1,6 @@
import * as api from '@actual-app/api';
import type { Command } from 'commander';
import { resolveConfig } from '#config';
import { withConnection } from '#connection';
import { printOutput } from '#output';
import { parseBoolFlag, parseIntFlag } from '#utils';
@@ -20,7 +19,7 @@ export function registerBudgetsCommand(program: Command) {
const result = await api.getBudgets();
printOutput(result, opts.format);
},
{ loadBudget: false },
{ mutates: false, skipBudget: true },
);
});
@@ -30,40 +29,33 @@ export function registerBudgetsCommand(program: Command) {
.option('--encryption-password <password>', 'Encryption password')
.action(async (syncId: string, cmdOpts) => {
const opts = program.opts();
const config = await resolveConfig(opts);
const password = config.encryptionPassword ?? cmdOpts.encryptionPassword;
await withConnection(
opts,
async () => {
async config => {
const password =
cmdOpts.encryptionPassword ?? config.encryptionPassword;
await api.downloadBudget(syncId, {
password,
});
printOutput({ success: true, syncId }, opts.format);
},
{ loadBudget: false },
{ mutates: false, skipBudget: true },
);
});
budgets
.command('sync')
.description('Sync the current budget')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.sync();
printOutput({ success: true }, opts.format);
});
});
budgets
.command('months')
.description('List available budget months')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getBudgetMonths();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getBudgetMonths();
printOutput(result, opts.format);
},
{ mutates: false },
);
});
budgets
@@ -71,10 +63,14 @@ export function registerBudgetsCommand(program: Command) {
.description('Get budget data for a specific month (YYYY-MM)')
.action(async (month: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getBudgetMonth(month);
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getBudgetMonth(month);
printOutput(result, opts.format);
},
{ mutates: false },
);
});
budgets
@@ -89,10 +85,14 @@ export function registerBudgetsCommand(program: Command) {
.action(async cmdOpts => {
const amount = parseIntFlag(cmdOpts.amount, '--amount');
const opts = program.opts();
await withConnection(opts, async () => {
await api.setBudgetAmount(cmdOpts.month, cmdOpts.category, amount);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
await api.setBudgetAmount(cmdOpts.month, cmdOpts.category, amount);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
budgets
@@ -104,10 +104,14 @@ export function registerBudgetsCommand(program: Command) {
.action(async cmdOpts => {
const flag = parseBoolFlag(cmdOpts.flag, '--flag');
const opts = program.opts();
await withConnection(opts, async () => {
await api.setBudgetCarryover(cmdOpts.month, cmdOpts.category, flag);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
await api.setBudgetCarryover(cmdOpts.month, cmdOpts.category, flag);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
budgets
@@ -121,10 +125,14 @@ export function registerBudgetsCommand(program: Command) {
.action(async cmdOpts => {
const parsedAmount = parseIntFlag(cmdOpts.amount, '--amount');
const opts = program.opts();
await withConnection(opts, async () => {
await api.holdBudgetForNextMonth(cmdOpts.month, parsedAmount);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
await api.holdBudgetForNextMonth(cmdOpts.month, parsedAmount);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
budgets
@@ -133,9 +141,13 @@ export function registerBudgetsCommand(program: Command) {
.requiredOption('--month <month>', 'Budget month (YYYY-MM)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.resetBudgetHold(cmdOpts.month);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
await api.resetBudgetHold(cmdOpts.month);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -0,0 +1,131 @@
import * as api from '@actual-app/api';
import { Command } from 'commander';
import { printOutput } from '#output';
import { registerCategoriesCommand } from './categories';
import { registerCategoryGroupsCommand } from './category-groups';
vi.mock('@actual-app/api', () => ({
getCategories: vi.fn().mockResolvedValue([]),
createCategory: vi.fn().mockResolvedValue('new-id'),
updateCategory: vi.fn().mockResolvedValue(undefined),
deleteCategory: vi.fn().mockResolvedValue(undefined),
getCategoryGroups: vi.fn().mockResolvedValue([]),
createCategoryGroup: vi.fn().mockResolvedValue('new-group-id'),
updateCategoryGroup: vi.fn().mockResolvedValue(undefined),
deleteCategoryGroup: vi.fn().mockResolvedValue(undefined),
}));
vi.mock('#connection', () => ({
withConnection: vi.fn((_opts, fn) => fn()),
}));
vi.mock('#output', () => ({
printOutput: vi.fn(),
}));
function createProgram(): Command {
const program = new Command();
program.option('--format <format>');
program.option('--server-url <url>');
program.option('--password <pw>');
program.option('--session-token <token>');
program.option('--sync-id <id>');
program.option('--data-dir <dir>');
program.option('--verbose');
program.exitOverride();
registerCategoriesCommand(program);
registerCategoryGroupsCommand(program);
return program;
}
async function run(args: string[]) {
const program = createProgram();
await program.parseAsync(['node', 'test', ...args]);
}
describe('categories commands', () => {
let stderrSpy: ReturnType<typeof vi.spyOn>;
let stdoutSpy: ReturnType<typeof vi.spyOn>;
beforeEach(() => {
vi.clearAllMocks();
stderrSpy = vi
.spyOn(process.stderr, 'write')
.mockImplementation(() => true);
stdoutSpy = vi
.spyOn(process.stdout, 'write')
.mockImplementation(() => true);
});
afterEach(() => {
stderrSpy.mockRestore();
stdoutSpy.mockRestore();
});
describe('categories list', () => {
it('asks the API to exclude hidden categories by default', async () => {
await run(['categories', 'list']);
expect(api.getCategories).toHaveBeenCalledWith({ hidden: false });
});
it('asks the API for all categories when --include-hidden is passed', async () => {
await run(['categories', 'list', '--include-hidden']);
expect(api.getCategories).toHaveBeenCalledWith({});
});
it('prints whatever the API returns', async () => {
const visible = {
id: '1',
name: 'Visible',
group_id: 'g1',
hidden: false,
};
vi.mocked(api.getCategories).mockResolvedValue([visible]);
await run(['categories', 'list']);
expect(printOutput).toHaveBeenCalledWith([visible], undefined);
});
it('passes format option to printOutput', async () => {
vi.mocked(api.getCategories).mockResolvedValue([]);
await run(['--format', 'csv', 'categories', 'list']);
expect(printOutput).toHaveBeenCalledWith([], 'csv');
});
});
describe('category-groups list', () => {
it('asks the API to exclude hidden groups by default', async () => {
await run(['category-groups', 'list']);
expect(api.getCategoryGroups).toHaveBeenCalledWith({ hidden: false });
});
it('asks the API for all groups when --include-hidden is passed', async () => {
await run(['category-groups', 'list', '--include-hidden']);
expect(api.getCategoryGroups).toHaveBeenCalledWith({});
});
it('prints whatever the API returns', async () => {
const group = {
id: 'g1',
name: 'Group',
is_income: false,
hidden: false,
categories: [{ id: 'c1', name: 'Cat', group_id: 'g1', hidden: false }],
};
vi.mocked(api.getCategoryGroups).mockResolvedValue([group]);
await run(['category-groups', 'list']);
expect(printOutput).toHaveBeenCalledWith([group], undefined);
});
});
});

View File

@@ -12,13 +12,20 @@ export function registerCategoriesCommand(program: Command) {
categories
.command('list')
.description('List all categories')
.action(async () => {
.description('List categories (excludes hidden by default)')
.option('--include-hidden', 'Include hidden categories', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getCategories();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getCategories(
cmdOpts.includeHidden ? {} : { hidden: false },
);
printOutput(result, opts.format);
},
{ mutates: false },
);
});
categories
@@ -29,15 +36,19 @@ export function registerCategoriesCommand(program: Command) {
.option('--is-income', 'Mark as income category', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const id = await api.createCategory({
name: cmdOpts.name,
group_id: cmdOpts.groupId,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const id = await api.createCategory({
name: cmdOpts.name,
group_id: cmdOpts.groupId,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
categories
@@ -55,10 +66,14 @@ export function registerCategoriesCommand(program: Command) {
throw new Error('No update fields provided. Use --name or --hidden.');
}
const opts = program.opts();
await withConnection(opts, async () => {
await api.updateCategory(id, fields);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.updateCategory(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
categories
@@ -67,9 +82,13 @@ export function registerCategoriesCommand(program: Command) {
.option('--transfer-to <id>', 'Transfer transactions to this category')
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteCategory(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteCategory(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -12,13 +12,20 @@ export function registerCategoryGroupsCommand(program: Command) {
groups
.command('list')
.description('List all category groups')
.action(async () => {
.description('List category groups (excludes hidden by default)')
.option('--include-hidden', 'Include hidden groups and categories', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getCategoryGroups();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getCategoryGroups(
cmdOpts.includeHidden ? {} : { hidden: false },
);
printOutput(result, opts.format);
},
{ mutates: false },
);
});
groups
@@ -28,14 +35,18 @@ export function registerCategoryGroupsCommand(program: Command) {
.option('--is-income', 'Mark as income group', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const id = await api.createCategoryGroup({
name: cmdOpts.name,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const id = await api.createCategoryGroup({
name: cmdOpts.name,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
groups
@@ -53,10 +64,14 @@ export function registerCategoryGroupsCommand(program: Command) {
throw new Error('No update fields provided. Use --name or --hidden.');
}
const opts = program.opts();
await withConnection(opts, async () => {
await api.updateCategoryGroup(id, fields);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.updateCategoryGroup(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
groups
@@ -65,9 +80,13 @@ export function registerCategoryGroupsCommand(program: Command) {
.option('--transfer-to <id>', 'Transfer transactions to this category ID')
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteCategoryGroup(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteCategoryGroup(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -12,10 +12,14 @@ export function registerPayeesCommand(program: Command) {
.description('List all payees')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getPayees();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getPayees();
printOutput(result, opts.format);
},
{ mutates: false },
);
});
payees
@@ -23,10 +27,14 @@ export function registerPayeesCommand(program: Command) {
.description('List frequently used payees')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getCommonPayees();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getCommonPayees();
printOutput(result, opts.format);
},
{ mutates: false },
);
});
payees
@@ -35,10 +43,14 @@ export function registerPayeesCommand(program: Command) {
.requiredOption('--name <name>', 'Payee name')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const id = await api.createPayee({ name: cmdOpts.name });
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const id = await api.createPayee({ name: cmdOpts.name });
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
payees
@@ -54,10 +66,14 @@ export function registerPayeesCommand(program: Command) {
);
}
const opts = program.opts();
await withConnection(opts, async () => {
await api.updatePayee(id, fields);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.updatePayee(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
payees
@@ -65,10 +81,14 @@ export function registerPayeesCommand(program: Command) {
.description('Delete a payee')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deletePayee(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deletePayee(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
payees
@@ -87,9 +107,13 @@ export function registerPayeesCommand(program: Command) {
);
}
const opts = program.opts();
await withConnection(opts, async () => {
await api.mergePayees(cmdOpts.target, mergeIds);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
await api.mergePayees(cmdOpts.target, mergeIds);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -301,27 +301,31 @@ export function registerQueryCommand(program: Command) {
.addHelpText('after', RUN_EXAMPLES)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const parsed = cmdOpts.file ? readJsonInput(cmdOpts) : undefined;
if (parsed !== undefined && !isRecord(parsed)) {
throw new Error('Query file must contain a JSON object');
}
const queryObj = parsed
? buildQueryFromFile(parsed, cmdOpts.table)
: buildQueryFromFlags(cmdOpts);
await withConnection(
opts,
async () => {
const parsed = cmdOpts.file ? readJsonInput(cmdOpts) : undefined;
if (parsed !== undefined && !isRecord(parsed)) {
throw new Error('Query file must contain a JSON object');
}
const queryObj = parsed
? buildQueryFromFile(parsed, cmdOpts.table)
: buildQueryFromFlags(cmdOpts);
const result = await api.aqlQuery(queryObj);
const result = await api.aqlQuery(queryObj);
if (!isRecord(result) || !('data' in result)) {
throw new Error('Query result missing data');
}
if (!isRecord(result) || !('data' in result)) {
throw new Error('Query result missing data');
}
if (cmdOpts.count) {
printOutput({ count: result.data }, opts.format);
} else {
printOutput(result.data, opts.format);
}
});
if (cmdOpts.count) {
printOutput({ count: result.data }, opts.format);
} else {
printOutput(result.data, opts.format);
}
},
{ mutates: false },
);
});
query

View File

@@ -15,10 +15,14 @@ export function registerRulesCommand(program: Command) {
.description('List all rules')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getRules();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getRules();
printOutput(result, opts.format);
},
{ mutates: false },
);
});
rules
@@ -26,10 +30,14 @@ export function registerRulesCommand(program: Command) {
.description('List rules for a specific payee')
.action(async (payeeId: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getPayeeRules(payeeId);
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getPayeeRules(payeeId);
printOutput(result, opts.format);
},
{ mutates: false },
);
});
rules
@@ -39,13 +47,17 @@ export function registerRulesCommand(program: Command) {
.option('--file <path>', 'Read rule from JSON file (use - for stdin)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.createRule
>[0];
const id = await api.createRule(rule);
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.createRule
>[0];
const id = await api.createRule(rule);
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
rules
@@ -55,13 +67,17 @@ export function registerRulesCommand(program: Command) {
.option('--file <path>', 'Read rule from JSON file (use - for stdin)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.updateRule
>[0];
await api.updateRule(rule);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.updateRule
>[0];
await api.updateRule(rule);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
rules
@@ -69,9 +85,13 @@ export function registerRulesCommand(program: Command) {
.description('Delete a rule')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteRule(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteRule(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -15,10 +15,14 @@ export function registerSchedulesCommand(program: Command) {
.description('List all schedules')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getSchedules();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getSchedules();
printOutput(result, opts.format);
},
{ mutates: false },
);
});
schedules
@@ -28,13 +32,17 @@ export function registerSchedulesCommand(program: Command) {
.option('--file <path>', 'Read schedule from JSON file (use - for stdin)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const schedule = readJsonInput(cmdOpts) as Parameters<
typeof api.createSchedule
>[0];
const id = await api.createSchedule(schedule);
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const schedule = readJsonInput(cmdOpts) as Parameters<
typeof api.createSchedule
>[0];
const id = await api.createSchedule(schedule);
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
schedules
@@ -45,13 +53,17 @@ export function registerSchedulesCommand(program: Command) {
.option('--reset-next-date', 'Reset next occurrence date', false)
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(opts, async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateSchedule
>[1];
await api.updateSchedule(id, fields, cmdOpts.resetNextDate);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateSchedule
>[1];
await api.updateSchedule(id, fields, cmdOpts.resetNextDate);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
schedules
@@ -59,9 +71,13 @@ export function registerSchedulesCommand(program: Command) {
.description('Delete a schedule')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteSchedule(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteSchedule(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -19,7 +19,7 @@ export function registerServerCommand(program: Command) {
const version = await api.getServerVersion();
printOutput({ version }, opts.format);
},
{ loadBudget: false },
{ mutates: false, skipBudget: true },
);
});
@@ -34,13 +34,17 @@ export function registerServerCommand(program: Command) {
.requiredOption('--name <name>', 'Entity name')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const id = await api.getIDByName(cmdOpts.type, cmdOpts.name);
printOutput(
{ id, type: cmdOpts.type, name: cmdOpts.name },
opts.format,
);
});
await withConnection(
opts,
async () => {
const id = await api.getIDByName(cmdOpts.type, cmdOpts.name);
printOutput(
{ id, type: cmdOpts.type, name: cmdOpts.name },
opts.format,
);
},
{ mutates: false },
);
});
server
@@ -49,12 +53,16 @@ export function registerServerCommand(program: Command) {
.option('--account <id>', 'Specific account ID to sync')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const args = cmdOpts.account
? { accountId: cmdOpts.account }
: undefined;
await api.runBankSync(args);
printOutput({ success: true }, opts.format);
});
await withConnection(
opts,
async () => {
const args = cmdOpts.account
? { accountId: cmdOpts.account }
: undefined;
await api.runBankSync(args);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -0,0 +1,124 @@
import { existsSync, mkdtempSync, rmSync } from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import { Command } from 'commander';
import { CACHE_FILE_NAME, getMetaDir, writeCacheState } from '#cache';
import { resolveConfig } from '#config';
import { registerSyncCommand } from './sync';
vi.mock('@actual-app/api', () => ({
init: vi.fn().mockResolvedValue(undefined),
downloadBudget: vi.fn().mockResolvedValue(undefined),
loadBudget: vi.fn().mockResolvedValue(undefined),
sync: vi.fn().mockResolvedValue(undefined),
shutdown: vi.fn().mockResolvedValue(undefined),
getBudgets: vi
.fn()
.mockResolvedValue([{ id: 'bud-disk-1', groupId: 'sync-1' }]),
}));
vi.mock('#config', () => ({
resolveConfig: vi.fn(),
}));
let dataDir: string;
function metaDirFor(syncId: string) {
return getMetaDir(dataDir, syncId);
}
function program() {
const p = new Command();
p.exitOverride();
p.option('--sync-id <id>');
p.option('--data-dir <path>');
p.option('--format <fmt>');
p.option('--verbose');
registerSyncCommand(p);
return p;
}
describe('actual sync', () => {
let stdoutSpy: ReturnType<typeof vi.spyOn>;
beforeEach(() => {
vi.clearAllMocks();
dataDir = mkdtempSync(join(tmpdir(), 'actual-cli-sync-'));
vi.mocked(resolveConfig).mockResolvedValue({
serverUrl: 'http://test',
password: 'pw',
dataDir,
syncId: 'sync-1',
cacheTtl: 60,
lockTimeout: 10,
refresh: false,
noLock: true,
});
stdoutSpy = vi
.spyOn(process.stdout, 'write')
.mockImplementation(() => true);
});
afterEach(() => {
stdoutSpy.mockRestore();
rmSync(dataDir, { recursive: true, force: true });
});
it('runs a sync and prints the syncId', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: 0,
lastDownloadedAt: 0,
});
await program().parseAsync(['node', 'actual', 'sync']);
const out = stdoutSpy.mock.calls
.map((c: unknown[]) => String(c[0]))
.join('');
expect(out).toMatch(/"syncId":\s*"sync-1"/);
});
it('--status prints cache info without syncing', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now() - 5000,
lastDownloadedAt: Date.now() - 5000,
});
await program().parseAsync(['node', 'actual', 'sync', '--status']);
const out = stdoutSpy.mock.calls
.map((c: unknown[]) => String(c[0]))
.join('');
expect(out).toMatch(/"stale":\s*(true|false)/);
expect(out).toMatch(/"ageSeconds":\s*\d+/);
});
it('--status on no prior sync reports "never synced" and exits 0', async () => {
await program().parseAsync(['node', 'actual', 'sync', '--status']);
const out = stdoutSpy.mock.calls
.map((c: unknown[]) => String(c[0]))
.join('');
expect(out).toMatch(/"neverSynced":\s*true/);
});
it('--clear removes the cache file', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
expect(existsSync(join(metaDirFor('sync-1'), CACHE_FILE_NAME))).toBe(true);
await program().parseAsync(['node', 'actual', 'sync', '--clear']);
expect(existsSync(join(metaDirFor('sync-1'), CACHE_FILE_NAME))).toBe(false);
});
});

View File

@@ -0,0 +1,118 @@
import { rmSync } from 'node:fs';
import { join } from 'node:path';
import type { Command } from 'commander';
import { CACHE_FILE_NAME, getMetaDir, readCacheState } from '#cache';
import type { CliConfig } from '#config';
import { resolveConfig } from '#config';
import { withConnection } from '#connection';
import { acquireExclusive } from '#lock';
import { printOutput } from '#output';
type SyncCmdOpts = {
status?: boolean;
clear?: boolean;
};
async function requireSyncIdAndMeta(
opts: Record<string, unknown>,
flag: string,
): Promise<{ config: CliConfig; meta: string }> {
const config = await resolveConfig(opts);
if (!config.syncId) {
throw new Error(
`Sync ID is required for sync ${flag}. Set --sync-id or ACTUAL_SYNC_ID.`,
);
}
return { config, meta: getMetaDir(config.dataDir, config.syncId) };
}
export function registerSyncCommand(program: Command) {
program
.command('sync')
.description(
'Sync the local cached budget with the server, print cache status, or clear the cache',
)
.option('--status', 'Print cache status without syncing', false)
.option(
'--clear',
'Delete the local cache; next command re-downloads',
false,
)
.action(async (cmdOpts: SyncCmdOpts) => {
const opts = program.opts();
if (cmdOpts.status) {
const { config, meta } = await requireSyncIdAndMeta(opts, '--status');
const state = readCacheState(meta);
if (state === null) {
printOutput(
{
neverSynced: true,
syncId: config.syncId,
ttlSeconds: config.cacheTtl,
},
opts.format,
);
return;
}
const rawAgeSeconds = Math.round(
(Date.now() - state.lastSyncedAt) / 1000,
);
const ageSeconds = Math.max(0, rawAgeSeconds);
printOutput(
{
neverSynced: false,
syncId: state.syncId,
budgetId: state.budgetId,
syncedAt: new Date(state.lastSyncedAt).toISOString(),
lastDownloadedAt: new Date(state.lastDownloadedAt).toISOString(),
ageSeconds,
ttlSeconds: config.cacheTtl,
stale: rawAgeSeconds < 0 || rawAgeSeconds > config.cacheTtl,
},
opts.format,
);
return;
}
if (cmdOpts.clear) {
const { config, meta } = await requireSyncIdAndMeta(opts, '--clear');
// Serialize with concurrent writers so we don't rm a half-written
// state.json that's about to be renamed into place.
const release = config.noLock
? null
: await acquireExclusive(meta, {
timeoutMs: config.lockTimeout * 1000,
});
try {
rmSync(join(meta, CACHE_FILE_NAME), { force: true });
} finally {
await release?.();
}
printOutput({ cleared: true, syncId: config.syncId }, opts.format);
return;
}
await withConnection(
opts,
async config => {
const state = config.syncId
? readCacheState(getMetaDir(config.dataDir, config.syncId))
: null;
printOutput(
{
syncedAt: new Date(
state?.lastSyncedAt ?? Date.now(),
).toISOString(),
syncId: config.syncId,
budgetId: state?.budgetId ?? config.syncId,
},
opts.format,
);
},
{ mutates: true },
);
});
}

View File

@@ -12,10 +12,14 @@ export function registerTagsCommand(program: Command) {
.description('List all tags')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getTags();
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getTags();
printOutput(result, opts.format);
},
{ mutates: false },
);
});
tags
@@ -26,14 +30,18 @@ export function registerTagsCommand(program: Command) {
.option('--description <description>', 'Tag description')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const id = await api.createTag({
tag: cmdOpts.tag,
color: cmdOpts.color,
description: cmdOpts.description,
});
printOutput({ id }, opts.format);
});
await withConnection(
opts,
async () => {
const id = await api.createTag({
tag: cmdOpts.tag,
color: cmdOpts.color,
description: cmdOpts.description,
});
printOutput({ id }, opts.format);
},
{ mutates: true },
);
});
tags
@@ -55,10 +63,14 @@ export function registerTagsCommand(program: Command) {
);
}
const opts = program.opts();
await withConnection(opts, async () => {
await api.updateTag(id, fields);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.updateTag(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
tags
@@ -66,9 +78,13 @@ export function registerTagsCommand(program: Command) {
.description('Delete a tag')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteTag(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteTag(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -18,14 +18,18 @@ export function registerTransactionsCommand(program: Command) {
.requiredOption('--end <date>', 'End date (YYYY-MM-DD)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const result = await api.getTransactions(
cmdOpts.account,
cmdOpts.start,
cmdOpts.end,
);
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const result = await api.getTransactions(
cmdOpts.account,
cmdOpts.start,
cmdOpts.end,
);
printOutput(result, opts.format);
},
{ mutates: false },
);
});
transactions
@@ -41,20 +45,24 @@ export function registerTransactionsCommand(program: Command) {
.option('--run-transfers', 'Process transfers', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.addTransactions
>[1];
const result = await api.addTransactions(
cmdOpts.account,
transactions,
{
learnCategories: cmdOpts.learnCategories,
runTransfers: cmdOpts.runTransfers,
},
);
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.addTransactions
>[1];
const result = await api.addTransactions(
cmdOpts.account,
transactions,
{
learnCategories: cmdOpts.learnCategories,
runTransfers: cmdOpts.runTransfers,
},
);
printOutput(result, opts.format);
},
{ mutates: true },
);
});
transactions
@@ -69,20 +77,24 @@ export function registerTransactionsCommand(program: Command) {
.option('--dry-run', 'Preview without importing', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(opts, async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.importTransactions
>[1];
const result = await api.importTransactions(
cmdOpts.account,
transactions,
{
defaultCleared: true,
dryRun: cmdOpts.dryRun,
},
);
printOutput(result, opts.format);
});
await withConnection(
opts,
async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.importTransactions
>[1];
const result = await api.importTransactions(
cmdOpts.account,
transactions,
{
defaultCleared: true,
dryRun: cmdOpts.dryRun,
},
);
printOutput(result, opts.format);
},
{ mutates: true },
);
});
transactions
@@ -92,13 +104,17 @@ export function registerTransactionsCommand(program: Command) {
.option('--file <path>', 'Read fields from JSON file (use - for stdin)')
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(opts, async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateTransaction
>[1];
await api.updateTransaction(id, fields);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateTransaction
>[1];
await api.updateTransaction(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
transactions
@@ -106,9 +122,13 @@ export function registerTransactionsCommand(program: Command) {
.description('Delete a transaction')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.deleteTransaction(id);
printOutput({ success: true, id }, opts.format);
});
await withConnection(
opts,
async () => {
await api.deleteTransaction(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
});
}

View File

@@ -28,6 +28,9 @@ describe('resolveConfig', () => {
'ACTUAL_SYNC_ID',
'ACTUAL_DATA_DIR',
'ACTUAL_ENCRYPTION_PASSWORD',
'ACTUAL_CACHE_TTL',
'ACTUAL_LOCK_TIMEOUT',
'ACTUAL_NO_LOCK',
];
beforeEach(() => {
@@ -159,6 +162,125 @@ describe('resolveConfig', () => {
});
});
describe('cache options', () => {
beforeEach(() => {
process.env.ACTUAL_SERVER_URL = 'http://test';
process.env.ACTUAL_PASSWORD = 'pw';
});
it('defaults cacheTtl to 60 seconds', async () => {
const config = await resolveConfig({});
expect(config.cacheTtl).toBe(60);
});
it('reads cacheTtl from env', async () => {
process.env.ACTUAL_CACHE_TTL = '300';
const config = await resolveConfig({});
expect(config.cacheTtl).toBe(300);
});
it('prefers cacheTtl from CLI flag', async () => {
process.env.ACTUAL_CACHE_TTL = '300';
const config = await resolveConfig({ cacheTtl: 10 });
expect(config.cacheTtl).toBe(10);
});
it('rejects negative cacheTtl', async () => {
await expect(resolveConfig({ cacheTtl: -1 })).rejects.toThrow(/cacheTtl/);
});
it('rejects non-integer cacheTtl from env', async () => {
process.env.ACTUAL_CACHE_TTL = 'banana';
await expect(resolveConfig({})).rejects.toThrow(/ACTUAL_CACHE_TTL/);
});
it('defaults lockTimeout to 10 seconds', async () => {
const config = await resolveConfig({});
expect(config.lockTimeout).toBe(10);
});
it('reads lockTimeout from env', async () => {
process.env.ACTUAL_LOCK_TIMEOUT = '30';
const config = await resolveConfig({});
expect(config.lockTimeout).toBe(30);
});
it('defaults refresh to false', async () => {
const config = await resolveConfig({});
expect(config.refresh).toBe(false);
});
it('sets refresh when provided on CLI opts', async () => {
const config = await resolveConfig({ refresh: true });
expect(config.refresh).toBe(true);
});
it('sets refresh when --no-cache is passed (cliOpts.cache === false)', async () => {
const config = await resolveConfig({ cache: false });
expect(config.refresh).toBe(true);
});
it('does not set refresh when cliOpts.cache is true (flag absent)', async () => {
const config = await resolveConfig({ cache: true });
expect(config.refresh).toBe(false);
});
it('defaults noLock to false', async () => {
const config = await resolveConfig({});
expect(config.noLock).toBe(false);
});
it('sets noLock when --no-lock is passed (cliOpts.lock === false)', async () => {
const config = await resolveConfig({ lock: false });
expect(config.noLock).toBe(true);
});
it('leaves noLock false when cliOpts.lock is true (flag absent)', async () => {
const config = await resolveConfig({ lock: true });
expect(config.noLock).toBe(false);
});
it('parses ACTUAL_NO_LOCK=1 as true', async () => {
process.env.ACTUAL_NO_LOCK = '1';
const config = await resolveConfig({});
expect(config.noLock).toBe(true);
});
it('parses ACTUAL_NO_LOCK=true as true', async () => {
process.env.ACTUAL_NO_LOCK = 'true';
const config = await resolveConfig({});
expect(config.noLock).toBe(true);
});
it('throws on an invalid ACTUAL_NO_LOCK value', async () => {
process.env.ACTUAL_NO_LOCK = 'yes';
await expect(resolveConfig({})).rejects.toThrow(/ACTUAL_NO_LOCK/);
});
it('reads cacheTtl/lockTimeout/noLock from config file', async () => {
mockConfigFile({
serverUrl: 'http://file',
password: 'pw',
cacheTtl: 120,
lockTimeout: 5,
noLock: true,
});
const config = await resolveConfig({});
expect(config.cacheTtl).toBe(120);
expect(config.lockTimeout).toBe(5);
expect(config.noLock).toBe(true);
});
it('rejects non-number cacheTtl in config file', async () => {
mockConfigFile({
serverUrl: 'http://file',
password: 'pw',
cacheTtl: 'soon',
});
await expect(resolveConfig({})).rejects.toThrow(/cacheTtl/);
});
});
describe('cosmiconfig handling', () => {
it('handles null result (no config file found)', async () => {
mockConfigFile(null);

View File

@@ -3,7 +3,7 @@ import { join } from 'path';
import { cosmiconfig } from 'cosmiconfig';
import { isRecord } from './utils';
import { isRecord, parseBoolEnv, parseNonNegativeIntFlag } from './utils';
export type CliConfig = {
serverUrl: string;
@@ -12,6 +12,10 @@ export type CliConfig = {
syncId?: string;
dataDir: string;
encryptionPassword?: string;
cacheTtl: number;
lockTimeout: number;
refresh: boolean;
noLock: boolean;
};
export type CliGlobalOpts = {
@@ -21,10 +25,29 @@ export type CliGlobalOpts = {
syncId?: string;
dataDir?: string;
encryptionPassword?: string;
cacheTtl?: number;
lockTimeout?: number;
refresh?: boolean;
// Commander stores --no-foo flags under the positive key. Default true,
// false when the flag is passed.
cache?: boolean;
lock?: boolean;
format?: 'json' | 'table' | 'csv';
verbose?: boolean;
};
const stringKeys = [
'serverUrl',
'password',
'sessionToken',
'syncId',
'dataDir',
'encryptionPassword',
] as const;
const numberKeys = ['cacheTtl', 'lockTimeout'] as const;
const booleanKeys = ['noLock'] as const;
type ConfigFileContent = {
serverUrl?: string;
password?: string;
@@ -32,15 +55,15 @@ type ConfigFileContent = {
syncId?: string;
dataDir?: string;
encryptionPassword?: string;
cacheTtl?: number;
lockTimeout?: number;
noLock?: boolean;
};
const configFileKeys: readonly string[] = [
'serverUrl',
'password',
'sessionToken',
'syncId',
'dataDir',
'encryptionPassword',
...stringKeys,
...numberKeys,
...booleanKeys,
];
function validateConfigFileContent(value: unknown): ConfigFileContent {
@@ -54,9 +77,30 @@ function validateConfigFileContent(value: unknown): ConfigFileContent {
if (!configFileKeys.includes(key)) {
throw new Error(`Invalid config file: unknown key "${key}"`);
}
if (value[key] !== undefined && typeof value[key] !== 'string') {
const v = value[key];
if (v === undefined) continue;
if (
(stringKeys as readonly string[]).includes(key) &&
typeof v !== 'string'
) {
throw new Error(
`Invalid config file: key "${key}" must be a string, got ${typeof value[key]}`,
`Invalid config file: key "${key}" must be a string, got ${typeof v}`,
);
}
if (
(numberKeys as readonly string[]).includes(key) &&
(typeof v !== 'number' || !Number.isInteger(v) || v < 0)
) {
throw new Error(
`Invalid config file: key "${key}" must be a non-negative integer`,
);
}
if (
(booleanKeys as readonly string[]).includes(key) &&
typeof v !== 'boolean'
) {
throw new Error(
`Invalid config file: key "${key}" must be a boolean, got ${typeof v}`,
);
}
}
@@ -83,6 +127,22 @@ async function loadConfigFile(): Promise<ConfigFileContent> {
return {};
}
function parseNonNegativeIntEnv(
raw: string | undefined,
source: string,
): number | undefined {
return raw === undefined ? undefined : parseNonNegativeIntFlag(raw, source);
}
function validateNonNegativeInt(value: number, name: string): number {
if (!Number.isInteger(value) || value < 0) {
throw new Error(
`Invalid ${name}: expected a non-negative integer, got ${value}`,
);
}
return value;
}
export async function resolveConfig(
cliOpts: CliGlobalOpts,
): Promise<CliConfig> {
@@ -128,6 +188,37 @@ export async function resolveConfig(
);
}
const cacheTtl = validateNonNegativeInt(
cliOpts.cacheTtl ??
parseNonNegativeIntEnv(
process.env.ACTUAL_CACHE_TTL,
'ACTUAL_CACHE_TTL',
) ??
fileConfig.cacheTtl ??
60,
'cacheTtl',
);
const lockTimeout = validateNonNegativeInt(
cliOpts.lockTimeout ??
parseNonNegativeIntEnv(
process.env.ACTUAL_LOCK_TIMEOUT,
'ACTUAL_LOCK_TIMEOUT',
) ??
fileConfig.lockTimeout ??
10,
'lockTimeout',
);
const refresh = (cliOpts.refresh ?? false) || cliOpts.cache === false;
const flagNoLock = cliOpts.lock === false ? true : undefined;
const noLock =
flagNoLock ??
parseBoolEnv(process.env.ACTUAL_NO_LOCK, 'ACTUAL_NO_LOCK') ??
fileConfig.noLock ??
false;
return {
serverUrl,
password,
@@ -135,5 +226,9 @@ export async function resolveConfig(
syncId,
dataDir,
encryptionPassword,
cacheTtl,
lockTimeout,
refresh,
noLock,
};
}

View File

@@ -1,24 +1,44 @@
import { mkdtempSync, rmSync } from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import * as api from '@actual-app/api';
import { getMetaDir, writeCacheState } from './cache';
import { resolveConfig } from './config';
import { withConnection } from './connection';
vi.mock('@actual-app/api', () => ({
init: vi.fn().mockResolvedValue(undefined),
downloadBudget: vi.fn().mockResolvedValue(undefined),
loadBudget: vi.fn().mockResolvedValue(undefined),
sync: vi.fn().mockResolvedValue(undefined),
shutdown: vi.fn().mockResolvedValue(undefined),
getBudgets: vi
.fn()
.mockResolvedValue([{ id: 'bud-disk-1', groupId: 'sync-1' }]),
}));
vi.mock('./config', () => ({
resolveConfig: vi.fn(),
}));
let dataDir: string;
function metaDirFor(syncId: string) {
return getMetaDir(dataDir, syncId);
}
function setConfig(overrides: Record<string, unknown> = {}) {
vi.mocked(resolveConfig).mockResolvedValue({
serverUrl: 'http://test',
password: 'pw',
dataDir: '/tmp/data',
syncId: 'budget-1',
dataDir,
syncId: 'sync-1',
cacheTtl: 60,
lockTimeout: 10,
refresh: false,
noLock: true,
...overrides,
});
}
@@ -31,104 +51,182 @@ describe('withConnection', () => {
stderrSpy = vi
.spyOn(process.stderr, 'write')
.mockImplementation(() => true);
dataDir = mkdtempSync(join(tmpdir(), 'actual-cli-conn-'));
setConfig();
});
afterEach(() => {
stderrSpy.mockRestore();
rmSync(dataDir, { recursive: true, force: true });
});
it('calls api.init with password when no sessionToken', async () => {
setConfig({ password: 'pw', sessionToken: undefined });
await withConnection({}, async () => 'ok');
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.init).toHaveBeenCalledWith({
serverURL: 'http://test',
password: 'pw',
dataDir: '/tmp/data',
dataDir,
verbose: undefined,
});
});
it('calls api.init with sessionToken when present', async () => {
setConfig({ sessionToken: 'tok', password: undefined });
await withConnection({}, async () => 'ok');
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.init).toHaveBeenCalledWith({
serverURL: 'http://test',
sessionToken: 'tok',
dataDir: '/tmp/data',
dataDir,
verbose: undefined,
});
});
it('calls api.downloadBudget when syncId is set', async () => {
setConfig({ syncId: 'budget-1' });
await withConnection({}, async () => 'ok');
expect(api.downloadBudget).toHaveBeenCalledWith('budget-1', {
it('first run: calls downloadBudget and writes cache state', async () => {
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.downloadBudget).toHaveBeenCalledWith('sync-1', {
password: undefined,
});
expect(api.sync).not.toHaveBeenCalled();
});
it('throws when loadBudget is true but syncId is not set', async () => {
setConfig({ syncId: undefined });
await expect(withConnection({}, async () => 'ok')).rejects.toThrow(
'Sync ID is required',
);
});
it('skips budget download when loadBudget is false and syncId is not set', async () => {
setConfig({ syncId: undefined });
await withConnection({}, async () => 'ok', { loadBudget: false });
it('skips sync on a read inside the TTL', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.loadBudget).toHaveBeenCalledWith('bud-disk-1');
expect(api.sync).not.toHaveBeenCalled();
expect(api.downloadBudget).not.toHaveBeenCalled();
});
it('does not call api.downloadBudget when loadBudget is false', async () => {
setConfig({ syncId: 'budget-1' });
await withConnection({}, async () => 'ok', { loadBudget: false });
expect(api.downloadBudget).not.toHaveBeenCalled();
it('syncs on a read past the TTL', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now() - 10 * 60_000,
lastDownloadedAt: Date.now() - 10 * 60_000,
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.loadBudget).toHaveBeenCalled();
expect(api.sync).toHaveBeenCalledTimes(1);
});
it('returns callback result', async () => {
const result = await withConnection({}, async () => 42);
it('write command syncs before and after the callback, even when fresh', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: true });
expect(api.loadBudget).toHaveBeenCalled();
expect(api.sync).toHaveBeenCalledTimes(2);
});
it('--refresh forces a sync on a read inside the TTL', async () => {
setConfig({ refresh: true });
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.sync).toHaveBeenCalledTimes(1);
});
it('encrypted budget forces a sync on a read inside the TTL', async () => {
setConfig({ encryptionPassword: 'secret' });
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.sync).toHaveBeenCalledTimes(1);
});
it('invalidates cache when syncId changes', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'OTHER',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.downloadBudget).toHaveBeenCalled();
});
it('skips budget work when skipBudget is true', async () => {
await withConnection({}, async () => 'ok', {
mutates: false,
skipBudget: true,
});
expect(api.downloadBudget).not.toHaveBeenCalled();
expect(api.loadBudget).not.toHaveBeenCalled();
expect(api.sync).not.toHaveBeenCalled();
});
it('throws when syncId is missing and skipBudget is false', async () => {
setConfig({ syncId: undefined });
await expect(
withConnection({}, async () => 'ok', { mutates: false }),
).rejects.toThrow('Sync ID is required');
});
it('returns the callback result', async () => {
const result = await withConnection({}, async () => 42, {
mutates: false,
});
expect(result).toBe(42);
});
it('calls api.shutdown in finally block on success', async () => {
await withConnection({}, async () => 'ok');
it('calls api.shutdown on success', async () => {
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.shutdown).toHaveBeenCalled();
});
it('calls api.shutdown in finally block on error', async () => {
it('calls api.shutdown on error', async () => {
await expect(
withConnection({}, async () => {
throw new Error('boom');
}),
withConnection(
{},
async () => {
throw new Error('boom');
},
{ mutates: false },
),
).rejects.toThrow('boom');
expect(api.shutdown).toHaveBeenCalled();
});
it('does not write to stderr by default', async () => {
await withConnection({}, async () => 'ok');
expect(stderrSpy).not.toHaveBeenCalled();
});
it('writes info to stderr when verbose', async () => {
await withConnection({ verbose: true }, async () => 'ok');
expect(stderrSpy).toHaveBeenCalledWith(
expect.stringContaining('Connecting to'),
);
it('propagates sync errors on a stale read', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now() - 10 * 60_000,
lastDownloadedAt: Date.now() - 10 * 60_000,
});
vi.mocked(api.sync).mockRejectedValueOnce(new Error('network'));
await expect(
withConnection({}, async () => 'ok', { mutates: false }),
).rejects.toThrow('network');
});
});

View File

@@ -1,30 +1,49 @@
import { mkdirSync } from 'fs';
import * as api from '@actual-app/api';
import type { CacheState } from './cache';
import {
CACHE_VERSION,
decideSyncAction,
getMetaDir,
readCacheState,
writeCacheState,
} from './cache';
import type { CliConfig, CliGlobalOpts } from './config';
import { resolveConfig } from './config';
import type { CliGlobalOpts } from './config';
function info(message: string, verbose?: boolean) {
if (verbose) {
process.stderr.write(message + '\n');
}
}
import { acquireExclusive, acquireShared } from './lock';
import type { Release } from './lock';
type ConnectionOptions = {
loadBudget?: boolean;
mutates: boolean;
skipBudget?: boolean;
};
function info(message: string, verbose?: boolean) {
if (verbose) process.stderr.write(message + '\n');
}
async function resolveBudgetIdForSyncId(syncId: string): Promise<string> {
const budgets = await api.getBudgets();
const match = budgets.find(
b =>
typeof b.id === 'string' &&
(b.groupId === syncId || b.cloudFileId === syncId),
);
if (!match?.id) {
throw new Error(
`Could not resolve on-disk budget id for syncId ${syncId} after download.`,
);
}
return match.id;
}
export async function withConnection<T>(
globalOpts: CliGlobalOpts,
fn: () => Promise<T>,
options: ConnectionOptions = {},
fn: (config: CliConfig) => Promise<T>,
{ mutates, skipBudget = false }: ConnectionOptions,
): Promise<T> {
const { loadBudget = true } = options;
const config = await resolveConfig(globalOpts);
mkdirSync(config.dataDir, { recursive: true });
info(`Connecting to ${config.serverUrl}...`, globalOpts.verbose);
if (config.sessionToken) {
@@ -48,17 +67,87 @@ export async function withConnection<T>(
}
try {
if (loadBudget && config.syncId) {
info(`Downloading budget ${config.syncId}...`, globalOpts.verbose);
await api.downloadBudget(config.syncId, {
password: config.encryptionPassword,
});
} else if (loadBudget && !config.syncId) {
if (skipBudget) return await fn(config);
if (!config.syncId) {
throw new Error(
'Sync ID is required for this command. Set --sync-id or ACTUAL_SYNC_ID.',
);
}
return await fn();
const meta = getMetaDir(config.dataDir, config.syncId);
let release: Release | null = null;
if (!config.noLock) {
release = mutates
? await acquireExclusive(meta, {
timeoutMs: config.lockTimeout * 1000,
})
: await acquireShared(meta, {
timeoutMs: config.lockTimeout * 1000,
});
}
try {
const cachedState = readCacheState(meta);
const decision = decideSyncAction({
state: cachedState,
config: { syncId: config.syncId, serverUrl: config.serverUrl },
now: Date.now(),
ttlMs: config.cacheTtl * 1000,
mutates,
refresh: config.refresh,
encrypted: Boolean(config.encryptionPassword),
});
let state: CacheState;
if (decision.action === 'download') {
info(
cachedState === null
? `Downloading budget ${config.syncId} for the first time...`
: `Re-downloading budget ${config.syncId} (cache invalidated)...`,
globalOpts.verbose,
);
await api.downloadBudget(config.syncId, {
password: config.encryptionPassword,
});
const budgetId = await resolveBudgetIdForSyncId(config.syncId);
const now = Date.now();
state = {
version: CACHE_VERSION,
syncId: config.syncId,
budgetId,
serverUrl: config.serverUrl,
lastSyncedAt: now,
lastDownloadedAt: now,
};
writeCacheState(meta, state);
} else if (decision.action === 'skip') {
const age = Math.round(
(Date.now() - decision.state.lastSyncedAt) / 1000,
);
info(`Using cached budget (synced ${age}s ago)...`, globalOpts.verbose);
await api.loadBudget(decision.state.budgetId);
state = decision.state;
} else {
info(`Syncing budget ${config.syncId}...`, globalOpts.verbose);
await api.loadBudget(decision.state.budgetId);
await api.sync();
state = { ...decision.state, lastSyncedAt: Date.now() };
writeCacheState(meta, state);
}
const result = await fn(config);
if (mutates) {
info(`Pushing changes for ${config.syncId}...`, globalOpts.verbose);
await api.sync();
state = { ...state, lastSyncedAt: Date.now() };
writeCacheState(meta, state);
}
return result;
} finally {
if (release) await release();
}
} finally {
await api.shutdown();
}

View File

@@ -9,8 +9,10 @@ import { registerQueryCommand } from './commands/query';
import { registerRulesCommand } from './commands/rules';
import { registerSchedulesCommand } from './commands/schedules';
import { registerServerCommand } from './commands/server';
import { registerSyncCommand } from './commands/sync';
import { registerTagsCommand } from './commands/tags';
import { registerTransactionsCommand } from './commands/transactions';
import { parseNonNegativeIntFlag } from './utils';
declare const __CLI_VERSION__: string;
@@ -32,6 +34,22 @@ program
'--encryption-password <password>',
'E2E encryption password (env: ACTUAL_ENCRYPTION_PASSWORD)',
)
.option(
'--cache-ttl <seconds>',
'Cache TTL in seconds (env: ACTUAL_CACHE_TTL; default: 60)',
value => parseNonNegativeIntFlag(value, '--cache-ttl'),
)
.option('--refresh', 'Force a sync on this call, ignoring the cache', false)
.option('--no-cache', 'Alias for --refresh')
.option(
'--lock-timeout <seconds>',
'How long to wait for another CLI process to release the lock (env: ACTUAL_LOCK_TIMEOUT; default: 10)',
value => parseNonNegativeIntFlag(value, '--lock-timeout'),
)
.option(
'--no-lock',
'Disable the budget directory lock (use with care, env: ACTUAL_NO_LOCK)',
)
.addOption(
new Option('--format <format>', 'Output format: json, table, csv')
.choices(['json', 'table', 'csv'] as const)
@@ -50,6 +68,7 @@ registerRulesCommand(program);
registerSchedulesCommand(program);
registerQueryCommand(program);
registerServerCommand(program);
registerSyncCommand(program);
function normalizeThrownMessage(err: unknown): string {
if (err instanceof Error) return err.message;

View File

@@ -0,0 +1,159 @@
import {
existsSync,
mkdirSync,
mkdtempSync,
readdirSync,
rmSync,
writeFileSync,
} from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import { acquireExclusive, acquireShared } from './lock';
// In-memory stand-in for proper-lockfile. The real library spins up a
// setTimeout loop to refresh lockfile mtimes; on some CI filesystems that
// timer keeps Node's event loop alive even after tests complete, wedging the
// test run. The mock behaves identically from our wrapper's perspective
// (acquire, detect contention with ELOCKED, release) without touching the
// filesystem or scheduling timers.
const mockHeld = new Set<string>();
vi.mock('proper-lockfile', () => ({
default: {
lock: vi.fn(
async (
file: string,
opts?: { lockfilePath?: string },
): Promise<() => Promise<void>> => {
const key = opts?.lockfilePath ?? file;
if (mockHeld.has(key)) {
const err = new Error('Lock is already held') as Error & {
code?: string;
};
err.code = 'ELOCKED';
throw err;
}
mockHeld.add(key);
return async () => {
mockHeld.delete(key);
};
},
),
},
}));
describe('acquireExclusive', () => {
let dir: string;
beforeEach(() => {
mockHeld.clear();
dir = mkdtempSync(join(tmpdir(), 'actual-cli-lock-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('creates the directory if it does not exist', async () => {
const target = join(dir, 'nested', 'budget');
const release = await acquireExclusive(target, { timeoutMs: 1000 });
expect(existsSync(target)).toBe(true);
await release();
});
it('returns a release function that frees the lock', async () => {
const release1 = await acquireExclusive(dir, { timeoutMs: 1000 });
await release1();
const release2 = await acquireExclusive(dir, { timeoutMs: 1000 });
await release2();
});
it('rejects with a user-friendly error when another holder has the lock', async () => {
const release = await acquireExclusive(dir, { timeoutMs: 1000 });
await expect(acquireExclusive(dir, { timeoutMs: 100 })).rejects.toThrow(
/holding the budget/,
);
await release();
});
});
describe('acquireShared', () => {
let dir: string;
beforeEach(() => {
mockHeld.clear();
dir = mkdtempSync(join(tmpdir(), 'actual-cli-lock-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('allows multiple concurrent shared holders', async () => {
const r1 = await acquireShared(dir, { timeoutMs: 1000 });
const r2 = await acquireShared(dir, { timeoutMs: 1000 });
const readers = readdirSync(join(dir, 'readers'));
expect(readers).toHaveLength(2);
await r1();
await r2();
});
it('removes the reader marker on release', async () => {
const release = await acquireShared(dir, { timeoutMs: 1000 });
await release();
const readers = readdirSync(join(dir, 'readers'));
expect(readers).toHaveLength(0);
});
it('rejects when an exclusive lock is held', async () => {
const releaseExclusive = await acquireExclusive(dir, { timeoutMs: 1000 });
await expect(acquireShared(dir, { timeoutMs: 100 })).rejects.toThrow(
/holding the budget/,
);
await releaseExclusive();
});
it('sweeps stale reader markers whose PIDs no longer exist', async () => {
const readersDir = join(dir, 'readers');
mkdirSync(readersDir, { recursive: true });
writeFileSync(join(readersDir, '-1-abc'), '');
const release = await acquireExclusive(dir, { timeoutMs: 1000 });
expect(readdirSync(readersDir)).toHaveLength(0);
await release();
});
});
describe('writer-reader interaction', () => {
let dir: string;
beforeEach(() => {
mockHeld.clear();
dir = mkdtempSync(join(tmpdir(), 'actual-cli-lock-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('exclusive waits for active shared holders to release', async () => {
const readerRelease = await acquireShared(dir, { timeoutMs: 500 });
let writerAcquired = false;
const writerPromise = acquireExclusive(dir, { timeoutMs: 1000 }).then(
release => {
writerAcquired = true;
return release;
},
);
await new Promise(resolve => setTimeout(resolve, 150));
expect(writerAcquired).toBe(false);
await readerRelease();
const writerRelease = await writerPromise;
expect(writerAcquired).toBe(true);
await writerRelease();
});
});

149
packages/cli/src/lock.ts Normal file
View File

@@ -0,0 +1,149 @@
import { randomBytes } from 'node:crypto';
import { mkdirSync, readdirSync, rmSync, writeFileSync } from 'node:fs';
import { join } from 'node:path';
import lockfile from 'proper-lockfile';
export type Release = () => Promise<void>;
export type AcquireOptions = {
timeoutMs: number;
};
const LOCKFILE_NAME = 'lock';
const READERS_DIR_NAME = 'readers';
const READER_POLL_INTERVAL_MS = 100;
function lockfilePath(dir: string): string {
return join(dir, LOCKFILE_NAME);
}
function readersDir(dir: string): string {
return join(dir, READERS_DIR_NAME);
}
function ensureDir(dir: string) {
mkdirSync(dir, { recursive: true });
}
function retriesForTimeout(timeoutMs: number) {
return {
retries: Math.max(1, Math.floor(timeoutMs / 200)),
minTimeout: 100,
maxTimeout: 500,
factor: 1.5,
};
}
function errorCode(err: unknown): string | undefined {
if (err instanceof Error && 'code' in err) {
const { code } = err as { code?: unknown };
if (typeof code === 'string') return code;
}
return undefined;
}
function isLockedError(err: unknown): boolean {
return errorCode(err) === 'ELOCKED';
}
function lockedMessage(timeoutMs: number): string {
return `Another CLI process is holding the budget (waited ${Math.round(
timeoutMs / 1000,
)}s). Retry, or use a different --data-dir.`;
}
function pidIsAlive(pid: number): boolean {
if (pid <= 0) return false;
try {
process.kill(pid, 0);
return true;
} catch (err) {
return errorCode(err) === 'EPERM';
}
}
function readReaderNames(readers: string): string[] {
try {
return readdirSync(readers);
} catch (err) {
if (errorCode(err) === 'ENOENT') return [];
throw err;
}
}
function sweepStaleReaders(dir: string) {
const readers = readersDir(dir);
for (const name of readReaderNames(readers)) {
const pid = Number(name.split('-')[0]);
if (!Number.isFinite(pid) || !pidIsAlive(pid)) {
rmSync(join(readers, name), { force: true });
}
}
}
async function waitForReadersEmpty(dir: string, timeoutMs: number) {
const readers = readersDir(dir);
const deadline = Date.now() + timeoutMs;
while (Date.now() < deadline) {
sweepStaleReaders(dir);
if (readReaderNames(readers).length === 0) return;
await new Promise(resolve => setTimeout(resolve, READER_POLL_INTERVAL_MS));
}
throw new Error(lockedMessage(timeoutMs));
}
async function acquireGate(
dir: string,
timeoutMs: number,
): Promise<() => Promise<void>> {
ensureDir(dir);
try {
return await lockfile.lock(dir, {
lockfilePath: lockfilePath(dir),
retries: retriesForTimeout(timeoutMs),
stale: 30_000,
});
} catch (err) {
if (isLockedError(err)) throw new Error(lockedMessage(timeoutMs));
throw err;
}
}
export async function acquireExclusive(
dir: string,
{ timeoutMs }: AcquireOptions,
): Promise<Release> {
const start = Date.now();
const release = await acquireGate(dir, timeoutMs);
try {
const remaining = Math.max(0, timeoutMs - (Date.now() - start));
await waitForReadersEmpty(dir, remaining);
} catch (err) {
await release();
throw err;
}
return () => release();
}
export async function acquireShared(
dir: string,
{ timeoutMs }: AcquireOptions,
): Promise<Release> {
const gate = await acquireGate(dir, timeoutMs);
let markerPath: string;
try {
const readers = readersDir(dir);
ensureDir(readers);
const markerName = `${process.pid}-${randomBytes(6).toString('hex')}`;
markerPath = join(readers, markerName);
writeFileSync(markerPath, '');
} catch (err) {
await gate();
throw err;
}
await gate();
return async () => {
rmSync(markerPath, { force: true });
};
}

View File

@@ -18,3 +18,29 @@ export function parseIntFlag(value: string, flagName: string): number {
}
return parsed;
}
export function parseNonNegativeIntFlag(
value: string,
flagName: string,
): number {
const parsed = parseIntFlag(value, flagName);
if (parsed < 0) {
throw new Error(
`Invalid ${flagName}: "${value}". Expected a non-negative integer.`,
);
}
return parsed;
}
export function parseBoolEnv(
raw: string | undefined,
source: string,
): boolean | undefined {
if (raw === undefined) return undefined;
const lower = raw.toLowerCase();
if (raw === '1' || lower === 'true') return true;
if (raw === '0' || lower === 'false') return false;
throw new Error(
`Invalid ${source}: "${raw}". Expected "true", "false", "1", or "0".`,
);
}

View File

@@ -32,5 +32,8 @@ export default defineConfig({
plugins: [visualizer({ template: 'raw-data', filename: 'dist/stats.json' })],
test: {
globals: true,
include: ['src/**/*.test.ts'],
exclude: ['**/node_modules/**', '**/dist/**'],
testTimeout: 10_000,
},
});

View File

@@ -4,8 +4,11 @@ import type { Preview } from '@storybook/react-vite';
// Not ideal to import from desktop-client, but we need a source of truth for theme variables
// TODO: this needs refactoring
// oxlint-disable-next-line actual/enforce-boundaries
import * as darkTheme from '../../desktop-client/src/style/themes/dark';
// oxlint-disable-next-line actual/enforce-boundaries
import * as lightTheme from '../../desktop-client/src/style/themes/light';
// oxlint-disable-next-line actual/enforce-boundaries
import * as midnightTheme from '../../desktop-client/src/style/themes/midnight';
const THEMES = {

View File

@@ -58,7 +58,7 @@
"@svgr/babel-plugin-add-jsx-attribute": "^8.0.0",
"@svgr/cli": "^8.1.0",
"@types/react": "^19.2.14",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"@typescript/native-preview": "beta",
"@vitejs/plugin-react": "^6.0.1",
"eslint-plugin-storybook": "^10.3.4",
"react": "19.2.4",

View File

@@ -1,4 +1,5 @@
#!/bin/bash
set -euo pipefail
cd "$(dirname "$(dirname "$0")")"
@@ -7,20 +8,10 @@ if ! [ -x "$(command -v protoc)" ]; then
exit 1
fi
export PATH="$PWD/bin:$PATH"
protoc --plugin="protoc-gen-ts=../../node_modules/.bin/protoc-gen-ts" \
--ts_opt=esModuleInterop=true \
--ts_out="src/proto" \
--js_out=import_style=commonjs,binary:src/proto \
protoc --plugin="protoc-gen-es=../../node_modules/.bin/protoc-gen-es" \
--es_opt=target=ts \
--es_out="src/proto" \
--proto_path=src/proto \
sync.proto
../../node_modules/.bin/oxfmt src/proto/*.d.ts
for file in src/proto/*.d.ts; do
{ echo "/* oxlint-disable typescript/no-namespace */"; sed 's/export class/export declare class/g' "$file"; } > "${file%.d.ts}.ts"
rm "$file"
done
echo 'One more step! Find the `var global = ...` declaration in src/proto/sync_pb.js and change it to `var global = globalThis;`'
../../node_modules/.bin/oxfmt src/proto/*.ts

View File

@@ -4,16 +4,17 @@
"description": "CRDT layer of Actual",
"license": "MIT",
"files": [
"dist"
"dist",
"!dist/**/*.test.d.ts",
"!dist/**/*.test.d.ts.map",
"!dist/**/*.spec.d.ts",
"!dist/**/*.spec.d.ts.map"
],
"main": "dist/index.js",
"types": "dist/index.d.ts",
"type": "module",
"main": "src/index.ts",
"types": "src/index.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"development": "./src/index.ts",
"default": "./dist/index.js"
}
".": "./src/index.ts"
},
"publishConfig": {
"exports": {
@@ -21,25 +22,26 @@
"types": "./dist/index.d.ts",
"default": "./dist/index.js"
}
}
},
"main": "dist/index.js",
"types": "dist/index.d.ts"
},
"scripts": {
"build:node": "vite build",
"proto:generate": "./bin/generate-proto",
"build": "yarn run build:node && tsgo -p tsconfig.build.json --emitDeclarationOnly",
"build": "yarn run build:node && tsgo -b",
"test": "vitest --run",
"typecheck": "tsgo -b"
},
"dependencies": {
"google-protobuf": "^3.21.4",
"murmurhash": "^2.0.1"
"@bufbuild/protobuf": "^2.11.0",
"murmurhash": "^2.0.1",
"uuid": "^14.0.0"
},
"devDependencies": {
"@types/google-protobuf": "3.15.12",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"protoc-gen-js": "3.21.4-4",
"@bufbuild/protoc-gen-es": "^2.11.0",
"@typescript/native-preview": "beta",
"rollup-plugin-visualizer": "^7.0.1",
"ts-protoc-gen": "0.15.0",
"vite": "^8.0.5",
"vitest": "^4.1.2"
}

Some files were not shown because too many files have changed in this diff Show More