Compare commits

..

25 Commits

Author SHA1 Message Date
github-actions[bot]
95a99200d0 [AI] desktop-client: drop unused absurd-sql dependency
desktop-client no longer imports absurd-sql directly — that plumbing
moved into loot-core's backend-worker module as part of the browser
worker consolidation. The dep was left over; removing it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 17:00:06 +01:00
github-actions[bot]
3ec327ccc9 Release notes 2026-04-18 16:58:33 +01:00
github-actions[bot]
9ea4021516 [AI] fix CI: align jsdom version + add absurd-sql type shim
- `yarn constraints` flagged jsdom ^29.0.2 in api vs ^27.4.0 in
  desktop-client. Align to ^27.4.0 — api's browser-facade test only
  uses a minimal jsdom env, both versions satisfy its needs.
- `yarn typecheck` under tsc-strict needed a declaration for
  absurd-sql/dist/indexeddb-main-thread; added a one-line .d.ts-style
  shim under packages/loot-core/typings/, matching the existing
  pattern used for vite-plugin-peggy-loader.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:52:25 +01:00
github-actions[bot]
4f013dc3ed [AI] chore: drop placeholder release-notes file
Will be recreated under the PR number once the PR is opened.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:44:53 +01:00
github-actions[bot]
88e148c168 [AI] chore: drop docs/superpowers/ planning artifacts
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:42:32 +01:00
github-actions[bot]
afce78503a Merge remote-tracking branch 'origin/master' into worktree-browser-api 2026-04-18 16:38:33 +01:00
github-actions[bot]
89891a8151 [AI] api: ship JS migrations with .data suffix so Vite ignores them
loot-core's JS migrations use `#`-subpath imports that only resolve
inside loot-core's package boundary. Once those files live in
`node_modules/@actual-app/api/dist/data/migrations/`, Vite's dev-server
import-analysis tries to resolve them and errors. Consumer workaround
was a bespoke middleware in their vite.config.ts — a leaky abstraction
for a package that should just work on import.

Fix it inside the api package:

  - Build-time rename: copyMigrationsAndDefaultDb now writes each .js
    migration under dist/data/migrations/ with an extra `.data` suffix
    and records the suffix in dist/data-file-index.txt. dist/migrations/
    (flat, used by Node consumers) stays untouched.
  - Runtime fetch wrap: browser-worker.ts installs a small pre-hook at
    module load that rewrites URLs to match — .js → .js.data on the
    request side, strips the suffix from data-file-index.txt responses —
    so loot-core's migration runner still sees files at /migrations/foo.js
    in the virtual FS.

Consumer-side vite.config.ts is now just COOP/COEP + optimizeDeps.exclude;
no dev-server plumbing needed. Verified end-to-end via the playground:
init → download → 22 accounts → 2 transactions → done, with zero config
hacks in the consumer.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:34:42 +01:00
github-actions[bot]
0c5fc1b38c [AI] api: typecheck no longer clobbers Vite build output
tsgo -b (used by both project-reference builds and the typecheck
script) emits .js per source file into outDir. Since the api's outDir
is dist/ and tsconfig had noEmit: false + declaration: true, every
`yarn typecheck` overwrote the Vite-built browser.js / worker.js with
per-file TS compilations, breaking downstream consumers until the next
Vite rebuild.

Adding emitDeclarationOnly: true to tsconfig keeps the composite /
declaration wiring intact (required for project references) but
suppresses JS emission. build:node still passes --emitDeclarationOnly
on the CLI so the intent is explicit there too.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:19:53 +01:00
github-actions[bot]
35f84b3f7f [AI] api: serve runtime assets from the api package itself
Consumers no longer copy default-db.sqlite, migrations, sql-wasm.wasm,
or data-file-index.txt into their static assets directory. The api's
dist/ now contains everything loot-core's browser fs asks for — the
existing files plus a new data-file-index.txt manifest and a data/
mirror directory (hard-linked to avoid duplicating bytes).

At init time the main-thread facade derives the directory portion of
its own bundle URL (via string manipulation to dodge Vite's asset
plugin) and hands it to the worker as __assetsBaseUrl. The worker
sets process.env.PUBLIC_URL to that URL before calling loot-core's
init(config), so populateDefaultFilesystem and sql.js locateFile all
resolve against @actual-app/api/dist/ wherever the consumer's bundler
placed it.

Playground shrinks accordingly: no more public/ directory,
copy-assets.sh script, or predev hook. `yarn dev` now does just
`vite` — matching the zero-setup `api.init()` story.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:16:23 +01:00
github-actions[bot]
45a733f2ac [AI] chore: gitignore .playwright-cli/ dev snapshots
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:02:26 +01:00
github-actions[bot]
59e7f858a7 [AI] core: move browser-preload multi-tab coordinator into loot-core
Extracts the WorkerBridge class + the Worker/SharedWorker transport
selector out of packages/desktop-client/src/browser-preload.js and into
a new loot-core module at packages/loot-core/src/platform/client/
browser-preload/. desktop-client's preload shrinks to a thin shell that
only wires its PWA service worker, package.json version, SharedWorker
factory, and the global.Actual shim — everything else is a one-line
startBrowserBackend({ ... }) call into loot-core.

The api package still uses the lighter-weight createBackendWorker entry
in the sibling loot-core module; both packages now consume the worker-
bootstrap primitives from loot-core rather than duplicating them.

Verified end-to-end in the browser via playwright — the api playground
still loads, downloads the budget, and renders accounts+transactions
identically.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:02:07 +01:00
github-actions[bot]
d5a75a831a [AI] api: spawn worker internally; share bootstrap with desktop-client
Removes the last piece of bespoke browser wiring in @actual-app/api so
consumers call api.init({...}) with no worker construction of their own.
Under the hood, the api package now:

  - Spawns its Web Worker itself via `new Worker(new URL('./worker.js',
    import.meta.url), { type: 'module' })`. Consumer bundlers resolve the
    sibling asset at their own build time.
  - Speaks loot-core's existing {id, name, args} / {type:'reply', id,
    result} backend protocol, including the {type:'connect'} handshake
    — same protocol desktop-client's browser-preload.js already feeds.
  - Delegates sqlite bootstrap to loot-core's public init(config) via a
    worker-registered `api-browser/init` handler; server-side dispatch is
    handled by the existing packages/loot-core/src/platform/server/
    connection layer, so no custom {op, payload} shape remains.

The absurd-sql main-thread plumbing (initSQLBackend + __absurd:* filter)
is now a single function in loot-core:
`packages/loot-core/src/platform/client/backend-worker/createBackendWorker`,
consumed by both desktop-client's browser-preload.js and the api's
browser rpc.ts.

Test split moves accordingly: browser-facade.test.ts swaps in a
Worker mock that speaks the new protocol (id, name, args / reply handshake)
and confirms init forwards config via api-browser/init.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 09:42:02 +01:00
github-actions[bot]
35d208a978 [AI] api: split browser build into main-thread facade + worker
absurd-sql uses Atomics.wait for sync sqlite access, which only works
inside a Web Worker. Rather than forcing every consumer to wire up
their own worker + RPC glue, ship two artifacts:

  - dist/browser.js: tiny main-thread facade (~10 KB). Reuses
    packages/api/methods.ts verbatim by aliasing
    @actual-app/core/server/main to browser/lib-stub.ts at build time;
    every lib.send call posts to the worker.
  - dist/worker.js: the full loot-core + sql.js + absurd-sql stack
    (~3.6 MB) running in a Web Worker.

Consumer wiring:

    const worker = new Worker(
      new URL('@actual-app/api/dist/worker.js', import.meta.url),
      { type: 'module' },
    );
    await api.init({ worker, dataDir: '/documents', serverURL, password });
    await api.getAccounts();

Same named imports as Node/Electron — the worker is the only
browser-specific wiring. Keeping the URL construction in consumer
code lets their bundler (Vite, Webpack, ...) handle worker.js as an
asset without forcing us onto a single bundler convention.

Tests split accordingly: Node runs the full CRUD roundtrip against
real loot-core; jsdom runs a facade test that verifies init
validation, postMessage payload shapes, and error propagation via
a mock Worker.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:53:50 +01:00
github-actions[bot]
b9b3c7ecf5 [AI] api: bundle required runtime assets into dist/
Adds the Node-side build plugin to copy @jlongster/sql.js' sql-wasm.wasm
into dist/ alongside default-db.sqlite and migrations/. Browser
consumers can now point a static handler at dist/ without reaching
into nested node_modules of a transitive dep.

Also introduces vite-plugin-node-polyfills for the browser build
(process / Buffer / stream / path / crypto / zlib / fs / assert),
with process.env.* values substituted at build time. Splits the
browser vitest config out of vite.browser.config.mts so node polyfills
don't shadow real Node fs in test setup.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:15:14 +01:00
github-actions[bot]
f95a881d24 [AI] api: add release notes entry for browser build
Filename uses TBD- placeholder until the PR number is assigned.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:03:54 +01:00
github-actions[bot]
9a71b66929 [AI] api: lint autofixes for browser test setup
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:03:27 +01:00
github-actions[bot]
e161eefc02 [AI] api: run integration smoke test under browser jsdom
Adds setup.browser.ts with fake-indexeddb, a fetch polyfill that points
sql.js WASM and loot-core's data-file-index fetches at on-disk files,
and wires the browser Vite config to use jsdom. The shared integration
spec now gates the full CRUD roundtrip behind __API_FULL_SUITE__ (set
only in Node) because absurd-sql's worker + SharedArrayBuffer
requirement is not met under jsdom; the browser smoke test verifies
that init returns a usable handle. Full-flow browser coverage moves
to the playground app in the next phase.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:02:41 +01:00
github-actions[bot]
bb7e0b63bc [AI] api: exclude moved test files from typecheck
The root *.test.ts glob no longer matches once the tests live in test/.
Widens the pattern to **/*.test.ts and adds test/setup.*.ts so the
integration setup keeps the same latitude the existing tests had with
global state.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:57:30 +01:00
github-actions[bot]
579e50f727 [AI] api: add shared integration test for Node environment
Adds a small end-to-end integration test that bootstraps a budget via
the internal create-budget handler, writes an account and a couple of
transactions through the public API, then reads them back. The spec is
environment-agnostic and will be rerun under jsdom + fake-indexeddb in
a follow-up task.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:55:30 +01:00
github-actions[bot]
1165b5ad1e [AI] api: move tests into test/ and extract node setup
Moves methods.test.ts and its snapshots into test/, and extracts the
loot-core fs mock and IS_TESTING flag into a dedicated setup.node.ts
wired up via vite.config.mts. Prepares for a sibling setup.browser.ts.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:54:23 +01:00
github-actions[bot]
55889c560b [AI] api: add browser build scripts and exports conditions
Adds build:browser and test:browser scripts alongside the existing
Node-targeted ones, and a new "browser" condition in the package
exports so bundlers auto-pick the browser build. Also adds
npm-run-all and fake-indexeddb dev dependencies used by the browser
build/test pipeline.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:52:36 +01:00
github-actions[bot]
9787894535 [AI] api: add browser Vite build config
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:51:35 +01:00
github-actions[bot]
f3850cae1d [AI] api: add browser entry point
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:50:56 +01:00
github-actions[bot]
05f4b84a85 [AI] Add implementation plan for browser-compatible @actual-app/api
Task-by-task TDD plan covering the browser entry and Vite build, dual
Node+browser unit tests, release notes, and the out-of-repo playground
for manual verification.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:49:58 +01:00
github-actions[bot]
54cb04f1c9 [AI] Add design spec for browser-compatible @actual-app/api build
Captures the approach for adding a browser build to @actual-app/api, the
unit-test setup that runs the same integration spec under both Node and
browser environments, and an out-of-repo playground app that hand-verifies
the browser build against a real Actual sync server.

Continuation of the work started in #7247 (closed stale).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 21:44:33 +01:00
482 changed files with 5546 additions and 15094 deletions

View File

@@ -1,6 +1,6 @@
issue_enrichment:
auto_enrich:
enabled: true
enabled: false
reviews:
request_changes_workflow: true
review_status: false

View File

@@ -1,7 +1,7 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/docker-existing-docker-compose
{
"name": "Actual Devcontainer",
"name": "Actual development",
"dockerComposeFile": ["../docker-compose.yml", "docker-compose.yml"],
// Alternatively:
// "image": "mcr.microsoft.com/devcontainers/typescript-node:0-16",

View File

@@ -3,6 +3,9 @@ contact_links:
- name: Bank-sync issues
url: https://discord.gg/pRYNYr4W5A
about: Is bank-sync not working? Returning too much or too few information? Reach out to the community on Discord.
- name: Support
url: https://discord.gg/pRYNYr4W5A
about: Need help with something? Having troubles setting up? Or perhaps issues using the API? Reach out to the community on Discord.
- name: Translations
url: https://hosted.weblate.org/projects/actualbudget/actual/
about: Found a string that needs a better translation? Add your suggestion or upvote an existing one in Weblate.

View File

@@ -1,17 +0,0 @@
name: Tech Support
description: Need help with something? Having troubles setting up? Or perhaps issues using the API?
title: '[Support]: '
labels: ['tech-support']
body:
- type: markdown
attributes:
value: |
> ⚠️ **Tech support tickets opened here are automatically closed.** GitHub Issues are reserved for bug reports and feature requests. The fastest way to get help is to ask the community on [Discord](https://discord.gg/pRYNYr4W5A) — that's where most of the community lives and can help you in real time.
- type: textarea
id: problem
attributes:
label: Describe your problem
description: Please describe, in as much detail as you can, what you need help with.
placeholder: I'm trying to [...] but [...]
validations:
required: true

View File

@@ -1,4 +1,4 @@
<!-- Thank you for submitting a pull request! Make sure to follow the instructions to write release notes for your PR — it should only take a minute or two: https://actualbudget.org/docs/contributing/#writing-good-release-notes. Try running yarn generate:release-notes *before* pushing your PR for an interactive experience. -->
<!-- Thank you for submitting a pull request! Make sure to follow the instructions to write release notes for your PR — it should only take a minute or two: https://github.com/actualbudget/docs#writing-good-release-notes. Try running yarn generate:release-notes *before* pushing your PR for an interactive experience. -->
## Description

View File

@@ -1,16 +1,13 @@
# See https://github.com/check-spelling/check-spelling/wiki/Configuration-Examples:-excludes
(?:^|/)(?i).nojekyll
(?:^|/)(?i)COPYRIGHT
(?:^|/)(?i)docusaurus.config.js
(?:^|/)(?i)LICEN[CS]E
(?:^|/)(?i)README.md
(?:^|/)3rdparty/
(?:^|/)go\.sum$
(?:^|/)package(?:-lock|)\.json$
(?:^|/)pyproject.toml
(?:^|/)requirements(?:-dev|-doc|-test|)\.txt$
(?:^|/)vendor/
(?:^|/)yarn\.lock$
ignore$
\.a$
\.ai$
\.avi$
@@ -56,7 +53,6 @@
\.svgz?$
\.tar$
\.tiff?$
\.tsx$
\.ttf$
\.wav$
\.webm$
@@ -66,12 +62,15 @@
\.zip$
^\.github/actions/spelling/
^\.github/ISSUE_TEMPLATE/
^\.yarn/
^\Q.github/\E$
^\Q.github/workflows/spelling.yml\E$
^\.yarn/
^\Qnode_modules/\E$
^\Qsrc/\E$
^\Qstatic/\E$
^\Q.github/\E$
(?:^|/)yarn\.lock$
(?:^|/)(?i)docusaurus.config.js
(?:^|/)(?i)README.md
(?:^|/)(?i).nojekyll
^\static/
^packages/docs/docs/releases\.md$
ignore$
\.tsx$

View File

@@ -38,13 +38,10 @@ Cetelem
cimode
Citi
Citibank
claude
Cloudflare
CLP
CMCIFRPAXXX
COBADEFF
CODEOWNERS
Codespaces
COEP
commerzbank
Copiar
@@ -56,7 +53,6 @@ crt
CZK
Danske
datadir
datamodel
DATEDIF
Depositos
deselection
@@ -86,7 +82,6 @@ Globecard
GLS
gocardless
Grafana
Gruvbox
HABAL
Hampel
HELADEF
@@ -94,7 +89,6 @@ HLOOKUP
HUF
IFERROR
IFNA
Ilavenil
INDUSTRIEL
INGBPLPW
Ingo
@@ -133,7 +127,6 @@ murmurhash
NETWORKDAYS
nginx
nodenext
nord
OIDC
Okabe
overbudgeted
@@ -147,13 +140,14 @@ pluggyai
Poste
PPABPLPK
prefs
Primoco
Priotecs
proactively
Qatari
QNTOFRP
QONTO
Raiffeisen
REGEXREPLACE
relinking
revolut
RIED
RSchedule
@@ -178,6 +172,7 @@ SWEDBANK
SWEDNOKK
Synology
systemctl
tada
taskbar
templating
THB
@@ -185,7 +180,6 @@ TIMEFRAME
touchscreen
triaging
tsgo
tsgolint
TWD
UAH
ubuntu
@@ -201,6 +195,4 @@ websecure
WEEKNUM
Widiba
WOR
worktree
youngcw
zizmor

View File

@@ -9,7 +9,7 @@ runs:
node-version: 22
- name: Install dependencies
shell: bash
run: yarn workspaces focus actual @actual-app/ci-actions
run: yarn workspaces focus @actual-app/ci-actions
- name: Generate release notes
shell: bash
env:

View File

@@ -52,9 +52,8 @@ runs:
with:
repository: actualbudget/translations
path: ${{ inputs.working-directory }}/packages/desktop-client/locale
persist-credentials: false
if: ${{ inputs.download-translations == 'true' && !env.ACT }}
if: ${{ inputs.download-translations == 'true' }}
- name: Remove untranslated languages
run: packages/desktop-client/bin/remove-untranslated-languages
shell: bash
if: ${{ inputs.download-translations == 'true' && !env.ACT }}
if: ${{ inputs.download-translations == 'true' }}

View File

@@ -18,8 +18,6 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -16,8 +16,6 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -19,26 +19,10 @@ concurrency:
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}
jobs:
setup:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
api:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -61,12 +45,9 @@ jobs:
path: api-stats.json
crdt:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -89,12 +70,9 @@ jobs:
path: crdt-stats.json
web:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
- name: Build Web
@@ -111,12 +89,9 @@ jobs:
path: packages/desktop-client/build-stats
cli:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -139,12 +114,9 @@ jobs:
path: cli-stats.json
server:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -12,40 +12,20 @@ concurrency:
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}
jobs:
setup:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
constraints:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Check dependency version consistency
run: yarn constraints
- name: Check tsconfig project references are in sync
run: yarn check:tsconfig-references
lint:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -53,12 +33,9 @@ jobs:
- name: Lint
run: yarn lint
typecheck:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -66,12 +43,9 @@ jobs:
- name: Typecheck
run: yarn typecheck
validate-cli:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -81,12 +55,9 @@ jobs:
- name: Check that the built CLI works
run: node packages/sync-server/build/bin/actual-server.js --version
test:
needs: setup
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -104,13 +75,10 @@ jobs:
- uses: zizmorcore/zizmor-action@71321a20a9ded102f6e9ce5718a2fcec2c4f70d8 # v0.5.2
migrations:
needs: setup
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -23,8 +23,6 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1

View File

@@ -17,8 +17,6 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:

View File

@@ -26,13 +26,11 @@ permissions:
jobs:
cut-release-branch:
runs-on: ubuntu-latest
environment: release
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.inputs.ref || 'master' }}
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup

View File

@@ -32,14 +32,11 @@ jobs:
if: github.event_name == 'workflow_dispatch' || !github.event.repository.fork
name: Build Docker image
runs-on: ubuntu-latest
environment: release
strategy:
matrix:
os: [ubuntu, alpine]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up QEMU
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0

View File

@@ -27,11 +27,8 @@ jobs:
build:
name: Build Docker image
runs-on: ubuntu-latest
environment: release
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up QEMU
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0

View File

@@ -17,80 +17,32 @@ on:
env:
GITHUB_PR_NUMBER: ${{github.event.pull_request.number}}
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
build-web:
name: Build web bundle
runs-on: ubuntu-latest
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Trust the repository directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
- name: Build browser bundle
# REACT_APP_NETLIFY=true flips isNonProductionEnvironment() on in the
# bundle so the "Create test file" button (used by every e2e beforeEach
# via ConfigurationPage.createTestFile()) is still rendered in a
# production build. Without it, e2e tests would time out waiting for
# a button that was tree-shaken out.
# --skip-translations keeps VRT screenshots deterministic by rendering
# source-code English instead of upstream Weblate en.json (which can
# drift between snapshot capture and test runs).
env:
REACT_APP_NETLIFY: 'true'
run: yarn build:browser --skip-translations
- name: Upload build artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
retention-days: 1
overwrite: true
functional:
name: Functional (shard ${{ matrix.shard }}/3)
name: Functional (shard ${{ matrix.shard }}/5)
runs-on: ubuntu-latest
needs: build-web
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3]
shard: [1, 2, 3, 4, 5]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
env:
E2E_USE_BUILD: '1'
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Trust the repository directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
- name: Download web build
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
- name: Run E2E Tests
run: yarn e2e --shard=${{ matrix.shard }}/3
run: yarn e2e --shard=${{ matrix.shard }}/5
- uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
if: failure()
if: always()
with:
name: desktop-client-test-results-shard-${{ matrix.shard }}
path: packages/desktop-client/test-results/
@@ -104,8 +56,6 @@ jobs:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -130,34 +80,22 @@ jobs:
overwrite: true
vrt:
name: Visual regression (shard ${{ matrix.shard }}/3)
name: Visual regression (shard ${{ matrix.shard }}/5)
runs-on: ubuntu-latest
needs: build-web
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3]
shard: [1, 2, 3, 4, 5]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
env:
E2E_USE_BUILD: '1'
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Trust the repository directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
- name: Download web build
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
- name: Run VRT Tests
run: yarn vrt --shard=${{ matrix.shard }}/3
run: yarn vrt --shard=${{ matrix.shard }}/5
- uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
if: always()
with:
@@ -175,8 +113,6 @@ jobs:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
- name: Download all blob reports
@@ -201,9 +137,7 @@ jobs:
mkdir -p vrt-metadata
echo "${{ github.event.pull_request.number }}" > vrt-metadata/pr-number.txt
echo "${{ needs.vrt.result }}" > vrt-metadata/vrt-result.txt
echo "${STEPS_PLAYWRIGHT_REPORT_VRT_OUTPUTS_ARTIFACT_URL}" > vrt-metadata/artifact-url.txt
env:
STEPS_PLAYWRIGHT_REPORT_VRT_OUTPUTS_ARTIFACT_URL: ${{ steps.playwright-report-vrt.outputs.artifact-url }}
echo "${{ steps.playwright-report-vrt.outputs.artifact-url }}" > vrt-metadata/artifact-url.txt
- name: Upload VRT metadata
if: github.event_name == 'pull_request'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1

View File

@@ -21,9 +21,7 @@ jobs:
# this is so the assets can be added to the release
permissions:
contents: write
environment: release
strategy:
fail-fast: false
matrix:
os:
- ubuntu-22.04
@@ -32,8 +30,6 @@ jobs:
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- if: ${{ startsWith(matrix.os, 'windows') }}
run: pip.exe install setuptools
- if: ${{ ! startsWith(matrix.os, 'windows') }}
@@ -60,11 +56,9 @@ jobs:
METAINFO_FILE="packages/desktop-electron/extra-resources/linux/com.actualbudget.actual.metainfo.xml"
TODAY=$(date +%Y-%m-%d)
VERSION=${STEPS_PROCESS_VERSION_OUTPUTS_VERSION}
VERSION=${{ steps.process_version.outputs.version }}
sed -i "s/%RELEASE_VERSION%/$VERSION/g; s/%RELEASE_DATE%/$TODAY/g" "$METAINFO_FILE"
flatpak run --command=flatpak-builder-lint org.flatpak.Builder appstream "$METAINFO_FILE"
env:
STEPS_PROCESS_VERSION_OUTPUTS_VERSION: ${{ steps.process_version.outputs.version }}
- name: Set up environment
uses: ./.github/actions/setup
- name: Build Electron for Mac
@@ -124,7 +118,6 @@ jobs:
publish-microsoft-store:
needs: build
runs-on: windows-latest
environment: release
if: ${{ github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v') }}
steps:
- name: Install StoreBroker

View File

@@ -26,7 +26,6 @@ concurrency:
jobs:
build:
strategy:
fail-fast: false
matrix:
os:
- ubuntu-22.04
@@ -35,8 +34,6 @@ jobs:
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- if: ${{ startsWith(matrix.os, 'windows') }}
run: pip.exe install setuptools
- if: ${{ ! startsWith(matrix.os, 'windows') }}

View File

@@ -15,7 +15,6 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
path: actual
persist-credentials: false
- name: Set up environment
uses: ./actual/.github/actions/setup
with:
@@ -60,8 +59,6 @@ jobs:
ssh-key: ${{ secrets.STRING_IMPORT_DEPLOY_KEY }}
repository: actualbudget/translations
path: translations
# Need to be able to push back extracted strings
persist-credentials: true
- name: Generate i18n strings
working-directory: actual
run: |

View File

@@ -1,23 +0,0 @@
name: Close tech support issues with automated message
on:
issues:
types: [labeled]
jobs:
tech-support:
if: ${{ github.event.label.name == 'tech-support' }}
runs-on: ubuntu-latest
steps:
- name: Create comment and close issue
run: |
gh issue comment "$ISSUE_URL" --body ":wave: Thanks for reaching out!
GitHub Issues are reserved for bug reports and feature requests, so tech support tickets are automatically closed. The fastest way to get help is to ask the community on [Discord](https://discord.gg/pRYNYr4W5A) — that's where most of the community lives and can help you in real time.
<!-- tech-support-auto-close-comment -->"
gh issue close "$ISSUE_URL"
env:
ISSUE_URL: https://github.com/actualbudget/actual/issues/${{ github.event.issue.number }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -25,8 +25,6 @@ jobs:
steps:
# This is not a security concern because we have approved & merged the PR
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22

View File

@@ -19,12 +19,9 @@ concurrency:
jobs:
build-and-deploy:
runs-on: ubuntu-latest
environment: release
steps:
- name: Repository Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup

View File

@@ -1,47 +0,0 @@
name: Nightly theme catalog scan
on:
schedule:
# 05:15 UTC daily — runs after the i18n extract job (04:00) and well
# before the nightly Electron/npm publishes (00:00 UTC the next day).
- cron: '15 5 * * *'
workflow_dispatch:
permissions:
contents: read
jobs:
validate-theme-catalog:
name: Validate custom theme catalog
runs-on: ubuntu-latest
if: github.repository == 'actualbudget/actual'
timeout-minutes: 10
steps:
- name: Check out repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Validate themes
run: yarn workspace @actual-app/web validate:theme-catalog
notify-failure:
name: Notify Discord on failure
needs: validate-theme-catalog
if: failure() && github.repository == 'actualbudget/actual'
runs-on: ubuntu-latest
environment: nightly-alerts
timeout-minutes: 5
steps:
- name: Notify Discord
uses: sarisia/actions-status-discord@eb045afee445dc055c18d3d90bd0f244fd062708 # v1.16.0
with:
webhook: ${{ secrets.DISCORD_WEBHOOK_URL }}
status: Failure
title: Nightly theme catalog scan failed
description: The nightly scan failed. One or more themes may be broken, or the scan itself did not complete.
username: Actual Nightly
nofail: true

View File

@@ -21,7 +21,6 @@ concurrency:
jobs:
publish-flathub:
runs-on: ubuntu-22.04
environment: release
steps:
- name: Resolve version
id: resolve_version
@@ -55,9 +54,8 @@ jobs:
- name: Verify release assets exist
env:
GH_TOKEN: ${{ github.token }}
STEPS_RESOLVE_VERSION_OUTPUTS_TAG: ${{ steps.resolve_version.outputs.tag }}
run: |
TAG="${STEPS_RESOLVE_VERSION_OUTPUTS_TAG}"
TAG="${{ steps.resolve_version.outputs.tag }}"
echo "Checking release assets for tag $TAG..."
ASSETS=$(gh api "repos/${{ github.repository }}/releases/tags/$TAG" --jq '.assets[].name')
@@ -79,7 +77,7 @@ jobs:
- name: Calculate AppImage SHA256 (streamed)
run: |
VERSION="${STEPS_RESOLVE_VERSION_OUTPUTS_VERSION}"
VERSION="${{ steps.resolve_version.outputs.version }}"
BASE_URL="https://github.com/${{ github.repository }}/releases/download/v${VERSION}"
echo "Streaming x86_64 AppImage to compute SHA256..."
@@ -92,32 +90,27 @@ jobs:
echo "APPIMAGE_X64_SHA256=$APPIMAGE_X64_SHA256" >> "$GITHUB_ENV"
echo "APPIMAGE_ARM64_SHA256=$APPIMAGE_ARM64_SHA256" >> "$GITHUB_ENV"
env:
STEPS_RESOLVE_VERSION_OUTPUTS_VERSION: ${{ steps.resolve_version.outputs.version }}
- name: Checkout Flathub repo
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
repository: flathub/com.actualbudget.actual
token: ${{ secrets.FLATHUB_GITHUB_TOKEN }}
persist-credentials: false
- name: Update manifest with new version
run: |
VERSION="${STEPS_RESOLVE_VERSION_OUTPUTS_VERSION}"
VERSION="${{ steps.resolve_version.outputs.version }}"
# Replace x86_64 entry
sed -i "/x86_64.AppImage/{n;s|sha256:.*|sha256: ${APPIMAGE_X64_SHA256}|}" com.actualbudget.actual.yml
sed -i "/x86_64.AppImage/{n;s|sha256:.*|sha256: ${{ env.APPIMAGE_X64_SHA256 }}|}" com.actualbudget.actual.yml
sed -i "/x86_64.AppImage/s|url:.*|url: https://github.com/actualbudget/actual/releases/download/v${VERSION}/Actual-linux-x86_64.AppImage|" com.actualbudget.actual.yml
# Replace arm64 entry
sed -i "/arm64.AppImage/{n;s|sha256:.*|sha256: ${APPIMAGE_ARM64_SHA256}|}" com.actualbudget.actual.yml
sed -i "/arm64.AppImage/{n;s|sha256:.*|sha256: ${{ env.APPIMAGE_ARM64_SHA256 }}|}" com.actualbudget.actual.yml
sed -i "/arm64.AppImage/s|url:.*|url: https://github.com/actualbudget/actual/releases/download/v${VERSION}/Actual-linux-arm64.AppImage|" com.actualbudget.actual.yml
echo "Updated manifest:"
cat com.actualbudget.actual.yml
env:
STEPS_RESOLVE_VERSION_OUTPUTS_VERSION: ${{ steps.resolve_version.outputs.version }}
- name: Create PR in Flathub repo
uses: peter-evans/create-pull-request@5f6978faf089d4d20b00c7766989d076bb2fc7f1 # v8.1.1

View File

@@ -20,19 +20,15 @@ concurrency:
jobs:
build:
strategy:
fail-fast: false
matrix:
os:
- ubuntu-22.04
- windows-latest
- macos-latest
runs-on: ${{ matrix.os }}
environment: release
if: github.event.repository.fork == false
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- if: ${{ startsWith(matrix.os, 'windows') }}
run: pip.exe install setuptools

View File

@@ -0,0 +1,124 @@
name: Publish nightly npm packages
# Nightly npm packages are built daily at midnight UTC
on:
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
jobs:
build-and-pack:
runs-on: ubuntu-latest
name: Build and pack npm packages
if: github.event.repository.fork == false
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up environment
uses: ./.github/actions/setup
- name: Update package versions
run: |
# Get new nightly versions
NEW_CORE_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/loot-core/package.json --type nightly)
NEW_WEB_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/desktop-client/package.json --type nightly)
NEW_SYNC_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/sync-server/package.json --type nightly)
NEW_API_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/api/package.json --type nightly)
NEW_CLI_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/cli/package.json --type nightly)
# Set package versions
npm version $NEW_CORE_VERSION --no-git-tag-version --workspace=@actual-app/core --no-workspaces-update
npm version $NEW_WEB_VERSION --no-git-tag-version --workspace=@actual-app/web --no-workspaces-update
npm version $NEW_SYNC_VERSION --no-git-tag-version --workspace=@actual-app/sync-server --no-workspaces-update
npm version $NEW_API_VERSION --no-git-tag-version --workspace=@actual-app/api --no-workspaces-update
npm version $NEW_CLI_VERSION --no-git-tag-version --workspace=@actual-app/cli --no-workspaces-update
- name: Yarn install
run: |
yarn install
- name: Pack the core package
run: |
yarn workspace @actual-app/core pack --filename @actual-app/core.tgz
- name: Build Server & Web
run: yarn build:server
- name: Pack the web and server packages
run: |
yarn workspace @actual-app/web pack --filename @actual-app/web.tgz
yarn workspace @actual-app/sync-server pack --filename @actual-app/sync-server.tgz
- name: Build API
run: yarn build:api
- name: Pack the api package
run: |
yarn workspace @actual-app/api pack --filename @actual-app/api.tgz
- name: Build CLI
run: yarn workspace @actual-app/cli build
- name: Pack the cli package
run: |
yarn workspace @actual-app/cli pack --filename @actual-app/cli.tgz
- name: Upload package artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: npm-packages
path: |
packages/loot-core/@actual-app/core.tgz
packages/desktop-client/@actual-app/web.tgz
packages/sync-server/@actual-app/sync-server.tgz
packages/api/@actual-app/api.tgz
packages/cli/@actual-app/cli.tgz
publish:
runs-on: ubuntu-latest
name: Publish Nightly npm packages
needs: build-and-pack
permissions:
contents: read
packages: write
steps:
- name: Download the artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: npm-packages
- name: Setup node and npm registry
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22
registry-url: 'https://registry.npmjs.org'
- name: Publish Core
run: |
npm publish loot-core/@actual-app/core.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish Web
run: |
npm publish desktop-client/@actual-app/web.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish Sync-Server
run: |
npm publish sync-server/@actual-app/sync-server.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish API
run: |
npm publish api/@actual-app/api.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish CLI
run: |
npm publish cli/@actual-app/cli.tgz --access public --tag nightly
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -1,55 +1,26 @@
name: Publish npm packages
# Npm packages are published for every new tag and nightly schedule
# # Npm packages are published for every new tag
on:
push:
tags:
- 'v*.*.*'
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
jobs:
build-and-pack:
runs-on: ubuntu-latest
name: Build and pack npm packages
if: github.event_name == 'push' || (github.event.repository.fork == false)
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
- name: Update package versions
if: github.event_name != 'push'
run: |
# Get new nightly versions
NEW_CORE_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/loot-core/package.json --type nightly)
NEW_WEB_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/desktop-client/package.json --type nightly)
NEW_SYNC_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/sync-server/package.json --type nightly)
NEW_API_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/api/package.json --type nightly)
NEW_CLI_VERSION=$(yarn workspace @actual-app/ci-actions tsx bin/get-next-package-version.ts --package-json ./packages/cli/package.json --type nightly)
# Set package versions
npm version $NEW_CORE_VERSION --no-git-tag-version --workspace=@actual-app/core --no-workspaces-update
npm version $NEW_WEB_VERSION --no-git-tag-version --workspace=@actual-app/web --no-workspaces-update
npm version $NEW_SYNC_VERSION --no-git-tag-version --workspace=@actual-app/sync-server --no-workspaces-update
npm version $NEW_API_VERSION --no-git-tag-version --workspace=@actual-app/api --no-workspaces-update
npm version $NEW_CLI_VERSION --no-git-tag-version --workspace=@actual-app/cli --no-workspaces-update
- name: Yarn install
if: github.event_name != 'push'
run: |
# Required after nightly `npm version` updates workspace manifests.
yarn install
- name: Pack the core package
run: |
yarn workspace @actual-app/core pack --filename @actual-app/core.tgz
- name: Build Server & Web
- name: Build Web
run: yarn build:server
- name: Pack the web and server packages
@@ -73,7 +44,6 @@ jobs:
- name: Upload package artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
if: ${{ !env.ACT }}
with:
name: npm-packages
path: |
@@ -87,13 +57,9 @@ jobs:
runs-on: ubuntu-latest
name: Publish npm packages
needs: build-and-pack
environment: release
permissions:
contents: read
packages: write
id-token: write # Required for OIDC
env:
NPM_DIST_TAG: ${{ github.event_name != 'push' && 'nightly' || '' }}
steps:
- name: Download the artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
@@ -103,26 +69,35 @@ jobs:
- name: Setup node and npm registry
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 24
check-latest: true
node-version: 22
registry-url: 'https://registry.npmjs.org'
- name: Publish Core
run: |
npm publish loot-core/@actual-app/core.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
npm publish loot-core/@actual-app/core.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish Web
run: |
npm publish desktop-client/@actual-app/web.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
npm publish desktop-client/@actual-app/web.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish Sync-Server
run: |
npm publish sync-server/@actual-app/sync-server.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
npm publish sync-server/@actual-app/sync-server.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish API
run: |
npm publish api/@actual-app/api.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
npm publish api/@actual-app/api.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Publish CLI
run: |
npm publish cli/@actual-app/cli.tgz --access public ${NPM_DIST_TAG:+--tag "$NPM_DIST_TAG"}
npm publish cli/@actual-app/cli.tgz --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

View File

@@ -37,15 +37,13 @@ jobs:
with:
fetch-depth: 0
token: ${{ secrets.ACTIONS_UPDATE_TOKEN || github.token }}
# Need to be able to commit release notes after generation
persist-credentials: true
- name: Get changed files
if: steps.bot-check.outputs.skip != 'true'
id: changed-files
run: |
git fetch origin ${GITHUB_BASE_REF}
CHANGED_FILES=$(git diff --name-only origin/${GITHUB_BASE_REF}...HEAD)
git fetch origin ${{ github.base_ref }}
CHANGED_FILES=$(git diff --name-only origin/${{ github.base_ref }}...HEAD)
NON_DOCS_FILES=$(echo "$CHANGED_FILES" | grep -v -e "^packages/docs/" -e "^\.github/actions/docs-spelling/" || true)
if [ -z "$NON_DOCS_FILES" ] && [ -n "$CHANGED_FILES" ]; then

View File

@@ -38,7 +38,6 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.base_ref }}
persist-credentials: false
- name: Set up environment
uses: ./.github/actions/setup
with:
@@ -105,7 +104,7 @@ jobs:
- name: Report build failure
if: steps.wait-for-web-build.outputs.conclusion == 'failure' || steps.wait-for-api-build.outputs.conclusion == 'failure' || steps.wait-for-cli-build.outputs.conclusion == 'failure' || steps.wait-for-crdt-build.outputs.conclusion == 'failure'
run: |
echo "Build failed on PR branch or ${GITHUB_BASE_REF}"
echo "Build failed on PR branch or ${{github.base_ref}}"
exit 1
- name: Download web build artifact from ${{github.base_ref}}

View File

@@ -3,12 +3,9 @@ on:
schedule:
- cron: '30 1 * * *'
workflow_dispatch: # Allow manual triggering
permissions: {}
jobs:
stale:
permissions:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10.2.0
@@ -19,8 +16,6 @@ jobs:
days-before-close: 5
days-before-issue-stale: -1
stale-wip:
permissions:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10.2.0
@@ -32,8 +27,6 @@ jobs:
days-before-issue-stale: -1
stale-needs-info:
permissions:
issues: write
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10.2.0

View File

@@ -75,12 +75,9 @@ jobs:
echo "Found patch file: $PATCH_FILE"
# Validate patch only contains PNG files. `git format-patch` emits a
# `GIT binary patch` block for PNGs (no +++/--- lines), so check
# `diff --git` headers — those are present for both text and binary.
# Validate patch only contains PNG files
echo "Validating patch contains only PNG files..."
if grep -E '^diff --git ' "$PATCH_FILE" \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
if grep -E '^(\+\+\+|---) [ab]/' "$PATCH_FILE" | grep -v '\.png$'; then
echo "ERROR: Patch contains non-PNG files! Rejecting for security."
echo "applied=false" >> "$GITHUB_OUTPUT"
echo "error=Patch validation failed: contains non-PNG files" >> "$GITHUB_OUTPUT"
@@ -88,7 +85,7 @@ jobs:
fi
# Extract file list for verification
FILES_CHANGED=$(grep -cE '^diff --git ' "$PATCH_FILE")
FILES_CHANGED=$(grep -E '^\+\+\+ b/' "$PATCH_FILE" | sed 's/^+++ b\///' | wc -l)
echo "Patch modifies $FILES_CHANGED PNG file(s)"
# Configure git
@@ -110,7 +107,7 @@ jobs:
fi
# Commit
git commit -m "Update VRT screenshots" -m "Auto-generated by VRT workflow" -m "PR: #${STEPS_METADATA_OUTPUTS_PR_NUMBER}"
git commit -m "Update VRT screenshots" -m "Auto-generated by VRT workflow" -m "PR: #${{ steps.metadata.outputs.pr_number }}"
echo "applied=true" >> "$GITHUB_OUTPUT"
else
@@ -119,8 +116,6 @@ jobs:
echo "error=Patch conflicts with current branch state" >> "$GITHUB_OUTPUT"
exit 1
fi
env:
STEPS_METADATA_OUTPUTS_PR_NUMBER: ${{ steps.metadata.outputs.pr_number }}
- name: Push changes
if: steps.apply.outputs.applied == 'true'

View File

@@ -36,16 +36,15 @@ jobs:
content: 'eyes'
});
get-pr:
name: Resolve PR details
generate-vrt-updates:
name: Generate VRT Updates
runs-on: ubuntu-latest
# Only run on PR comments containing /update-vrt
if: >
github.event.issue.pull_request &&
startsWith(github.event.comment.body, '/update-vrt')
outputs:
head_sha: ${{ steps.pr.outputs.head_sha }}
head_ref: ${{ steps.pr.outputs.head_ref }}
head_repo: ${{ steps.pr.outputs.head_repo }}
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- name: Get PR details
id: pr
@@ -61,130 +60,9 @@ jobs:
core.setOutput('head_ref', pr.head.ref);
core.setOutput('head_repo', pr.head.repo.full_name);
build-web:
name: Build web bundle
runs-on: ubuntu-latest
needs: get-pr
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Trust workspace directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
shell: bash
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Build browser bundle
# REACT_APP_NETLIFY=true keeps the "Create test file" button in the
# production bundle — every VRT test's beforeEach relies on it via
# ConfigurationPage.createTestFile().
env:
REACT_APP_NETLIFY: 'true'
run: |
yarn workspace plugins-service build
yarn workspace @actual-app/crdt build
yarn workspace @actual-app/core build:browser
yarn workspace @actual-app/web build:browser
- name: Upload build artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
retention-days: 1
overwrite: true
browser-vrt:
name: Browser VRT (shard ${{ matrix.shard }}/3)
runs-on: ubuntu-latest
needs: [get-pr, build-web]
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3]
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
env:
E2E_USE_BUILD: '1'
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Trust workspace directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
shell: bash
- name: Set up environment
uses: ./.github/actions/setup
with:
download-translations: 'false'
- name: Download web build
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: desktop-client-build
path: packages/desktop-client/build/
- name: Run VRT Tests
continue-on-error: true
run: yarn vrt --update-snapshots --shard=${{ matrix.shard }}/3
- name: Create shard patch with PNG changes only
id: create-patch
run: |
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git add "**/*.png"
if git diff --staged --quiet; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No VRT changes in this shard"
exit 0
fi
echo "has_changes=true" >> "$GITHUB_OUTPUT"
git commit -m "Update VRT screenshots (browser shard ${{ matrix.shard }})"
git format-patch -1 HEAD --stdout > vrt-shard.patch
# Validate patch only contains PNG files. `git format-patch` emits a
# `GIT binary patch` block for PNGs (no +++/--- lines), so check
# `diff --git` headers — those are present for both text and binary.
if grep -E '^diff --git ' vrt-shard.patch \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Shard patch contains non-PNG files!"
exit 1
fi
- name: Upload shard patch
if: steps.create-patch.outputs.has_changes == 'true'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: vrt-shard-browser-${{ matrix.shard }}
path: vrt-shard.patch
retention-days: 1
overwrite: true
desktop-vrt:
name: Desktop VRT
runs-on: ubuntu-latest
needs: get-pr
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Trust workspace directory
run: git config --global --add safe.directory "$GITHUB_WORKSPACE"
shell: bash
ref: ${{ steps.pr.outputs.head_sha }}
- name: Set up environment
uses: ./.github/actions/setup
@@ -195,120 +73,48 @@ jobs:
- name: Install build tools
run: apt-get update && apt-get install -y build-essential python3
- name: Run Desktop VRT Tests
- name: Run VRT Tests on Desktop app
continue-on-error: true
run: |
yarn rebuild-electron
xvfb-run --auto-servernum --server-args="-screen 0 1920x1080x24" -- yarn e2e:desktop --update-snapshots
- name: Create shard patch with PNG changes only
- name: Run VRT Tests
continue-on-error: true
run: yarn vrt --update-snapshots
- name: Create patch with PNG changes only
id: create-patch
run: |
# Trust the repository directory (required for container environments)
git config --global --add safe.directory "$GITHUB_WORKSPACE"
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
# Stage only PNG files
git add "**/*.png"
# Check if there are any changes
if git diff --staged --quiet; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No VRT changes in desktop shard"
exit 0
fi
echo "has_changes=true" >> "$GITHUB_OUTPUT"
git commit -m "Update VRT screenshots (desktop)"
git format-patch -1 HEAD --stdout > vrt-shard.patch
# See validation note in browser-vrt above.
if grep -E '^diff --git ' vrt-shard.patch \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Desktop shard patch contains non-PNG files!"
exit 1
fi
- name: Upload shard patch
if: steps.create-patch.outputs.has_changes == 'true'
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: vrt-shard-desktop
path: vrt-shard.patch
retention-days: 1
overwrite: true
merge-patch:
name: Merge VRT Patches
runs-on: ubuntu-latest
needs: [get-pr, browser-vrt, desktop-vrt]
if: ${{ !cancelled() && needs.get-pr.result == 'success' }}
container:
image: mcr.microsoft.com/playwright:v1.59.1-jammy
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ needs.get-pr.outputs.head_sha }}
persist-credentials: false
- name: Download all shard patches
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
path: /tmp/shard-patches
pattern: vrt-shard-*
- name: Merge shard patches
id: create-patch
run: |
git config --global --add safe.directory "$GITHUB_WORKSPACE"
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
shopt -s nullglob
patches=(/tmp/shard-patches/*/vrt-shard.patch)
if [ ${#patches[@]} -eq 0 ]; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No shard patches to merge"
exit 0
fi
# Defense in depth: re-validate every shard patch before applying.
# See validation note in browser-vrt above for why we match
# `diff --git` headers instead of +++/--- lines.
for patch in "${patches[@]}"; do
echo "Validating $patch"
if grep -E '^diff --git ' "$patch" \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: $patch contains non-PNG files!"
exit 1
fi
done
# Apply each shard patch. Shards touch disjoint PNG files so
# order does not matter. --index stages the applied changes.
for patch in "${patches[@]}"; do
echo "Applying $patch"
git apply --index "$patch"
done
if git diff --staged --quiet; then
echo "has_changes=false" >> "$GITHUB_OUTPUT"
echo "No VRT changes after merge"
echo "No VRT changes to commit"
exit 0
fi
echo "has_changes=true" >> "$GITHUB_OUTPUT"
# Create commit and patch
git commit -m "Update VRT screenshots"
git format-patch -1 HEAD --stdout > vrt-update.patch
# Final guard on the combined patch.
if grep -E '^diff --git ' vrt-update.patch \
| grep -vE '^diff --git a/[^[:space:]]+\.png b/[^[:space:]]+\.png$'; then
echo "ERROR: Merged patch contains non-PNG files!"
# Validate patch only contains PNG files
if grep -E '^(\+\+\+|---) [ab]/' vrt-update.patch | grep -v '\.png$'; then
echo "ERROR: Patch contains non-PNG files!"
exit 1
fi
echo "Merged patch created successfully with PNG changes only"
echo "Patch created successfully with PNG changes only"
- name: Upload patch artifact
if: steps.create-patch.outputs.has_changes == 'true'
@@ -323,11 +129,8 @@ jobs:
run: |
mkdir -p pr-metadata
echo "${{ github.event.issue.number }}" > pr-metadata/pr-number.txt
echo "${NEEDS_GET_PR_OUTPUTS_HEAD_REF}" > pr-metadata/head-ref.txt
echo "${NEEDS_GET_PR_OUTPUTS_HEAD_REPO}" > pr-metadata/head-repo.txt
env:
NEEDS_GET_PR_OUTPUTS_HEAD_REF: ${{ needs.get-pr.outputs.head_ref }}
NEEDS_GET_PR_OUTPUTS_HEAD_REPO: ${{ needs.get-pr.outputs.head_repo }}
echo "${{ steps.pr.outputs.head_ref }}" > pr-metadata/head-ref.txt
echo "${{ steps.pr.outputs.head_repo }}" > pr-metadata/head-repo.txt
- name: Upload PR metadata
if: steps.create-patch.outputs.has_changes == 'true'

4
.gitignore vendored
View File

@@ -42,9 +42,6 @@ bundle.desktop.js.map
bundle.mobile.js
bundle.mobile.js.map
# Python virtualenv (Electron CI provisions one at the repo root for setuptools)
.venv/
# Yarn
.pnp.*
.yarn/*
@@ -95,3 +92,4 @@ storybook-static
.actualrc.yaml
.actualrc.yml
actual.config.js
.playwright-cli/

View File

@@ -37,7 +37,6 @@
"actual/no-anchor-tag": "error",
"actual/no-react-default-import": "error",
"actual/prefer-subpath-imports": "error",
"actual/enforce-boundaries": "error",
"actual/no-extraneous-dependencies": "error",
// JSX A11y rules
@@ -337,11 +336,6 @@
"group": ["**/*.api", "**/*.electron"],
"message": "Don't directly reference imports from other platforms"
},
{
"group": ["uuid"],
"importNames": ["*"],
"message": "Use `import { v4 as uuidv4 } from 'uuid'` instead"
},
{
"group": ["**/style", "**/colors"],
"importNames": ["colors"],
@@ -375,14 +369,7 @@
"files": ["**/*.test.{js,ts,jsx,tsx}", "packages/docs/**/*"],
"rules": {
"actual/no-untranslated-strings": "off",
"actual/prefer-logger-over-console": "off",
"typescript/unbound-method": "off"
}
},
{
"files": ["packages/eslint-plugin-actual/lib/rules/__tests__/**/*"],
"rules": {
"actual/enforce-boundaries": "off"
"actual/prefer-logger-over-console": "off"
}
},
{

View File

@@ -281,6 +281,7 @@ Always run `yarn typecheck` before committing.
- Avoid `any` or `unknown` unless absolutely necessary
- Look for existing type definitions in the codebase
- Avoid type assertions (`as`, `!`) - prefer `satisfies`
- Use inline type imports: `import { type MyType } from '...'`
**Naming:**

View File

@@ -1,3 +1 @@
Please review the contributing documentation on our website: https://actualbudget.org/docs/contributing/
If you plan to use AI tools when contributing, please also read our [AI Usage Policy](https://actualbudget.org/docs/contributing/ai-usage-policy).

View File

@@ -4,30 +4,21 @@ ROOT=`dirname $0`
cd "$ROOT/.."
SKIP_TRANSLATIONS=false
while [[ $# -gt 0 ]]; do
case "$1" in
--skip-translations)
SKIP_TRANSLATIONS=true
shift
;;
*)
echo "Unknown argument: $1" >&2
exit 1
;;
esac
done
if [ "$SKIP_TRANSLATIONS" = false ]; then
echo "Updating translations..."
if ! [ -d packages/desktop-client/locale ]; then
git clone https://github.com/actualbudget/translations packages/desktop-client/locale
fi
pushd packages/desktop-client/locale > /dev/null
git checkout .
git pull
popd > /dev/null
packages/desktop-client/bin/remove-untranslated-languages
echo "Updating translations..."
if ! [ -d packages/desktop-client/locale ]; then
git clone https://github.com/actualbudget/translations packages/desktop-client/locale
fi
pushd packages/desktop-client/locale > /dev/null
git checkout .
git pull
popd > /dev/null
packages/desktop-client/bin/remove-untranslated-languages
lage build:browser --to=@actual-app/web
export NODE_OPTIONS="--max-old-space-size=4096"
yarn workspace @actual-app/crdt build
yarn workspace plugins-service build
yarn workspace @actual-app/core build:browser
yarn workspace @actual-app/web build:browser
echo "packages/desktop-client/build"

View File

@@ -57,7 +57,8 @@ yarn workspace @actual-app/core build:node
yarn workspace @actual-app/web build --mode=desktop # electron specific build
# required for running the sync-server server
yarn build:browser
yarn workspace @actual-app/core build:browser
yarn workspace @actual-app/web build:browser
yarn workspace @actual-app/sync-server build
# Emit @actual-app/core declarations so desktop-electron (which includes typings/window.ts) can build

View File

@@ -25,14 +25,6 @@ module.exports = {
outputGlob: BUILD_OUTPUT_GLOBS,
},
},
// Not cached: the script stages files into public/ and build-stats/ that
// fall outside BUILD_OUTPUT_GLOBS, so a cache hit would skip the side
// effects.
'build:browser': {
type: 'npmScript',
dependsOn: ['^build'],
cache: false,
},
},
cacheOptions: {
cacheStorageConfig: {

View File

@@ -24,16 +24,18 @@
"start:server-dev": "NODE_ENV=development BROWSER_OPEN=localhost:5006 yarn npm-run-all --parallel 'start:server-monitor' 'start'",
"start:desktop": "yarn desktop-dependencies && npm-run-all --parallel 'start:desktop-*'",
"start:docs": "yarn workspace docs start",
"desktop-dependencies": "npm-run-all --parallel rebuild-electron build:plugins-service",
"desktop-dependencies": "npm-run-all --parallel rebuild-electron build:browser-backend build:plugins-service",
"start:desktop-node": "yarn workspace @actual-app/core watch:node",
"start:desktop-client": "yarn workspace @actual-app/web watch",
"start:desktop-server-client": "yarn workspace @actual-app/web build:browser",
"start:desktop-electron": "yarn workspace desktop-electron watch",
"start:browser": "npm-run-all --parallel 'start:browser-*' 'start:service-plugins'",
"start:browser": "yarn workspace plugins-service build-dev && npm-run-all --parallel 'start:browser-*'",
"start:service-plugins": "yarn workspace plugins-service watch",
"start:browser-backend": "yarn workspace @actual-app/core watch:browser",
"start:browser-frontend": "yarn workspace @actual-app/web start:browser",
"start:storybook": "yarn workspace @actual-app/components start:storybook",
"build": "lage build",
"build:browser-backend": "yarn workspace @actual-app/core build:browser",
"build:server": "yarn build:browser && yarn workspace @actual-app/sync-server build",
"build:browser": "./bin/package-browser",
"build:desktop": "./bin/package-electron",
@@ -52,23 +54,20 @@
"playwright": "yarn workspace @actual-app/web run playwright",
"vrt": "yarn workspace @actual-app/web run vrt",
"vrt:docker": "./bin/run-vrt",
"rebuild-electron": "./node_modules/.bin/electron-rebuild -f -m ./packages/desktop-electron -o better-sqlite3,bcrypt",
"rebuild-electron": "./node_modules/.bin/electron-rebuild -m ./packages/loot-core && ./node_modules/.bin/electron-rebuild -m ./packages/desktop-electron -o better-sqlite3,bcrypt",
"rebuild-node": "yarn workspace @actual-app/core rebuild",
"lint": "oxfmt --check . && oxlint --type-aware --quiet",
"lint:fix": "oxfmt . && oxlint --fix --type-aware --quiet",
"install:server": "yarn workspaces focus @actual-app/sync-server --production",
"constraints": "yarn constraints",
"typecheck": "tsgo -p tsconfig.root.json --noEmit && lage typecheck",
"check:tsconfig-references": "workspaces-to-typescript-project-references --check",
"sync:tsconfig-references": "workspaces-to-typescript-project-references",
"prepare": "husky"
},
"devDependencies": {
"@monorepo-utils/workspaces-to-typescript-project-references": "^2.10.3",
"@octokit/rest": "^22.0.1",
"@types/node": "^22.19.17",
"@types/prompts": "^2.4.9",
"@typescript/native-preview": "beta",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"@yarnpkg/types": "^4.0.1",
"eslint": "^10.2.0",
"eslint-plugin-perfectionist": "^5.8.0",
@@ -99,9 +98,8 @@
"socks": ">=2.8.3"
},
"lint-staged": {
"packages/*/{package.json,tsconfig.json}": [
"ts-node ./bin/validate-publish-imports.ts --fix",
"yarn sync:tsconfig-references"
"packages/*/package.json": [
"ts-node ./bin/validate-publish-imports.ts --fix"
],
"*.{js,mjs,jsx,ts,tsx,md,json,yml,yaml}": [
"oxfmt --no-error-on-unmatched-pattern"

View File

@@ -0,0 +1,102 @@
/// <reference lib="webworker" />
// Worker entry for @actual-app/api's browser build.
//
// This owns the real loot-core instance (sql.js + absurd-sql + IndexedDB)
// and speaks loot-core's existing backend protocol over postMessage:
// main → worker: {id, name, args, undoTag?, catchErrors?}
// worker → main: {type:'reply', id, result, mutated, undoTag}
// {type:'error', id, error}
// {type:'connect'} (handshake heartbeat)
//
// Bootstrapping:
// - We register an `api-browser/init` handler that runs loot-core's public
// init(config), so the main-thread facade can kick off the DB + auth via
// a normal RPC call. The reply carries no return value (loot-core's
// `init(config)` resolves to `lib`, which isn't structured-cloneable).
// - connection.init(self, handlers) starts the message loop and the
// `{type:'connect'}` handshake loot-core's client connection expects.
import * as connection from '@actual-app/core/platform/server/connection';
import { handlers, init } from '@actual-app/core/server/main';
import type { InitConfig } from '@actual-app/core/server/main';
// Dev-server friendliness: consumer bundlers (Vite first, others too) run
// import-analysis on every `.js` URL they serve. loot-core's JS migrations
// use `#`-subpath imports that only resolve inside loot-core — analysis
// fails when those files live under node_modules/@actual-app/api/dist/.
// Our build writes those files with an extra `.data` suffix, so bundlers
// leave them alone. Translate the URLs here so loot-core's fetch layer
// still sees `.js` names both in the manifest and on-disk.
//
// The wrap has to install before connection.init() runs, and populateDefault-
// Filesystem is kicked off lazily from the first `load-budget` / init call.
{
const origFetch = globalThis.fetch;
const MIGRATION_JS = /\/data\/migrations\/[^/?]+\.js(\?.*)?$/;
globalThis.fetch = (async (
input: RequestInfo | URL,
initArg?: RequestInit,
): Promise<Response> => {
const url =
typeof input === 'string' ? input : (input as URL | Request).toString();
if (MIGRATION_JS.test(url)) {
// Re-target .js → .js.data before hitting the network.
const patched = url.replace(/(\.js)(\?|$)/, '.js.data$2');
return origFetch(patched, initArg);
}
if (
url.endsWith('/data-file-index.txt') ||
url.endsWith('data-file-index.txt')
) {
const res = await origFetch(input as RequestInfo | URL, initArg);
if (!res.ok) return res;
const text = await res.text();
const rewritten = text.replace(/\.js\.data(\r?\n|$)/g, '.js$1');
return new Response(rewritten, {
status: res.status,
statusText: res.statusText,
headers: res.headers,
});
}
return origFetch(input as RequestInfo | URL, initArg);
}) as typeof fetch;
}
// `api-browser/init` is a worker-local handler; it isn't part of the shared
// Handlers type. Assign via the index-signature cast rather than extending
// the type globally.
(handlers as Record<string, (args?: unknown) => Promise<unknown>>)[
'api-browser/init'
] = async function (args?: unknown) {
const payload = (args ?? {}) as InitConfig & { __assetsBaseUrl?: string };
// Main thread hands us a URL pointing at the api's own dist/ dir. Setting
// PUBLIC_URL here is what makes loot-core's populateDefaultFilesystem
// fetch `data-file-index.txt` / `data/<name>` / `sql-wasm.wasm` from our
// package instead of the consumer's page origin — no manual copy step.
const { __assetsBaseUrl, ...config } = payload;
if (__assetsBaseUrl) {
process.env.PUBLIC_URL = __assetsBaseUrl;
}
await init(config);
// Nothing to return — the resolved `lib` has functions and isn't
// structured-cloneable anyway.
};
self.addEventListener('error', e => {
// eslint-disable-next-line no-console
console.error(
'[api worker] uncaught',
(e as ErrorEvent).error ?? (e as ErrorEvent).message,
);
});
self.addEventListener('unhandledrejection', e => {
// eslint-disable-next-line no-console
console.error(
'[api worker] unhandled rejection',
(e as PromiseRejectionEvent).reason,
);
});
connection.init(self as unknown as Window, handlers);

View File

@@ -0,0 +1,39 @@
// Browser main-thread stub for `@actual-app/core/server/main`.
//
// The real loot-core runs inside the worker (see browser-worker.ts). The
// main-thread bundle reuses packages/api/methods.ts verbatim, but that file
// reads `lib.send(...)` from loot-core. Resolving that import to this stub
// routes every call over postMessage instead of touching loot-core on the
// main thread.
export type BrowserSendFn = (name: string, args?: unknown) => Promise<unknown>;
let workerSend: BrowserSendFn = () => {
return Promise.reject(
new Error('@actual-app/api: call init() before any other method'),
);
};
// Shape-cast rather than `typeof import(...)` so this stub stays
// module-graph-independent from the real loot-core.
export const lib = {
send(name: string, args?: unknown) {
return workerSend(name, args);
},
} as unknown as {
send: <T = unknown>(name: string, args?: unknown) => Promise<T>;
};
export function _setBrowserSend(fn: BrowserSendFn) {
workerSend = fn;
}
// Inline InitConfig (matches loot-core's shape) so this stub does not force
// TS to pull in the real @actual-app/core/server/main module graph at all.
export type InitConfig = {
dataDir?: string;
serverURL?: string;
password?: string;
sessionToken?: string;
verbose?: boolean;
};

132
packages/api/browser/rpc.ts Normal file
View File

@@ -0,0 +1,132 @@
// Main-thread RPC bridge to the api worker.
//
// Reuses `createBackendWorker` from loot-core so absurd-sql's main-thread
// plumbing (IDB helper worker, __absurd:* filtering) stays in one place.
// Speaks loot-core's existing backend protocol:
// out: {id, name, args, catchErrors?}
// in : {type:'reply', id, result, error?}
// {type:'error', id, error}
// {type:'connect'} (handshake heartbeat)
// {type:'push', name, args}
//
// We handle the handshake by replying {name:'client-connected-to-backend'}
// on the first 'connect'. Messages sent before handshake completes are
// queued.
import { createBackendWorker } from '@actual-app/core/platform/client/backend-worker';
import type { BackendWorker } from '@actual-app/core/platform/client/backend-worker';
type Pending = {
resolve: (v: unknown) => void;
reject: (e: unknown) => void;
};
type Reply =
| {
type: 'reply';
id: string;
result?: unknown;
error?: { type?: string; message?: string; [k: string]: unknown };
}
| {
type: 'error';
id: string;
error: { type?: string; message?: string; [k: string]: unknown };
};
let backend: BackendWorker | null = null;
let connected = false;
let queue: Array<{ id: string; name: string; args?: unknown }> = [];
const pending = new Map<string, Pending>();
function nextId(): string {
if (typeof crypto !== 'undefined' && 'randomUUID' in crypto) {
return crypto.randomUUID();
}
return Date.now().toString(36) + '-' + Math.random().toString(36).slice(2);
}
function toError(info: { type?: string; message?: string } | undefined) {
const msg = info?.message || info?.type || 'api worker error';
const err = new Error(msg);
if (info?.type) err.name = info.type;
return err;
}
export function setWorker(worker: Worker): BackendWorker {
if (backend) {
backend.terminate();
}
connected = false;
queue = [];
pending.clear();
backend = createBackendWorker(worker);
backend.onMessage((data: unknown) => {
if (!data || typeof data !== 'object') return;
const msg = data as { type?: string; name?: string };
if (msg.type === 'connect') {
if (!connected) {
connected = true;
backend!.postMessage({ name: 'client-connected-to-backend' });
// Drain anything queued while waiting for the handshake.
const drained = queue;
queue = [];
for (const m of drained) backend!.postMessage(m);
}
return;
}
if (msg.type === 'reply' || msg.type === 'error') {
const reply = msg as Reply;
const p = pending.get(reply.id);
if (!p) return;
pending.delete(reply.id);
if (reply.type === 'error') {
p.reject(toError(reply.error));
} else if ('error' in reply && reply.error) {
// api/* handlers funnel errors through the reply envelope.
p.reject(toError(reply.error));
} else {
p.resolve(reply.result);
}
return;
}
// push/capture-exception/etc. — ignore for now; the api consumer
// doesn't subscribe to loot-core's server events.
});
return backend;
}
export function rpc(name: string, args?: unknown): Promise<unknown> {
if (!backend) {
return Promise.reject(
new Error('@actual-app/api: init() must be called before any api method'),
);
}
return new Promise((resolve, reject) => {
const id = nextId();
pending.set(id, { resolve, reject });
const msg = { id, name, args };
if (connected) {
backend!.postMessage(msg);
} else {
queue.push(msg);
}
});
}
export function terminate() {
if (backend) {
backend.terminate();
backend = null;
}
connected = false;
queue = [];
pending.clear();
}

View File

@@ -0,0 +1,66 @@
// Main-thread browser entry for @actual-app/api.
//
// Public surface matches the Node entry. The worker is spawned internally
// so consumers write:
//
// import * as api from '@actual-app/api';
// await api.init({ dataDir: '/documents', serverURL, password });
// await api.getAccounts();
//
// worker.js must be a sibling of browser.js at runtime. Our build ships
// them together in dist/; the consumer's bundler resolves the worker URL
// via `new URL(..., import.meta.url)`.
import { _setBrowserSend } from './browser/lib-stub';
import type { InitConfig } from './browser/lib-stub';
import { rpc, setWorker, terminate } from './browser/rpc';
export * from './methods';
export * as utils from './utils';
// Wire methods.ts's `lib.send` through the worker.
_setBrowserSend((name, args) => rpc(name, args));
function createWorker(): Worker {
// Vite's `vite:worker-import-meta-url` plugin rewrites this pattern at
// the CONSUMER's build time (emit worker.js as an asset, substitute the
// hashed URL). Feeding it a non-literal first argument keeps the api's
// OWN lib build from trying to pre-bundle it, which would fail because
// ./worker.js is not a source-tree sibling of this file.
const rel = './worker.js';
return new Worker(new URL(rel, import.meta.url), { type: 'module' });
}
export async function init(config: InitConfig = {}) {
setWorker(createWorker());
// Point loot-core's browser fs at our dist/ directory. We want the
// directory portion of this bundle's own URL so loot-core's fetches land
// on files we ship (data-file-index.txt, migrations/, default-db.sqlite,
// sql-wasm.wasm). Vite's asset plugin tries to pre-bundle
// `new URL('.', import.meta.url)` at consumer build time and picks up
// the `development` export condition (inlining index.ts as a data URL!).
// Derive the base URL via string manipulation instead so static analyzers
// leave it alone.
const assetsBaseUrl = import.meta.url.replace(/[^/]+$/, '');
await rpc('api-browser/init', { ...config, __assetsBaseUrl: assetsBaseUrl });
// Return a {send} handle compatible with the Node entry so existing
// consumer code that does `const internal = await api.init(...); internal.send(...)`
// keeps working on the browser build too.
return {
send: (name: string, args?: unknown) => rpc(name, args),
};
}
export async function shutdown() {
try {
await rpc('sync');
} catch {
// most likely no budget loaded
}
try {
await rpc('close-budget');
} catch {
// ignore
}
terminate();
}

View File

@@ -13,7 +13,6 @@ import type { ImportTransactionsOpts } from '@actual-app/core/types/api-handlers
import type { Handlers } from '@actual-app/core/types/handlers';
import type {
ImportTransactionEntity,
NoteEntity,
RuleEntity,
TransactionEntity,
} from '@actual-app/core/types/models';
@@ -248,14 +247,6 @@ export function deleteCategory(
return send('api/category-delete', { id, transferCategoryId });
}
export function getNote(id: NoteEntity['id']) {
return send('api/note-get', { id });
}
export function updateNote(id: NoteEntity['id'], note: NoteEntity['note']) {
return send('api/note-update', { id, note });
}
export function getCommonPayees() {
return send('api/common-payees-get');
}

View File

@@ -1 +0,0 @@
export type * from '@actual-app/core/server/api-models';

View File

@@ -1,18 +1,11 @@
{
"name": "@actual-app/api",
"version": "26.5.2",
"version": "26.4.0",
"description": "An API for Actual",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/actualbudget/actual.git",
"directory": "packages/api"
},
"files": [
"@types",
"dist",
"!@types/**/*.test.d.ts",
"!@types/**/*.test.d.ts.map"
"dist"
],
"main": "dist/index.js",
"types": "@types/index.d.ts",
@@ -20,43 +13,45 @@
".": {
"types": "./@types/index.d.ts",
"development": "./index.ts",
"browser": "./dist/browser.js",
"default": "./dist/index.js"
},
"./models": {
"types": "./@types/models.d.ts",
"development": "./models.ts",
"default": "./dist/models.js"
}
},
"publishConfig": {
"exports": {
".": {
"types": "./@types/index.d.ts",
"browser": "./dist/browser.js",
"default": "./dist/index.js"
},
"./models": {
"types": "./@types/models.d.ts",
"default": "./dist/models.js"
}
}
},
"scripts": {
"build": "vite build && tsgo --emitDeclarationOnly",
"test": "vitest --run",
"build": "npm-run-all -s build:node build:browser-worker build:browser",
"build:node": "vite build --config vite.config.mts && tsgo --emitDeclarationOnly",
"build:browser": "vite build --config vite.browser.config.mts",
"build:browser-worker": "vite build --config vite.browser-worker.config.mts",
"test": "npm-run-all -cp 'test:*'",
"test:node": "vitest --run --config vite.config.mts",
"test:browser": "vitest --run --config vitest.browser.config.mts",
"typecheck": "tsgo -b && tsc-strict"
},
"dependencies": {
"@actual-app/core": "workspace:*",
"@actual-app/crdt": "workspace:*",
"absurd-sql": "0.0.54",
"better-sqlite3": "^12.8.0",
"compare-versions": "^6.1.1",
"uuid": "^14.0.0"
"compare-versions": "^6.1.1"
},
"devDependencies": {
"@typescript/native-preview": "beta",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"fake-indexeddb": "^6.2.5",
"jsdom": "^27.4.0",
"npm-run-all": "^4.1.5",
"rollup-plugin-visualizer": "^7.0.1",
"typescript-strict-plugin": "^2.4.4",
"vite": "^8.0.5",
"vite-plugin-node-polyfills": "^0.26.0",
"vite-plugin-peggy-loader": "^2.0.1",
"vitest": "^4.1.2"
},

View File

@@ -0,0 +1,183 @@
import { afterEach, describe, expect, test, vi } from 'vitest';
import * as api from '../index.browser';
// Swap the real Worker constructor for a mock that the tests control. Vitest
// picks this up via vite.config resolve.alias; here we just stand in globally
// because jsdom does not ship Worker at all.
class MockWorker {
public posted: Array<unknown> = [];
public responder: (
req: { id: string; name: string; args?: unknown },
reply: (res: unknown) => void,
) => void = () => undefined;
private listeners: Array<(e: MessageEvent) => void> = [];
onmessage: ((e: MessageEvent) => void) | null = null;
onerror: ((e: ErrorEvent) => void) | null = null;
private connected = false;
addEventListener(type: string, handler: (e: MessageEvent) => void) {
if (type === 'message') this.listeners.push(handler);
}
removeEventListener() {
// no-op for tests
}
postMessage(msg: unknown) {
this.posted.push(msg);
if (
msg &&
typeof msg === 'object' &&
(msg as { name?: string }).name === 'client-connected-to-backend'
) {
// Handshake complete; we won't keep sending 'connect' heartbeats.
return;
}
const req = msg as { id: string; name: string; args?: unknown };
queueMicrotask(() => {
this.responder(req, (data: unknown) => {
const ev = { data } as MessageEvent;
this.onmessage?.(ev);
for (const l of this.listeners) l(ev);
});
});
}
/** Simulate loot-core's connect handshake from the worker side. */
fireConnect() {
if (this.connected) return;
this.connected = true;
const ev = { data: { type: 'connect' } } as MessageEvent;
this.onmessage?.(ev);
for (const l of this.listeners) l(ev);
}
terminate() {
this.listeners = [];
}
}
// Every Worker the api spawns inside init() comes through here.
let lastMockWorker: MockWorker | null = null;
const mockWorkerResponder = vi.fn<
(
req: { id: string; name: string; args?: unknown },
reply: (res: unknown) => void,
) => void
>(() => undefined);
// Global Worker stub — the api's internal `new Worker(...)` will call this.
// @ts-expect-error jsdom has no Worker; we override the global for the test.
globalThis.Worker = class {
constructor(_url: URL | string, _opts?: WorkerOptions) {
const w = new MockWorker();
w.responder = (req, reply) => mockWorkerResponder(req, reply);
lastMockWorker = w;
// Fire the connect handshake on the next tick so init() resolves.
queueMicrotask(() => w.fireConnect());
return w as unknown as Worker;
}
};
// absurd-sql's main-thread bridge expects real Worker event semantics. The
// mock above exposes addEventListener; initSQLBackend just attaches a
// message listener, so it's safe with jsdom.
afterEach(async () => {
// Keep whatever responder the test installed so shutdown's sync/close-budget
// calls resolve rather than hang.
await api.shutdown().catch(() => undefined);
mockWorkerResponder.mockReset();
lastMockWorker = null;
});
describe('@actual-app/api browser facade', () => {
test('spawns a worker on init and forwards config via api-browser/init', async () => {
mockWorkerResponder.mockImplementation((req, reply) => {
reply({ type: 'reply', id: req.id, result: undefined });
});
await api.init({
dataDir: '/documents',
serverURL: 'https://example.test',
password: 'pw',
});
expect(lastMockWorker).toBeTruthy();
// First post after the handshake ack is the api-browser/init request.
const initCall = lastMockWorker!.posted.find(
m =>
m &&
typeof m === 'object' &&
(m as { name?: string }).name === 'api-browser/init',
) as { name: string; args: unknown } | undefined;
expect(initCall).toBeTruthy();
expect(initCall!.args).toMatchObject({
dataDir: '/documents',
serverURL: 'https://example.test',
password: 'pw',
});
// The api also hands over its own asset base URL so loot-core's fs
// can fetch migrations / default-db / WASM from the api's dist/
// instead of the consumer's page origin.
expect(
(initCall!.args as { __assetsBaseUrl?: string }).__assetsBaseUrl,
).toBeTypeOf('string');
});
test('rpc methods forward as {id, name, args} and read {type:reply, result}', async () => {
mockWorkerResponder.mockImplementation((req, reply) => {
if (req.name === 'api-browser/init') {
reply({ type: 'reply', id: req.id, result: undefined });
return;
}
if (req.name === 'api/accounts-get') {
reply({
type: 'reply',
id: req.id,
result: [{ id: 'a1', name: 'Checking' }],
});
return;
}
reply({
type: 'error',
id: req.id,
error: { type: 'APIError', message: 'unexpected' },
});
});
await api.init({ dataDir: '/documents' });
const accounts = await api.getAccounts();
expect(accounts).toEqual([{ id: 'a1', name: 'Checking' }]);
const sendCalls = lastMockWorker!.posted.filter(
m =>
m &&
typeof m === 'object' &&
(m as { name?: string }).name === 'api/accounts-get',
);
expect(sendCalls).toHaveLength(1);
expect((sendCalls[0] as { args?: unknown }).args).toBeUndefined();
});
test('worker errors reject at the call site', async () => {
mockWorkerResponder.mockImplementation((req, reply) => {
if (req.name === 'api-browser/init') {
reply({ type: 'reply', id: req.id, result: undefined });
return;
}
reply({
type: 'reply',
id: req.id,
error: { type: 'APIError', message: 'budget not loaded' },
});
});
await api.init({ dataDir: '/documents' });
await expect(api.getAccounts()).rejects.toThrow(/budget not loaded/);
});
});

View File

@@ -0,0 +1,43 @@
import { afterEach, describe, expect, test } from 'vitest';
import * as api from '../index';
declare const __API_DATA_DIR__: string;
afterEach(async () => {
await api.shutdown();
});
describe('api CRUD roundtrip (Node)', () => {
test('creates a budget, writes, reads it back', async () => {
const internal = await api.init({ dataDir: __API_DATA_DIR__ });
await internal.send('create-budget', {
budgetName: 'Integration Test',
testMode: true,
testBudgetId: 'integration-test',
});
await api.loadBudget('integration-test');
const accountId = await api.createAccount(
{ name: 'Checking', offbudget: false },
0,
);
await api.addTransactions(accountId, [
{ date: '2026-04-01', amount: 1000, payee_name: 'Coffee' },
{ date: '2026-04-02', amount: -500, payee_name: 'Book' },
]);
const accounts = await api.getAccounts();
expect(accounts.map(a => a.name)).toContain('Checking');
const txns = await api.getTransactions(
accountId,
'2026-04-01',
'2026-04-30',
);
expect(txns).toHaveLength(2);
expect(txns.map(t => t.amount).sort((a, b) => a - b)).toEqual([-500, 1000]);
});
});

View File

@@ -2,43 +2,18 @@ import * as fs from 'fs/promises';
import * as path from 'path';
import type { RuleEntity } from '@actual-app/core/types/models';
import { vi } from 'vitest';
import * as api from './index';
declare global {
var IS_TESTING: boolean;
var currentMonth: string | null;
}
// In tests we run from source; loot-core's API fs uses __dirname (for the built dist/).
// Mock the fs so path constants point at loot-core package root where migrations live.
vi.mock(
'../loot-core/src/platform/server/fs/index.api',
async importOriginal => {
const actual = (await importOriginal()) as Record<string, unknown>;
const pathMod = await import('path');
const lootCoreRoot = pathMod.join(__dirname, '..', 'loot-core');
return {
...actual,
migrationsPath: pathMod.join(lootCoreRoot, 'migrations'),
bundledDatabasePath: pathMod.join(lootCoreRoot, 'default-db.sqlite'),
demoBudgetPath: pathMod.join(lootCoreRoot, 'demo-budget'),
};
},
);
import * as api from '../index';
const budgetName = 'test-budget';
global.IS_TESTING = true;
beforeEach(async () => {
const budgetPath = path.join(__dirname, '/mocks/budgets/', budgetName);
const budgetPath = path.join(__dirname, '/../mocks/budgets/', budgetName);
await fs.rm(budgetPath, { force: true, recursive: true });
await createTestBudget('default-budget-template', budgetName);
await api.init({
dataDir: path.join(__dirname, '/mocks/budgets/'),
dataDir: path.join(__dirname, '/../mocks/budgets/'),
});
});
@@ -50,10 +25,10 @@ afterEach(async () => {
async function createTestBudget(templateName: string, name: string) {
const templatePath = path.join(
__dirname,
'/../loot-core/src/mocks/files',
'/../../loot-core/src/mocks/files',
templateName,
);
const budgetPath = path.join(__dirname, '/mocks/budgets/', name);
const budgetPath = path.join(__dirname, '/../mocks/budgets/', name);
await fs.mkdir(budgetPath);
await fs.copyFile(
@@ -516,29 +491,6 @@ describe('API CRUD operations', () => {
);
});
// apis: getNote, updateNote
test('Notes: successfully get and update note', async () => {
const categories = await api.getCategories();
const categoryId = categories[0].id;
// No note exists initially
const initial = await api.getNote(categoryId);
expect(initial).toBeNull();
// Set a note
await api.updateNote(categoryId, 'Test note content');
const afterSet = await api.getNote(categoryId);
expect(afterSet).toEqual({ id: categoryId, note: 'Test note content' });
// Update the note
await api.updateNote(categoryId, 'Updated note content');
const afterUpdate = await api.getNote(categoryId);
expect(afterUpdate).toEqual({
id: categoryId,
note: 'Updated note content',
});
});
// apis: getRules, getPayeeRules, createRule, updateRule, deleteRule
test('Rules: successfully update rules', async () => {
await api.createPayee({ name: 'test-payee' });

View File

@@ -0,0 +1,31 @@
import * as fsPromises from 'fs/promises';
import * as os from 'os';
import * as path from 'path';
import { vi } from 'vitest';
// In tests we run from source; loot-core's API fs uses __dirname (for the built dist/).
// Mock the fs so path constants point at loot-core package root where migrations live.
vi.mock(
'../../loot-core/src/platform/server/fs/index.api',
async importOriginal => {
const actual = (await importOriginal()) as Record<string, unknown>;
const lootCoreRoot = path.join(__dirname, '..', '..', 'loot-core');
return {
...actual,
migrationsPath: path.join(lootCoreRoot, 'migrations'),
bundledDatabasePath: path.join(lootCoreRoot, 'default-db.sqlite'),
demoBudgetPath: path.join(lootCoreRoot, 'demo-budget'),
};
},
);
global.IS_TESTING = true;
// Shared integration test lives in a filesystem-backed tmp dir.
const dataDir = path.join(
os.tmpdir(),
`api-it-${Date.now()}-${Math.random().toString(36).slice(2)}`,
);
await fsPromises.mkdir(dataDir, { recursive: true });
globalThis.__API_DATA_DIR__ = dataDir;

View File

@@ -8,33 +8,28 @@
"module": "es2022",
"moduleResolution": "bundler",
"customConditions": ["api"],
// composite + declaration: true require `noEmit: false`, so use
// emitDeclarationOnly to keep typecheck + project refs working without
// clobbering the Vite build artifacts in dist/. build:node also passes
// --emitDeclarationOnly on the CLI (redundant but explicit).
"noEmit": false,
"emitDeclarationOnly": true,
"declaration": true,
"declarationMap": true,
"outDir": "dist",
"rootDir": ".",
"declarationDir": "@types",
"tsBuildInfoFile": "dist/.tsbuildinfo",
"plugins": [
{
"name": "typescript-strict-plugin",
"paths": ["."]
}
]
"plugins": [{ "name": "typescript-strict-plugin", "paths": ["."] }]
},
"references": [
{
"path": "../loot-core"
},
{
"path": "../crdt"
}
],
"references": [{ "path": "../crdt" }, { "path": "../loot-core" }],
"include": ["."],
"exclude": [
"**/node_modules/*",
"dist",
"@types",
"**/*.test.ts",
"test/setup.*.ts",
"*.config.ts",
"*.config.mts"
]

View File

@@ -0,0 +1,62 @@
import path from 'path';
import { defineConfig } from 'vite';
import { nodePolyfills } from 'vite-plugin-node-polyfills';
import peggyLoader from 'vite-plugin-peggy-loader';
const distDir = path.resolve(__dirname, 'dist');
// Worker bundle: contains the full loot-core + sql.js + absurd-sql stack.
// Runs inside a Web Worker where absurd-sql's Atomics.wait has the right
// thread context. Consumer spawns the worker with this file as the entry.
export default defineConfig({
define: {
// NODE_ENV is read at build time by dead-code elimination paths and
// must stay a literal. The others (PUBLIC_URL, DATA_DIR, SERVER_URL,
// DOCUMENT_DIR) are set at runtime via the `api-browser/init` handler
// which receives them from the main thread — so they stay as
// `process.env.<name>` references and the nodePolyfills-provided
// process shim serves as the backing store.
'process.env.NODE_ENV': JSON.stringify('production'),
},
build: {
target: 'esnext',
outDir: distDir,
emptyOutDir: false,
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'browser-worker.ts'),
formats: ['es'],
fileName: () => 'worker.js',
},
rollupOptions: {
output: {
codeSplitting: false,
},
},
},
plugins: [
peggyLoader(),
nodePolyfills({
include: [
'process',
'buffer',
'stream',
'path',
'crypto',
'timers',
'util',
'zlib',
'fs',
'assert',
],
globals: {
process: true,
Buffer: true,
global: true,
},
}),
],
// Intentionally no resolve.conditions: ['api'] — loot-core falls back to
// its default (browser) platform files.
});

View File

@@ -0,0 +1,39 @@
import path from 'path';
import { defineConfig } from 'vite';
const distDir = path.resolve(__dirname, 'dist');
// Main-thread facade only. Tiny bundle: no loot-core, no sql.js, no absurd-sql.
// The worker is built separately by vite.browser-worker.config.mts. The
// consumer constructs the Worker (handling URL resolution through their own
// bundler) and hands it to init().
export default defineConfig({
build: {
target: 'esnext',
outDir: distDir,
emptyOutDir: false,
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'index.browser.ts'),
formats: ['es'],
fileName: () => 'browser.js',
},
rollupOptions: {
output: {
codeSplitting: false,
},
},
},
resolve: {
alias: {
// methods.ts reads `lib.send` from loot-core's server/main. Route it
// through the main-thread stub so loot-core is never pulled into
// the main bundle.
'@actual-app/core/server/main': path.resolve(
__dirname,
'browser/lib-stub.ts',
),
},
},
});

View File

@@ -49,10 +49,61 @@ function copyMigrationsAndDefaultDb() {
throw new Error(`default-db.sqlite not found at ${defaultDbPath}`);
}
fs.copyFileSync(defaultDbPath, path.join(distDir, 'default-db.sqlite'));
// Browser consumers need sql.js' WASM to be served at the same origin
// as the bundle. Ship it alongside dist/ so downstream apps just point
// a static handler at dist and don't have to reach into node_modules.
const sqlJsWasm = require.resolve('@jlongster/sql.js/dist/sql-wasm.wasm');
fs.copyFileSync(sqlJsWasm, path.join(distDir, 'sql-wasm.wasm'));
// loot-core's browser fs bootstraps by fetching:
// `${PUBLIC_URL}data-file-index.txt` - flat manifest
// `${PUBLIC_URL}data/<name>` - each file listed in the manifest
// We point PUBLIC_URL at the api's dist dir at runtime (see
// index.browser.ts), so these two shapes need to exist here.
//
// JS migrations get a `.data` suffix on the *wire* path. Consumer
// bundlers (Vite's dev server first, others to varying degrees)
// auto-transform `.js` URLs through their import-analysis pipelines,
// which fails on loot-core's `#`-subpath imports. The api's worker
// (browser-worker.ts) wraps `fetch` to translate back to `.js` so
// loot-core's migration runner finds the file under its original
// name in the virtual FS. `.sql` migrations stay as-is.
const dataDir = path.join(distDir, 'data');
const dataMigrationsDir = path.join(dataDir, 'migrations');
fs.mkdirSync(dataMigrationsDir, { recursive: true });
linkOrCopy(
path.join(distDir, 'default-db.sqlite'),
path.join(dataDir, 'default-db.sqlite'),
);
const wireMigrationNames: string[] = [];
for (const name of fs.readdirSync(migrationsDest)) {
const wireName = name.endsWith('.js') ? `${name}.data` : name;
linkOrCopy(
path.join(migrationsDest, name),
path.join(dataMigrationsDir, wireName),
);
wireMigrationNames.push(`migrations/${wireName}`);
}
wireMigrationNames.sort();
// data-file-index.txt: one path per line, relative to `data/`.
const manifest =
['default-db.sqlite', ...wireMigrationNames].join('\n') + '\n';
fs.writeFileSync(path.join(distDir, 'data-file-index.txt'), manifest);
},
};
}
function linkOrCopy(src: string, dest: string) {
try {
fs.linkSync(src, dest);
} catch {
fs.copyFileSync(src, dest);
}
}
export default defineConfig({
ssr: {
noExternal: true,
@@ -66,12 +117,9 @@ export default defineConfig({
emptyOutDir: true,
sourcemap: true,
lib: {
entry: {
index: path.resolve(__dirname, 'index.ts'),
models: path.resolve(__dirname, 'models.ts'),
},
entry: path.resolve(__dirname, 'index.ts'),
formats: ['cjs'],
fileName: (_format, entryName) => `${entryName}.js`,
fileName: () => 'index.js',
},
},
plugins: [
@@ -85,6 +133,9 @@ export default defineConfig({
},
test: {
globals: true,
environment: 'node',
setupFiles: ['./test/setup.node.ts'],
exclude: ['**/node_modules/**', '**/browser-facade.test.ts'],
onConsoleLog(log: string, type: 'stdout' | 'stderr'): boolean | void {
// print only console.error
return type === 'stderr';

View File

@@ -0,0 +1,35 @@
import path from 'path';
import { defineConfig } from 'vite';
import peggyLoader from 'vite-plugin-peggy-loader';
// Deliberately independent from vite.browser.config.mts: the build config
// applies node polyfills that would swap out Node fs in the test setup
// file. The test setup uses real Node fs to stream the on-disk fixtures
// (default-db.sqlite, migrations, sql.js WASM) through a fetch polyfill.
export default defineConfig({
plugins: [peggyLoader()],
// The facade test imports `../index.browser` directly and uses a mock
// Worker. loot-core never loads on the main thread, so no platform
// condition juggling is needed here. The sibling vite.browser.config.mts
// aliases loot-core to the stub for the bundled facade; for the test we
// mirror that so `methods.ts` resolves correctly.
resolve: {
alias: {
'@actual-app/core/server/main': path.resolve(
__dirname,
'browser/lib-stub.ts',
),
},
},
test: {
globals: true,
environment: 'jsdom',
include: ['test/browser-facade.test.ts'],
onConsoleLog(log: string, type: 'stdout' | 'stderr'): boolean | void {
return type === 'stderr';
},
maxWorkers: 2,
},
});

View File

@@ -69,8 +69,6 @@ const botEmail = '41898282+github-actions[bot]@users.noreply.github.com';
await exec(`git config user.name '${botName}'`);
await exec(`git config user.email '${botEmail}'`);
const AUTOGEN_MARKER = '<!-- release-notes:auto-generated -->';
await group('Prepare branch', async () => {
if (process.env.GITHUB_HEAD_REF) {
await exec(`git fetch origin ${process.env.GITHUB_HEAD_REF}`, {
@@ -81,34 +79,17 @@ await group('Prepare branch', async () => {
});
}
// recover deleted release note files from previous generation commits
const baseRef = process.env.GITHUB_BASE_REF || 'master';
await exec(`git fetch origin ${baseRef}`, { stdio: 'inherit' });
const { stdout: mergeBase } = await exec(
`git merge-base HEAD origin/${baseRef}`,
// the previous generation commit deletes source files from
// upcoming-release-notes, rebase it out so we can regenerate from all of them
const { stdout: commitHash } = await exec(
`git log --grep='${commitMessage}' --format=%H -1`,
);
const base = mergeBase.trim();
const { stdout: genLog } = await exec(
`git log --grep='${commitMessage}' --format=%H ${base}..HEAD`,
);
const genCommits = genLog.split('\n').filter(Boolean);
console.log(
`Reversing upcoming-release-notes deletions from ${genCommits.length} prior generation commit(s)`,
);
const tmpDir = process.env.RUNNER_TEMP || '/tmp';
for (const sha of genCommits) {
const patchPath = join(tmpDir, `revert-${sha}.patch`);
try {
await exec(
`git diff --diff-filter=D ${sha}~1..${sha} -- upcoming-release-notes > ${patchPath}`,
);
const { size } = await fs.stat(patchPath);
if (size > 0) {
await exec(`git apply -R --3way ${patchPath}`, { stdio: 'inherit' });
}
} finally {
await fs.unlink(patchPath).catch(() => undefined);
}
const hash = commitHash.trim();
if (hash) {
console.log(`Dropping previous release notes commit ${hash}`);
await exec(`git rebase --onto ${hash}~1 ${hash}`, {
stdio: 'inherit',
});
}
});
@@ -126,14 +107,13 @@ if (files.length === 0) {
const highlights = '- TODO: Add release highlights';
const blogPath = join(
'packages/docs/blog',
`${releaseDate}-release-${slug}.md`,
);
const releasesPath = 'packages/docs/docs/releases.md';
await group('Generate blog post', async () => {
const template = `---
const blogPath = join(
'packages/docs/blog',
`${releaseDate}-release-${slug}.md`,
);
const blogContent = `---
title: Release ${version}
description: New release of Actual.
date: ${releaseDate}T10:00
@@ -149,60 +129,18 @@ ${highlights}
**Docker Tag: ${version}**
${AUTOGEN_MARKER}
${categorizedNotes}
`;
let blogContent;
try {
const existing = await fs.readFile(blogPath, 'utf-8');
const idx = existing.indexOf(AUTOGEN_MARKER);
if (idx === -1) {
console.log(
`WARNING: ${blogPath} missing ${AUTOGEN_MARKER}, rewriting from template`,
);
blogContent = template;
} else {
blogContent =
existing.slice(0, idx + AUTOGEN_MARKER.length) +
'\n' +
categorizedNotes +
'\n';
}
} catch (e) {
if (e.code !== 'ENOENT') throw e;
blogContent = template;
}
await fs.writeFile(blogPath, blogContent);
console.log(`Wrote ${blogPath}`);
});
await group('Update releases.md', async () => {
const releasesPath = 'packages/docs/docs/releases.md';
const existing = await fs.readFile(releasesPath, 'utf-8');
const sectionRe = new RegExp(
`(^|\\n)## ${escapeRegExp(version)}\\n[\\s\\S]*?(?=\\n## |$)`,
);
const match = existing.match(sectionRe);
let updated;
if (match) {
const section = match[0];
const idx = section.indexOf(AUTOGEN_MARKER);
if (idx === -1) {
console.log(
`WARNING: section for ${version} in ${releasesPath} missing ${AUTOGEN_MARKER}, leaving as-is`,
);
updated = existing;
} else {
const newSection =
section.slice(0, idx + AUTOGEN_MARKER.length) + '\n' + categorizedNotes;
updated = existing.replace(section, newSection);
}
} else {
const newSection = `## ${version}
const newSection = `## ${version}
Release date: ${releaseDate}
@@ -210,14 +148,12 @@ ${highlights}
**Docker Tag: ${version}**
${AUTOGEN_MARKER}
${categorizedNotes}`;
updated = existing.replace(
'# Release Notes\n',
`# Release Notes\n\n${newSection}\n`,
);
}
const updated = existing.replace(
'# Release Notes\n',
`# Release Notes\n\n${newSection}\n`,
);
await fs.writeFile(releasesPath, updated);
console.log(`Updated ${releasesPath}`);
@@ -229,28 +165,13 @@ await group('Remove used release notes', async () => {
);
});
await group('Format generated files', async () => {
await exec(`yarn exec oxfmt ${blogPath} ${releasesPath}`, {
stdio: 'inherit',
});
});
await group('Commit and push', async () => {
await exec(
'git add upcoming-release-notes packages/docs/blog packages/docs/docs/releases.md',
{ stdio: 'inherit' },
);
try {
await exec('git diff --cached --quiet');
console.log('No changes to commit');
return;
} catch {
// there are staged changes
}
await exec(`git commit -m '${commitMessage}'`);
await exec('git push origin', { stdio: 'inherit' });
await exec('git push --force-with-lease origin', { stdio: 'inherit' });
});
async function parseReleaseNotes(dir) {
@@ -284,10 +205,6 @@ async function parseReleaseNotes(dir) {
return { notesByCategory, files };
}
function escapeRegExp(str) {
return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
}
function formatNotes(notes) {
return Object.entries(notes)
.filter(([_, values]) => values.length > 0)

View File

@@ -9,7 +9,7 @@
},
"devDependencies": {
"@octokit/rest": "^22.0.1",
"@typescript/native-preview": "beta",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"extensionless": "^2.0.6",
"gray-matter": "^4.0.3",
"listify": "^1.0.3",

View File

@@ -43,16 +43,13 @@ Configuration is resolved in this order (highest priority first):
### Environment Variables
| Variable | Description |
| ---------------------- | ----------------------------------------------------- |
| `ACTUAL_SERVER_URL` | URL of the Actual sync server (required) |
| `ACTUAL_PASSWORD` | Server password (required unless using token) |
| `ACTUAL_SESSION_TOKEN` | Session token (alternative to password) |
| `ACTUAL_SYNC_ID` | Budget Sync ID (required for most commands) |
| `ACTUAL_DATA_DIR` | Local directory for cached budget data |
| `ACTUAL_CACHE_TTL` | Cache TTL in seconds (default: 60) |
| `ACTUAL_LOCK_TIMEOUT` | Budget-dir lock wait timeout in seconds (default: 10) |
| `ACTUAL_NO_LOCK` | Set to `1` to disable budget-dir locking |
| Variable | Description |
| ---------------------- | --------------------------------------------- |
| `ACTUAL_SERVER_URL` | URL of the Actual sync server (required) |
| `ACTUAL_PASSWORD` | Server password (required unless using token) |
| `ACTUAL_SESSION_TOKEN` | Session token (alternative to password) |
| `ACTUAL_SYNC_ID` | Budget Sync ID (required for most commands) |
| `ACTUAL_DATA_DIR` | Local directory for cached budget data |
### Config File
@@ -62,10 +59,7 @@ Create an `.actualrc.json` (or `.actualrc`, `.actualrc.yaml`, `actual.config.js`
{
"serverUrl": "http://localhost:5006",
"password": "your-password",
"syncId": "1cfdbb80-6274-49bf-b0c2-737235a4c81f",
"cacheTtl": 60,
"lockTimeout": 10,
"noLock": false
"syncId": "1cfdbb80-6274-49bf-b0c2-737235a4c81f"
}
```
@@ -80,11 +74,6 @@ Create an `.actualrc.json` (or `.actualrc`, `.actualrc.yaml`, `actual.config.js`
| `--session-token <token>` | Session token |
| `--sync-id <id>` | Budget Sync ID |
| `--data-dir <path>` | Data directory |
| `--cache-ttl <seconds>` | Cache TTL; `0` disables caching (default: 60) |
| `--refresh` | Force a sync on this call, ignoring the cache |
| `--no-cache` | Alias for `--refresh` |
| `--lock-timeout <secs>` | Lock wait timeout (default: 10) |
| `--no-lock` | Disable budget-dir locking (use with care) |
| `--format <format>` | Output format: `json` (default), `table`, `csv` |
| `--verbose` | Show informational messages |
@@ -103,7 +92,6 @@ Create an `.actualrc.json` (or `.actualrc`, `.actualrc.yaml`, `actual.config.js`
| `schedules` | Manage scheduled transactions |
| `query` | Run an ActualQL query |
| `server` | Server utilities and lookups |
| `sync` | Refresh or inspect local cache |
Run `actual <command> --help` for subcommands and options.
@@ -147,32 +135,22 @@ All monetary amounts are **integer cents** when passed as input (flags, JSON):
- **Split transactions:** When summing or counting transactions, filter `"is_parent": false` to avoid double-counting. A split parent holds the total amount, and its children hold the individual parts — including both would count the total twice.
- **Rapid sequential requests:** The CLI caches the budget locally (see [Caching](#caching)), so read-heavy scripts no longer need a single-query workaround by default. For very chatty scripts, run `actual sync` once and then use a long `--cache-ttl` for reads:
- **Avoid rapid sequential requests:** Each CLI invocation opens a new server connection. Running queries in a tight loop (e.g. one per month) may trigger rate limiting or authentication failures. Instead, fetch all data in a single query with a date range filter and process locally:
```bash
actual sync
actual --cache-ttl 3600 query run ...
actual --cache-ttl 3600 accounts list
# Good: single query for the full year
actual query run --table transactions \
--filter '{"$and":[{"date":{"$gte":"2025-01-01"}},{"date":{"$lte":"2025-12-31"}}]}' \
--limit 5000
# Bad: one query per month in a loop (may fail with auth errors)
for month in 01 02 03 ...; do actual query run ...; done
```
- **Uncategorized transactions:** `category.name` is `null` for transactions without a category. Account for this when filtering or grouping by category.
- **No date sub-fields in AQL:** `date.month`, `date.year`, etc. are not supported as query fields. To group by month, fetch raw transactions with a date range filter and aggregate locally in a script.
## Caching
The CLI keeps a local copy of your budget so repeated commands don't hit the sync server on every call. Within the TTL (default `60` seconds), read commands (`list`, `balance`, `query run`, …) reuse the cached budget without a network round-trip. Write commands (`add`, `update`, `set-amount`, …) always sync with the server before and after the write.
- `actual sync` — refresh the cache now.
- `actual sync --status` — show how stale the local cache is.
- `actual sync --clear` — delete the local cache; the next command re-downloads.
- `--refresh` (or `--no-cache`) — force a sync on a single call.
- `--cache-ttl <seconds>` — override the TTL for a single call (use `0` to disable caching).
### Concurrency
The CLI takes a shared lock for reads and an exclusive lock for writes on the per-budget cache directory. Many parallel reads are safe; writes serialize. If another CLI process is holding the lock, subsequent invocations wait up to `--lock-timeout` seconds (default `10`) before failing with an error. Pass `--no-lock` to opt out in trusted single-process setups.
## Running Locally (Development)
If you're working on the CLI within the monorepo:

View File

@@ -1,13 +1,8 @@
{
"name": "@actual-app/cli",
"version": "26.5.2",
"version": "26.4.0",
"description": "CLI for Actual Budget",
"license": "MIT",
"repository": {
"type": "git",
"url": "git+https://github.com/actualbudget/actual.git",
"directory": "packages/cli"
},
"bin": {
"actual": "./dist/cli.js",
"actual-cli": "./dist/cli.js"
@@ -17,12 +12,10 @@
],
"type": "module",
"imports": {
"#cache": "./src/cache.ts",
"#commands/*": "./src/commands/*.ts",
"#config": "./src/config.ts",
"#connection": "./src/connection.ts",
"#input": "./src/input.ts",
"#lock": "./src/lock.ts",
"#output": "./src/output.ts",
"#utils": "./src/utils.ts"
},
@@ -35,13 +28,11 @@
"@actual-app/api": "workspace:*",
"cli-table3": "^0.6.5",
"commander": "^14.0.3",
"cosmiconfig": "^9.0.1",
"proper-lockfile": "^4.1.2"
"cosmiconfig": "^9.0.1"
},
"devDependencies": {
"@types/node": "^22.19.17",
"@types/proper-lockfile": "^4",
"@typescript/native-preview": "beta",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"rollup-plugin-visualizer": "^7.0.1",
"vite": "^8.0.5",
"vitest": "^4.1.2"

View File

@@ -1,206 +0,0 @@
import {
existsSync,
mkdtempSync,
readFileSync,
rmSync,
writeFileSync,
} from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import {
CACHE_FILE_NAME,
decideSyncAction,
readCacheState,
writeCacheState,
} from './cache';
describe('readCacheState', () => {
let dir: string;
beforeEach(() => {
dir = mkdtempSync(join(tmpdir(), 'actual-cli-cache-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('returns null when the file does not exist', () => {
expect(readCacheState(dir)).toBeNull();
});
it('returns null when the file is corrupt', () => {
writeFileSync(join(dir, CACHE_FILE_NAME), 'not json');
expect(readCacheState(dir)).toBeNull();
});
it('returns null when the file has the wrong version', () => {
writeFileSync(
join(dir, CACHE_FILE_NAME),
JSON.stringify({
version: 999,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
}),
);
expect(readCacheState(dir)).toBeNull();
});
it('returns the parsed state when the file is valid', () => {
writeFileSync(
join(dir, CACHE_FILE_NAME),
JSON.stringify({
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1234,
lastDownloadedAt: 5678,
}),
);
expect(readCacheState(dir)).toEqual({
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1234,
lastDownloadedAt: 5678,
});
});
});
describe('writeCacheState', () => {
let dir: string;
beforeEach(() => {
dir = mkdtempSync(join(tmpdir(), 'actual-cli-cache-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('writes the state to the cache file', () => {
writeCacheState(dir, {
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
});
const raw = readFileSync(join(dir, CACHE_FILE_NAME), 'utf-8');
expect(JSON.parse(raw).syncId).toBe('a');
});
it('is atomic: removes the tmp file after rename', () => {
writeCacheState(dir, {
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
});
expect(existsSync(join(dir, `${CACHE_FILE_NAME}.tmp`))).toBe(false);
});
it('does not throw when the filesystem refuses the write', () => {
// Force ENOTDIR by pointing writeCacheState at a path whose parent is a
// regular file — no OS-specific pseudo-filesystem semantics needed.
const file = join(dir, 'not-a-dir');
writeFileSync(file, '');
expect(() =>
writeCacheState(join(file, 'nested'), {
version: 1,
syncId: 'a',
budgetId: 'b',
serverUrl: 'c',
lastSyncedAt: 1,
lastDownloadedAt: 1,
}),
).not.toThrow();
});
});
describe('decideSyncAction', () => {
const base = {
state: {
version: 1 as const,
syncId: 'sync-1',
budgetId: 'bud-1',
serverUrl: 'http://s',
lastSyncedAt: 1_000_000,
lastDownloadedAt: 1_000_000,
},
config: { syncId: 'sync-1', serverUrl: 'http://s' },
now: 1_000_000,
ttlMs: 60_000,
mutates: false,
refresh: false,
encrypted: false,
};
it('returns "download" when state is null', () => {
expect(decideSyncAction({ ...base, state: null }).action).toBe('download');
});
it('returns "download" when syncId changed', () => {
expect(
decideSyncAction({
...base,
config: { ...base.config, syncId: 'other' },
}).action,
).toBe('download');
});
it('returns "download" when serverUrl changed', () => {
expect(
decideSyncAction({
...base,
config: { ...base.config, serverUrl: 'http://other' },
}).action,
).toBe('download');
});
it('returns "skip" for a read within the TTL', () => {
expect(decideSyncAction({ ...base, now: 1_000_000 + 30_000 }).action).toBe(
'skip',
);
});
it('returns "sync" for a read past the TTL', () => {
expect(decideSyncAction({ ...base, now: 1_000_000 + 61_000 }).action).toBe(
'sync',
);
});
it('returns "sync" for a write even when fresh', () => {
expect(decideSyncAction({ ...base, mutates: true }).action).toBe('sync');
});
it('returns "sync" when refresh is true', () => {
expect(decideSyncAction({ ...base, refresh: true }).action).toBe('sync');
});
it('returns "sync" when ttlMs is 0', () => {
expect(decideSyncAction({ ...base, ttlMs: 0 }).action).toBe('sync');
});
it('returns "sync" for encrypted budgets within the TTL', () => {
expect(decideSyncAction({ ...base, encrypted: true }).action).toBe('sync');
});
it('treats clock skew (negative age) as stale', () => {
expect(decideSyncAction({ ...base, now: 999_999 }).action).toBe('sync');
});
it('carries cached state on non-download actions', () => {
const decision = decideSyncAction({ ...base, mutates: true });
expect(decision).toEqual({ action: 'sync', state: base.state });
});
});

View File

@@ -1,107 +0,0 @@
import { randomBytes } from 'node:crypto';
import { mkdirSync, readFileSync, renameSync, writeFileSync } from 'node:fs';
import { join } from 'node:path';
import { isRecord } from './utils';
export const CACHE_FILE_NAME = 'state.json';
export const CACHE_VERSION = 1;
export const META_ROOT_DIR = '.actual-cli';
export type CacheState = {
version: typeof CACHE_VERSION;
syncId: string;
budgetId: string;
serverUrl: string;
lastSyncedAt: number;
lastDownloadedAt: number;
};
export function getMetaDir(dataDir: string, syncId: string): string {
return join(dataDir, META_ROOT_DIR, syncId);
}
function cachePath(metaDir: string): string {
return join(metaDir, CACHE_FILE_NAME);
}
function isCacheState(value: unknown): value is CacheState {
if (!isRecord(value)) return false;
return (
value.version === CACHE_VERSION &&
typeof value.syncId === 'string' &&
typeof value.budgetId === 'string' &&
typeof value.serverUrl === 'string' &&
typeof value.lastSyncedAt === 'number' &&
typeof value.lastDownloadedAt === 'number'
);
}
export function readCacheState(metaDir: string): CacheState | null {
let raw: string;
try {
raw = readFileSync(cachePath(metaDir), 'utf-8');
} catch {
return null;
}
let parsed: unknown;
try {
parsed = JSON.parse(raw);
} catch {
return null;
}
return isCacheState(parsed) ? parsed : null;
}
export function writeCacheState(metaDir: string, state: CacheState): void {
try {
mkdirSync(metaDir, { recursive: true });
const target = cachePath(metaDir);
// Unique tmp name per writer: concurrent shared-lock commands (encrypted
// budgets, --refresh, stale TTL) can both publish, and a shared tmp path
// lets the second writer's truncate destroy the first writer's bytes
// before either renames into place.
const tmp = `${target}.${process.pid}-${randomBytes(4).toString('hex')}.tmp`;
writeFileSync(tmp, JSON.stringify(state));
renameSync(tmp, target);
} catch {
// Cache persistence is best-effort. A read-only or unreachable dir must
// not crash the CLI; the next invocation simply won't find a cache.
}
}
export type SyncDecision =
| { action: 'download' }
| { action: 'skip'; state: CacheState }
| { action: 'sync'; state: CacheState };
export type DecideSyncArgs = {
state: CacheState | null;
config: { syncId: string; serverUrl: string };
now: number;
ttlMs: number;
mutates: boolean;
refresh: boolean;
encrypted: boolean;
};
export function decideSyncAction({
state,
config,
now,
ttlMs,
mutates,
refresh,
encrypted,
}: DecideSyncArgs): SyncDecision {
if (state === null) return { action: 'download' };
if (state.syncId !== config.syncId) return { action: 'download' };
if (state.serverUrl !== config.serverUrl) return { action: 'download' };
if (mutates || refresh || ttlMs === 0 || encrypted) {
return { action: 'sync', state };
}
const age = now - state.lastSyncedAt;
if (age < 0) return { action: 'sync', state };
if (age < ttlMs) return { action: 'skip', state };
return { action: 'sync', state };
}

View File

@@ -14,30 +14,26 @@ export function registerAccountsCommand(program: Command) {
.option('--include-closed', 'Include closed accounts', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const allAccounts = await api.getAccounts();
const accounts = allAccounts.filter(
a => cmdOpts.includeClosed || !a.closed,
);
// Stable sort: on-budget first, off-budget second
// (preserves API sort_order within each group)
accounts.sort((a, b) => Number(a.offbudget) - Number(b.offbudget));
const balances = await Promise.all(
accounts.map(a => api.getAccountBalance(a.id)),
);
const output = accounts.map((a, i) => ({
id: a.id,
name: a.name,
offbudget: a.offbudget,
closed: a.closed,
balance: balances[i],
}));
printOutput(output, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const allAccounts = await api.getAccounts();
const accounts = allAccounts.filter(
a => cmdOpts.includeClosed || !a.closed,
);
// Stable sort: on-budget first, off-budget second
// (preserves API sort_order within each group)
accounts.sort((a, b) => Number(a.offbudget) - Number(b.offbudget));
const balances = await Promise.all(
accounts.map(a => api.getAccountBalance(a.id)),
);
const output = accounts.map((a, i) => ({
id: a.id,
name: a.name,
offbudget: a.offbudget,
closed: a.closed,
balance: balances[i],
}));
printOutput(output, opts.format);
});
});
accounts
@@ -53,17 +49,13 @@ export function registerAccountsCommand(program: Command) {
.action(async cmdOpts => {
const balance = parseIntFlag(cmdOpts.balance, '--balance');
const opts = program.opts();
await withConnection(
opts,
async () => {
const id = await api.createAccount(
{ name: cmdOpts.name, offbudget: cmdOpts.offbudget },
balance,
);
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const id = await api.createAccount(
{ name: cmdOpts.name, offbudget: cmdOpts.offbudget },
balance,
);
printOutput({ id }, opts.format);
});
});
accounts
@@ -89,14 +81,10 @@ export function registerAccountsCommand(program: Command) {
'No update fields provided. Use --name or --offbudget.',
);
}
await withConnection(
opts,
async () => {
await api.updateAccount(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.updateAccount(id, fields);
printOutput({ success: true, id }, opts.format);
});
});
accounts
@@ -112,18 +100,14 @@ export function registerAccountsCommand(program: Command) {
)
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.closeAccount(
id,
cmdOpts.transferAccount,
cmdOpts.transferCategory,
);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.closeAccount(
id,
cmdOpts.transferAccount,
cmdOpts.transferCategory,
);
printOutput({ success: true, id }, opts.format);
});
});
accounts
@@ -131,14 +115,10 @@ export function registerAccountsCommand(program: Command) {
.description('Reopen a closed account')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.reopenAccount(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.reopenAccount(id);
printOutput({ success: true, id }, opts.format);
});
});
accounts
@@ -146,14 +126,10 @@ export function registerAccountsCommand(program: Command) {
.description('Delete an account')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteAccount(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteAccount(id);
printOutput({ success: true, id }, opts.format);
});
});
accounts
@@ -172,13 +148,9 @@ export function registerAccountsCommand(program: Command) {
cutoff = cutoffDate;
}
const opts = program.opts();
await withConnection(
opts,
async () => {
const balance = await api.getAccountBalance(id, cutoff);
printOutput({ id, balance }, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const balance = await api.getAccountBalance(id, cutoff);
printOutput({ id, balance }, opts.format);
});
});
}

View File

@@ -1,6 +1,7 @@
import * as api from '@actual-app/api';
import type { Command } from 'commander';
import { resolveConfig } from '#config';
import { withConnection } from '#connection';
import { printOutput } from '#output';
import { parseBoolFlag, parseIntFlag } from '#utils';
@@ -19,7 +20,7 @@ export function registerBudgetsCommand(program: Command) {
const result = await api.getBudgets();
printOutput(result, opts.format);
},
{ mutates: false, skipBudget: true },
{ loadBudget: false },
);
});
@@ -29,33 +30,40 @@ export function registerBudgetsCommand(program: Command) {
.option('--encryption-password <password>', 'Encryption password')
.action(async (syncId: string, cmdOpts) => {
const opts = program.opts();
const config = await resolveConfig(opts);
const password = config.encryptionPassword ?? cmdOpts.encryptionPassword;
await withConnection(
opts,
async config => {
const password =
cmdOpts.encryptionPassword ?? config.encryptionPassword;
async () => {
await api.downloadBudget(syncId, {
password,
});
printOutput({ success: true, syncId }, opts.format);
},
{ mutates: false, skipBudget: true },
{ loadBudget: false },
);
});
budgets
.command('sync')
.description('Sync the current budget')
.action(async () => {
const opts = program.opts();
await withConnection(opts, async () => {
await api.sync();
printOutput({ success: true }, opts.format);
});
});
budgets
.command('months')
.description('List available budget months')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getBudgetMonths();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getBudgetMonths();
printOutput(result, opts.format);
});
});
budgets
@@ -63,14 +71,10 @@ export function registerBudgetsCommand(program: Command) {
.description('Get budget data for a specific month (YYYY-MM)')
.action(async (month: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getBudgetMonth(month);
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getBudgetMonth(month);
printOutput(result, opts.format);
});
});
budgets
@@ -85,14 +89,10 @@ export function registerBudgetsCommand(program: Command) {
.action(async cmdOpts => {
const amount = parseIntFlag(cmdOpts.amount, '--amount');
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.setBudgetAmount(cmdOpts.month, cmdOpts.category, amount);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.setBudgetAmount(cmdOpts.month, cmdOpts.category, amount);
printOutput({ success: true }, opts.format);
});
});
budgets
@@ -104,14 +104,10 @@ export function registerBudgetsCommand(program: Command) {
.action(async cmdOpts => {
const flag = parseBoolFlag(cmdOpts.flag, '--flag');
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.setBudgetCarryover(cmdOpts.month, cmdOpts.category, flag);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.setBudgetCarryover(cmdOpts.month, cmdOpts.category, flag);
printOutput({ success: true }, opts.format);
});
});
budgets
@@ -125,14 +121,10 @@ export function registerBudgetsCommand(program: Command) {
.action(async cmdOpts => {
const parsedAmount = parseIntFlag(cmdOpts.amount, '--amount');
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.holdBudgetForNextMonth(cmdOpts.month, parsedAmount);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.holdBudgetForNextMonth(cmdOpts.month, parsedAmount);
printOutput({ success: true }, opts.format);
});
});
budgets
@@ -141,13 +133,9 @@ export function registerBudgetsCommand(program: Command) {
.requiredOption('--month <month>', 'Budget month (YYYY-MM)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.resetBudgetHold(cmdOpts.month);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.resetBudgetHold(cmdOpts.month);
printOutput({ success: true }, opts.format);
});
});
}

View File

@@ -15,14 +15,10 @@ export function registerCategoriesCommand(program: Command) {
.description('List all categories')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getCategories();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getCategories();
printOutput(result, opts.format);
});
});
categories
@@ -33,19 +29,15 @@ export function registerCategoriesCommand(program: Command) {
.option('--is-income', 'Mark as income category', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const id = await api.createCategory({
name: cmdOpts.name,
group_id: cmdOpts.groupId,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const id = await api.createCategory({
name: cmdOpts.name,
group_id: cmdOpts.groupId,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
});
});
categories
@@ -63,14 +55,10 @@ export function registerCategoriesCommand(program: Command) {
throw new Error('No update fields provided. Use --name or --hidden.');
}
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.updateCategory(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.updateCategory(id, fields);
printOutput({ success: true, id }, opts.format);
});
});
categories
@@ -79,13 +67,9 @@ export function registerCategoriesCommand(program: Command) {
.option('--transfer-to <id>', 'Transfer transactions to this category')
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteCategory(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteCategory(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
});
});
}

View File

@@ -15,14 +15,10 @@ export function registerCategoryGroupsCommand(program: Command) {
.description('List all category groups')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getCategoryGroups();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getCategoryGroups();
printOutput(result, opts.format);
});
});
groups
@@ -32,18 +28,14 @@ export function registerCategoryGroupsCommand(program: Command) {
.option('--is-income', 'Mark as income group', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const id = await api.createCategoryGroup({
name: cmdOpts.name,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const id = await api.createCategoryGroup({
name: cmdOpts.name,
is_income: cmdOpts.isIncome,
hidden: false,
});
printOutput({ id }, opts.format);
});
});
groups
@@ -61,14 +53,10 @@ export function registerCategoryGroupsCommand(program: Command) {
throw new Error('No update fields provided. Use --name or --hidden.');
}
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.updateCategoryGroup(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.updateCategoryGroup(id, fields);
printOutput({ success: true, id }, opts.format);
});
});
groups
@@ -77,13 +65,9 @@ export function registerCategoryGroupsCommand(program: Command) {
.option('--transfer-to <id>', 'Transfer transactions to this category ID')
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteCategoryGroup(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteCategoryGroup(id, cmdOpts.transferTo);
printOutput({ success: true, id }, opts.format);
});
});
}

View File

@@ -12,14 +12,10 @@ export function registerPayeesCommand(program: Command) {
.description('List all payees')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getPayees();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getPayees();
printOutput(result, opts.format);
});
});
payees
@@ -27,14 +23,10 @@ export function registerPayeesCommand(program: Command) {
.description('List frequently used payees')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getCommonPayees();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getCommonPayees();
printOutput(result, opts.format);
});
});
payees
@@ -43,14 +35,10 @@ export function registerPayeesCommand(program: Command) {
.requiredOption('--name <name>', 'Payee name')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const id = await api.createPayee({ name: cmdOpts.name });
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const id = await api.createPayee({ name: cmdOpts.name });
printOutput({ id }, opts.format);
});
});
payees
@@ -66,14 +54,10 @@ export function registerPayeesCommand(program: Command) {
);
}
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.updatePayee(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.updatePayee(id, fields);
printOutput({ success: true, id }, opts.format);
});
});
payees
@@ -81,14 +65,10 @@ export function registerPayeesCommand(program: Command) {
.description('Delete a payee')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deletePayee(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deletePayee(id);
printOutput({ success: true, id }, opts.format);
});
});
payees
@@ -107,13 +87,9 @@ export function registerPayeesCommand(program: Command) {
);
}
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.mergePayees(cmdOpts.target, mergeIds);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.mergePayees(cmdOpts.target, mergeIds);
printOutput({ success: true }, opts.format);
});
});
}

View File

@@ -301,31 +301,27 @@ export function registerQueryCommand(program: Command) {
.addHelpText('after', RUN_EXAMPLES)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const parsed = cmdOpts.file ? readJsonInput(cmdOpts) : undefined;
if (parsed !== undefined && !isRecord(parsed)) {
throw new Error('Query file must contain a JSON object');
}
const queryObj = parsed
? buildQueryFromFile(parsed, cmdOpts.table)
: buildQueryFromFlags(cmdOpts);
await withConnection(opts, async () => {
const parsed = cmdOpts.file ? readJsonInput(cmdOpts) : undefined;
if (parsed !== undefined && !isRecord(parsed)) {
throw new Error('Query file must contain a JSON object');
}
const queryObj = parsed
? buildQueryFromFile(parsed, cmdOpts.table)
: buildQueryFromFlags(cmdOpts);
const result = await api.aqlQuery(queryObj);
const result = await api.aqlQuery(queryObj);
if (!isRecord(result) || !('data' in result)) {
throw new Error('Query result missing data');
}
if (!isRecord(result) || !('data' in result)) {
throw new Error('Query result missing data');
}
if (cmdOpts.count) {
printOutput({ count: result.data }, opts.format);
} else {
printOutput(result.data, opts.format);
}
},
{ mutates: false },
);
if (cmdOpts.count) {
printOutput({ count: result.data }, opts.format);
} else {
printOutput(result.data, opts.format);
}
});
});
query

View File

@@ -15,14 +15,10 @@ export function registerRulesCommand(program: Command) {
.description('List all rules')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getRules();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getRules();
printOutput(result, opts.format);
});
});
rules
@@ -30,14 +26,10 @@ export function registerRulesCommand(program: Command) {
.description('List rules for a specific payee')
.action(async (payeeId: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getPayeeRules(payeeId);
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getPayeeRules(payeeId);
printOutput(result, opts.format);
});
});
rules
@@ -47,17 +39,13 @@ export function registerRulesCommand(program: Command) {
.option('--file <path>', 'Read rule from JSON file (use - for stdin)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.createRule
>[0];
const id = await api.createRule(rule);
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.createRule
>[0];
const id = await api.createRule(rule);
printOutput({ id }, opts.format);
});
});
rules
@@ -67,17 +55,13 @@ export function registerRulesCommand(program: Command) {
.option('--file <path>', 'Read rule from JSON file (use - for stdin)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.updateRule
>[0];
await api.updateRule(rule);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const rule = readJsonInput(cmdOpts) as Parameters<
typeof api.updateRule
>[0];
await api.updateRule(rule);
printOutput({ success: true }, opts.format);
});
});
rules
@@ -85,13 +69,9 @@ export function registerRulesCommand(program: Command) {
.description('Delete a rule')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteRule(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteRule(id);
printOutput({ success: true, id }, opts.format);
});
});
}

View File

@@ -15,14 +15,10 @@ export function registerSchedulesCommand(program: Command) {
.description('List all schedules')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getSchedules();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getSchedules();
printOutput(result, opts.format);
});
});
schedules
@@ -32,17 +28,13 @@ export function registerSchedulesCommand(program: Command) {
.option('--file <path>', 'Read schedule from JSON file (use - for stdin)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const schedule = readJsonInput(cmdOpts) as Parameters<
typeof api.createSchedule
>[0];
const id = await api.createSchedule(schedule);
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const schedule = readJsonInput(cmdOpts) as Parameters<
typeof api.createSchedule
>[0];
const id = await api.createSchedule(schedule);
printOutput({ id }, opts.format);
});
});
schedules
@@ -53,17 +45,13 @@ export function registerSchedulesCommand(program: Command) {
.option('--reset-next-date', 'Reset next occurrence date', false)
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateSchedule
>[1];
await api.updateSchedule(id, fields, cmdOpts.resetNextDate);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateSchedule
>[1];
await api.updateSchedule(id, fields, cmdOpts.resetNextDate);
printOutput({ success: true, id }, opts.format);
});
});
schedules
@@ -71,13 +59,9 @@ export function registerSchedulesCommand(program: Command) {
.description('Delete a schedule')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteSchedule(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteSchedule(id);
printOutput({ success: true, id }, opts.format);
});
});
}

View File

@@ -19,7 +19,7 @@ export function registerServerCommand(program: Command) {
const version = await api.getServerVersion();
printOutput({ version }, opts.format);
},
{ mutates: false, skipBudget: true },
{ loadBudget: false },
);
});
@@ -34,17 +34,13 @@ export function registerServerCommand(program: Command) {
.requiredOption('--name <name>', 'Entity name')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const id = await api.getIDByName(cmdOpts.type, cmdOpts.name);
printOutput(
{ id, type: cmdOpts.type, name: cmdOpts.name },
opts.format,
);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const id = await api.getIDByName(cmdOpts.type, cmdOpts.name);
printOutput(
{ id, type: cmdOpts.type, name: cmdOpts.name },
opts.format,
);
});
});
server
@@ -53,16 +49,12 @@ export function registerServerCommand(program: Command) {
.option('--account <id>', 'Specific account ID to sync')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const args = cmdOpts.account
? { accountId: cmdOpts.account }
: undefined;
await api.runBankSync(args);
printOutput({ success: true }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const args = cmdOpts.account
? { accountId: cmdOpts.account }
: undefined;
await api.runBankSync(args);
printOutput({ success: true }, opts.format);
});
});
}

View File

@@ -1,124 +0,0 @@
import { existsSync, mkdtempSync, rmSync } from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import { Command } from 'commander';
import { CACHE_FILE_NAME, getMetaDir, writeCacheState } from '#cache';
import { resolveConfig } from '#config';
import { registerSyncCommand } from './sync';
vi.mock('@actual-app/api', () => ({
init: vi.fn().mockResolvedValue(undefined),
downloadBudget: vi.fn().mockResolvedValue(undefined),
loadBudget: vi.fn().mockResolvedValue(undefined),
sync: vi.fn().mockResolvedValue(undefined),
shutdown: vi.fn().mockResolvedValue(undefined),
getBudgets: vi
.fn()
.mockResolvedValue([{ id: 'bud-disk-1', groupId: 'sync-1' }]),
}));
vi.mock('#config', () => ({
resolveConfig: vi.fn(),
}));
let dataDir: string;
function metaDirFor(syncId: string) {
return getMetaDir(dataDir, syncId);
}
function program() {
const p = new Command();
p.exitOverride();
p.option('--sync-id <id>');
p.option('--data-dir <path>');
p.option('--format <fmt>');
p.option('--verbose');
registerSyncCommand(p);
return p;
}
describe('actual sync', () => {
let stdoutSpy: ReturnType<typeof vi.spyOn>;
beforeEach(() => {
vi.clearAllMocks();
dataDir = mkdtempSync(join(tmpdir(), 'actual-cli-sync-'));
vi.mocked(resolveConfig).mockResolvedValue({
serverUrl: 'http://test',
password: 'pw',
dataDir,
syncId: 'sync-1',
cacheTtl: 60,
lockTimeout: 10,
refresh: false,
noLock: true,
});
stdoutSpy = vi
.spyOn(process.stdout, 'write')
.mockImplementation(() => true);
});
afterEach(() => {
stdoutSpy.mockRestore();
rmSync(dataDir, { recursive: true, force: true });
});
it('runs a sync and prints the syncId', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: 0,
lastDownloadedAt: 0,
});
await program().parseAsync(['node', 'actual', 'sync']);
const out = stdoutSpy.mock.calls
.map((c: unknown[]) => String(c[0]))
.join('');
expect(out).toMatch(/"syncId":\s*"sync-1"/);
});
it('--status prints cache info without syncing', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now() - 5000,
lastDownloadedAt: Date.now() - 5000,
});
await program().parseAsync(['node', 'actual', 'sync', '--status']);
const out = stdoutSpy.mock.calls
.map((c: unknown[]) => String(c[0]))
.join('');
expect(out).toMatch(/"stale":\s*(true|false)/);
expect(out).toMatch(/"ageSeconds":\s*\d+/);
});
it('--status on no prior sync reports "never synced" and exits 0', async () => {
await program().parseAsync(['node', 'actual', 'sync', '--status']);
const out = stdoutSpy.mock.calls
.map((c: unknown[]) => String(c[0]))
.join('');
expect(out).toMatch(/"neverSynced":\s*true/);
});
it('--clear removes the cache file', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
expect(existsSync(join(metaDirFor('sync-1'), CACHE_FILE_NAME))).toBe(true);
await program().parseAsync(['node', 'actual', 'sync', '--clear']);
expect(existsSync(join(metaDirFor('sync-1'), CACHE_FILE_NAME))).toBe(false);
});
});

View File

@@ -1,118 +0,0 @@
import { rmSync } from 'node:fs';
import { join } from 'node:path';
import type { Command } from 'commander';
import { CACHE_FILE_NAME, getMetaDir, readCacheState } from '#cache';
import type { CliConfig } from '#config';
import { resolveConfig } from '#config';
import { withConnection } from '#connection';
import { acquireExclusive } from '#lock';
import { printOutput } from '#output';
type SyncCmdOpts = {
status?: boolean;
clear?: boolean;
};
async function requireSyncIdAndMeta(
opts: Record<string, unknown>,
flag: string,
): Promise<{ config: CliConfig; meta: string }> {
const config = await resolveConfig(opts);
if (!config.syncId) {
throw new Error(
`Sync ID is required for sync ${flag}. Set --sync-id or ACTUAL_SYNC_ID.`,
);
}
return { config, meta: getMetaDir(config.dataDir, config.syncId) };
}
export function registerSyncCommand(program: Command) {
program
.command('sync')
.description(
'Sync the local cached budget with the server, print cache status, or clear the cache',
)
.option('--status', 'Print cache status without syncing', false)
.option(
'--clear',
'Delete the local cache; next command re-downloads',
false,
)
.action(async (cmdOpts: SyncCmdOpts) => {
const opts = program.opts();
if (cmdOpts.status) {
const { config, meta } = await requireSyncIdAndMeta(opts, '--status');
const state = readCacheState(meta);
if (state === null) {
printOutput(
{
neverSynced: true,
syncId: config.syncId,
ttlSeconds: config.cacheTtl,
},
opts.format,
);
return;
}
const rawAgeSeconds = Math.round(
(Date.now() - state.lastSyncedAt) / 1000,
);
const ageSeconds = Math.max(0, rawAgeSeconds);
printOutput(
{
neverSynced: false,
syncId: state.syncId,
budgetId: state.budgetId,
syncedAt: new Date(state.lastSyncedAt).toISOString(),
lastDownloadedAt: new Date(state.lastDownloadedAt).toISOString(),
ageSeconds,
ttlSeconds: config.cacheTtl,
stale: rawAgeSeconds < 0 || rawAgeSeconds > config.cacheTtl,
},
opts.format,
);
return;
}
if (cmdOpts.clear) {
const { config, meta } = await requireSyncIdAndMeta(opts, '--clear');
// Serialize with concurrent writers so we don't rm a half-written
// state.json that's about to be renamed into place.
const release = config.noLock
? null
: await acquireExclusive(meta, {
timeoutMs: config.lockTimeout * 1000,
});
try {
rmSync(join(meta, CACHE_FILE_NAME), { force: true });
} finally {
await release?.();
}
printOutput({ cleared: true, syncId: config.syncId }, opts.format);
return;
}
await withConnection(
opts,
async config => {
const state = config.syncId
? readCacheState(getMetaDir(config.dataDir, config.syncId))
: null;
printOutput(
{
syncedAt: new Date(
state?.lastSyncedAt ?? Date.now(),
).toISOString(),
syncId: config.syncId,
budgetId: state?.budgetId ?? config.syncId,
},
opts.format,
);
},
{ mutates: true },
);
});
}

View File

@@ -12,14 +12,10 @@ export function registerTagsCommand(program: Command) {
.description('List all tags')
.action(async () => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getTags();
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getTags();
printOutput(result, opts.format);
});
});
tags
@@ -30,18 +26,14 @@ export function registerTagsCommand(program: Command) {
.option('--description <description>', 'Tag description')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const id = await api.createTag({
tag: cmdOpts.tag,
color: cmdOpts.color,
description: cmdOpts.description,
});
printOutput({ id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const id = await api.createTag({
tag: cmdOpts.tag,
color: cmdOpts.color,
description: cmdOpts.description,
});
printOutput({ id }, opts.format);
});
});
tags
@@ -63,14 +55,10 @@ export function registerTagsCommand(program: Command) {
);
}
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.updateTag(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.updateTag(id, fields);
printOutput({ success: true, id }, opts.format);
});
});
tags
@@ -78,13 +66,9 @@ export function registerTagsCommand(program: Command) {
.description('Delete a tag')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteTag(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteTag(id);
printOutput({ success: true, id }, opts.format);
});
});
}

View File

@@ -18,18 +18,14 @@ export function registerTransactionsCommand(program: Command) {
.requiredOption('--end <date>', 'End date (YYYY-MM-DD)')
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const result = await api.getTransactions(
cmdOpts.account,
cmdOpts.start,
cmdOpts.end,
);
printOutput(result, opts.format);
},
{ mutates: false },
);
await withConnection(opts, async () => {
const result = await api.getTransactions(
cmdOpts.account,
cmdOpts.start,
cmdOpts.end,
);
printOutput(result, opts.format);
});
});
transactions
@@ -45,24 +41,20 @@ export function registerTransactionsCommand(program: Command) {
.option('--run-transfers', 'Process transfers', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.addTransactions
>[1];
const result = await api.addTransactions(
cmdOpts.account,
transactions,
{
learnCategories: cmdOpts.learnCategories,
runTransfers: cmdOpts.runTransfers,
},
);
printOutput(result, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.addTransactions
>[1];
const result = await api.addTransactions(
cmdOpts.account,
transactions,
{
learnCategories: cmdOpts.learnCategories,
runTransfers: cmdOpts.runTransfers,
},
);
printOutput(result, opts.format);
});
});
transactions
@@ -77,24 +69,20 @@ export function registerTransactionsCommand(program: Command) {
.option('--dry-run', 'Preview without importing', false)
.action(async cmdOpts => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.importTransactions
>[1];
const result = await api.importTransactions(
cmdOpts.account,
transactions,
{
defaultCleared: true,
dryRun: cmdOpts.dryRun,
},
);
printOutput(result, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const transactions = readJsonInput(cmdOpts) as Parameters<
typeof api.importTransactions
>[1];
const result = await api.importTransactions(
cmdOpts.account,
transactions,
{
defaultCleared: true,
dryRun: cmdOpts.dryRun,
},
);
printOutput(result, opts.format);
});
});
transactions
@@ -104,17 +92,13 @@ export function registerTransactionsCommand(program: Command) {
.option('--file <path>', 'Read fields from JSON file (use - for stdin)')
.action(async (id: string, cmdOpts) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateTransaction
>[1];
await api.updateTransaction(id, fields);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
const fields = readJsonInput(cmdOpts) as Parameters<
typeof api.updateTransaction
>[1];
await api.updateTransaction(id, fields);
printOutput({ success: true, id }, opts.format);
});
});
transactions
@@ -122,13 +106,9 @@ export function registerTransactionsCommand(program: Command) {
.description('Delete a transaction')
.action(async (id: string) => {
const opts = program.opts();
await withConnection(
opts,
async () => {
await api.deleteTransaction(id);
printOutput({ success: true, id }, opts.format);
},
{ mutates: true },
);
await withConnection(opts, async () => {
await api.deleteTransaction(id);
printOutput({ success: true, id }, opts.format);
});
});
}

View File

@@ -28,9 +28,6 @@ describe('resolveConfig', () => {
'ACTUAL_SYNC_ID',
'ACTUAL_DATA_DIR',
'ACTUAL_ENCRYPTION_PASSWORD',
'ACTUAL_CACHE_TTL',
'ACTUAL_LOCK_TIMEOUT',
'ACTUAL_NO_LOCK',
];
beforeEach(() => {
@@ -162,125 +159,6 @@ describe('resolveConfig', () => {
});
});
describe('cache options', () => {
beforeEach(() => {
process.env.ACTUAL_SERVER_URL = 'http://test';
process.env.ACTUAL_PASSWORD = 'pw';
});
it('defaults cacheTtl to 60 seconds', async () => {
const config = await resolveConfig({});
expect(config.cacheTtl).toBe(60);
});
it('reads cacheTtl from env', async () => {
process.env.ACTUAL_CACHE_TTL = '300';
const config = await resolveConfig({});
expect(config.cacheTtl).toBe(300);
});
it('prefers cacheTtl from CLI flag', async () => {
process.env.ACTUAL_CACHE_TTL = '300';
const config = await resolveConfig({ cacheTtl: 10 });
expect(config.cacheTtl).toBe(10);
});
it('rejects negative cacheTtl', async () => {
await expect(resolveConfig({ cacheTtl: -1 })).rejects.toThrow(/cacheTtl/);
});
it('rejects non-integer cacheTtl from env', async () => {
process.env.ACTUAL_CACHE_TTL = 'banana';
await expect(resolveConfig({})).rejects.toThrow(/ACTUAL_CACHE_TTL/);
});
it('defaults lockTimeout to 10 seconds', async () => {
const config = await resolveConfig({});
expect(config.lockTimeout).toBe(10);
});
it('reads lockTimeout from env', async () => {
process.env.ACTUAL_LOCK_TIMEOUT = '30';
const config = await resolveConfig({});
expect(config.lockTimeout).toBe(30);
});
it('defaults refresh to false', async () => {
const config = await resolveConfig({});
expect(config.refresh).toBe(false);
});
it('sets refresh when provided on CLI opts', async () => {
const config = await resolveConfig({ refresh: true });
expect(config.refresh).toBe(true);
});
it('sets refresh when --no-cache is passed (cliOpts.cache === false)', async () => {
const config = await resolveConfig({ cache: false });
expect(config.refresh).toBe(true);
});
it('does not set refresh when cliOpts.cache is true (flag absent)', async () => {
const config = await resolveConfig({ cache: true });
expect(config.refresh).toBe(false);
});
it('defaults noLock to false', async () => {
const config = await resolveConfig({});
expect(config.noLock).toBe(false);
});
it('sets noLock when --no-lock is passed (cliOpts.lock === false)', async () => {
const config = await resolveConfig({ lock: false });
expect(config.noLock).toBe(true);
});
it('leaves noLock false when cliOpts.lock is true (flag absent)', async () => {
const config = await resolveConfig({ lock: true });
expect(config.noLock).toBe(false);
});
it('parses ACTUAL_NO_LOCK=1 as true', async () => {
process.env.ACTUAL_NO_LOCK = '1';
const config = await resolveConfig({});
expect(config.noLock).toBe(true);
});
it('parses ACTUAL_NO_LOCK=true as true', async () => {
process.env.ACTUAL_NO_LOCK = 'true';
const config = await resolveConfig({});
expect(config.noLock).toBe(true);
});
it('throws on an invalid ACTUAL_NO_LOCK value', async () => {
process.env.ACTUAL_NO_LOCK = 'yes';
await expect(resolveConfig({})).rejects.toThrow(/ACTUAL_NO_LOCK/);
});
it('reads cacheTtl/lockTimeout/noLock from config file', async () => {
mockConfigFile({
serverUrl: 'http://file',
password: 'pw',
cacheTtl: 120,
lockTimeout: 5,
noLock: true,
});
const config = await resolveConfig({});
expect(config.cacheTtl).toBe(120);
expect(config.lockTimeout).toBe(5);
expect(config.noLock).toBe(true);
});
it('rejects non-number cacheTtl in config file', async () => {
mockConfigFile({
serverUrl: 'http://file',
password: 'pw',
cacheTtl: 'soon',
});
await expect(resolveConfig({})).rejects.toThrow(/cacheTtl/);
});
});
describe('cosmiconfig handling', () => {
it('handles null result (no config file found)', async () => {
mockConfigFile(null);

View File

@@ -3,7 +3,7 @@ import { join } from 'path';
import { cosmiconfig } from 'cosmiconfig';
import { isRecord, parseBoolEnv, parseNonNegativeIntFlag } from './utils';
import { isRecord } from './utils';
export type CliConfig = {
serverUrl: string;
@@ -12,10 +12,6 @@ export type CliConfig = {
syncId?: string;
dataDir: string;
encryptionPassword?: string;
cacheTtl: number;
lockTimeout: number;
refresh: boolean;
noLock: boolean;
};
export type CliGlobalOpts = {
@@ -25,29 +21,10 @@ export type CliGlobalOpts = {
syncId?: string;
dataDir?: string;
encryptionPassword?: string;
cacheTtl?: number;
lockTimeout?: number;
refresh?: boolean;
// Commander stores --no-foo flags under the positive key. Default true,
// false when the flag is passed.
cache?: boolean;
lock?: boolean;
format?: 'json' | 'table' | 'csv';
verbose?: boolean;
};
const stringKeys = [
'serverUrl',
'password',
'sessionToken',
'syncId',
'dataDir',
'encryptionPassword',
] as const;
const numberKeys = ['cacheTtl', 'lockTimeout'] as const;
const booleanKeys = ['noLock'] as const;
type ConfigFileContent = {
serverUrl?: string;
password?: string;
@@ -55,15 +32,15 @@ type ConfigFileContent = {
syncId?: string;
dataDir?: string;
encryptionPassword?: string;
cacheTtl?: number;
lockTimeout?: number;
noLock?: boolean;
};
const configFileKeys: readonly string[] = [
...stringKeys,
...numberKeys,
...booleanKeys,
'serverUrl',
'password',
'sessionToken',
'syncId',
'dataDir',
'encryptionPassword',
];
function validateConfigFileContent(value: unknown): ConfigFileContent {
@@ -77,30 +54,9 @@ function validateConfigFileContent(value: unknown): ConfigFileContent {
if (!configFileKeys.includes(key)) {
throw new Error(`Invalid config file: unknown key "${key}"`);
}
const v = value[key];
if (v === undefined) continue;
if (
(stringKeys as readonly string[]).includes(key) &&
typeof v !== 'string'
) {
if (value[key] !== undefined && typeof value[key] !== 'string') {
throw new Error(
`Invalid config file: key "${key}" must be a string, got ${typeof v}`,
);
}
if (
(numberKeys as readonly string[]).includes(key) &&
(typeof v !== 'number' || !Number.isInteger(v) || v < 0)
) {
throw new Error(
`Invalid config file: key "${key}" must be a non-negative integer`,
);
}
if (
(booleanKeys as readonly string[]).includes(key) &&
typeof v !== 'boolean'
) {
throw new Error(
`Invalid config file: key "${key}" must be a boolean, got ${typeof v}`,
`Invalid config file: key "${key}" must be a string, got ${typeof value[key]}`,
);
}
}
@@ -127,22 +83,6 @@ async function loadConfigFile(): Promise<ConfigFileContent> {
return {};
}
function parseNonNegativeIntEnv(
raw: string | undefined,
source: string,
): number | undefined {
return raw === undefined ? undefined : parseNonNegativeIntFlag(raw, source);
}
function validateNonNegativeInt(value: number, name: string): number {
if (!Number.isInteger(value) || value < 0) {
throw new Error(
`Invalid ${name}: expected a non-negative integer, got ${value}`,
);
}
return value;
}
export async function resolveConfig(
cliOpts: CliGlobalOpts,
): Promise<CliConfig> {
@@ -188,37 +128,6 @@ export async function resolveConfig(
);
}
const cacheTtl = validateNonNegativeInt(
cliOpts.cacheTtl ??
parseNonNegativeIntEnv(
process.env.ACTUAL_CACHE_TTL,
'ACTUAL_CACHE_TTL',
) ??
fileConfig.cacheTtl ??
60,
'cacheTtl',
);
const lockTimeout = validateNonNegativeInt(
cliOpts.lockTimeout ??
parseNonNegativeIntEnv(
process.env.ACTUAL_LOCK_TIMEOUT,
'ACTUAL_LOCK_TIMEOUT',
) ??
fileConfig.lockTimeout ??
10,
'lockTimeout',
);
const refresh = (cliOpts.refresh ?? false) || cliOpts.cache === false;
const flagNoLock = cliOpts.lock === false ? true : undefined;
const noLock =
flagNoLock ??
parseBoolEnv(process.env.ACTUAL_NO_LOCK, 'ACTUAL_NO_LOCK') ??
fileConfig.noLock ??
false;
return {
serverUrl,
password,
@@ -226,9 +135,5 @@ export async function resolveConfig(
syncId,
dataDir,
encryptionPassword,
cacheTtl,
lockTimeout,
refresh,
noLock,
};
}

View File

@@ -1,44 +1,24 @@
import { mkdtempSync, rmSync } from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import * as api from '@actual-app/api';
import { getMetaDir, writeCacheState } from './cache';
import { resolveConfig } from './config';
import { withConnection } from './connection';
vi.mock('@actual-app/api', () => ({
init: vi.fn().mockResolvedValue(undefined),
downloadBudget: vi.fn().mockResolvedValue(undefined),
loadBudget: vi.fn().mockResolvedValue(undefined),
sync: vi.fn().mockResolvedValue(undefined),
shutdown: vi.fn().mockResolvedValue(undefined),
getBudgets: vi
.fn()
.mockResolvedValue([{ id: 'bud-disk-1', groupId: 'sync-1' }]),
}));
vi.mock('./config', () => ({
resolveConfig: vi.fn(),
}));
let dataDir: string;
function metaDirFor(syncId: string) {
return getMetaDir(dataDir, syncId);
}
function setConfig(overrides: Record<string, unknown> = {}) {
vi.mocked(resolveConfig).mockResolvedValue({
serverUrl: 'http://test',
password: 'pw',
dataDir,
syncId: 'sync-1',
cacheTtl: 60,
lockTimeout: 10,
refresh: false,
noLock: true,
dataDir: '/tmp/data',
syncId: 'budget-1',
...overrides,
});
}
@@ -51,182 +31,104 @@ describe('withConnection', () => {
stderrSpy = vi
.spyOn(process.stderr, 'write')
.mockImplementation(() => true);
dataDir = mkdtempSync(join(tmpdir(), 'actual-cli-conn-'));
setConfig();
});
afterEach(() => {
stderrSpy.mockRestore();
rmSync(dataDir, { recursive: true, force: true });
});
it('calls api.init with password when no sessionToken', async () => {
await withConnection({}, async () => 'ok', { mutates: false });
setConfig({ password: 'pw', sessionToken: undefined });
await withConnection({}, async () => 'ok');
expect(api.init).toHaveBeenCalledWith({
serverURL: 'http://test',
password: 'pw',
dataDir,
dataDir: '/tmp/data',
verbose: undefined,
});
});
it('calls api.init with sessionToken when present', async () => {
setConfig({ sessionToken: 'tok', password: undefined });
await withConnection({}, async () => 'ok', { mutates: false });
await withConnection({}, async () => 'ok');
expect(api.init).toHaveBeenCalledWith({
serverURL: 'http://test',
sessionToken: 'tok',
dataDir,
dataDir: '/tmp/data',
verbose: undefined,
});
});
it('first run: calls downloadBudget and writes cache state', async () => {
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.downloadBudget).toHaveBeenCalledWith('sync-1', {
it('calls api.downloadBudget when syncId is set', async () => {
setConfig({ syncId: 'budget-1' });
await withConnection({}, async () => 'ok');
expect(api.downloadBudget).toHaveBeenCalledWith('budget-1', {
password: undefined,
});
expect(api.sync).not.toHaveBeenCalled();
});
it('skips sync on a read inside the TTL', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.loadBudget).toHaveBeenCalledWith('bud-disk-1');
expect(api.sync).not.toHaveBeenCalled();
expect(api.downloadBudget).not.toHaveBeenCalled();
});
it('syncs on a read past the TTL', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now() - 10 * 60_000,
lastDownloadedAt: Date.now() - 10 * 60_000,
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.loadBudget).toHaveBeenCalled();
expect(api.sync).toHaveBeenCalledTimes(1);
});
it('write command syncs before and after the callback, even when fresh', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: true });
expect(api.loadBudget).toHaveBeenCalled();
expect(api.sync).toHaveBeenCalledTimes(2);
});
it('--refresh forces a sync on a read inside the TTL', async () => {
setConfig({ refresh: true });
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.sync).toHaveBeenCalledTimes(1);
});
it('encrypted budget forces a sync on a read inside the TTL', async () => {
setConfig({ encryptionPassword: 'secret' });
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.sync).toHaveBeenCalledTimes(1);
});
it('invalidates cache when syncId changes', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'OTHER',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now(),
lastDownloadedAt: Date.now(),
});
await withConnection({}, async () => 'ok', { mutates: false });
expect(api.downloadBudget).toHaveBeenCalled();
});
it('skips budget work when skipBudget is true', async () => {
await withConnection({}, async () => 'ok', {
mutates: false,
skipBudget: true,
});
expect(api.downloadBudget).not.toHaveBeenCalled();
expect(api.loadBudget).not.toHaveBeenCalled();
expect(api.sync).not.toHaveBeenCalled();
});
it('throws when syncId is missing and skipBudget is false', async () => {
it('throws when loadBudget is true but syncId is not set', async () => {
setConfig({ syncId: undefined });
await expect(
withConnection({}, async () => 'ok', { mutates: false }),
).rejects.toThrow('Sync ID is required');
await expect(withConnection({}, async () => 'ok')).rejects.toThrow(
'Sync ID is required',
);
});
it('returns the callback result', async () => {
const result = await withConnection({}, async () => 42, {
mutates: false,
});
it('skips budget download when loadBudget is false and syncId is not set', async () => {
setConfig({ syncId: undefined });
await withConnection({}, async () => 'ok', { loadBudget: false });
expect(api.downloadBudget).not.toHaveBeenCalled();
});
it('does not call api.downloadBudget when loadBudget is false', async () => {
setConfig({ syncId: 'budget-1' });
await withConnection({}, async () => 'ok', { loadBudget: false });
expect(api.downloadBudget).not.toHaveBeenCalled();
});
it('returns callback result', async () => {
const result = await withConnection({}, async () => 42);
expect(result).toBe(42);
});
it('calls api.shutdown on success', async () => {
await withConnection({}, async () => 'ok', { mutates: false });
it('calls api.shutdown in finally block on success', async () => {
await withConnection({}, async () => 'ok');
expect(api.shutdown).toHaveBeenCalled();
});
it('calls api.shutdown on error', async () => {
it('calls api.shutdown in finally block on error', async () => {
await expect(
withConnection(
{},
async () => {
throw new Error('boom');
},
{ mutates: false },
),
withConnection({}, async () => {
throw new Error('boom');
}),
).rejects.toThrow('boom');
expect(api.shutdown).toHaveBeenCalled();
});
it('propagates sync errors on a stale read', async () => {
writeCacheState(metaDirFor('sync-1'), {
version: 1,
syncId: 'sync-1',
budgetId: 'bud-disk-1',
serverUrl: 'http://test',
lastSyncedAt: Date.now() - 10 * 60_000,
lastDownloadedAt: Date.now() - 10 * 60_000,
});
vi.mocked(api.sync).mockRejectedValueOnce(new Error('network'));
await expect(
withConnection({}, async () => 'ok', { mutates: false }),
).rejects.toThrow('network');
it('does not write to stderr by default', async () => {
await withConnection({}, async () => 'ok');
expect(stderrSpy).not.toHaveBeenCalled();
});
it('writes info to stderr when verbose', async () => {
await withConnection({ verbose: true }, async () => 'ok');
expect(stderrSpy).toHaveBeenCalledWith(
expect.stringContaining('Connecting to'),
);
});
});

View File

@@ -1,49 +1,30 @@
import { mkdirSync } from 'fs';
import * as api from '@actual-app/api';
import type { CacheState } from './cache';
import {
CACHE_VERSION,
decideSyncAction,
getMetaDir,
readCacheState,
writeCacheState,
} from './cache';
import type { CliConfig, CliGlobalOpts } from './config';
import { resolveConfig } from './config';
import { acquireExclusive, acquireShared } from './lock';
import type { Release } from './lock';
type ConnectionOptions = {
mutates: boolean;
skipBudget?: boolean;
};
import type { CliGlobalOpts } from './config';
function info(message: string, verbose?: boolean) {
if (verbose) process.stderr.write(message + '\n');
if (verbose) {
process.stderr.write(message + '\n');
}
}
async function resolveBudgetIdForSyncId(syncId: string): Promise<string> {
const budgets = await api.getBudgets();
const match = budgets.find(
b =>
typeof b.id === 'string' &&
(b.groupId === syncId || b.cloudFileId === syncId),
);
if (!match?.id) {
throw new Error(
`Could not resolve on-disk budget id for syncId ${syncId} after download.`,
);
}
return match.id;
}
type ConnectionOptions = {
loadBudget?: boolean;
};
export async function withConnection<T>(
globalOpts: CliGlobalOpts,
fn: (config: CliConfig) => Promise<T>,
{ mutates, skipBudget = false }: ConnectionOptions,
fn: () => Promise<T>,
options: ConnectionOptions = {},
): Promise<T> {
const { loadBudget = true } = options;
const config = await resolveConfig(globalOpts);
mkdirSync(config.dataDir, { recursive: true });
info(`Connecting to ${config.serverUrl}...`, globalOpts.verbose);
if (config.sessionToken) {
@@ -67,87 +48,17 @@ export async function withConnection<T>(
}
try {
if (skipBudget) return await fn(config);
if (!config.syncId) {
if (loadBudget && config.syncId) {
info(`Downloading budget ${config.syncId}...`, globalOpts.verbose);
await api.downloadBudget(config.syncId, {
password: config.encryptionPassword,
});
} else if (loadBudget && !config.syncId) {
throw new Error(
'Sync ID is required for this command. Set --sync-id or ACTUAL_SYNC_ID.',
);
}
const meta = getMetaDir(config.dataDir, config.syncId);
let release: Release | null = null;
if (!config.noLock) {
release = mutates
? await acquireExclusive(meta, {
timeoutMs: config.lockTimeout * 1000,
})
: await acquireShared(meta, {
timeoutMs: config.lockTimeout * 1000,
});
}
try {
const cachedState = readCacheState(meta);
const decision = decideSyncAction({
state: cachedState,
config: { syncId: config.syncId, serverUrl: config.serverUrl },
now: Date.now(),
ttlMs: config.cacheTtl * 1000,
mutates,
refresh: config.refresh,
encrypted: Boolean(config.encryptionPassword),
});
let state: CacheState;
if (decision.action === 'download') {
info(
cachedState === null
? `Downloading budget ${config.syncId} for the first time...`
: `Re-downloading budget ${config.syncId} (cache invalidated)...`,
globalOpts.verbose,
);
await api.downloadBudget(config.syncId, {
password: config.encryptionPassword,
});
const budgetId = await resolveBudgetIdForSyncId(config.syncId);
const now = Date.now();
state = {
version: CACHE_VERSION,
syncId: config.syncId,
budgetId,
serverUrl: config.serverUrl,
lastSyncedAt: now,
lastDownloadedAt: now,
};
writeCacheState(meta, state);
} else if (decision.action === 'skip') {
const age = Math.round(
(Date.now() - decision.state.lastSyncedAt) / 1000,
);
info(`Using cached budget (synced ${age}s ago)...`, globalOpts.verbose);
await api.loadBudget(decision.state.budgetId);
state = decision.state;
} else {
info(`Syncing budget ${config.syncId}...`, globalOpts.verbose);
await api.loadBudget(decision.state.budgetId);
await api.sync();
state = { ...decision.state, lastSyncedAt: Date.now() };
writeCacheState(meta, state);
}
const result = await fn(config);
if (mutates) {
info(`Pushing changes for ${config.syncId}...`, globalOpts.verbose);
await api.sync();
state = { ...state, lastSyncedAt: Date.now() };
writeCacheState(meta, state);
}
return result;
} finally {
if (release) await release();
}
return await fn();
} finally {
await api.shutdown();
}

View File

@@ -9,10 +9,8 @@ import { registerQueryCommand } from './commands/query';
import { registerRulesCommand } from './commands/rules';
import { registerSchedulesCommand } from './commands/schedules';
import { registerServerCommand } from './commands/server';
import { registerSyncCommand } from './commands/sync';
import { registerTagsCommand } from './commands/tags';
import { registerTransactionsCommand } from './commands/transactions';
import { parseNonNegativeIntFlag } from './utils';
declare const __CLI_VERSION__: string;
@@ -34,22 +32,6 @@ program
'--encryption-password <password>',
'E2E encryption password (env: ACTUAL_ENCRYPTION_PASSWORD)',
)
.option(
'--cache-ttl <seconds>',
'Cache TTL in seconds (env: ACTUAL_CACHE_TTL; default: 60)',
value => parseNonNegativeIntFlag(value, '--cache-ttl'),
)
.option('--refresh', 'Force a sync on this call, ignoring the cache', false)
.option('--no-cache', 'Alias for --refresh')
.option(
'--lock-timeout <seconds>',
'How long to wait for another CLI process to release the lock (env: ACTUAL_LOCK_TIMEOUT; default: 10)',
value => parseNonNegativeIntFlag(value, '--lock-timeout'),
)
.option(
'--no-lock',
'Disable the budget directory lock (use with care, env: ACTUAL_NO_LOCK)',
)
.addOption(
new Option('--format <format>', 'Output format: json, table, csv')
.choices(['json', 'table', 'csv'] as const)
@@ -68,7 +50,6 @@ registerRulesCommand(program);
registerSchedulesCommand(program);
registerQueryCommand(program);
registerServerCommand(program);
registerSyncCommand(program);
function normalizeThrownMessage(err: unknown): string {
if (err instanceof Error) return err.message;

View File

@@ -1,159 +0,0 @@
import {
existsSync,
mkdirSync,
mkdtempSync,
readdirSync,
rmSync,
writeFileSync,
} from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import { acquireExclusive, acquireShared } from './lock';
// In-memory stand-in for proper-lockfile. The real library spins up a
// setTimeout loop to refresh lockfile mtimes; on some CI filesystems that
// timer keeps Node's event loop alive even after tests complete, wedging the
// test run. The mock behaves identically from our wrapper's perspective
// (acquire, detect contention with ELOCKED, release) without touching the
// filesystem or scheduling timers.
const mockHeld = new Set<string>();
vi.mock('proper-lockfile', () => ({
default: {
lock: vi.fn(
async (
file: string,
opts?: { lockfilePath?: string },
): Promise<() => Promise<void>> => {
const key = opts?.lockfilePath ?? file;
if (mockHeld.has(key)) {
const err = new Error('Lock is already held') as Error & {
code?: string;
};
err.code = 'ELOCKED';
throw err;
}
mockHeld.add(key);
return async () => {
mockHeld.delete(key);
};
},
),
},
}));
describe('acquireExclusive', () => {
let dir: string;
beforeEach(() => {
mockHeld.clear();
dir = mkdtempSync(join(tmpdir(), 'actual-cli-lock-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('creates the directory if it does not exist', async () => {
const target = join(dir, 'nested', 'budget');
const release = await acquireExclusive(target, { timeoutMs: 1000 });
expect(existsSync(target)).toBe(true);
await release();
});
it('returns a release function that frees the lock', async () => {
const release1 = await acquireExclusive(dir, { timeoutMs: 1000 });
await release1();
const release2 = await acquireExclusive(dir, { timeoutMs: 1000 });
await release2();
});
it('rejects with a user-friendly error when another holder has the lock', async () => {
const release = await acquireExclusive(dir, { timeoutMs: 1000 });
await expect(acquireExclusive(dir, { timeoutMs: 100 })).rejects.toThrow(
/holding the budget/,
);
await release();
});
});
describe('acquireShared', () => {
let dir: string;
beforeEach(() => {
mockHeld.clear();
dir = mkdtempSync(join(tmpdir(), 'actual-cli-lock-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('allows multiple concurrent shared holders', async () => {
const r1 = await acquireShared(dir, { timeoutMs: 1000 });
const r2 = await acquireShared(dir, { timeoutMs: 1000 });
const readers = readdirSync(join(dir, 'readers'));
expect(readers).toHaveLength(2);
await r1();
await r2();
});
it('removes the reader marker on release', async () => {
const release = await acquireShared(dir, { timeoutMs: 1000 });
await release();
const readers = readdirSync(join(dir, 'readers'));
expect(readers).toHaveLength(0);
});
it('rejects when an exclusive lock is held', async () => {
const releaseExclusive = await acquireExclusive(dir, { timeoutMs: 1000 });
await expect(acquireShared(dir, { timeoutMs: 100 })).rejects.toThrow(
/holding the budget/,
);
await releaseExclusive();
});
it('sweeps stale reader markers whose PIDs no longer exist', async () => {
const readersDir = join(dir, 'readers');
mkdirSync(readersDir, { recursive: true });
writeFileSync(join(readersDir, '-1-abc'), '');
const release = await acquireExclusive(dir, { timeoutMs: 1000 });
expect(readdirSync(readersDir)).toHaveLength(0);
await release();
});
});
describe('writer-reader interaction', () => {
let dir: string;
beforeEach(() => {
mockHeld.clear();
dir = mkdtempSync(join(tmpdir(), 'actual-cli-lock-'));
});
afterEach(() => {
rmSync(dir, { recursive: true, force: true });
});
it('exclusive waits for active shared holders to release', async () => {
const readerRelease = await acquireShared(dir, { timeoutMs: 500 });
let writerAcquired = false;
const writerPromise = acquireExclusive(dir, { timeoutMs: 1000 }).then(
release => {
writerAcquired = true;
return release;
},
);
await new Promise(resolve => setTimeout(resolve, 150));
expect(writerAcquired).toBe(false);
await readerRelease();
const writerRelease = await writerPromise;
expect(writerAcquired).toBe(true);
await writerRelease();
});
});

View File

@@ -1,149 +0,0 @@
import { randomBytes } from 'node:crypto';
import { mkdirSync, readdirSync, rmSync, writeFileSync } from 'node:fs';
import { join } from 'node:path';
import lockfile from 'proper-lockfile';
export type Release = () => Promise<void>;
export type AcquireOptions = {
timeoutMs: number;
};
const LOCKFILE_NAME = 'lock';
const READERS_DIR_NAME = 'readers';
const READER_POLL_INTERVAL_MS = 100;
function lockfilePath(dir: string): string {
return join(dir, LOCKFILE_NAME);
}
function readersDir(dir: string): string {
return join(dir, READERS_DIR_NAME);
}
function ensureDir(dir: string) {
mkdirSync(dir, { recursive: true });
}
function retriesForTimeout(timeoutMs: number) {
return {
retries: Math.max(1, Math.floor(timeoutMs / 200)),
minTimeout: 100,
maxTimeout: 500,
factor: 1.5,
};
}
function errorCode(err: unknown): string | undefined {
if (err instanceof Error && 'code' in err) {
const { code } = err as { code?: unknown };
if (typeof code === 'string') return code;
}
return undefined;
}
function isLockedError(err: unknown): boolean {
return errorCode(err) === 'ELOCKED';
}
function lockedMessage(timeoutMs: number): string {
return `Another CLI process is holding the budget (waited ${Math.round(
timeoutMs / 1000,
)}s). Retry, or use a different --data-dir.`;
}
function pidIsAlive(pid: number): boolean {
if (pid <= 0) return false;
try {
process.kill(pid, 0);
return true;
} catch (err) {
return errorCode(err) === 'EPERM';
}
}
function readReaderNames(readers: string): string[] {
try {
return readdirSync(readers);
} catch (err) {
if (errorCode(err) === 'ENOENT') return [];
throw err;
}
}
function sweepStaleReaders(dir: string) {
const readers = readersDir(dir);
for (const name of readReaderNames(readers)) {
const pid = Number(name.split('-')[0]);
if (!Number.isFinite(pid) || !pidIsAlive(pid)) {
rmSync(join(readers, name), { force: true });
}
}
}
async function waitForReadersEmpty(dir: string, timeoutMs: number) {
const readers = readersDir(dir);
const deadline = Date.now() + timeoutMs;
while (Date.now() < deadline) {
sweepStaleReaders(dir);
if (readReaderNames(readers).length === 0) return;
await new Promise(resolve => setTimeout(resolve, READER_POLL_INTERVAL_MS));
}
throw new Error(lockedMessage(timeoutMs));
}
async function acquireGate(
dir: string,
timeoutMs: number,
): Promise<() => Promise<void>> {
ensureDir(dir);
try {
return await lockfile.lock(dir, {
lockfilePath: lockfilePath(dir),
retries: retriesForTimeout(timeoutMs),
stale: 30_000,
});
} catch (err) {
if (isLockedError(err)) throw new Error(lockedMessage(timeoutMs));
throw err;
}
}
export async function acquireExclusive(
dir: string,
{ timeoutMs }: AcquireOptions,
): Promise<Release> {
const start = Date.now();
const release = await acquireGate(dir, timeoutMs);
try {
const remaining = Math.max(0, timeoutMs - (Date.now() - start));
await waitForReadersEmpty(dir, remaining);
} catch (err) {
await release();
throw err;
}
return () => release();
}
export async function acquireShared(
dir: string,
{ timeoutMs }: AcquireOptions,
): Promise<Release> {
const gate = await acquireGate(dir, timeoutMs);
let markerPath: string;
try {
const readers = readersDir(dir);
ensureDir(readers);
const markerName = `${process.pid}-${randomBytes(6).toString('hex')}`;
markerPath = join(readers, markerName);
writeFileSync(markerPath, '');
} catch (err) {
await gate();
throw err;
}
await gate();
return async () => {
rmSync(markerPath, { force: true });
};
}

View File

@@ -18,29 +18,3 @@ export function parseIntFlag(value: string, flagName: string): number {
}
return parsed;
}
export function parseNonNegativeIntFlag(
value: string,
flagName: string,
): number {
const parsed = parseIntFlag(value, flagName);
if (parsed < 0) {
throw new Error(
`Invalid ${flagName}: "${value}". Expected a non-negative integer.`,
);
}
return parsed;
}
export function parseBoolEnv(
raw: string | undefined,
source: string,
): boolean | undefined {
if (raw === undefined) return undefined;
const lower = raw.toLowerCase();
if (raw === '1' || lower === 'true') return true;
if (raw === '0' || lower === 'false') return false;
throw new Error(
`Invalid ${source}: "${raw}". Expected "true", "false", "1", or "0".`,
);
}

View File

@@ -32,8 +32,5 @@ export default defineConfig({
plugins: [visualizer({ template: 'raw-data', filename: 'dist/stats.json' })],
test: {
globals: true,
include: ['src/**/*.test.ts'],
exclude: ['**/node_modules/**', '**/dist/**'],
testTimeout: 10_000,
},
});

View File

@@ -4,11 +4,8 @@ import type { Preview } from '@storybook/react-vite';
// Not ideal to import from desktop-client, but we need a source of truth for theme variables
// TODO: this needs refactoring
// oxlint-disable-next-line actual/enforce-boundaries
import * as darkTheme from '../../desktop-client/src/style/themes/dark';
// oxlint-disable-next-line actual/enforce-boundaries
import * as lightTheme from '../../desktop-client/src/style/themes/light';
// oxlint-disable-next-line actual/enforce-boundaries
import * as midnightTheme from '../../desktop-client/src/style/themes/midnight';
const THEMES = {

View File

@@ -58,7 +58,7 @@
"@svgr/babel-plugin-add-jsx-attribute": "^8.0.0",
"@svgr/cli": "^8.1.0",
"@types/react": "^19.2.14",
"@typescript/native-preview": "beta",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"@vitejs/plugin-react": "^6.0.1",
"eslint-plugin-storybook": "^10.3.4",
"react": "19.2.4",

View File

@@ -4,11 +4,7 @@
"description": "CRDT layer of Actual",
"license": "MIT",
"files": [
"dist",
"!dist/**/*.test.d.ts",
"!dist/**/*.test.d.ts.map",
"!dist/**/*.spec.d.ts",
"!dist/**/*.spec.d.ts.map"
"dist"
],
"main": "dist/index.js",
"types": "dist/index.d.ts",
@@ -30,18 +26,17 @@
"scripts": {
"build:node": "vite build",
"proto:generate": "./bin/generate-proto",
"build": "yarn run build:node && tsgo -b",
"build": "yarn run build:node && tsgo -p tsconfig.build.json --emitDeclarationOnly",
"test": "vitest --run",
"typecheck": "tsgo -b"
},
"dependencies": {
"google-protobuf": "^3.21.4",
"murmurhash": "^2.0.1",
"uuid": "^14.0.0"
"murmurhash": "^2.0.1"
},
"devDependencies": {
"@types/google-protobuf": "3.15.12",
"@typescript/native-preview": "beta",
"@typescript/native-preview": "^7.0.0-dev.20260404.1",
"protoc-gen-js": "3.21.4-4",
"rollup-plugin-visualizer": "^7.0.1",
"ts-protoc-gen": "0.15.0",

View File

@@ -1,5 +1,4 @@
import murmurhash from 'murmurhash';
import { v4 as uuidv4 } from 'uuid';
import type { TrieNode } from './merkle';
@@ -77,7 +76,7 @@ export function deserializeClock(clock: string): Clock {
}
export function makeClientId() {
return uuidv4().replace(/-/g, '').slice(-16);
return crypto.randomUUID().replace(/-/g, '').slice(-16);
}
const config = {

View File

@@ -0,0 +1,8 @@
{
"extends": "./tsconfig.json",
"compilerOptions": {
"composite": false,
"emitDeclarationOnly": false
},
"exclude": ["**/*.test.ts", "**/*.spec.ts"]
}

View File

@@ -8,7 +8,6 @@ coverage
test-results
playwright-report
blob-report
.playwright-cli
# production
build

View File

@@ -0,0 +1,17 @@
#!/bin/sh -ex
ROOT=`dirname $0`
cd "$ROOT/.."
echo "Building the browser..."
rm -fr build
export REACT_APP_BACKEND_WORKER_HASH=`ls "$ROOT"/../public/kcab/kcab.worker.*.js | sed 's/.*kcab\.worker\.\(.*\)\.js/\1/'`
yarn build --mode=browser
rm -fr build-stats
mkdir build-stats
mv build/kcab/stats.json build-stats/loot-core-stats.json
mv ./stats.json build-stats/web-stats.json

View File

@@ -1,97 +0,0 @@
#!/usr/bin/env node
// Minimal static file server for the prebuilt browser bundle at
// packages/desktop-client/build. Serves with the COOP/COEP headers required
// by the app (SharedArrayBuffer/SQLite). Intended for CI e2e runs where
// starting the full Vite dev server is unnecessary overhead.
import fs from 'node:fs';
import http from 'node:http';
import path from 'node:path';
import { fileURLToPath } from 'node:url';
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const ROOT = path.resolve(__dirname, '..', 'build');
const INDEX_PATH = path.join(ROOT, 'index.html');
const PORT = Number(process.env.PORT) || 3001;
const MIME = {
'.html': 'text/html; charset=utf-8',
'.js': 'text/javascript; charset=utf-8',
'.mjs': 'text/javascript; charset=utf-8',
'.css': 'text/css; charset=utf-8',
'.json': 'application/json; charset=utf-8',
'.map': 'application/json; charset=utf-8',
'.svg': 'image/svg+xml',
'.png': 'image/png',
'.jpg': 'image/jpeg',
'.jpeg': 'image/jpeg',
'.gif': 'image/gif',
'.ico': 'image/x-icon',
'.wasm': 'application/wasm',
'.woff': 'font/woff',
'.woff2': 'font/woff2',
'.ttf': 'font/ttf',
'.otf': 'font/otf',
'.webmanifest': 'application/manifest+json',
'.txt': 'text/plain; charset=utf-8',
};
function setSharedHeaders(res) {
res.setHeader('Cross-Origin-Opener-Policy', 'same-origin');
res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp');
res.setHeader('Cross-Origin-Resource-Policy', 'same-origin');
}
function resolveFile(urlPath) {
let cleanPath;
try {
cleanPath = decodeURIComponent(urlPath.split('?')[0].split('#')[0]);
} catch {
return null;
}
if (cleanPath.includes('\0')) return null;
// Strip leading slashes so path.resolve treats it as relative to ROOT,
// regardless of whether the URL was absolute or contained duplicate
// separators.
const relPath = cleanPath.replace(/^\/+/, '');
const candidate = path.resolve(ROOT, relPath);
const relative = path.relative(ROOT, candidate);
if (relative.startsWith('..') || path.isAbsolute(relative)) return null;
try {
return fs.statSync(candidate).isFile() ? candidate : null;
} catch {
return null;
}
}
const server = http.createServer((req, res) => {
setSharedHeaders(res);
const rawUrlPath = (req.url || '/').split('?')[0].split('#')[0];
let filePath = resolveFile(req.url || '/');
// SPA fallback: serve index.html only for routes without a file extension
// (i.e. client-side routes). Asset requests that miss get a real 404 so the
// browser doesn't receive HTML when it asked for JS/CSS/etc.
if (!filePath) {
const hasExtension = path.extname(rawUrlPath) !== '';
if (hasExtension) {
res.writeHead(404);
res.end('Not found');
return;
}
filePath = INDEX_PATH;
}
const ext = path.extname(filePath).toLowerCase();
res.setHeader('Content-Type', MIME[ext] || 'application/octet-stream');
fs.createReadStream(filePath)
.on('error', err => {
res.writeHead(500);
res.end(String(err));
})
.pipe(res);
});
server.listen(PORT, () => {
console.log(`serve-build: serving ${ROOT} on http://localhost:${PORT}`);
});

View File

@@ -1,210 +0,0 @@
import { appendFileSync, readFileSync } from 'node:fs';
import { dirname, resolve } from 'node:path';
import { fileURLToPath } from 'node:url';
import type { CatalogTheme } from '../src/style/customThemes.ts';
import {
embedThemeFonts,
validateThemeCss,
} from '../src/style/customThemes.ts';
const MAX_CSS_BYTES = 512 * 1024;
const FETCH_TIMEOUT_MS = 15_000;
const REPO_PATTERN = /^[A-Za-z0-9._-]+\/[A-Za-z0-9._-]+$/;
type ThemeResult = {
name: string;
repo: string;
status: 'ok' | 'error';
error?: string;
};
const here = dirname(fileURLToPath(import.meta.url));
const catalogPath = resolve(
here,
'..',
'src',
'data',
'customThemeCatalog.json',
);
function readCatalog(): CatalogTheme[] {
const raw = readFileSync(catalogPath, 'utf8');
const parsed: unknown = JSON.parse(raw);
if (!Array.isArray(parsed)) {
throw new Error('Catalog JSON must be an array.');
}
return parsed.map((entry, i) => validateCatalogEntry(entry, i));
}
function validateCatalogEntry(value: unknown, index: number): CatalogTheme {
if (!value || typeof value !== 'object') {
throw new Error(`Catalog entry #${index} is not an object.`);
}
const e = value as Record<string, unknown>;
if (typeof e.name !== 'string' || !e.name.trim()) {
throw new Error(`Catalog entry #${index} is missing a valid "name".`);
}
// Schema-check the repo before it gets interpolated into a fetch URL.
if (typeof e.repo !== 'string' || !REPO_PATTERN.test(e.repo)) {
throw new Error(
`Catalog entry "${String(e.name)}" has an invalid "repo" (expected "owner/repo"): ${JSON.stringify(e.repo)}`,
);
}
if (e.mode !== 'light' && e.mode !== 'dark') {
throw new Error(
`Catalog entry "${String(e.name)}" has an invalid "mode" (expected "light" or "dark").`,
);
}
if (
e.colors !== undefined &&
(!Array.isArray(e.colors) ||
!e.colors.every((c: unknown) => typeof c === 'string'))
) {
throw new Error(
`Catalog entry "${String(e.name)}" has an invalid "colors" (expected string[]).`,
);
}
return {
name: e.name,
repo: e.repo,
mode: e.mode,
colors: e.colors as string[] | undefined,
};
}
async function fetchCss(url: string): Promise<string> {
const response = await fetch(url, {
signal: AbortSignal.timeout(FETCH_TIMEOUT_MS),
redirect: 'error',
headers: { Accept: 'text/css, text/plain, */*' },
});
if (!response.ok) {
throw new Error(
`Failed to fetch ${url}: ${response.status} ${response.statusText}`,
);
}
const contentLength = response.headers.get('content-length');
if (contentLength !== null) {
const size = Number.parseInt(contentLength, 10);
if (Number.isFinite(size) && size > MAX_CSS_BYTES) {
throw new Error(
`CSS at ${url} is ${size} bytes; max allowed is ${MAX_CSS_BYTES} bytes.`,
);
}
}
const reader = response.body?.getReader();
if (!reader) {
throw new Error(`Response from ${url} has no body.`);
}
const decoder = new TextDecoder('utf-8');
let received = 0;
let text = '';
for (;;) {
const { done, value } = await reader.read();
if (done) break;
received += value.byteLength;
if (received > MAX_CSS_BYTES) {
await reader.cancel();
throw new Error(
`CSS at ${url} exceeds max allowed size of ${MAX_CSS_BYTES} bytes.`,
);
}
text += decoder.decode(value, { stream: true });
}
text += decoder.decode();
return text;
}
async function validateOne(entry: CatalogTheme): Promise<ThemeResult> {
try {
const url = `https://raw.githubusercontent.com/${entry.repo}/refs/heads/main/actual.css`;
const css = await fetchCss(url);
// Embed fonts before validation: the validator only accepts data: URIs in
// @font-face, and embedThemeFonts is what turns relative url() refs into
// data: URIs. Matches ThemeInstaller's install flow.
const embedded = await embedThemeFonts(css, entry.repo);
validateThemeCss(embedded);
return { name: entry.name, repo: entry.repo, status: 'ok' };
} catch (err) {
return {
name: entry.name,
repo: entry.repo,
status: 'error',
error: err instanceof Error ? err.message : String(err),
};
}
}
function escapeForMarkdown(s: string): string {
return s.replace(/[`<>|]/g, c => `\\${c}`).replace(/\r?\n/g, ' ');
}
function writeStepSummary(results: ThemeResult[]): void {
const summaryPath = process.env.GITHUB_STEP_SUMMARY;
if (!summaryPath) return;
const okCount = results.filter(r => r.status === 'ok').length;
const failCount = results.length - okCount;
const lines: string[] = [];
lines.push('# Custom theme catalog scan');
lines.push('');
lines.push(`- Total themes: ${results.length}`);
lines.push(`- Passing: ${okCount}`);
lines.push(`- Failing: ${failCount}`);
lines.push('');
lines.push('| Status | Theme | Repo | Error |');
lines.push('| --- | --- | --- | --- |');
for (const r of results) {
const status = r.status === 'ok' ? 'pass' : 'FAIL';
const err = r.error ? escapeForMarkdown(r.error) : '';
lines.push(
`| ${status} | ${escapeForMarkdown(r.name)} | ${escapeForMarkdown(r.repo)} | ${err} |`,
);
}
lines.push('');
appendFileSync(summaryPath, lines.join('\n') + '\n');
}
async function main(): Promise<void> {
const catalog = readCatalog();
console.log(`Validating ${catalog.length} theme(s) from the catalog…`);
const results: ThemeResult[] = [];
for (const entry of catalog) {
const result = await validateOne(entry);
if (result.status === 'ok') {
console.log(` ok ${entry.repo.padEnd(55)} ${entry.name}`);
} else {
console.log(
` FAIL ${entry.repo.padEnd(55)} ${entry.name}\n → ${result.error}`,
);
}
results.push(result);
}
const failed = results.filter(r => r.status === 'error');
console.log('');
console.log(
`Summary: ${results.length - failed.length}/${results.length} passing, ${failed.length} failing.`,
);
writeStepSummary(results);
process.exit(failed.length === 0 ? 0 : 1);
}
main().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -86,14 +86,7 @@ test.describe('Accounts', () => {
credit: '34.56',
});
// Wait for both newly created transactions to actually be in the
// transaction list before selecting them. A bare waitForTimeout(100)
// here is not enough under parallel CI load: the second
// createSingleTransaction's row may still be mounting when the
// selection clicks land, so the selection doesn't stick and the
// 'Make transfer' button (rendered only when items are selected)
// never appears.
await expect(accountPage.getNthTransaction(1).payee).toBeVisible();
await page.waitForTimeout(100); // Give time for the previous transaction to be rendered
await accountPage.selectNthTransaction(0);
await accountPage.selectNthTransaction(1);

Binary file not shown.

Before

Width:  |  Height:  |  Size: 56 KiB

After

Width:  |  Height:  |  Size: 56 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 56 KiB

After

Width:  |  Height:  |  Size: 56 KiB

Some files were not shown because too many files have changed in this diff Show More