Compare commits

...

18 Commits

Author SHA1 Message Date
github-actions[bot]
4802b75163 [AI] sync-server: bootstrap entry that registers the loader for utilityProcess
Electron's utilityProcess.fork accepts execArgv but silently ignores
--import (verified with a minimal repro: the flag shows up in
process.execArgv but the preload module never executes), so the
previous attempt was a no-op and the embedded sync-server still
crashed on crdt's ESM directory imports. Add packages/sync-server/start.mjs
that statically imports register-loader.mjs and then dynamic-imports
build/app.js, so the loader is in place before the app's module graph
resolves. desktop-electron now points utilityProcess.fork at start.mjs
and drops the ineffective --import flag.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-04 13:11:29 +01:00
github-actions[bot]
0779c9a84b [AI] desktop-electron: register sync-server loader on the embedded fork
The Electron app starts the sync server via utilityProcess.fork, which
bypasses sync-server's `start` script. With crdt now loaded from
source, the fork needs the same `--import register-loader.mjs` that
the standalone server uses; otherwise it crashes on the extensionless
`from './crdt'` directory import. Adds the loader files to
sync-server's published `files` so they actually ship with the
packaged app.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-04 12:41:10 +01:00
github-actions[bot]
425378d6ba [AI] crdt: load source directly in dev, only use dist when published
Local exports point at src/index.ts so consumers (sync-server in
particular) never load a stale Vite bundle. publishConfig keeps the
dist/ mapping for npm consumers. Switched the Vite output to ESM and
added "type": "module" so the published bundle stays consistent.

Sync-server's existing extension-resolution loader is extended to
handle directory imports and is now registered at runtime via
--import ./register-loader.mjs, matching how tests already load it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 22:21:37 +01:00
github-actions[bot]
8dda4bb7fc Merge remote-tracking branch 'origin/master' into matiss/crdt-protobuf
# Conflicts:
#	packages/crdt/package.json
#	packages/sync-server/src/app-sync.ts
#	yarn.lock
2026-05-03 21:41:05 +01:00
github-actions[bot]
9270ccc29f [AI] address review feedback on encoder/app-sync test
- encoder.ts: prefs.getPrefs().encryptKeyId is `string | undefined`
  (MetadataPrefs is a Partial<>). The bufbuild SyncRequestSchema's
  keyId field is a non-optional proto3 string. Current code worked by
  accident — passing undefined into `create(Schema, init)` falls back
  to the schema default '' — but relied on bufbuild's undef-handling
  and would break if someone dropped @ts-strict-ignore. Normalize to
  '' explicitly.
- app-sync.test.ts: add a short WHY comment next to
  `syncRequest.since = ''` in "returns 422 if since is not provided".
  The test's intent (missing since) only matches the handler's
  `requestPb.since || null` falsy-check because proto3 strips '' on
  the wire and decodes it back to ''. Not obvious without the comment.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 17:08:13 +01:00
github-actions[bot]
819122686b [AI] address review feedback on crdt/sync-server
- generate-proto: add `set -euo pipefail` so a protoc failure exits the
  script non-zero instead of silently running oxfmt on whatever is in
  src/proto/ from the previous run.
- sync.proto SyncRequest: field numbers jumped from 3 to 5; declare
  `reserved 4;` so the slot can't be silently reused for a new field
  with an incompatible type. Regenerated sync_pb.ts — the reservation
  shows up in the encoded file descriptor.
- sync-simple.js: SQLite stores is_encrypted as a 0/1 integer and
  better-sqlite3 hands it back as a number, but the bufbuild
  MessageEnvelope schema types isEncrypted as bool. Coerce to boolean
  when constructing the envelope so the JS value matches the field
  type before toBinary runs.

Skipped the suggested `types` → ./src/index.ts swap in crdt's exports:
packages/api uses the same `types` → dist pattern and TypeScript's
bundler resolution already falls through when dist/*.d.ts doesn't yet
exist (verified — loot-core typecheck passes with packages/crdt/dist
removed).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 16:46:22 +01:00
github-actions[bot]
54e9054b4d Merge remote-tracking branch 'origin/master' into matiss/crdt-protobuf
# Conflicts:
#	packages/crdt/package.json
#	packages/sync-server/package.json
#	yarn.lock
2026-04-18 16:42:13 +01:00
github-actions[bot]
33cb1c2088 [AI] crdt: split exports per consumer (browser source, node dist)
Previous commit's conditional exports routed everything non-development
to ./dist/index.js. That broke the web build: rolldown runs with
conditions ['electron-renderer', 'module', 'browser', 'default'] — no
match for development, falls through to the dist entry, which isn't
built by bin/package-browser, and fails to resolve @actual-app/crdt
when bundling loot-core's server/undo.ts.

Split the entries so each consumer lands on the right artifact:

  types       → ./dist/index.d.ts   (TypeScript, project references)
  development → ./src/index.ts      (vitest — both configs include it)
  browser     → ./src/index.ts      (web rolldown bundles the source)
  node        → ./dist/index.js     (sync-server forked by Node at
                                     runtime — the failure that kicked
                                     off this whole saga)
  default     → ./src/index.ts      (fallback for bundlers like api's
                                     vite build with conditions=['api'])

Verified: node resolves to dist, yarn build:browser succeeds from a
clean crdt/, sync-server build produces both dist/index.js and
build/app.js, loot-core (552) + sync-server (386) tests pass, full
typecheck clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-18 09:21:09 +01:00
github-actions[bot]
20b7ea3999 [AI] crdt: expose dist/ via conditional exports so Node can load it
The package's `exports` field pointed straight at `./src/index.ts`,
which works for TS tooling and bundlers (vite with noExternal, vitest)
but breaks at plain-Node runtime — Node can't execute `.ts` files and
resolves dependent `./crdt` as a directory import, failing with
ERR_UNSUPPORTED_DIR_IMPORT.

That was invisible before because sync-server pinned
`@actual-app/crdt@2.1.0` and ran against the published npm tarball
(whose `publishConfig.exports` had already been promoted to the main
`exports` by yarn pack). Switching sync-server to `workspace:*` made
the raw workspace exports win at runtime: the compiled server imported
crdt when desktop-electron forked it, Node hit the `.ts` entry, the
utility process crashed before emitting `server-started`, and the
onboarding flow stalled on "Configure your server".

Switch to the same conditional-exports pattern packages/api already
uses: types → dist/index.d.ts, development → src/index.ts (for vitest
runs that enable the `development` condition), default → dist/index.js
(Node runtime and any other consumer). `publishConfig.exports` still
collapses this to just types + default for the npm tarball.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 23:42:45 +01:00
github-actions[bot]
487afb5344 [AI] sync-server: build @actual-app/crdt before tsgo
The previous tsgo -b approach emitted crdt's .d.ts via the project
reference but never produced dist/index.js — tsgo respects crdt's
tsconfig which has emitDeclarationOnly: true, and the actual JS
runtime is emitted by Vite in crdt's build script. So sync-server
compiled cleanly but crashed at runtime when forked by desktop-electron
(require('@actual-app/crdt') resolved to a package whose main pointed
at a nonexistent file, surfaced in e2e as the onboarding screen never
leaving the "Configure your server" state).

Unlike packages/api (which uses Vite with noExternal: true and bundles
crdt's source inline), sync-server uses plain tsgo compilation and
keeps its deps external — so crdt must be built ahead of time and be
resolvable via node_modules at runtime.

Chain `yarn workspace @actual-app/crdt build` before tsgo so every
caller of sync-server's build (build:server, docker-edge, publish
workflows, direct invocations in CI) gets a complete crdt dist. Revert
tsgo -b back to plain tsgo since crdt's build step now emits both the
JS and the declarations.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:43:06 +01:00
github-actions[bot]
d3245cf840 [AI] sync-server: build project references via tsgo -b
The build script ran plain `tsgo`, which doesn't compile referenced
projects. With @actual-app/crdt now a `workspace:*` dep (no bundled
declarations from the npm tarball), the sync-server build fails with
TS6305 because packages/crdt/dist/index.d.ts doesn't exist yet.

Switch to `tsgo -b` so the sync-server build is self-contained: it
emits crdt's declarations into packages/crdt/dist on demand. This
mirrors what the sync-server `typecheck` script already does and fixes
all callers (`build:server`, docker-edge, publish workflows, the
direct `yarn workspace @actual-app/sync-server build` invocation in
build.yml) without needing per-workflow lage orchestration.

Revert the build.yml workaround added in the previous commit.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:31:27 +01:00
github-actions[bot]
8d4f742805 [AI] CI: drive sync-server build through lage so crdt deps are built
Before: the server job ran `yarn workspace @actual-app/sync-server build`
directly, which invokes tsgo without first emitting the workspace
dependencies' declarations. That worked when sync-server pinned crdt to
the published npm version (declarations bundled in the tarball), but
with `workspace:*` it fails with TS6305 because packages/crdt/dist/*.d.ts
hasn't been built yet.

Switch the CI command to `yarn build --to=@actual-app/sync-server`.
Lage respects the `dependsOn: ['^build']` pipeline and builds
@actual-app/crdt (and the other transitive deps) before sync-server.

Using --to rather than --scope keeps the build set minimal; --scope
would also include dependents like desktop-electron.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:25:47 +01:00
github-actions[bot]
8862dcc0f1 [AI] crdt: drop the SyncProtoBuf compat layer
The proto/compat.ts wrapper was introduced alongside the bufbuild
migration to avoid touching call sites. With bufbuild messages already
exposing fields as plain mutable properties, the wrapper was just
boilerplate hiding direct reads and writes — and it had drifted (e.g.
setMessagesList was called in a test but never defined).

Delete compat.ts and migrate the six call sites in loot-core and
sync-server to use @bufbuild/protobuf directly. The crdt package now
re-exports the sync_pb types/schemas and the three bufbuild runtime
helpers (create, fromBinary, toBinary) so consumers keep a single
import source.

Also switch sync-server's @actual-app/crdt dependency from the pinned
"2.1.0" to "workspace:*", matching api/loot-core — the npm pin was
pulling the stale published copy instead of the workspace source.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 22:14:42 +01:00
github-actions[bot]
8c26826d9e Merge remote-tracking branch 'origin/master' into matiss/crdt-protobuf
# Conflicts:
#	packages/crdt/package.json
#	yarn.lock
2026-04-17 21:45:23 +01:00
github-actions[bot]
1f33765f57 Merge remote-tracking branch 'origin/master' into matiss/crdt-protobuf
# Conflicts:
#	packages/crdt/bin/generate-proto
#	packages/crdt/src/index.ts
2026-04-17 19:28:50 +01:00
github-actions[bot]
922870ffe0 [AI] Align @bufbuild/protobuf version ranges with installed
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 19:26:06 +01:00
github-actions[bot]
1b3dc367a6 [AI] Replace google-protobuf with @bufbuild/protobuf
Swap the google-protobuf + ts-protoc-gen + protoc-gen-js toolchain for
@bufbuild/protobuf + @bufbuild/protoc-gen-es. The generator now emits a
single pure-TS sync_pb.ts (no .js sidecar, no globalThis.proto hack)
and a thin wrapper in proto/compat.ts preserves the SyncProtoBuf /
SyncRequest / etc. API so call sites stay unchanged. Removes the
loot-core CommonJS require polyfill that only existed to service
google-protobuf.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 19:24:34 +01:00
github-actions[bot]
dcac103533 [AI] crdt: typecheck test files and clean up lint issues
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-17 17:36:15 +01:00
23 changed files with 426 additions and 1641 deletions

View File

@@ -1,4 +1,5 @@
#!/bin/bash
set -euo pipefail
cd "$(dirname "$(dirname "$0")")"
@@ -7,20 +8,10 @@ if ! [ -x "$(command -v protoc)" ]; then
exit 1
fi
export PATH="$PWD/bin:$PATH"
protoc --plugin="protoc-gen-ts=../../node_modules/.bin/protoc-gen-ts" \
--ts_opt=esModuleInterop=true \
--ts_out="src/proto" \
--js_out=import_style=commonjs,binary:src/proto \
protoc --plugin="protoc-gen-es=../../node_modules/.bin/protoc-gen-es" \
--es_opt=target=ts \
--es_out="src/proto" \
--proto_path=src/proto \
sync.proto
../../node_modules/.bin/oxfmt src/proto/*.d.ts
for file in src/proto/*.d.ts; do
{ echo "/* oxlint-disable typescript/no-namespace */"; sed 's/export class/export declare class/g' "$file"; } > "${file%.d.ts}.ts"
rm "$file"
done
echo 'One more step! Find the `var global = ...` declaration in src/proto/sync_pb.js and change it to `var global = globalThis;`'
../../node_modules/.bin/oxfmt src/proto/*.ts

View File

@@ -10,14 +10,11 @@
"!dist/**/*.spec.d.ts",
"!dist/**/*.spec.d.ts.map"
],
"main": "dist/index.js",
"types": "dist/index.d.ts",
"type": "module",
"main": "src/index.ts",
"types": "src/index.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"development": "./src/index.ts",
"default": "./dist/index.js"
}
".": "./src/index.ts"
},
"publishConfig": {
"exports": {
@@ -25,7 +22,9 @@
"types": "./dist/index.d.ts",
"default": "./dist/index.js"
}
}
},
"main": "dist/index.js",
"types": "dist/index.d.ts"
},
"scripts": {
"build:node": "vite build",
@@ -35,15 +34,13 @@
"typecheck": "tsgo -b"
},
"dependencies": {
"google-protobuf": "^3.21.4",
"@bufbuild/protobuf": "^2.11.0",
"murmurhash": "^2.0.1"
},
"devDependencies": {
"@types/google-protobuf": "3.15.12",
"@bufbuild/protoc-gen-es": "^2.11.0",
"@typescript/native-preview": "beta",
"protoc-gen-js": "3.21.4-4",
"rollup-plugin-visualizer": "^7.0.1",
"ts-protoc-gen": "0.15.0",
"vite": "^8.0.5",
"vitest": "^4.1.2"
}

View File

@@ -1,6 +1,3 @@
import './proto/sync_pb.js'; // Import for side effects
import type * as SyncPb from './proto/sync_pb';
export {
merkle,
getClock,
@@ -13,16 +10,17 @@ export {
Timestamp,
} from './crdt';
declare global {
var proto: typeof SyncPb;
}
export {
type EncryptedData,
type Message,
type MessageEnvelope,
type SyncRequest,
type SyncResponse,
EncryptedDataSchema,
MessageSchema,
MessageEnvelopeSchema,
SyncRequestSchema,
SyncResponseSchema,
} from './proto/sync_pb';
const { proto } = globalThis;
export const SyncRequest = proto.SyncRequest;
export const SyncResponse = proto.SyncResponse;
export const Message = proto.Message;
export const MessageEnvelope = proto.MessageEnvelope;
export const EncryptedData = proto.EncryptedData;
export const SyncProtoBuf = proto;
export { create, fromBinary, toBinary } from '@bufbuild/protobuf';

View File

@@ -21,6 +21,7 @@ message MessageEnvelope {
}
message SyncRequest {
reserved 4;
repeated MessageEnvelope messages = 1;
string fileId = 2;
string groupId = 3;

File diff suppressed because it is too large Load Diff

View File

@@ -1,217 +1,161 @@
/* oxlint-disable typescript/no-namespace */
// package:
// file: sync.proto
// @generated by protoc-gen-es v2.11.0 with parameter "target=ts"
// @generated from file sync.proto (syntax proto3)
/* eslint-disable */
import * as jspb from 'google-protobuf';
import type { Message as Message$1 } from '@bufbuild/protobuf';
import type { GenFile, GenMessage } from '@bufbuild/protobuf/codegenv2';
import { fileDesc, messageDesc } from '@bufbuild/protobuf/codegenv2';
export declare class EncryptedData extends jspb.Message {
getIv(): Uint8Array | string;
getIv_asU8(): Uint8Array;
getIv_asB64(): string;
setIv(value: Uint8Array | string): void;
/**
* Describes the file sync.proto.
*/
export const file_sync: GenFile /*@__PURE__*/ = fileDesc(
'CgpzeW5jLnByb3RvIjoKDUVuY3J5cHRlZERhdGESCgoCaXYYASABKAwSDwoHYXV0aFRhZxgCIAEoDBIMCgRkYXRhGAMgASgMIkYKB01lc3NhZ2USDwoHZGF0YXNldBgBIAEoCRILCgNyb3cYAiABKAkSDgoGY29sdW1uGAMgASgJEg0KBXZhbHVlGAQgASgJIkoKD01lc3NhZ2VFbnZlbG9wZRIRCgl0aW1lc3RhbXAYASABKAkSEwoLaXNFbmNyeXB0ZWQYAiABKAgSDwoHY29udGVudBgDIAEoDCJ2CgtTeW5jUmVxdWVzdBIiCghtZXNzYWdlcxgBIAMoCzIQLk1lc3NhZ2VFbnZlbG9wZRIOCgZmaWxlSWQYAiABKAkSDwoHZ3JvdXBJZBgDIAEoCRINCgVrZXlJZBgFIAEoCRINCgVzaW5jZRgGIAEoCUoECAQQBSJCCgxTeW5jUmVzcG9uc2USIgoIbWVzc2FnZXMYASADKAsyEC5NZXNzYWdlRW52ZWxvcGUSDgoGbWVya2xlGAIgASgJYgZwcm90bzM',
);
getAuthtag(): Uint8Array | string;
getAuthtag_asU8(): Uint8Array;
getAuthtag_asB64(): string;
setAuthtag(value: Uint8Array | string): void;
/**
* @generated from message EncryptedData
*/
export type EncryptedData = Message$1<'EncryptedData'> & {
/**
* @generated from field: bytes iv = 1;
*/
iv: Uint8Array;
getData(): Uint8Array | string;
getData_asU8(): Uint8Array;
getData_asB64(): string;
setData(value: Uint8Array | string): void;
/**
* @generated from field: bytes authTag = 2;
*/
authTag: Uint8Array;
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): EncryptedData.AsObject;
static toObject(
includeInstance: boolean,
msg: EncryptedData,
): EncryptedData.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: EncryptedData,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): EncryptedData;
static deserializeBinaryFromReader(
message: EncryptedData,
reader: jspb.BinaryReader,
): EncryptedData;
}
/**
* @generated from field: bytes data = 3;
*/
data: Uint8Array;
};
export namespace EncryptedData {
export type AsObject = {
iv: Uint8Array | string;
authtag: Uint8Array | string;
data: Uint8Array | string;
};
}
/**
* Describes the message EncryptedData.
* Use `create(EncryptedDataSchema)` to create a new message.
*/
export const EncryptedDataSchema: GenMessage<EncryptedData> /*@__PURE__*/ =
messageDesc(file_sync, 0);
export declare class Message extends jspb.Message {
getDataset(): string;
setDataset(value: string): void;
/**
* @generated from message Message
*/
export type Message = Message$1<'Message'> & {
/**
* @generated from field: string dataset = 1;
*/
dataset: string;
getRow(): string;
setRow(value: string): void;
/**
* @generated from field: string row = 2;
*/
row: string;
getColumn(): string;
setColumn(value: string): void;
/**
* @generated from field: string column = 3;
*/
column: string;
getValue(): string;
setValue(value: string): void;
/**
* @generated from field: string value = 4;
*/
value: string;
};
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): Message.AsObject;
static toObject(includeInstance: boolean, msg: Message): Message.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: Message,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): Message;
static deserializeBinaryFromReader(
message: Message,
reader: jspb.BinaryReader,
): Message;
}
/**
* Describes the message Message.
* Use `create(MessageSchema)` to create a new message.
*/
export const MessageSchema: GenMessage<Message> /*@__PURE__*/ = messageDesc(
file_sync,
1,
);
export namespace Message {
export type AsObject = {
dataset: string;
row: string;
column: string;
value: string;
};
}
/**
* @generated from message MessageEnvelope
*/
export type MessageEnvelope = Message$1<'MessageEnvelope'> & {
/**
* @generated from field: string timestamp = 1;
*/
timestamp: string;
export declare class MessageEnvelope extends jspb.Message {
getTimestamp(): string;
setTimestamp(value: string): void;
/**
* @generated from field: bool isEncrypted = 2;
*/
isEncrypted: boolean;
getIsencrypted(): boolean;
setIsencrypted(value: boolean): void;
/**
* @generated from field: bytes content = 3;
*/
content: Uint8Array;
};
getContent(): Uint8Array | string;
getContent_asU8(): Uint8Array;
getContent_asB64(): string;
setContent(value: Uint8Array | string): void;
/**
* Describes the message MessageEnvelope.
* Use `create(MessageEnvelopeSchema)` to create a new message.
*/
export const MessageEnvelopeSchema: GenMessage<MessageEnvelope> /*@__PURE__*/ =
messageDesc(file_sync, 2);
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): MessageEnvelope.AsObject;
static toObject(
includeInstance: boolean,
msg: MessageEnvelope,
): MessageEnvelope.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: MessageEnvelope,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): MessageEnvelope;
static deserializeBinaryFromReader(
message: MessageEnvelope,
reader: jspb.BinaryReader,
): MessageEnvelope;
}
/**
* @generated from message SyncRequest
*/
export type SyncRequest = Message$1<'SyncRequest'> & {
/**
* @generated from field: repeated MessageEnvelope messages = 1;
*/
messages: MessageEnvelope[];
export namespace MessageEnvelope {
export type AsObject = {
timestamp: string;
isencrypted: boolean;
content: Uint8Array | string;
};
}
/**
* @generated from field: string fileId = 2;
*/
fileId: string;
export declare class SyncRequest extends jspb.Message {
clearMessagesList(): void;
getMessagesList(): Array<MessageEnvelope>;
setMessagesList(value: Array<MessageEnvelope>): void;
addMessages(value?: MessageEnvelope, index?: number): MessageEnvelope;
/**
* @generated from field: string groupId = 3;
*/
groupId: string;
getFileid(): string;
setFileid(value: string): void;
/**
* @generated from field: string keyId = 5;
*/
keyId: string;
getGroupid(): string;
setGroupid(value: string): void;
/**
* @generated from field: string since = 6;
*/
since: string;
};
getKeyid(): string;
setKeyid(value: string): void;
/**
* Describes the message SyncRequest.
* Use `create(SyncRequestSchema)` to create a new message.
*/
export const SyncRequestSchema: GenMessage<SyncRequest> /*@__PURE__*/ =
messageDesc(file_sync, 3);
getSince(): string;
setSince(value: string): void;
/**
* @generated from message SyncResponse
*/
export type SyncResponse = Message$1<'SyncResponse'> & {
/**
* @generated from field: repeated MessageEnvelope messages = 1;
*/
messages: MessageEnvelope[];
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): SyncRequest.AsObject;
static toObject(
includeInstance: boolean,
msg: SyncRequest,
): SyncRequest.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: SyncRequest,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): SyncRequest;
static deserializeBinaryFromReader(
message: SyncRequest,
reader: jspb.BinaryReader,
): SyncRequest;
}
/**
* @generated from field: string merkle = 2;
*/
merkle: string;
};
export namespace SyncRequest {
export type AsObject = {
messagesList: Array<MessageEnvelope.AsObject>;
fileid: string;
groupid: string;
keyid: string;
since: string;
};
}
export declare class SyncResponse extends jspb.Message {
clearMessagesList(): void;
getMessagesList(): Array<MessageEnvelope>;
setMessagesList(value: Array<MessageEnvelope>): void;
addMessages(value?: MessageEnvelope, index?: number): MessageEnvelope;
getMerkle(): string;
setMerkle(value: string): void;
serializeBinary(): Uint8Array;
toObject(includeInstance?: boolean): SyncResponse.AsObject;
static toObject(
includeInstance: boolean,
msg: SyncResponse,
): SyncResponse.AsObject;
static extensions: { [key: number]: jspb.ExtensionFieldInfo<jspb.Message> };
static extensionsBinary: {
[key: number]: jspb.ExtensionFieldBinaryInfo<jspb.Message>;
};
static serializeBinaryToWriter(
message: SyncResponse,
writer: jspb.BinaryWriter,
): void;
static deserializeBinary(bytes: Uint8Array): SyncResponse;
static deserializeBinaryFromReader(
message: SyncResponse,
reader: jspb.BinaryReader,
): SyncResponse;
}
export namespace SyncResponse {
export type AsObject = {
messagesList: Array<MessageEnvelope.AsObject>;
merkle: string;
};
}
/**
* Describes the message SyncResponse.
* Use `create(SyncResponseSchema)` to create a new message.
*/
export const SyncResponseSchema: GenMessage<SyncResponse> /*@__PURE__*/ =
messageDesc(file_sync, 4);

View File

@@ -4,8 +4,8 @@
"rootDir": "./src",
"composite": true,
"target": "ES2021",
"module": "NodeNext",
"moduleResolution": "NodeNext",
"module": "ES2022",
"moduleResolution": "bundler",
"noEmit": false,
"emitDeclarationOnly": true,
"declaration": true,

View File

@@ -6,7 +6,7 @@ import { defineConfig } from 'vite';
export default defineConfig({
ssr: {
noExternal: true,
external: ['google-protobuf', 'murmurhash'],
external: ['@bufbuild/protobuf', 'murmurhash'],
},
build: {
ssr: true,
@@ -16,7 +16,7 @@ export default defineConfig({
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'src/index.ts'),
formats: ['cjs'],
formats: ['es'],
fileName: () => 'index.js',
},
},

View File

@@ -239,12 +239,15 @@ async function startSyncServer() {
),
};
const serverPath = path.join(
// require.resolve will recursively search up the workspace for the module
path.dirname(require.resolve('@actual-app/sync-server/package.json')),
'build',
'app.js',
// require.resolve will recursively search up the workspace for the module
const syncServerRoot = path.dirname(
require.resolve('@actual-app/sync-server/package.json'),
);
// start.mjs registers sync-server's TS-source loader before importing
// build/app.js. We can't use Node's --import flag here because Electron's
// utilityProcess.fork accepts execArgv but doesn't actually preload the
// module, so we register imperatively from a bootstrap entry instead.
const serverPath = path.join(syncServerRoot, 'start.mjs');
const webRoot = path.join(
// require.resolve will recursively search up the workspace for the module

View File

@@ -181,7 +181,6 @@
"csv-parse": "^6.2.1",
"csv-stringify": "^6.7.0",
"date-fns": "^4.1.0",
"google-protobuf": "^3.21.4",
"handlebars": "^4.7.9",
"hyperformula": "^3.2.0",
"lodash": "^4.18.1",

View File

@@ -1,5 +1,4 @@
// @ts-strict-ignore
import './polyfills';
import * as asyncStorage from '#platform/server/asyncStorage';
import * as connection from '#platform/server/connection';
import * as fs from '#platform/server/fs';

View File

@@ -1,26 +0,0 @@
// Polyfills for browser/web worker environment
import * as jspb from 'google-protobuf';
if (typeof globalThis !== 'undefined') {
// Add a basic require polyfill for CommonJS modules
if (typeof globalThis.require === 'undefined') {
// @ts-expect-error - we're creating a minimal require implementation
globalThis.require = (moduleId: string) => {
switch (moduleId) {
case 'google-protobuf':
return jspb;
default:
throw new Error(
`Module not found: ${moduleId}. Add to polyfills if needed.`,
);
}
};
}
}
// Also set on global for compatibility
if (typeof global !== 'undefined') {
if (typeof global.require === 'undefined') {
global.require = globalThis.require;
}
}

View File

@@ -1,5 +1,15 @@
// @ts-strict-ignore
import { SyncProtoBuf, Timestamp } from '@actual-app/crdt';
import {
create,
EncryptedDataSchema,
fromBinary,
MessageEnvelopeSchema,
MessageSchema,
SyncRequestSchema,
SyncResponseSchema,
Timestamp,
toBinary,
} from '@actual-app/crdt';
import { logger } from '#platform/server/log';
import * as encryption from '#server/encryption';
@@ -26,23 +36,27 @@ export async function encode(
messages: Message[],
): Promise<Uint8Array> {
const { encryptKeyId } = prefs.getPrefs();
const requestPb = new SyncProtoBuf.SyncRequest();
const requestPb = create(SyncRequestSchema, {
groupId,
fileId,
keyId: encryptKeyId ?? '',
since: since.toString(),
});
for (let i = 0; i < messages.length; i++) {
const msg = messages[i];
const envelopePb = new SyncProtoBuf.MessageEnvelope();
envelopePb.setTimestamp(msg.timestamp.toString());
const messagePb = new SyncProtoBuf.Message();
messagePb.setDataset(msg.dataset);
messagePb.setRow(msg.row);
messagePb.setColumn(msg.column);
messagePb.setValue(msg.value as string);
const binaryMsg = messagePb.serializeBinary();
for (const msg of messages) {
const binaryMsg = toBinary(
MessageSchema,
create(MessageSchema, {
dataset: msg.dataset,
row: msg.row,
column: msg.column,
value: msg.value as string,
}),
);
let content: Uint8Array;
let isEncrypted: boolean;
if (encryptKeyId) {
const encrypted = new SyncProtoBuf.EncryptedData();
let result;
try {
result = await encryption.encrypt(binaryMsg, encryptKeyId);
@@ -52,25 +66,30 @@ export async function encode(
});
}
encrypted.setData(result.value);
encrypted.setIv(Buffer.from(result.meta.iv, 'base64'));
encrypted.setAuthtag(Buffer.from(result.meta.authTag, 'base64'));
envelopePb.setContent(encrypted.serializeBinary());
envelopePb.setIsencrypted(true);
content = toBinary(
EncryptedDataSchema,
create(EncryptedDataSchema, {
data: result.value,
iv: Buffer.from(result.meta.iv, 'base64'),
authTag: Buffer.from(result.meta.authTag, 'base64'),
}),
);
isEncrypted = true;
} else {
envelopePb.setContent(binaryMsg);
content = binaryMsg;
isEncrypted = false;
}
requestPb.addMessages(envelopePb);
requestPb.messages.push(
create(MessageEnvelopeSchema, {
timestamp: msg.timestamp.toString(),
content,
isEncrypted,
}),
);
}
requestPb.setGroupid(groupId);
requestPb.setFileid(fileId);
requestPb.setKeyid(encryptKeyId);
requestPb.setSince(since.toString());
return requestPb.serializeBinary();
return toBinary(SyncRequestSchema, requestPb);
}
export async function decode(
@@ -78,29 +97,23 @@ export async function decode(
): Promise<{ messages: Message[]; merkle: { hash: number } }> {
const { encryptKeyId } = prefs.getPrefs();
const responsePb = SyncProtoBuf.SyncResponse.deserializeBinary(data);
const merkle = JSON.parse(responsePb.getMerkle());
const list = responsePb.getMessagesList();
const responsePb = fromBinary(SyncResponseSchema, data);
const merkle = JSON.parse(responsePb.merkle);
const messages = [];
for (let i = 0; i < list.length; i++) {
const envelopePb = list[i];
const timestamp = Timestamp.parse(envelopePb.getTimestamp());
const encrypted = envelopePb.getIsencrypted();
for (const envelopePb of responsePb.messages) {
let msg;
if (encrypted) {
const binary = SyncProtoBuf.EncryptedData.deserializeBinary(
envelopePb.getContent() as Uint8Array,
);
if (envelopePb.isEncrypted) {
const binary = fromBinary(EncryptedDataSchema, envelopePb.content);
let decrypted;
try {
decrypted = await encryption.decrypt(coerceBuffer(binary.getData()), {
decrypted = await encryption.decrypt(coerceBuffer(binary.data), {
keyId: encryptKeyId,
algorithm: 'aes-256-gcm',
iv: coerceBuffer(binary.getIv()),
authTag: coerceBuffer(binary.getAuthtag()),
iv: coerceBuffer(binary.iv),
authTag: coerceBuffer(binary.authTag),
});
} catch (e) {
logger.log(e);
@@ -109,19 +122,17 @@ export async function decode(
});
}
msg = SyncProtoBuf.Message.deserializeBinary(decrypted);
msg = fromBinary(MessageSchema, decrypted);
} else {
msg = SyncProtoBuf.Message.deserializeBinary(
envelopePb.getContent() as Uint8Array,
);
msg = fromBinary(MessageSchema, envelopePb.content);
}
messages.push({
timestamp,
dataset: msg.getDataset(),
row: msg.getRow(),
column: msg.getColumn(),
value: msg.getValue(),
timestamp: Timestamp.parse(envelopePb.timestamp),
dataset: msg.dataset,
row: msg.row,
column: msg.column,
value: msg.value,
});
}

View File

@@ -1,5 +1,5 @@
// @ts-strict-ignore
import { SyncProtoBuf } from '@actual-app/crdt';
import { create, MessageSchema, toBinary } from '@actual-app/crdt';
import * as encryption from '#server/encryption';
@@ -8,12 +8,15 @@ function randomString() {
}
export async function makeTestMessage(keyId) {
const messagePb = new SyncProtoBuf.Message();
messagePb.setDataset(randomString());
messagePb.setRow(randomString());
messagePb.setColumn(randomString());
messagePb.setValue(randomString());
const binaryMsg = messagePb.serializeBinary();
const binaryMsg = toBinary(
MessageSchema,
create(MessageSchema, {
dataset: randomString(),
row: randomString(),
column: randomString(),
value: randomString(),
}),
);
return await encryption.encrypt(binaryMsg, keyId);
}

View File

@@ -1,5 +1,16 @@
// @ts-strict-ignore
import { makeClock, merkle, SyncProtoBuf, Timestamp } from '@actual-app/crdt';
import {
create,
fromBinary,
makeClock,
merkle,
MessageEnvelopeSchema,
MessageSchema,
SyncRequestSchema,
SyncResponseSchema,
Timestamp,
toBinary,
} from '@actual-app/crdt';
import type { Clock } from '@actual-app/crdt';
import type { Message } from '#server/sync';
@@ -36,41 +47,41 @@ handlers['/'] = () => {
};
handlers['/sync/sync'] = async (data: Uint8Array): Promise<Uint8Array> => {
const requestPb = SyncProtoBuf.SyncRequest.deserializeBinary(data);
const since = requestPb.getSince();
const messages = requestPb.getMessagesList();
const requestPb = fromBinary(SyncRequestSchema, data);
const newMessages = currentMessages.filter(msg => msg.timestamp > since);
const newMessages = currentMessages.filter(
msg => msg.timestamp > requestPb.since,
);
messages.forEach(msg => {
if (!currentMessages.find(m => m.timestamp === msg.getTimestamp())) {
requestPb.messages.forEach(msg => {
if (!currentMessages.find(m => m.timestamp === msg.timestamp)) {
currentMessages.push({
timestamp: msg.getTimestamp(),
is_encrypted: msg.getIsencrypted(),
content: msg.getContent_asU8(),
timestamp: msg.timestamp,
is_encrypted: msg.isEncrypted,
content: msg.content,
});
currentClock.merkle = merkle.insert(
currentClock.merkle,
Timestamp.parse(msg.getTimestamp()),
Timestamp.parse(msg.timestamp),
);
}
});
currentClock.merkle = merkle.prune(currentClock.merkle);
const responsePb = new SyncProtoBuf.SyncResponse();
responsePb.setMerkle(JSON.stringify(currentClock.merkle));
newMessages.forEach(msg => {
const envelopePb = new SyncProtoBuf.MessageEnvelope();
envelopePb.setTimestamp(msg.timestamp);
envelopePb.setIsencrypted(msg.is_encrypted);
envelopePb.setContent(msg.content);
responsePb.addMessages(envelopePb);
const responsePb = create(SyncResponseSchema, {
merkle: JSON.stringify(currentClock.merkle),
messages: newMessages.map(msg =>
create(MessageEnvelopeSchema, {
timestamp: msg.timestamp,
isEncrypted: msg.is_encrypted,
content: msg.content,
}),
),
});
return responsePb.serializeBinary();
return toBinary(SyncResponseSchema, responsePb);
};
handlers['/gocardless/accounts'] = () => {
@@ -96,14 +107,14 @@ export const getClock = (): Clock => {
export const getMessages = (): Message[] => {
return currentMessages.map(msg => {
const { timestamp, content } = msg;
const fields = SyncProtoBuf.Message.deserializeBinary(content);
const fields = fromBinary(MessageSchema, content);
return {
timestamp: Timestamp.parse(timestamp),
dataset: fields.getDataset(),
row: fields.getRow(),
column: fields.getColumn(),
value: deserializeValue(fields.getValue()),
dataset: fields.dataset,
row: fields.row,
column: fields.column,
value: deserializeValue(fields.value),
};
});
};

View File

@@ -19,6 +19,14 @@ export async function resolve(specifier, context, nextResolve) {
return nextResolve(pathToFileURL(resolvedPath).href, context);
}
}
// Fall back to <specifier>/index.<ext> for directory imports
for (const ext of extensions) {
const resolvedPath = nodeResolve(parentDir, specifier, `index${ext}`);
if (existsSync(resolvedPath)) {
return nextResolve(pathToFileURL(resolvedPath).href, context);
}
}
}
}

View File

@@ -13,6 +13,9 @@
},
"files": [
"build",
"loader.mjs",
"register-loader.mjs",
"start.mjs",
"LICENSE",
"README.md"
],
@@ -68,8 +71,8 @@
}
},
"scripts": {
"start": "yarn build && node build/app",
"start-monitor": "nodemon --exec 'yarn build && node build/app' --ignore './build/**/*' --ext 'ts,js' build/app",
"start": "yarn build && node --import ./register-loader.mjs build/app",
"start-monitor": "nodemon --exec 'yarn build && node --import ./register-loader.mjs build/app' --ignore './build/**/*' --ext 'ts,js' build/app",
"build": "tsgo -b && yarn add-import-extensions && yarn copy-static-assets",
"typecheck": "tsgo -b && tsc-strict",
"add-import-extensions": "node bin/add-import-extensions.mjs",

View File

@@ -2,7 +2,7 @@
import crypto from 'node:crypto';
import fs from 'node:fs';
import { SyncProtoBuf } from '@actual-app/crdt';
import { create, SyncRequestSchema, toBinary } from '@actual-app/crdt';
import request from 'supertest';
import { getAccountDb } from './account-db';
@@ -1342,7 +1342,9 @@ describe('/sync', () => {
'group-id',
'key-id',
);
syncRequest.setSince(undefined);
// proto3 default-value semantics: '' is omitted on the wire and
// decoded back to '', which the handler falsy-checks as missing.
syncRequest.since = '';
const res = await sendSyncRequest(syncRequest);
@@ -1487,17 +1489,16 @@ function addMockFile(
}
function createMinimalSyncRequest(fileId, groupId, keyId) {
const syncRequest = new SyncProtoBuf.SyncRequest();
syncRequest.setFileid(fileId);
syncRequest.setGroupid(groupId);
syncRequest.setKeyid(keyId);
syncRequest.setSince('2024-01-01T00:00:00.000Z');
syncRequest.setMessagesList([]);
return syncRequest;
return create(SyncRequestSchema, {
fileId,
groupId,
keyId,
since: '2024-01-01T00:00:00.000Z',
});
}
async function sendSyncRequest(syncRequest, token = 'valid-token') {
const serializedRequest = syncRequest.serializeBinary();
const serializedRequest = toBinary(SyncRequestSchema, syncRequest);
// Convert Uint8Array to Buffer
const bufferRequest = Buffer.from(serializedRequest);

View File

@@ -3,7 +3,13 @@ import { Buffer } from 'node:buffer';
import fs from 'node:fs/promises';
import { resolve } from 'node:path';
import { SyncProtoBuf } from '@actual-app/crdt';
import {
create,
fromBinary,
SyncRequestSchema,
SyncResponseSchema,
toBinary,
} from '@actual-app/crdt';
import type { Request, Response } from 'express';
import express from 'express';
@@ -124,7 +130,7 @@ function requireFileAccess(file: File, userId: string) {
app.post('/sync', async (req, res): Promise<void> => {
let requestPb;
try {
requestPb = SyncProtoBuf.SyncRequest.deserializeBinary(req.body);
requestPb = fromBinary(SyncRequestSchema, req.body);
} catch (e) {
console.log('Error parsing sync request', e);
res.status(500);
@@ -132,11 +138,11 @@ app.post('/sync', async (req, res): Promise<void> => {
return;
}
const fileId = requestPb.getFileid() || null;
const groupId = requestPb.getGroupid() || null;
const keyId = requestPb.getKeyid() || null;
const since = requestPb.getSince() || null;
const messages = requestPb.getMessagesList();
const fileId = requestPb.fileId || null;
const groupId = requestPb.groupId || null;
const keyId = requestPb.keyId || null;
const since = requestPb.since || null;
const messages = requestPb.messages;
if (!since) {
res.status(422).send({
@@ -176,14 +182,14 @@ app.post('/sync', async (req, res): Promise<void> => {
const { trie, newMessages } = simpleSync.sync(messages, since, groupId);
// encode it back...
const responsePb = new SyncProtoBuf.SyncResponse();
responsePb.setMerkle(JSON.stringify(trie));
newMessages.forEach(msg => responsePb.addMessages(msg));
const responsePb = create(SyncResponseSchema, {
merkle: JSON.stringify(trie),
messages: newMessages,
});
res.set('Content-Type', 'application/actual-sync');
res.set('X-ACTUAL-SYNC-METHOD', 'simple');
res.send(Buffer.from(responsePb.serializeBinary()));
res.send(Buffer.from(toBinary(SyncResponseSchema, responsePb)));
});
app.post('/user-get-key', (req, res) => {

View File

@@ -1,7 +1,12 @@
import { existsSync, readFileSync } from 'node:fs';
import { join } from 'node:path';
import { merkle, SyncProtoBuf, Timestamp } from '@actual-app/crdt';
import {
create,
merkle,
MessageEnvelopeSchema,
Timestamp,
} from '@actual-app/crdt';
import { openDatabase } from './db';
import { sqlDir } from './load-config';
@@ -31,15 +36,11 @@ function addMessages(db, messages) {
const info = db.mutate(
`INSERT OR IGNORE INTO messages_binary (timestamp, is_encrypted, content)
VALUES (?, ?, ?)`,
[
msg.getTimestamp(),
msg.getIsencrypted() ? 1 : 0,
Buffer.from(msg.getContent()),
],
[msg.timestamp, msg.isEncrypted ? 1 : 0, Buffer.from(msg.content)],
);
if (info.changes > 0) {
trie = merkle.insert(trie, Timestamp.parse(msg.getTimestamp()));
trie = merkle.insert(trie, Timestamp.parse(msg.timestamp));
}
}
}
@@ -84,12 +85,12 @@ export function sync(messages, since, groupId) {
return {
trie,
newMessages: newMessages.map(msg => {
const envelopePb = new SyncProtoBuf.MessageEnvelope();
envelopePb.setTimestamp(msg.timestamp);
envelopePb.setIsencrypted(msg.is_encrypted);
envelopePb.setContent(msg.content);
return envelopePb;
}),
newMessages: newMessages.map(msg =>
create(MessageEnvelopeSchema, {
timestamp: msg.timestamp,
isEncrypted: msg.is_encrypted === 1,
content: msg.content,
}),
),
};
}

View File

@@ -0,0 +1,8 @@
// Bootstrap entry point. Statically imports register-loader.mjs (which
// installs the TS-source/extension/directory-index resolver), then
// dynamic-imports the built app so its module graph is resolved through
// that loader. Used in environments where Node's --import flag isn't
// honored, e.g. Electron's utilityProcess.fork.
import './register-loader.mjs';
await import('./build/app.js');

View File

@@ -0,0 +1,6 @@
---
category: Maintenance
authors: [MatissJanis]
---
crdt: move from 'google-protobuf' to '@bufbuild/protobuf'

109
yarn.lock
View File

@@ -124,7 +124,6 @@ __metadata:
date-fns: "npm:^4.1.0"
fake-indexeddb: "npm:^6.2.5"
fast-check: "npm:^4.6.0"
google-protobuf: "npm:^3.21.4"
handlebars: "npm:^4.7.9"
hyperformula: "npm:^3.2.0"
i18next: "npm:^25.10.10"
@@ -159,13 +158,11 @@ __metadata:
version: 0.0.0-use.local
resolution: "@actual-app/crdt@workspace:packages/crdt"
dependencies:
"@types/google-protobuf": "npm:3.15.12"
"@bufbuild/protobuf": "npm:^2.11.0"
"@bufbuild/protoc-gen-es": "npm:^2.11.0"
"@typescript/native-preview": "npm:beta"
google-protobuf: "npm:^3.21.4"
murmurhash: "npm:^2.0.1"
protoc-gen-js: "npm:3.21.4-4"
rollup-plugin-visualizer: "npm:^7.0.1"
ts-protoc-gen: "npm:0.15.0"
vite: "npm:^8.0.5"
vitest: "npm:^4.1.2"
languageName: unknown
@@ -1950,6 +1947,41 @@ __metadata:
languageName: node
linkType: hard
"@bufbuild/protobuf@npm:2.11.0, @bufbuild/protobuf@npm:^2.11.0":
version: 2.11.0
resolution: "@bufbuild/protobuf@npm:2.11.0"
checksum: 10/dddab84c2dc92f15b467449dc9d951b9aef6ea335dba448f8d4028f9b52fdb790d3b856a1dceb4dbcfe7f182072f0d1cd6ce05b2a95ff40132eea6a428e84883
languageName: node
linkType: hard
"@bufbuild/protoc-gen-es@npm:^2.11.0":
version: 2.11.0
resolution: "@bufbuild/protoc-gen-es@npm:2.11.0"
dependencies:
"@bufbuild/protobuf": "npm:2.11.0"
"@bufbuild/protoplugin": "npm:2.11.0"
peerDependencies:
"@bufbuild/protobuf": 2.11.0
peerDependenciesMeta:
"@bufbuild/protobuf":
optional: true
bin:
protoc-gen-es: bin/protoc-gen-es
checksum: 10/9dbf56ceb7b99b2c36a11a389eac441ef19c3aacb8b412b2a87fe8f6676679ef178329e585b26cbd1319c6b23261c256499578b7b16cfa52866377995037ac74
languageName: node
linkType: hard
"@bufbuild/protoplugin@npm:2.11.0":
version: 2.11.0
resolution: "@bufbuild/protoplugin@npm:2.11.0"
dependencies:
"@bufbuild/protobuf": "npm:2.11.0"
"@typescript/vfs": "npm:^1.6.2"
typescript: "npm:5.4.5"
checksum: 10/271ab6bb33a6632bf6f4730c1788da00a8679f24f29990a4c04a22558defd89dffa481a3d51312ced7e385ebc0a1dc656506decfd7796c72c8e70a8102d7c4df
languageName: node
linkType: hard
"@chevrotain/cst-dts-gen@npm:11.0.3":
version: 11.0.3
resolution: "@chevrotain/cst-dts-gen@npm:11.0.3"
@@ -9733,13 +9765,6 @@ __metadata:
languageName: node
linkType: hard
"@types/google-protobuf@npm:3.15.12":
version: 3.15.12
resolution: "@types/google-protobuf@npm:3.15.12"
checksum: 10/a5c5f09a3fc4bc6a9339df29f4a32daf77c37f2bce6e8aa7b949fae19829a87c351786b7401eb45ea643dfa98d5155ffd9dd637c3ec61f69a30979bd67f6954e
languageName: node
linkType: hard
"@types/gtag.js@npm:^0.0.20":
version: 0.0.20
resolution: "@types/gtag.js@npm:0.0.20"
@@ -10450,6 +10475,17 @@ __metadata:
languageName: node
linkType: hard
"@typescript/vfs@npm:^1.6.2":
version: 1.6.4
resolution: "@typescript/vfs@npm:1.6.4"
dependencies:
debug: "npm:^4.4.3"
peerDependencies:
typescript: "*"
checksum: 10/3a301cdc950f7b3bc3b21164d355d9d23ec00b6888ca77c83193afd950b9f5d30ec2de61c7473c99518fc47b51e1c69592eab253072a35781b48e5096b11692a
languageName: node
linkType: hard
"@uiw/codemirror-extensions-basic-setup@npm:4.25.9":
version: 4.25.9
resolution: "@uiw/codemirror-extensions-basic-setup@npm:4.25.9"
@@ -17140,13 +17176,6 @@ __metadata:
languageName: node
linkType: hard
"google-protobuf@npm:^3.15.5, google-protobuf@npm:^3.21.4":
version: 3.21.4
resolution: "google-protobuf@npm:3.21.4"
checksum: 10/0d87fe8ef221d105cbaa808f4024bd577638524d8e461469e3733f2e4933391ad4da86b7fcbd11e8781bee04eacf2e8ba19aaacd5f9deb336a220485841d980f
languageName: node
linkType: hard
"gopd@npm:^1.0.1, gopd@npm:^1.2.0":
version: 1.2.0
resolution: "gopd@npm:1.2.0"
@@ -23850,17 +23879,6 @@ __metadata:
languageName: node
linkType: hard
"protoc-gen-js@npm:3.21.4-4":
version: 3.21.4-4
resolution: "protoc-gen-js@npm:3.21.4-4"
dependencies:
adm-zip: "npm:0.5.10"
bin:
protoc-gen-js: cli.js
checksum: 10/43d5469195fb28ecbb6e2f387601c22d7a032aaffd5eadee10e02ca4a6b6361bfb54e83fe220b6eef36a27011421f3a0ef86e163116273d71ab9158d31980c3b
languageName: node
linkType: hard
"proxy-addr@npm:^2.0.7, proxy-addr@npm:~2.0.7":
version: 2.0.7
resolution: "proxy-addr@npm:2.0.7"
@@ -27625,17 +27643,6 @@ __metadata:
languageName: node
linkType: hard
"ts-protoc-gen@npm:0.15.0":
version: 0.15.0
resolution: "ts-protoc-gen@npm:0.15.0"
dependencies:
google-protobuf: "npm:^3.15.5"
bin:
protoc-gen-ts: bin/protoc-gen-ts
checksum: 10/de1d526b47886e1994995836b1e1e5765193f476ee0aded067a2e4154d60df0bd8fb33a1f56654a1ea779576c158c184b7beca36e59f620c9382217040f91d64
languageName: node
linkType: hard
"tsconfig-paths@npm:^4.2.0":
version: 4.2.0
resolution: "tsconfig-paths@npm:4.2.0"
@@ -27827,6 +27834,16 @@ __metadata:
languageName: node
linkType: hard
"typescript@npm:5.4.5":
version: 5.4.5
resolution: "typescript@npm:5.4.5"
bin:
tsc: bin/tsc
tsserver: bin/tsserver
checksum: 10/d04a9e27e6d83861f2126665aa8d84847e8ebabcea9125b9ebc30370b98cb38b5dff2508d74e2326a744938191a83a69aa9fddab41f193ffa43eabfdf3f190a5
languageName: node
linkType: hard
"typescript@npm:^5.0.4":
version: 5.9.3
resolution: "typescript@npm:5.9.3"
@@ -27847,6 +27864,16 @@ __metadata:
languageName: node
linkType: hard
"typescript@patch:typescript@npm%3A5.4.5#optional!builtin<compat/typescript>":
version: 5.4.5
resolution: "typescript@patch:typescript@npm%3A5.4.5#optional!builtin<compat/typescript>::version=5.4.5&hash=5adc0c"
bin:
tsc: bin/tsc
tsserver: bin/tsserver
checksum: 10/760f7d92fb383dbf7dee2443bf902f4365db2117f96f875cf809167f6103d55064de973db9f78fe8f31ec08fff52b2c969aee0d310939c0a3798ec75d0bca2e1
languageName: node
linkType: hard
"typescript@patch:typescript@npm%3A^5.0.4#optional!builtin<compat/typescript>":
version: 5.9.3
resolution: "typescript@patch:typescript@npm%3A5.9.3#optional!builtin<compat/typescript>::version=5.9.3&hash=5786d5"