Add E2E testing (#201)

* feat: add E2E testing infrastructure with fake GitHub, Playwright, and CI workflow

- Add fake GitHub API server (tests/e2e/fake-github-server.ts) with
  management API for seeding test data
- Add Playwright E2E test suite covering full mirror workflow:
  service health checks, user registration, config, sync, verify
- Add Docker Compose for E2E Gitea instance
- Add orchestrator script (run-e2e.sh) with cleanup
- Add GitHub Actions workflow (e2e-tests.yml) with Gitea service container
- Make GITHUB_API_URL configurable via env var for testing
- Add npm scripts: test:e2e, test:e2e:ci, test:e2e:keep, test:e2e:cleanup

* feat: add real git repos + backup config testing to E2E suite

- Create programmatic test git repos (create-test-repos.ts) with real
  commits, branches (main, develop, feature/*), and tags (v1.0.0, v1.1.0)
- Add git-server container to docker-compose serving bare repos via
  dumb HTTP protocol so Gitea can actually clone them
- Update fake GitHub server to emit reachable clone_url fields pointing
  to the git-server container (configurable via GIT_SERVER_URL env var)
- Add management endpoint POST /___mgmt/set-clone-url for runtime config
- Update E2E spec with real mirroring verification:
  * Verify repos appear in Gitea with actual content
  * Check branches, tags, commits, file content
  * Verify 4/4 repos mirrored successfully
- Add backup configuration test suite:
  * Enable/disable backupBeforeSync config
  * Toggle blockSyncOnBackupFailure
  * Trigger re-sync with backup enabled and verify activities
  * Verify config persistence across changes
- Update CI workflow to use docker compose (not service containers)
  matching the local run-e2e.sh approach
- Update cleanup.sh for git-repos directory and git-server port
- All 22 tests passing with real git content verification

* refactor: split E2E tests into focused files + add force-push tests

Split the monolithic e2e.spec.ts (1335 lines) into 5 focused spec files
and a shared helpers module:

  helpers.ts                 — constants, GiteaAPI, auth, saveConfig, utilities
  01-health.spec.ts          — service health checks (4 tests)
  02-mirror-workflow.spec.ts — full first-mirror journey (8 tests)
  03-backup.spec.ts          — backup config toggling (6 tests)
  04-force-push.spec.ts      — force-push simulation & backup verification (9 tests)
  05-sync-verification.spec.ts — dynamic repos, content integrity, reset (5 tests)

The force-push tests are the critical addition:
  F0: Record original state (commit SHAs, file content)
  F1: Rewrite source repo history (simulate force-push)
  F2: Sync to Gitea WITHOUT backup
  F3: Verify data loss — LICENSE file gone, README overwritten
  F4: Restore source, re-mirror to clean state
  F5: Enable backup, force-push again, sync through app
  F6: Verify Gitea reflects the force-push
  F7: Verify backup system was invoked (snapshot activities logged)
  F8: Restore source repo for subsequent tests

Also added to helpers.ts:
  - GiteaAPI.getBranch(), .getCommit(), .triggerMirrorSync()
  - getRepositoryIds(), triggerMirrorJobs(), triggerSyncRepo()

All 32 tests passing.

* Try to fix actions

* Try to fix the other action

* Add debug info to check why e2e action is failing

* More debug info

* Even more debug info

* E2E fix attempt #1

* E2E fix attempt #2

* more debug again

* E2E fix attempt #3

* E2E fix attempt #4

* Remove a bunch of debug info

* Hopefully fix backup bug

* Force backups to succeed
This commit is contained in:
Xyndra
2026-03-01 03:05:13 +01:00
committed by GitHub
parent 61841dd7a5
commit 2e00a610cb
19 changed files with 5365 additions and 59 deletions

281
.github/workflows/e2e-tests.yml vendored Normal file
View File

@@ -0,0 +1,281 @@
name: E2E Integration Tests
on:
push:
branches: ["*"]
paths-ignore:
- "README.md"
- "docs/**"
- "CHANGELOG.md"
- "LICENSE"
pull_request:
branches: ["*"]
paths-ignore:
- "README.md"
- "docs/**"
- "CHANGELOG.md"
- "LICENSE"
workflow_dispatch:
inputs:
debug_enabled:
description: "Enable debug logging"
required: false
default: "false"
type: boolean
permissions:
contents: read
actions: read
concurrency:
group: e2e-${{ github.ref }}
cancel-in-progress: true
env:
GITEA_PORT: 3333
FAKE_GITHUB_PORT: 4580
GIT_SERVER_PORT: 4590
APP_PORT: 4321
BUN_VERSION: "1.3.6"
jobs:
e2e-tests:
name: E2E Integration Tests
runs-on: ubuntu-latest
timeout-minutes: 25
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "22"
- name: Install dependencies
run: |
bun install
echo "✓ Dependencies installed"
- name: Install Playwright
run: |
npx playwright install chromium
npx playwright install-deps chromium
echo "✓ Playwright ready"
- name: Create test git repositories
run: |
echo "Creating bare git repos for E2E testing..."
bun run tests/e2e/create-test-repos.ts --output-dir tests/e2e/git-repos
if [ ! -f tests/e2e/git-repos/manifest.json ]; then
echo "ERROR: Test git repos were not created (manifest.json missing)"
exit 1
fi
echo "✓ Test repos created:"
cat tests/e2e/git-repos/manifest.json | jq -r '.repos[] | " • \(.owner)/\(.name) — \(.description)"'
- name: Start Gitea and git-server containers
run: |
echo "Starting containers via docker compose..."
docker compose -f tests/e2e/docker-compose.e2e.yml up -d
# Wait for git-server
echo "Waiting for git HTTP server..."
for i in $(seq 1 30); do
if curl -sf http://localhost:${{ env.GIT_SERVER_PORT }}/manifest.json > /dev/null 2>&1; then
echo "✓ Git HTTP server is ready"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Git HTTP server did not start"
docker compose -f tests/e2e/docker-compose.e2e.yml logs git-server
exit 1
fi
sleep 1
done
# Wait for Gitea
echo "Waiting for Gitea to be ready..."
for i in $(seq 1 60); do
if curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version > /dev/null 2>&1; then
version=$(curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version | jq -r '.version // "unknown"')
echo "✓ Gitea is ready (version: $version)"
break
fi
if [ $i -eq 60 ]; then
echo "ERROR: Gitea did not become healthy within 120s"
docker compose -f tests/e2e/docker-compose.e2e.yml logs gitea-e2e --tail=30
exit 1
fi
sleep 2
done
- name: Initialize database
run: |
bun run manage-db init
echo "✓ Database initialized"
- name: Build application
env:
GH_API_URL: http://localhost:4580
BETTER_AUTH_SECRET: e2e-test-secret
run: |
bun run build
echo "✓ Build complete"
- name: Start fake GitHub API server
run: |
# Start with GIT_SERVER_URL pointing to the git-server container name
# (Gitea will resolve it via Docker networking)
PORT=${{ env.FAKE_GITHUB_PORT }} GIT_SERVER_URL="http://git-server" \
npx tsx tests/e2e/fake-github-server.ts &
echo $! > /tmp/fake-github.pid
echo "Waiting for fake GitHub API..."
for i in $(seq 1 30); do
if curl -sf http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/health > /dev/null 2>&1; then
echo "✓ Fake GitHub API is ready"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Fake GitHub API did not start"
exit 1
fi
sleep 1
done
# Ensure clone URLs are set for the git-server container
curl -sf -X POST http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/set-clone-url \
-H "Content-Type: application/json" \
-d '{"url": "http://git-server"}' || true
echo "✓ Clone URLs configured for git-server container"
- name: Start gitea-mirror application
env:
GH_API_URL: http://localhost:4580
BETTER_AUTH_SECRET: e2e-test-secret
BETTER_AUTH_URL: http://localhost:4321
DATABASE_URL: file:data/gitea-mirror.db
HOST: 0.0.0.0
PORT: ${{ env.APP_PORT }}
NODE_ENV: production
PRE_SYNC_BACKUP_ENABLED: "false"
ENCRYPTION_SECRET: "e2e-encryption-secret-32char!!"
run: |
# Re-init DB in case build step cleared it
bun run manage-db init 2>/dev/null || true
bun run start &
echo $! > /tmp/app.pid
echo "Waiting for gitea-mirror app..."
for i in $(seq 1 90); do
if curl -sf http://localhost:${{ env.APP_PORT }}/api/health > /dev/null 2>&1 || \
curl -sf -o /dev/null -w "%{http_code}" http://localhost:${{ env.APP_PORT }}/ 2>/dev/null | grep -q "^[23]"; then
echo "✓ gitea-mirror app is ready"
break
fi
if ! kill -0 $(cat /tmp/app.pid) 2>/dev/null; then
echo "ERROR: App process died"
exit 1
fi
if [ $i -eq 90 ]; then
echo "ERROR: gitea-mirror app did not start within 180s"
exit 1
fi
sleep 2
done
- name: Run E2E tests
env:
APP_URL: http://localhost:${{ env.APP_PORT }}
GITEA_URL: http://localhost:${{ env.GITEA_PORT }}
FAKE_GITHUB_URL: http://localhost:${{ env.FAKE_GITHUB_PORT }}
GIT_SERVER_URL: http://localhost:${{ env.GIT_SERVER_PORT }}
CI: true
run: |
mkdir -p tests/e2e/test-results
npx playwright test \
--config tests/e2e/playwright.config.ts \
--reporter=github,html
- name: Diagnostic info on failure
if: failure()
run: |
echo "═══════════════════════════════════════════════════════════"
echo " Diagnostic Information"
echo "═══════════════════════════════════════════════════════════"
echo ""
echo "── Git server status ──"
curl -sf http://localhost:${{ env.GIT_SERVER_PORT }}/manifest.json 2>/dev/null | jq . || echo "(unreachable)"
echo ""
echo "── Gitea status ──"
curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version 2>/dev/null || echo "(unreachable)"
echo ""
echo "── Fake GitHub status ──"
curl -sf http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/health 2>/dev/null | jq . || echo "(unreachable)"
echo ""
echo "── App status ──"
curl -sf http://localhost:${{ env.APP_PORT }}/api/health 2>/dev/null || echo "(unreachable)"
echo ""
echo "── Docker containers ──"
docker compose -f tests/e2e/docker-compose.e2e.yml ps 2>/dev/null || true
echo ""
echo "── Gitea container logs (last 50 lines) ──"
docker compose -f tests/e2e/docker-compose.e2e.yml logs gitea-e2e --tail=50 2>/dev/null || echo "(no container)"
echo ""
echo "── Git server logs (last 20 lines) ──"
docker compose -f tests/e2e/docker-compose.e2e.yml logs git-server --tail=20 2>/dev/null || echo "(no container)"
echo ""
echo "── Running processes ──"
ps aux | grep -E "(fake-github|astro|bun|node)" | grep -v grep || true
- name: Upload Playwright report
uses: actions/upload-artifact@v4
if: always()
with:
name: e2e-playwright-report
path: tests/e2e/playwright-report/
retention-days: 14
- name: Upload test results
uses: actions/upload-artifact@v4
if: always()
with:
name: e2e-test-results
path: tests/e2e/test-results/
retention-days: 14
- name: Cleanup
if: always()
run: |
# Stop background processes
if [ -f /tmp/fake-github.pid ]; then
kill $(cat /tmp/fake-github.pid) 2>/dev/null || true
rm -f /tmp/fake-github.pid
fi
if [ -f /tmp/app.pid ]; then
kill $(cat /tmp/app.pid) 2>/dev/null || true
rm -f /tmp/app.pid
fi
# Stop containers
docker compose -f tests/e2e/docker-compose.e2e.yml down --volumes --remove-orphans 2>/dev/null || true
echo "✓ Cleanup complete"

15
.gitignore vendored
View File

@@ -37,3 +37,18 @@ result
result-*
.direnv/
# E2E test artifacts
tests/e2e/test-results/
tests/e2e/playwright-report/
tests/e2e/.auth/
tests/e2e/e2e-storage-state.json
tests/e2e/.fake-github.pid
tests/e2e/.app.pid
tests/e2e/git-repos/
# Playwright
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/
/playwright/.auth/

View File

@@ -35,7 +35,7 @@
"@types/canvas-confetti": "^1.9.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"astro": "^5.17.3",
"astro": "^5.18.0",
"bcryptjs": "^3.0.3",
"better-auth": "1.4.19",
"buffer": "^6.0.3",
@@ -63,11 +63,13 @@
"zod": "^4.3.6",
},
"devDependencies": {
"@playwright/test": "^1.58.2",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/bcryptjs": "^3.0.0",
"@types/bun": "^1.3.9",
"@types/jsonwebtoken": "^9.0.10",
"@types/node": "^25.3.2",
"@types/uuid": "^11.0.0",
"@vitejs/plugin-react": "^5.1.4",
"drizzle-kit": "^0.31.9",
@@ -362,6 +364,8 @@
"@oslojs/encoding": ["@oslojs/encoding@1.1.0", "", {}, "sha512-70wQhgYmndg4GCPxPPxPGevRKqTIJ2Nh4OkiMWmDAVYsTQ+Ta7Sq+rPevXyXGdzr30/qZBnyOalCszoMxlyldQ=="],
"@playwright/test": ["@playwright/test@1.58.2", "", { "dependencies": { "playwright": "1.58.2" }, "bin": { "playwright": "cli.js" } }, "sha512-akea+6bHYBBfA9uQqSYmlJXn61cTa+jbO87xVLCWbTqbWadRVmhxlXATaOjOgcBaWU4ePo0wB41KMFv3o35IXA=="],
"@radix-ui/number": ["@radix-ui/number@1.1.1", "", {}, "sha512-MkKCwxlXTgz6CFoJx3pCwn07GKp36+aZyu/u2Ln2VrA5DcdyCZkASEDBTd8x5whTQQL5CiYf4prXKLcgQdv29g=="],
"@radix-ui/primitive": ["@radix-ui/primitive@1.1.3", "", {}, "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg=="],
@@ -592,7 +596,7 @@
"@types/nlcst": ["@types/nlcst@2.0.3", "", { "dependencies": { "@types/unist": "*" } }, "sha512-vSYNSDe6Ix3q+6Z7ri9lyWqgGhJTmzRjZRqyq15N0Z/1/UnVsno9G/N40NBijoYx2seFDIl0+B2mgAb9mezUCA=="],
"@types/node": ["@types/node@22.15.23", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-7Ec1zaFPF4RJ0eXu1YT/xgiebqwqoJz8rYPDi/O2BcZ++Wpt0Kq9cl0eg6NN6bYbPnR67ZLo7St5Q3UK0SnARw=="],
"@types/node": ["@types/node@25.3.2", "", { "dependencies": { "undici-types": "~7.18.0" } }, "sha512-RpV6r/ij22zRRdyBPcxDeKAzH43phWVKEjL2iksqo1Vz3CuBUrgmPpPhALKiRfU7OMCmeeO9vECBMsV0hMTG8Q=="],
"@types/react": ["@types/react@19.2.14", "", { "dependencies": { "csstype": "^3.2.2" } }, "sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w=="],
@@ -670,7 +674,7 @@
"astring": ["astring@1.9.0", "", { "bin": { "astring": "bin/astring" } }, "sha512-LElXdjswlqjWrPpJFg1Fx4wpkOCxj1TDHlSV4PlaRxHGWko024xICaa97ZkMfs6DRKlCguiAI+rbXv5GWwXIkg=="],
"astro": ["astro@5.17.3", "", { "dependencies": { "@astrojs/compiler": "^2.13.0", "@astrojs/internal-helpers": "0.7.5", "@astrojs/markdown-remark": "6.3.10", "@astrojs/telemetry": "3.3.0", "@capsizecss/unpack": "^4.0.0", "@oslojs/encoding": "^1.1.0", "@rollup/pluginutils": "^5.3.0", "acorn": "^8.15.0", "aria-query": "^5.3.2", "axobject-query": "^4.1.0", "boxen": "8.0.1", "ci-info": "^4.3.1", "clsx": "^2.1.1", "common-ancestor-path": "^1.0.1", "cookie": "^1.1.1", "cssesc": "^3.0.0", "debug": "^4.4.3", "deterministic-object-hash": "^2.0.2", "devalue": "^5.6.2", "diff": "^8.0.3", "dlv": "^1.1.3", "dset": "^3.1.4", "es-module-lexer": "^1.7.0", "esbuild": "^0.27.3", "estree-walker": "^3.0.3", "flattie": "^1.1.1", "fontace": "~0.4.0", "github-slugger": "^2.0.0", "html-escaper": "3.0.3", "http-cache-semantics": "^4.2.0", "import-meta-resolve": "^4.2.0", "js-yaml": "^4.1.1", "magic-string": "^0.30.21", "magicast": "^0.5.1", "mrmime": "^2.0.1", "neotraverse": "^0.6.18", "p-limit": "^6.2.0", "p-queue": "^8.1.1", "package-manager-detector": "^1.6.0", "piccolore": "^0.1.3", "picomatch": "^4.0.3", "prompts": "^2.4.2", "rehype": "^13.0.2", "semver": "^7.7.3", "shiki": "^3.21.0", "smol-toml": "^1.6.0", "svgo": "^4.0.0", "tinyexec": "^1.0.2", "tinyglobby": "^0.2.15", "tsconfck": "^3.1.6", "ultrahtml": "^1.6.0", "unifont": "~0.7.3", "unist-util-visit": "^5.0.0", "unstorage": "^1.17.4", "vfile": "^6.0.3", "vite": "^6.4.1", "vitefu": "^1.1.1", "xxhash-wasm": "^1.1.0", "yargs-parser": "^21.1.1", "yocto-spinner": "^0.2.3", "zod": "^3.25.76", "zod-to-json-schema": "^3.25.1", "zod-to-ts": "^1.2.0" }, "optionalDependencies": { "sharp": "^0.34.0" }, "bin": { "astro": "astro.js" } }, "sha512-69dcfPe8LsHzklwj+hl+vunWUbpMB6pmg35mACjetxbJeUNNys90JaBM8ZiwsPK689SAj/4Zqb1ayaANls9/MA=="],
"astro": ["astro@5.18.0", "", { "dependencies": { "@astrojs/compiler": "^2.13.0", "@astrojs/internal-helpers": "0.7.5", "@astrojs/markdown-remark": "6.3.10", "@astrojs/telemetry": "3.3.0", "@capsizecss/unpack": "^4.0.0", "@oslojs/encoding": "^1.1.0", "@rollup/pluginutils": "^5.3.0", "acorn": "^8.15.0", "aria-query": "^5.3.2", "axobject-query": "^4.1.0", "boxen": "8.0.1", "ci-info": "^4.3.1", "clsx": "^2.1.1", "common-ancestor-path": "^1.0.1", "cookie": "^1.1.1", "cssesc": "^3.0.0", "debug": "^4.4.3", "deterministic-object-hash": "^2.0.2", "devalue": "^5.6.2", "diff": "^8.0.3", "dlv": "^1.1.3", "dset": "^3.1.4", "es-module-lexer": "^1.7.0", "esbuild": "^0.27.3", "estree-walker": "^3.0.3", "flattie": "^1.1.1", "fontace": "~0.4.0", "github-slugger": "^2.0.0", "html-escaper": "3.0.3", "http-cache-semantics": "^4.2.0", "import-meta-resolve": "^4.2.0", "js-yaml": "^4.1.1", "magic-string": "^0.30.21", "magicast": "^0.5.1", "mrmime": "^2.0.1", "neotraverse": "^0.6.18", "p-limit": "^6.2.0", "p-queue": "^8.1.1", "package-manager-detector": "^1.6.0", "piccolore": "^0.1.3", "picomatch": "^4.0.3", "prompts": "^2.4.2", "rehype": "^13.0.2", "semver": "^7.7.3", "shiki": "^3.21.0", "smol-toml": "^1.6.0", "svgo": "^4.0.0", "tinyexec": "^1.0.2", "tinyglobby": "^0.2.15", "tsconfck": "^3.1.6", "ultrahtml": "^1.6.0", "unifont": "~0.7.3", "unist-util-visit": "^5.0.0", "unstorage": "^1.17.4", "vfile": "^6.0.3", "vite": "^6.4.1", "vitefu": "^1.1.1", "xxhash-wasm": "^1.1.0", "yargs-parser": "^21.1.1", "yocto-spinner": "^0.2.3", "zod": "^3.25.76", "zod-to-json-schema": "^3.25.1", "zod-to-ts": "^1.2.0" }, "optionalDependencies": { "sharp": "^0.34.0" }, "bin": { "astro": "astro.js" } }, "sha512-CHiohwJIS4L0G6/IzE1Fx3dgWqXBCXus/od0eGUfxrZJD2um2pE7ehclMmgL/fXqbU7NfE1Ze2pq34h2QaA6iQ=="],
"axobject-query": ["axobject-query@4.1.0", "", {}, "sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ=="],
@@ -1278,6 +1282,10 @@
"picomatch": ["picomatch@4.0.3", "", {}, "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q=="],
"playwright": ["playwright@1.58.2", "", { "dependencies": { "playwright-core": "1.58.2" }, "optionalDependencies": { "fsevents": "2.3.2" }, "bin": { "playwright": "cli.js" } }, "sha512-vA30H8Nvkq/cPBnNw4Q8TWz1EJyqgpuinBcHET0YVJVFldr8JDNiU9LaWAE1KqSkRYazuaBhTpB5ZzShOezQ6A=="],
"playwright-core": ["playwright-core@1.58.2", "", { "bin": { "playwright-core": "cli.js" } }, "sha512-yZkEtftgwS8CsfYo7nm0KE8jsvm6i/PTgVtB8DL726wNf6H2IMsDuxCpJj59KDaxCtSnrWan2AeDqM7JBaultg=="],
"postcss": ["postcss@8.5.3", "", { "dependencies": { "nanoid": "^3.3.8", "picocolors": "^1.1.1", "source-map-js": "^1.2.1" } }, "sha512-dle9A3yYxlBSrt8Fu+IpjGT8SY8hN0mlaA6GY8t0P5PjIOZemULz/E2Bnm/2dcUOena75OTNkHI76uZBNUUq3A=="],
"prettier": ["prettier@3.7.4", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-v6UNi1+3hSlVvv8fSaoUbggEM5VErKmmpGA7Pl3HF8V6uKY7rvClBOJlH6yNwQtfTueNkGVpOv/mtWL9L4bgRA=="],
@@ -1502,7 +1510,7 @@
"undici": ["undici@7.22.0", "", {}, "sha512-RqslV2Us5BrllB+JeiZnK4peryVTndy9Dnqq62S3yYRRTj0tFQCwEniUy2167skdGOy3vqRzEvl1Dm4sV2ReDg=="],
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"undici-types": ["undici-types@7.18.2", "", {}, "sha512-AsuCzffGHJybSaRrmr5eHr81mwJU3kjw6M+uprWvCXiNeN9SOGwQ3Jn8jb8m3Z6izVgknn1R0FTCEAP2QrLY/w=="],
"unified": ["unified@11.0.5", "", { "dependencies": { "@types/unist": "^3.0.0", "bail": "^2.0.0", "devlop": "^1.0.0", "extend": "^3.0.0", "is-plain-obj": "^4.0.0", "trough": "^2.0.0", "vfile": "^6.0.0" } }, "sha512-xKvGhPWw3k84Qjh8bI3ZeJjqnyadK+GEFtazSfZv/rKeTkTjOJho6mFqh2SM96iIcZokxiOpg78GazTSg8+KHA=="],
@@ -1742,6 +1750,8 @@
"@types/bcryptjs/bcryptjs": ["bcryptjs@3.0.2", "", { "bin": { "bcrypt": "bin/bcrypt" } }, "sha512-k38b3XOZKv60C4E2hVsXTolJWfkGRMbILBIe2IBITXciy5bOsTKot5kDrf3ZfufQtQOUN5mXceUEpU1rTl9Uog=="],
"@types/jsonwebtoken/@types/node": ["@types/node@22.15.23", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-7Ec1zaFPF4RJ0eXu1YT/xgiebqwqoJz8rYPDi/O2BcZ++Wpt0Kq9cl0eg6NN6bYbPnR67ZLo7St5Q3UK0SnARw=="],
"anymatch/picomatch": ["picomatch@2.3.1", "", {}, "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA=="],
"astro/esbuild": ["esbuild@0.27.3", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.27.3", "@esbuild/android-arm": "0.27.3", "@esbuild/android-arm64": "0.27.3", "@esbuild/android-x64": "0.27.3", "@esbuild/darwin-arm64": "0.27.3", "@esbuild/darwin-x64": "0.27.3", "@esbuild/freebsd-arm64": "0.27.3", "@esbuild/freebsd-x64": "0.27.3", "@esbuild/linux-arm": "0.27.3", "@esbuild/linux-arm64": "0.27.3", "@esbuild/linux-ia32": "0.27.3", "@esbuild/linux-loong64": "0.27.3", "@esbuild/linux-mips64el": "0.27.3", "@esbuild/linux-ppc64": "0.27.3", "@esbuild/linux-riscv64": "0.27.3", "@esbuild/linux-s390x": "0.27.3", "@esbuild/linux-x64": "0.27.3", "@esbuild/netbsd-arm64": "0.27.3", "@esbuild/netbsd-x64": "0.27.3", "@esbuild/openbsd-arm64": "0.27.3", "@esbuild/openbsd-x64": "0.27.3", "@esbuild/openharmony-arm64": "0.27.3", "@esbuild/sunos-x64": "0.27.3", "@esbuild/win32-arm64": "0.27.3", "@esbuild/win32-ia32": "0.27.3", "@esbuild/win32-x64": "0.27.3" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-8VwMnyGCONIs6cWue2IdpHxHnAjzxnw2Zr7MkVxB2vjmQ2ivqGFb4LEG3SMnv0Gb2F/G/2yA8zUaiL1gywDCCg=="],
@@ -1754,6 +1764,8 @@
"boxen/string-width": ["string-width@7.2.0", "", { "dependencies": { "emoji-regex": "^10.3.0", "get-east-asian-width": "^1.0.0", "strip-ansi": "^7.1.0" } }, "sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ=="],
"bun-types/@types/node": ["@types/node@22.15.23", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-7Ec1zaFPF4RJ0eXu1YT/xgiebqwqoJz8rYPDi/O2BcZ++Wpt0Kq9cl0eg6NN6bYbPnR67ZLo7St5Q3UK0SnARw=="],
"cliui/wrap-ansi": ["wrap-ansi@7.0.0", "", { "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", "strip-ansi": "^6.0.0" } }, "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q=="],
"cmdk/@radix-ui/react-dialog": ["@radix-ui/react-dialog@1.1.14", "", { "dependencies": { "@radix-ui/primitive": "1.1.2", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-dismissable-layer": "1.1.10", "@radix-ui/react-focus-guards": "1.1.2", "@radix-ui/react-focus-scope": "1.1.7", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-portal": "1.1.9", "@radix-ui/react-presence": "1.1.4", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-slot": "1.2.3", "@radix-ui/react-use-controllable-state": "1.2.2", "aria-hidden": "^1.2.4", "react-remove-scroll": "^2.6.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-+CpweKjqpzTmwRwcYECQcNYbI8V9VSQt0SNFKeEBLgfucbsLssU6Ppq7wUdNXEGb573bMjFhVjKVll8rmV6zMw=="],
@@ -1794,6 +1806,8 @@
"parse-entities/@types/unist": ["@types/unist@2.0.11", "", {}, "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA=="],
"playwright/fsevents": ["fsevents@2.3.2", "", { "os": "darwin" }, "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA=="],
"pretty-format/ansi-styles": ["ansi-styles@5.2.0", "", {}, "sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA=="],
"prompts/kleur": ["kleur@3.0.3", "", {}, "sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w=="],
@@ -1920,6 +1934,8 @@
"@tailwindcss/node/lightningcss/lightningcss-win32-x64-msvc": ["lightningcss-win32-x64-msvc@1.31.1", "", { "os": "win32", "cpu": "x64" }, "sha512-I9aiFrbd7oYHwlnQDqr1Roz+fTz61oDDJX7n9tYF9FJymH1cIN1DtKw3iYt6b8WZgEjoNwVSncwF4wx/ZedMhw=="],
"@types/jsonwebtoken/@types/node/undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"astro/esbuild/@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.27.3", "", { "os": "aix", "cpu": "ppc64" }, "sha512-9fJMTNFTWZMh5qwrBItuziu834eOCUcEqymSH7pY+zoMVEZg3gcPuBNxH1EvfVYe9h0x/Ptw8KBzv7qxb7l8dg=="],
"astro/esbuild/@esbuild/android-arm": ["@esbuild/android-arm@0.27.3", "", { "os": "android", "cpu": "arm" }, "sha512-i5D1hPY7GIQmXlXhs2w8AWHhenb00+GxjxRncS2ZM7YNVGNfaMxgzSGuO8o8SJzRc/oZwU2bcScvVERk03QhzA=="],
@@ -1976,6 +1992,8 @@
"boxen/string-width/strip-ansi": ["strip-ansi@7.1.0", "", { "dependencies": { "ansi-regex": "^6.0.1" } }, "sha512-iq6eVVI64nQQTRYq2KtEg2d2uU7LElhTJwsH4YzIHZshxlgZms/wIc4VoDQTlG/IvVIrBKG06CrZnp0qv7hkcQ=="],
"bun-types/@types/node/undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"cmdk/@radix-ui/react-dialog/@radix-ui/primitive": ["@radix-ui/primitive@1.1.2", "", {}, "sha512-XnbHrrprsNqZKQhStrSwgRUQzoCI1glLzdw79xiZPoofhGICeZRSQ3dIxAKH1gb3OHfNf4d6f+vAv3kil2eggA=="],
"cmdk/@radix-ui/react-dialog/@radix-ui/react-dismissable-layer": ["@radix-ui/react-dismissable-layer@1.1.10", "", { "dependencies": { "@radix-ui/primitive": "1.1.2", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-callback-ref": "1.1.1", "@radix-ui/react-use-escape-keydown": "1.1.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-IM1zzRV4W3HtVgftdQiiOmA0AdJlCtMLe00FXaHwgt3rAnNsIyDqshvkIW3hj/iu5hu8ERP7KIYki6NkqDxAwQ=="],

View File

@@ -3,4 +3,7 @@
timeout = 5000
# Preload the setup file
preload = ["./src/tests/setup.bun.ts"]
preload = ["./src/tests/setup.bun.ts"]
# Only run tests in src/ directory (excludes tests/e2e/ which are Playwright tests)
root = "./src/"

View File

@@ -36,6 +36,10 @@
"test": "bun test",
"test:watch": "bun test --watch",
"test:coverage": "bun test --coverage",
"test:e2e": "bash tests/e2e/run-e2e.sh",
"test:e2e:ci": "bash tests/e2e/run-e2e.sh --ci",
"test:e2e:keep": "bash tests/e2e/run-e2e.sh --keep",
"test:e2e:cleanup": "bash tests/e2e/cleanup.sh",
"astro": "bunx --bun astro"
},
"overrides": {
@@ -73,7 +77,7 @@
"@types/canvas-confetti": "^1.9.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"astro": "^5.17.3",
"astro": "^5.18.0",
"bcryptjs": "^3.0.3",
"better-auth": "1.4.19",
"buffer": "^6.0.3",
@@ -101,11 +105,13 @@
"zod": "^4.3.6"
},
"devDependencies": {
"@playwright/test": "^1.58.2",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/bcryptjs": "^3.0.0",
"@types/bun": "^1.3.9",
"@types/jsonwebtoken": "^9.0.10",
"@types/node": "^25.3.2",
"@types/uuid": "^11.0.0",
"@vitejs/plugin-react": "^5.1.4",
"drizzle-kit": "^0.31.9",

View File

@@ -22,22 +22,30 @@ if (process.env.NODE_ENV !== "test") {
// Fallback to base Octokit if .plugin is not present
const MyOctokit: any = (Octokit as any)?.plugin?.call
? (Octokit as any).plugin(throttling)
: Octokit as any;
: (Octokit as any);
/**
* Creates an authenticated Octokit instance with rate limit tracking and throttling
*/
export function createGitHubClient(token: string, userId?: string, username?: string): Octokit {
export function createGitHubClient(
token: string,
userId?: string,
username?: string,
): Octokit {
// Create a proper User-Agent to identify our application
// This helps GitHub understand our traffic patterns and can provide better rate limits
const userAgent = username
? `gitea-mirror/3.5.4 (user:${username})`
const userAgent = username
? `gitea-mirror/3.5.4 (user:${username})`
: "gitea-mirror/3.5.4";
// Support GH_API_URL (preferred) or GITHUB_API_URL (may conflict with GitHub Actions)
// GitHub Actions sets GITHUB_API_URL to https://api.github.com by default
const baseUrl = process.env.GH_API_URL || process.env.GITHUB_API_URL || "https://api.github.com";
const octokit = new MyOctokit({
auth: token, // Always use token for authentication (5000 req/hr vs 60 for unauthenticated)
userAgent, // Identify our application and user
baseUrl: "https://api.github.com", // Explicitly set the API endpoint
baseUrl, // Configurable for E2E testing
log: {
debug: () => {},
info: console.log,
@@ -52,14 +60,19 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
},
throttle: {
onRateLimit: async (retryAfter: number, options: any, octokit: any, retryCount: number) => {
onRateLimit: async (
retryAfter: number,
options: any,
octokit: any,
retryCount: number,
) => {
const isSearch = options.url.includes("/search/");
const maxRetries = isSearch ? 5 : 3; // Search endpoints get more retries
console.warn(
`[GitHub] Rate limit hit for ${options.method} ${options.url}. Retry ${retryCount + 1}/${maxRetries}`
`[GitHub] Rate limit hit for ${options.method} ${options.url}. Retry ${retryCount + 1}/${maxRetries}`,
);
// Update rate limit status and notify UI (if available)
if (userId && RateLimitManager) {
await RateLimitManager.updateFromResponse(userId, {
@@ -68,7 +81,7 @@ export function createGitHubClient(token: string, userId?: string, username?: st
"x-ratelimit-reset": (Date.now() / 1000 + retryAfter).toString(),
});
}
if (userId && publishEvent) {
await publishEvent({
userId,
@@ -83,22 +96,29 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
});
}
// Retry with exponential backoff
if (retryCount < maxRetries) {
console.log(`[GitHub] Waiting ${retryAfter}s before retry...`);
return true;
}
// Max retries reached
console.error(`[GitHub] Max retries (${maxRetries}) reached for ${options.url}`);
console.error(
`[GitHub] Max retries (${maxRetries}) reached for ${options.url}`,
);
return false;
},
onSecondaryRateLimit: async (retryAfter: number, options: any, octokit: any, retryCount: number) => {
onSecondaryRateLimit: async (
retryAfter: number,
options: any,
octokit: any,
retryCount: number,
) => {
console.warn(
`[GitHub] Secondary rate limit hit for ${options.method} ${options.url}`
`[GitHub] Secondary rate limit hit for ${options.method} ${options.url}`,
);
// Update status and notify UI (if available)
if (userId && publishEvent) {
await publishEvent({
@@ -114,13 +134,15 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
});
}
// Retry up to 2 times for secondary rate limits
if (retryCount < 2) {
console.log(`[GitHub] Waiting ${retryAfter}s for secondary rate limit...`);
console.log(
`[GitHub] Waiting ${retryAfter}s for secondary rate limit...`,
);
return true;
}
return false;
},
// Throttle options to prevent hitting limits
@@ -129,50 +151,57 @@ export function createGitHubClient(token: string, userId?: string, username?: st
retryAfterBaseValue: 1000, // Base retry in ms
},
});
// Add additional rate limit tracking if userId is provided and RateLimitManager is available
// Add rate limit tracking hooks if userId is provided and RateLimitManager is available
if (userId && RateLimitManager) {
octokit.hook.after("request", async (response: any, options: any) => {
// Update rate limit from response headers
octokit.hook.after("request", async (response: any, _options: any) => {
if (response.headers) {
await RateLimitManager.updateFromResponse(userId, response.headers);
}
});
octokit.hook.error("request", async (error: any, options: any) => {
// Handle rate limit errors
if (error.status === 403 || error.status === 429) {
const message = error.message || "";
if (message.includes("rate limit") || message.includes("API rate limit")) {
console.error(`[GitHub] Rate limit error for user ${userId}: ${message}`);
if (
message.includes("rate limit") ||
message.includes("API rate limit")
) {
console.error(
`[GitHub] Rate limit error for user ${userId}: ${message}`,
);
// Update rate limit status from error response (if available)
if (error.response?.headers && RateLimitManager) {
await RateLimitManager.updateFromResponse(userId, error.response.headers);
await RateLimitManager.updateFromResponse(
userId,
error.response.headers,
);
}
// Create error event for UI (if available)
if (publishEvent) {
await publishEvent({
userId,
channel: "rate-limit",
payload: {
type: "error",
provider: "github",
error: message,
endpoint: options.url,
message: `Rate limit exceeded: ${message}`,
},
});
channel: "rate-limit",
payload: {
type: "error",
provider: "github",
error: message,
endpoint: options.url,
message: `Rate limit exceeded: ${message}`,
},
});
}
}
}
throw error;
});
}
return octokit;
}
@@ -213,7 +242,7 @@ export async function getGithubRepositories({
try {
const repos = await octokit.paginate(
octokit.repos.listForAuthenticatedUser,
{ per_page: 100 }
{ per_page: 100 },
);
const skipForks = config.githubConfig?.skipForks ?? false;
@@ -265,7 +294,7 @@ export async function getGithubRepositories({
throw new Error(
`Error fetching repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -282,7 +311,7 @@ export async function getGithubStarredRepositories({
octokit.activity.listReposStarredByAuthenticatedUser,
{
per_page: 100,
}
},
);
return starredRepos.map((repo) => ({
@@ -326,7 +355,7 @@ export async function getGithubStarredRepositories({
throw new Error(
`Error fetching starred repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -349,13 +378,15 @@ export async function getGithubOrganizations({
// Get excluded organizations from environment variable
const excludedOrgsEnv = process.env.GITHUB_EXCLUDED_ORGS;
const excludedOrgs = excludedOrgsEnv
? excludedOrgsEnv.split(',').map(org => org.trim().toLowerCase())
? excludedOrgsEnv.split(",").map((org) => org.trim().toLowerCase())
: [];
// Filter out excluded organizations
const filteredOrgs = orgs.filter(org => {
const filteredOrgs = orgs.filter((org) => {
if (excludedOrgs.includes(org.login.toLowerCase())) {
console.log(`Skipping organization ${org.login} - excluded via GITHUB_EXCLUDED_ORGS environment variable`);
console.log(
`Skipping organization ${org.login} - excluded via GITHUB_EXCLUDED_ORGS environment variable`,
);
return false;
}
return true;
@@ -381,7 +412,7 @@ export async function getGithubOrganizations({
createdAt: new Date(),
updatedAt: new Date(),
};
})
}),
);
return organizations;
@@ -389,7 +420,7 @@ export async function getGithubOrganizations({
throw new Error(
`Error fetching organizations: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -451,7 +482,7 @@ export async function getGithubOrganizationRepositories({
throw new Error(
`Error fetching organization repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}

View File

@@ -126,10 +126,16 @@ export async function createPreSyncBundleBackup({
throw new Error("Decrypted Gitea token is required for pre-sync backup.");
}
const backupRoot =
let backupRoot =
config.giteaConfig?.backupDirectory?.trim() ||
process.env.PRE_SYNC_BACKUP_DIR?.trim() ||
path.join(process.cwd(), "data", "repo-backups");
// Ensure backupRoot is absolute - relative paths break git bundle creation
// because git runs with -C mirrorClonePath and interprets relative paths from there
if (!path.isAbsolute(backupRoot)) {
backupRoot = path.resolve(process.cwd(), backupRoot);
}
const retention = Math.max(
1,
Number.isFinite(config.giteaConfig?.backupRetentionCount)

View File

@@ -0,0 +1,77 @@
/**
* 01 Service health checks.
*
* Quick smoke tests that confirm every service required by the E2E suite is
* reachable before the heavier workflow tests run.
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
FAKE_GITHUB_URL,
GIT_SERVER_URL,
waitFor,
} from "./helpers";
test.describe("E2E: Service health checks", () => {
test("Fake GitHub API is running", async ({ request }) => {
const resp = await request.get(`${FAKE_GITHUB_URL}/___mgmt/health`);
expect(resp.ok()).toBeTruthy();
const data = await resp.json();
expect(data.status).toBe("ok");
expect(data.repos).toBeGreaterThan(0);
console.log(
`[Health] Fake GitHub: ${data.repos} repos, ${data.orgs} orgs, clone base: ${data.gitCloneBaseUrl ?? "default"}`,
);
});
test("Git HTTP server is running (serves test repos)", async ({
request,
}) => {
const resp = await request.get(`${GIT_SERVER_URL}/manifest.json`, {
failOnStatusCode: false,
});
expect(resp.ok(), "Git server should serve manifest.json").toBeTruthy();
const manifest = await resp.json();
expect(manifest.repos).toBeDefined();
expect(manifest.repos.length).toBeGreaterThan(0);
console.log(`[Health] Git server: serving ${manifest.repos.length} repos`);
for (const r of manifest.repos) {
console.log(`[Health] • ${r.owner}/${r.name}${r.description}`);
}
});
test("Gitea instance is running", async ({ request }) => {
await waitFor(
async () => {
const resp = await request.get(`${GITEA_URL}/api/v1/version`, {
failOnStatusCode: false,
});
return resp.ok();
},
{ timeout: 30_000, interval: 2_000, label: "Gitea healthy" },
);
const resp = await request.get(`${GITEA_URL}/api/v1/version`);
const data = await resp.json();
console.log(`[Health] Gitea version: ${data.version}`);
expect(data.version).toBeTruthy();
});
test("gitea-mirror app is running", async ({ request }) => {
await waitFor(
async () => {
const resp = await request.get(`${APP_URL}/`, {
failOnStatusCode: false,
});
return resp.status() < 500;
},
{ timeout: 60_000, interval: 2_000, label: "App healthy" },
);
const resp = await request.get(`${APP_URL}/`, {
failOnStatusCode: false,
});
console.log(`[Health] App status: ${resp.status()}`);
expect(resp.status()).toBeLessThan(500);
});
});

View File

@@ -0,0 +1,344 @@
/**
* 02 Main mirror workflow.
*
* Walks through the full first-time user journey:
* 1. Create Gitea admin user + API token
* 2. Create the mirror target organization
* 3. Register / sign-in to the gitea-mirror app
* 4. Save GitHub + Gitea configuration
* 5. Trigger a GitHub data sync (pull repo list from fake GitHub)
* 6. Trigger mirror jobs (push repos into Gitea)
* 7. Verify repos actually appeared in Gitea with real content
* 8. Verify mirror job activity and app state
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
saveConfig,
waitFor,
getRepositoryIds,
triggerMirrorJobs,
} from "./helpers";
test.describe("E2E: Mirror workflow", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
});
test.afterAll(async () => {
await giteaApi.dispose();
});
test("Step 1: Setup Gitea admin user and token", async () => {
await giteaApi.ensureAdminUser();
const token = await giteaApi.createToken();
expect(token).toBeTruthy();
expect(token.length).toBeGreaterThan(10);
console.log(`[Setup] Gitea token acquired (length: ${token.length})`);
});
test("Step 2: Create mirror organization in Gitea", async () => {
await giteaApi.ensureOrg(GITEA_MIRROR_ORG);
const repos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
expect(Array.isArray(repos)).toBeTruthy();
console.log(
`[Setup] Org ${GITEA_MIRROR_ORG} exists with ${repos.length} repos`,
);
});
test("Step 3: Register and sign in to gitea-mirror app", async ({
request,
}) => {
appCookies = await getAppSessionCookies(request);
expect(appCookies).toBeTruthy();
console.log(
`[Auth] Session cookies acquired (length: ${appCookies.length})`,
);
const whoami = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
expect(
whoami.status(),
`Auth check returned ${whoami.status()} cookies may be invalid`,
).not.toBe(401);
console.log(`[Auth] Auth check status: ${whoami.status()}`);
});
test("Step 4: Configure mirrors via API (backup disabled)", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken, "Gitea token should be set from Step 1").toBeTruthy();
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: false,
blockSyncOnBackupFailure: false,
},
});
console.log("[Config] Configuration saved (backup disabled)");
});
test("Step 5: Trigger GitHub data sync (fetch repos from fake GitHub)", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const syncResp = await request.post(`${APP_URL}/api/sync`, {
headers: {
"Content-Type": "application/json",
Cookie: appCookies,
},
failOnStatusCode: false,
});
const status = syncResp.status();
console.log(`[Sync] GitHub sync response: ${status}`);
if (status >= 400) {
const body = await syncResp.text();
console.log(`[Sync] Error body: ${body}`);
}
expect(status, "Sync should not be unauthorized").not.toBe(401);
expect(status, "Sync should not return server error").toBeLessThan(500);
if (syncResp.ok()) {
const data = await syncResp.json();
console.log(
`[Sync] New repos: ${data.newRepositories ?? "?"}, new orgs: ${data.newOrganizations ?? "?"}`,
);
}
});
test("Step 6: Trigger mirror jobs (push repos to Gitea)", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Fetch repository IDs from the dashboard API
const { ids: repositoryIds, repos } = await getRepositoryIds(
request,
appCookies,
);
console.log(
`[Mirror] Found ${repositoryIds.length} repos to mirror: ${repos.map((r: any) => r.name).join(", ")}`,
);
if (repositoryIds.length === 0) {
// Fallback: try the github/repositories endpoint
const repoResp = await request.get(
`${APP_URL}/api/github/repositories`,
{
headers: { Cookie: appCookies },
failOnStatusCode: false,
},
);
if (repoResp.ok()) {
const repoData = await repoResp.json();
const fallbackRepos: any[] = Array.isArray(repoData)
? repoData
: (repoData.repositories ?? []);
repositoryIds.push(...fallbackRepos.map((r: any) => r.id));
console.log(
`[Mirror] Fallback: found ${repositoryIds.length} repos`,
);
}
}
expect(
repositoryIds.length,
"Should have at least one repository to mirror",
).toBeGreaterThan(0);
const status = await triggerMirrorJobs(
request,
appCookies,
repositoryIds,
30_000,
);
console.log(`[Mirror] Mirror job response: ${status}`);
expect(status, "Mirror job should not be unauthorized").not.toBe(401);
expect(status, "Mirror job should not return server error").toBeLessThan(
500,
);
});
test("Step 7: Verify repos were actually mirrored to Gitea", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Wait for mirror jobs to finish processing
await waitFor(
async () => {
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
console.log(
`[Verify] Gitea org repos so far: ${orgRepos.length} (${orgRepos.map((r: any) => r.name).join(", ")})`,
);
// We expect at least 3 repos (my-project, dotfiles, notes)
return orgRepos.length >= 3;
},
{
timeout: 90_000,
interval: 5_000,
label: "repos appear in Gitea",
},
);
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
const orgRepoNames = orgRepos.map((r: any) => r.name);
console.log(
`[Verify] Gitea org repos: ${orgRepoNames.join(", ")} (total: ${orgRepos.length})`,
);
// Check that at least the 3 personal repos are mirrored
for (const repoName of ["my-project", "dotfiles", "notes"]) {
expect(
orgRepoNames,
`Expected repo "${repoName}" to be mirrored into org ${GITEA_MIRROR_ORG}`,
).toContain(repoName);
}
// Verify my-project has actual content (branches, commits)
const myProjectBranches = await giteaApi.listBranches(
GITEA_MIRROR_ORG,
"my-project",
);
const branchNames = myProjectBranches.map((b: any) => b.name);
console.log(`[Verify] my-project branches: ${branchNames.join(", ")}`);
expect(branchNames, "main branch should exist").toContain("main");
// Verify we can read actual file content
const readmeContent = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
expect(readmeContent, "README.md should have content").toBeTruthy();
expect(readmeContent).toContain("My Project");
console.log(
`[Verify] my-project README.md starts with: ${readmeContent?.substring(0, 50)}...`,
);
// Verify tags were mirrored
const tags = await giteaApi.listTags(GITEA_MIRROR_ORG, "my-project");
const tagNames = tags.map((t: any) => t.name);
console.log(`[Verify] my-project tags: ${tagNames.join(", ")}`);
if (tagNames.length > 0) {
expect(tagNames).toContain("v1.0.0");
}
// Verify commits exist
const commits = await giteaApi.listCommits(
GITEA_MIRROR_ORG,
"my-project",
);
console.log(`[Verify] my-project commits: ${commits.length}`);
expect(commits.length, "Should have multiple commits").toBeGreaterThan(0);
// Verify dotfiles repo has content
const bashrc = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"dotfiles",
".bashrc",
);
expect(bashrc, "dotfiles should contain .bashrc").toBeTruthy();
console.log("[Verify] dotfiles .bashrc verified");
});
test("Step 8: Verify mirror jobs and app state", async ({ request }) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Check activity log
const activitiesResp = await request.get(`${APP_URL}/api/activities`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (activitiesResp.ok()) {
const activities = await activitiesResp.json();
const jobs: any[] = Array.isArray(activities)
? activities
: (activities.jobs ?? activities.activities ?? []);
console.log(`[State] Activity/job records: ${jobs.length}`);
const mirrorJobs = jobs.filter(
(j: any) =>
j.status === "mirroring" ||
j.status === "failed" ||
j.status === "success" ||
j.status === "mirrored" ||
j.message?.includes("mirror") ||
j.message?.includes("Mirror"),
);
console.log(`[State] Mirror-related jobs: ${mirrorJobs.length}`);
for (const j of mirrorJobs.slice(0, 5)) {
console.log(
`[State] • ${j.repositoryName ?? "?"}: ${j.status}${j.message ?? ""}`,
);
}
}
// Check dashboard repos
const dashResp = await request.get(`${APP_URL}/api/dashboard`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (dashResp.ok()) {
const dashData = await dashResp.json();
const repos: any[] = dashData.repositories ?? [];
console.log(`[State] Dashboard repos: ${repos.length}`);
for (const r of repos) {
console.log(
`[State] • ${r.name}: status=${r.status}, mirrored=${r.mirroredLocation ?? "none"}`,
);
}
expect(repos.length, "Repos should exist in DB").toBeGreaterThan(0);
const succeeded = repos.filter(
(r: any) => r.status === "mirrored" || r.status === "success",
);
console.log(
`[State] Successfully mirrored repos: ${succeeded.length}/${repos.length}`,
);
}
// App should still be running
const healthResp = await request.get(`${APP_URL}/`, {
failOnStatusCode: false,
});
expect(
healthResp.status(),
"App should still be running after mirror attempts",
).toBeLessThan(500);
console.log(`[State] App health: ${healthResp.status()}`);
});
});

305
tests/e2e/03-backup.spec.ts Normal file
View File

@@ -0,0 +1,305 @@
/**
* 03 Backup configuration tests.
*
* Exercises the pre-sync backup system by toggling config flags through
* the app API and triggering re-syncs on repos that were already mirrored
* by the 02-mirror-workflow suite.
*
* What is tested:
* B1. Enable backupBeforeSync in config
* B2. Confirm mirrored repos exist in Gitea (precondition)
* B3. Trigger a re-sync with backup enabled — verify the backup code path
* runs (snapshot activity entries appear in the activity log)
* B4. Inspect activity log for snapshot-related entries
* B5. Enable blockSyncOnBackupFailure and verify the flag is persisted
* B6. Disable backup and verify config resets cleanly
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
saveConfig,
getRepositoryIds,
triggerSyncRepo,
} from "./helpers";
test.describe("E2E: Backup configuration", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
try {
await giteaApi.createToken();
} catch {
console.log(
"[Backup] Could not create Gitea token; tests may be limited",
);
}
});
test.afterAll(async () => {
await giteaApi.dispose();
});
// ── B1 ─────────────────────────────────────────────────────────────────────
test("Step B1: Enable backup in config", async ({ request }) => {
appCookies = await getAppSessionCookies(request);
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken, "Gitea token required").toBeTruthy();
// Save config with backup enabled
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: true,
blockSyncOnBackupFailure: false,
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
// Verify config was saved
const configResp = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
expect(configResp.status()).toBeLessThan(500);
if (configResp.ok()) {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
console.log(
`[Backup] Config saved: backupBeforeSync=${giteaCfg.backupBeforeSync}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`,
);
}
});
// ── B2 ─────────────────────────────────────────────────────────────────────
test("Step B2: Verify mirrored repos exist in Gitea before backup test", async () => {
// We need repos to already be mirrored from the 02-mirror-workflow suite
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
console.log(
`[Backup] Repos in ${GITEA_MIRROR_ORG}: ${orgRepos.length} (${orgRepos.map((r: any) => r.name).join(", ")})`,
);
if (orgRepos.length === 0) {
console.log(
"[Backup] WARNING: No repos in Gitea yet. Backup test will verify " +
"job creation but not bundle creation.",
);
}
});
// ── B3 ─────────────────────────────────────────────────────────────────────
test("Step B3: Trigger re-sync with backup enabled", async ({ request }) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Fetch mirrored repository IDs (sync-repo requires them)
const { ids: repositoryIds, repos } = await getRepositoryIds(
request,
appCookies,
{ status: "mirrored" },
);
// Also include repos with "success" status
if (repositoryIds.length === 0) {
const { ids: successIds } = await getRepositoryIds(
request,
appCookies,
{ status: "success" },
);
repositoryIds.push(...successIds);
}
// Fall back to all repos if no mirrored/success repos
if (repositoryIds.length === 0) {
const { ids: allIds } = await getRepositoryIds(request, appCookies);
repositoryIds.push(...allIds);
}
console.log(
`[Backup] Found ${repositoryIds.length} repos to re-sync: ` +
repos.map((r: any) => r.name).join(", "),
);
expect(
repositoryIds.length,
"Need at least one repo to test backup",
).toBeGreaterThan(0);
// Trigger sync-repo — this calls syncGiteaRepoEnhanced which checks
// shouldCreatePreSyncBackup and creates bundles before syncing
const status = await triggerSyncRepo(
request,
appCookies,
repositoryIds,
25_000,
);
console.log(`[Backup] Sync-repo response: ${status}`);
expect(status, "Sync-repo should accept request").toBeLessThan(500);
});
// ── B4 ─────────────────────────────────────────────────────────────────────
test("Step B4: Verify backup-related activity in logs", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const activitiesResp = await request.get(`${APP_URL}/api/activities`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (!activitiesResp.ok()) {
console.log(
`[Backup] Could not fetch activities: ${activitiesResp.status()}`,
);
return;
}
const activities = await activitiesResp.json();
const jobs: any[] = Array.isArray(activities)
? activities
: (activities.jobs ?? activities.activities ?? []);
// Look for backup / snapshot related messages
const backupJobs = jobs.filter(
(j: any) =>
j.message?.toLowerCase().includes("snapshot") ||
j.message?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("snapshot") ||
j.details?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("bundle"),
);
console.log(
`[Backup] Backup-related activity entries: ${backupJobs.length}`,
);
for (const j of backupJobs.slice(0, 10)) {
console.log(
`[Backup] • ${j.repositoryName ?? "?"}: ${j.status}${j.message ?? ""} | ${(j.details ?? "").substring(0, 120)}`,
);
}
// We expect at least some backup-related entries if repos were mirrored
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
if (orgRepos.length > 0) {
// With repos in Gitea, the backup system should have tried to create
// snapshots. All snapshots should succeed.
expect(
backupJobs.length,
"Expected at least one backup/snapshot activity entry when " +
"backupBeforeSync is enabled and repos exist in Gitea",
).toBeGreaterThan(0);
// Check for any failed backups
const failedBackups = backupJobs.filter(
(j: any) =>
j.status === "failed" &&
(j.message?.toLowerCase().includes("snapshot") ||
j.details?.toLowerCase().includes("snapshot")),
);
expect(
failedBackups.length,
`Expected all backups to succeed, but ${failedBackups.length} backup(s) failed. ` +
`Failed: ${failedBackups.map((j: any) => `${j.repositoryName}: ${j.details?.substring(0, 100)}`).join("; ")}`,
).toBe(0);
console.log(
`[Backup] Confirmed: backup system was invoked for ${backupJobs.length} repos`,
);
}
// Dump all recent jobs for debugging visibility
console.log(`[Backup] All recent jobs (last 20):`);
for (const j of jobs.slice(0, 20)) {
console.log(
`[Backup] - [${j.status}] ${j.repositoryName ?? "?"}: ${j.message ?? ""} ` +
`${j.details ? `(${j.details.substring(0, 80)})` : ""}`,
);
}
});
// ── B5 ─────────────────────────────────────────────────────────────────────
test("Step B5: Enable blockSyncOnBackupFailure and verify behavior", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
// Update config to block sync on backup failure
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: true,
blockSyncOnBackupFailure: true,
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
console.log("[Backup] Config updated: blockSyncOnBackupFailure=true");
// Verify the flag persisted
const configResp = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (configResp.ok()) {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
expect(giteaCfg.blockSyncOnBackupFailure).toBe(true);
console.log(
`[Backup] Verified: blockSyncOnBackupFailure=${giteaCfg.blockSyncOnBackupFailure}`,
);
}
});
// ── B6 ─────────────────────────────────────────────────────────────────────
test("Step B6: Disable backup and verify config resets", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
// Disable backup
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: false,
blockSyncOnBackupFailure: false,
},
});
const configResp = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (configResp.ok()) {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
console.log(
`[Backup] After disable: backupBeforeSync=${giteaCfg.backupBeforeSync}`,
);
}
console.log("[Backup] Backup configuration test complete");
});
});

View File

@@ -0,0 +1,864 @@
/**
* 04 Force-push simulation and backup verification.
*
* This is the critical test that proves data loss can happen from a
* force-push on the source repo, and verifies that the backup system
* (when enabled) preserves the old state.
*
* Scenario:
* 1. Confirm my-project is already mirrored with known commits / content
* 2. Record the pre-force-push state (branch SHAs, commit messages, file content)
* 3. Rewrite history in the source bare repo (simulate a force-push)
* 4. Trigger Gitea mirror-sync WITHOUT backup
* 5. Verify Gitea now reflects the rewritten history — old commits are GONE
* 6. Restore the source repo, re-mirror, then enable backup
* 7. Force-push again and sync WITH backup enabled
* 8. Verify backup activity was recorded (snapshot attempted before sync)
*
* The source bare repos live on the host filesystem at
* tests/e2e/git-repos/<owner>/<name>.git and are served read-only into the
* git-server container. Because the bind-mount is :ro in docker-compose,
* we modify the repos on the host and Gitea's dumb-HTTP clone picks up
* the changes on the next fetch.
*
* Prerequisites: 02-mirror-workflow.spec.ts must have run first so that
* my-project is already mirrored into Gitea.
*/
import { execSync } from "node:child_process";
import { existsSync, mkdirSync, rmSync, writeFileSync } from "node:fs";
import { join, resolve, dirname } from "node:path";
import { fileURLToPath } from "node:url";
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
saveConfig,
waitFor,
getRepositoryIds,
triggerSyncRepo,
} from "./helpers";
// ─── Paths ───────────────────────────────────────────────────────────────────
const E2E_DIR = resolve(dirname(fileURLToPath(import.meta.url)));
const GIT_REPOS_DIR = join(E2E_DIR, "git-repos");
const MY_PROJECT_BARE = join(GIT_REPOS_DIR, "e2e-test-user", "my-project.git");
// ─── Git helpers ─────────────────────────────────────────────────────────────
/** Run a git command in a given directory. */
function git(args: string, cwd: string): string {
try {
return execSync(`git ${args}`, {
cwd,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "Force Push Bot",
GIT_AUTHOR_EMAIL: "force-push@test.local",
GIT_COMMITTER_NAME: "Force Push Bot",
GIT_COMMITTER_EMAIL: "force-push@test.local",
},
}).trim();
} catch (err: any) {
const stderr = err.stderr?.toString() ?? "";
const stdout = err.stdout?.toString() ?? "";
throw new Error(
`git ${args} failed in ${cwd}:\n${stderr || stdout || err.message}`,
);
}
}
/**
* Get the SHA of a ref in a bare repository.
* Uses `git rev-parse` so it works for branches and tags.
*/
function getRefSha(bareRepo: string, ref: string): string {
return git(`rev-parse ${ref}`, bareRepo);
}
/**
* Clone the bare repo to a temporary working copy, execute a callback that
* mutates the working copy, then force-push back to the bare repo and
* update server-info for dumb-HTTP serving.
*/
function mutateSourceRepo(
bareRepo: string,
tmpName: string,
mutate: (workDir: string) => void,
): void {
const tmpDir = join(GIT_REPOS_DIR, ".work-force-push", tmpName);
rmSync(tmpDir, { recursive: true, force: true });
mkdirSync(join(GIT_REPOS_DIR, ".work-force-push"), { recursive: true });
try {
// Clone from the bare repo
git(`clone "${bareRepo}" "${tmpDir}"`, GIT_REPOS_DIR);
git("config user.name 'Force Push Bot'", tmpDir);
git("config user.email 'force-push@test.local'", tmpDir);
// Let the caller rewrite history
mutate(tmpDir);
// Force-push all refs back to the bare repo
git(`push --force --all "${bareRepo}"`, tmpDir);
git(`push --force --tags "${bareRepo}"`, tmpDir);
// Update server-info so the dumb-HTTP server picks up the new refs
git("update-server-info", bareRepo);
} finally {
rmSync(tmpDir, { recursive: true, force: true });
}
}
/** Helper to clean up the temporary working directory. */
function cleanupWorkDir(): void {
const workDir = join(GIT_REPOS_DIR, ".work-force-push");
rmSync(workDir, { recursive: true, force: true });
}
// ─── Tests ───────────────────────────────────────────────────────────────────
test.describe("E2E: Force-push simulation", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
/** SHA of the main branch BEFORE we force-push. */
let originalMainSha = "";
/** The commit message of the HEAD commit before force-push. */
let originalHeadMessage = "";
/** Content of README.md before force-push. */
let originalReadmeContent = "";
/** Number of commits on main before force-push. */
let originalCommitCount = 0;
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
try {
await giteaApi.createToken();
} catch {
console.log("[ForcePush] Could not create Gitea token");
}
});
test.afterAll(async () => {
cleanupWorkDir();
await giteaApi.dispose();
});
// ── F0: Preconditions ────────────────────────────────────────────────────
test("F0: Confirm my-project is mirrored and record its state", async ({
request,
}) => {
// Verify the source bare repo exists on the host
expect(
existsSync(MY_PROJECT_BARE),
`Bare repo should exist at ${MY_PROJECT_BARE}`,
).toBeTruthy();
// Verify it is mirrored in Gitea
const repo = await giteaApi.getRepo(GITEA_MIRROR_ORG, "my-project");
expect(repo, "my-project should exist in Gitea").toBeTruthy();
console.log(
`[ForcePush] my-project in Gitea: mirror=${repo.mirror}, ` +
`default_branch=${repo.default_branch}`,
);
// Record the current state of main in Gitea
const mainBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
expect(mainBranch, "main branch should exist").toBeTruthy();
originalMainSha = mainBranch.commit.id;
originalHeadMessage =
mainBranch.commit.message?.trim() ?? "(unknown message)";
console.log(
`[ForcePush] Original main HEAD: ${originalMainSha.substring(0, 12)} ` +
`"${originalHeadMessage}"`,
);
// Record commit count
const commits = await giteaApi.listCommits(GITEA_MIRROR_ORG, "my-project", {
limit: 50,
});
originalCommitCount = commits.length;
console.log(
`[ForcePush] Original commit count on main: ${originalCommitCount}`,
);
// Record README content
const readme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
originalReadmeContent = readme ?? "";
expect(originalReadmeContent).toContain("My Project");
console.log(
`[ForcePush] Original README length: ${originalReadmeContent.length} chars`,
);
// Also verify the source bare repo matches
const sourceSha = getRefSha(MY_PROJECT_BARE, "refs/heads/main");
console.log(
`[ForcePush] Source bare main SHA: ${sourceSha.substring(0, 12)}`,
);
// They may differ slightly if Gitea hasn't synced the very latest, but
// the important thing is that both exist.
});
// ── F1: Rewrite history on the source repo ───────────────────────────────
test("F1: Force-push rewritten history to source repo", async () => {
const shaBeforeRewrite = getRefSha(MY_PROJECT_BARE, "refs/heads/main");
console.log(
`[ForcePush] Source main before rewrite: ${shaBeforeRewrite.substring(0, 12)}`,
);
mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite", (workDir) => {
// We're on the main branch.
// Rewrite history: remove the last commit (the LICENSE commit) via
// reset --hard HEAD~1, then add a completely different commit.
git("checkout main", workDir);
// Record what HEAD is for logging
const headBefore = git("log --oneline -1", workDir);
console.log(`[ForcePush] Working copy HEAD before reset: ${headBefore}`);
// Hard reset to remove the last commit (this drops "Add MIT license")
git("reset --hard HEAD~1", workDir);
const headAfterReset = git("log --oneline -1", workDir);
console.log(`[ForcePush] After reset HEAD~1: ${headAfterReset}`);
// Write a replacement commit with different content (simulates someone
// rewriting history with different changes)
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nThis README was FORCE-PUSHED.\n\nOriginal history has been rewritten.\n",
);
writeFileSync(
join(workDir, "FORCE_PUSH_MARKER.txt"),
`Force-pushed at ${new Date().toISOString()}\n`,
);
git("add -A", workDir);
execSync('git commit -m "FORCE PUSH: Rewritten history"', {
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "Force Push Bot",
GIT_AUTHOR_EMAIL: "force-push@test.local",
GIT_AUTHOR_DATE: "2024-06-15T12:00:00+00:00",
GIT_COMMITTER_NAME: "Force Push Bot",
GIT_COMMITTER_EMAIL: "force-push@test.local",
GIT_COMMITTER_DATE: "2024-06-15T12:00:00+00:00",
},
});
const headAfterRewrite = git("log --oneline -3", workDir);
console.log(`[ForcePush] After rewrite (last 3):\n${headAfterRewrite}`);
});
const shaAfterRewrite = getRefSha(MY_PROJECT_BARE, "refs/heads/main");
console.log(
`[ForcePush] Source main after rewrite: ${shaAfterRewrite.substring(0, 12)}`,
);
// The SHA must have changed — this proves the force-push happened
expect(
shaAfterRewrite,
"Source repo main SHA should change after force-push",
).not.toBe(originalMainSha);
// Verify the old SHA is no longer reachable on main
const logOutput = git("log --oneline main", MY_PROJECT_BARE);
expect(
logOutput,
"Rewritten history should NOT contain the old head commit",
).toContain("FORCE PUSH");
});
// ── F2: Sync to Gitea WITHOUT backup ─────────────────────────────────────
test("F2: Disable backup and sync force-pushed repo to Gitea", async ({
request,
}) => {
appCookies = await getAppSessionCookies(request);
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken).toBeTruthy();
// Ensure backup is disabled for this test
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: false,
blockSyncOnBackupFailure: false,
},
});
console.log("[ForcePush] Backup disabled for unprotected sync test");
// Trigger Gitea's mirror-sync directly via the Gitea API.
// This is more reliable than going through the app for this test because
// the app's sync-repo endpoint involves extra processing. We want to test
// the raw effect of Gitea pulling the rewritten refs.
const synced = await giteaApi.triggerMirrorSync(
GITEA_MIRROR_ORG,
"my-project",
);
console.log(`[ForcePush] Gitea mirror-sync triggered: ${synced}`);
// Wait for Gitea to pull the new refs from the git-server
console.log("[ForcePush] Waiting for Gitea to pull rewritten refs...");
await new Promise((r) => setTimeout(r, 15_000));
});
// ── F3: Verify Gitea reflects the rewritten history ──────────────────────
test("F3: Verify Gitea has the force-pushed content (old history GONE)", async () => {
// Poll until Gitea picks up the new HEAD
await waitFor(
async () => {
const branch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
if (!branch) return false;
return branch.commit.id !== originalMainSha;
},
{
timeout: 60_000,
interval: 5_000,
label: "Gitea main branch updates to new SHA",
},
);
// Read the new state
const newMainBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
expect(newMainBranch).toBeTruthy();
const newSha = newMainBranch.commit.id;
const newMsg = newMainBranch.commit.message?.trim() ?? "";
console.log(
`[ForcePush] New main HEAD: ${newSha.substring(0, 12)} "${newMsg}"`,
);
// The SHA MUST be different from the original
expect(
newSha,
"Gitea main SHA should have changed after force-push sync",
).not.toBe(originalMainSha);
// The new commit message should be the force-pushed one
expect(newMsg).toContain("FORCE PUSH");
// Verify the force-push marker file now exists in Gitea
const markerContent = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"FORCE_PUSH_MARKER.txt",
);
expect(
markerContent,
"FORCE_PUSH_MARKER.txt should appear after sync",
).toBeTruthy();
console.log(
`[ForcePush] Marker file present: ${markerContent?.substring(0, 40)}...`,
);
// Verify the README was overwritten
const newReadme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
expect(newReadme).toContain("FORCE-PUSHED");
expect(newReadme).not.toBe(originalReadmeContent);
console.log("[ForcePush] README.md confirms overwritten content");
// Verify the LICENSE file is GONE (it was in the dropped commit)
const licenseContent = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
expect(
licenseContent,
"LICENSE should be GONE after force-push removed that commit",
).toBeNull();
console.log("[ForcePush] ✗ LICENSE file is GONE — data loss confirmed");
// Verify the old commit SHA is no longer accessible
const oldCommit = await giteaApi.getCommit(
GITEA_MIRROR_ORG,
"my-project",
originalMainSha,
);
// Gitea may or may not GC the unreachable commit immediately, so this
// is informational rather than a hard assertion.
if (oldCommit) {
console.log(
`[ForcePush] Old commit ${originalMainSha.substring(0, 12)} is ` +
`still in Gitea's object store (not yet GC'd)`,
);
} else {
console.log(
`[ForcePush] Old commit ${originalMainSha.substring(0, 12)} is ` +
`no longer accessible — data loss complete`,
);
}
// Check commit count changed
const newCommits = await giteaApi.listCommits(
GITEA_MIRROR_ORG,
"my-project",
{ limit: 50 },
);
console.log(
`[ForcePush] Commit count: was ${originalCommitCount}, now ${newCommits.length}`,
);
// The rewrite dropped one commit and added one, so the count should differ
// or at minimum the commit list should not contain the old head message.
const commitMessages = newCommits.map(
(c: any) => c.commit?.message?.trim() ?? "",
);
expect(
commitMessages.some((m: string) => m.includes("FORCE PUSH")),
"New commit list should contain the force-pushed commit",
).toBeTruthy();
console.log(
"\n[ForcePush] ════════════════════════════════════════════════════",
);
console.log(
"[ForcePush] CONFIRMED: Force-push without backup = DATA LOSS",
);
console.log(
"[ForcePush] The LICENSE file and original HEAD commit are gone.",
);
console.log(
"[ForcePush] ════════════════════════════════════════════════════\n",
);
});
// ── F4: Restore source, re-mirror, then test WITH backup ─────────────────
test("F4: Restore source repo to a good state and re-mirror", async ({
request,
}) => {
// To test the backup path we need a clean slate. Re-create the original
// my-project content in the source repo so it has known good history.
mutateSourceRepo(MY_PROJECT_BARE, "my-project-restore", (workDir) => {
git("checkout main", workDir);
// Remove the force-push marker
try {
execSync("rm -f FORCE_PUSH_MARKER.txt", { cwd: workDir });
} catch {
// may not exist
}
// Restore README
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nA sample project for E2E testing.\n\n" +
"## Features\n- Greeting module\n- Math utilities\n",
);
// Restore LICENSE
writeFileSync(
join(workDir, "LICENSE"),
"MIT License\n\nCopyright (c) 2024 E2E Test\n",
);
git("add -A", workDir);
execSync(
'git commit -m "Restore original content after force-push test"',
{
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
},
},
);
const newHead = git("log --oneline -1", workDir);
console.log(`[ForcePush] Restored source HEAD: ${newHead}`);
});
// Sync Gitea to pick up the restored state
const synced = await giteaApi.triggerMirrorSync(
GITEA_MIRROR_ORG,
"my-project",
);
console.log(`[ForcePush] Gitea mirror-sync for restore: ${synced}`);
await new Promise((r) => setTimeout(r, 15_000));
// Verify Gitea has the restored content
await waitFor(
async () => {
const readme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
return readme !== null && readme.includes("Features");
},
{
timeout: 60_000,
interval: 5_000,
label: "Gitea picks up restored content",
},
);
const license = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
expect(license, "LICENSE should be restored").toBeTruthy();
console.log("[ForcePush] Gitea restored to good state");
// Record the new "good" SHA for the next force-push test
const restoredBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
originalMainSha = restoredBranch.commit.id;
console.log(
`[ForcePush] Restored main SHA: ${originalMainSha.substring(0, 12)}`,
);
});
// ── F5: Force-push AGAIN, this time with backup enabled ──────────────────
test("F5: Enable backup, force-push, and sync", async ({ request }) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
// Enable backup
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: true,
blockSyncOnBackupFailure: false, // don't block — we want to see both backup + sync happen
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
console.log("[ForcePush] Backup enabled for protected sync test");
// Force-push again
mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite2", (workDir) => {
git("checkout main", workDir);
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nSECOND FORCE-PUSH — backup should have preserved old state.\n",
);
writeFileSync(
join(workDir, "SECOND_FORCE_PUSH.txt"),
`Second force-push at ${new Date().toISOString()}\n`,
);
// Remove LICENSE again to simulate destructive rewrite
try {
execSync("rm -f LICENSE", { cwd: workDir });
} catch {
// may not exist
}
git("add -A", workDir);
execSync('git commit -m "SECOND FORCE PUSH: backup should catch this"', {
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "Force Push Bot",
GIT_AUTHOR_EMAIL: "force-push@test.local",
GIT_COMMITTER_NAME: "Force Push Bot",
GIT_COMMITTER_EMAIL: "force-push@test.local",
},
});
});
console.log("[ForcePush] Second force-push applied to source repo");
// Use the app's sync-repo to trigger the sync (this goes through
// syncGiteaRepoEnhanced which runs the backup code path)
const { ids: repoIds } = await getRepositoryIds(request, appCookies);
// Find the my-project repo ID
const dashResp = await request.get(`${APP_URL}/api/dashboard`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
let myProjectId = "";
if (dashResp.ok()) {
const data = await dashResp.json();
const repos: any[] = data.repositories ?? [];
const myProj = repos.find((r: any) => r.name === "my-project");
if (myProj) myProjectId = myProj.id;
}
if (myProjectId) {
console.log(
`[ForcePush] Triggering app sync-repo for my-project (${myProjectId})`,
);
const status = await triggerSyncRepo(
request,
appCookies,
[myProjectId],
25_000,
);
console.log(`[ForcePush] App sync-repo response: ${status}`);
} else {
// Fallback: trigger via Gitea API directly
console.log(
"[ForcePush] Could not find my-project ID, using Gitea API directly",
);
await giteaApi.triggerMirrorSync(GITEA_MIRROR_ORG, "my-project");
await new Promise((r) => setTimeout(r, 15_000));
}
});
// ── F6: Verify Gitea picked up the second force-push ─────────────────────
test("F6: Verify Gitea reflects second force-push", async () => {
await waitFor(
async () => {
const branch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
if (!branch) return false;
return branch.commit.id !== originalMainSha;
},
{
timeout: 60_000,
interval: 5_000,
label: "Gitea main branch updates after second force-push",
},
);
const newBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
const newSha = newBranch.commit.id;
console.log(
`[ForcePush] After 2nd force-push: main=${newSha.substring(0, 12)}, ` +
`msg="${newBranch.commit.message?.trim()}"`,
);
expect(newSha).not.toBe(originalMainSha);
// Verify the second force-push marker
const marker = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"SECOND_FORCE_PUSH.txt",
);
expect(marker, "Second force-push marker should exist").toBeTruthy();
// LICENSE should be gone again
const license = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
expect(license, "LICENSE gone again after 2nd force-push").toBeNull();
console.log("[ForcePush] Second force-push verified in Gitea");
});
// ── F7: Verify backup activity was logged for the second force-push ──────
test("F7: Verify backup activity was recorded for protected sync", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const activitiesResp = await request.get(`${APP_URL}/api/activities`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (!activitiesResp.ok()) {
console.log(
`[ForcePush] Could not fetch activities: ${activitiesResp.status()}`,
);
return;
}
const activities = await activitiesResp.json();
const jobs: any[] = Array.isArray(activities)
? activities
: (activities.jobs ?? activities.activities ?? []);
// Filter to backup/snapshot entries for my-project
const backupJobs = jobs.filter(
(j: any) =>
(j.repositoryName === "my-project" ||
j.repositoryName === "my-project") &&
(j.message?.toLowerCase().includes("snapshot") ||
j.message?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("snapshot") ||
j.details?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("bundle")),
);
console.log(
`[ForcePush] Backup activity for my-project: ${backupJobs.length} entries`,
);
for (const j of backupJobs) {
console.log(
`[ForcePush] • [${j.status}] ${j.message ?? ""} | ${(j.details ?? "").substring(0, 100)}`,
);
}
// The backup system should have been invoked and must succeed.
expect(
backupJobs.length,
"At least one backup/snapshot activity should exist for my-project " +
"when backupBeforeSync is enabled",
).toBeGreaterThan(0);
// Check whether any backups actually succeeded
const successfulBackups = backupJobs.filter(
(j: any) =>
j.status === "syncing" ||
j.message?.includes("Snapshot created") ||
j.details?.includes("Pre-sync snapshot created"),
);
const failedBackups = backupJobs.filter(
(j: any) =>
j.status === "failed" &&
(j.message?.includes("Snapshot failed") ||
j.details?.includes("snapshot failed")),
);
if (successfulBackups.length > 0) {
console.log(
`[ForcePush] ✓ ${successfulBackups.length} backup(s) SUCCEEDED — ` +
`old state was preserved in bundle`,
);
}
if (failedBackups.length > 0) {
console.log(
`[ForcePush] ⚠ ${failedBackups.length} backup(s) FAILED`,
);
// Extract and log the first failure reason for visibility
const firstFailure = failedBackups[0];
console.log(
`[ForcePush] Failure reason: ${firstFailure.details?.substring(0, 200)}`,
);
}
console.log(
"[ForcePush] ════════════════════════════════════════════════════",
);
if (successfulBackups.length > 0) {
console.log(
"[ForcePush] RESULT: Backup system PROTECTED against force-push",
);
} else {
console.log("[ForcePush] RESULT: Backup system was INVOKED but FAILED.");
}
console.log(
"[ForcePush] ════════════════════════════════════════════════════\n",
);
// Fail the test if any backups failed
expect(
failedBackups.length,
`Expected all backups to succeed, but ${failedBackups.length} backup(s) failed. ` +
`First failure: ${failedBackups[0]?.details || "unknown error"}`,
).toBe(0);
});
// ── F8: Restore source repo for subsequent test suites ───────────────────
test("F8: Restore source repo to clean state for other tests", async () => {
mutateSourceRepo(MY_PROJECT_BARE, "my-project-final-restore", (workDir) => {
git("checkout main", workDir);
// Remove force-push artifacts
try {
execSync("rm -f FORCE_PUSH_MARKER.txt SECOND_FORCE_PUSH.txt", {
cwd: workDir,
});
} catch {
// ignore
}
// Restore content
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nA sample project for E2E testing.\n\n" +
"## Features\n- Greeting module\n- Math utilities\n",
);
writeFileSync(
join(workDir, "LICENSE"),
"MIT License\n\nCopyright (c) 2024 E2E Test\n",
);
git("add -A", workDir);
execSync(
'git commit --allow-empty -m "Final restore after force-push tests"',
{
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
},
},
);
});
// Sync Gitea
await giteaApi.triggerMirrorSync(GITEA_MIRROR_ORG, "my-project");
await new Promise((r) => setTimeout(r, 10_000));
// Verify restoration
const license = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
if (license) {
console.log("[ForcePush] Source repo restored for subsequent tests");
} else {
console.log(
"[ForcePush] Warning: restoration may not have synced yet (Gitea async)",
);
}
});
});

View File

@@ -0,0 +1,342 @@
/**
* 05 Sync verification and cleanup.
*
* Exercises the dynamic aspects of the sync pipeline:
* • Adding a repo to the fake GitHub at runtime and verifying the app
* discovers it on the next sync
* • Deep content-integrity checks on repos mirrored during earlier suites
* • Resetting the fake GitHub store to its defaults
*
* Prerequisites: 02-mirror-workflow.spec.ts must have run so that repos
* already exist in Gitea.
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
FAKE_GITHUB_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
} from "./helpers";
test.describe("E2E: Sync verification", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
try {
await giteaApi.createToken();
} catch {
console.log("[SyncVerify] Could not create Gitea token; tests may skip");
}
});
test.afterAll(async () => {
await giteaApi.dispose();
});
// ── Dynamic repo addition ────────────────────────────────────────────────
test("Verify fake GitHub management API can add repos dynamically", async ({
request,
}) => {
const addResp = await request.post(`${FAKE_GITHUB_URL}/___mgmt/add-repo`, {
data: {
name: "dynamic-repo",
owner_login: "e2e-test-user",
description: "Dynamically added for E2E testing",
language: "Rust",
},
});
expect(addResp.ok()).toBeTruthy();
const repoResp = await request.get(
`${FAKE_GITHUB_URL}/repos/e2e-test-user/dynamic-repo`,
);
expect(repoResp.ok()).toBeTruthy();
const repo = await repoResp.json();
expect(repo.name).toBe("dynamic-repo");
expect(repo.language).toBe("Rust");
console.log("[DynamicRepo] Successfully added and verified dynamic repo");
});
test("Newly added fake GitHub repo gets picked up by sync", async ({
request,
}) => {
appCookies = await getAppSessionCookies(request);
const syncResp = await request.post(`${APP_URL}/api/sync`, {
headers: {
"Content-Type": "application/json",
Cookie: appCookies,
},
failOnStatusCode: false,
});
const status = syncResp.status();
console.log(`[DynamicSync] Sync response: ${status}`);
expect(status).toBeLessThan(500);
if (syncResp.ok()) {
const data = await syncResp.json();
console.log(
`[DynamicSync] New repos discovered: ${data.newRepositories ?? "?"}`,
);
if (data.newRepositories !== undefined) {
expect(data.newRepositories).toBeGreaterThanOrEqual(0);
}
}
});
// ── Content integrity ────────────────────────────────────────────────────
test("Verify repo content integrity after mirror", async () => {
// Check repos in the mirror org
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
const orgRepoNames = orgRepos.map((r: any) => r.name);
console.log(
`[Integrity] Repos in ${GITEA_MIRROR_ORG}: ${orgRepoNames.join(", ")}`,
);
// Check github-stars org for starred repos
const starsRepos = await giteaApi.listOrgRepos("github-stars");
const starsRepoNames = starsRepos.map((r: any) => r.name);
console.log(
`[Integrity] Repos in github-stars: ${starsRepoNames.join(", ")}`,
);
// ── notes repo (minimal single-commit repo) ──────────────────────────
if (orgRepoNames.includes("notes")) {
const notesReadme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"notes",
"README.md",
);
if (notesReadme) {
expect(notesReadme).toContain("Notes");
console.log("[Integrity] notes/README.md verified");
}
const ideas = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"notes",
"ideas.md",
);
if (ideas) {
expect(ideas).toContain("Ideas");
console.log("[Integrity] notes/ideas.md verified");
}
const todo = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"notes",
"todo.md",
);
if (todo) {
expect(todo).toContain("TODO");
console.log("[Integrity] notes/todo.md verified");
}
}
// ── dotfiles repo ────────────────────────────────────────────────────
if (orgRepoNames.includes("dotfiles")) {
const vimrc = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"dotfiles",
".vimrc",
);
if (vimrc) {
expect(vimrc).toContain("set number");
console.log("[Integrity] dotfiles/.vimrc verified");
}
const gitconfig = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"dotfiles",
".gitconfig",
);
if (gitconfig) {
expect(gitconfig).toContain("[user]");
console.log("[Integrity] dotfiles/.gitconfig verified");
}
// Verify commit count (dotfiles has 2 commits)
const commits = await giteaApi.listCommits(
GITEA_MIRROR_ORG,
"dotfiles",
);
console.log(`[Integrity] dotfiles commit count: ${commits.length}`);
expect(
commits.length,
"dotfiles should have at least 2 commits",
).toBeGreaterThanOrEqual(2);
}
// ── popular-lib (starred repo from other-user) ───────────────────────
// In single-org strategy it goes to the starredReposOrg ("github-stars")
if (starsRepoNames.includes("popular-lib")) {
const readme = await giteaApi.getFileContent(
"github-stars",
"popular-lib",
"README.md",
);
if (readme) {
expect(readme).toContain("Popular Lib");
console.log("[Integrity] popular-lib/README.md verified");
}
const pkg = await giteaApi.getFileContent(
"github-stars",
"popular-lib",
"package.json",
);
if (pkg) {
const parsed = JSON.parse(pkg);
expect(parsed.name).toBe("popular-lib");
expect(parsed.version).toBe("2.5.0");
console.log("[Integrity] popular-lib/package.json verified");
}
const tags = await giteaApi.listTags("github-stars", "popular-lib");
const tagNames = tags.map((t: any) => t.name);
console.log(
`[Integrity] popular-lib tags: ${tagNames.join(", ") || "(none)"}`,
);
if (tagNames.length > 0) {
expect(tagNames).toContain("v2.5.0");
}
} else {
console.log(
"[Integrity] popular-lib not found in github-stars " +
"(may be in mirror org or not yet mirrored)",
);
}
// ── org-tool (organization repo) ─────────────────────────────────────
// org-tool may be in the mirror org or a separate org depending on
// the mirror strategy — check several possible locations.
const orgToolOwners = [GITEA_MIRROR_ORG, "test-org"];
let foundOrgTool = false;
for (const owner of orgToolOwners) {
const repo = await giteaApi.getRepo(owner, "org-tool");
if (repo) {
foundOrgTool = true;
console.log(`[Integrity] org-tool found in ${owner}`);
const readme = await giteaApi.getFileContent(
owner,
"org-tool",
"README.md",
);
if (readme) {
expect(readme).toContain("Org Tool");
console.log("[Integrity] org-tool/README.md verified");
}
const mainGo = await giteaApi.getFileContent(
owner,
"org-tool",
"main.go",
);
if (mainGo) {
expect(mainGo).toContain("package main");
console.log("[Integrity] org-tool/main.go verified");
}
// Check branches
const branches = await giteaApi.listBranches(owner, "org-tool");
const branchNames = branches.map((b: any) => b.name);
console.log(
`[Integrity] org-tool branches: ${branchNames.join(", ")}`,
);
if (branchNames.length > 0) {
expect(branchNames).toContain("main");
}
// Check tags
const tags = await giteaApi.listTags(owner, "org-tool");
const tagNames = tags.map((t: any) => t.name);
console.log(
`[Integrity] org-tool tags: ${tagNames.join(", ") || "(none)"}`,
);
break;
}
}
if (!foundOrgTool) {
console.log(
"[Integrity] org-tool not found in Gitea " +
"(may not have been mirrored in single-org strategy)",
);
}
});
// ── my-project deep check ────────────────────────────────────────────────
test("Verify my-project branch and tag structure", async () => {
const branches = await giteaApi.listBranches(
GITEA_MIRROR_ORG,
"my-project",
);
const branchNames = branches.map((b: any) => b.name);
console.log(
`[Integrity] my-project branches: ${branchNames.join(", ")}`,
);
// The source repo had main, develop, and feature/add-tests
expect(branchNames, "main branch should exist").toContain("main");
// develop and feature/add-tests may or may not survive force-push tests
// depending on test ordering, so just log them
for (const expected of ["develop", "feature/add-tests"]) {
if (branchNames.includes(expected)) {
console.log(`[Integrity] ✓ Branch "${expected}" present`);
} else {
console.log(`[Integrity] ⊘ Branch "${expected}" not present (may have been affected by force-push tests)`);
}
}
const tags = await giteaApi.listTags(GITEA_MIRROR_ORG, "my-project");
const tagNames = tags.map((t: any) => t.name);
console.log(
`[Integrity] my-project tags: ${tagNames.join(", ") || "(none)"}`,
);
// Verify package.json exists and is valid JSON
const pkg = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"package.json",
);
if (pkg) {
const parsed = JSON.parse(pkg);
expect(parsed.name).toBe("my-project");
console.log("[Integrity] my-project/package.json verified");
}
});
});
// ─── Fake GitHub reset ───────────────────────────────────────────────────────
test.describe("E2E: Fake GitHub reset", () => {
test("Can reset fake GitHub to default state", async ({ request }) => {
const resp = await request.post(`${FAKE_GITHUB_URL}/___mgmt/reset`);
expect(resp.ok()).toBeTruthy();
const data = await resp.json();
expect(data.message).toContain("reset");
console.log("[Reset] Fake GitHub reset to defaults");
const health = await request.get(`${FAKE_GITHUB_URL}/___mgmt/health`);
const healthData = await health.json();
expect(healthData.repos).toBeGreaterThan(0);
console.log(
`[Reset] After reset: ${healthData.repos} repos, ${healthData.orgs} orgs`,
);
});
});

141
tests/e2e/cleanup.sh Executable file
View File

@@ -0,0 +1,141 @@
#!/usr/bin/env bash
# ────────────────────────────────────────────────────────────────────────────────
# E2E Cleanup Script
# Removes all temporary data from previous E2E test runs.
#
# Usage:
# ./tests/e2e/cleanup.sh # cleanup everything
# ./tests/e2e/cleanup.sh --soft # keep container images, only remove volumes/data
# ────────────────────────────────────────────────────────────────────────────────
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
COMPOSE_FILE="$SCRIPT_DIR/docker-compose.e2e.yml"
SOFT_CLEAN=false
if [[ "${1:-}" == "--soft" ]]; then
SOFT_CLEAN=true
fi
# Detect container runtime (podman or docker)
if command -v podman-compose &>/dev/null; then
COMPOSE_CMD="podman-compose"
CONTAINER_CMD="podman"
elif command -v docker-compose &>/dev/null; then
COMPOSE_CMD="docker-compose"
CONTAINER_CMD="docker"
elif command -v docker &>/dev/null && docker compose version &>/dev/null 2>&1; then
COMPOSE_CMD="docker compose"
CONTAINER_CMD="docker"
else
echo "[cleanup] WARNING: No container compose tool found. Skipping container cleanup."
COMPOSE_CMD=""
CONTAINER_CMD=""
fi
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ E2E Test Cleanup ║"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
# ── 1. Stop and remove containers ─────────────────────────────────────────────
if [[ -n "$COMPOSE_CMD" ]] && [[ -f "$COMPOSE_FILE" ]]; then
echo "[cleanup] Stopping E2E containers..."
$COMPOSE_CMD -f "$COMPOSE_FILE" down --volumes --remove-orphans 2>/dev/null || true
echo "[cleanup] ✓ Containers stopped and removed"
else
echo "[cleanup] ⊘ No compose file or runtime found, skipping container teardown"
fi
# ── 2. Remove named volumes created by E2E compose ───────────────────────────
if [[ -n "$CONTAINER_CMD" ]]; then
for vol in e2e-gitea-data; do
full_vol_name="e2e_${vol}"
# Try both with and without the project prefix
for candidate in "$vol" "$full_vol_name" "tests_e2e_${vol}"; do
if $CONTAINER_CMD volume inspect "$candidate" &>/dev/null 2>&1; then
echo "[cleanup] Removing volume: $candidate"
$CONTAINER_CMD volume rm -f "$candidate" 2>/dev/null || true
fi
done
done
echo "[cleanup] ✓ Named volumes cleaned"
fi
# ── 3. Kill leftover background processes from previous runs ──────────────────
echo "[cleanup] Checking for leftover processes..."
# Kill fake GitHub server
if pgrep -f "fake-github-server" &>/dev/null; then
echo "[cleanup] Killing leftover fake-github-server process(es)..."
pkill -f "fake-github-server" 2>/dev/null || true
fi
# Kill any stray node/tsx processes on our E2E ports (including git-server on 4590)
for port in 4580 4590 4321 3333; do
pid=$(lsof -ti :"$port" 2>/dev/null || true)
if [[ -n "$pid" ]]; then
echo "[cleanup] Killing process on port $port (PID: $pid)..."
kill -9 $pid 2>/dev/null || true
fi
done
echo "[cleanup] ✓ Leftover processes cleaned"
# ── 4. Remove E2E database and data files ─────────────────────────────────────
echo "[cleanup] Removing E2E data files..."
# Remove test databases
rm -f "$PROJECT_ROOT/gitea-mirror.db" 2>/dev/null || true
rm -f "$PROJECT_ROOT/data/gitea-mirror.db" 2>/dev/null || true
rm -f "$PROJECT_ROOT/e2e-gitea-mirror.db" 2>/dev/null || true
# Remove test backup data
rm -rf "$PROJECT_ROOT/data/repo-backups"* 2>/dev/null || true
# Remove programmatically created test git repositories
if [[ -d "$SCRIPT_DIR/git-repos" ]]; then
echo "[cleanup] Removing test git repos..."
rm -rf "$SCRIPT_DIR/git-repos" 2>/dev/null || true
echo "[cleanup] ✓ Test git repos removed"
fi
# Remove Playwright state/artifacts from previous runs
rm -rf "$SCRIPT_DIR/test-results" 2>/dev/null || true
rm -rf "$SCRIPT_DIR/playwright-report" 2>/dev/null || true
rm -rf "$SCRIPT_DIR/.auth" 2>/dev/null || true
rm -f "$SCRIPT_DIR/e2e-storage-state.json" 2>/dev/null || true
# Remove any PID files we might have created
rm -f "$SCRIPT_DIR/.fake-github.pid" 2>/dev/null || true
rm -f "$SCRIPT_DIR/.app.pid" 2>/dev/null || true
echo "[cleanup] ✓ Data files cleaned"
# ── 5. Remove temp directories ────────────────────────────────────────────────
echo "[cleanup] Removing temp directories..."
rm -rf /tmp/gitea-mirror-backup-* 2>/dev/null || true
rm -rf /tmp/e2e-gitea-mirror-* 2>/dev/null || true
echo "[cleanup] ✓ Temp directories cleaned"
# ── 6. Optionally remove container images ─────────────────────────────────────
if [[ "$SOFT_CLEAN" == false ]] && [[ -n "$CONTAINER_CMD" ]]; then
echo "[cleanup] Pruning dangling images..."
$CONTAINER_CMD image prune -f 2>/dev/null || true
echo "[cleanup] ✓ Dangling images pruned"
else
echo "[cleanup] ⊘ Skipping image cleanup (soft mode)"
fi
# ── 7. Remove node_modules/.cache artifacts from E2E ──────────────────────────
if [[ -d "$PROJECT_ROOT/node_modules/.cache/playwright" ]]; then
echo "[cleanup] Removing Playwright cache..."
rm -rf "$PROJECT_ROOT/node_modules/.cache/playwright" 2>/dev/null || true
echo "[cleanup] ✓ Playwright cache removed"
fi
echo ""
echo "═══════════════════════════════════════════════════════════════"
echo " ✅ E2E cleanup complete"
echo "═══════════════════════════════════════════════════════════════"

View File

@@ -0,0 +1,522 @@
#!/usr/bin/env bun
/**
* create-test-repos.ts
*
* Programmatically creates bare git repositories with real commits, branches,
* and tags so that Gitea can actually clone them during E2E testing.
*
* Repos are created under <outputDir>/<owner>/<name>.git as bare repositories.
* After creation, `git update-server-info` is run on each so they can be served
* via the "dumb HTTP" protocol by any static file server (nginx, darkhttpd, etc.).
*
* Usage:
* bun run tests/e2e/create-test-repos.ts [--output-dir tests/e2e/git-repos]
*
* The script creates the following repositories matching the fake GitHub server's
* default store:
*
* e2e-test-user/my-project.git repo with commits, branches, tags, README
* e2e-test-user/dotfiles.git simple repo with a few config files
* e2e-test-user/notes.git minimal repo with one commit
* other-user/popular-lib.git starred repo from another user
* test-org/org-tool.git organization repository
*/
import { execSync } from "node:child_process";
import { mkdirSync, rmSync, writeFileSync, existsSync } from "node:fs";
import { join, resolve } from "node:path";
// ─── Configuration ───────────────────────────────────────────────────────────
const DEFAULT_OUTPUT_DIR = join(import.meta.dir, "git-repos");
const outputDir = (() => {
const idx = process.argv.indexOf("--output-dir");
if (idx !== -1 && process.argv[idx + 1]) {
return resolve(process.argv[idx + 1]);
}
return DEFAULT_OUTPUT_DIR;
})();
// ─── Helpers ─────────────────────────────────────────────────────────────────
function git(args: string, cwd: string): string {
try {
return execSync(`git ${args}`, {
cwd,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
// Deterministic committer for reproducible repos
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_AUTHOR_DATE: "2024-01-15T10:00:00+00:00",
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_DATE: "2024-01-15T10:00:00+00:00",
},
}).trim();
} catch (err: any) {
const stderr = err.stderr?.toString() ?? "";
const stdout = err.stdout?.toString() ?? "";
throw new Error(
`git ${args} failed in ${cwd}:\n${stderr || stdout || err.message}`,
);
}
}
/** Increment the fake date for each commit so they have unique timestamps */
let commitCounter = 0;
function gitCommit(msg: string, cwd: string): void {
commitCounter++;
const date = `2024-01-15T${String(10 + Math.floor(commitCounter / 60)).padStart(2, "0")}:${String(commitCounter % 60).padStart(2, "0")}:00+00:00`;
execSync(`git commit -m "${msg}"`, {
cwd,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_AUTHOR_DATE: date,
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_DATE: date,
},
});
}
function writeFile(repoDir: string, relPath: string, content: string): void {
const fullPath = join(repoDir, relPath);
const dir = fullPath.substring(0, fullPath.lastIndexOf("/"));
if (dir && !existsSync(dir)) {
mkdirSync(dir, { recursive: true });
}
writeFileSync(fullPath, content, "utf-8");
}
interface RepoSpec {
owner: string;
name: string;
description: string;
/** Function that populates the working repo with commits/branches/tags */
populate: (workDir: string) => void;
}
/**
* Creates a bare repo at <outputDir>/<owner>/<name>.git
* by first building a working repo, then cloning it as bare.
*/
function createBareRepo(spec: RepoSpec): string {
const barePath = join(outputDir, spec.owner, `${spec.name}.git`);
const workPath = join(outputDir, ".work", spec.owner, spec.name);
// Clean previous
rmSync(barePath, { recursive: true, force: true });
rmSync(workPath, { recursive: true, force: true });
// Create working repo
mkdirSync(workPath, { recursive: true });
git("init -b main", workPath);
git("config user.name 'E2E Test Bot'", workPath);
git("config user.email 'e2e-bot@test.local'", workPath);
// Populate with content
spec.populate(workPath);
// Clone as bare
mkdirSync(join(outputDir, spec.owner), { recursive: true });
git(`clone --bare "${workPath}" "${barePath}"`, outputDir);
// Enable dumb HTTP protocol support
git("update-server-info", barePath);
// Also enable the post-update hook so update-server-info runs on push
const hookPath = join(barePath, "hooks", "post-update");
mkdirSync(join(barePath, "hooks"), { recursive: true });
writeFileSync(hookPath, "#!/bin/sh\nexec git update-server-info\n", {
mode: 0o755,
});
return barePath;
}
// ─── Repository Definitions ──────────────────────────────────────────────────
const repos: RepoSpec[] = [
// ── my-project: feature-rich repo ────────────────────────────────────────
{
owner: "e2e-test-user",
name: "my-project",
description: "A test project with branches, tags, and multiple commits",
populate(dir) {
// Initial commit
writeFile(
dir,
"README.md",
"# My Project\n\nA sample project for E2E testing.\n",
);
writeFile(
dir,
"package.json",
JSON.stringify(
{
name: "my-project",
version: "1.0.0",
description: "E2E test project",
main: "index.js",
},
null,
2,
) + "\n",
);
writeFile(
dir,
"index.js",
'// Main entry point\nconsole.log("Hello from my-project");\n',
);
writeFile(dir, ".gitignore", "node_modules/\ndist/\n.env\n");
git("add -A", dir);
gitCommit("Initial commit", dir);
// Second commit
writeFile(
dir,
"src/lib.js",
"export function greet(name) {\n return `Hello, ${name}!`;\n}\n",
);
writeFile(
dir,
"src/utils.js",
"export function sum(a, b) {\n return a + b;\n}\n",
);
git("add -A", dir);
gitCommit("Add library modules", dir);
// Tag v1.0.0
git("tag -a v1.0.0 -m 'Initial release'", dir);
// Create develop branch
git("checkout -b develop", dir);
writeFile(
dir,
"src/feature.js",
"export function newFeature() {\n return 'coming soon';\n}\n",
);
git("add -A", dir);
gitCommit("Add new feature placeholder", dir);
// Create feature branch from develop
git("checkout -b feature/add-tests", dir);
writeFile(
dir,
"tests/lib.test.js",
`import { greet } from '../src/lib.js';
import { sum } from '../src/utils.js';
console.assert(greet('World') === 'Hello, World!');
console.assert(sum(2, 3) === 5);
console.log('All tests passed');
`,
);
git("add -A", dir);
gitCommit("Add unit tests", dir);
// Go back to main and add another commit
git("checkout main", dir);
writeFile(
dir,
"README.md",
"# My Project\n\nA sample project for E2E testing.\n\n## Features\n- Greeting module\n- Math utilities\n",
);
git("add -A", dir);
gitCommit("Update README with features list", dir);
// Tag v1.1.0
git("tag -a v1.1.0 -m 'Feature update'", dir);
// Third commit on main for more history
writeFile(dir, "LICENSE", "MIT License\n\nCopyright (c) 2024 E2E Test\n");
git("add -A", dir);
gitCommit("Add MIT license", dir);
},
},
// ── dotfiles: simple config repo ─────────────────────────────────────────
{
owner: "e2e-test-user",
name: "dotfiles",
description: "Personal configuration files",
populate(dir) {
writeFile(
dir,
".bashrc",
"# Bash configuration\nalias ll='ls -la'\nalias gs='git status'\nexport EDITOR=vim\n",
);
writeFile(
dir,
".vimrc",
'" Vim configuration\nset number\nset tabstop=2\nset shiftwidth=2\nset expandtab\nsyntax on\n',
);
writeFile(
dir,
".gitconfig",
"[user]\n name = E2E Test User\n email = e2e@test.local\n[alias]\n co = checkout\n br = branch\n st = status\n",
);
git("add -A", dir);
gitCommit("Add dotfiles", dir);
writeFile(
dir,
".tmux.conf",
"# Tmux configuration\nset -g mouse on\nset -g default-terminal 'screen-256color'\n",
);
writeFile(
dir,
"install.sh",
'#!/bin/bash\n# Symlink dotfiles to home\nfor f in .bashrc .vimrc .gitconfig .tmux.conf; do\n ln -sf "$(pwd)/$f" "$HOME/$f"\ndone\necho \'Dotfiles installed!\'\n',
);
git("add -A", dir);
gitCommit("Add tmux config and install script", dir);
},
},
// ── notes: minimal single-commit repo ────────────────────────────────────
{
owner: "e2e-test-user",
name: "notes",
description: "Personal notes and documentation",
populate(dir) {
writeFile(
dir,
"README.md",
"# Notes\n\nA collection of personal notes.\n",
);
writeFile(
dir,
"ideas.md",
"# Ideas\n\n- Build a mirror tool\n- Automate backups\n- Learn Rust\n",
);
writeFile(
dir,
"todo.md",
"# TODO\n\n- [x] Set up repository\n- [ ] Add more notes\n- [ ] Organize by topic\n",
);
git("add -A", dir);
gitCommit("Initial notes", dir);
},
},
// ── popular-lib: starred repo from another user ──────────────────────────
{
owner: "other-user",
name: "popular-lib",
description: "A popular library that we starred",
populate(dir) {
writeFile(
dir,
"README.md",
"# Popular Lib\n\nA widely-used utility library.\n\n## Installation\n\n```bash\nnpm install popular-lib\n```\n",
);
writeFile(
dir,
"package.json",
JSON.stringify(
{
name: "popular-lib",
version: "2.5.0",
description: "A widely-used utility library",
main: "dist/index.js",
license: "Apache-2.0",
},
null,
2,
) + "\n",
);
writeFile(
dir,
"src/index.ts",
`/**
* Popular Lib - utility functions
*/
export function capitalize(str: string): string {
return str.charAt(0).toUpperCase() + str.slice(1);
}
export function slugify(str: string): string {
return str.toLowerCase().replace(/\\s+/g, '-').replace(/[^a-z0-9-]/g, '');
}
export function truncate(str: string, len: number): string {
if (str.length <= len) return str;
return str.slice(0, len) + '...';
}
`,
);
git("add -A", dir);
gitCommit("Initial release of popular-lib", dir);
git("tag -a v2.5.0 -m 'Stable release 2.5.0'", dir);
// Add a second commit
writeFile(
dir,
"CHANGELOG.md",
"# Changelog\n\n## 2.5.0\n- Added capitalize, slugify, truncate\n\n## 2.4.0\n- Bug fixes\n",
);
git("add -A", dir);
gitCommit("Add changelog", dir);
},
},
// ── org-tool: organization repo ──────────────────────────────────────────
{
owner: "test-org",
name: "org-tool",
description: "Internal organization tooling",
populate(dir) {
writeFile(
dir,
"README.md",
"# Org Tool\n\nInternal tooling for test-org.\n\n## Usage\n\n```bash\norg-tool run <command>\n```\n",
);
writeFile(
dir,
"main.go",
`package main
import "fmt"
func main() {
\tfmt.Println("org-tool v0.1.0")
}
`,
);
writeFile(
dir,
"go.mod",
"module github.com/test-org/org-tool\n\ngo 1.21\n",
);
writeFile(
dir,
"Makefile",
"build:\n\tgo build -o org-tool .\n\ntest:\n\tgo test ./...\n\nclean:\n\trm -f org-tool\n",
);
git("add -A", dir);
gitCommit("Initial org tool", dir);
// Add a release branch
git("checkout -b release/v0.1", dir);
writeFile(dir, "VERSION", "0.1.0\n");
git("add -A", dir);
gitCommit("Pin version for release", dir);
git("tag -a v0.1.0 -m 'Release v0.1.0'", dir);
// Back to main with more work
git("checkout main", dir);
writeFile(
dir,
"cmd/serve.go",
`package cmd
import "fmt"
func Serve() {
\tfmt.Println("Starting server on :8080")
}
`,
);
git("add -A", dir);
gitCommit("Add serve command", dir);
},
},
];
// ─── Main ────────────────────────────────────────────────────────────────────
function main() {
console.log(
"╔══════════════════════════════════════════════════════════════╗",
);
console.log(
"║ Create E2E Test Git Repositories ║",
);
console.log(
"╠══════════════════════════════════════════════════════════════╣",
);
console.log(`║ Output directory: ${outputDir}`);
console.log(`║ Repositories: ${repos.length}`);
console.log(
"╚══════════════════════════════════════════════════════════════╝",
);
console.log("");
// Verify git is available
try {
const version = execSync("git --version", { encoding: "utf-8" }).trim();
console.log(`[setup] Git version: ${version}`);
} catch {
console.error("ERROR: git is not installed or not in PATH");
process.exit(1);
}
// Clean output directory (preserve the directory itself)
if (existsSync(outputDir)) {
console.log("[setup] Cleaning previous repos...");
rmSync(outputDir, { recursive: true, force: true });
}
mkdirSync(outputDir, { recursive: true });
// Create each repository
const created: string[] = [];
for (const spec of repos) {
const label = `${spec.owner}/${spec.name}`;
console.log(`\n[repo] Creating ${label} ...`);
try {
const barePath = createBareRepo(spec);
console.log(`[repo] ✓ ${label}${barePath}`);
created.push(label);
} catch (err) {
console.error(`[repo] ✗ ${label} FAILED:`, err);
process.exit(1);
}
}
// Cleanup working directories
const workDir = join(outputDir, ".work");
if (existsSync(workDir)) {
rmSync(workDir, { recursive: true, force: true });
}
// Write a manifest file so other scripts know what repos exist
const manifest = {
createdAt: new Date().toISOString(),
outputDir,
repos: repos.map((r) => ({
owner: r.owner,
name: r.name,
description: r.description,
barePath: `${r.owner}/${r.name}.git`,
})),
};
writeFileSync(
join(outputDir, "manifest.json"),
JSON.stringify(manifest, null, 2) + "\n",
"utf-8",
);
console.log(
"\n═══════════════════════════════════════════════════════════════",
);
console.log(` ✅ Created ${created.length} bare repositories:`);
for (const name of created) {
console.log(`${name}.git`);
}
console.log(`\n Manifest: ${join(outputDir, "manifest.json")}`);
console.log(
"═══════════════════════════════════════════════════════════════",
);
}
main();

View File

@@ -0,0 +1,105 @@
# E2E testing environment
# Spins up a Gitea instance and a git HTTP server for integration testing.
#
# The git-server container serves bare git repositories created by
# create-test-repos.ts via the "dumb HTTP" protocol so that Gitea can
# actually clone them during mirror operations.
#
# Usage: podman-compose -f tests/e2e/docker-compose.e2e.yml up -d
services:
gitea-e2e:
image: docker.io/gitea/gitea:1.22
container_name: gitea-e2e
environment:
- USER_UID=1000
- USER_GID=1000
- GITEA__database__DB_TYPE=sqlite3
- GITEA__database__PATH=/data/gitea/gitea.db
- GITEA__server__DOMAIN=localhost
- GITEA__server__ROOT_URL=http://localhost:3333/
- GITEA__server__HTTP_PORT=3000
- GITEA__server__SSH_DOMAIN=localhost
- GITEA__server__START_SSH_SERVER=false
- GITEA__security__INSTALL_LOCK=true
- GITEA__service__DISABLE_REGISTRATION=false
- GITEA__service__REQUIRE_SIGNIN_VIEW=false
- GITEA__api__ENABLE_SWAGGER=false
- GITEA__log__MODE=console
- GITEA__log__LEVEL=Warn
- GITEA__mirror__ENABLED=true
- GITEA__mirror__DEFAULT_INTERVAL=1m
- GITEA__mirror__MIN_INTERVAL=1m
# Allow migrations from any domain including the git-server container
- GITEA__migrations__ALLOWED_DOMAINS=*
- GITEA__migrations__ALLOW_LOCAL_NETWORKS=true
- GITEA__migrations__SKIP_TLS_VERIFY=true
ports:
- "3333:3000"
volumes:
- e2e-gitea-data:/data
depends_on:
git-server:
condition: service_started
healthcheck:
test:
[
"CMD",
"wget",
"--no-verbose",
"--tries=1",
"--spider",
"http://localhost:3000/",
]
interval: 5s
timeout: 5s
retries: 30
start_period: 10s
tmpfs:
- /tmp
networks:
- e2e-net
# Lightweight HTTP server that serves bare git repositories.
# Repos are created on the host by create-test-repos.ts and bind-mounted
# into this container. Gitea clones from http://git-server/<owner>/<name>.git
# using the "dumb HTTP" protocol (repos have git update-server-info run).
git-server:
image: docker.io/alpine:3.19
container_name: git-server
command:
- sh
- -c
- |
apk add --no-cache darkhttpd >/dev/null 2>&1
echo "[git-server] Serving repos from /repos on port 80"
ls -la /repos/ 2>/dev/null || echo "[git-server] WARNING: /repos is empty"
exec darkhttpd /repos --port 80 --no-listing --log /dev/stdout
volumes:
- ./git-repos:/repos:ro
ports:
- "4590:80"
healthcheck:
test:
[
"CMD",
"wget",
"--no-verbose",
"--tries=1",
"--spider",
"http://localhost:80/manifest.json",
]
interval: 3s
timeout: 3s
retries: 15
start_period: 5s
networks:
- e2e-net
networks:
e2e-net:
driver: bridge
volumes:
e2e-gitea-data:
driver: local

File diff suppressed because it is too large Load Diff

666
tests/e2e/helpers.ts Normal file
View File

@@ -0,0 +1,666 @@
/**
* Shared helpers for E2E tests.
*
* Exports constants, the GiteaAPI wrapper, auth helpers (sign-up / sign-in),
* the saveConfig helper, and a generic waitFor polling utility.
*/
import {
expect,
request as playwrightRequest,
type Page,
type APIRequestContext,
} from "@playwright/test";
// ─── Constants ───────────────────────────────────────────────────────────────
export const APP_URL = process.env.APP_URL || "http://localhost:4321";
export const GITEA_URL = process.env.GITEA_URL || "http://localhost:3333";
export const FAKE_GITHUB_URL =
process.env.FAKE_GITHUB_URL || "http://localhost:4580";
export const GIT_SERVER_URL =
process.env.GIT_SERVER_URL || "http://localhost:4590";
export const GITEA_ADMIN_USER = "e2e_admin";
export const GITEA_ADMIN_PASS = "e2eAdminPass123!";
export const GITEA_ADMIN_EMAIL = "admin@e2e-test.local";
export const APP_USER_EMAIL = "e2e@test.local";
export const APP_USER_PASS = "E2eTestPass123!";
export const APP_USER_NAME = "e2e-tester";
export const GITEA_MIRROR_ORG = "github-mirrors";
// ─── waitFor ─────────────────────────────────────────────────────────────────
/** Retry a function until it returns truthy or timeout is reached. */
export async function waitFor(
fn: () => Promise<boolean>,
{
timeout = 60_000,
interval = 2_000,
label = "condition",
}: { timeout?: number; interval?: number; label?: string } = {},
): Promise<void> {
const deadline = Date.now() + timeout;
let lastErr: Error | undefined;
while (Date.now() < deadline) {
try {
if (await fn()) return;
} catch (e) {
lastErr = e instanceof Error ? e : new Error(String(e));
}
await new Promise((r) => setTimeout(r, interval));
}
throw new Error(
`waitFor("${label}") timed out after ${timeout}ms` +
(lastErr ? `: ${lastErr.message}` : ""),
);
}
// ─── GiteaAPI ────────────────────────────────────────────────────────────────
/**
* Direct HTTP helper for talking to Gitea's API.
*
* Uses a manually-created APIRequestContext so it can be shared across
* beforeAll / afterAll / individual tests without hitting Playwright's
* "fixture from beforeAll cannot be reused" restriction.
*/
export class GiteaAPI {
private token = "";
private ctx: APIRequestContext | null = null;
constructor(private baseUrl: string) {}
/** Lazily create (and cache) a Playwright APIRequestContext. */
private async getCtx(): Promise<APIRequestContext> {
if (!this.ctx) {
this.ctx = await playwrightRequest.newContext({
baseURL: this.baseUrl,
});
}
return this.ctx;
}
/** Dispose of the underlying context call in afterAll. */
async dispose(): Promise<void> {
if (this.ctx) {
await this.ctx.dispose();
this.ctx = null;
}
}
/** Create the admin user via Gitea's sign-up form (first user becomes admin). */
async ensureAdminUser(): Promise<void> {
const ctx = await this.getCtx();
// Check if admin already exists by trying basic-auth
try {
const resp = await ctx.get(`/api/v1/user`, {
headers: {
Authorization: `Basic ${btoa(`${GITEA_ADMIN_USER}:${GITEA_ADMIN_PASS}`)}`,
},
failOnStatusCode: false,
});
if (resp.ok()) {
console.log("[GiteaAPI] Admin user already exists");
return;
}
} catch {
// Expected on first run
}
// Register through the form first user auto-becomes admin
console.log("[GiteaAPI] Creating admin via sign-up form...");
const signUpResp = await ctx.post(`/user/sign_up`, {
form: {
user_name: GITEA_ADMIN_USER,
password: GITEA_ADMIN_PASS,
retype: GITEA_ADMIN_PASS,
email: GITEA_ADMIN_EMAIL,
},
failOnStatusCode: false,
maxRedirects: 5,
});
console.log(`[GiteaAPI] Sign-up response status: ${signUpResp.status()}`);
// Verify
const check = await ctx.get(`/api/v1/user`, {
headers: {
Authorization: `Basic ${btoa(`${GITEA_ADMIN_USER}:${GITEA_ADMIN_PASS}`)}`,
},
failOnStatusCode: false,
});
if (!check.ok()) {
throw new Error(
`Failed to verify admin user after creation (status ${check.status()})`,
);
}
console.log("[GiteaAPI] Admin user verified");
}
/** Generate a Gitea API token for the admin user. */
async createToken(): Promise<string> {
if (this.token) return this.token;
const ctx = await this.getCtx();
const tokenName = `e2e-token-${Date.now()}`;
const resp = await ctx.post(`/api/v1/users/${GITEA_ADMIN_USER}/tokens`, {
headers: {
Authorization: `Basic ${btoa(`${GITEA_ADMIN_USER}:${GITEA_ADMIN_PASS}`)}`,
"Content-Type": "application/json",
},
data: {
name: tokenName,
scopes: [
"read:user",
"write:user",
"read:organization",
"write:organization",
"read:repository",
"write:repository",
"read:issue",
"write:issue",
"read:misc",
"write:misc",
"read:admin",
"write:admin",
],
},
});
expect(
resp.ok(),
`Failed to create Gitea token: ${resp.status()}`,
).toBeTruthy();
const data = await resp.json();
this.token = data.sha1 || data.token;
console.log(`[GiteaAPI] Created token: ${tokenName}`);
return this.token;
}
/** Create an organization in Gitea. */
async ensureOrg(orgName: string): Promise<void> {
const ctx = await this.getCtx();
const token = await this.createToken();
// Check if org exists
const check = await ctx.get(`/api/v1/orgs/${orgName}`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (check.ok()) {
console.log(`[GiteaAPI] Org ${orgName} already exists`);
return;
}
const resp = await ctx.post(`/api/v1/orgs`, {
headers: {
Authorization: `token ${token}`,
"Content-Type": "application/json",
},
data: {
username: orgName,
full_name: orgName,
description: "E2E test mirror organization",
visibility: "public",
},
});
expect(resp.ok(), `Failed to create org: ${resp.status()}`).toBeTruthy();
console.log(`[GiteaAPI] Created org: ${orgName}`);
}
/** List repos in a Gitea org. */
async listOrgRepos(orgName: string): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/orgs/${orgName}/repos`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** List repos for the admin user. */
async listUserRepos(): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/users/${GITEA_ADMIN_USER}/repos`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** Get a specific repo. */
async getRepo(owner: string, name: string): Promise<any | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/repos/${owner}/${name}`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return null;
return resp.json();
}
/** List branches for a repo. */
async listBranches(owner: string, name: string): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/repos/${owner}/${name}/branches`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** List tags for a repo. */
async listTags(owner: string, name: string): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/repos/${owner}/${name}/tags`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** List commits for a repo (on default branch). */
async listCommits(
owner: string,
name: string,
opts?: { sha?: string; limit?: number },
): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const params = new URLSearchParams();
if (opts?.sha) params.set("sha", opts.sha);
if (opts?.limit) params.set("limit", String(opts.limit));
const qs = params.toString() ? `?${params.toString()}` : "";
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/commits${qs}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return [];
return resp.json();
}
/** Get a single branch (includes the commit SHA). */
async getBranch(
owner: string,
name: string,
branch: string,
): Promise<any | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/branches/${branch}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return null;
return resp.json();
}
/** Get file content from a repo. */
async getFileContent(
owner: string,
name: string,
filePath: string,
ref?: string,
): Promise<string | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const refQuery = ref ? `?ref=${encodeURIComponent(ref)}` : "";
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/raw/${filePath}${refQuery}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return null;
return resp.text();
}
/** Get a commit by SHA. */
async getCommit(
owner: string,
name: string,
sha: string,
): Promise<any | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/git/commits/${sha}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return null;
return resp.json();
}
/** Trigger mirror sync for a repo via the Gitea API directly. */
async triggerMirrorSync(owner: string, name: string): Promise<boolean> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.post(
`/api/v1/repos/${owner}/${name}/mirror-sync`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
return resp.ok() || resp.status() === 200;
}
getTokenValue(): string {
return this.token;
}
}
// ─── App auth helpers ────────────────────────────────────────────────────────
/**
* Sign up + sign in to the gitea-mirror app using the Better Auth REST API
* and return the session cookie string.
*/
export async function getAppSessionCookies(
request: APIRequestContext,
): Promise<string> {
// 1. Try sign-in first (user may already exist from a previous test / run)
const signInResp = await request.post(`${APP_URL}/api/auth/sign-in/email`, {
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
});
if (signInResp.ok()) {
const cookies = extractSetCookies(signInResp);
if (cookies) {
console.log("[App] Signed in (existing user)");
return cookies;
}
}
// 2. Register
const signUpResp = await request.post(`${APP_URL}/api/auth/sign-up/email`, {
data: {
name: APP_USER_NAME,
email: APP_USER_EMAIL,
password: APP_USER_PASS,
},
failOnStatusCode: false,
});
const signUpStatus = signUpResp.status();
console.log(`[App] Sign-up response: ${signUpStatus}`);
// After sign-up Better Auth may already set a session cookie
const signUpCookies = extractSetCookies(signUpResp);
if (signUpCookies) {
console.log("[App] Got session from sign-up response");
return signUpCookies;
}
// 3. Sign in after registration
const postRegSignIn = await request.post(
`${APP_URL}/api/auth/sign-in/email`,
{
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
},
);
if (!postRegSignIn.ok()) {
const body = await postRegSignIn.text();
throw new Error(
`Sign-in after registration failed (${postRegSignIn.status()}): ${body}`,
);
}
const cookies = extractSetCookies(postRegSignIn);
if (!cookies) {
throw new Error("Sign-in succeeded but no session cookie was returned");
}
console.log("[App] Signed in (after registration)");
return cookies;
}
/**
* Extract session cookies from a response's `set-cookie` headers.
*/
export function extractSetCookies(
resp: Awaited<ReturnType<APIRequestContext["post"]>>,
): string {
const raw = resp
.headersArray()
.filter((h) => h.name.toLowerCase() === "set-cookie");
if (raw.length === 0) return "";
const pairs: string[] = [];
for (const header of raw) {
const nv = header.value.split(";")[0].trim();
if (nv) pairs.push(nv);
}
return pairs.join("; ");
}
/**
* Sign in via the browser UI so the browser context gets session cookies.
*/
export async function signInViaBrowser(page: Page): Promise<string> {
const signInResp = await page.request.post(
`${APP_URL}/api/auth/sign-in/email`,
{
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
},
);
if (!signInResp.ok()) {
const signUpResp = await page.request.post(
`${APP_URL}/api/auth/sign-up/email`,
{
data: {
name: APP_USER_NAME,
email: APP_USER_EMAIL,
password: APP_USER_PASS,
},
failOnStatusCode: false,
},
);
console.log(`[Browser] Sign-up status: ${signUpResp.status()}`);
const retryResp = await page.request.post(
`${APP_URL}/api/auth/sign-in/email`,
{
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
},
);
if (!retryResp.ok()) {
console.log(`[Browser] Sign-in retry failed: ${retryResp.status()}`);
}
}
await page.goto(`${APP_URL}/`);
await page.waitForLoadState("networkidle");
const url = page.url();
console.log(`[Browser] After sign-in, URL: ${url}`);
const cookies = await page.context().cookies();
return cookies.map((c) => `${c.name}=${c.value}`).join("; ");
}
// ─── Config helper ───────────────────────────────────────────────────────────
/** Save app config via the API. */
export async function saveConfig(
request: APIRequestContext,
giteaToken: string,
cookies: string,
overrides: Record<string, any> = {},
): Promise<void> {
const giteaConfigDefaults = {
url: GITEA_URL,
username: GITEA_ADMIN_USER,
token: giteaToken,
organization: GITEA_MIRROR_ORG,
visibility: "public",
starredReposOrg: "github-stars",
preserveOrgStructure: false,
mirrorStrategy: "single-org",
backupBeforeSync: false,
blockSyncOnBackupFailure: false,
};
const configPayload = {
githubConfig: {
username: "e2e-test-user",
token: "fake-github-token-for-e2e",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: { ...giteaConfigDefaults, ...(overrides.giteaConfig || {}) },
scheduleConfig: {
enabled: false,
interval: 3600,
},
cleanupConfig: {
enabled: false,
retentionDays: 86400,
deleteIfNotInGitHub: false,
orphanedRepoAction: "skip",
dryRun: true,
},
mirrorOptions: {
mirrorReleases: false,
mirrorLFS: false,
mirrorMetadata: false,
metadataComponents: {
issues: false,
pullRequests: false,
labels: false,
milestones: false,
wiki: false,
},
},
advancedOptions: {
skipForks: false,
starredCodeOnly: false,
},
};
const resp = await request.post(`${APP_URL}/api/config`, {
data: configPayload,
headers: {
"Content-Type": "application/json",
Cookie: cookies,
},
failOnStatusCode: false,
});
const status = resp.status();
console.log(`[App] Save config response: ${status}`);
if (status >= 400) {
const body = await resp.text();
console.log(`[App] Config error body: ${body}`);
}
expect(status, "Config save should not return server error").toBeLessThan(
500,
);
}
// ─── Dashboard / repo helpers ────────────────────────────────────────────────
/**
* Fetch the list of repository IDs from the app's dashboard API.
* Optionally filter to repos with a given status.
*/
export async function getRepositoryIds(
request: APIRequestContext,
cookies: string,
opts?: { status?: string },
): Promise<{ ids: string[]; repos: any[] }> {
const dashResp = await request.get(`${APP_URL}/api/dashboard`, {
headers: { Cookie: cookies },
failOnStatusCode: false,
});
if (!dashResp.ok()) return { ids: [], repos: [] };
const dashData = await dashResp.json();
const repos: any[] = dashData.repositories ?? dashData.repos ?? [];
const filtered = opts?.status
? repos.filter((r: any) => r.status === opts.status)
: repos;
return {
ids: filtered.map((r: any) => r.id),
repos: filtered,
};
}
/**
* Trigger mirror jobs for the given repository IDs via the app API,
* then wait for a specified delay for async processing.
*/
export async function triggerMirrorJobs(
request: APIRequestContext,
cookies: string,
repositoryIds: string[],
waitMs = 30_000,
): Promise<number> {
const mirrorResp = await request.post(`${APP_URL}/api/job/mirror-repo`, {
headers: {
"Content-Type": "application/json",
Cookie: cookies,
},
data: { repositoryIds },
failOnStatusCode: false,
});
const status = mirrorResp.status();
if (waitMs > 0) {
await new Promise((r) => setTimeout(r, waitMs));
}
return status;
}
/**
* Trigger sync-repo (re-sync already-mirrored repos) for the given
* repository IDs, then wait for processing.
*/
export async function triggerSyncRepo(
request: APIRequestContext,
cookies: string,
repositoryIds: string[],
waitMs = 25_000,
): Promise<number> {
const syncResp = await request.post(`${APP_URL}/api/job/sync-repo`, {
headers: {
"Content-Type": "application/json",
Cookie: cookies,
},
data: { repositoryIds },
failOnStatusCode: false,
});
const status = syncResp.status();
if (waitMs > 0) {
await new Promise((r) => setTimeout(r, waitMs));
}
return status;
}

View File

@@ -0,0 +1,98 @@
import { defineConfig, devices } from "@playwright/test";
/**
* Playwright configuration for gitea-mirror E2E tests.
*
* Expected services (started by run-e2e.sh before Playwright launches):
* - Fake GitHub API server on http://localhost:4580
* - Git HTTP server on http://localhost:4590
* - Gitea instance on http://localhost:3333
* - gitea-mirror app on http://localhost:4321
*
* Test files are numbered to enforce execution order (they share state
* via a single Gitea + app instance):
* 01-health.spec.ts service smoke tests
* 02-mirror-workflow.spec.ts full first-mirror journey
* 03-backup.spec.ts backup config toggling
* 04-force-push.spec.ts force-push simulation & backup verification
* 05-sync-verification.spec.ts dynamic repos, content integrity, reset
*/
export default defineConfig({
testDir: ".",
testMatch: /\d+-.*\.spec\.ts$/,
/* Fail the build on CI if test.only is left in source */
forbidOnly: !!process.env.CI,
/* Retry once on CI to absorb flakiness from container startup races */
retries: process.env.CI ? 1 : 0,
/* Limit parallelism the tests share a single Gitea + app instance */
workers: 1,
fullyParallel: false,
/* Generous timeout: mirrors involve real HTTP round-trips to Gitea */
timeout: 120_000,
expect: { timeout: 15_000 },
/* Reporter */
reporter: process.env.CI
? [
["github"],
["html", { open: "never", outputFolder: "playwright-report" }],
]
: [
["list"],
["html", { open: "on-failure", outputFolder: "playwright-report" }],
],
outputDir: "test-results",
use: {
/* Base URL of the gitea-mirror app */
baseURL: process.env.APP_URL || "http://localhost:4321",
/* Collect traces on first retry so CI failures are debuggable */
trace: "on-first-retry",
screenshot: "only-on-failure",
video: "retain-on-failure",
/* Extra HTTP headers aren't needed but keep accept consistent */
extraHTTPHeaders: {
Accept: "application/json, text/html, */*",
},
},
projects: [
{
name: "chromium",
use: { ...devices["Desktop Chrome"] },
},
],
/* We do NOT use webServer here because run-e2e.sh manages all services.
* On CI the GitHub Action workflow starts them before invoking Playwright.
* Locally, run-e2e.sh does the same.
*
* If you want Playwright to start the app for you during local dev, uncomment:
*
* webServer: [
* {
* command: "npx tsx tests/e2e/fake-github-server.ts",
* port: 4580,
* reuseExistingServer: true,
* timeout: 10_000,
* },
* {
* command: "bun run dev",
* port: 4321,
* reuseExistingServer: true,
* timeout: 30_000,
* env: {
* GITHUB_API_URL: "http://localhost:4580",
* BETTER_AUTH_SECRET: "e2e-test-secret",
* },
* },
* ],
*/
});

455
tests/e2e/run-e2e.sh Executable file
View File

@@ -0,0 +1,455 @@
#!/usr/bin/env bash
# ────────────────────────────────────────────────────────────────────────────────
# E2E Test Orchestrator
#
# Starts all required services, runs Playwright E2E tests, and tears down.
#
# Services managed:
# 1. Gitea instance (Docker/Podman on port 3333)
# 2. Fake GitHub API (Node.js on port 4580)
# 3. gitea-mirror app (Astro dev server on port 4321)
#
# Usage:
# ./tests/e2e/run-e2e.sh # full run (cleanup → start → test → teardown)
# ./tests/e2e/run-e2e.sh --no-build # skip the Astro build step
# ./tests/e2e/run-e2e.sh --keep # don't tear down services after tests
# ./tests/e2e/run-e2e.sh --ci # CI-friendly mode (stricter, no --keep)
#
# Environment variables:
# GITEA_PORT (default: 3333)
# FAKE_GITHUB_PORT (default: 4580)
# APP_PORT (default: 4321)
# SKIP_CLEANUP (default: false) set "true" to skip initial cleanup
# BUN_CMD (default: auto-detected bun or "npx --yes bun")
# ────────────────────────────────────────────────────────────────────────────────
set -euo pipefail
# ─── Resolve paths ────────────────────────────────────────────────────────────
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
COMPOSE_FILE="$SCRIPT_DIR/docker-compose.e2e.yml"
# ─── Configuration ────────────────────────────────────────────────────────────
GITEA_PORT="${GITEA_PORT:-3333}"
FAKE_GITHUB_PORT="${FAKE_GITHUB_PORT:-4580}"
APP_PORT="${APP_PORT:-4321}"
GIT_SERVER_PORT="${GIT_SERVER_PORT:-4590}"
GITEA_URL="http://localhost:${GITEA_PORT}"
FAKE_GITHUB_URL="http://localhost:${FAKE_GITHUB_PORT}"
APP_URL="http://localhost:${APP_PORT}"
GIT_SERVER_URL="http://localhost:${GIT_SERVER_PORT}"
# URL that Gitea (inside Docker) uses to reach the git-server container
GIT_SERVER_INTERNAL_URL="http://git-server"
NO_BUILD=false
KEEP_RUNNING=false
CI_MODE=false
for arg in "$@"; do
case "$arg" in
--no-build) NO_BUILD=true ;;
--keep) KEEP_RUNNING=true ;;
--ci) CI_MODE=true ;;
--help|-h)
echo "Usage: $0 [--no-build] [--keep] [--ci]"
exit 0
;;
esac
done
# ─── Detect tools ─────────────────────────────────────────────────────────────
# Container runtime
COMPOSE_CMD=""
CONTAINER_CMD=""
if command -v podman-compose &>/dev/null; then
COMPOSE_CMD="podman-compose"
CONTAINER_CMD="podman"
elif command -v docker-compose &>/dev/null; then
COMPOSE_CMD="docker-compose"
CONTAINER_CMD="docker"
elif command -v docker &>/dev/null && docker compose version &>/dev/null 2>&1; then
COMPOSE_CMD="docker compose"
CONTAINER_CMD="docker"
else
echo "ERROR: No container compose tool found. Install docker-compose or podman-compose."
exit 1
fi
# Bun or fallback
if command -v bun &>/dev/null; then
BUN_CMD="${BUN_CMD:-bun}"
elif command -v npx &>/dev/null; then
# Use npx to run bun commands works on CI with setup-bun action
BUN_CMD="${BUN_CMD:-npx --yes bun}"
else
echo "ERROR: Neither bun nor npx found."
exit 1
fi
# Node/tsx for the fake GitHub server
if command -v tsx &>/dev/null; then
TSX_CMD="tsx"
elif command -v npx &>/dev/null; then
TSX_CMD="npx --yes tsx"
else
echo "ERROR: Neither tsx nor npx found."
exit 1
fi
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ E2E Test Orchestrator ║"
echo "╠══════════════════════════════════════════════════════════════╣"
echo "║ Container runtime : $COMPOSE_CMD"
echo "║ Bun command : $BUN_CMD"
echo "║ TSX command : $TSX_CMD"
echo "║ Gitea URL : $GITEA_URL"
echo "║ Fake GitHub URL : $FAKE_GITHUB_URL"
echo "║ App URL : $APP_URL"
echo "║ Git Server URL : $GIT_SERVER_URL"
echo "║ Git Server (int) : $GIT_SERVER_INTERNAL_URL"
echo "║ CI mode : $CI_MODE"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
# ─── PID tracking for cleanup ─────────────────────────────────────────────────
FAKE_GITHUB_PID=""
APP_PID=""
EXIT_CODE=0
cleanup_on_exit() {
local code=$?
echo ""
echo "────────────────────────────────────────────────────────────────"
echo "[teardown] Cleaning up..."
# Kill fake GitHub server
if [[ -n "$FAKE_GITHUB_PID" ]] && kill -0 "$FAKE_GITHUB_PID" 2>/dev/null; then
echo "[teardown] Stopping fake GitHub server (PID $FAKE_GITHUB_PID)..."
kill "$FAKE_GITHUB_PID" 2>/dev/null || true
wait "$FAKE_GITHUB_PID" 2>/dev/null || true
fi
rm -f "$SCRIPT_DIR/.fake-github.pid"
# Kill app server
if [[ -n "$APP_PID" ]] && kill -0 "$APP_PID" 2>/dev/null; then
echo "[teardown] Stopping gitea-mirror app (PID $APP_PID)..."
kill "$APP_PID" 2>/dev/null || true
wait "$APP_PID" 2>/dev/null || true
fi
rm -f "$SCRIPT_DIR/.app.pid"
# Stop containers (unless --keep)
if [[ "$KEEP_RUNNING" == false ]]; then
if [[ -n "$COMPOSE_CMD" ]] && [[ -f "$COMPOSE_FILE" ]]; then
echo "[teardown] Stopping Gitea container..."
$COMPOSE_CMD -f "$COMPOSE_FILE" down --volumes --remove-orphans 2>/dev/null || true
fi
else
echo "[teardown] --keep flag set, leaving services running"
fi
echo "[teardown] Done."
# Use the test exit code, not the cleanup exit code
if [[ $EXIT_CODE -ne 0 ]]; then
exit $EXIT_CODE
fi
exit $code
}
trap cleanup_on_exit EXIT INT TERM
# ─── Step 0: Cleanup previous run ────────────────────────────────────────────
if [[ "${SKIP_CLEANUP:-false}" != "true" ]]; then
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 0: Cleanup previous E2E run │"
echo "└──────────────────────────────────────────────────────────────┘"
bash "$SCRIPT_DIR/cleanup.sh" --soft 2>/dev/null || true
echo ""
fi
# ─── Step 1: Install dependencies ────────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 1: Install dependencies │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
$BUN_CMD install 2>&1 | tail -5
echo "[deps] ✓ Dependencies installed"
# Install Playwright browsers if needed
if ! npx playwright install --dry-run chromium &>/dev/null 2>&1; then
echo "[deps] Installing Playwright browsers..."
npx playwright install chromium 2>&1 | tail -3
fi
# Always ensure system deps are available (needed in CI/fresh environments)
if [[ "$CI_MODE" == true ]]; then
echo "[deps] Installing Playwright system dependencies..."
npx playwright install-deps chromium 2>&1 | tail -5 || true
fi
echo "[deps] ✓ Playwright ready"
echo ""
# ─── Step 1.5: Create test git repositories ─────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 1.5: Create test git repositories │"
echo "└──────────────────────────────────────────────────────────────┘"
GIT_REPOS_DIR="$SCRIPT_DIR/git-repos"
echo "[git-repos] Creating bare git repos in $GIT_REPOS_DIR ..."
$BUN_CMD run "$SCRIPT_DIR/create-test-repos.ts" --output-dir "$GIT_REPOS_DIR" 2>&1
if [[ ! -f "$GIT_REPOS_DIR/manifest.json" ]]; then
echo "ERROR: Test git repos were not created (manifest.json missing)"
EXIT_CODE=1
exit 1
fi
echo "[git-repos] ✓ Test repositories created"
echo ""
# ─── Step 2: Build the app ──────────────────────────────────────────────────
if [[ "$NO_BUILD" == false ]]; then
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 2: Build gitea-mirror │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
# Initialize the database
echo "[build] Initializing database..."
$BUN_CMD run manage-db init 2>&1 | tail -3 || true
# Build the Astro project
echo "[build] Building Astro project..."
GITHUB_API_URL="$FAKE_GITHUB_URL" \
BETTER_AUTH_SECRET="e2e-test-secret" \
$BUN_CMD run build 2>&1 | tail -10
echo "[build] ✓ Build complete"
echo ""
else
echo "[build] Skipped (--no-build flag)"
echo ""
fi
# ─── Step 3: Start Gitea container ──────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 3: Start Gitea container │"
echo "└──────────────────────────────────────────────────────────────┘"
$COMPOSE_CMD -f "$COMPOSE_FILE" up -d 2>&1
# Wait for git-server to be healthy first (Gitea depends on it)
echo "[git-server] Waiting for git HTTP server..."
GIT_SERVER_READY=false
for i in $(seq 1 30); do
if curl -sf "${GIT_SERVER_URL}/manifest.json" &>/dev/null; then
GIT_SERVER_READY=true
break
fi
printf "."
sleep 1
done
echo ""
if [[ "$GIT_SERVER_READY" != true ]]; then
echo "ERROR: Git HTTP server did not start within 30 seconds"
echo "[git-server] Container logs:"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs git-server --tail=20 2>/dev/null || true
EXIT_CODE=1
exit 1
fi
echo "[git-server] ✓ Git HTTP server is ready on $GIT_SERVER_URL"
echo "[gitea] Waiting for Gitea to become healthy..."
GITEA_READY=false
for i in $(seq 1 60); do
if curl -sf "${GITEA_URL}/api/v1/version" &>/dev/null; then
GITEA_READY=true
break
fi
printf "."
sleep 2
done
echo ""
if [[ "$GITEA_READY" != true ]]; then
echo "ERROR: Gitea did not become healthy within 120 seconds"
echo "[gitea] Container logs:"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs gitea-e2e --tail=30 2>/dev/null || true
EXIT_CODE=1
exit 1
fi
GITEA_VERSION=$(curl -sf "${GITEA_URL}/api/v1/version" | grep -o '"version":"[^"]*"' | cut -d'"' -f4)
echo "[gitea] ✓ Gitea is ready (version: ${GITEA_VERSION:-unknown})"
echo ""
# ─── Step 4: Start fake GitHub API ──────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 4: Start fake GitHub API server │"
echo "└──────────────────────────────────────────────────────────────┘"
PORT=$FAKE_GITHUB_PORT GIT_SERVER_URL="$GIT_SERVER_INTERNAL_URL" \
$TSX_CMD "$SCRIPT_DIR/fake-github-server.ts" &
FAKE_GITHUB_PID=$!
echo "$FAKE_GITHUB_PID" > "$SCRIPT_DIR/.fake-github.pid"
echo "[fake-github] Started (PID: $FAKE_GITHUB_PID)"
echo "[fake-github] Waiting for server to be ready..."
FAKE_READY=false
for i in $(seq 1 30); do
if curl -sf "${FAKE_GITHUB_URL}/___mgmt/health" &>/dev/null; then
FAKE_READY=true
break
fi
# Check if process died
if ! kill -0 "$FAKE_GITHUB_PID" 2>/dev/null; then
echo "ERROR: Fake GitHub server process died"
EXIT_CODE=1
exit 1
fi
printf "."
sleep 1
done
echo ""
if [[ "$FAKE_READY" != true ]]; then
echo "ERROR: Fake GitHub server did not start within 30 seconds"
EXIT_CODE=1
exit 1
fi
echo "[fake-github] ✓ Fake GitHub API is ready on $FAKE_GITHUB_URL"
# Tell the fake GitHub server to use the git-server container URL for clone_url
# (This updates existing repos in the store so Gitea can actually clone them)
echo "[fake-github] Setting clone URL base to $GIT_SERVER_INTERNAL_URL ..."
curl -sf -X POST "${FAKE_GITHUB_URL}/___mgmt/set-clone-url" \
-H "Content-Type: application/json" \
-d "{\"url\": \"${GIT_SERVER_INTERNAL_URL}\"}" || true
echo "[fake-github] ✓ Clone URLs configured"
echo ""
# ─── Step 5: Start gitea-mirror app ────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 5: Start gitea-mirror application │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
# Reinitialize the database in case build step reset it
$BUN_CMD run manage-db init 2>&1 | tail -2 || true
# Start the app with E2E environment
GITHUB_API_URL="$FAKE_GITHUB_URL" \
BETTER_AUTH_SECRET="e2e-test-secret" \
BETTER_AUTH_URL="$APP_URL" \
DATABASE_URL="file:data/gitea-mirror.db" \
HOST="0.0.0.0" \
PORT="$APP_PORT" \
NODE_ENV="production" \
PRE_SYNC_BACKUP_ENABLED="false" \
ENCRYPTION_SECRET="e2e-encryption-secret-32char!!" \
$BUN_CMD run start &
APP_PID=$!
echo "$APP_PID" > "$SCRIPT_DIR/.app.pid"
echo "[app] Started (PID: $APP_PID)"
echo "[app] Waiting for app to be ready..."
APP_READY=false
for i in $(seq 1 90); do
# Try the health endpoint first, then fall back to root
if curl -sf "${APP_URL}/api/health" &>/dev/null 2>&1 || \
curl -sf -o /dev/null -w "%{http_code}" "${APP_URL}/" 2>/dev/null | grep -q "^[23]"; then
APP_READY=true
break
fi
# Check if process died
if ! kill -0 "$APP_PID" 2>/dev/null; then
echo ""
echo "ERROR: gitea-mirror app process died"
EXIT_CODE=1
exit 1
fi
printf "."
sleep 2
done
echo ""
if [[ "$APP_READY" != true ]]; then
echo "ERROR: gitea-mirror app did not start within 180 seconds"
EXIT_CODE=1
exit 1
fi
echo "[app] ✓ gitea-mirror app is ready on $APP_URL"
echo ""
# ─── Step 6: Run Playwright E2E tests ──────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 6: Run Playwright E2E tests │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
# Ensure test-results directory exists
mkdir -p "$SCRIPT_DIR/test-results"
# Run Playwright
set +e
APP_URL="$APP_URL" \
GITEA_URL="$GITEA_URL" \
FAKE_GITHUB_URL="$FAKE_GITHUB_URL" \
npx playwright test \
--config "$SCRIPT_DIR/playwright.config.ts" \
--reporter=list
PLAYWRIGHT_EXIT=$?
set -e
echo ""
if [[ $PLAYWRIGHT_EXIT -eq 0 ]]; then
echo "═══════════════════════════════════════════════════════════════"
echo " ✅ E2E tests PASSED"
echo "═══════════════════════════════════════════════════════════════"
else
echo "═══════════════════════════════════════════════════════════════"
echo " ❌ E2E tests FAILED (exit code: $PLAYWRIGHT_EXIT)"
echo "═══════════════════════════════════════════════════════════════"
# On failure, dump some diagnostic info
echo ""
echo "[diag] Gitea container status:"
$COMPOSE_CMD -f "$COMPOSE_FILE" ps 2>/dev/null || true
echo ""
echo "[diag] Gitea container logs (last 20 lines):"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs gitea-e2e --tail=20 2>/dev/null || true
echo ""
echo "[diag] Git server logs (last 10 lines):"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs git-server --tail=10 2>/dev/null || true
echo ""
echo "[diag] Git server health:"
curl -sf "${GIT_SERVER_URL}/manifest.json" 2>/dev/null || echo "(unreachable)"
echo ""
echo "[diag] Fake GitHub health:"
curl -sf "${FAKE_GITHUB_URL}/___mgmt/health" 2>/dev/null || echo "(unreachable)"
echo ""
echo "[diag] App health:"
curl -sf "${APP_URL}/api/health" 2>/dev/null || echo "(unreachable)"
echo ""
# Point to HTML report
if [[ -d "$SCRIPT_DIR/playwright-report" ]]; then
echo "[diag] HTML report: $SCRIPT_DIR/playwright-report/index.html"
echo " Run: npx playwright show-report $SCRIPT_DIR/playwright-report"
fi
EXIT_CODE=$PLAYWRIGHT_EXIT
fi
# EXIT_CODE is used by the trap handler
exit $EXIT_CODE