feat: add /update skill for pulling upstream changes (#372)
Interactive skill that guides Claude through fetching upstream NanoClaw, previewing changes, merging with customizations, running migrations, and verifying the result. Includes: - SKILL.md with 9-step update flow - fetch-upstream.sh: detects remote, fetches, extracts tracked paths - run-migrations.ts: discovers and runs version-ordered migrations - post-update.ts: clears backup after conflict resolution - update-core.ts: adds --json and --preview-only flags - BASE_INCLUDES moved to constants.ts as single source of truth - 16 new tests covering fetch, migrations, and CLI flags Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
171
.claude/skills/update/SKILL.md
Normal file
171
.claude/skills/update/SKILL.md
Normal file
@@ -0,0 +1,171 @@
|
|||||||
|
---
|
||||||
|
name: update
|
||||||
|
description: "Update NanoClaw from upstream. Fetches latest changes, merges with your customizations and skills, runs migrations. Triggers on \"update\", \"pull upstream\", \"sync with upstream\", \"get latest changes\"."
|
||||||
|
---
|
||||||
|
|
||||||
|
# Update NanoClaw
|
||||||
|
|
||||||
|
Pull upstream changes and merge them with the user's installation, preserving skills and customizations. Scripts live in `.claude/skills/update/scripts/`.
|
||||||
|
|
||||||
|
**Principle:** Handle everything automatically. Only pause for user confirmation before applying changes, or when merge conflicts need human judgment.
|
||||||
|
|
||||||
|
**UX Note:** Use `AskUserQuestion` for all user-facing questions.
|
||||||
|
|
||||||
|
## 1. Pre-flight
|
||||||
|
|
||||||
|
Check that the skills system is initialized:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
test -d .nanoclaw && echo "INITIALIZED" || echo "NOT_INITIALIZED"
|
||||||
|
```
|
||||||
|
|
||||||
|
**If NOT_INITIALIZED:** Run `initSkillsSystem()` first:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx tsx -e "import { initNanoclawDir } from './skills-engine/init.js'; initNanoclawDir();"
|
||||||
|
```
|
||||||
|
|
||||||
|
Check for uncommitted git changes:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git status --porcelain
|
||||||
|
```
|
||||||
|
|
||||||
|
**If there are uncommitted changes:** Warn the user: "You have uncommitted changes. It's recommended to commit or stash them before updating. Continue anyway?" Use `AskUserQuestion` with options: "Continue anyway", "Abort (I'll commit first)". If they abort, stop here.
|
||||||
|
|
||||||
|
## 2. Fetch upstream
|
||||||
|
|
||||||
|
Run the fetch script:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./.claude/skills/update/scripts/fetch-upstream.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Parse the structured status block between `<<< STATUS` and `STATUS >>>` markers. Extract:
|
||||||
|
- `TEMP_DIR` — path to extracted upstream files
|
||||||
|
- `REMOTE` — which git remote was used
|
||||||
|
- `CURRENT_VERSION` — version from local `package.json`
|
||||||
|
- `NEW_VERSION` — version from upstream `package.json`
|
||||||
|
- `STATUS` — "success" or "error"
|
||||||
|
|
||||||
|
**If STATUS=error:** Show the error output and stop.
|
||||||
|
|
||||||
|
**If CURRENT_VERSION equals NEW_VERSION:** Tell the user they're already up to date. Ask if they want to force the update anyway (there may be non-version-bumped changes). If no, clean up the temp dir and stop.
|
||||||
|
|
||||||
|
## 3. Preview
|
||||||
|
|
||||||
|
Run the preview to show what will change:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx tsx scripts/update-core.ts --json --preview-only <TEMP_DIR>
|
||||||
|
```
|
||||||
|
|
||||||
|
This outputs JSON with: `currentVersion`, `newVersion`, `filesChanged`, `filesDeleted`, `conflictRisk`, `customPatchesAtRisk`.
|
||||||
|
|
||||||
|
Present to the user:
|
||||||
|
- "Updating from **{currentVersion}** to **{newVersion}**"
|
||||||
|
- "{N} files will be changed" — list them if <= 20, otherwise summarize
|
||||||
|
- If `conflictRisk` is non-empty: "These files have skill modifications and may conflict: {list}"
|
||||||
|
- If `customPatchesAtRisk` is non-empty: "These custom patches may need re-application: {list}"
|
||||||
|
- If `filesDeleted` is non-empty: "{N} files will be removed"
|
||||||
|
|
||||||
|
## 4. Confirm
|
||||||
|
|
||||||
|
Use `AskUserQuestion`: "Apply this update?" with options:
|
||||||
|
- "Yes, apply update"
|
||||||
|
- "No, cancel"
|
||||||
|
|
||||||
|
If cancelled, clean up the temp dir (`rm -rf <TEMP_DIR>`) and stop.
|
||||||
|
|
||||||
|
## 5. Apply
|
||||||
|
|
||||||
|
Run the update:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx tsx scripts/update-core.ts --json <TEMP_DIR>
|
||||||
|
```
|
||||||
|
|
||||||
|
Parse the JSON output. The result has: `success`, `previousVersion`, `newVersion`, `mergeConflicts`, `backupPending`, `customPatchFailures`, `skillReapplyResults`, `error`.
|
||||||
|
|
||||||
|
**If success=true with no issues:** Continue to step 7.
|
||||||
|
|
||||||
|
**If customPatchFailures exist:** Warn the user which custom patches failed to re-apply. These may need manual attention after the update.
|
||||||
|
|
||||||
|
**If skillReapplyResults has false entries:** Warn the user which skill tests failed after re-application.
|
||||||
|
|
||||||
|
## 6. Handle conflicts
|
||||||
|
|
||||||
|
**If backupPending=true:** There are unresolved merge conflicts.
|
||||||
|
|
||||||
|
For each file in `mergeConflicts`:
|
||||||
|
1. Read the file — it contains conflict markers (`<<<<<<<`, `=======`, `>>>>>>>`)
|
||||||
|
2. Check if there's an intent file for this path in any applied skill (e.g., `.claude/skills/<skill>/modify/<path>.intent.md`)
|
||||||
|
3. Use the intent file and your understanding of the codebase to resolve the conflict
|
||||||
|
4. Write the resolved file
|
||||||
|
|
||||||
|
After resolving all conflicts:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx tsx scripts/post-update.ts
|
||||||
|
```
|
||||||
|
|
||||||
|
This clears the backup, confirming the resolution.
|
||||||
|
|
||||||
|
**If you cannot confidently resolve a conflict:** Show the user the conflicting sections and ask them to choose or provide guidance.
|
||||||
|
|
||||||
|
## 7. Run migrations
|
||||||
|
|
||||||
|
Run migrations between the old and new versions:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx tsx scripts/run-migrations.ts <CURRENT_VERSION> <NEW_VERSION> <TEMP_DIR>
|
||||||
|
```
|
||||||
|
|
||||||
|
Parse the JSON output. It contains: `migrationsRun` (count), `results` (array of `{version, success, error?}`).
|
||||||
|
|
||||||
|
**If any migration fails:** Show the error to the user. The update itself is already applied — the migration failure needs manual attention.
|
||||||
|
|
||||||
|
**If no migrations found:** This is normal (most updates won't have migrations). Continue silently.
|
||||||
|
|
||||||
|
## 8. Verify
|
||||||
|
|
||||||
|
Run build and tests:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm run build && npm test
|
||||||
|
```
|
||||||
|
|
||||||
|
**If build fails:** Show the error. Common causes:
|
||||||
|
- Type errors from merged files — read the error, fix the file, retry
|
||||||
|
- Missing dependencies — run `npm install` first, retry
|
||||||
|
|
||||||
|
**If tests fail:** Show which tests failed. Try to diagnose and fix. If you can't fix automatically, report to the user.
|
||||||
|
|
||||||
|
**If both pass:** Report success.
|
||||||
|
|
||||||
|
## 9. Cleanup
|
||||||
|
|
||||||
|
Remove the temp directory:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
rm -rf <TEMP_DIR>
|
||||||
|
```
|
||||||
|
|
||||||
|
Report final status:
|
||||||
|
- "Updated from **{previousVersion}** to **{newVersion}**"
|
||||||
|
- Number of files changed
|
||||||
|
- Any warnings (failed custom patches, failed skill tests, migration issues)
|
||||||
|
- Build and test status
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
**No upstream remote:** The fetch script auto-adds `upstream` pointing to `https://github.com/qwibitai/nanoclaw.git`. If the user forked from a different URL, they should set the remote manually: `git remote add upstream <url>`.
|
||||||
|
|
||||||
|
**Merge conflicts in many files:** Consider whether the user has heavily customized core files. Suggest using the skills system for modifications instead of direct edits, as skills survive updates better.
|
||||||
|
|
||||||
|
**Build fails after update:** Check if `package.json` dependencies changed. Run `npm install` to pick up new dependencies.
|
||||||
|
|
||||||
|
**Rollback:** If something goes wrong after applying but before cleanup, the backup is still in `.nanoclaw/backup/`. Run:
|
||||||
|
```bash
|
||||||
|
npx tsx -e "import { restoreBackup, clearBackup } from './skills-engine/backup.js'; restoreBackup(); clearBackup();"
|
||||||
|
```
|
||||||
84
.claude/skills/update/scripts/fetch-upstream.sh
Executable file
84
.claude/skills/update/scripts/fetch-upstream.sh
Executable file
@@ -0,0 +1,84 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Fetch upstream NanoClaw and extract to a temp directory.
|
||||||
|
# Outputs a structured status block for machine parsing.
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../../../.." && pwd)"
|
||||||
|
cd "$PROJECT_ROOT"
|
||||||
|
|
||||||
|
# Determine the correct remote
|
||||||
|
REMOTE=""
|
||||||
|
if git remote get-url upstream &>/dev/null; then
|
||||||
|
REMOTE="upstream"
|
||||||
|
elif git remote get-url origin &>/dev/null; then
|
||||||
|
ORIGIN_URL=$(git remote get-url origin)
|
||||||
|
if echo "$ORIGIN_URL" | grep -q "qwibitai/nanoclaw"; then
|
||||||
|
REMOTE="origin"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z "$REMOTE" ]; then
|
||||||
|
echo "No upstream remote found. Adding upstream → https://github.com/qwibitai/nanoclaw.git"
|
||||||
|
git remote add upstream https://github.com/qwibitai/nanoclaw.git
|
||||||
|
REMOTE="upstream"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Fetching from $REMOTE..."
|
||||||
|
if ! git fetch "$REMOTE" main 2>&1; then
|
||||||
|
echo "<<< STATUS"
|
||||||
|
echo "STATUS=error"
|
||||||
|
echo "ERROR=Failed to fetch from $REMOTE"
|
||||||
|
echo "STATUS >>>"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Get current version from local package.json
|
||||||
|
CURRENT_VERSION="unknown"
|
||||||
|
if [ -f package.json ]; then
|
||||||
|
CURRENT_VERSION=$(node -e "console.log(require('./package.json').version || 'unknown')")
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create temp dir and extract only the paths the skills engine tracks.
|
||||||
|
# Read BASE_INCLUDES from the single source of truth in skills-engine/constants.ts,
|
||||||
|
# plus always include migrations/ for the migration runner.
|
||||||
|
TEMP_DIR=$(mktemp -d /tmp/nanoclaw-update-XXXX)
|
||||||
|
trap 'rm -rf "$TEMP_DIR"' ERR
|
||||||
|
echo "Extracting $REMOTE/main to $TEMP_DIR..."
|
||||||
|
|
||||||
|
CANDIDATES=$(node -e "
|
||||||
|
const fs = require('fs');
|
||||||
|
const src = fs.readFileSync('skills-engine/constants.ts', 'utf-8');
|
||||||
|
const m = src.match(/BASE_INCLUDES\s*=\s*\[([^\]]+)\]/);
|
||||||
|
if (!m) { console.error('Cannot parse BASE_INCLUDES'); process.exit(1); }
|
||||||
|
const paths = m[1].match(/'([^']+)'/g).map(s => s.replace(/'/g, ''));
|
||||||
|
paths.push('migrations/');
|
||||||
|
console.log(paths.join(' '));
|
||||||
|
")
|
||||||
|
|
||||||
|
# Filter to paths that actually exist in the upstream tree.
|
||||||
|
# git archive errors if a path doesn't exist, so we check first.
|
||||||
|
PATHS=""
|
||||||
|
for candidate in $CANDIDATES; do
|
||||||
|
if [ -n "$(git ls-tree --name-only "$REMOTE/main" "$candidate" 2>/dev/null)" ]; then
|
||||||
|
PATHS="$PATHS $candidate"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
git archive "$REMOTE/main" -- $PATHS | tar -x -C "$TEMP_DIR"
|
||||||
|
|
||||||
|
# Get new version from extracted package.json
|
||||||
|
NEW_VERSION="unknown"
|
||||||
|
if [ -f "$TEMP_DIR/package.json" ]; then
|
||||||
|
NEW_VERSION=$(node -e "console.log(require('$TEMP_DIR/package.json').version || 'unknown')")
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "<<< STATUS"
|
||||||
|
echo "TEMP_DIR=$TEMP_DIR"
|
||||||
|
echo "REMOTE=$REMOTE"
|
||||||
|
echo "CURRENT_VERSION=$CURRENT_VERSION"
|
||||||
|
echo "NEW_VERSION=$NEW_VERSION"
|
||||||
|
echo "STATUS=success"
|
||||||
|
echo "STATUS >>>"
|
||||||
5
scripts/post-update.ts
Normal file
5
scripts/post-update.ts
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
#!/usr/bin/env tsx
|
||||||
|
import { clearBackup } from '../skills-engine/backup.js';
|
||||||
|
|
||||||
|
clearBackup();
|
||||||
|
console.log('Backup cleared.');
|
||||||
104
scripts/run-migrations.ts
Normal file
104
scripts/run-migrations.ts
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
#!/usr/bin/env tsx
|
||||||
|
import { execFileSync, execSync } from 'child_process';
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
|
||||||
|
import { compareSemver } from '../skills-engine/state.js';
|
||||||
|
|
||||||
|
// Resolve tsx binary once to avoid npx race conditions across migrations
|
||||||
|
function resolveTsx(): string {
|
||||||
|
// Check local node_modules first
|
||||||
|
const local = path.resolve('node_modules/.bin/tsx');
|
||||||
|
if (fs.existsSync(local)) return local;
|
||||||
|
// Fall back to whichever tsx is in PATH
|
||||||
|
try {
|
||||||
|
return execSync('which tsx', { encoding: 'utf-8' }).trim();
|
||||||
|
} catch {
|
||||||
|
return 'npx'; // last resort
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const tsxBin = resolveTsx();
|
||||||
|
|
||||||
|
const fromVersion = process.argv[2];
|
||||||
|
const toVersion = process.argv[3];
|
||||||
|
const newCorePath = process.argv[4];
|
||||||
|
|
||||||
|
if (!fromVersion || !toVersion || !newCorePath) {
|
||||||
|
console.error(
|
||||||
|
'Usage: tsx scripts/run-migrations.ts <from-version> <to-version> <new-core-path>',
|
||||||
|
);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
interface MigrationResult {
|
||||||
|
version: string;
|
||||||
|
success: boolean;
|
||||||
|
error?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
const results: MigrationResult[] = [];
|
||||||
|
|
||||||
|
// Look for migrations in the new core
|
||||||
|
const migrationsDir = path.join(newCorePath, 'migrations');
|
||||||
|
|
||||||
|
if (!fs.existsSync(migrationsDir)) {
|
||||||
|
console.log(
|
||||||
|
JSON.stringify({ migrationsRun: 0, results: [] }, null, 2),
|
||||||
|
);
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Discover migration directories (version-named)
|
||||||
|
const entries = fs.readdirSync(migrationsDir, { withFileTypes: true });
|
||||||
|
const migrationVersions = entries
|
||||||
|
.filter((e) => e.isDirectory() && /^\d+\.\d+\.\d+$/.test(e.name))
|
||||||
|
.map((e) => e.name)
|
||||||
|
.filter(
|
||||||
|
(v) =>
|
||||||
|
compareSemver(v, fromVersion) > 0 && compareSemver(v, toVersion) <= 0,
|
||||||
|
)
|
||||||
|
.sort(compareSemver);
|
||||||
|
|
||||||
|
const projectRoot = process.cwd();
|
||||||
|
|
||||||
|
for (const version of migrationVersions) {
|
||||||
|
const migrationIndex = path.join(migrationsDir, version, 'index.ts');
|
||||||
|
if (!fs.existsSync(migrationIndex)) {
|
||||||
|
results.push({
|
||||||
|
version,
|
||||||
|
success: false,
|
||||||
|
error: `Migration ${version}/index.ts not found`,
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const tsxArgs = tsxBin.endsWith('npx')
|
||||||
|
? ['tsx', migrationIndex, projectRoot]
|
||||||
|
: [migrationIndex, projectRoot];
|
||||||
|
execFileSync(tsxBin, tsxArgs, {
|
||||||
|
stdio: 'pipe',
|
||||||
|
cwd: projectRoot,
|
||||||
|
timeout: 120_000,
|
||||||
|
});
|
||||||
|
results.push({ version, success: true });
|
||||||
|
} catch (err) {
|
||||||
|
const message =
|
||||||
|
err instanceof Error ? err.message : String(err);
|
||||||
|
results.push({ version, success: false, error: message });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(
|
||||||
|
JSON.stringify(
|
||||||
|
{ migrationsRun: results.length, results },
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
|
||||||
|
// Exit with error if any migration failed
|
||||||
|
if (results.some((r) => !r.success)) {
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
@@ -1,14 +1,27 @@
|
|||||||
#!/usr/bin/env tsx
|
#!/usr/bin/env tsx
|
||||||
import { applyUpdate, previewUpdate } from '../skills-engine/update.js';
|
import { applyUpdate, previewUpdate } from '../skills-engine/update.js';
|
||||||
|
|
||||||
const newCorePath = process.argv[2];
|
const args = process.argv.slice(2);
|
||||||
|
const jsonMode = args.includes('--json');
|
||||||
|
const previewOnly = args.includes('--preview-only');
|
||||||
|
const newCorePath = args.find((a) => !a.startsWith('--'));
|
||||||
|
|
||||||
if (!newCorePath) {
|
if (!newCorePath) {
|
||||||
console.error('Usage: tsx scripts/update-core.ts <path-to-new-core>');
|
console.error(
|
||||||
|
'Usage: tsx scripts/update-core.ts [--json] [--preview-only] <path-to-new-core>',
|
||||||
|
);
|
||||||
process.exit(1);
|
process.exit(1);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Preview
|
// Preview
|
||||||
const preview = previewUpdate(newCorePath);
|
const preview = previewUpdate(newCorePath);
|
||||||
|
|
||||||
|
if (jsonMode && previewOnly) {
|
||||||
|
console.log(JSON.stringify(preview, null, 2));
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
function printPreview(): void {
|
||||||
console.log('=== Update Preview ===');
|
console.log('=== Update Preview ===');
|
||||||
console.log(`Current version: ${preview.currentVersion}`);
|
console.log(`Current version: ${preview.currentVersion}`);
|
||||||
console.log(`New version: ${preview.newVersion}`);
|
console.log(`New version: ${preview.newVersion}`);
|
||||||
@@ -22,13 +35,25 @@ if (preview.conflictRisk.length > 0) {
|
|||||||
console.log(`Conflict risk: ${preview.conflictRisk.join(', ')}`);
|
console.log(`Conflict risk: ${preview.conflictRisk.join(', ')}`);
|
||||||
}
|
}
|
||||||
if (preview.customPatchesAtRisk.length > 0) {
|
if (preview.customPatchesAtRisk.length > 0) {
|
||||||
console.log(`Custom patches at risk: ${preview.customPatchesAtRisk.join(', ')}`);
|
console.log(
|
||||||
|
`Custom patches at risk: ${preview.customPatchesAtRisk.join(', ')}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
console.log('');
|
|
||||||
|
|
||||||
// Apply
|
if (previewOnly) {
|
||||||
|
printPreview();
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!jsonMode) {
|
||||||
|
printPreview();
|
||||||
|
console.log('');
|
||||||
console.log('Applying update...');
|
console.log('Applying update...');
|
||||||
|
}
|
||||||
|
|
||||||
const result = await applyUpdate(newCorePath);
|
const result = await applyUpdate(newCorePath);
|
||||||
|
|
||||||
console.log(JSON.stringify(result, null, 2));
|
console.log(JSON.stringify(result, null, 2));
|
||||||
|
|
||||||
if (!result.success) {
|
if (!result.success) {
|
||||||
|
|||||||
249
skills-engine/__tests__/fetch-upstream.test.ts
Normal file
249
skills-engine/__tests__/fetch-upstream.test.ts
Normal file
@@ -0,0 +1,249 @@
|
|||||||
|
import { execFileSync, execSync } from 'child_process';
|
||||||
|
import fs from 'fs';
|
||||||
|
import os from 'os';
|
||||||
|
import path from 'path';
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
|
||||||
|
describe('fetch-upstream.sh', () => {
|
||||||
|
let projectDir: string;
|
||||||
|
let upstreamBareDir: string;
|
||||||
|
const scriptPath = path.resolve(
|
||||||
|
'.claude/skills/update/scripts/fetch-upstream.sh',
|
||||||
|
);
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
// Create a bare repo to act as "upstream"
|
||||||
|
upstreamBareDir = fs.mkdtempSync(
|
||||||
|
path.join(os.tmpdir(), 'nanoclaw-upstream-'),
|
||||||
|
);
|
||||||
|
execSync('git init --bare', { cwd: upstreamBareDir, stdio: 'pipe' });
|
||||||
|
|
||||||
|
// Create a working repo, add files, push to the bare repo
|
||||||
|
const seedDir = fs.mkdtempSync(
|
||||||
|
path.join(os.tmpdir(), 'nanoclaw-seed-'),
|
||||||
|
);
|
||||||
|
execSync('git init', { cwd: seedDir, stdio: 'pipe' });
|
||||||
|
execSync('git config user.email "test@test.com"', {
|
||||||
|
cwd: seedDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
execSync('git config user.name "Test"', { cwd: seedDir, stdio: 'pipe' });
|
||||||
|
fs.writeFileSync(
|
||||||
|
path.join(seedDir, 'package.json'),
|
||||||
|
JSON.stringify({ name: 'nanoclaw', version: '2.0.0' }),
|
||||||
|
);
|
||||||
|
fs.mkdirSync(path.join(seedDir, 'src'), { recursive: true });
|
||||||
|
fs.writeFileSync(
|
||||||
|
path.join(seedDir, 'src/index.ts'),
|
||||||
|
'export const v = 2;',
|
||||||
|
);
|
||||||
|
execSync('git add -A && git commit -m "upstream v2.0.0"', {
|
||||||
|
cwd: seedDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
execSync(`git remote add origin ${upstreamBareDir}`, {
|
||||||
|
cwd: seedDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
execSync('git push origin main 2>/dev/null || git push origin master', {
|
||||||
|
cwd: seedDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
shell: '/bin/bash',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Rename the default branch to main in the bare repo if needed
|
||||||
|
try {
|
||||||
|
execSync('git symbolic-ref HEAD refs/heads/main', {
|
||||||
|
cwd: upstreamBareDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
// Already on main
|
||||||
|
}
|
||||||
|
|
||||||
|
fs.rmSync(seedDir, { recursive: true, force: true });
|
||||||
|
|
||||||
|
// Create the "project" repo that will run the script
|
||||||
|
projectDir = fs.mkdtempSync(
|
||||||
|
path.join(os.tmpdir(), 'nanoclaw-project-'),
|
||||||
|
);
|
||||||
|
execSync('git init', { cwd: projectDir, stdio: 'pipe' });
|
||||||
|
execSync('git config user.email "test@test.com"', {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
execSync('git config user.name "Test"', {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
fs.writeFileSync(
|
||||||
|
path.join(projectDir, 'package.json'),
|
||||||
|
JSON.stringify({ name: 'nanoclaw', version: '1.0.0' }),
|
||||||
|
);
|
||||||
|
execSync('git add -A && git commit -m "init"', {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Copy skills-engine/constants.ts so fetch-upstream.sh can read BASE_INCLUDES
|
||||||
|
const constantsSrc = path.resolve('skills-engine/constants.ts');
|
||||||
|
const constantsDest = path.join(projectDir, 'skills-engine/constants.ts');
|
||||||
|
fs.mkdirSync(path.dirname(constantsDest), { recursive: true });
|
||||||
|
fs.copyFileSync(constantsSrc, constantsDest);
|
||||||
|
|
||||||
|
// Copy the script into the project so it can find PROJECT_ROOT
|
||||||
|
const skillScriptsDir = path.join(
|
||||||
|
projectDir,
|
||||||
|
'.claude/skills/update/scripts',
|
||||||
|
);
|
||||||
|
fs.mkdirSync(skillScriptsDir, { recursive: true });
|
||||||
|
fs.copyFileSync(scriptPath, path.join(skillScriptsDir, 'fetch-upstream.sh'));
|
||||||
|
fs.chmodSync(path.join(skillScriptsDir, 'fetch-upstream.sh'), 0o755);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
// Clean up temp dirs (also any TEMP_DIR created by the script)
|
||||||
|
for (const dir of [projectDir, upstreamBareDir]) {
|
||||||
|
if (dir && fs.existsSync(dir)) {
|
||||||
|
fs.rmSync(dir, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function runFetchUpstream(): { stdout: string; exitCode: number } {
|
||||||
|
try {
|
||||||
|
const stdout = execFileSync(
|
||||||
|
'bash',
|
||||||
|
['.claude/skills/update/scripts/fetch-upstream.sh'],
|
||||||
|
{
|
||||||
|
cwd: projectDir,
|
||||||
|
encoding: 'utf-8',
|
||||||
|
stdio: 'pipe',
|
||||||
|
timeout: 30_000,
|
||||||
|
},
|
||||||
|
);
|
||||||
|
return { stdout, exitCode: 0 };
|
||||||
|
} catch (err: any) {
|
||||||
|
return { stdout: (err.stdout ?? '') + (err.stderr ?? ''), exitCode: err.status ?? 1 };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function parseStatus(stdout: string): Record<string, string> {
|
||||||
|
const match = stdout.match(/<<< STATUS\n([\s\S]*?)\nSTATUS >>>/);
|
||||||
|
if (!match) return {};
|
||||||
|
const lines = match[1].trim().split('\n');
|
||||||
|
const result: Record<string, string> = {};
|
||||||
|
for (const line of lines) {
|
||||||
|
const eq = line.indexOf('=');
|
||||||
|
if (eq > 0) {
|
||||||
|
result[line.slice(0, eq)] = line.slice(eq + 1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
it('uses existing upstream remote', () => {
|
||||||
|
execSync(`git remote add upstream ${upstreamBareDir}`, {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runFetchUpstream();
|
||||||
|
const status = parseStatus(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
expect(status.STATUS).toBe('success');
|
||||||
|
expect(status.REMOTE).toBe('upstream');
|
||||||
|
expect(status.CURRENT_VERSION).toBe('1.0.0');
|
||||||
|
expect(status.NEW_VERSION).toBe('2.0.0');
|
||||||
|
expect(status.TEMP_DIR).toMatch(/^\/tmp\/nanoclaw-update-/);
|
||||||
|
|
||||||
|
// Verify extracted files exist
|
||||||
|
expect(
|
||||||
|
fs.existsSync(path.join(status.TEMP_DIR, 'package.json')),
|
||||||
|
).toBe(true);
|
||||||
|
expect(
|
||||||
|
fs.existsSync(path.join(status.TEMP_DIR, 'src/index.ts')),
|
||||||
|
).toBe(true);
|
||||||
|
|
||||||
|
// Cleanup temp dir
|
||||||
|
fs.rmSync(status.TEMP_DIR, { recursive: true, force: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('uses origin when it points to qwibitai/nanoclaw', () => {
|
||||||
|
// Set origin to a URL containing qwibitai/nanoclaw
|
||||||
|
execSync(
|
||||||
|
`git remote add origin https://github.com/qwibitai/nanoclaw.git`,
|
||||||
|
{ cwd: projectDir, stdio: 'pipe' },
|
||||||
|
);
|
||||||
|
// We can't actually fetch from GitHub in tests, but we can verify
|
||||||
|
// it picks the right remote. We'll add a second remote it CAN fetch from.
|
||||||
|
execSync(`git remote add upstream ${upstreamBareDir}`, {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runFetchUpstream();
|
||||||
|
const status = parseStatus(stdout);
|
||||||
|
|
||||||
|
// It should find 'upstream' first (checked before origin)
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
expect(status.REMOTE).toBe('upstream');
|
||||||
|
|
||||||
|
if (status.TEMP_DIR) {
|
||||||
|
fs.rmSync(status.TEMP_DIR, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('adds upstream remote when none exists', { timeout: 15_000 }, () => {
|
||||||
|
// Remove origin if any
|
||||||
|
try {
|
||||||
|
execSync('git remote remove origin', {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
// No origin
|
||||||
|
}
|
||||||
|
|
||||||
|
const { stdout } = runFetchUpstream();
|
||||||
|
|
||||||
|
// It will try to add upstream pointing to github (which will fail to fetch),
|
||||||
|
// but we can verify it attempted to add the remote
|
||||||
|
expect(stdout).toContain('Adding upstream');
|
||||||
|
|
||||||
|
// Verify the remote was added
|
||||||
|
const remotes = execSync('git remote -v', {
|
||||||
|
cwd: projectDir,
|
||||||
|
encoding: 'utf-8',
|
||||||
|
});
|
||||||
|
expect(remotes).toContain('upstream');
|
||||||
|
expect(remotes).toContain('qwibitai/nanoclaw');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('extracts files to temp dir correctly', () => {
|
||||||
|
execSync(`git remote add upstream ${upstreamBareDir}`, {
|
||||||
|
cwd: projectDir,
|
||||||
|
stdio: 'pipe',
|
||||||
|
});
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runFetchUpstream();
|
||||||
|
const status = parseStatus(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
|
||||||
|
// Check file content matches what was pushed
|
||||||
|
const pkg = JSON.parse(
|
||||||
|
fs.readFileSync(path.join(status.TEMP_DIR, 'package.json'), 'utf-8'),
|
||||||
|
);
|
||||||
|
expect(pkg.version).toBe('2.0.0');
|
||||||
|
|
||||||
|
const indexContent = fs.readFileSync(
|
||||||
|
path.join(status.TEMP_DIR, 'src/index.ts'),
|
||||||
|
'utf-8',
|
||||||
|
);
|
||||||
|
expect(indexContent).toBe('export const v = 2;');
|
||||||
|
|
||||||
|
fs.rmSync(status.TEMP_DIR, { recursive: true, force: true });
|
||||||
|
});
|
||||||
|
});
|
||||||
234
skills-engine/__tests__/run-migrations.test.ts
Normal file
234
skills-engine/__tests__/run-migrations.test.ts
Normal file
@@ -0,0 +1,234 @@
|
|||||||
|
import { execFileSync } from 'child_process';
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
|
||||||
|
import { cleanup, createTempDir } from './test-helpers.js';
|
||||||
|
|
||||||
|
describe('run-migrations', () => {
|
||||||
|
let tmpDir: string;
|
||||||
|
let newCoreDir: string;
|
||||||
|
const scriptPath = path.resolve('scripts/run-migrations.ts');
|
||||||
|
const tsxBin = path.resolve('node_modules/.bin/tsx');
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
tmpDir = createTempDir();
|
||||||
|
newCoreDir = path.join(tmpDir, 'new-core');
|
||||||
|
fs.mkdirSync(newCoreDir, { recursive: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
cleanup(tmpDir);
|
||||||
|
});
|
||||||
|
|
||||||
|
function createMigration(version: string, code: string): void {
|
||||||
|
const migDir = path.join(newCoreDir, 'migrations', version);
|
||||||
|
fs.mkdirSync(migDir, { recursive: true });
|
||||||
|
fs.writeFileSync(path.join(migDir, 'index.ts'), code);
|
||||||
|
}
|
||||||
|
|
||||||
|
function runMigrations(
|
||||||
|
from: string,
|
||||||
|
to: string,
|
||||||
|
): { stdout: string; exitCode: number } {
|
||||||
|
try {
|
||||||
|
const stdout = execFileSync(
|
||||||
|
tsxBin,
|
||||||
|
[scriptPath, from, to, newCoreDir],
|
||||||
|
{ cwd: tmpDir, encoding: 'utf-8', stdio: 'pipe', timeout: 30_000 },
|
||||||
|
);
|
||||||
|
return { stdout, exitCode: 0 };
|
||||||
|
} catch (err: any) {
|
||||||
|
return { stdout: err.stdout ?? '', exitCode: err.status ?? 1 };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
it('outputs empty results when no migrations directory exists', () => {
|
||||||
|
const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
expect(result.migrationsRun).toBe(0);
|
||||||
|
expect(result.results).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('outputs empty results when migrations dir exists but is empty', () => {
|
||||||
|
fs.mkdirSync(path.join(newCoreDir, 'migrations'), { recursive: true });
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
expect(result.migrationsRun).toBe(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('runs migrations in the correct version range', () => {
|
||||||
|
// Create a marker file when the migration runs
|
||||||
|
createMigration(
|
||||||
|
'1.1.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-1.1.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
createMigration(
|
||||||
|
'1.2.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-1.2.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
// This one should NOT run (outside range)
|
||||||
|
createMigration(
|
||||||
|
'2.1.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-2.1.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
expect(result.migrationsRun).toBe(2);
|
||||||
|
expect(result.results[0].version).toBe('1.1.0');
|
||||||
|
expect(result.results[0].success).toBe(true);
|
||||||
|
expect(result.results[1].version).toBe('1.2.0');
|
||||||
|
expect(result.results[1].success).toBe(true);
|
||||||
|
|
||||||
|
// Verify the migrations actually ran
|
||||||
|
expect(fs.existsSync(path.join(tmpDir, 'migrated-1.1.0'))).toBe(true);
|
||||||
|
expect(fs.existsSync(path.join(tmpDir, 'migrated-1.2.0'))).toBe(true);
|
||||||
|
// 2.1.0 is outside range
|
||||||
|
expect(fs.existsSync(path.join(tmpDir, 'migrated-2.1.0'))).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('excludes the from-version (only runs > from)', () => {
|
||||||
|
createMigration(
|
||||||
|
'1.0.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-1.0.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
createMigration(
|
||||||
|
'1.1.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-1.1.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const { stdout } = runMigrations('1.0.0', '1.1.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(result.migrationsRun).toBe(1);
|
||||||
|
expect(result.results[0].version).toBe('1.1.0');
|
||||||
|
// 1.0.0 should NOT have run
|
||||||
|
expect(fs.existsSync(path.join(tmpDir, 'migrated-1.0.0'))).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('includes the to-version (<= to)', () => {
|
||||||
|
createMigration(
|
||||||
|
'2.0.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-2.0.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const { stdout } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(result.migrationsRun).toBe(1);
|
||||||
|
expect(result.results[0].version).toBe('2.0.0');
|
||||||
|
expect(result.results[0].success).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('runs migrations in semver ascending order', () => {
|
||||||
|
// Create them in non-sorted order
|
||||||
|
for (const v of ['1.3.0', '1.1.0', '1.2.0']) {
|
||||||
|
createMigration(
|
||||||
|
v,
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
const log = path.join(root, 'migration-order.log');
|
||||||
|
const existing = fs.existsSync(log) ? fs.readFileSync(log, 'utf-8') : '';
|
||||||
|
fs.writeFileSync(log, existing + '${v}\\n');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const { stdout } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(result.migrationsRun).toBe(3);
|
||||||
|
expect(result.results.map((r: any) => r.version)).toEqual([
|
||||||
|
'1.1.0',
|
||||||
|
'1.2.0',
|
||||||
|
'1.3.0',
|
||||||
|
]);
|
||||||
|
|
||||||
|
// Verify execution order from the log file
|
||||||
|
const log = fs.readFileSync(
|
||||||
|
path.join(tmpDir, 'migration-order.log'),
|
||||||
|
'utf-8',
|
||||||
|
);
|
||||||
|
expect(log.trim()).toBe('1.1.0\n1.2.0\n1.3.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('reports failure and exits non-zero when a migration throws', () => {
|
||||||
|
createMigration(
|
||||||
|
'1.1.0',
|
||||||
|
`throw new Error('migration failed intentionally');`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(1);
|
||||||
|
expect(result.migrationsRun).toBe(1);
|
||||||
|
expect(result.results[0].success).toBe(false);
|
||||||
|
expect(result.results[0].error).toBeDefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('ignores non-semver directories in migrations/', () => {
|
||||||
|
fs.mkdirSync(path.join(newCoreDir, 'migrations', 'README'), {
|
||||||
|
recursive: true,
|
||||||
|
});
|
||||||
|
fs.mkdirSync(path.join(newCoreDir, 'migrations', 'utils'), {
|
||||||
|
recursive: true,
|
||||||
|
});
|
||||||
|
createMigration(
|
||||||
|
'1.1.0',
|
||||||
|
`
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
const root = process.argv[2];
|
||||||
|
fs.writeFileSync(path.join(root, 'migrated-1.1.0'), 'done');
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0');
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(exitCode).toBe(0);
|
||||||
|
expect(result.migrationsRun).toBe(1);
|
||||||
|
expect(result.results[0].version).toBe('1.1.0');
|
||||||
|
});
|
||||||
|
});
|
||||||
131
skills-engine/__tests__/update-core-cli.test.ts
Normal file
131
skills-engine/__tests__/update-core-cli.test.ts
Normal file
@@ -0,0 +1,131 @@
|
|||||||
|
import { execFileSync } from 'child_process';
|
||||||
|
import fs from 'fs';
|
||||||
|
import path from 'path';
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
import { stringify } from 'yaml';
|
||||||
|
|
||||||
|
import { cleanup, createTempDir, initGitRepo, setupNanoclawDir } from './test-helpers.js';
|
||||||
|
|
||||||
|
describe('update-core.ts CLI flags', () => {
|
||||||
|
let tmpDir: string;
|
||||||
|
const scriptPath = path.resolve('scripts/update-core.ts');
|
||||||
|
const tsxBin = path.resolve('node_modules/.bin/tsx');
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
tmpDir = createTempDir();
|
||||||
|
setupNanoclawDir(tmpDir);
|
||||||
|
initGitRepo(tmpDir);
|
||||||
|
|
||||||
|
// Write state file
|
||||||
|
const statePath = path.join(tmpDir, '.nanoclaw', 'state.yaml');
|
||||||
|
fs.writeFileSync(
|
||||||
|
statePath,
|
||||||
|
stringify({
|
||||||
|
skills_system_version: '0.1.0',
|
||||||
|
core_version: '1.0.0',
|
||||||
|
applied_skills: [],
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
cleanup(tmpDir);
|
||||||
|
});
|
||||||
|
|
||||||
|
function createNewCore(files: Record<string, string>): string {
|
||||||
|
const dir = path.join(tmpDir, 'new-core');
|
||||||
|
fs.mkdirSync(dir, { recursive: true });
|
||||||
|
for (const [relPath, content] of Object.entries(files)) {
|
||||||
|
const fullPath = path.join(dir, relPath);
|
||||||
|
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||||
|
fs.writeFileSync(fullPath, content);
|
||||||
|
}
|
||||||
|
return dir;
|
||||||
|
}
|
||||||
|
|
||||||
|
it('--json --preview-only outputs JSON preview without applying', () => {
|
||||||
|
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||||
|
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||||
|
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'original');
|
||||||
|
|
||||||
|
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||||
|
fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'original');
|
||||||
|
|
||||||
|
const newCoreDir = createNewCore({
|
||||||
|
'src/index.ts': 'updated',
|
||||||
|
'package.json': JSON.stringify({ version: '2.0.0' }),
|
||||||
|
});
|
||||||
|
|
||||||
|
const stdout = execFileSync(
|
||||||
|
tsxBin,
|
||||||
|
[scriptPath, '--json', '--preview-only', newCoreDir],
|
||||||
|
{ cwd: tmpDir, encoding: 'utf-8', stdio: 'pipe', timeout: 30_000 },
|
||||||
|
);
|
||||||
|
|
||||||
|
const preview = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(preview.currentVersion).toBe('1.0.0');
|
||||||
|
expect(preview.newVersion).toBe('2.0.0');
|
||||||
|
expect(preview.filesChanged).toContain('src/index.ts');
|
||||||
|
|
||||||
|
// File should NOT have been modified (preview only)
|
||||||
|
expect(fs.readFileSync(path.join(tmpDir, 'src/index.ts'), 'utf-8')).toBe(
|
||||||
|
'original',
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('--preview-only without --json outputs human-readable text', () => {
|
||||||
|
const newCoreDir = createNewCore({
|
||||||
|
'src/new-file.ts': 'export const x = 1;',
|
||||||
|
'package.json': JSON.stringify({ version: '2.0.0' }),
|
||||||
|
});
|
||||||
|
|
||||||
|
const stdout = execFileSync(
|
||||||
|
tsxBin,
|
||||||
|
[scriptPath, '--preview-only', newCoreDir],
|
||||||
|
{ cwd: tmpDir, encoding: 'utf-8', stdio: 'pipe', timeout: 30_000 },
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(stdout).toContain('Update Preview');
|
||||||
|
expect(stdout).toContain('2.0.0');
|
||||||
|
// Should NOT contain JSON (it's human-readable mode)
|
||||||
|
expect(stdout).not.toContain('"currentVersion"');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('--json applies and outputs JSON result', () => {
|
||||||
|
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||||
|
fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'original');
|
||||||
|
|
||||||
|
const newCoreDir = createNewCore({
|
||||||
|
'src/index.ts': 'original',
|
||||||
|
'package.json': JSON.stringify({ version: '2.0.0' }),
|
||||||
|
});
|
||||||
|
|
||||||
|
const stdout = execFileSync(
|
||||||
|
tsxBin,
|
||||||
|
[scriptPath, '--json', newCoreDir],
|
||||||
|
{ cwd: tmpDir, encoding: 'utf-8', stdio: 'pipe', timeout: 30_000 },
|
||||||
|
);
|
||||||
|
|
||||||
|
const result = JSON.parse(stdout);
|
||||||
|
|
||||||
|
expect(result.success).toBe(true);
|
||||||
|
expect(result.previousVersion).toBe('1.0.0');
|
||||||
|
expect(result.newVersion).toBe('2.0.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('exits with error when no path provided', () => {
|
||||||
|
try {
|
||||||
|
execFileSync(tsxBin, [scriptPath], {
|
||||||
|
cwd: tmpDir,
|
||||||
|
encoding: 'utf-8',
|
||||||
|
stdio: 'pipe',
|
||||||
|
timeout: 30_000,
|
||||||
|
});
|
||||||
|
expect.unreachable('Should have exited with error');
|
||||||
|
} catch (err: any) {
|
||||||
|
expect(err.status).toBe(1);
|
||||||
|
expect(err.stderr).toContain('Usage');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -7,3 +7,7 @@ export const CUSTOM_DIR = '.nanoclaw/custom';
|
|||||||
export const RESOLUTIONS_DIR = '.nanoclaw/resolutions';
|
export const RESOLUTIONS_DIR = '.nanoclaw/resolutions';
|
||||||
export const SHIPPED_RESOLUTIONS_DIR = '.claude/resolutions';
|
export const SHIPPED_RESOLUTIONS_DIR = '.claude/resolutions';
|
||||||
export const SKILLS_SCHEMA_VERSION = '0.1.0';
|
export const SKILLS_SCHEMA_VERSION = '0.1.0';
|
||||||
|
|
||||||
|
// Top-level paths to include in base snapshot and upstream extraction.
|
||||||
|
// Add new entries here when new root-level directories/files need tracking.
|
||||||
|
export const BASE_INCLUDES = ['src/', 'package.json', '.env.example', 'container/'];
|
||||||
|
|||||||
@@ -2,14 +2,11 @@ import { execSync } from 'child_process';
|
|||||||
import fs from 'fs';
|
import fs from 'fs';
|
||||||
import path from 'path';
|
import path from 'path';
|
||||||
|
|
||||||
import { BACKUP_DIR, BASE_DIR, NANOCLAW_DIR } from './constants.js';
|
import { BACKUP_DIR, BASE_DIR, BASE_INCLUDES, NANOCLAW_DIR } from './constants.js';
|
||||||
import { isGitRepo } from './merge.js';
|
import { isGitRepo } from './merge.js';
|
||||||
import { writeState } from './state.js';
|
import { writeState } from './state.js';
|
||||||
import { SkillState } from './types.js';
|
import { SkillState } from './types.js';
|
||||||
|
|
||||||
// Top-level paths to include in base snapshot
|
|
||||||
const BASE_INCLUDES = ['src/', 'package.json', '.env.example', 'container/'];
|
|
||||||
|
|
||||||
// Directories/files to always exclude from base snapshot
|
// Directories/files to always exclude from base snapshot
|
||||||
const BASE_EXCLUDES = [
|
const BASE_EXCLUDES = [
|
||||||
'node_modules',
|
'node_modules',
|
||||||
|
|||||||
Reference in New Issue
Block a user