diff --git a/.claude/settings.json b/.claude/settings.json new file mode 100644 index 0000000..0967ef4 --- /dev/null +++ b/.claude/settings.json @@ -0,0 +1 @@ +{} diff --git a/.claude/skills/add-compact/SKILL.md b/.claude/skills/add-compact/SKILL.md new file mode 100644 index 0000000..0c46165 --- /dev/null +++ b/.claude/skills/add-compact/SKILL.md @@ -0,0 +1,135 @@ +--- +name: add-compact +description: Add /compact command for manual context compaction. Solves context rot in long sessions by forwarding the SDK's built-in /compact slash command. Main-group or trusted sender only. +--- + +# Add /compact Command + +Adds a `/compact` session command that compacts conversation history to fight context rot in long-running sessions. Uses the Claude Agent SDK's built-in `/compact` slash command — no synthetic system prompts. + +**Session contract:** `/compact` keeps the same logical session alive. The SDK returns a new session ID after compaction (via the `init` system message), which the agent-runner forwards to the orchestrator as `newSessionId`. No destructive reset occurs — the agent retains summarized context. + +## Phase 1: Pre-flight + +Check if `src/session-commands.ts` exists: + +```bash +test -f src/session-commands.ts && echo "Already applied" || echo "Not applied" +``` + +If already applied, skip to Phase 3 (Verify). + +## Phase 2: Apply Code Changes + +Merge the skill branch: + +```bash +git fetch upstream skill/compact +git merge upstream/skill/compact +``` + +> **Note:** `upstream` is the remote pointing to `qwibitai/nanoclaw`. If using a different remote name, substitute accordingly. + +This adds: +- `src/session-commands.ts` (extract and authorize session commands) +- `src/session-commands.test.ts` (unit tests for command parsing and auth) +- Session command interception in `src/index.ts` (both `processGroupMessages` and `startMessageLoop`) +- Slash command handling in `container/agent-runner/src/index.ts` + +### Validate + +```bash +npm test +npm run build +``` + +### Rebuild container + +```bash +./container/build.sh +``` + +### Restart service + +```bash +launchctl kickstart -k gui/$(id -u)/com.nanoclaw # macOS +# Linux: systemctl --user restart nanoclaw +``` + +## Phase 3: Verify + +### Integration Test + +1. Start NanoClaw in dev mode: `npm run dev` +2. From the **main group** (self-chat), send exactly: `/compact` +3. Verify: + - The agent acknowledges compaction (e.g., "Conversation compacted.") + - The session continues — send a follow-up message and verify the agent responds coherently + - A conversation archive is written to `groups/{folder}/conversations/` (by the PreCompact hook) + - Container logs show `Compact boundary observed` (confirms SDK actually compacted) + - If `compact_boundary` was NOT observed, the response says "compact_boundary was not observed" +4. From a **non-main group** as a non-admin user, send: `@ /compact` +5. Verify: + - The bot responds with "Session commands require admin access." + - No compaction occurs, no container is spawned for the command +6. From a **non-main group** as the admin (device owner / `is_from_me`), send: `@ /compact` +7. Verify: + - Compaction proceeds normally (same behavior as main group) +8. While an **active container** is running for the main group, send `/compact` +9. Verify: + - The active container is signaled to close (authorized senders only — untrusted senders cannot kill in-flight work) + - Compaction proceeds via a new container once the active one exits + - The command is not dropped (no cursor race) +10. Send a normal message, then `/compact`, then another normal message in quick succession (same polling batch): +11. Verify: + - Pre-compact messages are sent to the agent first (check container logs for two `runAgent` calls) + - Compaction proceeds after pre-compact messages are processed + - Messages **after** `/compact` in the batch are preserved (cursor advances to `/compact`'s timestamp only) and processed on the next poll cycle +12. From a **non-main group** as a non-admin user, send `@ /compact`: +13. Verify: + - Denial message is sent ("Session commands require admin access.") + - The `/compact` is consumed (cursor advanced) — it does NOT replay on future polls + - Other messages in the same batch are also consumed (cursor is a high-water mark — this is an accepted tradeoff for the narrow edge case of denied `/compact` + other messages in the same polling interval) + - No container is killed or interrupted +14. From a **non-main group** (with `requiresTrigger` enabled) as a non-admin user, send bare `/compact` (no trigger prefix): +15. Verify: + - No denial message is sent (trigger policy prevents untrusted bot responses) + - The `/compact` is consumed silently + - Note: in groups where `requiresTrigger` is `false`, a denial message IS sent because the sender is considered reachable +16. After compaction, verify **no auto-compaction** behavior — only manual `/compact` triggers it + +### Validation on Fresh Clone + +```bash +git clone /tmp/nanoclaw-test +cd /tmp/nanoclaw-test +claude # then run /add-compact +npm run build +npm test +./container/build.sh +# Manual: send /compact from main group, verify compaction + continuation +# Manual: send @ /compact from non-main as non-admin, verify denial +# Manual: send @ /compact from non-main as admin, verify allowed +# Manual: verify no auto-compaction behavior +``` + +## Security Constraints + +- **Main-group or trusted/admin sender only.** The main group is the user's private self-chat and is trusted (see `docs/SECURITY.md`). Non-main groups are untrusted — a careless or malicious user could wipe the agent's short-term memory. However, the device owner (`is_from_me`) is always trusted and can compact from any group. +- **No auto-compaction.** This skill implements manual compaction only. Automatic threshold-based compaction is a separate concern and should be a separate skill. +- **No config file.** NanoClaw's philosophy is customization through code changes, not configuration sprawl. +- **Transcript archived before compaction.** The existing `PreCompact` hook in the agent-runner archives the full transcript to `conversations/` before the SDK compacts it. +- **Session continues after compaction.** This is not a destructive reset. The conversation continues with summarized context. + +## What This Does NOT Do + +- No automatic compaction threshold (add separately if desired) +- No `/clear` command (separate skill, separate semantics — `/clear` is a destructive reset) +- No cross-group compaction (each group's session is isolated) +- No changes to the container image, Dockerfile, or build script + +## Troubleshooting + +- **"Session commands require admin access"**: Only the device owner (`is_from_me`) or main-group senders can use `/compact`. Other users are denied. +- **No compact_boundary in logs**: The SDK may not emit this event in all versions. Check the agent-runner logs for the warning message. Compaction may still have succeeded. +- **Pre-compact failure**: If messages before `/compact` fail to process, the error message says "Failed to process messages before /compact." The cursor advances past sent output to prevent duplicates; `/compact` remains pending for the next attempt. diff --git a/.claude/skills/add-discord/SKILL.md b/.claude/skills/add-discord/SKILL.md index b73e5ad..e46bd3e 100644 --- a/.claude/skills/add-discord/SKILL.md +++ b/.claude/skills/add-discord/SKILL.md @@ -1,64 +1,66 @@ +--- +name: add-discord +description: Add Discord bot channel integration to NanoClaw. +--- + # Add Discord Channel -This skill adds Discord support to NanoClaw using the skills engine for deterministic code changes, then walks through interactive setup. +This skill adds Discord support to NanoClaw, then walks through interactive setup. ## Phase 1: Pre-flight ### Check if already applied -Read `.nanoclaw/state.yaml`. If `discord` is in `applied_skills`, skip to Phase 3 (Setup). The code changes are already in place. +Check if `src/channels/discord.ts` exists. If it does, skip to Phase 3 (Setup). The code changes are already in place. ### Ask the user Use `AskUserQuestion` to collect configuration: -AskUserQuestion: Should Discord replace WhatsApp or run alongside it? -- **Replace WhatsApp** - Discord will be the only channel (sets DISCORD_ONLY=true) -- **Alongside** - Both Discord and WhatsApp channels active - AskUserQuestion: Do you have a Discord bot token, or do you need to create one? If they have one, collect it now. If not, we'll create one in Phase 3. ## Phase 2: Apply Code Changes -Run the skills engine to apply this skill's code package. The package files are in this directory alongside this SKILL.md. - -### Initialize skills system (if needed) - -If `.nanoclaw/` directory doesn't exist yet: +### Ensure channel remote ```bash -npx tsx scripts/apply-skill.ts --init +git remote -v ``` -Or call `initSkillsSystem()` from `skills-engine/migrate.ts`. - -### Apply the skill +If `discord` is missing, add it: ```bash -npx tsx scripts/apply-skill.ts .claude/skills/add-discord +git remote add discord https://github.com/qwibitai/nanoclaw-discord.git ``` -This deterministically: -- Adds `src/channels/discord.ts` (DiscordChannel class implementing Channel interface) -- Adds `src/channels/discord.test.ts` (unit tests with discord.js mock) -- Three-way merges Discord support into `src/index.ts` (multi-channel support, findChannel routing) -- Three-way merges Discord config into `src/config.ts` (DISCORD_BOT_TOKEN, DISCORD_ONLY exports) -- Three-way merges updated routing tests into `src/routing.test.ts` -- Installs the `discord.js` npm dependency -- Updates `.env.example` with `DISCORD_BOT_TOKEN` and `DISCORD_ONLY` -- Records the application in `.nanoclaw/state.yaml` +### Merge the skill branch -If the apply reports merge conflicts, read the intent files: -- `modify/src/index.ts.intent.md` — what changed and invariants for index.ts -- `modify/src/config.ts.intent.md` — what changed for config.ts +```bash +git fetch discord main +git merge discord/main || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/channels/discord.ts` (DiscordChannel class with self-registration via `registerChannel`) +- `src/channels/discord.test.ts` (unit tests with discord.js mock) +- `import './discord.js'` appended to the channel barrel file `src/channels/index.ts` +- `discord.js` npm dependency in `package.json` +- `DISCORD_BOT_TOKEN` in `.env.example` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. ### Validate code changes ```bash -npm test +npm install npm run build +npx vitest run src/channels/discord.test.ts ``` All tests must pass (including the new Discord tests) and build must be clean before proceeding. @@ -93,16 +95,12 @@ Add to `.env`: DISCORD_BOT_TOKEN= ``` -If they chose to replace WhatsApp: - -```bash -DISCORD_ONLY=true -``` +Channels auto-enable when their credentials are present — no extra configuration needed. Sync to container environment: ```bash -cp .env data/env/env +mkdir -p data/env && cp .env data/env/env ``` The container reads environment from `data/env/env`, not `.env` directly. @@ -132,30 +130,18 @@ Wait for the user to provide the channel ID (format: `dc:1234567890123456`). ### Register the channel -Use the IPC register flow or register directly. The channel ID, name, and folder name are needed. +The channel ID, name, and folder name are needed. Use `npx tsx setup/index.ts --step register` with the appropriate flags. -For a main channel (responds to all messages, uses the `main` folder): +For a main channel (responds to all messages): -```typescript -registerGroup("dc:", { - name: " #", - folder: "main", - trigger: `@${ASSISTANT_NAME}`, - added_at: new Date().toISOString(), - requiresTrigger: false, -}); +```bash +npx tsx setup/index.ts --step register -- --jid "dc:" --name " #" --folder "discord_main" --trigger "@${ASSISTANT_NAME}" --channel discord --no-trigger-required --is-main ``` For additional channels (trigger-only): -```typescript -registerGroup("dc:", { - name: " #", - folder: "", - trigger: `@${ASSISTANT_NAME}`, - added_at: new Date().toISOString(), - requiresTrigger: true, -}); +```bash +npx tsx setup/index.ts --step register -- --jid "dc:" --name " #" --folder "discord_" --trigger "@${ASSISTANT_NAME}" --channel discord ``` ## Phase 5: Verify diff --git a/.claude/skills/add-discord/add/src/channels/discord.test.ts b/.claude/skills/add-discord/add/src/channels/discord.test.ts deleted file mode 100644 index eff0b77..0000000 --- a/.claude/skills/add-discord/add/src/channels/discord.test.ts +++ /dev/null @@ -1,762 +0,0 @@ -import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest'; - -// --- Mocks --- - -// Mock config -vi.mock('../config.js', () => ({ - ASSISTANT_NAME: 'Andy', - TRIGGER_PATTERN: /^@Andy\b/i, -})); - -// Mock logger -vi.mock('../logger.js', () => ({ - logger: { - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }, -})); - -// --- discord.js mock --- - -type Handler = (...args: any[]) => any; - -const clientRef = vi.hoisted(() => ({ current: null as any })); - -vi.mock('discord.js', () => { - const Events = { - MessageCreate: 'messageCreate', - ClientReady: 'ready', - Error: 'error', - }; - - const GatewayIntentBits = { - Guilds: 1, - GuildMessages: 2, - MessageContent: 4, - DirectMessages: 8, - }; - - class MockClient { - eventHandlers = new Map(); - user: any = { id: '999888777', tag: 'Andy#1234' }; - private _ready = false; - - constructor(_opts: any) { - clientRef.current = this; - } - - on(event: string, handler: Handler) { - const existing = this.eventHandlers.get(event) || []; - existing.push(handler); - this.eventHandlers.set(event, existing); - return this; - } - - once(event: string, handler: Handler) { - return this.on(event, handler); - } - - async login(_token: string) { - this._ready = true; - // Fire the ready event - const readyHandlers = this.eventHandlers.get('ready') || []; - for (const h of readyHandlers) { - h({ user: this.user }); - } - } - - isReady() { - return this._ready; - } - - channels = { - fetch: vi.fn().mockResolvedValue({ - send: vi.fn().mockResolvedValue(undefined), - sendTyping: vi.fn().mockResolvedValue(undefined), - }), - }; - - destroy() { - this._ready = false; - } - } - - // Mock TextChannel type - class TextChannel {} - - return { - Client: MockClient, - Events, - GatewayIntentBits, - TextChannel, - }; -}); - -import { DiscordChannel, DiscordChannelOpts } from './discord.js'; - -// --- Test helpers --- - -function createTestOpts( - overrides?: Partial, -): DiscordChannelOpts { - return { - onMessage: vi.fn(), - onChatMetadata: vi.fn(), - registeredGroups: vi.fn(() => ({ - 'dc:1234567890123456': { - name: 'Test Server #general', - folder: 'test-server', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - ...overrides, - }; -} - -function createMessage(overrides: { - channelId?: string; - content?: string; - authorId?: string; - authorUsername?: string; - authorDisplayName?: string; - memberDisplayName?: string; - isBot?: boolean; - guildName?: string; - channelName?: string; - messageId?: string; - createdAt?: Date; - attachments?: Map; - reference?: { messageId?: string }; - mentionsBotId?: boolean; -}) { - const channelId = overrides.channelId ?? '1234567890123456'; - const authorId = overrides.authorId ?? '55512345'; - const botId = '999888777'; // matches mock client user id - - const mentionsMap = new Map(); - if (overrides.mentionsBotId) { - mentionsMap.set(botId, { id: botId }); - } - - return { - channelId, - id: overrides.messageId ?? 'msg_001', - content: overrides.content ?? 'Hello everyone', - createdAt: overrides.createdAt ?? new Date('2024-01-01T00:00:00.000Z'), - author: { - id: authorId, - username: overrides.authorUsername ?? 'alice', - displayName: overrides.authorDisplayName ?? 'Alice', - bot: overrides.isBot ?? false, - }, - member: overrides.memberDisplayName - ? { displayName: overrides.memberDisplayName } - : null, - guild: overrides.guildName - ? { name: overrides.guildName } - : null, - channel: { - name: overrides.channelName ?? 'general', - messages: { - fetch: vi.fn().mockResolvedValue({ - author: { username: 'Bob', displayName: 'Bob' }, - member: { displayName: 'Bob' }, - }), - }, - }, - mentions: { - users: mentionsMap, - }, - attachments: overrides.attachments ?? new Map(), - reference: overrides.reference ?? null, - }; -} - -function currentClient() { - return clientRef.current; -} - -async function triggerMessage(message: any) { - const handlers = currentClient().eventHandlers.get('messageCreate') || []; - for (const h of handlers) await h(message); -} - -// --- Tests --- - -describe('DiscordChannel', () => { - beforeEach(() => { - vi.clearAllMocks(); - }); - - afterEach(() => { - vi.restoreAllMocks(); - }); - - // --- Connection lifecycle --- - - describe('connection lifecycle', () => { - it('resolves connect() when client is ready', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - - await channel.connect(); - - expect(channel.isConnected()).toBe(true); - }); - - it('registers message handlers on connect', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - - await channel.connect(); - - expect(currentClient().eventHandlers.has('messageCreate')).toBe(true); - expect(currentClient().eventHandlers.has('error')).toBe(true); - expect(currentClient().eventHandlers.has('ready')).toBe(true); - }); - - it('disconnects cleanly', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - - await channel.connect(); - expect(channel.isConnected()).toBe(true); - - await channel.disconnect(); - expect(channel.isConnected()).toBe(false); - }); - - it('isConnected() returns false before connect', () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - - expect(channel.isConnected()).toBe(false); - }); - }); - - // --- Text message handling --- - - describe('text message handling', () => { - it('delivers message for registered channel', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'Hello everyone', - guildName: 'Test Server', - channelName: 'general', - }); - await triggerMessage(msg); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.any(String), - 'Test Server #general', - ); - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - id: 'msg_001', - chat_jid: 'dc:1234567890123456', - sender: '55512345', - sender_name: 'Alice', - content: 'Hello everyone', - is_from_me: false, - }), - ); - }); - - it('only emits metadata for unregistered channels', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - channelId: '9999999999999999', - content: 'Unknown channel', - guildName: 'Other Server', - }); - await triggerMessage(msg); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'dc:9999999999999999', - expect.any(String), - expect.any(String), - ); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('ignores bot messages', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ isBot: true, content: 'I am a bot' }); - await triggerMessage(msg); - - expect(opts.onMessage).not.toHaveBeenCalled(); - expect(opts.onChatMetadata).not.toHaveBeenCalled(); - }); - - it('uses member displayName when available (server nickname)', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'Hi', - memberDisplayName: 'Alice Nickname', - authorDisplayName: 'Alice Global', - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ sender_name: 'Alice Nickname' }), - ); - }); - - it('falls back to author displayName when no member', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'Hi', - memberDisplayName: undefined, - authorDisplayName: 'Alice Global', - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ sender_name: 'Alice Global' }), - ); - }); - - it('uses sender name for DM chats (no guild)', async () => { - const opts = createTestOpts({ - registeredGroups: vi.fn(() => ({ - 'dc:1234567890123456': { - name: 'DM', - folder: 'dm', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - }); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'Hello', - guildName: undefined, - authorDisplayName: 'Alice', - }); - await triggerMessage(msg); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.any(String), - 'Alice', - ); - }); - - it('uses guild name + channel name for server messages', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'Hello', - guildName: 'My Server', - channelName: 'bot-chat', - }); - await triggerMessage(msg); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.any(String), - 'My Server #bot-chat', - ); - }); - }); - - // --- @mention translation --- - - describe('@mention translation', () => { - it('translates <@botId> mention to trigger format', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: '<@999888777> what time is it?', - mentionsBotId: true, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '@Andy what time is it?', - }), - ); - }); - - it('does not translate if message already matches trigger', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: '@Andy hello <@999888777>', - mentionsBotId: true, - guildName: 'Server', - }); - await triggerMessage(msg); - - // Should NOT prepend @Andy — already starts with trigger - // But the <@botId> should still be stripped - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '@Andy hello', - }), - ); - }); - - it('does not translate when bot is not mentioned', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'hello everyone', - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: 'hello everyone', - }), - ); - }); - - it('handles <@!botId> (nickname mention format)', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: '<@!999888777> check this', - mentionsBotId: true, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '@Andy check this', - }), - ); - }); - }); - - // --- Attachments --- - - describe('attachments', () => { - it('stores image attachment with placeholder', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const attachments = new Map([ - ['att1', { name: 'photo.png', contentType: 'image/png' }], - ]); - const msg = createMessage({ - content: '', - attachments, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '[Image: photo.png]', - }), - ); - }); - - it('stores video attachment with placeholder', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const attachments = new Map([ - ['att1', { name: 'clip.mp4', contentType: 'video/mp4' }], - ]); - const msg = createMessage({ - content: '', - attachments, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '[Video: clip.mp4]', - }), - ); - }); - - it('stores file attachment with placeholder', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const attachments = new Map([ - ['att1', { name: 'report.pdf', contentType: 'application/pdf' }], - ]); - const msg = createMessage({ - content: '', - attachments, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '[File: report.pdf]', - }), - ); - }); - - it('includes text content with attachments', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const attachments = new Map([ - ['att1', { name: 'photo.jpg', contentType: 'image/jpeg' }], - ]); - const msg = createMessage({ - content: 'Check this out', - attachments, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: 'Check this out\n[Image: photo.jpg]', - }), - ); - }); - - it('handles multiple attachments', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const attachments = new Map([ - ['att1', { name: 'a.png', contentType: 'image/png' }], - ['att2', { name: 'b.txt', contentType: 'text/plain' }], - ]); - const msg = createMessage({ - content: '', - attachments, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '[Image: a.png]\n[File: b.txt]', - }), - ); - }); - }); - - // --- Reply context --- - - describe('reply context', () => { - it('includes reply author in content', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const msg = createMessage({ - content: 'I agree with that', - reference: { messageId: 'original_msg_id' }, - guildName: 'Server', - }); - await triggerMessage(msg); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'dc:1234567890123456', - expect.objectContaining({ - content: '[Reply to Bob] I agree with that', - }), - ); - }); - }); - - // --- sendMessage --- - - describe('sendMessage', () => { - it('sends message via channel', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - await channel.sendMessage('dc:1234567890123456', 'Hello'); - - const fetchedChannel = await currentClient().channels.fetch('1234567890123456'); - expect(currentClient().channels.fetch).toHaveBeenCalledWith('1234567890123456'); - }); - - it('strips dc: prefix from JID', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - await channel.sendMessage('dc:9876543210', 'Test'); - - expect(currentClient().channels.fetch).toHaveBeenCalledWith('9876543210'); - }); - - it('handles send failure gracefully', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - currentClient().channels.fetch.mockRejectedValueOnce( - new Error('Channel not found'), - ); - - // Should not throw - await expect( - channel.sendMessage('dc:1234567890123456', 'Will fail'), - ).resolves.toBeUndefined(); - }); - - it('does nothing when client is not initialized', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - - // Don't connect — client is null - await channel.sendMessage('dc:1234567890123456', 'No client'); - - // No error, no API call - }); - - it('splits messages exceeding 2000 characters', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const mockChannel = { - send: vi.fn().mockResolvedValue(undefined), - sendTyping: vi.fn(), - }; - currentClient().channels.fetch.mockResolvedValue(mockChannel); - - const longText = 'x'.repeat(3000); - await channel.sendMessage('dc:1234567890123456', longText); - - expect(mockChannel.send).toHaveBeenCalledTimes(2); - expect(mockChannel.send).toHaveBeenNthCalledWith(1, 'x'.repeat(2000)); - expect(mockChannel.send).toHaveBeenNthCalledWith(2, 'x'.repeat(1000)); - }); - }); - - // --- ownsJid --- - - describe('ownsJid', () => { - it('owns dc: JIDs', () => { - const channel = new DiscordChannel('test-token', createTestOpts()); - expect(channel.ownsJid('dc:1234567890123456')).toBe(true); - }); - - it('does not own WhatsApp group JIDs', () => { - const channel = new DiscordChannel('test-token', createTestOpts()); - expect(channel.ownsJid('12345@g.us')).toBe(false); - }); - - it('does not own Telegram JIDs', () => { - const channel = new DiscordChannel('test-token', createTestOpts()); - expect(channel.ownsJid('tg:123456789')).toBe(false); - }); - - it('does not own unknown JID formats', () => { - const channel = new DiscordChannel('test-token', createTestOpts()); - expect(channel.ownsJid('random-string')).toBe(false); - }); - }); - - // --- setTyping --- - - describe('setTyping', () => { - it('sends typing indicator when isTyping is true', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - const mockChannel = { - send: vi.fn(), - sendTyping: vi.fn().mockResolvedValue(undefined), - }; - currentClient().channels.fetch.mockResolvedValue(mockChannel); - - await channel.setTyping('dc:1234567890123456', true); - - expect(mockChannel.sendTyping).toHaveBeenCalled(); - }); - - it('does nothing when isTyping is false', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - await channel.connect(); - - await channel.setTyping('dc:1234567890123456', false); - - // channels.fetch should NOT be called - expect(currentClient().channels.fetch).not.toHaveBeenCalled(); - }); - - it('does nothing when client is not initialized', async () => { - const opts = createTestOpts(); - const channel = new DiscordChannel('test-token', opts); - - // Don't connect - await channel.setTyping('dc:1234567890123456', true); - - // No error - }); - }); - - // --- Channel properties --- - - describe('channel properties', () => { - it('has name "discord"', () => { - const channel = new DiscordChannel('test-token', createTestOpts()); - expect(channel.name).toBe('discord'); - }); - }); -}); diff --git a/.claude/skills/add-discord/add/src/channels/discord.ts b/.claude/skills/add-discord/add/src/channels/discord.ts deleted file mode 100644 index 997d489..0000000 --- a/.claude/skills/add-discord/add/src/channels/discord.ts +++ /dev/null @@ -1,236 +0,0 @@ -import { Client, Events, GatewayIntentBits, Message, TextChannel } from 'discord.js'; - -import { ASSISTANT_NAME, TRIGGER_PATTERN } from '../config.js'; -import { logger } from '../logger.js'; -import { - Channel, - OnChatMetadata, - OnInboundMessage, - RegisteredGroup, -} from '../types.js'; - -export interface DiscordChannelOpts { - onMessage: OnInboundMessage; - onChatMetadata: OnChatMetadata; - registeredGroups: () => Record; -} - -export class DiscordChannel implements Channel { - name = 'discord'; - - private client: Client | null = null; - private opts: DiscordChannelOpts; - private botToken: string; - - constructor(botToken: string, opts: DiscordChannelOpts) { - this.botToken = botToken; - this.opts = opts; - } - - async connect(): Promise { - this.client = new Client({ - intents: [ - GatewayIntentBits.Guilds, - GatewayIntentBits.GuildMessages, - GatewayIntentBits.MessageContent, - GatewayIntentBits.DirectMessages, - ], - }); - - this.client.on(Events.MessageCreate, async (message: Message) => { - // Ignore bot messages (including own) - if (message.author.bot) return; - - const channelId = message.channelId; - const chatJid = `dc:${channelId}`; - let content = message.content; - const timestamp = message.createdAt.toISOString(); - const senderName = - message.member?.displayName || - message.author.displayName || - message.author.username; - const sender = message.author.id; - const msgId = message.id; - - // Determine chat name - let chatName: string; - if (message.guild) { - const textChannel = message.channel as TextChannel; - chatName = `${message.guild.name} #${textChannel.name}`; - } else { - chatName = senderName; - } - - // Translate Discord @bot mentions into TRIGGER_PATTERN format. - // Discord mentions look like <@botUserId> — these won't match - // TRIGGER_PATTERN (e.g., ^@Andy\b), so we prepend the trigger - // when the bot is @mentioned. - if (this.client?.user) { - const botId = this.client.user.id; - const isBotMentioned = - message.mentions.users.has(botId) || - content.includes(`<@${botId}>`) || - content.includes(`<@!${botId}>`); - - if (isBotMentioned) { - // Strip the <@botId> mention to avoid visual clutter - content = content - .replace(new RegExp(`<@!?${botId}>`, 'g'), '') - .trim(); - // Prepend trigger if not already present - if (!TRIGGER_PATTERN.test(content)) { - content = `@${ASSISTANT_NAME} ${content}`; - } - } - } - - // Handle attachments — store placeholders so the agent knows something was sent - if (message.attachments.size > 0) { - const attachmentDescriptions = [...message.attachments.values()].map((att) => { - const contentType = att.contentType || ''; - if (contentType.startsWith('image/')) { - return `[Image: ${att.name || 'image'}]`; - } else if (contentType.startsWith('video/')) { - return `[Video: ${att.name || 'video'}]`; - } else if (contentType.startsWith('audio/')) { - return `[Audio: ${att.name || 'audio'}]`; - } else { - return `[File: ${att.name || 'file'}]`; - } - }); - if (content) { - content = `${content}\n${attachmentDescriptions.join('\n')}`; - } else { - content = attachmentDescriptions.join('\n'); - } - } - - // Handle reply context — include who the user is replying to - if (message.reference?.messageId) { - try { - const repliedTo = await message.channel.messages.fetch( - message.reference.messageId, - ); - const replyAuthor = - repliedTo.member?.displayName || - repliedTo.author.displayName || - repliedTo.author.username; - content = `[Reply to ${replyAuthor}] ${content}`; - } catch { - // Referenced message may have been deleted - } - } - - // Store chat metadata for discovery - this.opts.onChatMetadata(chatJid, timestamp, chatName); - - // Only deliver full message for registered groups - const group = this.opts.registeredGroups()[chatJid]; - if (!group) { - logger.debug( - { chatJid, chatName }, - 'Message from unregistered Discord channel', - ); - return; - } - - // Deliver message — startMessageLoop() will pick it up - this.opts.onMessage(chatJid, { - id: msgId, - chat_jid: chatJid, - sender, - sender_name: senderName, - content, - timestamp, - is_from_me: false, - }); - - logger.info( - { chatJid, chatName, sender: senderName }, - 'Discord message stored', - ); - }); - - // Handle errors gracefully - this.client.on(Events.Error, (err) => { - logger.error({ err: err.message }, 'Discord client error'); - }); - - return new Promise((resolve) => { - this.client!.once(Events.ClientReady, (readyClient) => { - logger.info( - { username: readyClient.user.tag, id: readyClient.user.id }, - 'Discord bot connected', - ); - console.log(`\n Discord bot: ${readyClient.user.tag}`); - console.log( - ` Use /chatid command or check channel IDs in Discord settings\n`, - ); - resolve(); - }); - - this.client!.login(this.botToken); - }); - } - - async sendMessage(jid: string, text: string): Promise { - if (!this.client) { - logger.warn('Discord client not initialized'); - return; - } - - try { - const channelId = jid.replace(/^dc:/, ''); - const channel = await this.client.channels.fetch(channelId); - - if (!channel || !('send' in channel)) { - logger.warn({ jid }, 'Discord channel not found or not text-based'); - return; - } - - const textChannel = channel as TextChannel; - - // Discord has a 2000 character limit per message — split if needed - const MAX_LENGTH = 2000; - if (text.length <= MAX_LENGTH) { - await textChannel.send(text); - } else { - for (let i = 0; i < text.length; i += MAX_LENGTH) { - await textChannel.send(text.slice(i, i + MAX_LENGTH)); - } - } - logger.info({ jid, length: text.length }, 'Discord message sent'); - } catch (err) { - logger.error({ jid, err }, 'Failed to send Discord message'); - } - } - - isConnected(): boolean { - return this.client !== null && this.client.isReady(); - } - - ownsJid(jid: string): boolean { - return jid.startsWith('dc:'); - } - - async disconnect(): Promise { - if (this.client) { - this.client.destroy(); - this.client = null; - logger.info('Discord bot stopped'); - } - } - - async setTyping(jid: string, isTyping: boolean): Promise { - if (!this.client || !isTyping) return; - try { - const channelId = jid.replace(/^dc:/, ''); - const channel = await this.client.channels.fetch(channelId); - if (channel && 'sendTyping' in channel) { - await (channel as TextChannel).sendTyping(); - } - } catch (err) { - logger.debug({ jid, err }, 'Failed to send Discord typing indicator'); - } - } -} diff --git a/.claude/skills/add-discord/manifest.yaml b/.claude/skills/add-discord/manifest.yaml deleted file mode 100644 index f2cf2c8..0000000 --- a/.claude/skills/add-discord/manifest.yaml +++ /dev/null @@ -1,20 +0,0 @@ -skill: discord -version: 1.0.0 -description: "Discord Bot integration via discord.js" -core_version: 0.1.0 -adds: - - src/channels/discord.ts - - src/channels/discord.test.ts -modifies: - - src/index.ts - - src/config.ts - - src/routing.test.ts -structured: - npm_dependencies: - discord.js: "^14.18.0" - env_additions: - - DISCORD_BOT_TOKEN - - DISCORD_ONLY -conflicts: [] -depends: [] -test: "npx vitest run src/channels/discord.test.ts" diff --git a/.claude/skills/add-discord/modify/src/config.ts b/.claude/skills/add-discord/modify/src/config.ts deleted file mode 100644 index 5f3fa6a..0000000 --- a/.claude/skills/add-discord/modify/src/config.ts +++ /dev/null @@ -1,77 +0,0 @@ -import os from 'os'; -import path from 'path'; - -import { readEnvFile } from './env.js'; - -// Read config values from .env (falls back to process.env). -// Secrets are NOT read here — they stay on disk and are loaded only -// where needed (container-runner.ts) to avoid leaking to child processes. -const envConfig = readEnvFile([ - 'ASSISTANT_NAME', - 'ASSISTANT_HAS_OWN_NUMBER', - 'DISCORD_BOT_TOKEN', - 'DISCORD_ONLY', -]); - -export const ASSISTANT_NAME = - process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy'; -export const ASSISTANT_HAS_OWN_NUMBER = - (process.env.ASSISTANT_HAS_OWN_NUMBER || envConfig.ASSISTANT_HAS_OWN_NUMBER) === 'true'; -export const POLL_INTERVAL = 2000; -export const SCHEDULER_POLL_INTERVAL = 60000; - -// Absolute paths needed for container mounts -const PROJECT_ROOT = process.cwd(); -const HOME_DIR = process.env.HOME || os.homedir(); - -// Mount security: allowlist stored OUTSIDE project root, never mounted into containers -export const MOUNT_ALLOWLIST_PATH = path.join( - HOME_DIR, - '.config', - 'nanoclaw', - 'mount-allowlist.json', -); -export const STORE_DIR = path.resolve(PROJECT_ROOT, 'store'); -export const GROUPS_DIR = path.resolve(PROJECT_ROOT, 'groups'); -export const DATA_DIR = path.resolve(PROJECT_ROOT, 'data'); -export const MAIN_GROUP_FOLDER = 'main'; - -export const CONTAINER_IMAGE = - process.env.CONTAINER_IMAGE || 'nanoclaw-agent:latest'; -export const CONTAINER_TIMEOUT = parseInt( - process.env.CONTAINER_TIMEOUT || '1800000', - 10, -); -export const CONTAINER_MAX_OUTPUT_SIZE = parseInt( - process.env.CONTAINER_MAX_OUTPUT_SIZE || '10485760', - 10, -); // 10MB default -export const IPC_POLL_INTERVAL = 1000; -export const IDLE_TIMEOUT = parseInt( - process.env.IDLE_TIMEOUT || '1800000', - 10, -); // 30min default — how long to keep container alive after last result -export const MAX_CONCURRENT_CONTAINERS = Math.max( - 1, - parseInt(process.env.MAX_CONCURRENT_CONTAINERS || '5', 10) || 5, -); - -function escapeRegex(str: string): string { - return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); -} - -export const TRIGGER_PATTERN = new RegExp( - `^@${escapeRegex(ASSISTANT_NAME)}\\b`, - 'i', -); - -// Timezone for scheduled tasks (cron expressions, etc.) -// Uses system timezone by default -export const TIMEZONE = - process.env.TZ || Intl.DateTimeFormat().resolvedOptions().timeZone; - -// Discord configuration -export const DISCORD_BOT_TOKEN = - process.env.DISCORD_BOT_TOKEN || envConfig.DISCORD_BOT_TOKEN || ''; -export const DISCORD_ONLY = - (process.env.DISCORD_ONLY || envConfig.DISCORD_ONLY) === 'true'; diff --git a/.claude/skills/add-discord/modify/src/config.ts.intent.md b/.claude/skills/add-discord/modify/src/config.ts.intent.md deleted file mode 100644 index a88fabe..0000000 --- a/.claude/skills/add-discord/modify/src/config.ts.intent.md +++ /dev/null @@ -1,21 +0,0 @@ -# Intent: src/config.ts modifications - -## What changed -Added two new configuration exports for Discord channel support. - -## Key sections -- **readEnvFile call**: Must include `DISCORD_BOT_TOKEN` and `DISCORD_ONLY` in the keys array. NanoClaw does NOT load `.env` into `process.env` — all `.env` values must be explicitly requested via `readEnvFile()`. -- **DISCORD_BOT_TOKEN**: Read from `process.env` first, then `envConfig` fallback, defaults to empty string (channel disabled when empty) -- **DISCORD_ONLY**: Boolean flag from `process.env` or `envConfig`, when `true` disables WhatsApp channel creation - -## Invariants -- All existing config exports remain unchanged -- New Discord keys are added to the `readEnvFile` call alongside existing keys -- New exports are appended at the end of the file -- No existing behavior is modified — Discord config is additive only -- Both `process.env` and `envConfig` are checked (same pattern as `ASSISTANT_NAME`) - -## Must-keep -- All existing exports (`ASSISTANT_NAME`, `POLL_INTERVAL`, `TRIGGER_PATTERN`, etc.) -- The `readEnvFile` pattern — ALL config read from `.env` must go through this function -- The `escapeRegex` helper and `TRIGGER_PATTERN` construction diff --git a/.claude/skills/add-discord/modify/src/index.ts b/.claude/skills/add-discord/modify/src/index.ts deleted file mode 100644 index 4b6f30e..0000000 --- a/.claude/skills/add-discord/modify/src/index.ts +++ /dev/null @@ -1,509 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { - ASSISTANT_NAME, - DISCORD_BOT_TOKEN, - DISCORD_ONLY, - IDLE_TIMEOUT, - MAIN_GROUP_FOLDER, - POLL_INTERVAL, - TRIGGER_PATTERN, -} from './config.js'; -import { DiscordChannel } from './channels/discord.js'; -import { WhatsAppChannel } from './channels/whatsapp.js'; -import { - ContainerOutput, - runContainerAgent, - writeGroupsSnapshot, - writeTasksSnapshot, -} from './container-runner.js'; -import { cleanupOrphans, ensureContainerRuntimeRunning } from './container-runtime.js'; -import { - getAllChats, - getAllRegisteredGroups, - getAllSessions, - getAllTasks, - getMessagesSince, - getNewMessages, - getRouterState, - initDatabase, - setRegisteredGroup, - setRouterState, - setSession, - storeChatMetadata, - storeMessage, -} from './db.js'; -import { GroupQueue } from './group-queue.js'; -import { resolveGroupFolderPath } from './group-folder.js'; -import { startIpcWatcher } from './ipc.js'; -import { findChannel, formatMessages, formatOutbound } from './router.js'; -import { startSchedulerLoop } from './task-scheduler.js'; -import { Channel, NewMessage, RegisteredGroup } from './types.js'; -import { logger } from './logger.js'; - -// Re-export for backwards compatibility during refactor -export { escapeXml, formatMessages } from './router.js'; - -let lastTimestamp = ''; -let sessions: Record = {}; -let registeredGroups: Record = {}; -let lastAgentTimestamp: Record = {}; -let messageLoopRunning = false; - -let whatsapp: WhatsAppChannel; -const channels: Channel[] = []; -const queue = new GroupQueue(); - -function loadState(): void { - lastTimestamp = getRouterState('last_timestamp') || ''; - const agentTs = getRouterState('last_agent_timestamp'); - try { - lastAgentTimestamp = agentTs ? JSON.parse(agentTs) : {}; - } catch { - logger.warn('Corrupted last_agent_timestamp in DB, resetting'); - lastAgentTimestamp = {}; - } - sessions = getAllSessions(); - registeredGroups = getAllRegisteredGroups(); - logger.info( - { groupCount: Object.keys(registeredGroups).length }, - 'State loaded', - ); -} - -function saveState(): void { - setRouterState('last_timestamp', lastTimestamp); - setRouterState( - 'last_agent_timestamp', - JSON.stringify(lastAgentTimestamp), - ); -} - -function registerGroup(jid: string, group: RegisteredGroup): void { - let groupDir: string; - try { - groupDir = resolveGroupFolderPath(group.folder); - } catch (err) { - logger.warn( - { jid, folder: group.folder, err }, - 'Rejecting group registration with invalid folder', - ); - return; - } - - registeredGroups[jid] = group; - setRegisteredGroup(jid, group); - - // Create group folder - fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true }); - - logger.info( - { jid, name: group.name, folder: group.folder }, - 'Group registered', - ); -} - -/** - * Get available groups list for the agent. - * Returns groups ordered by most recent activity. - */ -export function getAvailableGroups(): import('./container-runner.js').AvailableGroup[] { - const chats = getAllChats(); - const registeredJids = new Set(Object.keys(registeredGroups)); - - return chats - .filter((c) => c.jid !== '__group_sync__' && c.is_group) - .map((c) => ({ - jid: c.jid, - name: c.name, - lastActivity: c.last_message_time, - isRegistered: registeredJids.has(c.jid), - })); -} - -/** @internal - exported for testing */ -export function _setRegisteredGroups(groups: Record): void { - registeredGroups = groups; -} - -/** - * Process all pending messages for a group. - * Called by the GroupQueue when it's this group's turn. - */ -async function processGroupMessages(chatJid: string): Promise { - const group = registeredGroups[chatJid]; - if (!group) return true; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - return true; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const missedMessages = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - - if (missedMessages.length === 0) return true; - - // For non-main groups, check if trigger is required and present - if (!isMainGroup && group.requiresTrigger !== false) { - const hasTrigger = missedMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) return true; - } - - const prompt = formatMessages(missedMessages); - - // Advance cursor so the piping path in startMessageLoop won't re-fetch - // these messages. Save the old cursor so we can roll back on error. - const previousCursor = lastAgentTimestamp[chatJid] || ''; - lastAgentTimestamp[chatJid] = - missedMessages[missedMessages.length - 1].timestamp; - saveState(); - - logger.info( - { group: group.name, messageCount: missedMessages.length }, - 'Processing messages', - ); - - // Track idle timer for closing stdin when agent is idle - let idleTimer: ReturnType | null = null; - - const resetIdleTimer = () => { - if (idleTimer) clearTimeout(idleTimer); - idleTimer = setTimeout(() => { - logger.debug({ group: group.name }, 'Idle timeout, closing container stdin'); - queue.closeStdin(chatJid); - }, IDLE_TIMEOUT); - }; - - await channel.setTyping?.(chatJid, true); - let hadError = false; - let outputSentToUser = false; - - const output = await runAgent(group, prompt, chatJid, async (result) => { - // Streaming output callback — called for each agent result - if (result.result) { - const raw = typeof result.result === 'string' ? result.result : JSON.stringify(result.result); - // Strip ... blocks — agent uses these for internal reasoning - const text = raw.replace(/[\s\S]*?<\/internal>/g, '').trim(); - logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`); - if (text) { - await channel.sendMessage(chatJid, text); - outputSentToUser = true; - } - // Only reset idle timer on actual results, not session-update markers (result: null) - resetIdleTimer(); - } - - if (result.status === 'success') { - queue.notifyIdle(chatJid); - } - - if (result.status === 'error') { - hadError = true; - } - }); - - await channel.setTyping?.(chatJid, false); - if (idleTimer) clearTimeout(idleTimer); - - if (output === 'error' || hadError) { - // If we already sent output to the user, don't roll back the cursor — - // the user got their response and re-processing would send duplicates. - if (outputSentToUser) { - logger.warn({ group: group.name }, 'Agent error after output was sent, skipping cursor rollback to prevent duplicates'); - return true; - } - // Roll back cursor so retries can re-process these messages - lastAgentTimestamp[chatJid] = previousCursor; - saveState(); - logger.warn({ group: group.name }, 'Agent error, rolled back message cursor for retry'); - return false; - } - - return true; -} - -async function runAgent( - group: RegisteredGroup, - prompt: string, - chatJid: string, - onOutput?: (output: ContainerOutput) => Promise, -): Promise<'success' | 'error'> { - const isMain = group.folder === MAIN_GROUP_FOLDER; - const sessionId = sessions[group.folder]; - - // Update tasks snapshot for container to read (filtered by group) - const tasks = getAllTasks(); - writeTasksSnapshot( - group.folder, - isMain, - tasks.map((t) => ({ - id: t.id, - groupFolder: t.group_folder, - prompt: t.prompt, - schedule_type: t.schedule_type, - schedule_value: t.schedule_value, - status: t.status, - next_run: t.next_run, - })), - ); - - // Update available groups snapshot (main group only can see all groups) - const availableGroups = getAvailableGroups(); - writeGroupsSnapshot( - group.folder, - isMain, - availableGroups, - new Set(Object.keys(registeredGroups)), - ); - - // Wrap onOutput to track session ID from streamed results - const wrappedOnOutput = onOutput - ? async (output: ContainerOutput) => { - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - await onOutput(output); - } - : undefined; - - try { - const output = await runContainerAgent( - group, - { - prompt, - sessionId, - groupFolder: group.folder, - chatJid, - isMain, - assistantName: ASSISTANT_NAME, - }, - (proc, containerName) => queue.registerProcess(chatJid, proc, containerName, group.folder), - wrappedOnOutput, - ); - - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - - if (output.status === 'error') { - logger.error( - { group: group.name, error: output.error }, - 'Container agent error', - ); - return 'error'; - } - - return 'success'; - } catch (err) { - logger.error({ group: group.name, err }, 'Agent error'); - return 'error'; - } -} - -async function startMessageLoop(): Promise { - if (messageLoopRunning) { - logger.debug('Message loop already running, skipping duplicate start'); - return; - } - messageLoopRunning = true; - - logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`); - - while (true) { - try { - const jids = Object.keys(registeredGroups); - const { messages, newTimestamp } = getNewMessages(jids, lastTimestamp, ASSISTANT_NAME); - - if (messages.length > 0) { - logger.info({ count: messages.length }, 'New messages'); - - // Advance the "seen" cursor for all messages immediately - lastTimestamp = newTimestamp; - saveState(); - - // Deduplicate by group - const messagesByGroup = new Map(); - for (const msg of messages) { - const existing = messagesByGroup.get(msg.chat_jid); - if (existing) { - existing.push(msg); - } else { - messagesByGroup.set(msg.chat_jid, [msg]); - } - } - - for (const [chatJid, groupMessages] of messagesByGroup) { - const group = registeredGroups[chatJid]; - if (!group) continue; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - continue; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - const needsTrigger = !isMainGroup && group.requiresTrigger !== false; - - // For non-main groups, only act on trigger messages. - // Non-trigger messages accumulate in DB and get pulled as - // context when a trigger eventually arrives. - if (needsTrigger) { - const hasTrigger = groupMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) continue; - } - - // Pull all messages since lastAgentTimestamp so non-trigger - // context that accumulated between triggers is included. - const allPending = getMessagesSince( - chatJid, - lastAgentTimestamp[chatJid] || '', - ASSISTANT_NAME, - ); - const messagesToSend = - allPending.length > 0 ? allPending : groupMessages; - const formatted = formatMessages(messagesToSend); - - if (queue.sendMessage(chatJid, formatted)) { - logger.debug( - { chatJid, count: messagesToSend.length }, - 'Piped messages to active container', - ); - lastAgentTimestamp[chatJid] = - messagesToSend[messagesToSend.length - 1].timestamp; - saveState(); - // Show typing indicator while the container processes the piped message - channel.setTyping?.(chatJid, true)?.catch((err) => - logger.warn({ chatJid, err }, 'Failed to set typing indicator'), - ); - } else { - // No active container — enqueue for a new one - queue.enqueueMessageCheck(chatJid); - } - } - } - } catch (err) { - logger.error({ err }, 'Error in message loop'); - } - await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL)); - } -} - -/** - * Startup recovery: check for unprocessed messages in registered groups. - * Handles crash between advancing lastTimestamp and processing messages. - */ -function recoverPendingMessages(): void { - for (const [chatJid, group] of Object.entries(registeredGroups)) { - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const pending = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - if (pending.length > 0) { - logger.info( - { group: group.name, pendingCount: pending.length }, - 'Recovery: found unprocessed messages', - ); - queue.enqueueMessageCheck(chatJid); - } - } -} - -function ensureContainerSystemRunning(): void { - ensureContainerRuntimeRunning(); - cleanupOrphans(); -} - -async function main(): Promise { - ensureContainerSystemRunning(); - initDatabase(); - logger.info('Database initialized'); - loadState(); - - // Graceful shutdown handlers - const shutdown = async (signal: string) => { - logger.info({ signal }, 'Shutdown signal received'); - await queue.shutdown(10000); - for (const ch of channels) await ch.disconnect(); - process.exit(0); - }; - process.on('SIGTERM', () => shutdown('SIGTERM')); - process.on('SIGINT', () => shutdown('SIGINT')); - - // Channel callbacks (shared by all channels) - const channelOpts = { - onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg), - onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) => - storeChatMetadata(chatJid, timestamp, name, channel, isGroup), - registeredGroups: () => registeredGroups, - }; - - // Create and connect channels - if (DISCORD_BOT_TOKEN) { - const discord = new DiscordChannel(DISCORD_BOT_TOKEN, channelOpts); - channels.push(discord); - await discord.connect(); - } - - if (!DISCORD_ONLY) { - whatsapp = new WhatsAppChannel(channelOpts); - channels.push(whatsapp); - await whatsapp.connect(); - } - - // Start subsystems (independently of connection handler) - startSchedulerLoop({ - registeredGroups: () => registeredGroups, - getSessions: () => sessions, - queue, - onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder), - sendMessage: async (jid, rawText) => { - const channel = findChannel(channels, jid); - if (!channel) { - console.log(`Warning: no channel owns JID ${jid}, cannot send message`); - return; - } - const text = formatOutbound(rawText); - if (text) await channel.sendMessage(jid, text); - }, - }); - startIpcWatcher({ - sendMessage: (jid, text) => { - const channel = findChannel(channels, jid); - if (!channel) throw new Error(`No channel for JID: ${jid}`); - return channel.sendMessage(jid, text); - }, - registeredGroups: () => registeredGroups, - registerGroup, - syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(), - getAvailableGroups, - writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj), - }); - queue.setProcessMessagesFn(processGroupMessages); - recoverPendingMessages(); - startMessageLoop().catch((err) => { - logger.fatal({ err }, 'Message loop crashed unexpectedly'); - process.exit(1); - }); -} - -// Guard: only run when executed directly, not when imported by tests -const isDirectRun = - process.argv[1] && - new URL(import.meta.url).pathname === new URL(`file://${process.argv[1]}`).pathname; - -if (isDirectRun) { - main().catch((err) => { - logger.error({ err }, 'Failed to start NanoClaw'); - process.exit(1); - }); -} diff --git a/.claude/skills/add-discord/modify/src/index.ts.intent.md b/.claude/skills/add-discord/modify/src/index.ts.intent.md deleted file mode 100644 index a02ef52..0000000 --- a/.claude/skills/add-discord/modify/src/index.ts.intent.md +++ /dev/null @@ -1,43 +0,0 @@ -# Intent: src/index.ts modifications - -## What changed -Added Discord as a channel option alongside WhatsApp, introducing multi-channel infrastructure. - -## Key sections - -### Imports (top of file) -- Added: `DiscordChannel` from `./channels/discord.js` -- Added: `DISCORD_BOT_TOKEN`, `DISCORD_ONLY` from `./config.js` -- Added: `findChannel` from `./router.js` -- Added: `Channel` from `./types.js` - -### Multi-channel infrastructure -- Added: `const channels: Channel[] = []` array to hold all active channels -- Changed: `processGroupMessages` uses `findChannel(channels, chatJid)` instead of `whatsapp` directly -- Changed: `startMessageLoop` uses `findChannel(channels, chatJid)` instead of `whatsapp` directly -- Changed: `channel.setTyping?.()` instead of `whatsapp.setTyping()` -- Changed: `channel.sendMessage()` instead of `whatsapp.sendMessage()` - -### getAvailableGroups() -- Unchanged: uses `c.is_group` filter from base (Discord channels pass `isGroup=true` via `onChatMetadata`) - -### main() -- Added: `channelOpts` shared callback object for all channels -- Changed: WhatsApp conditional to `if (!DISCORD_ONLY)` -- Added: conditional Discord creation (`if (DISCORD_BOT_TOKEN)`) -- Changed: shutdown iterates `channels` array instead of just `whatsapp` -- Changed: subsystems use `findChannel(channels, jid)` for message routing - -## Invariants -- All existing message processing logic (triggers, cursors, idle timers) is preserved -- The `runAgent` function is completely unchanged -- State management (loadState/saveState) is unchanged -- Recovery logic is unchanged -- Container runtime check is unchanged (ensureContainerSystemRunning) - -## Must-keep -- The `escapeXml` and `formatMessages` re-exports -- The `_setRegisteredGroups` test helper -- The `isDirectRun` guard at bottom -- All error handling and cursor rollback logic in processGroupMessages -- The outgoing queue flush and reconnection logic (in WhatsAppChannel, not here) diff --git a/.claude/skills/add-discord/modify/src/routing.test.ts b/.claude/skills/add-discord/modify/src/routing.test.ts deleted file mode 100644 index 6144af0..0000000 --- a/.claude/skills/add-discord/modify/src/routing.test.ts +++ /dev/null @@ -1,147 +0,0 @@ -import { describe, it, expect, beforeEach } from 'vitest'; - -import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js'; -import { getAvailableGroups, _setRegisteredGroups } from './index.js'; - -beforeEach(() => { - _initTestDatabase(); - _setRegisteredGroups({}); -}); - -// --- JID ownership patterns --- - -describe('JID ownership patterns', () => { - // These test the patterns that will become ownsJid() on the Channel interface - - it('WhatsApp group JID: ends with @g.us', () => { - const jid = '12345678@g.us'; - expect(jid.endsWith('@g.us')).toBe(true); - }); - - it('Discord JID: starts with dc:', () => { - const jid = 'dc:1234567890123456'; - expect(jid.startsWith('dc:')).toBe(true); - }); - - it('WhatsApp DM JID: ends with @s.whatsapp.net', () => { - const jid = '12345678@s.whatsapp.net'; - expect(jid.endsWith('@s.whatsapp.net')).toBe(true); - }); -}); - -// --- getAvailableGroups --- - -describe('getAvailableGroups', () => { - it('returns only groups, excludes DMs', () => { - storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(2); - expect(groups.map((g) => g.jid)).toContain('group1@g.us'); - expect(groups.map((g) => g.jid)).toContain('group2@g.us'); - expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net'); - }); - - it('includes Discord channel JIDs', () => { - storeChatMetadata('dc:1234567890123456', '2024-01-01T00:00:01.000Z', 'Discord Channel', 'discord', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('dc:1234567890123456'); - }); - - it('marks registered Discord channels correctly', () => { - storeChatMetadata('dc:1234567890123456', '2024-01-01T00:00:01.000Z', 'DC Registered', 'discord', true); - storeChatMetadata('dc:9999999999999999', '2024-01-01T00:00:02.000Z', 'DC Unregistered', 'discord', true); - - _setRegisteredGroups({ - 'dc:1234567890123456': { - name: 'DC Registered', - folder: 'dc-registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const dcReg = groups.find((g) => g.jid === 'dc:1234567890123456'); - const dcUnreg = groups.find((g) => g.jid === 'dc:9999999999999999'); - - expect(dcReg?.isRegistered).toBe(true); - expect(dcUnreg?.isRegistered).toBe(false); - }); - - it('excludes __group_sync__ sentinel', () => { - storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z'); - storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('marks registered groups correctly', () => { - storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true); - storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true); - - _setRegisteredGroups({ - 'reg@g.us': { - name: 'Registered', - folder: 'registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const reg = groups.find((g) => g.jid === 'reg@g.us'); - const unreg = groups.find((g) => g.jid === 'unreg@g.us'); - - expect(reg?.isRegistered).toBe(true); - expect(unreg?.isRegistered).toBe(false); - }); - - it('returns groups ordered by most recent activity', () => { - storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true); - storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true); - storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups[0].jid).toBe('new@g.us'); - expect(groups[1].jid).toBe('mid@g.us'); - expect(groups[2].jid).toBe('old@g.us'); - }); - - it('excludes non-group chats regardless of JID format', () => { - // Unknown JID format stored without is_group should not appear - storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown'); - // Explicitly non-group with unusual JID - storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false); - // A real group for contrast - storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('returns empty array when no chats exist', () => { - const groups = getAvailableGroups(); - expect(groups).toHaveLength(0); - }); - - it('mixes WhatsApp and Discord chats ordered by activity', () => { - storeChatMetadata('wa@g.us', '2024-01-01T00:00:01.000Z', 'WhatsApp', 'whatsapp', true); - storeChatMetadata('dc:555', '2024-01-01T00:00:03.000Z', 'Discord', 'discord', true); - storeChatMetadata('wa2@g.us', '2024-01-01T00:00:02.000Z', 'WhatsApp 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(3); - expect(groups[0].jid).toBe('dc:555'); - expect(groups[1].jid).toBe('wa2@g.us'); - expect(groups[2].jid).toBe('wa@g.us'); - }); -}); diff --git a/.claude/skills/add-discord/tests/discord.test.ts b/.claude/skills/add-discord/tests/discord.test.ts deleted file mode 100644 index a644aa7..0000000 --- a/.claude/skills/add-discord/tests/discord.test.ts +++ /dev/null @@ -1,133 +0,0 @@ -import { describe, expect, it } from 'vitest'; -import fs from 'fs'; -import path from 'path'; - -describe('discord skill package', () => { - const skillDir = path.resolve(__dirname, '..'); - - it('has a valid manifest', () => { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - expect(fs.existsSync(manifestPath)).toBe(true); - - const content = fs.readFileSync(manifestPath, 'utf-8'); - expect(content).toContain('skill: discord'); - expect(content).toContain('version: 1.0.0'); - expect(content).toContain('discord.js'); - }); - - it('has all files declared in adds', () => { - const addFile = path.join(skillDir, 'add', 'src', 'channels', 'discord.ts'); - expect(fs.existsSync(addFile)).toBe(true); - - const content = fs.readFileSync(addFile, 'utf-8'); - expect(content).toContain('class DiscordChannel'); - expect(content).toContain('implements Channel'); - - // Test file for the channel - const testFile = path.join(skillDir, 'add', 'src', 'channels', 'discord.test.ts'); - expect(fs.existsSync(testFile)).toBe(true); - - const testContent = fs.readFileSync(testFile, 'utf-8'); - expect(testContent).toContain("describe('DiscordChannel'"); - }); - - it('has all files declared in modifies', () => { - const indexFile = path.join(skillDir, 'modify', 'src', 'index.ts'); - const configFile = path.join(skillDir, 'modify', 'src', 'config.ts'); - const routingTestFile = path.join(skillDir, 'modify', 'src', 'routing.test.ts'); - - expect(fs.existsSync(indexFile)).toBe(true); - expect(fs.existsSync(configFile)).toBe(true); - expect(fs.existsSync(routingTestFile)).toBe(true); - - const indexContent = fs.readFileSync(indexFile, 'utf-8'); - expect(indexContent).toContain('DiscordChannel'); - expect(indexContent).toContain('DISCORD_BOT_TOKEN'); - expect(indexContent).toContain('DISCORD_ONLY'); - expect(indexContent).toContain('findChannel'); - expect(indexContent).toContain('channels: Channel[]'); - - const configContent = fs.readFileSync(configFile, 'utf-8'); - expect(configContent).toContain('DISCORD_BOT_TOKEN'); - expect(configContent).toContain('DISCORD_ONLY'); - }); - - it('has intent files for modified files', () => { - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'index.ts.intent.md'))).toBe(true); - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'config.ts.intent.md'))).toBe(true); - }); - - it('modified index.ts preserves core structure', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'index.ts'), - 'utf-8', - ); - - // Core functions still present - expect(content).toContain('function loadState()'); - expect(content).toContain('function saveState()'); - expect(content).toContain('function registerGroup('); - expect(content).toContain('function getAvailableGroups()'); - expect(content).toContain('function processGroupMessages('); - expect(content).toContain('function runAgent('); - expect(content).toContain('function startMessageLoop()'); - expect(content).toContain('function recoverPendingMessages()'); - expect(content).toContain('function ensureContainerSystemRunning()'); - expect(content).toContain('async function main()'); - - // Test helper preserved - expect(content).toContain('_setRegisteredGroups'); - - // Direct-run guard preserved - expect(content).toContain('isDirectRun'); - }); - - it('modified index.ts includes Discord channel creation', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'index.ts'), - 'utf-8', - ); - - // Multi-channel architecture - expect(content).toContain('const channels: Channel[] = []'); - expect(content).toContain('channels.push(whatsapp)'); - expect(content).toContain('channels.push(discord)'); - - // Conditional channel creation - expect(content).toContain('if (!DISCORD_ONLY)'); - expect(content).toContain('if (DISCORD_BOT_TOKEN)'); - - // Shutdown disconnects all channels - expect(content).toContain('for (const ch of channels) await ch.disconnect()'); - }); - - it('modified config.ts preserves all existing exports', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'config.ts'), - 'utf-8', - ); - - // All original exports preserved - expect(content).toContain('export const ASSISTANT_NAME'); - expect(content).toContain('export const POLL_INTERVAL'); - expect(content).toContain('export const TRIGGER_PATTERN'); - expect(content).toContain('export const CONTAINER_IMAGE'); - expect(content).toContain('export const DATA_DIR'); - expect(content).toContain('export const TIMEZONE'); - - // Discord exports added - expect(content).toContain('export const DISCORD_BOT_TOKEN'); - expect(content).toContain('export const DISCORD_ONLY'); - }); - - it('modified routing.test.ts includes Discord JID tests', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'routing.test.ts'), - 'utf-8', - ); - - expect(content).toContain("Discord JID: starts with dc:"); - expect(content).toContain("dc:1234567890123456"); - expect(content).toContain("dc:"); - }); -}); diff --git a/.claude/skills/add-emacs/SKILL.md b/.claude/skills/add-emacs/SKILL.md new file mode 100644 index 0000000..09bdbdd --- /dev/null +++ b/.claude/skills/add-emacs/SKILL.md @@ -0,0 +1,289 @@ +--- +name: add-emacs +description: Add Emacs as a channel. Opens an interactive chat buffer and org-mode integration so you can talk to NanoClaw from within Emacs (Doom, Spacemacs, or vanilla). Uses a local HTTP bridge — no bot token or external service needed. +--- + +# Add Emacs Channel + +This skill adds Emacs support to NanoClaw, then walks through interactive setup. +Works with Doom Emacs, Spacemacs, and vanilla Emacs 27.1+. + +## What you can do with this + +- **Ask while coding** — open the chat buffer (`C-c n c` / `SPC N c`), ask about a function or error without leaving Emacs +- **Code review** — select a region and send it with `nanoclaw-org-send`; the response appears as a child heading inline in your org file +- **Meeting notes** — send an org agenda entry; get a summary or action item list back as a child node +- **Draft writing** — send org prose; receive revisions or continuations in place +- **Research capture** — ask a question directly in your org notes; the answer lands exactly where you need it +- **Schedule tasks** — ask Andy to set a reminder or create a scheduled NanoClaw task (e.g. "remind me tomorrow to review the PR") + +## Phase 1: Pre-flight + +### Check if already applied + +Check if `src/channels/emacs.ts` exists: + +```bash +test -f src/channels/emacs.ts && echo "already applied" || echo "not applied" +``` + +If it exists, skip to Phase 3 (Setup). The code changes are already in place. + +## Phase 2: Apply Code Changes + +### Ensure the upstream remote + +```bash +git remote -v +``` + +If an `upstream` remote pointing to `https://github.com/qwibitai/nanoclaw.git` is missing, +add it: + +```bash +git remote add upstream https://github.com/qwibitai/nanoclaw.git +``` + +### Merge the skill branch + +```bash +git fetch upstream skill/emacs +git merge upstream/skill/emacs +``` + +If there are merge conflicts on `package-lock.json`, resolve them by accepting the incoming +version and continuing: + +```bash +git checkout --theirs package-lock.json +git add package-lock.json +git merge --continue +``` + +For any other conflict, read the conflicted file and reconcile both sides manually. + +This adds: +- `src/channels/emacs.ts` — `EmacsBridgeChannel` HTTP server (port 8766) +- `src/channels/emacs.test.ts` — unit tests +- `emacs/nanoclaw.el` — Emacs Lisp package (`nanoclaw-chat`, `nanoclaw-org-send`) +- `import './emacs.js'` appended to `src/channels/index.ts` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. + +### Validate code changes + +```bash +npm run build +npx vitest run src/channels/emacs.test.ts +``` + +Build must be clean and tests must pass before proceeding. + +## Phase 3: Setup + +### Configure environment (optional) + +The channel works out of the box with defaults. Add to `.env` only if you need non-defaults: + +```bash +EMACS_CHANNEL_PORT=8766 # default — change if 8766 is already in use +EMACS_AUTH_TOKEN= # optional — locks the endpoint to Emacs only +``` + +If you change or add values, sync to the container environment: + +```bash +mkdir -p data/env && cp .env data/env/env +``` + +### Configure Emacs + +The `nanoclaw.el` package requires only Emacs 27.1+ built-in libraries (`url`, `json`, `org`) — no package manager setup needed. + +AskUserQuestion: Which Emacs distribution are you using? +- **Doom Emacs** - config.el with map! keybindings +- **Spacemacs** - dotspacemacs/user-config in ~/.spacemacs +- **Vanilla Emacs / other** - init.el with global-set-key + +**Doom Emacs** — add to `~/.config/doom/config.el` (or `~/.doom.d/config.el`): + +```elisp +;; NanoClaw — personal AI assistant channel +(load (expand-file-name "~/src/nanoclaw/emacs/nanoclaw.el")) + +(map! :leader + :prefix ("N" . "NanoClaw") + :desc "Chat buffer" "c" #'nanoclaw-chat + :desc "Send org" "o" #'nanoclaw-org-send) +``` + +Then reload: `M-x doom/reload` + +**Spacemacs** — add to `dotspacemacs/user-config` in `~/.spacemacs`: + +```elisp +;; NanoClaw — personal AI assistant channel +(load-file "~/src/nanoclaw/emacs/nanoclaw.el") + +(spacemacs/set-leader-keys "aNc" #'nanoclaw-chat) +(spacemacs/set-leader-keys "aNo" #'nanoclaw-org-send) +``` + +Then reload: `M-x dotspacemacs/sync-configuration-layers` or restart Emacs. + +**Vanilla Emacs** — add to `~/.emacs.d/init.el` (or `~/.emacs`): + +```elisp +;; NanoClaw — personal AI assistant channel +(load-file "~/src/nanoclaw/emacs/nanoclaw.el") + +(global-set-key (kbd "C-c n c") #'nanoclaw-chat) +(global-set-key (kbd "C-c n o") #'nanoclaw-org-send) +``` + +Then reload: `M-x eval-buffer` or restart Emacs. + +If `EMACS_AUTH_TOKEN` was set, also add (any distribution): + +```elisp +(setq nanoclaw-auth-token "") +``` + +If `EMACS_CHANNEL_PORT` was changed from the default, also add: + +```elisp +(setq nanoclaw-port ) +``` + +### Restart NanoClaw + +```bash +npm run build +launchctl kickstart -k gui/$(id -u)/com.nanoclaw # macOS +# Linux: systemctl --user restart nanoclaw +``` + +## Phase 4: Verify + +### Test the HTTP endpoint + +```bash +curl -s "http://localhost:8766/api/messages?since=0" +``` + +Expected: `{"messages":[]}` + +If you set `EMACS_AUTH_TOKEN`: + +```bash +curl -s -H "Authorization: Bearer " "http://localhost:8766/api/messages?since=0" +``` + +### Test from Emacs + +Tell the user: + +> 1. Open the chat buffer with your keybinding (`SPC N c`, `SPC a N c`, or `C-c n c`) +> 2. Type a message and press `RET` +> 3. A response from Andy should appear within a few seconds +> +> For org-mode: open any `.org` file, position the cursor on a heading, and use `SPC N o` / `SPC a N o` / `C-c n o` + +### Check logs if needed + +```bash +tail -f logs/nanoclaw.log +``` + +Look for `Emacs channel listening` at startup and `Emacs message received` when a message is sent. + +## Troubleshooting + +### Port already in use + +``` +Error: listen EADDRINUSE: address already in use :::8766 +``` + +Either a stale NanoClaw process is running, or 8766 is taken by another app. + +Find and kill the stale process: + +```bash +lsof -ti :8766 | xargs kill -9 +``` + +Or change the port in `.env` (`EMACS_CHANNEL_PORT=8767`) and update `nanoclaw-port` in Emacs config. + +### No response from agent + +Check: +1. NanoClaw is running: `launchctl list | grep nanoclaw` (macOS) or `systemctl --user status nanoclaw` (Linux) +2. Emacs group is registered: `sqlite3 store/messages.db "SELECT * FROM registered_groups WHERE jid = 'emacs:default'"` +3. Logs show activity: `tail -50 logs/nanoclaw.log` + +If the group is not registered, it will be created automatically on the next NanoClaw restart. + +### Auth token mismatch (401 Unauthorized) + +Verify the token in Emacs matches `.env`: + +```elisp +;; M-x describe-variable RET nanoclaw-auth-token RET +``` + +Must exactly match `EMACS_AUTH_TOKEN` in `.env`. + +### nanoclaw.el not loading + +Check the path is correct: + +```bash +ls ~/src/nanoclaw/emacs/nanoclaw.el +``` + +If NanoClaw is cloned elsewhere, update the `load`/`load-file` path in your Emacs config. + +## After Setup + +If running `npm run dev` while the service is active: + +```bash +# macOS: +launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist +npm run dev +# When done testing: +launchctl load ~/Library/LaunchAgents/com.nanoclaw.plist + +# Linux: +# systemctl --user stop nanoclaw +# npm run dev +# systemctl --user start nanoclaw +``` + +## Agent Formatting + +The Emacs bridge converts markdown → org-mode automatically. Agents should +output standard markdown — **not** org-mode syntax. The conversion handles: + +| Markdown | Org-mode | +|----------|----------| +| `**bold**` | `*bold*` | +| `*italic*` | `/italic/` | +| `~~text~~` | `+text+` | +| `` `code` `` | `~code~` | +| ` ```lang ` | `#+begin_src lang` | + +If an agent outputs org-mode directly, bold/italic/etc. will be double-converted +and render incorrectly. + +## Removal + +To remove the Emacs channel: + +1. Delete `src/channels/emacs.ts`, `src/channels/emacs.test.ts`, and `emacs/nanoclaw.el` +2. Remove `import './emacs.js'` from `src/channels/index.ts` +3. Remove the NanoClaw block from your Emacs config file +4. Remove Emacs registration from SQLite: `sqlite3 store/messages.db "DELETE FROM registered_groups WHERE jid = 'emacs:default'"` +5. Remove `EMACS_CHANNEL_PORT` and `EMACS_AUTH_TOKEN` from `.env` if set +6. Rebuild: `npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `npm run build && systemctl --user restart nanoclaw` (Linux) \ No newline at end of file diff --git a/.claude/skills/add-gmail/SKILL.md b/.claude/skills/add-gmail/SKILL.md index f4267cc..781a0eb 100644 --- a/.claude/skills/add-gmail/SKILL.md +++ b/.claude/skills/add-gmail/SKILL.md @@ -11,7 +11,7 @@ This skill adds Gmail support to NanoClaw — either as a tool (read, send, sear ### Check if already applied -Read `.nanoclaw/state.yaml`. If `gmail` is in `applied_skills`, skip to Phase 3 (Setup). The code changes are already in place. +Check if `src/channels/gmail.ts` exists. If it does, skip to Phase 3 (Setup). The code changes are already in place. ### Ask the user @@ -24,66 +24,42 @@ AskUserQuestion: Should incoming emails be able to trigger the agent? ## Phase 2: Apply Code Changes -### Initialize skills system (if needed) - -If `.nanoclaw/` directory doesn't exist yet: +### Ensure channel remote ```bash -npx tsx scripts/apply-skill.ts --init +git remote -v ``` -### Path A: Tool-only (user chose "No") - -Do NOT run the full apply script. Only two source files need changes. This avoids adding dead code (`gmail.ts`, `gmail.test.ts`, index.ts channel logic, routing tests, `googleapis` dependency). - -#### 1. Mount Gmail credentials in container - -Apply the changes described in `modify/src/container-runner.ts.intent.md` to `src/container-runner.ts`: import `os`, add a conditional read-write mount of `~/.gmail-mcp` to `/home/node/.gmail-mcp` in `buildVolumeMounts()` after the session mounts. - -#### 2. Add Gmail MCP server to agent runner - -Apply the changes described in `modify/container/agent-runner/src/index.ts.intent.md` to `container/agent-runner/src/index.ts`: add `gmail` MCP server (`npx -y @gongrzhe/server-gmail-autoauth-mcp`) and `'mcp__gmail__*'` to `allowedTools`. - -#### 3. Record in state - -Add `gmail` to `.nanoclaw/state.yaml` under `applied_skills` with `mode: tool-only`. - -#### 4. Validate +If `gmail` is missing, add it: ```bash -npm run build +git remote add gmail https://github.com/qwibitai/nanoclaw-gmail.git ``` -Build must be clean before proceeding. Skip to Phase 3. - -### Path B: Channel mode (user chose "Yes") - -Run the full skills engine to apply all code changes: +### Merge the skill branch ```bash -npx tsx scripts/apply-skill.ts .claude/skills/add-gmail +git fetch gmail main +git merge gmail/main || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} ``` -This deterministically: +This merges in: +- `src/channels/gmail.ts` (GmailChannel class with self-registration via `registerChannel`) +- `src/channels/gmail.test.ts` (unit tests) +- `import './gmail.js'` appended to the channel barrel file `src/channels/index.ts` +- Gmail credentials mount (`~/.gmail-mcp`) in `src/container-runner.ts` +- Gmail MCP server (`@gongrzhe/server-gmail-autoauth-mcp`) and `mcp__gmail__*` allowed tool in `container/agent-runner/src/index.ts` +- `googleapis` npm dependency in `package.json` -- Adds `src/channels/gmail.ts` (GmailChannel class implementing Channel interface) -- Adds `src/channels/gmail.test.ts` (unit tests) -- Three-way merges Gmail channel wiring into `src/index.ts` (GmailChannel creation) -- Three-way merges Gmail credentials mount into `src/container-runner.ts` (~/.gmail-mcp -> /home/node/.gmail-mcp) -- Three-way merges Gmail MCP server into `container/agent-runner/src/index.ts` (@gongrzhe/server-gmail-autoauth-mcp) -- Three-way merges Gmail JID tests into `src/routing.test.ts` -- Installs the `googleapis` npm dependency -- Records the application in `.nanoclaw/state.yaml` +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. -If the apply reports merge conflicts, read the intent files: +### Add email handling instructions (Channel mode only) -- `modify/src/index.ts.intent.md` — what changed and invariants for index.ts -- `modify/src/container-runner.ts.intent.md` — what changed for container-runner.ts -- `modify/container/agent-runner/src/index.ts.intent.md` — what changed for agent-runner - -#### Add email handling instructions - -Append the following to `groups/main/CLAUDE.md` (before the formatting section): +If the user chose channel mode, append the following to `groups/main/CLAUDE.md` (before the formatting section): ```markdown ## Email Notifications @@ -91,14 +67,15 @@ Append the following to `groups/main/CLAUDE.md` (before the formatting section): When you receive an email notification (messages starting with `[Email from ...`), inform the user about it but do NOT reply to the email unless specifically asked. You have Gmail tools available — use them only when the user explicitly asks you to reply, forward, or take action on an email. ``` -#### Validate +### Validate code changes ```bash -npm test +npm install npm run build +npx vitest run src/channels/gmail.test.ts ``` -All tests must pass (including the new gmail tests) and build must be clean before proceeding. +All tests must pass (including the new Gmail tests) and build must be clean before proceeding. ## Phase 3: Setup @@ -227,18 +204,17 @@ npx -y @gongrzhe/server-gmail-autoauth-mcp 1. Remove `~/.gmail-mcp` mount from `src/container-runner.ts` 2. Remove `gmail` MCP server and `mcp__gmail__*` from `container/agent-runner/src/index.ts` -3. Remove `gmail` from `.nanoclaw/state.yaml` +3. Rebuild and restart 4. Clear stale agent-runner copies: `rm -r data/sessions/*/agent-runner-src 2>/dev/null || true` 5. Rebuild: `cd container && ./build.sh && cd .. && npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `systemctl --user restart nanoclaw` (Linux) ### Channel mode 1. Delete `src/channels/gmail.ts` and `src/channels/gmail.test.ts` -2. Remove `GmailChannel` import and creation from `src/index.ts` +2. Remove `import './gmail.js'` from `src/channels/index.ts` 3. Remove `~/.gmail-mcp` mount from `src/container-runner.ts` 4. Remove `gmail` MCP server and `mcp__gmail__*` from `container/agent-runner/src/index.ts` -5. Remove Gmail JID tests from `src/routing.test.ts` -6. Uninstall: `npm uninstall googleapis` -7. Remove `gmail` from `.nanoclaw/state.yaml` -8. Clear stale agent-runner copies: `rm -r data/sessions/*/agent-runner-src 2>/dev/null || true` -9. Rebuild: `cd container && ./build.sh && cd .. && npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `systemctl --user restart nanoclaw` (Linux) +5. Uninstall: `npm uninstall googleapis` +6. Rebuild and restart +7. Clear stale agent-runner copies: `rm -r data/sessions/*/agent-runner-src 2>/dev/null || true` +8. Rebuild: `cd container && ./build.sh && cd .. && npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `systemctl --user restart nanoclaw` (Linux) diff --git a/.claude/skills/add-gmail/add/src/channels/gmail.test.ts b/.claude/skills/add-gmail/add/src/channels/gmail.test.ts deleted file mode 100644 index 52602dd..0000000 --- a/.claude/skills/add-gmail/add/src/channels/gmail.test.ts +++ /dev/null @@ -1,71 +0,0 @@ -import { describe, it, expect, vi, beforeEach } from 'vitest'; - -import { GmailChannel, GmailChannelOpts } from './gmail.js'; - -function makeOpts(overrides?: Partial): GmailChannelOpts { - return { - onMessage: vi.fn(), - onChatMetadata: vi.fn(), - registeredGroups: () => ({}), - ...overrides, - }; -} - -describe('GmailChannel', () => { - let channel: GmailChannel; - - beforeEach(() => { - channel = new GmailChannel(makeOpts()); - }); - - describe('ownsJid', () => { - it('returns true for gmail: prefixed JIDs', () => { - expect(channel.ownsJid('gmail:abc123')).toBe(true); - expect(channel.ownsJid('gmail:thread-id-456')).toBe(true); - }); - - it('returns false for non-gmail JIDs', () => { - expect(channel.ownsJid('12345@g.us')).toBe(false); - expect(channel.ownsJid('tg:123')).toBe(false); - expect(channel.ownsJid('dc:456')).toBe(false); - expect(channel.ownsJid('user@s.whatsapp.net')).toBe(false); - }); - }); - - describe('name', () => { - it('is gmail', () => { - expect(channel.name).toBe('gmail'); - }); - }); - - describe('isConnected', () => { - it('returns false before connect', () => { - expect(channel.isConnected()).toBe(false); - }); - }); - - describe('disconnect', () => { - it('sets connected to false', async () => { - await channel.disconnect(); - expect(channel.isConnected()).toBe(false); - }); - }); - - describe('constructor options', () => { - it('accepts custom poll interval', () => { - const ch = new GmailChannel(makeOpts(), 30000); - expect(ch.name).toBe('gmail'); - }); - - it('defaults to unread query when no filter configured', () => { - const ch = new GmailChannel(makeOpts()); - const query = (ch as unknown as { buildQuery: () => string }).buildQuery(); - expect(query).toBe('is:unread category:primary'); - }); - - it('defaults with no options provided', () => { - const ch = new GmailChannel(makeOpts()); - expect(ch.name).toBe('gmail'); - }); - }); -}); diff --git a/.claude/skills/add-gmail/add/src/channels/gmail.ts b/.claude/skills/add-gmail/add/src/channels/gmail.ts deleted file mode 100644 index b9ade60..0000000 --- a/.claude/skills/add-gmail/add/src/channels/gmail.ts +++ /dev/null @@ -1,339 +0,0 @@ -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { google, gmail_v1 } from 'googleapis'; -import { OAuth2Client } from 'google-auth-library'; - -import { MAIN_GROUP_FOLDER } from '../config.js'; -import { logger } from '../logger.js'; -import { - Channel, - OnChatMetadata, - OnInboundMessage, - RegisteredGroup, -} from '../types.js'; - -export interface GmailChannelOpts { - onMessage: OnInboundMessage; - onChatMetadata: OnChatMetadata; - registeredGroups: () => Record; -} - -interface ThreadMeta { - sender: string; - senderName: string; - subject: string; - messageId: string; // RFC 2822 Message-ID for In-Reply-To -} - -export class GmailChannel implements Channel { - name = 'gmail'; - - private oauth2Client: OAuth2Client | null = null; - private gmail: gmail_v1.Gmail | null = null; - private opts: GmailChannelOpts; - private pollIntervalMs: number; - private pollTimer: ReturnType | null = null; - private processedIds = new Set(); - private threadMeta = new Map(); - private consecutiveErrors = 0; - private userEmail = ''; - - constructor(opts: GmailChannelOpts, pollIntervalMs = 60000) { - this.opts = opts; - this.pollIntervalMs = pollIntervalMs; - } - - async connect(): Promise { - const credDir = path.join(os.homedir(), '.gmail-mcp'); - const keysPath = path.join(credDir, 'gcp-oauth.keys.json'); - const tokensPath = path.join(credDir, 'credentials.json'); - - if (!fs.existsSync(keysPath) || !fs.existsSync(tokensPath)) { - logger.warn( - 'Gmail credentials not found in ~/.gmail-mcp/. Skipping Gmail channel. Run /add-gmail to set up.', - ); - return; - } - - const keys = JSON.parse(fs.readFileSync(keysPath, 'utf-8')); - const tokens = JSON.parse(fs.readFileSync(tokensPath, 'utf-8')); - - const clientConfig = keys.installed || keys.web || keys; - const { client_id, client_secret, redirect_uris } = clientConfig; - this.oauth2Client = new google.auth.OAuth2( - client_id, - client_secret, - redirect_uris?.[0], - ); - this.oauth2Client.setCredentials(tokens); - - // Persist refreshed tokens - this.oauth2Client.on('tokens', (newTokens) => { - try { - const current = JSON.parse(fs.readFileSync(tokensPath, 'utf-8')); - Object.assign(current, newTokens); - fs.writeFileSync(tokensPath, JSON.stringify(current, null, 2)); - logger.debug('Gmail OAuth tokens refreshed'); - } catch (err) { - logger.warn({ err }, 'Failed to persist refreshed Gmail tokens'); - } - }); - - this.gmail = google.gmail({ version: 'v1', auth: this.oauth2Client }); - - // Verify connection - const profile = await this.gmail.users.getProfile({ userId: 'me' }); - this.userEmail = profile.data.emailAddress || ''; - logger.info({ email: this.userEmail }, 'Gmail channel connected'); - - // Start polling with error backoff - const schedulePoll = () => { - const backoffMs = this.consecutiveErrors > 0 - ? Math.min(this.pollIntervalMs * Math.pow(2, this.consecutiveErrors), 30 * 60 * 1000) - : this.pollIntervalMs; - this.pollTimer = setTimeout(() => { - this.pollForMessages() - .catch((err) => logger.error({ err }, 'Gmail poll error')) - .finally(() => { - if (this.gmail) schedulePoll(); - }); - }, backoffMs); - }; - - // Initial poll - await this.pollForMessages(); - schedulePoll(); - } - - async sendMessage(jid: string, text: string): Promise { - if (!this.gmail) { - logger.warn('Gmail not initialized'); - return; - } - - const threadId = jid.replace(/^gmail:/, ''); - const meta = this.threadMeta.get(threadId); - - if (!meta) { - logger.warn({ jid }, 'No thread metadata for reply, cannot send'); - return; - } - - const subject = meta.subject.startsWith('Re:') - ? meta.subject - : `Re: ${meta.subject}`; - - const headers = [ - `To: ${meta.sender}`, - `From: ${this.userEmail}`, - `Subject: ${subject}`, - `In-Reply-To: ${meta.messageId}`, - `References: ${meta.messageId}`, - 'Content-Type: text/plain; charset=utf-8', - '', - text, - ].join('\r\n'); - - const encodedMessage = Buffer.from(headers) - .toString('base64') - .replace(/\+/g, '-') - .replace(/\//g, '_') - .replace(/=+$/, ''); - - try { - await this.gmail.users.messages.send({ - userId: 'me', - requestBody: { - raw: encodedMessage, - threadId, - }, - }); - logger.info({ to: meta.sender, threadId }, 'Gmail reply sent'); - } catch (err) { - logger.error({ jid, err }, 'Failed to send Gmail reply'); - } - } - - isConnected(): boolean { - return this.gmail !== null; - } - - ownsJid(jid: string): boolean { - return jid.startsWith('gmail:'); - } - - async disconnect(): Promise { - if (this.pollTimer) { - clearTimeout(this.pollTimer); - this.pollTimer = null; - } - this.gmail = null; - this.oauth2Client = null; - logger.info('Gmail channel stopped'); - } - - // --- Private --- - - private buildQuery(): string { - return 'is:unread category:primary'; - } - - private async pollForMessages(): Promise { - if (!this.gmail) return; - - try { - const query = this.buildQuery(); - const res = await this.gmail.users.messages.list({ - userId: 'me', - q: query, - maxResults: 10, - }); - - const messages = res.data.messages || []; - - for (const stub of messages) { - if (!stub.id || this.processedIds.has(stub.id)) continue; - this.processedIds.add(stub.id); - - await this.processMessage(stub.id); - } - - // Cap processed ID set to prevent unbounded growth - if (this.processedIds.size > 5000) { - const ids = [...this.processedIds]; - this.processedIds = new Set(ids.slice(ids.length - 2500)); - } - - this.consecutiveErrors = 0; - } catch (err) { - this.consecutiveErrors++; - const backoffMs = Math.min(this.pollIntervalMs * Math.pow(2, this.consecutiveErrors), 30 * 60 * 1000); - logger.error({ err, consecutiveErrors: this.consecutiveErrors, nextPollMs: backoffMs }, 'Gmail poll failed'); - } - } - - private async processMessage(messageId: string): Promise { - if (!this.gmail) return; - - const msg = await this.gmail.users.messages.get({ - userId: 'me', - id: messageId, - format: 'full', - }); - - const headers = msg.data.payload?.headers || []; - const getHeader = (name: string) => - headers.find((h) => h.name?.toLowerCase() === name.toLowerCase()) - ?.value || ''; - - const from = getHeader('From'); - const subject = getHeader('Subject'); - const rfc2822MessageId = getHeader('Message-ID'); - const threadId = msg.data.threadId || messageId; - const timestamp = new Date( - parseInt(msg.data.internalDate || '0', 10), - ).toISOString(); - - // Extract sender name and email - const senderMatch = from.match(/^(.+?)\s*<(.+?)>$/); - const senderName = senderMatch ? senderMatch[1].replace(/"/g, '') : from; - const senderEmail = senderMatch ? senderMatch[2] : from; - - // Skip emails from self (our own replies) - if (senderEmail === this.userEmail) return; - - // Extract body text - const body = this.extractTextBody(msg.data.payload); - - if (!body) { - logger.debug({ messageId, subject }, 'Skipping email with no text body'); - return; - } - - const chatJid = `gmail:${threadId}`; - - // Cache thread metadata for replies - this.threadMeta.set(threadId, { - sender: senderEmail, - senderName, - subject, - messageId: rfc2822MessageId, - }); - - // Store chat metadata for group discovery - this.opts.onChatMetadata(chatJid, timestamp, subject, 'gmail', false); - - // Find the main group to deliver the email notification - const groups = this.opts.registeredGroups(); - const mainEntry = Object.entries(groups).find( - ([, g]) => g.folder === MAIN_GROUP_FOLDER, - ); - - if (!mainEntry) { - logger.debug( - { chatJid, subject }, - 'No main group registered, skipping email', - ); - return; - } - - const mainJid = mainEntry[0]; - const content = `[Email from ${senderName} <${senderEmail}>]\nSubject: ${subject}\n\n${body}`; - - this.opts.onMessage(mainJid, { - id: messageId, - chat_jid: mainJid, - sender: senderEmail, - sender_name: senderName, - content, - timestamp, - is_from_me: false, - }); - - // Mark as read - try { - await this.gmail.users.messages.modify({ - userId: 'me', - id: messageId, - requestBody: { removeLabelIds: ['UNREAD'] }, - }); - } catch (err) { - logger.warn({ messageId, err }, 'Failed to mark email as read'); - } - - logger.info( - { mainJid, from: senderName, subject }, - 'Gmail email delivered to main group', - ); - } - - private extractTextBody( - payload: gmail_v1.Schema$MessagePart | undefined, - ): string { - if (!payload) return ''; - - // Direct text/plain body - if (payload.mimeType === 'text/plain' && payload.body?.data) { - return Buffer.from(payload.body.data, 'base64').toString('utf-8'); - } - - // Multipart: search parts recursively - if (payload.parts) { - // Prefer text/plain - for (const part of payload.parts) { - if (part.mimeType === 'text/plain' && part.body?.data) { - return Buffer.from(part.body.data, 'base64').toString('utf-8'); - } - } - // Recurse into nested multipart - for (const part of payload.parts) { - const text = this.extractTextBody(part); - if (text) return text; - } - } - - return ''; - } -} diff --git a/.claude/skills/add-gmail/manifest.yaml b/.claude/skills/add-gmail/manifest.yaml deleted file mode 100644 index ea7c66a..0000000 --- a/.claude/skills/add-gmail/manifest.yaml +++ /dev/null @@ -1,18 +0,0 @@ -skill: gmail -version: 1.0.0 -description: "Gmail integration via Google APIs" -core_version: 0.1.0 -adds: - - src/channels/gmail.ts - - src/channels/gmail.test.ts -modifies: - - src/index.ts - - src/container-runner.ts - - container/agent-runner/src/index.ts - - src/routing.test.ts -structured: - npm_dependencies: - googleapis: "^144.0.0" -conflicts: [] -depends: [] -test: "npx vitest run src/channels/gmail.test.ts" diff --git a/.claude/skills/add-gmail/modify/container/agent-runner/src/index.ts b/.claude/skills/add-gmail/modify/container/agent-runner/src/index.ts deleted file mode 100644 index 4d98033..0000000 --- a/.claude/skills/add-gmail/modify/container/agent-runner/src/index.ts +++ /dev/null @@ -1,593 +0,0 @@ -/** - * NanoClaw Agent Runner - * Runs inside a container, receives config via stdin, outputs result to stdout - * - * Input protocol: - * Stdin: Full ContainerInput JSON (read until EOF, like before) - * IPC: Follow-up messages written as JSON files to /workspace/ipc/input/ - * Files: {type:"message", text:"..."}.json — polled and consumed - * Sentinel: /workspace/ipc/input/_close — signals session end - * - * Stdout protocol: - * Each result is wrapped in OUTPUT_START_MARKER / OUTPUT_END_MARKER pairs. - * Multiple results may be emitted (one per agent teams result). - * Final marker after loop ends signals completion. - */ - -import fs from 'fs'; -import path from 'path'; -import { query, HookCallback, PreCompactHookInput, PreToolUseHookInput } from '@anthropic-ai/claude-agent-sdk'; -import { fileURLToPath } from 'url'; - -interface ContainerInput { - prompt: string; - sessionId?: string; - groupFolder: string; - chatJid: string; - isMain: boolean; - isScheduledTask?: boolean; - assistantName?: string; - secrets?: Record; -} - -interface ContainerOutput { - status: 'success' | 'error'; - result: string | null; - newSessionId?: string; - error?: string; -} - -interface SessionEntry { - sessionId: string; - fullPath: string; - summary: string; - firstPrompt: string; -} - -interface SessionsIndex { - entries: SessionEntry[]; -} - -interface SDKUserMessage { - type: 'user'; - message: { role: 'user'; content: string }; - parent_tool_use_id: null; - session_id: string; -} - -const IPC_INPUT_DIR = '/workspace/ipc/input'; -const IPC_INPUT_CLOSE_SENTINEL = path.join(IPC_INPUT_DIR, '_close'); -const IPC_POLL_MS = 500; - -/** - * Push-based async iterable for streaming user messages to the SDK. - * Keeps the iterable alive until end() is called, preventing isSingleUserTurn. - */ -class MessageStream { - private queue: SDKUserMessage[] = []; - private waiting: (() => void) | null = null; - private done = false; - - push(text: string): void { - this.queue.push({ - type: 'user', - message: { role: 'user', content: text }, - parent_tool_use_id: null, - session_id: '', - }); - this.waiting?.(); - } - - end(): void { - this.done = true; - this.waiting?.(); - } - - async *[Symbol.asyncIterator](): AsyncGenerator { - while (true) { - while (this.queue.length > 0) { - yield this.queue.shift()!; - } - if (this.done) return; - await new Promise(r => { this.waiting = r; }); - this.waiting = null; - } - } -} - -async function readStdin(): Promise { - return new Promise((resolve, reject) => { - let data = ''; - process.stdin.setEncoding('utf8'); - process.stdin.on('data', chunk => { data += chunk; }); - process.stdin.on('end', () => resolve(data)); - process.stdin.on('error', reject); - }); -} - -const OUTPUT_START_MARKER = '---NANOCLAW_OUTPUT_START---'; -const OUTPUT_END_MARKER = '---NANOCLAW_OUTPUT_END---'; - -function writeOutput(output: ContainerOutput): void { - console.log(OUTPUT_START_MARKER); - console.log(JSON.stringify(output)); - console.log(OUTPUT_END_MARKER); -} - -function log(message: string): void { - console.error(`[agent-runner] ${message}`); -} - -function getSessionSummary(sessionId: string, transcriptPath: string): string | null { - const projectDir = path.dirname(transcriptPath); - const indexPath = path.join(projectDir, 'sessions-index.json'); - - if (!fs.existsSync(indexPath)) { - log(`Sessions index not found at ${indexPath}`); - return null; - } - - try { - const index: SessionsIndex = JSON.parse(fs.readFileSync(indexPath, 'utf-8')); - const entry = index.entries.find(e => e.sessionId === sessionId); - if (entry?.summary) { - return entry.summary; - } - } catch (err) { - log(`Failed to read sessions index: ${err instanceof Error ? err.message : String(err)}`); - } - - return null; -} - -/** - * Archive the full transcript to conversations/ before compaction. - */ -function createPreCompactHook(assistantName?: string): HookCallback { - return async (input, _toolUseId, _context) => { - const preCompact = input as PreCompactHookInput; - const transcriptPath = preCompact.transcript_path; - const sessionId = preCompact.session_id; - - if (!transcriptPath || !fs.existsSync(transcriptPath)) { - log('No transcript found for archiving'); - return {}; - } - - try { - const content = fs.readFileSync(transcriptPath, 'utf-8'); - const messages = parseTranscript(content); - - if (messages.length === 0) { - log('No messages to archive'); - return {}; - } - - const summary = getSessionSummary(sessionId, transcriptPath); - const name = summary ? sanitizeFilename(summary) : generateFallbackName(); - - const conversationsDir = '/workspace/group/conversations'; - fs.mkdirSync(conversationsDir, { recursive: true }); - - const date = new Date().toISOString().split('T')[0]; - const filename = `${date}-${name}.md`; - const filePath = path.join(conversationsDir, filename); - - const markdown = formatTranscriptMarkdown(messages, summary, assistantName); - fs.writeFileSync(filePath, markdown); - - log(`Archived conversation to ${filePath}`); - } catch (err) { - log(`Failed to archive transcript: ${err instanceof Error ? err.message : String(err)}`); - } - - return {}; - }; -} - -// Secrets to strip from Bash tool subprocess environments. -// These are needed by claude-code for API auth but should never -// be visible to commands Kit runs. -const SECRET_ENV_VARS = ['ANTHROPIC_API_KEY', 'CLAUDE_CODE_OAUTH_TOKEN']; - -function createSanitizeBashHook(): HookCallback { - return async (input, _toolUseId, _context) => { - const preInput = input as PreToolUseHookInput; - const command = (preInput.tool_input as { command?: string })?.command; - if (!command) return {}; - - const unsetPrefix = `unset ${SECRET_ENV_VARS.join(' ')} 2>/dev/null; `; - return { - hookSpecificOutput: { - hookEventName: 'PreToolUse', - updatedInput: { - ...(preInput.tool_input as Record), - command: unsetPrefix + command, - }, - }, - }; - }; -} - -function sanitizeFilename(summary: string): string { - return summary - .toLowerCase() - .replace(/[^a-z0-9]+/g, '-') - .replace(/^-+|-+$/g, '') - .slice(0, 50); -} - -function generateFallbackName(): string { - const time = new Date(); - return `conversation-${time.getHours().toString().padStart(2, '0')}${time.getMinutes().toString().padStart(2, '0')}`; -} - -interface ParsedMessage { - role: 'user' | 'assistant'; - content: string; -} - -function parseTranscript(content: string): ParsedMessage[] { - const messages: ParsedMessage[] = []; - - for (const line of content.split('\n')) { - if (!line.trim()) continue; - try { - const entry = JSON.parse(line); - if (entry.type === 'user' && entry.message?.content) { - const text = typeof entry.message.content === 'string' - ? entry.message.content - : entry.message.content.map((c: { text?: string }) => c.text || '').join(''); - if (text) messages.push({ role: 'user', content: text }); - } else if (entry.type === 'assistant' && entry.message?.content) { - const textParts = entry.message.content - .filter((c: { type: string }) => c.type === 'text') - .map((c: { text: string }) => c.text); - const text = textParts.join(''); - if (text) messages.push({ role: 'assistant', content: text }); - } - } catch { - } - } - - return messages; -} - -function formatTranscriptMarkdown(messages: ParsedMessage[], title?: string | null, assistantName?: string): string { - const now = new Date(); - const formatDateTime = (d: Date) => d.toLocaleString('en-US', { - month: 'short', - day: 'numeric', - hour: 'numeric', - minute: '2-digit', - hour12: true - }); - - const lines: string[] = []; - lines.push(`# ${title || 'Conversation'}`); - lines.push(''); - lines.push(`Archived: ${formatDateTime(now)}`); - lines.push(''); - lines.push('---'); - lines.push(''); - - for (const msg of messages) { - const sender = msg.role === 'user' ? 'User' : (assistantName || 'Assistant'); - const content = msg.content.length > 2000 - ? msg.content.slice(0, 2000) + '...' - : msg.content; - lines.push(`**${sender}**: ${content}`); - lines.push(''); - } - - return lines.join('\n'); -} - -/** - * Check for _close sentinel. - */ -function shouldClose(): boolean { - if (fs.existsSync(IPC_INPUT_CLOSE_SENTINEL)) { - try { fs.unlinkSync(IPC_INPUT_CLOSE_SENTINEL); } catch { /* ignore */ } - return true; - } - return false; -} - -/** - * Drain all pending IPC input messages. - * Returns messages found, or empty array. - */ -function drainIpcInput(): string[] { - try { - fs.mkdirSync(IPC_INPUT_DIR, { recursive: true }); - const files = fs.readdirSync(IPC_INPUT_DIR) - .filter(f => f.endsWith('.json')) - .sort(); - - const messages: string[] = []; - for (const file of files) { - const filePath = path.join(IPC_INPUT_DIR, file); - try { - const data = JSON.parse(fs.readFileSync(filePath, 'utf-8')); - fs.unlinkSync(filePath); - if (data.type === 'message' && data.text) { - messages.push(data.text); - } - } catch (err) { - log(`Failed to process input file ${file}: ${err instanceof Error ? err.message : String(err)}`); - try { fs.unlinkSync(filePath); } catch { /* ignore */ } - } - } - return messages; - } catch (err) { - log(`IPC drain error: ${err instanceof Error ? err.message : String(err)}`); - return []; - } -} - -/** - * Wait for a new IPC message or _close sentinel. - * Returns the messages as a single string, or null if _close. - */ -function waitForIpcMessage(): Promise { - return new Promise((resolve) => { - const poll = () => { - if (shouldClose()) { - resolve(null); - return; - } - const messages = drainIpcInput(); - if (messages.length > 0) { - resolve(messages.join('\n')); - return; - } - setTimeout(poll, IPC_POLL_MS); - }; - poll(); - }); -} - -/** - * Run a single query and stream results via writeOutput. - * Uses MessageStream (AsyncIterable) to keep isSingleUserTurn=false, - * allowing agent teams subagents to run to completion. - * Also pipes IPC messages into the stream during the query. - */ -async function runQuery( - prompt: string, - sessionId: string | undefined, - mcpServerPath: string, - containerInput: ContainerInput, - sdkEnv: Record, - resumeAt?: string, -): Promise<{ newSessionId?: string; lastAssistantUuid?: string; closedDuringQuery: boolean }> { - const stream = new MessageStream(); - stream.push(prompt); - - // Poll IPC for follow-up messages and _close sentinel during the query - let ipcPolling = true; - let closedDuringQuery = false; - const pollIpcDuringQuery = () => { - if (!ipcPolling) return; - if (shouldClose()) { - log('Close sentinel detected during query, ending stream'); - closedDuringQuery = true; - stream.end(); - ipcPolling = false; - return; - } - const messages = drainIpcInput(); - for (const text of messages) { - log(`Piping IPC message into active query (${text.length} chars)`); - stream.push(text); - } - setTimeout(pollIpcDuringQuery, IPC_POLL_MS); - }; - setTimeout(pollIpcDuringQuery, IPC_POLL_MS); - - let newSessionId: string | undefined; - let lastAssistantUuid: string | undefined; - let messageCount = 0; - let resultCount = 0; - - // Load global CLAUDE.md as additional system context (shared across all groups) - const globalClaudeMdPath = '/workspace/global/CLAUDE.md'; - let globalClaudeMd: string | undefined; - if (!containerInput.isMain && fs.existsSync(globalClaudeMdPath)) { - globalClaudeMd = fs.readFileSync(globalClaudeMdPath, 'utf-8'); - } - - // Discover additional directories mounted at /workspace/extra/* - // These are passed to the SDK so their CLAUDE.md files are loaded automatically - const extraDirs: string[] = []; - const extraBase = '/workspace/extra'; - if (fs.existsSync(extraBase)) { - for (const entry of fs.readdirSync(extraBase)) { - const fullPath = path.join(extraBase, entry); - if (fs.statSync(fullPath).isDirectory()) { - extraDirs.push(fullPath); - } - } - } - if (extraDirs.length > 0) { - log(`Additional directories: ${extraDirs.join(', ')}`); - } - - for await (const message of query({ - prompt: stream, - options: { - cwd: '/workspace/group', - additionalDirectories: extraDirs.length > 0 ? extraDirs : undefined, - resume: sessionId, - resumeSessionAt: resumeAt, - systemPrompt: globalClaudeMd - ? { type: 'preset' as const, preset: 'claude_code' as const, append: globalClaudeMd } - : undefined, - allowedTools: [ - 'Bash', - 'Read', 'Write', 'Edit', 'Glob', 'Grep', - 'WebSearch', 'WebFetch', - 'Task', 'TaskOutput', 'TaskStop', - 'TeamCreate', 'TeamDelete', 'SendMessage', - 'TodoWrite', 'ToolSearch', 'Skill', - 'NotebookEdit', - 'mcp__nanoclaw__*', - 'mcp__gmail__*', - ], - env: sdkEnv, - permissionMode: 'bypassPermissions', - allowDangerouslySkipPermissions: true, - settingSources: ['project', 'user'], - mcpServers: { - nanoclaw: { - command: 'node', - args: [mcpServerPath], - env: { - NANOCLAW_CHAT_JID: containerInput.chatJid, - NANOCLAW_GROUP_FOLDER: containerInput.groupFolder, - NANOCLAW_IS_MAIN: containerInput.isMain ? '1' : '0', - }, - }, - gmail: { - command: 'npx', - args: ['-y', '@gongrzhe/server-gmail-autoauth-mcp'], - }, - }, - hooks: { - PreCompact: [{ hooks: [createPreCompactHook(containerInput.assistantName)] }], - PreToolUse: [{ matcher: 'Bash', hooks: [createSanitizeBashHook()] }], - }, - } - })) { - messageCount++; - const msgType = message.type === 'system' ? `system/${(message as { subtype?: string }).subtype}` : message.type; - log(`[msg #${messageCount}] type=${msgType}`); - - if (message.type === 'assistant' && 'uuid' in message) { - lastAssistantUuid = (message as { uuid: string }).uuid; - } - - if (message.type === 'system' && message.subtype === 'init') { - newSessionId = message.session_id; - log(`Session initialized: ${newSessionId}`); - } - - if (message.type === 'system' && (message as { subtype?: string }).subtype === 'task_notification') { - const tn = message as { task_id: string; status: string; summary: string }; - log(`Task notification: task=${tn.task_id} status=${tn.status} summary=${tn.summary}`); - } - - if (message.type === 'result') { - resultCount++; - const textResult = 'result' in message ? (message as { result?: string }).result : null; - log(`Result #${resultCount}: subtype=${message.subtype}${textResult ? ` text=${textResult.slice(0, 200)}` : ''}`); - writeOutput({ - status: 'success', - result: textResult || null, - newSessionId - }); - } - } - - ipcPolling = false; - log(`Query done. Messages: ${messageCount}, results: ${resultCount}, lastAssistantUuid: ${lastAssistantUuid || 'none'}, closedDuringQuery: ${closedDuringQuery}`); - return { newSessionId, lastAssistantUuid, closedDuringQuery }; -} - -async function main(): Promise { - let containerInput: ContainerInput; - - try { - const stdinData = await readStdin(); - containerInput = JSON.parse(stdinData); - // Delete the temp file the entrypoint wrote — it contains secrets - try { fs.unlinkSync('/tmp/input.json'); } catch { /* may not exist */ } - log(`Received input for group: ${containerInput.groupFolder}`); - } catch (err) { - writeOutput({ - status: 'error', - result: null, - error: `Failed to parse input: ${err instanceof Error ? err.message : String(err)}` - }); - process.exit(1); - } - - // Build SDK env: merge secrets into process.env for the SDK only. - // Secrets never touch process.env itself, so Bash subprocesses can't see them. - const sdkEnv: Record = { ...process.env }; - for (const [key, value] of Object.entries(containerInput.secrets || {})) { - sdkEnv[key] = value; - } - - const __dirname = path.dirname(fileURLToPath(import.meta.url)); - const mcpServerPath = path.join(__dirname, 'ipc-mcp-stdio.js'); - - let sessionId = containerInput.sessionId; - fs.mkdirSync(IPC_INPUT_DIR, { recursive: true }); - - // Clean up stale _close sentinel from previous container runs - try { fs.unlinkSync(IPC_INPUT_CLOSE_SENTINEL); } catch { /* ignore */ } - - // Build initial prompt (drain any pending IPC messages too) - let prompt = containerInput.prompt; - if (containerInput.isScheduledTask) { - prompt = `[SCHEDULED TASK - The following message was sent automatically and is not coming directly from the user or group.]\n\n${prompt}`; - } - const pending = drainIpcInput(); - if (pending.length > 0) { - log(`Draining ${pending.length} pending IPC messages into initial prompt`); - prompt += '\n' + pending.join('\n'); - } - - // Query loop: run query → wait for IPC message → run new query → repeat - let resumeAt: string | undefined; - try { - while (true) { - log(`Starting query (session: ${sessionId || 'new'}, resumeAt: ${resumeAt || 'latest'})...`); - - const queryResult = await runQuery(prompt, sessionId, mcpServerPath, containerInput, sdkEnv, resumeAt); - if (queryResult.newSessionId) { - sessionId = queryResult.newSessionId; - } - if (queryResult.lastAssistantUuid) { - resumeAt = queryResult.lastAssistantUuid; - } - - // If _close was consumed during the query, exit immediately. - // Don't emit a session-update marker (it would reset the host's - // idle timer and cause a 30-min delay before the next _close). - if (queryResult.closedDuringQuery) { - log('Close sentinel consumed during query, exiting'); - break; - } - - // Emit session update so host can track it - writeOutput({ status: 'success', result: null, newSessionId: sessionId }); - - log('Query ended, waiting for next IPC message...'); - - // Wait for the next message or _close sentinel - const nextMessage = await waitForIpcMessage(); - if (nextMessage === null) { - log('Close sentinel received, exiting'); - break; - } - - log(`Got new message (${nextMessage.length} chars), starting new query`); - prompt = nextMessage; - } - } catch (err) { - const errorMessage = err instanceof Error ? err.message : String(err); - log(`Agent error: ${errorMessage}`); - writeOutput({ - status: 'error', - result: null, - newSessionId: sessionId, - error: errorMessage - }); - process.exit(1); - } -} - -main(); diff --git a/.claude/skills/add-gmail/modify/container/agent-runner/src/index.ts.intent.md b/.claude/skills/add-gmail/modify/container/agent-runner/src/index.ts.intent.md deleted file mode 100644 index 3d24be7..0000000 --- a/.claude/skills/add-gmail/modify/container/agent-runner/src/index.ts.intent.md +++ /dev/null @@ -1,32 +0,0 @@ -# Intent: container/agent-runner/src/index.ts modifications - -## What changed -Added Gmail MCP server to the agent's available tools so it can read and send emails. - -## Key sections - -### mcpServers (inside runQuery → query() call) -- Added: `gmail` MCP server alongside the existing `nanoclaw` server: - ``` - gmail: { - command: 'npx', - args: ['-y', '@gongrzhe/server-gmail-autoauth-mcp'], - }, - ``` - -### allowedTools (inside runQuery → query() call) -- Added: `'mcp__gmail__*'` to allow all Gmail MCP tools - -## Invariants -- The `nanoclaw` MCP server configuration is unchanged -- All existing allowed tools are preserved -- The query loop, IPC handling, MessageStream, and all other logic is untouched -- Hooks (PreCompact, sanitize Bash) are unchanged -- Output protocol (markers) is unchanged - -## Must-keep -- The `nanoclaw` MCP server with its environment variables -- All existing allowedTools entries -- The hook system (PreCompact, PreToolUse sanitize) -- The IPC input/close sentinel handling -- The MessageStream class and query loop diff --git a/.claude/skills/add-gmail/modify/src/container-runner.ts b/.claude/skills/add-gmail/modify/src/container-runner.ts deleted file mode 100644 index 7221338..0000000 --- a/.claude/skills/add-gmail/modify/src/container-runner.ts +++ /dev/null @@ -1,661 +0,0 @@ -/** - * Container Runner for NanoClaw - * Spawns agent execution in containers and handles IPC - */ -import { ChildProcess, exec, spawn } from 'child_process'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { - CONTAINER_IMAGE, - CONTAINER_MAX_OUTPUT_SIZE, - CONTAINER_TIMEOUT, - DATA_DIR, - GROUPS_DIR, - IDLE_TIMEOUT, - TIMEZONE, -} from './config.js'; -import { readEnvFile } from './env.js'; -import { resolveGroupFolderPath, resolveGroupIpcPath } from './group-folder.js'; -import { logger } from './logger.js'; -import { CONTAINER_RUNTIME_BIN, readonlyMountArgs, stopContainer } from './container-runtime.js'; -import { validateAdditionalMounts } from './mount-security.js'; -import { RegisteredGroup } from './types.js'; - -// Sentinel markers for robust output parsing (must match agent-runner) -const OUTPUT_START_MARKER = '---NANOCLAW_OUTPUT_START---'; -const OUTPUT_END_MARKER = '---NANOCLAW_OUTPUT_END---'; - -export interface ContainerInput { - prompt: string; - sessionId?: string; - groupFolder: string; - chatJid: string; - isMain: boolean; - isScheduledTask?: boolean; - assistantName?: string; - secrets?: Record; -} - -export interface ContainerOutput { - status: 'success' | 'error'; - result: string | null; - newSessionId?: string; - error?: string; -} - -interface VolumeMount { - hostPath: string; - containerPath: string; - readonly: boolean; -} - -function buildVolumeMounts( - group: RegisteredGroup, - isMain: boolean, -): VolumeMount[] { - const mounts: VolumeMount[] = []; - const projectRoot = process.cwd(); - const homeDir = os.homedir(); - const groupDir = resolveGroupFolderPath(group.folder); - - if (isMain) { - // Main gets the project root read-only. Writable paths the agent needs - // (group folder, IPC, .claude/) are mounted separately below. - // Read-only prevents the agent from modifying host application code - // (src/, dist/, package.json, etc.) which would bypass the sandbox - // entirely on next restart. - mounts.push({ - hostPath: projectRoot, - containerPath: '/workspace/project', - readonly: true, - }); - - // Main also gets its group folder as the working directory - mounts.push({ - hostPath: groupDir, - containerPath: '/workspace/group', - readonly: false, - }); - } else { - // Other groups only get their own folder - mounts.push({ - hostPath: groupDir, - containerPath: '/workspace/group', - readonly: false, - }); - - // Global memory directory (read-only for non-main) - // Only directory mounts are supported, not file mounts - const globalDir = path.join(GROUPS_DIR, 'global'); - if (fs.existsSync(globalDir)) { - mounts.push({ - hostPath: globalDir, - containerPath: '/workspace/global', - readonly: true, - }); - } - } - - // Per-group Claude sessions directory (isolated from other groups) - // Each group gets their own .claude/ to prevent cross-group session access - const groupSessionsDir = path.join( - DATA_DIR, - 'sessions', - group.folder, - '.claude', - ); - fs.mkdirSync(groupSessionsDir, { recursive: true }); - const settingsFile = path.join(groupSessionsDir, 'settings.json'); - if (!fs.existsSync(settingsFile)) { - fs.writeFileSync(settingsFile, JSON.stringify({ - env: { - // Enable agent swarms (subagent orchestration) - // https://code.claude.com/docs/en/agent-teams#orchestrate-teams-of-claude-code-sessions - CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS: '1', - // Load CLAUDE.md from additional mounted directories - // https://code.claude.com/docs/en/memory#load-memory-from-additional-directories - CLAUDE_CODE_ADDITIONAL_DIRECTORIES_CLAUDE_MD: '1', - // Enable Claude's memory feature (persists user preferences between sessions) - // https://code.claude.com/docs/en/memory#manage-auto-memory - CLAUDE_CODE_DISABLE_AUTO_MEMORY: '0', - }, - }, null, 2) + '\n'); - } - - // Sync skills from container/skills/ into each group's .claude/skills/ - const skillsSrc = path.join(process.cwd(), 'container', 'skills'); - const skillsDst = path.join(groupSessionsDir, 'skills'); - if (fs.existsSync(skillsSrc)) { - for (const skillDir of fs.readdirSync(skillsSrc)) { - const srcDir = path.join(skillsSrc, skillDir); - if (!fs.statSync(srcDir).isDirectory()) continue; - const dstDir = path.join(skillsDst, skillDir); - fs.cpSync(srcDir, dstDir, { recursive: true }); - } - } - mounts.push({ - hostPath: groupSessionsDir, - containerPath: '/home/node/.claude', - readonly: false, - }); - - // Gmail credentials directory (for Gmail MCP inside the container) - const gmailDir = path.join(homeDir, '.gmail-mcp'); - if (fs.existsSync(gmailDir)) { - mounts.push({ - hostPath: gmailDir, - containerPath: '/home/node/.gmail-mcp', - readonly: false, // MCP may need to refresh OAuth tokens - }); - } - - // Per-group IPC namespace: each group gets its own IPC directory - // This prevents cross-group privilege escalation via IPC - const groupIpcDir = resolveGroupIpcPath(group.folder); - fs.mkdirSync(path.join(groupIpcDir, 'messages'), { recursive: true }); - fs.mkdirSync(path.join(groupIpcDir, 'tasks'), { recursive: true }); - fs.mkdirSync(path.join(groupIpcDir, 'input'), { recursive: true }); - mounts.push({ - hostPath: groupIpcDir, - containerPath: '/workspace/ipc', - readonly: false, - }); - - // Copy agent-runner source into a per-group writable location so agents - // can customize it (add tools, change behavior) without affecting other - // groups. Recompiled on container startup via entrypoint.sh. - const agentRunnerSrc = path.join(projectRoot, 'container', 'agent-runner', 'src'); - const groupAgentRunnerDir = path.join(DATA_DIR, 'sessions', group.folder, 'agent-runner-src'); - if (!fs.existsSync(groupAgentRunnerDir) && fs.existsSync(agentRunnerSrc)) { - fs.cpSync(agentRunnerSrc, groupAgentRunnerDir, { recursive: true }); - } - mounts.push({ - hostPath: groupAgentRunnerDir, - containerPath: '/app/src', - readonly: false, - }); - - // Additional mounts validated against external allowlist (tamper-proof from containers) - if (group.containerConfig?.additionalMounts) { - const validatedMounts = validateAdditionalMounts( - group.containerConfig.additionalMounts, - group.name, - isMain, - ); - mounts.push(...validatedMounts); - } - - return mounts; -} - -/** - * Read allowed secrets from .env for passing to the container via stdin. - * Secrets are never written to disk or mounted as files. - */ -function readSecrets(): Record { - return readEnvFile(['CLAUDE_CODE_OAUTH_TOKEN', 'ANTHROPIC_API_KEY']); -} - -function buildContainerArgs(mounts: VolumeMount[], containerName: string): string[] { - const args: string[] = ['run', '-i', '--rm', '--name', containerName]; - - // Pass host timezone so container's local time matches the user's - args.push('-e', `TZ=${TIMEZONE}`); - - // Run as host user so bind-mounted files are accessible. - // Skip when running as root (uid 0), as the container's node user (uid 1000), - // or when getuid is unavailable (native Windows without WSL). - const hostUid = process.getuid?.(); - const hostGid = process.getgid?.(); - if (hostUid != null && hostUid !== 0 && hostUid !== 1000) { - args.push('--user', `${hostUid}:${hostGid}`); - args.push('-e', 'HOME=/home/node'); - } - - for (const mount of mounts) { - if (mount.readonly) { - args.push(...readonlyMountArgs(mount.hostPath, mount.containerPath)); - } else { - args.push('-v', `${mount.hostPath}:${mount.containerPath}`); - } - } - - args.push(CONTAINER_IMAGE); - - return args; -} - -export async function runContainerAgent( - group: RegisteredGroup, - input: ContainerInput, - onProcess: (proc: ChildProcess, containerName: string) => void, - onOutput?: (output: ContainerOutput) => Promise, -): Promise { - const startTime = Date.now(); - - const groupDir = resolveGroupFolderPath(group.folder); - fs.mkdirSync(groupDir, { recursive: true }); - - const mounts = buildVolumeMounts(group, input.isMain); - const safeName = group.folder.replace(/[^a-zA-Z0-9-]/g, '-'); - const containerName = `nanoclaw-${safeName}-${Date.now()}`; - const containerArgs = buildContainerArgs(mounts, containerName); - - logger.debug( - { - group: group.name, - containerName, - mounts: mounts.map( - (m) => - `${m.hostPath} -> ${m.containerPath}${m.readonly ? ' (ro)' : ''}`, - ), - containerArgs: containerArgs.join(' '), - }, - 'Container mount configuration', - ); - - logger.info( - { - group: group.name, - containerName, - mountCount: mounts.length, - isMain: input.isMain, - }, - 'Spawning container agent', - ); - - const logsDir = path.join(groupDir, 'logs'); - fs.mkdirSync(logsDir, { recursive: true }); - - return new Promise((resolve) => { - const container = spawn(CONTAINER_RUNTIME_BIN, containerArgs, { - stdio: ['pipe', 'pipe', 'pipe'], - }); - - onProcess(container, containerName); - - let stdout = ''; - let stderr = ''; - let stdoutTruncated = false; - let stderrTruncated = false; - - // Pass secrets via stdin (never written to disk or mounted as files) - input.secrets = readSecrets(); - container.stdin.write(JSON.stringify(input)); - container.stdin.end(); - // Remove secrets from input so they don't appear in logs - delete input.secrets; - - // Streaming output: parse OUTPUT_START/END marker pairs as they arrive - let parseBuffer = ''; - let newSessionId: string | undefined; - let outputChain = Promise.resolve(); - - container.stdout.on('data', (data) => { - const chunk = data.toString(); - - // Always accumulate for logging - if (!stdoutTruncated) { - const remaining = CONTAINER_MAX_OUTPUT_SIZE - stdout.length; - if (chunk.length > remaining) { - stdout += chunk.slice(0, remaining); - stdoutTruncated = true; - logger.warn( - { group: group.name, size: stdout.length }, - 'Container stdout truncated due to size limit', - ); - } else { - stdout += chunk; - } - } - - // Stream-parse for output markers - if (onOutput) { - parseBuffer += chunk; - let startIdx: number; - while ((startIdx = parseBuffer.indexOf(OUTPUT_START_MARKER)) !== -1) { - const endIdx = parseBuffer.indexOf(OUTPUT_END_MARKER, startIdx); - if (endIdx === -1) break; // Incomplete pair, wait for more data - - const jsonStr = parseBuffer - .slice(startIdx + OUTPUT_START_MARKER.length, endIdx) - .trim(); - parseBuffer = parseBuffer.slice(endIdx + OUTPUT_END_MARKER.length); - - try { - const parsed: ContainerOutput = JSON.parse(jsonStr); - if (parsed.newSessionId) { - newSessionId = parsed.newSessionId; - } - hadStreamingOutput = true; - // Activity detected — reset the hard timeout - resetTimeout(); - // Call onOutput for all markers (including null results) - // so idle timers start even for "silent" query completions. - outputChain = outputChain.then(() => onOutput(parsed)); - } catch (err) { - logger.warn( - { group: group.name, error: err }, - 'Failed to parse streamed output chunk', - ); - } - } - } - }); - - container.stderr.on('data', (data) => { - const chunk = data.toString(); - const lines = chunk.trim().split('\n'); - for (const line of lines) { - if (line) logger.debug({ container: group.folder }, line); - } - // Don't reset timeout on stderr — SDK writes debug logs continuously. - // Timeout only resets on actual output (OUTPUT_MARKER in stdout). - if (stderrTruncated) return; - const remaining = CONTAINER_MAX_OUTPUT_SIZE - stderr.length; - if (chunk.length > remaining) { - stderr += chunk.slice(0, remaining); - stderrTruncated = true; - logger.warn( - { group: group.name, size: stderr.length }, - 'Container stderr truncated due to size limit', - ); - } else { - stderr += chunk; - } - }); - - let timedOut = false; - let hadStreamingOutput = false; - const configTimeout = group.containerConfig?.timeout || CONTAINER_TIMEOUT; - // Grace period: hard timeout must be at least IDLE_TIMEOUT + 30s so the - // graceful _close sentinel has time to trigger before the hard kill fires. - const timeoutMs = Math.max(configTimeout, IDLE_TIMEOUT + 30_000); - - const killOnTimeout = () => { - timedOut = true; - logger.error({ group: group.name, containerName }, 'Container timeout, stopping gracefully'); - exec(stopContainer(containerName), { timeout: 15000 }, (err) => { - if (err) { - logger.warn({ group: group.name, containerName, err }, 'Graceful stop failed, force killing'); - container.kill('SIGKILL'); - } - }); - }; - - let timeout = setTimeout(killOnTimeout, timeoutMs); - - // Reset the timeout whenever there's activity (streaming output) - const resetTimeout = () => { - clearTimeout(timeout); - timeout = setTimeout(killOnTimeout, timeoutMs); - }; - - container.on('close', (code) => { - clearTimeout(timeout); - const duration = Date.now() - startTime; - - if (timedOut) { - const ts = new Date().toISOString().replace(/[:.]/g, '-'); - const timeoutLog = path.join(logsDir, `container-${ts}.log`); - fs.writeFileSync(timeoutLog, [ - `=== Container Run Log (TIMEOUT) ===`, - `Timestamp: ${new Date().toISOString()}`, - `Group: ${group.name}`, - `Container: ${containerName}`, - `Duration: ${duration}ms`, - `Exit Code: ${code}`, - `Had Streaming Output: ${hadStreamingOutput}`, - ].join('\n')); - - // Timeout after output = idle cleanup, not failure. - // The agent already sent its response; this is just the - // container being reaped after the idle period expired. - if (hadStreamingOutput) { - logger.info( - { group: group.name, containerName, duration, code }, - 'Container timed out after output (idle cleanup)', - ); - outputChain.then(() => { - resolve({ - status: 'success', - result: null, - newSessionId, - }); - }); - return; - } - - logger.error( - { group: group.name, containerName, duration, code }, - 'Container timed out with no output', - ); - - resolve({ - status: 'error', - result: null, - error: `Container timed out after ${configTimeout}ms`, - }); - return; - } - - const timestamp = new Date().toISOString().replace(/[:.]/g, '-'); - const logFile = path.join(logsDir, `container-${timestamp}.log`); - const isVerbose = process.env.LOG_LEVEL === 'debug' || process.env.LOG_LEVEL === 'trace'; - - const logLines = [ - `=== Container Run Log ===`, - `Timestamp: ${new Date().toISOString()}`, - `Group: ${group.name}`, - `IsMain: ${input.isMain}`, - `Duration: ${duration}ms`, - `Exit Code: ${code}`, - `Stdout Truncated: ${stdoutTruncated}`, - `Stderr Truncated: ${stderrTruncated}`, - ``, - ]; - - const isError = code !== 0; - - if (isVerbose || isError) { - logLines.push( - `=== Input ===`, - JSON.stringify(input, null, 2), - ``, - `=== Container Args ===`, - containerArgs.join(' '), - ``, - `=== Mounts ===`, - mounts - .map( - (m) => - `${m.hostPath} -> ${m.containerPath}${m.readonly ? ' (ro)' : ''}`, - ) - .join('\n'), - ``, - `=== Stderr${stderrTruncated ? ' (TRUNCATED)' : ''} ===`, - stderr, - ``, - `=== Stdout${stdoutTruncated ? ' (TRUNCATED)' : ''} ===`, - stdout, - ); - } else { - logLines.push( - `=== Input Summary ===`, - `Prompt length: ${input.prompt.length} chars`, - `Session ID: ${input.sessionId || 'new'}`, - ``, - `=== Mounts ===`, - mounts - .map((m) => `${m.containerPath}${m.readonly ? ' (ro)' : ''}`) - .join('\n'), - ``, - ); - } - - fs.writeFileSync(logFile, logLines.join('\n')); - logger.debug({ logFile, verbose: isVerbose }, 'Container log written'); - - if (code !== 0) { - logger.error( - { - group: group.name, - code, - duration, - stderr, - stdout, - logFile, - }, - 'Container exited with error', - ); - - resolve({ - status: 'error', - result: null, - error: `Container exited with code ${code}: ${stderr.slice(-200)}`, - }); - return; - } - - // Streaming mode: wait for output chain to settle, return completion marker - if (onOutput) { - outputChain.then(() => { - logger.info( - { group: group.name, duration, newSessionId }, - 'Container completed (streaming mode)', - ); - resolve({ - status: 'success', - result: null, - newSessionId, - }); - }); - return; - } - - // Legacy mode: parse the last output marker pair from accumulated stdout - try { - // Extract JSON between sentinel markers for robust parsing - const startIdx = stdout.indexOf(OUTPUT_START_MARKER); - const endIdx = stdout.indexOf(OUTPUT_END_MARKER); - - let jsonLine: string; - if (startIdx !== -1 && endIdx !== -1 && endIdx > startIdx) { - jsonLine = stdout - .slice(startIdx + OUTPUT_START_MARKER.length, endIdx) - .trim(); - } else { - // Fallback: last non-empty line (backwards compatibility) - const lines = stdout.trim().split('\n'); - jsonLine = lines[lines.length - 1]; - } - - const output: ContainerOutput = JSON.parse(jsonLine); - - logger.info( - { - group: group.name, - duration, - status: output.status, - hasResult: !!output.result, - }, - 'Container completed', - ); - - resolve(output); - } catch (err) { - logger.error( - { - group: group.name, - stdout, - stderr, - error: err, - }, - 'Failed to parse container output', - ); - - resolve({ - status: 'error', - result: null, - error: `Failed to parse container output: ${err instanceof Error ? err.message : String(err)}`, - }); - } - }); - - container.on('error', (err) => { - clearTimeout(timeout); - logger.error({ group: group.name, containerName, error: err }, 'Container spawn error'); - resolve({ - status: 'error', - result: null, - error: `Container spawn error: ${err.message}`, - }); - }); - }); -} - -export function writeTasksSnapshot( - groupFolder: string, - isMain: boolean, - tasks: Array<{ - id: string; - groupFolder: string; - prompt: string; - schedule_type: string; - schedule_value: string; - status: string; - next_run: string | null; - }>, -): void { - // Write filtered tasks to the group's IPC directory - const groupIpcDir = resolveGroupIpcPath(groupFolder); - fs.mkdirSync(groupIpcDir, { recursive: true }); - - // Main sees all tasks, others only see their own - const filteredTasks = isMain - ? tasks - : tasks.filter((t) => t.groupFolder === groupFolder); - - const tasksFile = path.join(groupIpcDir, 'current_tasks.json'); - fs.writeFileSync(tasksFile, JSON.stringify(filteredTasks, null, 2)); -} - -export interface AvailableGroup { - jid: string; - name: string; - lastActivity: string; - isRegistered: boolean; -} - -/** - * Write available groups snapshot for the container to read. - * Only main group can see all available groups (for activation). - * Non-main groups only see their own registration status. - */ -export function writeGroupsSnapshot( - groupFolder: string, - isMain: boolean, - groups: AvailableGroup[], - registeredJids: Set, -): void { - const groupIpcDir = resolveGroupIpcPath(groupFolder); - fs.mkdirSync(groupIpcDir, { recursive: true }); - - // Main sees all groups; others see nothing (they can't activate groups) - const visibleGroups = isMain ? groups : []; - - const groupsFile = path.join(groupIpcDir, 'available_groups.json'); - fs.writeFileSync( - groupsFile, - JSON.stringify( - { - groups: visibleGroups, - lastSync: new Date().toISOString(), - }, - null, - 2, - ), - ); -} diff --git a/.claude/skills/add-gmail/modify/src/container-runner.ts.intent.md b/.claude/skills/add-gmail/modify/src/container-runner.ts.intent.md deleted file mode 100644 index a9847a9..0000000 --- a/.claude/skills/add-gmail/modify/src/container-runner.ts.intent.md +++ /dev/null @@ -1,37 +0,0 @@ -# Intent: src/container-runner.ts modifications - -## What changed -Added a volume mount for Gmail OAuth credentials (`~/.gmail-mcp/`) so the Gmail MCP server inside the container can authenticate with Google. - -## Key sections - -### buildVolumeMounts() -- Added: Gmail credentials mount after the `.claude` sessions mount: - ``` - const gmailDir = path.join(homeDir, '.gmail-mcp'); - if (fs.existsSync(gmailDir)) { - mounts.push({ - hostPath: gmailDir, - containerPath: '/home/node/.gmail-mcp', - readonly: false, // MCP may need to refresh OAuth tokens - }); - } - ``` -- Uses `os.homedir()` to resolve the home directory -- Mount is read-write because the Gmail MCP server needs to refresh OAuth tokens -- Mount is conditional — only added if `~/.gmail-mcp/` exists on the host - -### Imports -- Added: `os` import for `os.homedir()` - -## Invariants -- All existing mounts are unchanged -- Mount ordering is preserved (Gmail added after session mounts, before additional mounts) -- The `buildContainerArgs`, `runContainerAgent`, and all other functions are untouched -- Additional mount validation via `validateAdditionalMounts` is unchanged - -## Must-keep -- All existing volume mounts (project root, group dir, global, sessions, IPC, agent-runner, additional) -- The mount security model (allowlist validation for additional mounts) -- The `readSecrets` function and stdin-based secret passing -- Container lifecycle (spawn, timeout, output parsing) diff --git a/.claude/skills/add-gmail/modify/src/index.ts b/.claude/skills/add-gmail/modify/src/index.ts deleted file mode 100644 index be26a17..0000000 --- a/.claude/skills/add-gmail/modify/src/index.ts +++ /dev/null @@ -1,507 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { - ASSISTANT_NAME, - IDLE_TIMEOUT, - MAIN_GROUP_FOLDER, - POLL_INTERVAL, - TRIGGER_PATTERN, -} from './config.js'; -import { GmailChannel } from './channels/gmail.js'; -import { WhatsAppChannel } from './channels/whatsapp.js'; -import { - ContainerOutput, - runContainerAgent, - writeGroupsSnapshot, - writeTasksSnapshot, -} from './container-runner.js'; -import { cleanupOrphans, ensureContainerRuntimeRunning } from './container-runtime.js'; -import { - getAllChats, - getAllRegisteredGroups, - getAllSessions, - getAllTasks, - getMessagesSince, - getNewMessages, - getRouterState, - initDatabase, - setRegisteredGroup, - setRouterState, - setSession, - storeChatMetadata, - storeMessage, -} from './db.js'; -import { GroupQueue } from './group-queue.js'; -import { resolveGroupFolderPath } from './group-folder.js'; -import { startIpcWatcher } from './ipc.js'; -import { findChannel, formatMessages, formatOutbound } from './router.js'; -import { startSchedulerLoop } from './task-scheduler.js'; -import { Channel, NewMessage, RegisteredGroup } from './types.js'; -import { logger } from './logger.js'; - -// Re-export for backwards compatibility during refactor -export { escapeXml, formatMessages } from './router.js'; - -let lastTimestamp = ''; -let sessions: Record = {}; -let registeredGroups: Record = {}; -let lastAgentTimestamp: Record = {}; -let messageLoopRunning = false; - -let whatsapp: WhatsAppChannel; -const channels: Channel[] = []; -const queue = new GroupQueue(); - -function loadState(): void { - lastTimestamp = getRouterState('last_timestamp') || ''; - const agentTs = getRouterState('last_agent_timestamp'); - try { - lastAgentTimestamp = agentTs ? JSON.parse(agentTs) : {}; - } catch { - logger.warn('Corrupted last_agent_timestamp in DB, resetting'); - lastAgentTimestamp = {}; - } - sessions = getAllSessions(); - registeredGroups = getAllRegisteredGroups(); - logger.info( - { groupCount: Object.keys(registeredGroups).length }, - 'State loaded', - ); -} - -function saveState(): void { - setRouterState('last_timestamp', lastTimestamp); - setRouterState( - 'last_agent_timestamp', - JSON.stringify(lastAgentTimestamp), - ); -} - -function registerGroup(jid: string, group: RegisteredGroup): void { - let groupDir: string; - try { - groupDir = resolveGroupFolderPath(group.folder); - } catch (err) { - logger.warn( - { jid, folder: group.folder, err }, - 'Rejecting group registration with invalid folder', - ); - return; - } - - registeredGroups[jid] = group; - setRegisteredGroup(jid, group); - - // Create group folder - fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true }); - - logger.info( - { jid, name: group.name, folder: group.folder }, - 'Group registered', - ); -} - -/** - * Get available groups list for the agent. - * Returns groups ordered by most recent activity. - */ -export function getAvailableGroups(): import('./container-runner.js').AvailableGroup[] { - const chats = getAllChats(); - const registeredJids = new Set(Object.keys(registeredGroups)); - - return chats - .filter((c) => c.jid !== '__group_sync__' && c.is_group) - .map((c) => ({ - jid: c.jid, - name: c.name, - lastActivity: c.last_message_time, - isRegistered: registeredJids.has(c.jid), - })); -} - -/** @internal - exported for testing */ -export function _setRegisteredGroups(groups: Record): void { - registeredGroups = groups; -} - -/** - * Process all pending messages for a group. - * Called by the GroupQueue when it's this group's turn. - */ -async function processGroupMessages(chatJid: string): Promise { - const group = registeredGroups[chatJid]; - if (!group) return true; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - return true; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const missedMessages = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - - if (missedMessages.length === 0) return true; - - // For non-main groups, check if trigger is required and present - if (!isMainGroup && group.requiresTrigger !== false) { - const hasTrigger = missedMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) return true; - } - - const prompt = formatMessages(missedMessages); - - // Advance cursor so the piping path in startMessageLoop won't re-fetch - // these messages. Save the old cursor so we can roll back on error. - const previousCursor = lastAgentTimestamp[chatJid] || ''; - lastAgentTimestamp[chatJid] = - missedMessages[missedMessages.length - 1].timestamp; - saveState(); - - logger.info( - { group: group.name, messageCount: missedMessages.length }, - 'Processing messages', - ); - - // Track idle timer for closing stdin when agent is idle - let idleTimer: ReturnType | null = null; - - const resetIdleTimer = () => { - if (idleTimer) clearTimeout(idleTimer); - idleTimer = setTimeout(() => { - logger.debug({ group: group.name }, 'Idle timeout, closing container stdin'); - queue.closeStdin(chatJid); - }, IDLE_TIMEOUT); - }; - - await channel.setTyping?.(chatJid, true); - let hadError = false; - let outputSentToUser = false; - - const output = await runAgent(group, prompt, chatJid, async (result) => { - // Streaming output callback — called for each agent result - if (result.result) { - const raw = typeof result.result === 'string' ? result.result : JSON.stringify(result.result); - // Strip ... blocks — agent uses these for internal reasoning - const text = raw.replace(/[\s\S]*?<\/internal>/g, '').trim(); - logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`); - if (text) { - await channel.sendMessage(chatJid, text); - outputSentToUser = true; - } - // Only reset idle timer on actual results, not session-update markers (result: null) - resetIdleTimer(); - } - - if (result.status === 'success') { - queue.notifyIdle(chatJid); - } - - if (result.status === 'error') { - hadError = true; - } - }); - - await channel.setTyping?.(chatJid, false); - if (idleTimer) clearTimeout(idleTimer); - - if (output === 'error' || hadError) { - // If we already sent output to the user, don't roll back the cursor — - // the user got their response and re-processing would send duplicates. - if (outputSentToUser) { - logger.warn({ group: group.name }, 'Agent error after output was sent, skipping cursor rollback to prevent duplicates'); - return true; - } - // Roll back cursor so retries can re-process these messages - lastAgentTimestamp[chatJid] = previousCursor; - saveState(); - logger.warn({ group: group.name }, 'Agent error, rolled back message cursor for retry'); - return false; - } - - return true; -} - -async function runAgent( - group: RegisteredGroup, - prompt: string, - chatJid: string, - onOutput?: (output: ContainerOutput) => Promise, -): Promise<'success' | 'error'> { - const isMain = group.folder === MAIN_GROUP_FOLDER; - const sessionId = sessions[group.folder]; - - // Update tasks snapshot for container to read (filtered by group) - const tasks = getAllTasks(); - writeTasksSnapshot( - group.folder, - isMain, - tasks.map((t) => ({ - id: t.id, - groupFolder: t.group_folder, - prompt: t.prompt, - schedule_type: t.schedule_type, - schedule_value: t.schedule_value, - status: t.status, - next_run: t.next_run, - })), - ); - - // Update available groups snapshot (main group only can see all groups) - const availableGroups = getAvailableGroups(); - writeGroupsSnapshot( - group.folder, - isMain, - availableGroups, - new Set(Object.keys(registeredGroups)), - ); - - // Wrap onOutput to track session ID from streamed results - const wrappedOnOutput = onOutput - ? async (output: ContainerOutput) => { - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - await onOutput(output); - } - : undefined; - - try { - const output = await runContainerAgent( - group, - { - prompt, - sessionId, - groupFolder: group.folder, - chatJid, - isMain, - assistantName: ASSISTANT_NAME, - }, - (proc, containerName) => queue.registerProcess(chatJid, proc, containerName, group.folder), - wrappedOnOutput, - ); - - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - - if (output.status === 'error') { - logger.error( - { group: group.name, error: output.error }, - 'Container agent error', - ); - return 'error'; - } - - return 'success'; - } catch (err) { - logger.error({ group: group.name, err }, 'Agent error'); - return 'error'; - } -} - -async function startMessageLoop(): Promise { - if (messageLoopRunning) { - logger.debug('Message loop already running, skipping duplicate start'); - return; - } - messageLoopRunning = true; - - logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`); - - while (true) { - try { - const jids = Object.keys(registeredGroups); - const { messages, newTimestamp } = getNewMessages(jids, lastTimestamp, ASSISTANT_NAME); - - if (messages.length > 0) { - logger.info({ count: messages.length }, 'New messages'); - - // Advance the "seen" cursor for all messages immediately - lastTimestamp = newTimestamp; - saveState(); - - // Deduplicate by group - const messagesByGroup = new Map(); - for (const msg of messages) { - const existing = messagesByGroup.get(msg.chat_jid); - if (existing) { - existing.push(msg); - } else { - messagesByGroup.set(msg.chat_jid, [msg]); - } - } - - for (const [chatJid, groupMessages] of messagesByGroup) { - const group = registeredGroups[chatJid]; - if (!group) continue; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - continue; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - const needsTrigger = !isMainGroup && group.requiresTrigger !== false; - - // For non-main groups, only act on trigger messages. - // Non-trigger messages accumulate in DB and get pulled as - // context when a trigger eventually arrives. - if (needsTrigger) { - const hasTrigger = groupMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) continue; - } - - // Pull all messages since lastAgentTimestamp so non-trigger - // context that accumulated between triggers is included. - const allPending = getMessagesSince( - chatJid, - lastAgentTimestamp[chatJid] || '', - ASSISTANT_NAME, - ); - const messagesToSend = - allPending.length > 0 ? allPending : groupMessages; - const formatted = formatMessages(messagesToSend); - - if (queue.sendMessage(chatJid, formatted)) { - logger.debug( - { chatJid, count: messagesToSend.length }, - 'Piped messages to active container', - ); - lastAgentTimestamp[chatJid] = - messagesToSend[messagesToSend.length - 1].timestamp; - saveState(); - // Show typing indicator while the container processes the piped message - channel.setTyping?.(chatJid, true)?.catch((err) => - logger.warn({ chatJid, err }, 'Failed to set typing indicator'), - ); - } else { - // No active container — enqueue for a new one - queue.enqueueMessageCheck(chatJid); - } - } - } - } catch (err) { - logger.error({ err }, 'Error in message loop'); - } - await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL)); - } -} - -/** - * Startup recovery: check for unprocessed messages in registered groups. - * Handles crash between advancing lastTimestamp and processing messages. - */ -function recoverPendingMessages(): void { - for (const [chatJid, group] of Object.entries(registeredGroups)) { - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const pending = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - if (pending.length > 0) { - logger.info( - { group: group.name, pendingCount: pending.length }, - 'Recovery: found unprocessed messages', - ); - queue.enqueueMessageCheck(chatJid); - } - } -} - -function ensureContainerSystemRunning(): void { - ensureContainerRuntimeRunning(); - cleanupOrphans(); -} - -async function main(): Promise { - ensureContainerSystemRunning(); - initDatabase(); - logger.info('Database initialized'); - loadState(); - - // Graceful shutdown handlers - const shutdown = async (signal: string) => { - logger.info({ signal }, 'Shutdown signal received'); - await queue.shutdown(10000); - for (const ch of channels) await ch.disconnect(); - process.exit(0); - }; - process.on('SIGTERM', () => shutdown('SIGTERM')); - process.on('SIGINT', () => shutdown('SIGINT')); - - // Channel callbacks (shared by all channels) - const channelOpts = { - onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg), - onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) => - storeChatMetadata(chatJid, timestamp, name, channel, isGroup), - registeredGroups: () => registeredGroups, - }; - - // Create and connect channels - whatsapp = new WhatsAppChannel(channelOpts); - channels.push(whatsapp); - await whatsapp.connect(); - - const gmail = new GmailChannel(channelOpts); - channels.push(gmail); - try { - await gmail.connect(); - } catch (err) { - logger.warn({ err }, 'Gmail channel failed to connect, continuing without it'); - } - - // Start subsystems (independently of connection handler) - startSchedulerLoop({ - registeredGroups: () => registeredGroups, - getSessions: () => sessions, - queue, - onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder), - sendMessage: async (jid, rawText) => { - const channel = findChannel(channels, jid); - if (!channel) { - console.log(`Warning: no channel owns JID ${jid}, cannot send message`); - return; - } - const text = formatOutbound(rawText); - if (text) await channel.sendMessage(jid, text); - }, - }); - startIpcWatcher({ - sendMessage: (jid, text) => { - const channel = findChannel(channels, jid); - if (!channel) throw new Error(`No channel for JID: ${jid}`); - return channel.sendMessage(jid, text); - }, - registeredGroups: () => registeredGroups, - registerGroup, - syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(), - getAvailableGroups, - writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj), - }); - queue.setProcessMessagesFn(processGroupMessages); - recoverPendingMessages(); - startMessageLoop().catch((err) => { - logger.fatal({ err }, 'Message loop crashed unexpectedly'); - process.exit(1); - }); -} - -// Guard: only run when executed directly, not when imported by tests -const isDirectRun = - process.argv[1] && - new URL(import.meta.url).pathname === new URL(`file://${process.argv[1]}`).pathname; - -if (isDirectRun) { - main().catch((err) => { - logger.error({ err }, 'Failed to start NanoClaw'); - process.exit(1); - }); -} diff --git a/.claude/skills/add-gmail/modify/src/index.ts.intent.md b/.claude/skills/add-gmail/modify/src/index.ts.intent.md deleted file mode 100644 index cd700f5..0000000 --- a/.claude/skills/add-gmail/modify/src/index.ts.intent.md +++ /dev/null @@ -1,40 +0,0 @@ -# Intent: src/index.ts modifications - -## What changed - -Added Gmail as a channel. - -## Key sections - -### Imports (top of file) - -- Added: `GmailChannel` from `./channels/gmail.js` - -### main() - -- Added Gmail channel creation: - ``` - const gmail = new GmailChannel(channelOpts); - channels.push(gmail); - await gmail.connect(); - ``` -- Gmail uses the same `channelOpts` callbacks as other channels -- Incoming emails are delivered to the main group (agent decides how to respond, user can configure) - -## Invariants - -- All existing message processing logic (triggers, cursors, idle timers) is preserved -- The `runAgent` function is completely unchanged -- State management (loadState/saveState) is unchanged -- Recovery logic is unchanged -- Container runtime check is unchanged -- Any other channel creation is untouched -- Shutdown iterates `channels` array (Gmail is included automatically) - -## Must-keep - -- The `escapeXml` and `formatMessages` re-exports -- The `_setRegisteredGroups` test helper -- The `isDirectRun` guard at bottom -- All error handling and cursor rollback logic in processGroupMessages -- The outgoing queue flush and reconnection logic diff --git a/.claude/skills/add-gmail/modify/src/routing.test.ts b/.claude/skills/add-gmail/modify/src/routing.test.ts deleted file mode 100644 index 837b1da..0000000 --- a/.claude/skills/add-gmail/modify/src/routing.test.ts +++ /dev/null @@ -1,119 +0,0 @@ -import { describe, it, expect, beforeEach } from 'vitest'; - -import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js'; -import { getAvailableGroups, _setRegisteredGroups } from './index.js'; - -beforeEach(() => { - _initTestDatabase(); - _setRegisteredGroups({}); -}); - -// --- JID ownership patterns --- - -describe('JID ownership patterns', () => { - // These test the patterns that will become ownsJid() on the Channel interface - - it('WhatsApp group JID: ends with @g.us', () => { - const jid = '12345678@g.us'; - expect(jid.endsWith('@g.us')).toBe(true); - }); - - it('WhatsApp DM JID: ends with @s.whatsapp.net', () => { - const jid = '12345678@s.whatsapp.net'; - expect(jid.endsWith('@s.whatsapp.net')).toBe(true); - }); - - it('Gmail JID: starts with gmail:', () => { - const jid = 'gmail:abc123def'; - expect(jid.startsWith('gmail:')).toBe(true); - }); - - it('Gmail thread JID: starts with gmail: followed by thread ID', () => { - const jid = 'gmail:18d3f4a5b6c7d8e9'; - expect(jid.startsWith('gmail:')).toBe(true); - }); -}); - -// --- getAvailableGroups --- - -describe('getAvailableGroups', () => { - it('returns only groups, excludes DMs', () => { - storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(2); - expect(groups.map((g) => g.jid)).toContain('group1@g.us'); - expect(groups.map((g) => g.jid)).toContain('group2@g.us'); - expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net'); - }); - - it('excludes __group_sync__ sentinel', () => { - storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z'); - storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('marks registered groups correctly', () => { - storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true); - storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true); - - _setRegisteredGroups({ - 'reg@g.us': { - name: 'Registered', - folder: 'registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const reg = groups.find((g) => g.jid === 'reg@g.us'); - const unreg = groups.find((g) => g.jid === 'unreg@g.us'); - - expect(reg?.isRegistered).toBe(true); - expect(unreg?.isRegistered).toBe(false); - }); - - it('returns groups ordered by most recent activity', () => { - storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true); - storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true); - storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups[0].jid).toBe('new@g.us'); - expect(groups[1].jid).toBe('mid@g.us'); - expect(groups[2].jid).toBe('old@g.us'); - }); - - it('excludes non-group chats regardless of JID format', () => { - // Unknown JID format stored without is_group should not appear - storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown'); - // Explicitly non-group with unusual JID - storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false); - // A real group for contrast - storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('returns empty array when no chats exist', () => { - const groups = getAvailableGroups(); - expect(groups).toHaveLength(0); - }); - - it('excludes Gmail threads from group list (Gmail threads are not groups)', () => { - storeChatMetadata('gmail:abc123', '2024-01-01T00:00:01.000Z', 'Email thread', 'gmail', false); - storeChatMetadata('group@g.us', '2024-01-01T00:00:02.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); -}); diff --git a/.claude/skills/add-gmail/tests/gmail.test.ts b/.claude/skills/add-gmail/tests/gmail.test.ts deleted file mode 100644 index 02d9721..0000000 --- a/.claude/skills/add-gmail/tests/gmail.test.ts +++ /dev/null @@ -1,40 +0,0 @@ -import { describe, it, expect } from 'vitest'; -import fs from 'fs'; -import path from 'path'; - -const root = process.cwd(); -const read = (f: string) => fs.readFileSync(path.join(root, f), 'utf-8'); - -function getGmailMode(): 'tool-only' | 'channel' { - const p = path.join(root, '.nanoclaw/state.yaml'); - if (!fs.existsSync(p)) return 'channel'; - return read('.nanoclaw/state.yaml').includes('mode: tool-only') ? 'tool-only' : 'channel'; -} - -const mode = getGmailMode(); -const channelOnly = mode === 'tool-only'; - -describe('add-gmail skill', () => { - it('container-runner mounts ~/.gmail-mcp', () => { - expect(read('src/container-runner.ts')).toContain('.gmail-mcp'); - }); - - it('agent-runner has gmail MCP server', () => { - const content = read('container/agent-runner/src/index.ts'); - expect(content).toContain('mcp__gmail__*'); - expect(content).toContain('@gongrzhe/server-gmail-autoauth-mcp'); - }); - - it.skipIf(channelOnly)('gmail channel file exists', () => { - expect(fs.existsSync(path.join(root, 'src/channels/gmail.ts'))).toBe(true); - }); - - it.skipIf(channelOnly)('index.ts wires up GmailChannel', () => { - expect(read('src/index.ts')).toContain('GmailChannel'); - }); - - it.skipIf(channelOnly)('googleapis dependency installed', () => { - const pkg = JSON.parse(read('package.json')); - expect(pkg.dependencies?.googleapis || pkg.devDependencies?.googleapis).toBeDefined(); - }); -}); diff --git a/.claude/skills/add-image-vision/SKILL.md b/.claude/skills/add-image-vision/SKILL.md new file mode 100644 index 0000000..072bf7b --- /dev/null +++ b/.claude/skills/add-image-vision/SKILL.md @@ -0,0 +1,94 @@ +--- +name: add-image-vision +description: Add image vision to NanoClaw agents. Resizes and processes WhatsApp image attachments, then sends them to Claude as multimodal content blocks. +--- + +# Image Vision Skill + +Adds the ability for NanoClaw agents to see and understand images sent via WhatsApp. Images are downloaded, resized with sharp, saved to the group workspace, and passed to the agent as base64-encoded multimodal content blocks. + +## Phase 1: Pre-flight + +1. Check if `src/image.ts` exists — skip to Phase 3 if already applied +2. Confirm `sharp` is installable (native bindings require build tools) + +**Prerequisite:** WhatsApp must be installed first (`skill/whatsapp` merged). This skill modifies WhatsApp channel files. + +## Phase 2: Apply Code Changes + +### Ensure WhatsApp fork remote + +```bash +git remote -v +``` + +If `whatsapp` is missing, add it: + +```bash +git remote add whatsapp https://github.com/qwibitai/nanoclaw-whatsapp.git +``` + +### Merge the skill branch + +```bash +git fetch whatsapp skill/image-vision +git merge whatsapp/skill/image-vision || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/image.ts` (image download, resize via sharp, base64 encoding) +- `src/image.test.ts` (8 unit tests) +- Image attachment handling in `src/channels/whatsapp.ts` +- Image passing to agent in `src/index.ts` and `src/container-runner.ts` +- Image content block support in `container/agent-runner/src/index.ts` +- `sharp` npm dependency in `package.json` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. + +### Validate code changes + +```bash +npm install +npm run build +npx vitest run src/image.test.ts +``` + +All tests must pass and build must be clean before proceeding. + +## Phase 3: Configure + +1. Rebuild the container (agent-runner changes need a rebuild): + ```bash + ./container/build.sh + ``` + +2. Sync agent-runner source to group caches: + ```bash + for dir in data/sessions/*/agent-runner-src/; do + cp container/agent-runner/src/*.ts "$dir" + done + ``` + +3. Restart the service: + ```bash + launchctl kickstart -k gui/$(id -u)/com.nanoclaw + ``` + +## Phase 4: Verify + +1. Send an image in a registered WhatsApp group +2. Check the agent responds with understanding of the image content +3. Check logs for "Processed image attachment": + ```bash + tail -50 groups/*/logs/container-*.log + ``` + +## Troubleshooting + +- **"Image - download failed"**: Check WhatsApp connection stability. The download may timeout on slow connections. +- **"Image - processing failed"**: Sharp may not be installed correctly. Run `npm ls sharp` to verify. +- **Agent doesn't mention image content**: Check container logs for "Loaded image" messages. If missing, ensure agent-runner source was synced to group caches. diff --git a/.claude/skills/add-macos-statusbar/SKILL.md b/.claude/skills/add-macos-statusbar/SKILL.md new file mode 100644 index 0000000..62855f2 --- /dev/null +++ b/.claude/skills/add-macos-statusbar/SKILL.md @@ -0,0 +1,133 @@ +--- +name: add-macos-statusbar +description: Add a macOS menu bar status indicator for NanoClaw. Shows a bolt icon with a green/red dot indicating whether NanoClaw is running, with Start, Stop, and Restart controls. macOS only. +--- + +# Add macOS Menu Bar Status Indicator + +Adds a persistent menu bar icon that shows NanoClaw's running status and lets the user +start, stop, or restart the service — similar to how Docker Desktop appears in the menu bar. + +**macOS only.** Requires Xcode Command Line Tools (`swiftc`). + +## Phase 1: Pre-flight + +### Check platform + +If not on macOS, stop and tell the user: + +> This skill is macOS only. The menu bar status indicator uses AppKit and requires `swiftc` (Xcode Command Line Tools). + +### Check for swiftc + +```bash +which swiftc +``` + +If not found, tell the user: + +> Xcode Command Line Tools are required. Install them by running: +> +> ```bash +> xcode-select --install +> ``` +> +> Then re-run `/add-macos-statusbar`. + +### Check if already installed + +```bash +launchctl list | grep com.nanoclaw.statusbar +``` + +If it returns a PID (not `-`), tell the user it's already installed and skip to Phase 3 (Verify). + +## Phase 2: Compile and Install + +### Compile the Swift binary + +The source lives in the skill directory. Compile it into `dist/`: + +```bash +mkdir -p dist +swiftc -O -o dist/statusbar "${CLAUDE_SKILL_DIR}/add/src/statusbar.swift" +``` + +This produces a small native binary at `dist/statusbar`. + +On macOS Sequoia or later, clear the quarantine attribute so the binary can run: + +```bash +xattr -cr dist/statusbar +``` + +### Create the launchd plist + +Determine the absolute project root and home directory: + +```bash +pwd +echo $HOME +``` + +Create `~/Library/LaunchAgents/com.nanoclaw.statusbar.plist`, substituting the actual values +for `{PROJECT_ROOT}` and `{HOME}`: + +```xml + + + + + Label + com.nanoclaw.statusbar + ProgramArguments + + {PROJECT_ROOT}/dist/statusbar + + RunAtLoad + + KeepAlive + + EnvironmentVariables + + HOME + {HOME} + + StandardOutPath + {PROJECT_ROOT}/logs/statusbar.log + StandardErrorPath + {PROJECT_ROOT}/logs/statusbar.error.log + + +``` + +### Load the service + +```bash +launchctl load ~/Library/LaunchAgents/com.nanoclaw.statusbar.plist +``` + +## Phase 3: Verify + +```bash +launchctl list | grep com.nanoclaw.statusbar +``` + +The first column should show a PID (not `-`). + +Tell the user: + +> The bolt icon should now appear in your macOS menu bar. Click it to see NanoClaw's status and control the service. +> +> - **Green dot** — NanoClaw is running +> - **Red dot** — NanoClaw is stopped +> +> Use **Restart** after making code changes, and **View Logs** to open the log file directly. + +## Removal + +```bash +launchctl unload ~/Library/LaunchAgents/com.nanoclaw.statusbar.plist +rm ~/Library/LaunchAgents/com.nanoclaw.statusbar.plist +rm dist/statusbar +``` diff --git a/.claude/skills/add-macos-statusbar/add/src/statusbar.swift b/.claude/skills/add-macos-statusbar/add/src/statusbar.swift new file mode 100644 index 0000000..2577380 --- /dev/null +++ b/.claude/skills/add-macos-statusbar/add/src/statusbar.swift @@ -0,0 +1,147 @@ +import AppKit + +class StatusBarController: NSObject { + private var statusItem: NSStatusItem! + private var isRunning = false + private var timer: Timer? + + private let plistPath = "\(NSHomeDirectory())/Library/LaunchAgents/com.nanoclaw.plist" + + /// Derive the NanoClaw project root from the binary location. + /// The binary is compiled to {project}/dist/statusbar, so the parent of + /// the parent directory is the project root. + private static let projectRoot: String = { + let binary = URL(fileURLWithPath: CommandLine.arguments[0]).resolvingSymlinksInPath() + return binary.deletingLastPathComponent().deletingLastPathComponent().path + }() + + override init() { + super.init() + setupStatusItem() + isRunning = checkRunning() + updateMenu() + // Poll every 5 seconds to reflect external state changes + timer = Timer.scheduledTimer(withTimeInterval: 5.0, repeats: true) { [weak self] _ in + guard let self else { return } + let current = self.checkRunning() + if current != self.isRunning { + self.isRunning = current + self.updateMenu() + } + } + } + + private func setupStatusItem() { + statusItem = NSStatusBar.system.statusItem(withLength: NSStatusItem.variableLength) + if let button = statusItem.button { + if let image = NSImage(systemSymbolName: "bolt.fill", accessibilityDescription: "NanoClaw") { + image.isTemplate = true + button.image = image + } else { + button.title = "⚡" + } + button.toolTip = "NanoClaw" + } + } + + private func checkRunning() -> Bool { + let task = Process() + task.launchPath = "/bin/launchctl" + task.arguments = ["list", "com.nanoclaw"] + let pipe = Pipe() + task.standardOutput = pipe + task.standardError = Pipe() + guard (try? task.run()) != nil else { return false } + task.waitUntilExit() + if task.terminationStatus != 0 { return false } + let output = String(data: pipe.fileHandleForReading.readDataToEndOfFile(), encoding: .utf8) ?? "" + // launchctl list output: "PID\tExitCode\tLabel" — "-" means not running + let pid = output.trimmingCharacters(in: .whitespacesAndNewlines).components(separatedBy: "\t").first ?? "-" + return pid != "-" + } + + private func updateMenu() { + let menu = NSMenu() + + // Status row with colored dot + let statusItem = NSMenuItem() + let dot = "● " + let dotColor: NSColor = isRunning ? .systemGreen : .systemRed + let attr = NSMutableAttributedString(string: dot, attributes: [.foregroundColor: dotColor]) + let label = isRunning ? "NanoClaw is running" : "NanoClaw is stopped" + attr.append(NSAttributedString(string: label, attributes: [.foregroundColor: NSColor.labelColor])) + statusItem.attributedTitle = attr + statusItem.isEnabled = false + menu.addItem(statusItem) + + menu.addItem(NSMenuItem.separator()) + + if isRunning { + let stop = NSMenuItem(title: "Stop", action: #selector(stopService), keyEquivalent: "") + stop.target = self + menu.addItem(stop) + + let restart = NSMenuItem(title: "Restart", action: #selector(restartService), keyEquivalent: "r") + restart.target = self + menu.addItem(restart) + } else { + let start = NSMenuItem(title: "Start", action: #selector(startService), keyEquivalent: "") + start.target = self + menu.addItem(start) + } + + menu.addItem(NSMenuItem.separator()) + + let logs = NSMenuItem(title: "View Logs", action: #selector(viewLogs), keyEquivalent: "") + logs.target = self + menu.addItem(logs) + + self.statusItem.menu = menu + } + + @objc private func startService() { + run("/bin/launchctl", ["load", plistPath]) + refresh(after: 2) + } + + @objc private func stopService() { + run("/bin/launchctl", ["unload", plistPath]) + refresh(after: 2) + } + + @objc private func restartService() { + let uid = getuid() + run("/bin/launchctl", ["kickstart", "-k", "gui/\(uid)/com.nanoclaw"]) + refresh(after: 3) + } + + @objc private func viewLogs() { + let logPath = "\(StatusBarController.projectRoot)/logs/nanoclaw.log" + NSWorkspace.shared.open(URL(fileURLWithPath: logPath)) + } + + private func refresh(after seconds: Double) { + DispatchQueue.main.asyncAfter(deadline: .now() + seconds) { [weak self] in + guard let self else { return } + self.isRunning = self.checkRunning() + self.updateMenu() + } + } + + @discardableResult + private func run(_ path: String, _ args: [String]) -> Int32 { + let task = Process() + task.launchPath = path + task.arguments = args + task.standardOutput = Pipe() + task.standardError = Pipe() + try? task.run() + task.waitUntilExit() + return task.terminationStatus + } +} + +let app = NSApplication.shared +app.setActivationPolicy(.accessory) +let controller = StatusBarController() +app.run() diff --git a/.claude/skills/add-ollama-tool/SKILL.md b/.claude/skills/add-ollama-tool/SKILL.md new file mode 100644 index 0000000..aa69295 --- /dev/null +++ b/.claude/skills/add-ollama-tool/SKILL.md @@ -0,0 +1,193 @@ +--- +name: add-ollama-tool +description: Add Ollama MCP server so the container agent can call local models and optionally manage the Ollama model library. +--- + +# Add Ollama Integration + +This skill adds a stdio-based MCP server that exposes local Ollama models as tools for the container agent. Claude remains the orchestrator but can offload work to local models, and can optionally manage the model library directly. + +Core tools (always available): +- `ollama_list_models` — list installed Ollama models with name, size, and family +- `ollama_generate` — send a prompt to a specified model and return the response + +Management tools (opt-in via `OLLAMA_ADMIN_TOOLS=true`): +- `ollama_pull_model` — pull (download) a model from the Ollama registry +- `ollama_delete_model` — delete a locally installed model to free disk space +- `ollama_show_model` — show model details: modelfile, parameters, and architecture info +- `ollama_list_running` — list models currently loaded in memory with memory usage and processor type + +## Phase 1: Pre-flight + +### Check if already applied + +Check if `container/agent-runner/src/ollama-mcp-stdio.ts` exists. If it does, skip to Phase 3 (Configure). + +### Check prerequisites + +Verify Ollama is installed and running on the host: + +```bash +ollama list +``` + +If Ollama is not installed, direct the user to https://ollama.com/download. + +If no models are installed, suggest pulling one: + +> You need at least one model. I recommend: +> +> ```bash +> ollama pull gemma3:1b # Small, fast (1GB) +> ollama pull llama3.2 # Good general purpose (2GB) +> ollama pull qwen3-coder:30b # Best for code tasks (18GB) +> ``` + +## Phase 2: Apply Code Changes + +### Ensure upstream remote + +```bash +git remote -v +``` + +If `upstream` is missing, add it: + +```bash +git remote add upstream https://github.com/qwibitai/nanoclaw.git +``` + +### Merge the skill branch + +```bash +git fetch upstream skill/ollama-tool +git merge upstream/skill/ollama-tool +``` + +This merges in: +- `container/agent-runner/src/ollama-mcp-stdio.ts` (Ollama MCP server) +- `scripts/ollama-watch.sh` (macOS notification watcher) +- Ollama MCP config in `container/agent-runner/src/index.ts` (allowedTools + mcpServers) +- `[OLLAMA]` log surfacing in `src/container-runner.ts` +- `OLLAMA_HOST` in `.env.example` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. + +### Copy to per-group agent-runner + +Existing groups have a cached copy of the agent-runner source. Copy the new files: + +```bash +for dir in data/sessions/*/agent-runner-src; do + cp container/agent-runner/src/ollama-mcp-stdio.ts "$dir/" + cp container/agent-runner/src/index.ts "$dir/" +done +``` + +### Validate code changes + +```bash +npm run build +./container/build.sh +``` + +Build must be clean before proceeding. + +## Phase 3: Configure + +### Enable model management tools (optional) + +Ask the user: + +> Would you like the agent to be able to **manage Ollama models** (pull, delete, inspect, list running)? +> +> - **Yes** — adds tools to pull new models, delete old ones, show model info, and check what's loaded in memory +> - **No** — the agent can only list installed models and generate responses (you manage models yourself on the host) + +If the user wants management tools, add to `.env`: + +```bash +OLLAMA_ADMIN_TOOLS=true +``` + +If they decline (or don't answer), do not add the variable — management tools will be disabled by default. + +### Set Ollama host (optional) + +By default, the MCP server connects to `http://host.docker.internal:11434` (Docker Desktop) with a fallback to `localhost`. To use a custom Ollama host, add to `.env`: + +```bash +OLLAMA_HOST=http://your-ollama-host:11434 +``` + +### Restart the service + +```bash +launchctl kickstart -k gui/$(id -u)/com.nanoclaw # macOS +# Linux: systemctl --user restart nanoclaw +``` + +## Phase 4: Verify + +### Test inference + +Tell the user: + +> Send a message like: "use ollama to tell me the capital of France" +> +> The agent should use `ollama_list_models` to find available models, then `ollama_generate` to get a response. + +### Test model management (if enabled) + +If `OLLAMA_ADMIN_TOOLS=true` was set, tell the user: + +> Send a message like: "pull the gemma3:1b model" or "which ollama models are currently loaded in memory?" +> +> The agent should call `ollama_pull_model` or `ollama_list_running` respectively. + +### Monitor activity (optional) + +Run the watcher script for macOS notifications when Ollama is used: + +```bash +./scripts/ollama-watch.sh +``` + +### Check logs if needed + +```bash +tail -f logs/nanoclaw.log | grep -i ollama +``` + +Look for: +- `[OLLAMA] >>> Generating` — generation started +- `[OLLAMA] <<< Done` — generation completed +- `[OLLAMA] Pulling model:` — pull in progress (management tools) +- `[OLLAMA] Deleted:` — model removed (management tools) + +## Troubleshooting + +### Agent says "Ollama is not installed" + +The agent is trying to run `ollama` CLI inside the container instead of using the MCP tools. This means: +1. The MCP server wasn't registered — check `container/agent-runner/src/index.ts` has the `ollama` entry in `mcpServers` +2. The per-group source wasn't updated — re-copy files (see Phase 2) +3. The container wasn't rebuilt — run `./container/build.sh` + +### "Failed to connect to Ollama" + +1. Verify Ollama is running: `ollama list` +2. Check Docker can reach the host: `docker run --rm curlimages/curl curl -s http://host.docker.internal:11434/api/tags` +3. If using a custom host, check `OLLAMA_HOST` in `.env` + +### Agent doesn't use Ollama tools + +The agent may not know about the tools. Try being explicit: "use the ollama_generate tool with gemma3:1b to answer: ..." + +### `ollama_pull_model` times out on large models + +Large models (7B+) can take several minutes. The tool uses `stream: false` so it blocks until complete — this is intentional. For very large pulls, use the host CLI directly: `ollama pull ` + +### Management tools not showing up + +Ensure `OLLAMA_ADMIN_TOOLS=true` is set in `.env` and the service was restarted after adding it. diff --git a/.claude/skills/add-pdf-reader/SKILL.md b/.claude/skills/add-pdf-reader/SKILL.md new file mode 100644 index 0000000..a01e530 --- /dev/null +++ b/.claude/skills/add-pdf-reader/SKILL.md @@ -0,0 +1,104 @@ +--- +name: add-pdf-reader +description: Add PDF reading to NanoClaw agents. Extracts text from PDFs via pdftotext CLI. Handles WhatsApp attachments, URLs, and local files. +--- + +# Add PDF Reader + +Adds PDF reading capability to all container agents using poppler-utils (pdftotext/pdfinfo). PDFs sent as WhatsApp attachments are auto-downloaded to the group workspace. + +## Phase 1: Pre-flight + +1. Check if `container/skills/pdf-reader/pdf-reader` exists — skip to Phase 3 if already applied +2. Confirm WhatsApp is installed first (`skill/whatsapp` merged). This skill modifies WhatsApp channel files. + +## Phase 2: Apply Code Changes + +### Ensure WhatsApp fork remote + +```bash +git remote -v +``` + +If `whatsapp` is missing, add it: + +```bash +git remote add whatsapp https://github.com/qwibitai/nanoclaw-whatsapp.git +``` + +### Merge the skill branch + +```bash +git fetch whatsapp skill/pdf-reader +git merge whatsapp/skill/pdf-reader || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `container/skills/pdf-reader/SKILL.md` (agent-facing documentation) +- `container/skills/pdf-reader/pdf-reader` (CLI script) +- `poppler-utils` in `container/Dockerfile` +- PDF attachment download in `src/channels/whatsapp.ts` +- PDF tests in `src/channels/whatsapp.test.ts` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. + +### Validate + +```bash +npm run build +npx vitest run src/channels/whatsapp.test.ts +``` + +### Rebuild container + +```bash +./container/build.sh +``` + +### Restart service + +```bash +launchctl kickstart -k gui/$(id -u)/com.nanoclaw # macOS +# Linux: systemctl --user restart nanoclaw +``` + +## Phase 3: Verify + +### Test PDF extraction + +Send a PDF file in any registered WhatsApp chat. The agent should: +1. Download the PDF to `attachments/` +2. Respond acknowledging the PDF +3. Be able to extract text when asked + +### Test URL fetching + +Ask the agent to read a PDF from a URL. It should use `pdf-reader fetch `. + +### Check logs if needed + +```bash +tail -f logs/nanoclaw.log | grep -i pdf +``` + +Look for: +- `Downloaded PDF attachment` — successful download +- `Failed to download PDF attachment` — media download issue + +## Troubleshooting + +### Agent says pdf-reader command not found + +Container needs rebuilding. Run `./container/build.sh` and restart the service. + +### PDF text extraction is empty + +The PDF may be scanned (image-based). pdftotext only handles text-based PDFs. Consider using the agent-browser to open the PDF visually instead. + +### WhatsApp PDF not detected + +Verify the message has `documentMessage` with `mimetype: application/pdf`. Some file-sharing apps send PDFs as generic files without the correct mimetype. diff --git a/.claude/skills/add-reactions/SKILL.md b/.claude/skills/add-reactions/SKILL.md new file mode 100644 index 0000000..de86768 --- /dev/null +++ b/.claude/skills/add-reactions/SKILL.md @@ -0,0 +1,117 @@ +--- +name: add-reactions +description: Add WhatsApp emoji reaction support — receive, send, store, and search reactions. +--- + +# Add Reactions + +This skill adds emoji reaction support to NanoClaw's WhatsApp channel: receive and store reactions, send reactions from the container agent via MCP tool, and query reaction history from SQLite. + +## Phase 1: Pre-flight + +### Check if already applied + +Check if `src/status-tracker.ts` exists: + +```bash +test -f src/status-tracker.ts && echo "Already applied" || echo "Not applied" +``` + +If already applied, skip to Phase 3 (Verify). + +## Phase 2: Apply Code Changes + +### Ensure WhatsApp fork remote + +```bash +git remote -v +``` + +If `whatsapp` is missing, add it: + +```bash +git remote add whatsapp https://github.com/qwibitai/nanoclaw-whatsapp.git +``` + +### Merge the skill branch + +```bash +git fetch whatsapp skill/reactions +git merge whatsapp/skill/reactions || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This adds: +- `scripts/migrate-reactions.ts` (database migration for `reactions` table with composite PK and indexes) +- `src/status-tracker.ts` (forward-only emoji state machine for message lifecycle signaling, with persistence and retry) +- `src/status-tracker.test.ts` (unit tests for StatusTracker) +- `container/skills/reactions/SKILL.md` (agent-facing documentation for the `react_to_message` MCP tool) +- Reaction support in `src/db.ts`, `src/channels/whatsapp.ts`, `src/types.ts`, `src/ipc.ts`, `src/index.ts`, `src/group-queue.ts`, and `container/agent-runner/src/ipc-mcp-stdio.ts` + +### Run database migration + +```bash +npx tsx scripts/migrate-reactions.ts +``` + +### Validate code changes + +```bash +npm test +npm run build +``` + +All tests must pass and build must be clean before proceeding. + +## Phase 3: Verify + +### Build and restart + +```bash +npm run build +``` + +Linux: +```bash +systemctl --user restart nanoclaw +``` + +macOS: +```bash +launchctl kickstart -k gui/$(id -u)/com.nanoclaw +``` + +### Test receiving reactions + +1. Send a message from your phone +2. React to it with an emoji on WhatsApp +3. Check the database: + +```bash +sqlite3 store/messages.db "SELECT * FROM reactions ORDER BY timestamp DESC LIMIT 5;" +``` + +### Test sending reactions + +Ask the agent to react to a message via the `react_to_message` MCP tool. Check your phone — the reaction should appear on the message. + +## Troubleshooting + +### Reactions not appearing in database + +- Check NanoClaw logs for `Failed to process reaction` errors +- Verify the chat is registered +- Confirm the service is running + +### Migration fails + +- Ensure `store/messages.db` exists and is accessible +- If "table reactions already exists", the migration already ran — skip it + +### Agent can't send reactions + +- Check IPC logs for `Unauthorized IPC reaction attempt blocked` — the agent can only react in its own group's chat +- Verify WhatsApp is connected: check logs for connection status diff --git a/.claude/skills/add-slack/SKILL.md b/.claude/skills/add-slack/SKILL.md index 3914bd9..4c86e19 100644 --- a/.claude/skills/add-slack/SKILL.md +++ b/.claude/skills/add-slack/SKILL.md @@ -5,65 +5,61 @@ description: Add Slack as a channel. Can replace WhatsApp entirely or run alongs # Add Slack Channel -This skill adds Slack support to NanoClaw using the skills engine for deterministic code changes, then walks through interactive setup. +This skill adds Slack support to NanoClaw, then walks through interactive setup. ## Phase 1: Pre-flight ### Check if already applied -Read `.nanoclaw/state.yaml`. If `slack` is in `applied_skills`, skip to Phase 3 (Setup). The code changes are already in place. +Check if `src/channels/slack.ts` exists. If it does, skip to Phase 3 (Setup). The code changes are already in place. ### Ask the user -1. **Mode**: Replace WhatsApp or add alongside it? - - Replace → will set `SLACK_ONLY=true` - - Alongside → both channels active (default) - -2. **Do they already have a Slack app configured?** If yes, collect the Bot Token and App Token now. If no, we'll create one in Phase 3. +**Do they already have a Slack app configured?** If yes, collect the Bot Token and App Token now. If no, we'll create one in Phase 3. ## Phase 2: Apply Code Changes -Run the skills engine to apply this skill's code package. The package files are in this directory alongside this SKILL.md. - -### Initialize skills system (if needed) - -If `.nanoclaw/` directory doesn't exist yet: +### Ensure channel remote ```bash -npx tsx scripts/apply-skill.ts --init +git remote -v ``` -Or call `initSkillsSystem()` from `skills-engine/migrate.ts`. - -### Apply the skill +If `slack` is missing, add it: ```bash -npx tsx scripts/apply-skill.ts .claude/skills/add-slack +git remote add slack https://github.com/qwibitai/nanoclaw-slack.git ``` -This deterministically: -- Adds `src/channels/slack.ts` (SlackChannel class implementing Channel interface) -- Adds `src/channels/slack.test.ts` (46 unit tests) -- Three-way merges Slack support into `src/index.ts` (multi-channel support, conditional channel creation) -- Three-way merges Slack config into `src/config.ts` (SLACK_ONLY export) -- Three-way merges updated routing tests into `src/routing.test.ts` -- Installs the `@slack/bolt` npm dependency -- Updates `.env.example` with `SLACK_BOT_TOKEN`, `SLACK_APP_TOKEN`, and `SLACK_ONLY` -- Records the application in `.nanoclaw/state.yaml` +### Merge the skill branch -If the apply reports merge conflicts, read the intent files: -- `modify/src/index.ts.intent.md` — what changed and invariants for index.ts -- `modify/src/config.ts.intent.md` — what changed for config.ts -- `modify/src/routing.test.ts.intent.md` — what changed for routing tests +```bash +git fetch slack main +git merge slack/main || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/channels/slack.ts` (SlackChannel class with self-registration via `registerChannel`) +- `src/channels/slack.test.ts` (46 unit tests) +- `import './slack.js'` appended to the channel barrel file `src/channels/index.ts` +- `@slack/bolt` npm dependency in `package.json` +- `SLACK_BOT_TOKEN` and `SLACK_APP_TOKEN` in `.env.example` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. ### Validate code changes ```bash -npm test +npm install npm run build +npx vitest run src/channels/slack.test.ts ``` -All tests must pass (including the new slack tests) and build must be clean before proceeding. +All tests must pass (including the new Slack tests) and build must be clean before proceeding. ## Phase 3: Setup @@ -89,11 +85,7 @@ SLACK_BOT_TOKEN=xoxb-your-bot-token SLACK_APP_TOKEN=xapp-your-app-token ``` -If they chose to replace WhatsApp: - -```bash -SLACK_ONLY=true -``` +Channels auto-enable when their credentials are present — no extra configuration needed. Sync to container environment: @@ -126,30 +118,18 @@ Wait for the user to provide the channel ID. ### Register the channel -Use the IPC register flow or register directly. The channel ID, name, and folder name are needed. +The channel ID, name, and folder name are needed. Use `npx tsx setup/index.ts --step register` with the appropriate flags. -For a main channel (responds to all messages, uses the `main` folder): +For a main channel (responds to all messages): -```typescript -registerGroup("slack:", { - name: "", - folder: "main", - trigger: `@${ASSISTANT_NAME}`, - added_at: new Date().toISOString(), - requiresTrigger: false, -}); +```bash +npx tsx setup/index.ts --step register -- --jid "slack:" --name "" --folder "slack_main" --trigger "@${ASSISTANT_NAME}" --channel slack --no-trigger-required --is-main ``` For additional channels (trigger-only): -```typescript -registerGroup("slack:", { - name: "", - folder: "", - trigger: `@${ASSISTANT_NAME}`, - added_at: new Date().toISOString(), - requiresTrigger: true, -}); +```bash +npx tsx setup/index.ts --step register -- --jid "slack:" --name "" --folder "slack_" --trigger "@${ASSISTANT_NAME}" --channel slack ``` ## Phase 5: Verify @@ -215,7 +195,7 @@ The Slack channel supports: - **Public channels** — Bot must be added to the channel - **Private channels** — Bot must be invited to the channel - **Direct messages** — Users can DM the bot directly -- **Multi-channel** — Can run alongside WhatsApp (default) or replace it (`SLACK_ONLY=true`) +- **Multi-channel** — Can run alongside WhatsApp or other channels (auto-enabled by credentials) ## Known Limitations diff --git a/.claude/skills/add-slack/SLACK_SETUP.md b/.claude/skills/add-slack/SLACK_SETUP.md deleted file mode 100644 index 90e2041..0000000 --- a/.claude/skills/add-slack/SLACK_SETUP.md +++ /dev/null @@ -1,149 +0,0 @@ -# Slack App Setup for NanoClaw - -Step-by-step guide to creating and configuring a Slack app for use with NanoClaw. - -## Prerequisites - -- A Slack workspace where you have admin permissions (or permission to install apps) -- Your NanoClaw instance with the `/add-slack` skill applied - -## Step 1: Create the Slack App - -1. Go to [api.slack.com/apps](https://api.slack.com/apps) -2. Click **Create New App** -3. Choose **From scratch** -4. Enter an app name (e.g., your `ASSISTANT_NAME` value, or any name you like) -5. Select the workspace you want to install it in -6. Click **Create App** - -## Step 2: Enable Socket Mode - -Socket Mode lets the bot connect to Slack without needing a public URL. This is what makes it work from your local machine. - -1. In the sidebar, click **Socket Mode** -2. Toggle **Enable Socket Mode** to **On** -3. When prompted for a token name, enter something like `nanoclaw` -4. Click **Generate** -5. **Copy the App-Level Token** — it starts with `xapp-`. Save this somewhere safe; you'll need it later. - -## Step 3: Subscribe to Events - -This tells Slack which messages to forward to your bot. - -1. In the sidebar, click **Event Subscriptions** -2. Toggle **Enable Events** to **On** -3. Under **Subscribe to bot events**, click **Add Bot User Event** and add these three events: - -| Event | What it does | -|-------|-------------| -| `message.channels` | Receive messages in public channels the bot is in | -| `message.groups` | Receive messages in private channels the bot is in | -| `message.im` | Receive direct messages to the bot | - -4. Click **Save Changes** at the bottom of the page - -## Step 4: Set Bot Permissions (OAuth Scopes) - -These scopes control what the bot is allowed to do. - -1. In the sidebar, click **OAuth & Permissions** -2. Scroll down to **Scopes** > **Bot Token Scopes** -3. Click **Add an OAuth Scope** and add each of these: - -| Scope | Why it's needed | -|-------|----------------| -| `chat:write` | Send messages to channels and DMs | -| `channels:history` | Read messages in public channels | -| `groups:history` | Read messages in private channels | -| `im:history` | Read direct messages | -| `channels:read` | List channels (for metadata sync) | -| `groups:read` | List private channels (for metadata sync) | -| `users:read` | Look up user display names | - -## Step 5: Install to Workspace - -1. In the sidebar, click **Install App** -2. Click **Install to Workspace** -3. Review the permissions and click **Allow** -4. **Copy the Bot User OAuth Token** — it starts with `xoxb-`. Save this somewhere safe. - -## Step 6: Configure NanoClaw - -Add both tokens to your `.env` file: - -``` -SLACK_BOT_TOKEN=xoxb-your-bot-token-here -SLACK_APP_TOKEN=xapp-your-app-token-here -``` - -If you want Slack to replace WhatsApp entirely (no WhatsApp channel), also add: - -``` -SLACK_ONLY=true -``` - -Then sync the environment to the container: - -```bash -mkdir -p data/env && cp .env data/env/env -``` - -## Step 7: Add the Bot to Channels - -The bot only receives messages from channels it has been explicitly added to. - -1. Open the Slack channel you want the bot to monitor -2. Click the channel name at the top to open channel details -3. Go to **Integrations** > **Add apps** -4. Search for your bot name and add it - -Repeat for each channel you want the bot in. - -## Step 8: Get Channel IDs for Registration - -You need the Slack channel ID to register it with NanoClaw. - -**Option A — From the URL:** -Open the channel in Slack on the web. The URL looks like: -``` -https://app.slack.com/client/TXXXXXXX/C0123456789 -``` -The `C0123456789` part is the channel ID. - -**Option B — Right-click:** -Right-click the channel name in Slack > **Copy link** > the channel ID is the last path segment. - -**Option C — Via API:** -```bash -curl -s -H "Authorization: Bearer $SLACK_BOT_TOKEN" \ - "https://slack.com/api/conversations.list" | jq '.channels[] | {id, name}' -``` - -The NanoClaw JID format is `slack:` followed by the channel ID, e.g., `slack:C0123456789`. - -## Token Reference - -| Token | Prefix | Where to find it | -|-------|--------|-----------------| -| Bot User OAuth Token | `xoxb-` | **OAuth & Permissions** > **Bot User OAuth Token** | -| App-Level Token | `xapp-` | **Basic Information** > **App-Level Tokens** (or during Socket Mode setup) | - -## Troubleshooting - -**Bot not receiving messages:** -- Verify Socket Mode is enabled (Step 2) -- Verify all three events are subscribed (Step 3) -- Verify the bot has been added to the channel (Step 7) - -**"missing_scope" errors:** -- Go back to **OAuth & Permissions** and add the missing scope -- After adding scopes, you must **reinstall the app** to your workspace (Slack will show a banner prompting you to do this) - -**Bot can't send messages:** -- Verify the `chat:write` scope is added -- Verify the bot has been added to the target channel - -**Token not working:** -- Bot tokens start with `xoxb-` — if yours doesn't, you may have copied the wrong token -- App tokens start with `xapp-` — these are generated in the Socket Mode or Basic Information pages -- If you regenerated a token, update `.env` and re-sync: `cp .env data/env/env` diff --git a/.claude/skills/add-slack/add/src/channels/slack.test.ts b/.claude/skills/add-slack/add/src/channels/slack.test.ts deleted file mode 100644 index 4c841d1..0000000 --- a/.claude/skills/add-slack/add/src/channels/slack.test.ts +++ /dev/null @@ -1,848 +0,0 @@ -import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest'; - -// --- Mocks --- - -// Mock config -vi.mock('../config.js', () => ({ - ASSISTANT_NAME: 'Jonesy', - TRIGGER_PATTERN: /^@Jonesy\b/i, -})); - -// Mock logger -vi.mock('../logger.js', () => ({ - logger: { - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }, -})); - -// Mock db -vi.mock('../db.js', () => ({ - updateChatName: vi.fn(), -})); - -// --- @slack/bolt mock --- - -type Handler = (...args: any[]) => any; - -const appRef = vi.hoisted(() => ({ current: null as any })); - -vi.mock('@slack/bolt', () => ({ - App: class MockApp { - eventHandlers = new Map(); - token: string; - appToken: string; - - client = { - auth: { - test: vi.fn().mockResolvedValue({ user_id: 'U_BOT_123' }), - }, - chat: { - postMessage: vi.fn().mockResolvedValue(undefined), - }, - conversations: { - list: vi.fn().mockResolvedValue({ - channels: [], - response_metadata: {}, - }), - }, - users: { - info: vi.fn().mockResolvedValue({ - user: { real_name: 'Alice Smith', name: 'alice' }, - }), - }, - }; - - constructor(opts: any) { - this.token = opts.token; - this.appToken = opts.appToken; - appRef.current = this; - } - - event(name: string, handler: Handler) { - this.eventHandlers.set(name, handler); - } - - async start() {} - async stop() {} - }, - LogLevel: { ERROR: 'error' }, -})); - -// Mock env -vi.mock('../env.js', () => ({ - readEnvFile: vi.fn().mockReturnValue({ - SLACK_BOT_TOKEN: 'xoxb-test-token', - SLACK_APP_TOKEN: 'xapp-test-token', - }), -})); - -import { SlackChannel, SlackChannelOpts } from './slack.js'; -import { updateChatName } from '../db.js'; -import { readEnvFile } from '../env.js'; - -// --- Test helpers --- - -function createTestOpts( - overrides?: Partial, -): SlackChannelOpts { - return { - onMessage: vi.fn(), - onChatMetadata: vi.fn(), - registeredGroups: vi.fn(() => ({ - 'slack:C0123456789': { - name: 'Test Channel', - folder: 'test-channel', - trigger: '@Jonesy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - ...overrides, - }; -} - -function createMessageEvent(overrides: { - channel?: string; - channelType?: string; - user?: string; - text?: string; - ts?: string; - threadTs?: string; - subtype?: string; - botId?: string; -}) { - return { - channel: overrides.channel ?? 'C0123456789', - channel_type: overrides.channelType ?? 'channel', - user: overrides.user ?? 'U_USER_456', - text: 'text' in overrides ? overrides.text : 'Hello everyone', - ts: overrides.ts ?? '1704067200.000000', - thread_ts: overrides.threadTs, - subtype: overrides.subtype, - bot_id: overrides.botId, - }; -} - -function currentApp() { - return appRef.current; -} - -async function triggerMessageEvent(event: ReturnType) { - const handler = currentApp().eventHandlers.get('message'); - if (handler) await handler({ event }); -} - -// --- Tests --- - -describe('SlackChannel', () => { - beforeEach(() => { - vi.clearAllMocks(); - }); - - afterEach(() => { - vi.restoreAllMocks(); - }); - - // --- Connection lifecycle --- - - describe('connection lifecycle', () => { - it('resolves connect() when app starts', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - await channel.connect(); - - expect(channel.isConnected()).toBe(true); - }); - - it('registers message event handler on construction', () => { - const opts = createTestOpts(); - new SlackChannel(opts); - - expect(currentApp().eventHandlers.has('message')).toBe(true); - }); - - it('gets bot user ID on connect', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - await channel.connect(); - - expect(currentApp().client.auth.test).toHaveBeenCalled(); - }); - - it('disconnects cleanly', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - await channel.connect(); - expect(channel.isConnected()).toBe(true); - - await channel.disconnect(); - expect(channel.isConnected()).toBe(false); - }); - - it('isConnected() returns false before connect', () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - expect(channel.isConnected()).toBe(false); - }); - }); - - // --- Message handling --- - - describe('message handling', () => { - it('delivers message for registered channel', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ text: 'Hello everyone' }); - await triggerMessageEvent(event); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.any(String), - undefined, - 'slack', - true, - ); - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - id: '1704067200.000000', - chat_jid: 'slack:C0123456789', - sender: 'U_USER_456', - content: 'Hello everyone', - is_from_me: false, - }), - ); - }); - - it('only emits metadata for unregistered channels', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ channel: 'C9999999999' }); - await triggerMessageEvent(event); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'slack:C9999999999', - expect.any(String), - undefined, - 'slack', - true, - ); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('skips non-text subtypes (channel_join, etc.)', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ subtype: 'channel_join' }); - await triggerMessageEvent(event); - - expect(opts.onMessage).not.toHaveBeenCalled(); - expect(opts.onChatMetadata).not.toHaveBeenCalled(); - }); - - it('allows bot_message subtype through', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - subtype: 'bot_message', - botId: 'B_OTHER_BOT', - text: 'Bot message', - }); - await triggerMessageEvent(event); - - expect(opts.onChatMetadata).toHaveBeenCalled(); - }); - - it('skips messages with no text', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ text: undefined as any }); - await triggerMessageEvent(event); - - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('detects bot messages by bot_id', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - subtype: 'bot_message', - botId: 'B_MY_BOT', - text: 'Bot response', - }); - await triggerMessageEvent(event); - - // Has bot_id so should be marked as bot message - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - is_from_me: true, - is_bot_message: true, - sender_name: 'Jonesy', - }), - ); - }); - - it('detects bot messages by matching bot user ID', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ user: 'U_BOT_123', text: 'Self message' }); - await triggerMessageEvent(event); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - is_from_me: true, - is_bot_message: true, - }), - ); - }); - - it('identifies IM channel type as non-group', async () => { - const opts = createTestOpts({ - registeredGroups: vi.fn(() => ({ - 'slack:D0123456789': { - name: 'DM', - folder: 'dm', - trigger: '@Jonesy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - }); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - channel: 'D0123456789', - channelType: 'im', - }); - await triggerMessageEvent(event); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'slack:D0123456789', - expect.any(String), - undefined, - 'slack', - false, // IM is not a group - ); - }); - - it('converts ts to ISO timestamp', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ ts: '1704067200.000000' }); - await triggerMessageEvent(event); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - timestamp: '2024-01-01T00:00:00.000Z', - }), - ); - }); - - it('resolves user name from Slack API', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ user: 'U_USER_456', text: 'Hello' }); - await triggerMessageEvent(event); - - expect(currentApp().client.users.info).toHaveBeenCalledWith({ - user: 'U_USER_456', - }); - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - sender_name: 'Alice Smith', - }), - ); - }); - - it('caches user names to avoid repeated API calls', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - // First message — API call - await triggerMessageEvent(createMessageEvent({ user: 'U_USER_456', text: 'First' })); - // Second message — should use cache - await triggerMessageEvent(createMessageEvent({ - user: 'U_USER_456', - text: 'Second', - ts: '1704067201.000000', - })); - - expect(currentApp().client.users.info).toHaveBeenCalledTimes(1); - }); - - it('falls back to user ID when API fails', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - currentApp().client.users.info.mockRejectedValueOnce(new Error('API error')); - - const event = createMessageEvent({ user: 'U_UNKNOWN', text: 'Hi' }); - await triggerMessageEvent(event); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - sender_name: 'U_UNKNOWN', - }), - ); - }); - - it('flattens threaded replies into channel messages', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - ts: '1704067201.000000', - threadTs: '1704067200.000000', // parent message ts — this is a reply - text: 'Thread reply', - }); - await triggerMessageEvent(event); - - // Threaded replies are delivered as regular channel messages - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - content: 'Thread reply', - }), - ); - }); - - it('delivers thread parent messages normally', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - ts: '1704067200.000000', - threadTs: '1704067200.000000', // same as ts — this IS the parent - text: 'Thread parent', - }); - await triggerMessageEvent(event); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - content: 'Thread parent', - }), - ); - }); - - it('delivers messages without thread_ts normally', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ text: 'Normal message' }); - await triggerMessageEvent(event); - - expect(opts.onMessage).toHaveBeenCalled(); - }); - }); - - // --- @mention translation --- - - describe('@mention translation', () => { - it('prepends trigger when bot is @mentioned via Slack format', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); // sets botUserId to 'U_BOT_123' - - const event = createMessageEvent({ - text: 'Hey <@U_BOT_123> what do you think?', - user: 'U_USER_456', - }); - await triggerMessageEvent(event); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - content: '@Jonesy Hey <@U_BOT_123> what do you think?', - }), - ); - }); - - it('does not prepend trigger when trigger pattern already matches', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - text: '@Jonesy <@U_BOT_123> hello', - user: 'U_USER_456', - }); - await triggerMessageEvent(event); - - // Content should be unchanged since it already matches TRIGGER_PATTERN - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - content: '@Jonesy <@U_BOT_123> hello', - }), - ); - }); - - it('does not translate mentions in bot messages', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - text: 'Echo: <@U_BOT_123>', - subtype: 'bot_message', - botId: 'B_MY_BOT', - }); - await triggerMessageEvent(event); - - // Bot messages skip mention translation - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - content: 'Echo: <@U_BOT_123>', - }), - ); - }); - - it('does not translate mentions for other users', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const event = createMessageEvent({ - text: 'Hey <@U_OTHER_USER> look at this', - user: 'U_USER_456', - }); - await triggerMessageEvent(event); - - // Mention is for a different user, not the bot - expect(opts.onMessage).toHaveBeenCalledWith( - 'slack:C0123456789', - expect.objectContaining({ - content: 'Hey <@U_OTHER_USER> look at this', - }), - ); - }); - }); - - // --- sendMessage --- - - describe('sendMessage', () => { - it('sends message via Slack client', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - await channel.sendMessage('slack:C0123456789', 'Hello'); - - expect(currentApp().client.chat.postMessage).toHaveBeenCalledWith({ - channel: 'C0123456789', - text: 'Hello', - }); - }); - - it('strips slack: prefix from JID', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - await channel.sendMessage('slack:D9876543210', 'DM message'); - - expect(currentApp().client.chat.postMessage).toHaveBeenCalledWith({ - channel: 'D9876543210', - text: 'DM message', - }); - }); - - it('queues message when disconnected', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - // Don't connect — should queue - await channel.sendMessage('slack:C0123456789', 'Queued message'); - - expect(currentApp().client.chat.postMessage).not.toHaveBeenCalled(); - }); - - it('queues message on send failure', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - currentApp().client.chat.postMessage.mockRejectedValueOnce( - new Error('Network error'), - ); - - // Should not throw - await expect( - channel.sendMessage('slack:C0123456789', 'Will fail'), - ).resolves.toBeUndefined(); - }); - - it('splits long messages at 4000 character boundary', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - // Create a message longer than 4000 chars - const longText = 'A'.repeat(4500); - await channel.sendMessage('slack:C0123456789', longText); - - // Should be split into 2 messages: 4000 + 500 - expect(currentApp().client.chat.postMessage).toHaveBeenCalledTimes(2); - expect(currentApp().client.chat.postMessage).toHaveBeenNthCalledWith(1, { - channel: 'C0123456789', - text: 'A'.repeat(4000), - }); - expect(currentApp().client.chat.postMessage).toHaveBeenNthCalledWith(2, { - channel: 'C0123456789', - text: 'A'.repeat(500), - }); - }); - - it('sends exactly-4000-char messages as a single message', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const text = 'B'.repeat(4000); - await channel.sendMessage('slack:C0123456789', text); - - expect(currentApp().client.chat.postMessage).toHaveBeenCalledTimes(1); - expect(currentApp().client.chat.postMessage).toHaveBeenCalledWith({ - channel: 'C0123456789', - text, - }); - }); - - it('splits messages into 3 parts when over 8000 chars', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - await channel.connect(); - - const longText = 'C'.repeat(8500); - await channel.sendMessage('slack:C0123456789', longText); - - // 4000 + 4000 + 500 = 3 messages - expect(currentApp().client.chat.postMessage).toHaveBeenCalledTimes(3); - }); - - it('flushes queued messages on connect', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - // Queue messages while disconnected - await channel.sendMessage('slack:C0123456789', 'First queued'); - await channel.sendMessage('slack:C0123456789', 'Second queued'); - - expect(currentApp().client.chat.postMessage).not.toHaveBeenCalled(); - - // Connect triggers flush - await channel.connect(); - - expect(currentApp().client.chat.postMessage).toHaveBeenCalledWith({ - channel: 'C0123456789', - text: 'First queued', - }); - expect(currentApp().client.chat.postMessage).toHaveBeenCalledWith({ - channel: 'C0123456789', - text: 'Second queued', - }); - }); - }); - - // --- ownsJid --- - - describe('ownsJid', () => { - it('owns slack: JIDs', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.ownsJid('slack:C0123456789')).toBe(true); - }); - - it('owns slack: DM JIDs', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.ownsJid('slack:D0123456789')).toBe(true); - }); - - it('does not own WhatsApp group JIDs', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.ownsJid('12345@g.us')).toBe(false); - }); - - it('does not own WhatsApp DM JIDs', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.ownsJid('12345@s.whatsapp.net')).toBe(false); - }); - - it('does not own Telegram JIDs', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.ownsJid('tg:123456')).toBe(false); - }); - - it('does not own unknown JID formats', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.ownsJid('random-string')).toBe(false); - }); - }); - - // --- syncChannelMetadata --- - - describe('syncChannelMetadata', () => { - it('calls conversations.list and updates chat names', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - currentApp().client.conversations.list.mockResolvedValue({ - channels: [ - { id: 'C001', name: 'general', is_member: true }, - { id: 'C002', name: 'random', is_member: true }, - { id: 'C003', name: 'external', is_member: false }, - ], - response_metadata: {}, - }); - - await channel.connect(); - - // connect() calls syncChannelMetadata internally - expect(updateChatName).toHaveBeenCalledWith('slack:C001', 'general'); - expect(updateChatName).toHaveBeenCalledWith('slack:C002', 'random'); - // Non-member channels are skipped - expect(updateChatName).not.toHaveBeenCalledWith('slack:C003', 'external'); - }); - - it('handles API errors gracefully', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - currentApp().client.conversations.list.mockRejectedValue( - new Error('API error'), - ); - - // Should not throw - await expect(channel.connect()).resolves.toBeUndefined(); - }); - }); - - // --- setTyping --- - - describe('setTyping', () => { - it('resolves without error (no-op)', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - // Should not throw — Slack has no bot typing indicator API - await expect( - channel.setTyping('slack:C0123456789', true), - ).resolves.toBeUndefined(); - }); - - it('accepts false without error', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - await expect( - channel.setTyping('slack:C0123456789', false), - ).resolves.toBeUndefined(); - }); - }); - - // --- Constructor error handling --- - - describe('constructor', () => { - it('throws when SLACK_BOT_TOKEN is missing', () => { - vi.mocked(readEnvFile).mockReturnValueOnce({ - SLACK_BOT_TOKEN: '', - SLACK_APP_TOKEN: 'xapp-test-token', - }); - - expect(() => new SlackChannel(createTestOpts())).toThrow( - 'SLACK_BOT_TOKEN and SLACK_APP_TOKEN must be set in .env', - ); - }); - - it('throws when SLACK_APP_TOKEN is missing', () => { - vi.mocked(readEnvFile).mockReturnValueOnce({ - SLACK_BOT_TOKEN: 'xoxb-test-token', - SLACK_APP_TOKEN: '', - }); - - expect(() => new SlackChannel(createTestOpts())).toThrow( - 'SLACK_BOT_TOKEN and SLACK_APP_TOKEN must be set in .env', - ); - }); - }); - - // --- syncChannelMetadata pagination --- - - describe('syncChannelMetadata pagination', () => { - it('paginates through multiple pages of channels', async () => { - const opts = createTestOpts(); - const channel = new SlackChannel(opts); - - // First page returns a cursor; second page returns no cursor - currentApp().client.conversations.list - .mockResolvedValueOnce({ - channels: [ - { id: 'C001', name: 'general', is_member: true }, - ], - response_metadata: { next_cursor: 'cursor_page2' }, - }) - .mockResolvedValueOnce({ - channels: [ - { id: 'C002', name: 'random', is_member: true }, - ], - response_metadata: {}, - }); - - await channel.connect(); - - // Should have called conversations.list twice (once per page) - expect(currentApp().client.conversations.list).toHaveBeenCalledTimes(2); - expect(currentApp().client.conversations.list).toHaveBeenNthCalledWith(2, - expect.objectContaining({ cursor: 'cursor_page2' }), - ); - - // Both channels from both pages stored - expect(updateChatName).toHaveBeenCalledWith('slack:C001', 'general'); - expect(updateChatName).toHaveBeenCalledWith('slack:C002', 'random'); - }); - }); - - // --- Channel properties --- - - describe('channel properties', () => { - it('has name "slack"', () => { - const channel = new SlackChannel(createTestOpts()); - expect(channel.name).toBe('slack'); - }); - }); -}); diff --git a/.claude/skills/add-slack/add/src/channels/slack.ts b/.claude/skills/add-slack/add/src/channels/slack.ts deleted file mode 100644 index 81cc1ac..0000000 --- a/.claude/skills/add-slack/add/src/channels/slack.ts +++ /dev/null @@ -1,290 +0,0 @@ -import { App, LogLevel } from '@slack/bolt'; -import type { GenericMessageEvent, BotMessageEvent } from '@slack/types'; - -import { ASSISTANT_NAME, TRIGGER_PATTERN } from '../config.js'; -import { updateChatName } from '../db.js'; -import { readEnvFile } from '../env.js'; -import { logger } from '../logger.js'; -import { - Channel, - OnInboundMessage, - OnChatMetadata, - RegisteredGroup, -} from '../types.js'; - -// Slack's chat.postMessage API limits text to ~4000 characters per call. -// Messages exceeding this are split into sequential chunks. -const MAX_MESSAGE_LENGTH = 4000; - -// The message subtypes we process. Bolt delivers all subtypes via app.event('message'); -// we filter to regular messages (GenericMessageEvent, subtype undefined) and bot messages -// (BotMessageEvent, subtype 'bot_message') so we can track our own output. -type HandledMessageEvent = GenericMessageEvent | BotMessageEvent; - -export interface SlackChannelOpts { - onMessage: OnInboundMessage; - onChatMetadata: OnChatMetadata; - registeredGroups: () => Record; -} - -export class SlackChannel implements Channel { - name = 'slack'; - - private app: App; - private botUserId: string | undefined; - private connected = false; - private outgoingQueue: Array<{ jid: string; text: string }> = []; - private flushing = false; - private userNameCache = new Map(); - - private opts: SlackChannelOpts; - - constructor(opts: SlackChannelOpts) { - this.opts = opts; - - // Read tokens from .env (not process.env — keeps secrets off the environment - // so they don't leak to child processes, matching NanoClaw's security pattern) - const env = readEnvFile(['SLACK_BOT_TOKEN', 'SLACK_APP_TOKEN']); - const botToken = env.SLACK_BOT_TOKEN; - const appToken = env.SLACK_APP_TOKEN; - - if (!botToken || !appToken) { - throw new Error( - 'SLACK_BOT_TOKEN and SLACK_APP_TOKEN must be set in .env', - ); - } - - this.app = new App({ - token: botToken, - appToken, - socketMode: true, - logLevel: LogLevel.ERROR, - }); - - this.setupEventHandlers(); - } - - private setupEventHandlers(): void { - // Use app.event('message') instead of app.message() to capture all - // message subtypes including bot_message (needed to track our own output) - this.app.event('message', async ({ event }) => { - // Bolt's event type is the full MessageEvent union (17+ subtypes). - // We filter on subtype first, then narrow to the two types we handle. - const subtype = (event as { subtype?: string }).subtype; - if (subtype && subtype !== 'bot_message') return; - - // After filtering, event is either GenericMessageEvent or BotMessageEvent - const msg = event as HandledMessageEvent; - - if (!msg.text) return; - - // Threaded replies are flattened into the channel conversation. - // The agent sees them alongside channel-level messages; responses - // always go to the channel, not back into the thread. - - const jid = `slack:${msg.channel}`; - const timestamp = new Date(parseFloat(msg.ts) * 1000).toISOString(); - const isGroup = msg.channel_type !== 'im'; - - // Always report metadata for group discovery - this.opts.onChatMetadata(jid, timestamp, undefined, 'slack', isGroup); - - // Only deliver full messages for registered groups - const groups = this.opts.registeredGroups(); - if (!groups[jid]) return; - - const isBotMessage = - !!msg.bot_id || msg.user === this.botUserId; - - let senderName: string; - if (isBotMessage) { - senderName = ASSISTANT_NAME; - } else { - senderName = - (await this.resolveUserName(msg.user)) || - msg.user || - 'unknown'; - } - - // Translate Slack <@UBOTID> mentions into TRIGGER_PATTERN format. - // Slack encodes @mentions as <@U12345>, which won't match TRIGGER_PATTERN - // (e.g., ^@\b), so we prepend the trigger when the bot is @mentioned. - let content = msg.text; - if (this.botUserId && !isBotMessage) { - const mentionPattern = `<@${this.botUserId}>`; - if (content.includes(mentionPattern) && !TRIGGER_PATTERN.test(content)) { - content = `@${ASSISTANT_NAME} ${content}`; - } - } - - this.opts.onMessage(jid, { - id: msg.ts, - chat_jid: jid, - sender: msg.user || msg.bot_id || '', - sender_name: senderName, - content, - timestamp, - is_from_me: isBotMessage, - is_bot_message: isBotMessage, - }); - }); - } - - async connect(): Promise { - await this.app.start(); - - // Get bot's own user ID for self-message detection. - // Resolve this BEFORE setting connected=true so that messages arriving - // during startup can correctly detect bot-sent messages. - try { - const auth = await this.app.client.auth.test(); - this.botUserId = auth.user_id as string; - logger.info({ botUserId: this.botUserId }, 'Connected to Slack'); - } catch (err) { - logger.warn( - { err }, - 'Connected to Slack but failed to get bot user ID', - ); - } - - this.connected = true; - - // Flush any messages queued before connection - await this.flushOutgoingQueue(); - - // Sync channel names on startup - await this.syncChannelMetadata(); - } - - async sendMessage(jid: string, text: string): Promise { - const channelId = jid.replace(/^slack:/, ''); - - if (!this.connected) { - this.outgoingQueue.push({ jid, text }); - logger.info( - { jid, queueSize: this.outgoingQueue.length }, - 'Slack disconnected, message queued', - ); - return; - } - - try { - // Slack limits messages to ~4000 characters; split if needed - if (text.length <= MAX_MESSAGE_LENGTH) { - await this.app.client.chat.postMessage({ channel: channelId, text }); - } else { - for (let i = 0; i < text.length; i += MAX_MESSAGE_LENGTH) { - await this.app.client.chat.postMessage({ - channel: channelId, - text: text.slice(i, i + MAX_MESSAGE_LENGTH), - }); - } - } - logger.info({ jid, length: text.length }, 'Slack message sent'); - } catch (err) { - this.outgoingQueue.push({ jid, text }); - logger.warn( - { jid, err, queueSize: this.outgoingQueue.length }, - 'Failed to send Slack message, queued', - ); - } - } - - isConnected(): boolean { - return this.connected; - } - - ownsJid(jid: string): boolean { - return jid.startsWith('slack:'); - } - - async disconnect(): Promise { - this.connected = false; - await this.app.stop(); - } - - // Slack does not expose a typing indicator API for bots. - // This no-op satisfies the Channel interface so the orchestrator - // doesn't need channel-specific branching. - async setTyping(_jid: string, _isTyping: boolean): Promise { - // no-op: Slack Bot API has no typing indicator endpoint - } - - /** - * Sync channel metadata from Slack. - * Fetches channels the bot is a member of and stores their names in the DB. - */ - async syncChannelMetadata(): Promise { - try { - logger.info('Syncing channel metadata from Slack...'); - let cursor: string | undefined; - let count = 0; - - do { - const result = await this.app.client.conversations.list({ - types: 'public_channel,private_channel', - exclude_archived: true, - limit: 200, - cursor, - }); - - for (const ch of result.channels || []) { - if (ch.id && ch.name && ch.is_member) { - updateChatName(`slack:${ch.id}`, ch.name); - count++; - } - } - - cursor = result.response_metadata?.next_cursor || undefined; - } while (cursor); - - logger.info({ count }, 'Slack channel metadata synced'); - } catch (err) { - logger.error({ err }, 'Failed to sync Slack channel metadata'); - } - } - - private async resolveUserName( - userId: string, - ): Promise { - if (!userId) return undefined; - - const cached = this.userNameCache.get(userId); - if (cached) return cached; - - try { - const result = await this.app.client.users.info({ user: userId }); - const name = result.user?.real_name || result.user?.name; - if (name) this.userNameCache.set(userId, name); - return name; - } catch (err) { - logger.debug({ userId, err }, 'Failed to resolve Slack user name'); - return undefined; - } - } - - private async flushOutgoingQueue(): Promise { - if (this.flushing || this.outgoingQueue.length === 0) return; - this.flushing = true; - try { - logger.info( - { count: this.outgoingQueue.length }, - 'Flushing Slack outgoing queue', - ); - while (this.outgoingQueue.length > 0) { - const item = this.outgoingQueue.shift()!; - const channelId = item.jid.replace(/^slack:/, ''); - await this.app.client.chat.postMessage({ - channel: channelId, - text: item.text, - }); - logger.info( - { jid: item.jid, length: item.text.length }, - 'Queued Slack message sent', - ); - } - } finally { - this.flushing = false; - } - } -} diff --git a/.claude/skills/add-slack/manifest.yaml b/.claude/skills/add-slack/manifest.yaml deleted file mode 100644 index 8320bb3..0000000 --- a/.claude/skills/add-slack/manifest.yaml +++ /dev/null @@ -1,21 +0,0 @@ -skill: slack -version: 1.0.0 -description: "Slack Bot integration via @slack/bolt with Socket Mode" -core_version: 0.1.0 -adds: - - src/channels/slack.ts - - src/channels/slack.test.ts -modifies: - - src/index.ts - - src/config.ts - - src/routing.test.ts -structured: - npm_dependencies: - "@slack/bolt": "^4.6.0" - env_additions: - - SLACK_BOT_TOKEN - - SLACK_APP_TOKEN - - SLACK_ONLY -conflicts: [] -depends: [] -test: "npx vitest run src/channels/slack.test.ts" diff --git a/.claude/skills/add-slack/modify/src/config.ts b/.claude/skills/add-slack/modify/src/config.ts deleted file mode 100644 index 1b59cf7..0000000 --- a/.claude/skills/add-slack/modify/src/config.ts +++ /dev/null @@ -1,75 +0,0 @@ -import path from 'path'; - -import { readEnvFile } from './env.js'; - -// Read config values from .env (falls back to process.env). -// Secrets are NOT read here — they stay on disk and are loaded only -// where needed (container-runner.ts) to avoid leaking to child processes. -const envConfig = readEnvFile([ - 'ASSISTANT_NAME', - 'ASSISTANT_HAS_OWN_NUMBER', - 'SLACK_ONLY', -]); - -export const ASSISTANT_NAME = - process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy'; -export const ASSISTANT_HAS_OWN_NUMBER = - (process.env.ASSISTANT_HAS_OWN_NUMBER || envConfig.ASSISTANT_HAS_OWN_NUMBER) === 'true'; -export const POLL_INTERVAL = 2000; -export const SCHEDULER_POLL_INTERVAL = 60000; - -// Absolute paths needed for container mounts -const PROJECT_ROOT = process.cwd(); -const HOME_DIR = process.env.HOME || '/Users/user'; - -// Mount security: allowlist stored OUTSIDE project root, never mounted into containers -export const MOUNT_ALLOWLIST_PATH = path.join( - HOME_DIR, - '.config', - 'nanoclaw', - 'mount-allowlist.json', -); -export const STORE_DIR = path.resolve(PROJECT_ROOT, 'store'); -export const GROUPS_DIR = path.resolve(PROJECT_ROOT, 'groups'); -export const DATA_DIR = path.resolve(PROJECT_ROOT, 'data'); -export const MAIN_GROUP_FOLDER = 'main'; - -export const CONTAINER_IMAGE = - process.env.CONTAINER_IMAGE || 'nanoclaw-agent:latest'; -export const CONTAINER_TIMEOUT = parseInt( - process.env.CONTAINER_TIMEOUT || '1800000', - 10, -); -export const CONTAINER_MAX_OUTPUT_SIZE = parseInt( - process.env.CONTAINER_MAX_OUTPUT_SIZE || '10485760', - 10, -); // 10MB default -export const IPC_POLL_INTERVAL = 1000; -export const IDLE_TIMEOUT = parseInt( - process.env.IDLE_TIMEOUT || '1800000', - 10, -); // 30min default — how long to keep container alive after last result -export const MAX_CONCURRENT_CONTAINERS = Math.max( - 1, - parseInt(process.env.MAX_CONCURRENT_CONTAINERS || '5', 10) || 5, -); - -function escapeRegex(str: string): string { - return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); -} - -export const TRIGGER_PATTERN = new RegExp( - `^@${escapeRegex(ASSISTANT_NAME)}\\b`, - 'i', -); - -// Timezone for scheduled tasks (cron expressions, etc.) -// Uses system timezone by default -export const TIMEZONE = - process.env.TZ || Intl.DateTimeFormat().resolvedOptions().timeZone; - -// Slack configuration -// SLACK_BOT_TOKEN and SLACK_APP_TOKEN are read directly by SlackChannel -// from .env via readEnvFile() to keep secrets off process.env. -export const SLACK_ONLY = - (process.env.SLACK_ONLY || envConfig.SLACK_ONLY) === 'true'; diff --git a/.claude/skills/add-slack/modify/src/config.ts.intent.md b/.claude/skills/add-slack/modify/src/config.ts.intent.md deleted file mode 100644 index b23def4..0000000 --- a/.claude/skills/add-slack/modify/src/config.ts.intent.md +++ /dev/null @@ -1,21 +0,0 @@ -# Intent: src/config.ts modifications - -## What changed -Added SLACK_ONLY configuration export for Slack channel support. - -## Key sections -- **readEnvFile call**: Must include `SLACK_ONLY` in the keys array. NanoClaw does NOT load `.env` into `process.env` — all `.env` values must be explicitly requested via `readEnvFile()`. -- **SLACK_ONLY**: Boolean flag from `process.env` or `envConfig`, when `true` disables WhatsApp channel creation -- **Note**: SLACK_BOT_TOKEN and SLACK_APP_TOKEN are NOT read here. They are read directly by SlackChannel via `readEnvFile()` in `slack.ts` to keep secrets off the config module entirely (same pattern as ANTHROPIC_API_KEY in container-runner.ts). - -## Invariants -- All existing config exports remain unchanged -- New Slack key is added to the `readEnvFile` call alongside existing keys -- New export is appended at the end of the file -- No existing behavior is modified — Slack config is additive only -- Both `process.env` and `envConfig` are checked (same pattern as `ASSISTANT_NAME`) - -## Must-keep -- All existing exports (`ASSISTANT_NAME`, `POLL_INTERVAL`, `TRIGGER_PATTERN`, etc.) -- The `readEnvFile` pattern — ALL config read from `.env` must go through this function -- The `escapeRegex` helper and `TRIGGER_PATTERN` construction diff --git a/.claude/skills/add-slack/modify/src/index.ts b/.claude/skills/add-slack/modify/src/index.ts deleted file mode 100644 index 50212e1..0000000 --- a/.claude/skills/add-slack/modify/src/index.ts +++ /dev/null @@ -1,498 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { - ASSISTANT_NAME, - DATA_DIR, - IDLE_TIMEOUT, - MAIN_GROUP_FOLDER, - POLL_INTERVAL, - SLACK_ONLY, - TRIGGER_PATTERN, -} from './config.js'; -import { WhatsAppChannel } from './channels/whatsapp.js'; -import { SlackChannel } from './channels/slack.js'; -import { - ContainerOutput, - runContainerAgent, - writeGroupsSnapshot, - writeTasksSnapshot, -} from './container-runner.js'; -import { cleanupOrphans, ensureContainerRuntimeRunning } from './container-runtime.js'; -import { - getAllChats, - getAllRegisteredGroups, - getAllSessions, - getAllTasks, - getMessagesSince, - getNewMessages, - getRouterState, - initDatabase, - setRegisteredGroup, - setRouterState, - setSession, - storeChatMetadata, - storeMessage, -} from './db.js'; -import { GroupQueue } from './group-queue.js'; -import { startIpcWatcher } from './ipc.js'; -import { findChannel, formatMessages, formatOutbound } from './router.js'; -import { startSchedulerLoop } from './task-scheduler.js'; -import { Channel, NewMessage, RegisteredGroup } from './types.js'; -import { logger } from './logger.js'; -import { readEnvFile } from './env.js'; - -// Re-export for backwards compatibility during refactor -export { escapeXml, formatMessages } from './router.js'; - -let lastTimestamp = ''; -let sessions: Record = {}; -let registeredGroups: Record = {}; -let lastAgentTimestamp: Record = {}; -let messageLoopRunning = false; - -let whatsapp: WhatsAppChannel; -let slack: SlackChannel | undefined; -const channels: Channel[] = []; -const queue = new GroupQueue(); - -function loadState(): void { - lastTimestamp = getRouterState('last_timestamp') || ''; - const agentTs = getRouterState('last_agent_timestamp'); - try { - lastAgentTimestamp = agentTs ? JSON.parse(agentTs) : {}; - } catch { - logger.warn('Corrupted last_agent_timestamp in DB, resetting'); - lastAgentTimestamp = {}; - } - sessions = getAllSessions(); - registeredGroups = getAllRegisteredGroups(); - logger.info( - { groupCount: Object.keys(registeredGroups).length }, - 'State loaded', - ); -} - -function saveState(): void { - setRouterState('last_timestamp', lastTimestamp); - setRouterState( - 'last_agent_timestamp', - JSON.stringify(lastAgentTimestamp), - ); -} - -function registerGroup(jid: string, group: RegisteredGroup): void { - registeredGroups[jid] = group; - setRegisteredGroup(jid, group); - - // Create group folder - const groupDir = path.join(DATA_DIR, '..', 'groups', group.folder); - fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true }); - - logger.info( - { jid, name: group.name, folder: group.folder }, - 'Group registered', - ); -} - -/** - * Get available groups list for the agent. - * Returns groups ordered by most recent activity. - */ -export function getAvailableGroups(): import('./container-runner.js').AvailableGroup[] { - const chats = getAllChats(); - const registeredJids = new Set(Object.keys(registeredGroups)); - - return chats - .filter((c) => c.jid !== '__group_sync__' && c.is_group) - .map((c) => ({ - jid: c.jid, - name: c.name, - lastActivity: c.last_message_time, - isRegistered: registeredJids.has(c.jid), - })); -} - -/** @internal - exported for testing */ -export function _setRegisteredGroups(groups: Record): void { - registeredGroups = groups; -} - -/** - * Process all pending messages for a group. - * Called by the GroupQueue when it's this group's turn. - */ -async function processGroupMessages(chatJid: string): Promise { - const group = registeredGroups[chatJid]; - if (!group) return true; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - return true; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const missedMessages = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - - if (missedMessages.length === 0) return true; - - // For non-main groups, check if trigger is required and present - if (!isMainGroup && group.requiresTrigger !== false) { - const hasTrigger = missedMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) return true; - } - - const prompt = formatMessages(missedMessages); - - // Advance cursor so the piping path in startMessageLoop won't re-fetch - // these messages. Save the old cursor so we can roll back on error. - const previousCursor = lastAgentTimestamp[chatJid] || ''; - lastAgentTimestamp[chatJid] = - missedMessages[missedMessages.length - 1].timestamp; - saveState(); - - logger.info( - { group: group.name, messageCount: missedMessages.length }, - 'Processing messages', - ); - - // Track idle timer for closing stdin when agent is idle - let idleTimer: ReturnType | null = null; - - const resetIdleTimer = () => { - if (idleTimer) clearTimeout(idleTimer); - idleTimer = setTimeout(() => { - logger.debug({ group: group.name }, 'Idle timeout, closing container stdin'); - queue.closeStdin(chatJid); - }, IDLE_TIMEOUT); - }; - - await channel.setTyping?.(chatJid, true); - let hadError = false; - let outputSentToUser = false; - - const output = await runAgent(group, prompt, chatJid, async (result) => { - // Streaming output callback — called for each agent result - if (result.result) { - const raw = typeof result.result === 'string' ? result.result : JSON.stringify(result.result); - // Strip ... blocks — agent uses these for internal reasoning - const text = raw.replace(/[\s\S]*?<\/internal>/g, '').trim(); - logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`); - if (text) { - await channel.sendMessage(chatJid, text); - outputSentToUser = true; - } - // Only reset idle timer on actual results, not session-update markers (result: null) - resetIdleTimer(); - } - - if (result.status === 'error') { - hadError = true; - } - }); - - await channel.setTyping?.(chatJid, false); - if (idleTimer) clearTimeout(idleTimer); - - if (output === 'error' || hadError) { - // If we already sent output to the user, don't roll back the cursor — - // the user got their response and re-processing would send duplicates. - if (outputSentToUser) { - logger.warn({ group: group.name }, 'Agent error after output was sent, skipping cursor rollback to prevent duplicates'); - return true; - } - // Roll back cursor so retries can re-process these messages - lastAgentTimestamp[chatJid] = previousCursor; - saveState(); - logger.warn({ group: group.name }, 'Agent error, rolled back message cursor for retry'); - return false; - } - - return true; -} - -async function runAgent( - group: RegisteredGroup, - prompt: string, - chatJid: string, - onOutput?: (output: ContainerOutput) => Promise, -): Promise<'success' | 'error'> { - const isMain = group.folder === MAIN_GROUP_FOLDER; - const sessionId = sessions[group.folder]; - - // Update tasks snapshot for container to read (filtered by group) - const tasks = getAllTasks(); - writeTasksSnapshot( - group.folder, - isMain, - tasks.map((t) => ({ - id: t.id, - groupFolder: t.group_folder, - prompt: t.prompt, - schedule_type: t.schedule_type, - schedule_value: t.schedule_value, - status: t.status, - next_run: t.next_run, - })), - ); - - // Update available groups snapshot (main group only can see all groups) - const availableGroups = getAvailableGroups(); - writeGroupsSnapshot( - group.folder, - isMain, - availableGroups, - new Set(Object.keys(registeredGroups)), - ); - - // Wrap onOutput to track session ID from streamed results - const wrappedOnOutput = onOutput - ? async (output: ContainerOutput) => { - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - await onOutput(output); - } - : undefined; - - try { - const output = await runContainerAgent( - group, - { - prompt, - sessionId, - groupFolder: group.folder, - chatJid, - isMain, - }, - (proc, containerName) => queue.registerProcess(chatJid, proc, containerName, group.folder), - wrappedOnOutput, - ); - - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - - if (output.status === 'error') { - logger.error( - { group: group.name, error: output.error }, - 'Container agent error', - ); - return 'error'; - } - - return 'success'; - } catch (err) { - logger.error({ group: group.name, err }, 'Agent error'); - return 'error'; - } -} - -async function startMessageLoop(): Promise { - if (messageLoopRunning) { - logger.debug('Message loop already running, skipping duplicate start'); - return; - } - messageLoopRunning = true; - - logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`); - - while (true) { - try { - const jids = Object.keys(registeredGroups); - const { messages, newTimestamp } = getNewMessages(jids, lastTimestamp, ASSISTANT_NAME); - - if (messages.length > 0) { - logger.info({ count: messages.length }, 'New messages'); - - // Advance the "seen" cursor for all messages immediately - lastTimestamp = newTimestamp; - saveState(); - - // Deduplicate by group - const messagesByGroup = new Map(); - for (const msg of messages) { - const existing = messagesByGroup.get(msg.chat_jid); - if (existing) { - existing.push(msg); - } else { - messagesByGroup.set(msg.chat_jid, [msg]); - } - } - - for (const [chatJid, groupMessages] of messagesByGroup) { - const group = registeredGroups[chatJid]; - if (!group) continue; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - continue; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - const needsTrigger = !isMainGroup && group.requiresTrigger !== false; - - // For non-main groups, only act on trigger messages. - // Non-trigger messages accumulate in DB and get pulled as - // context when a trigger eventually arrives. - if (needsTrigger) { - const hasTrigger = groupMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) continue; - } - - // Pull all messages since lastAgentTimestamp so non-trigger - // context that accumulated between triggers is included. - const allPending = getMessagesSince( - chatJid, - lastAgentTimestamp[chatJid] || '', - ASSISTANT_NAME, - ); - const messagesToSend = - allPending.length > 0 ? allPending : groupMessages; - const formatted = formatMessages(messagesToSend); - - if (queue.sendMessage(chatJid, formatted)) { - logger.debug( - { chatJid, count: messagesToSend.length }, - 'Piped messages to active container', - ); - lastAgentTimestamp[chatJid] = - messagesToSend[messagesToSend.length - 1].timestamp; - saveState(); - // Show typing indicator while the container processes the piped message - channel.setTyping?.(chatJid, true); - } else { - // No active container — enqueue for a new one - queue.enqueueMessageCheck(chatJid); - } - } - } - } catch (err) { - logger.error({ err }, 'Error in message loop'); - } - await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL)); - } -} - -/** - * Startup recovery: check for unprocessed messages in registered groups. - * Handles crash between advancing lastTimestamp and processing messages. - */ -function recoverPendingMessages(): void { - for (const [chatJid, group] of Object.entries(registeredGroups)) { - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const pending = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - if (pending.length > 0) { - logger.info( - { group: group.name, pendingCount: pending.length }, - 'Recovery: found unprocessed messages', - ); - queue.enqueueMessageCheck(chatJid); - } - } -} - -function ensureContainerSystemRunning(): void { - ensureContainerRuntimeRunning(); - cleanupOrphans(); -} - -async function main(): Promise { - ensureContainerSystemRunning(); - initDatabase(); - logger.info('Database initialized'); - loadState(); - - // Graceful shutdown handlers - const shutdown = async (signal: string) => { - logger.info({ signal }, 'Shutdown signal received'); - await queue.shutdown(10000); - for (const ch of channels) await ch.disconnect(); - process.exit(0); - }; - process.on('SIGTERM', () => shutdown('SIGTERM')); - process.on('SIGINT', () => shutdown('SIGINT')); - - // Channel callbacks (shared by all channels) - const channelOpts = { - onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg), - onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) => - storeChatMetadata(chatJid, timestamp, name, channel, isGroup), - registeredGroups: () => registeredGroups, - }; - - // Create and connect channels - // Check if Slack tokens are configured - const slackEnv = readEnvFile(['SLACK_BOT_TOKEN', 'SLACK_APP_TOKEN']); - const hasSlackTokens = !!(slackEnv.SLACK_BOT_TOKEN && slackEnv.SLACK_APP_TOKEN); - - if (!SLACK_ONLY) { - whatsapp = new WhatsAppChannel(channelOpts); - channels.push(whatsapp); - await whatsapp.connect(); - } - - if (hasSlackTokens) { - slack = new SlackChannel(channelOpts); - channels.push(slack); - await slack.connect(); - } - - // Start subsystems (independently of connection handler) - startSchedulerLoop({ - registeredGroups: () => registeredGroups, - getSessions: () => sessions, - queue, - onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder), - sendMessage: async (jid, rawText) => { - const channel = findChannel(channels, jid); - if (!channel) { - console.log(`Warning: no channel owns JID ${jid}, cannot send message`); - return; - } - const text = formatOutbound(rawText); - if (text) await channel.sendMessage(jid, text); - }, - }); - startIpcWatcher({ - sendMessage: (jid, text) => { - const channel = findChannel(channels, jid); - if (!channel) throw new Error(`No channel for JID: ${jid}`); - return channel.sendMessage(jid, text); - }, - registeredGroups: () => registeredGroups, - registerGroup, - syncGroupMetadata: async (force) => { - // Sync metadata across all active channels - if (whatsapp) await whatsapp.syncGroupMetadata(force); - if (slack) await slack.syncChannelMetadata(); - }, - getAvailableGroups, - writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj), - }); - queue.setProcessMessagesFn(processGroupMessages); - recoverPendingMessages(); - startMessageLoop(); -} - -// Guard: only run when executed directly, not when imported by tests -const isDirectRun = - process.argv[1] && - new URL(import.meta.url).pathname === new URL(`file://${process.argv[1]}`).pathname; - -if (isDirectRun) { - main().catch((err) => { - logger.error({ err }, 'Failed to start NanoClaw'); - process.exit(1); - }); -} diff --git a/.claude/skills/add-slack/modify/src/index.ts.intent.md b/.claude/skills/add-slack/modify/src/index.ts.intent.md deleted file mode 100644 index 8412843..0000000 --- a/.claude/skills/add-slack/modify/src/index.ts.intent.md +++ /dev/null @@ -1,60 +0,0 @@ -# Intent: src/index.ts modifications - -## What changed -Refactored from single WhatsApp channel to multi-channel architecture supporting Slack alongside WhatsApp. - -## Key sections - -### Imports (top of file) -- Added: `SlackChannel` from `./channels/slack.js` -- Added: `SLACK_ONLY` from `./config.js` -- Added: `readEnvFile` from `./env.js` -- Existing: `findChannel` from `./router.js` and `Channel` type from `./types.js` are already present - -### Module-level state -- Kept: `let whatsapp: WhatsAppChannel` — still needed for `syncGroupMetadata` reference -- Added: `let slack: SlackChannel | undefined` — direct reference for `syncChannelMetadata` -- Kept: `const channels: Channel[] = []` — array of all active channels - -### processGroupMessages() -- Uses `findChannel(channels, chatJid)` lookup (already exists in base) -- Uses `channel.setTyping?.()` and `channel.sendMessage()` (already exists in base) - -### startMessageLoop() -- Uses `findChannel(channels, chatJid)` per group (already exists in base) -- Uses `channel.setTyping?.()` for typing indicators (already exists in base) - -### main() -- Added: Reads Slack tokens via `readEnvFile()` to check if Slack is configured -- Added: conditional WhatsApp creation (`if (!SLACK_ONLY)`) -- Added: conditional Slack creation (`if (hasSlackTokens)`) -- Changed: scheduler `sendMessage` uses `findChannel()` → `channel.sendMessage()` -- Changed: IPC `syncGroupMetadata` syncs both WhatsApp and Slack metadata -- Changed: IPC `sendMessage` uses `findChannel()` → `channel.sendMessage()` - -### Shutdown handler -- Changed from `await whatsapp.disconnect()` to `for (const ch of channels) await ch.disconnect()` -- Disconnects all active channels (WhatsApp, Slack, or any future channels) on SIGTERM/SIGINT - -## Invariants -- All existing message processing logic (triggers, cursors, idle timers) is preserved -- The `runAgent` function is completely unchanged -- State management (loadState/saveState) is unchanged -- Recovery logic is unchanged -- Container runtime check is unchanged (ensureContainerSystemRunning) - -## Design decisions - -### Double readEnvFile for Slack tokens -`main()` in index.ts reads `SLACK_BOT_TOKEN`/`SLACK_APP_TOKEN` via `readEnvFile()` to check -whether Slack is configured (controls whether to instantiate SlackChannel). The SlackChannel -constructor reads them again independently. This is intentional — index.ts needs to decide -*whether* to create the channel, while SlackChannel needs the actual token values. Keeping -both reads follows the security pattern of not passing secrets through intermediate variables. - -## Must-keep -- The `escapeXml` and `formatMessages` re-exports -- The `_setRegisteredGroups` test helper -- The `isDirectRun` guard at bottom -- All error handling and cursor rollback logic in processGroupMessages -- The outgoing queue flush and reconnection logic (in each channel, not here) diff --git a/.claude/skills/add-slack/modify/src/routing.test.ts b/.claude/skills/add-slack/modify/src/routing.test.ts deleted file mode 100644 index 3a7f7ff..0000000 --- a/.claude/skills/add-slack/modify/src/routing.test.ts +++ /dev/null @@ -1,161 +0,0 @@ -import { describe, it, expect, beforeEach } from 'vitest'; - -import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js'; -import { getAvailableGroups, _setRegisteredGroups } from './index.js'; - -beforeEach(() => { - _initTestDatabase(); - _setRegisteredGroups({}); -}); - -// --- JID ownership patterns --- - -describe('JID ownership patterns', () => { - // These test the patterns that will become ownsJid() on the Channel interface - - it('WhatsApp group JID: ends with @g.us', () => { - const jid = '12345678@g.us'; - expect(jid.endsWith('@g.us')).toBe(true); - }); - - it('WhatsApp DM JID: ends with @s.whatsapp.net', () => { - const jid = '12345678@s.whatsapp.net'; - expect(jid.endsWith('@s.whatsapp.net')).toBe(true); - }); - - it('Slack channel JID: starts with slack:', () => { - const jid = 'slack:C0123456789'; - expect(jid.startsWith('slack:')).toBe(true); - }); - - it('Slack DM JID: starts with slack:D', () => { - const jid = 'slack:D0123456789'; - expect(jid.startsWith('slack:')).toBe(true); - }); -}); - -// --- getAvailableGroups --- - -describe('getAvailableGroups', () => { - it('returns only groups, excludes DMs', () => { - storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(2); - expect(groups.map((g) => g.jid)).toContain('group1@g.us'); - expect(groups.map((g) => g.jid)).toContain('group2@g.us'); - expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net'); - }); - - it('excludes __group_sync__ sentinel', () => { - storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z'); - storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('marks registered groups correctly', () => { - storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true); - storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true); - - _setRegisteredGroups({ - 'reg@g.us': { - name: 'Registered', - folder: 'registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const reg = groups.find((g) => g.jid === 'reg@g.us'); - const unreg = groups.find((g) => g.jid === 'unreg@g.us'); - - expect(reg?.isRegistered).toBe(true); - expect(unreg?.isRegistered).toBe(false); - }); - - it('returns groups ordered by most recent activity', () => { - storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true); - storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true); - storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups[0].jid).toBe('new@g.us'); - expect(groups[1].jid).toBe('mid@g.us'); - expect(groups[2].jid).toBe('old@g.us'); - }); - - it('excludes non-group chats regardless of JID format', () => { - // Unknown JID format stored without is_group should not appear - storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown'); - // Explicitly non-group with unusual JID - storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false); - // A real group for contrast - storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('returns empty array when no chats exist', () => { - const groups = getAvailableGroups(); - expect(groups).toHaveLength(0); - }); - - it('includes Slack channel JIDs', () => { - storeChatMetadata('slack:C0123456789', '2024-01-01T00:00:01.000Z', 'Slack Channel', 'slack', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('slack:C0123456789'); - }); - - it('returns Slack DM JIDs as groups when is_group is true', () => { - storeChatMetadata('slack:D0123456789', '2024-01-01T00:00:01.000Z', 'Slack DM', 'slack', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('slack:D0123456789'); - expect(groups[0].name).toBe('Slack DM'); - }); - - it('marks registered Slack channels correctly', () => { - storeChatMetadata('slack:C0123456789', '2024-01-01T00:00:01.000Z', 'Slack Registered', 'slack', true); - storeChatMetadata('slack:C9999999999', '2024-01-01T00:00:02.000Z', 'Slack Unregistered', 'slack', true); - - _setRegisteredGroups({ - 'slack:C0123456789': { - name: 'Slack Registered', - folder: 'slack-registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const slackReg = groups.find((g) => g.jid === 'slack:C0123456789'); - const slackUnreg = groups.find((g) => g.jid === 'slack:C9999999999'); - - expect(slackReg?.isRegistered).toBe(true); - expect(slackUnreg?.isRegistered).toBe(false); - }); - - it('mixes WhatsApp and Slack chats ordered by activity', () => { - storeChatMetadata('wa@g.us', '2024-01-01T00:00:01.000Z', 'WhatsApp', 'whatsapp', true); - storeChatMetadata('slack:C100', '2024-01-01T00:00:03.000Z', 'Slack', 'slack', true); - storeChatMetadata('wa2@g.us', '2024-01-01T00:00:02.000Z', 'WhatsApp 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(3); - expect(groups[0].jid).toBe('slack:C100'); - expect(groups[1].jid).toBe('wa2@g.us'); - expect(groups[2].jid).toBe('wa@g.us'); - }); -}); diff --git a/.claude/skills/add-slack/modify/src/routing.test.ts.intent.md b/.claude/skills/add-slack/modify/src/routing.test.ts.intent.md deleted file mode 100644 index a03ba99..0000000 --- a/.claude/skills/add-slack/modify/src/routing.test.ts.intent.md +++ /dev/null @@ -1,17 +0,0 @@ -# Intent: src/routing.test.ts modifications - -## What changed -Added Slack JID pattern tests and Slack-specific getAvailableGroups tests. - -## Key sections -- **JID ownership patterns**: Added Slack channel JID (`slack:C...`) and Slack DM JID (`slack:D...`) pattern tests -- **getAvailableGroups**: Added tests for Slack channel inclusion, Slack DM handling, registered Slack channels, and mixed WhatsApp + Slack ordering - -## Invariants -- All existing WhatsApp JID pattern tests remain unchanged -- All existing getAvailableGroups tests remain unchanged -- New tests follow the same patterns as existing tests - -## Must-keep -- All existing WhatsApp tests (group JID, DM JID patterns) -- All existing getAvailableGroups tests (DM exclusion, sentinel exclusion, registration, ordering, non-group exclusion, empty array) diff --git a/.claude/skills/add-slack/tests/slack.test.ts b/.claude/skills/add-slack/tests/slack.test.ts deleted file mode 100644 index 7e8d946..0000000 --- a/.claude/skills/add-slack/tests/slack.test.ts +++ /dev/null @@ -1,171 +0,0 @@ -import { describe, expect, it } from 'vitest'; -import fs from 'fs'; -import path from 'path'; - -describe('slack skill package', () => { - const skillDir = path.resolve(__dirname, '..'); - - it('has a valid manifest', () => { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - expect(fs.existsSync(manifestPath)).toBe(true); - - const content = fs.readFileSync(manifestPath, 'utf-8'); - expect(content).toContain('skill: slack'); - expect(content).toContain('version: 1.0.0'); - expect(content).toContain('@slack/bolt'); - }); - - it('has all files declared in adds', () => { - const addFile = path.join(skillDir, 'add', 'src', 'channels', 'slack.ts'); - expect(fs.existsSync(addFile)).toBe(true); - - const content = fs.readFileSync(addFile, 'utf-8'); - expect(content).toContain('class SlackChannel'); - expect(content).toContain('implements Channel'); - - // Test file for the channel - const testFile = path.join(skillDir, 'add', 'src', 'channels', 'slack.test.ts'); - expect(fs.existsSync(testFile)).toBe(true); - - const testContent = fs.readFileSync(testFile, 'utf-8'); - expect(testContent).toContain("describe('SlackChannel'"); - }); - - it('has all files declared in modifies', () => { - const indexFile = path.join(skillDir, 'modify', 'src', 'index.ts'); - const configFile = path.join(skillDir, 'modify', 'src', 'config.ts'); - const routingTestFile = path.join(skillDir, 'modify', 'src', 'routing.test.ts'); - - expect(fs.existsSync(indexFile)).toBe(true); - expect(fs.existsSync(configFile)).toBe(true); - expect(fs.existsSync(routingTestFile)).toBe(true); - - const indexContent = fs.readFileSync(indexFile, 'utf-8'); - expect(indexContent).toContain('SlackChannel'); - expect(indexContent).toContain('SLACK_ONLY'); - expect(indexContent).toContain('findChannel'); - expect(indexContent).toContain('channels: Channel[]'); - - const configContent = fs.readFileSync(configFile, 'utf-8'); - expect(configContent).toContain('SLACK_ONLY'); - }); - - it('has intent files for modified files', () => { - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'index.ts.intent.md'))).toBe(true); - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'config.ts.intent.md'))).toBe(true); - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'routing.test.ts.intent.md'))).toBe(true); - }); - - it('has setup documentation', () => { - expect(fs.existsSync(path.join(skillDir, 'SKILL.md'))).toBe(true); - expect(fs.existsSync(path.join(skillDir, 'SLACK_SETUP.md'))).toBe(true); - }); - - it('modified index.ts preserves core structure', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'index.ts'), - 'utf-8', - ); - - // Core functions still present - expect(content).toContain('function loadState()'); - expect(content).toContain('function saveState()'); - expect(content).toContain('function registerGroup('); - expect(content).toContain('function getAvailableGroups()'); - expect(content).toContain('function processGroupMessages('); - expect(content).toContain('function runAgent('); - expect(content).toContain('function startMessageLoop()'); - expect(content).toContain('function recoverPendingMessages()'); - expect(content).toContain('function ensureContainerSystemRunning()'); - expect(content).toContain('async function main()'); - - // Test helper preserved - expect(content).toContain('_setRegisteredGroups'); - - // Direct-run guard preserved - expect(content).toContain('isDirectRun'); - }); - - it('modified index.ts includes Slack channel creation', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'index.ts'), - 'utf-8', - ); - - // Multi-channel architecture - expect(content).toContain('const channels: Channel[] = []'); - expect(content).toContain('channels.push(whatsapp)'); - expect(content).toContain('channels.push(slack)'); - - // Conditional channel creation - expect(content).toContain('if (!SLACK_ONLY)'); - expect(content).toContain('new SlackChannel(channelOpts)'); - - // Shutdown disconnects all channels - expect(content).toContain('for (const ch of channels) await ch.disconnect()'); - }); - - it('modified config.ts preserves all existing exports', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'config.ts'), - 'utf-8', - ); - - // All original exports preserved - expect(content).toContain('export const ASSISTANT_NAME'); - expect(content).toContain('export const POLL_INTERVAL'); - expect(content).toContain('export const TRIGGER_PATTERN'); - expect(content).toContain('export const CONTAINER_IMAGE'); - expect(content).toContain('export const DATA_DIR'); - expect(content).toContain('export const TIMEZONE'); - - // Slack config added - expect(content).toContain('export const SLACK_ONLY'); - }); - - it('modified routing.test.ts includes Slack JID tests', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'routing.test.ts'), - 'utf-8', - ); - - // Slack JID pattern tests - expect(content).toContain('slack:C'); - expect(content).toContain('slack:D'); - - // Mixed ordering test - expect(content).toContain('mixes WhatsApp and Slack'); - - // All original WhatsApp tests preserved - expect(content).toContain('@g.us'); - expect(content).toContain('@s.whatsapp.net'); - expect(content).toContain('__group_sync__'); - }); - - it('slack.ts implements required Channel interface methods', () => { - const content = fs.readFileSync( - path.join(skillDir, 'add', 'src', 'channels', 'slack.ts'), - 'utf-8', - ); - - // Channel interface methods - expect(content).toContain('async connect()'); - expect(content).toContain('async sendMessage('); - expect(content).toContain('isConnected()'); - expect(content).toContain('ownsJid('); - expect(content).toContain('async disconnect()'); - expect(content).toContain('async setTyping('); - - // Security pattern: reads tokens from .env, not process.env - expect(content).toContain('readEnvFile'); - expect(content).not.toContain('process.env.SLACK_BOT_TOKEN'); - expect(content).not.toContain('process.env.SLACK_APP_TOKEN'); - - // Key behaviors - expect(content).toContain('socketMode: true'); - expect(content).toContain('MAX_MESSAGE_LENGTH'); - expect(content).toContain('thread_ts'); - expect(content).toContain('TRIGGER_PATTERN'); - expect(content).toContain('userNameCache'); - }); -}); diff --git a/.claude/skills/add-telegram/SKILL.md b/.claude/skills/add-telegram/SKILL.md index 8c941fa..10f25ab 100644 --- a/.claude/skills/add-telegram/SKILL.md +++ b/.claude/skills/add-telegram/SKILL.md @@ -5,68 +5,65 @@ description: Add Telegram as a channel. Can replace WhatsApp entirely or run alo # Add Telegram Channel -This skill adds Telegram support to NanoClaw using the skills engine for deterministic code changes, then walks through interactive setup. +This skill adds Telegram support to NanoClaw, then walks through interactive setup. ## Phase 1: Pre-flight ### Check if already applied -Read `.nanoclaw/state.yaml`. If `telegram` is in `applied_skills`, skip to Phase 3 (Setup). The code changes are already in place. +Check if `src/channels/telegram.ts` exists. If it does, skip to Phase 3 (Setup). The code changes are already in place. ### Ask the user Use `AskUserQuestion` to collect configuration: -AskUserQuestion: Should Telegram replace WhatsApp or run alongside it? -- **Replace WhatsApp** - Telegram will be the only channel (sets TELEGRAM_ONLY=true) -- **Alongside** - Both Telegram and WhatsApp channels active - AskUserQuestion: Do you have a Telegram bot token, or do you need to create one? If they have one, collect it now. If not, we'll create one in Phase 3. ## Phase 2: Apply Code Changes -Run the skills engine to apply this skill's code package. The package files are in this directory alongside this SKILL.md. - -### Initialize skills system (if needed) - -If `.nanoclaw/` directory doesn't exist yet: +### Ensure channel remote ```bash -npx tsx scripts/apply-skill.ts --init +git remote -v ``` -Or call `initSkillsSystem()` from `skills-engine/migrate.ts`. - -### Apply the skill +If `telegram` is missing, add it: ```bash -npx tsx scripts/apply-skill.ts .claude/skills/add-telegram +git remote add telegram https://github.com/qwibitai/nanoclaw-telegram.git ``` -This deterministically: -- Adds `src/channels/telegram.ts` (TelegramChannel class implementing Channel interface) -- Adds `src/channels/telegram.test.ts` (46 unit tests) -- Three-way merges Telegram support into `src/index.ts` (multi-channel support, findChannel routing) -- Three-way merges Telegram config into `src/config.ts` (TELEGRAM_BOT_TOKEN, TELEGRAM_ONLY exports) -- Three-way merges updated routing tests into `src/routing.test.ts` -- Installs the `grammy` npm dependency -- Updates `.env.example` with `TELEGRAM_BOT_TOKEN` and `TELEGRAM_ONLY` -- Records the application in `.nanoclaw/state.yaml` +### Merge the skill branch -If the apply reports merge conflicts, read the intent files: -- `modify/src/index.ts.intent.md` — what changed and invariants for index.ts -- `modify/src/config.ts.intent.md` — what changed for config.ts +```bash +git fetch telegram main +git merge telegram/main || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/channels/telegram.ts` (TelegramChannel class with self-registration via `registerChannel`) +- `src/channels/telegram.test.ts` (unit tests with grammy mock) +- `import './telegram.js'` appended to the channel barrel file `src/channels/index.ts` +- `grammy` npm dependency in `package.json` +- `TELEGRAM_BOT_TOKEN` in `.env.example` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. ### Validate code changes ```bash -npm test +npm install npm run build +npx vitest run src/channels/telegram.test.ts ``` -All tests must pass (including the new telegram tests) and build must be clean before proceeding. +All tests must pass (including the new Telegram tests) and build must be clean before proceeding. ## Phase 3: Setup @@ -92,11 +89,7 @@ Add to `.env`: TELEGRAM_BOT_TOKEN= ``` -If they chose to replace WhatsApp: - -```bash -TELEGRAM_ONLY=true -``` +Channels auto-enable when their credentials are present — no extra configuration needed. Sync to container environment: @@ -140,30 +133,18 @@ Wait for the user to provide the chat ID (format: `tg:123456789` or `tg:-1001234 ### Register the chat -Use the IPC register flow or register directly. The chat ID, name, and folder name are needed. +The chat ID, name, and folder name are needed. Use `npx tsx setup/index.ts --step register` with the appropriate flags. -For a main chat (responds to all messages, uses the `main` folder): +For a main chat (responds to all messages): -```typescript -registerGroup("tg:", { - name: "", - folder: "main", - trigger: `@${ASSISTANT_NAME}`, - added_at: new Date().toISOString(), - requiresTrigger: false, -}); +```bash +npx tsx setup/index.ts --step register -- --jid "tg:" --name "" --folder "telegram_main" --trigger "@${ASSISTANT_NAME}" --channel telegram --no-trigger-required --is-main ``` For additional chats (trigger-only): -```typescript -registerGroup("tg:", { - name: "", - folder: "", - trigger: `@${ASSISTANT_NAME}`, - added_at: new Date().toISOString(), - requiresTrigger: true, -}); +```bash +npx tsx setup/index.ts --step register -- --jid "tg:" --name "" --folder "telegram_" --trigger "@${ASSISTANT_NAME}" --channel telegram ``` ## Phase 5: Verify @@ -233,11 +214,9 @@ If they say yes, invoke the `/add-telegram-swarm` skill. To remove Telegram integration: -1. Delete `src/channels/telegram.ts` -2. Remove `TelegramChannel` import and creation from `src/index.ts` -3. Remove `channels` array and revert to using `whatsapp` directly in `processGroupMessages`, scheduler deps, and IPC deps -4. Revert `getAvailableGroups()` filter to only include `@g.us` chats -5. Remove Telegram config (`TELEGRAM_BOT_TOKEN`, `TELEGRAM_ONLY`) from `src/config.ts` -6. Remove Telegram registrations from SQLite: `sqlite3 store/messages.db "DELETE FROM registered_groups WHERE jid LIKE 'tg:%'"` -7. Uninstall: `npm uninstall grammy` -8. Rebuild: `npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `npm run build && systemctl --user restart nanoclaw` (Linux) +1. Delete `src/channels/telegram.ts` and `src/channels/telegram.test.ts` +2. Remove `import './telegram.js'` from `src/channels/index.ts` +3. Remove `TELEGRAM_BOT_TOKEN` from `.env` +4. Remove Telegram registrations from SQLite: `sqlite3 store/messages.db "DELETE FROM registered_groups WHERE jid LIKE 'tg:%'"` +5. Uninstall: `npm uninstall grammy` +6. Rebuild: `npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `npm run build && systemctl --user restart nanoclaw` (Linux) diff --git a/.claude/skills/add-telegram/add/src/channels/telegram.test.ts b/.claude/skills/add-telegram/add/src/channels/telegram.test.ts deleted file mode 100644 index 950b607..0000000 --- a/.claude/skills/add-telegram/add/src/channels/telegram.test.ts +++ /dev/null @@ -1,926 +0,0 @@ -import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest'; - -// --- Mocks --- - -// Mock config -vi.mock('../config.js', () => ({ - ASSISTANT_NAME: 'Andy', - TRIGGER_PATTERN: /^@Andy\b/i, -})); - -// Mock logger -vi.mock('../logger.js', () => ({ - logger: { - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }, -})); - -// --- Grammy mock --- - -type Handler = (...args: any[]) => any; - -const botRef = vi.hoisted(() => ({ current: null as any })); - -vi.mock('grammy', () => ({ - Bot: class MockBot { - token: string; - commandHandlers = new Map(); - filterHandlers = new Map(); - errorHandler: Handler | null = null; - - api = { - sendMessage: vi.fn().mockResolvedValue(undefined), - sendChatAction: vi.fn().mockResolvedValue(undefined), - }; - - constructor(token: string) { - this.token = token; - botRef.current = this; - } - - command(name: string, handler: Handler) { - this.commandHandlers.set(name, handler); - } - - on(filter: string, handler: Handler) { - const existing = this.filterHandlers.get(filter) || []; - existing.push(handler); - this.filterHandlers.set(filter, existing); - } - - catch(handler: Handler) { - this.errorHandler = handler; - } - - start(opts: { onStart: (botInfo: any) => void }) { - opts.onStart({ username: 'andy_ai_bot', id: 12345 }); - } - - stop() {} - }, -})); - -import { TelegramChannel, TelegramChannelOpts } from './telegram.js'; - -// --- Test helpers --- - -function createTestOpts( - overrides?: Partial, -): TelegramChannelOpts { - return { - onMessage: vi.fn(), - onChatMetadata: vi.fn(), - registeredGroups: vi.fn(() => ({ - 'tg:100200300': { - name: 'Test Group', - folder: 'test-group', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - ...overrides, - }; -} - -function createTextCtx(overrides: { - chatId?: number; - chatType?: string; - chatTitle?: string; - text: string; - fromId?: number; - firstName?: string; - username?: string; - messageId?: number; - date?: number; - entities?: any[]; -}) { - const chatId = overrides.chatId ?? 100200300; - const chatType = overrides.chatType ?? 'group'; - return { - chat: { - id: chatId, - type: chatType, - title: overrides.chatTitle ?? 'Test Group', - }, - from: { - id: overrides.fromId ?? 99001, - first_name: overrides.firstName ?? 'Alice', - username: overrides.username ?? 'alice_user', - }, - message: { - text: overrides.text, - date: overrides.date ?? Math.floor(Date.now() / 1000), - message_id: overrides.messageId ?? 1, - entities: overrides.entities ?? [], - }, - me: { username: 'andy_ai_bot' }, - reply: vi.fn(), - }; -} - -function createMediaCtx(overrides: { - chatId?: number; - chatType?: string; - fromId?: number; - firstName?: string; - date?: number; - messageId?: number; - caption?: string; - extra?: Record; -}) { - const chatId = overrides.chatId ?? 100200300; - return { - chat: { - id: chatId, - type: overrides.chatType ?? 'group', - title: 'Test Group', - }, - from: { - id: overrides.fromId ?? 99001, - first_name: overrides.firstName ?? 'Alice', - username: 'alice_user', - }, - message: { - date: overrides.date ?? Math.floor(Date.now() / 1000), - message_id: overrides.messageId ?? 1, - caption: overrides.caption, - ...(overrides.extra || {}), - }, - me: { username: 'andy_ai_bot' }, - }; -} - -function currentBot() { - return botRef.current; -} - -async function triggerTextMessage(ctx: ReturnType) { - const handlers = currentBot().filterHandlers.get('message:text') || []; - for (const h of handlers) await h(ctx); -} - -async function triggerMediaMessage( - filter: string, - ctx: ReturnType, -) { - const handlers = currentBot().filterHandlers.get(filter) || []; - for (const h of handlers) await h(ctx); -} - -// --- Tests --- - -describe('TelegramChannel', () => { - beforeEach(() => { - vi.clearAllMocks(); - }); - - afterEach(() => { - vi.restoreAllMocks(); - }); - - // --- Connection lifecycle --- - - describe('connection lifecycle', () => { - it('resolves connect() when bot starts', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - await channel.connect(); - - expect(channel.isConnected()).toBe(true); - }); - - it('registers command and message handlers on connect', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - await channel.connect(); - - expect(currentBot().commandHandlers.has('chatid')).toBe(true); - expect(currentBot().commandHandlers.has('ping')).toBe(true); - expect(currentBot().filterHandlers.has('message:text')).toBe(true); - expect(currentBot().filterHandlers.has('message:photo')).toBe(true); - expect(currentBot().filterHandlers.has('message:video')).toBe(true); - expect(currentBot().filterHandlers.has('message:voice')).toBe(true); - expect(currentBot().filterHandlers.has('message:audio')).toBe(true); - expect(currentBot().filterHandlers.has('message:document')).toBe(true); - expect(currentBot().filterHandlers.has('message:sticker')).toBe(true); - expect(currentBot().filterHandlers.has('message:location')).toBe(true); - expect(currentBot().filterHandlers.has('message:contact')).toBe(true); - }); - - it('registers error handler on connect', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - await channel.connect(); - - expect(currentBot().errorHandler).not.toBeNull(); - }); - - it('disconnects cleanly', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - await channel.connect(); - expect(channel.isConnected()).toBe(true); - - await channel.disconnect(); - expect(channel.isConnected()).toBe(false); - }); - - it('isConnected() returns false before connect', () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - expect(channel.isConnected()).toBe(false); - }); - }); - - // --- Text message handling --- - - describe('text message handling', () => { - it('delivers message for registered group', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ text: 'Hello everyone' }); - await triggerTextMessage(ctx); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'tg:100200300', - expect.any(String), - 'Test Group', - 'telegram', - true, - ); - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - id: '1', - chat_jid: 'tg:100200300', - sender: '99001', - sender_name: 'Alice', - content: 'Hello everyone', - is_from_me: false, - }), - ); - }); - - it('only emits metadata for unregistered chats', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ chatId: 999999, text: 'Unknown chat' }); - await triggerTextMessage(ctx); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'tg:999999', - expect.any(String), - 'Test Group', - 'telegram', - true, - ); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('skips command messages (starting with /)', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ text: '/start' }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).not.toHaveBeenCalled(); - expect(opts.onChatMetadata).not.toHaveBeenCalled(); - }); - - it('extracts sender name from first_name', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ text: 'Hi', firstName: 'Bob' }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ sender_name: 'Bob' }), - ); - }); - - it('falls back to username when first_name missing', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ text: 'Hi' }); - ctx.from.first_name = undefined as any; - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ sender_name: 'alice_user' }), - ); - }); - - it('falls back to user ID when name and username missing', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ text: 'Hi', fromId: 42 }); - ctx.from.first_name = undefined as any; - ctx.from.username = undefined as any; - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ sender_name: '42' }), - ); - }); - - it('uses sender name as chat name for private chats', async () => { - const opts = createTestOpts({ - registeredGroups: vi.fn(() => ({ - 'tg:100200300': { - name: 'Private', - folder: 'private', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - }); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: 'Hello', - chatType: 'private', - firstName: 'Alice', - }); - await triggerTextMessage(ctx); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'tg:100200300', - expect.any(String), - 'Alice', // Private chats use sender name - 'telegram', - false, - ); - }); - - it('uses chat title as name for group chats', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: 'Hello', - chatType: 'supergroup', - chatTitle: 'Project Team', - }); - await triggerTextMessage(ctx); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'tg:100200300', - expect.any(String), - 'Project Team', - 'telegram', - true, - ); - }); - - it('converts message.date to ISO timestamp', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const unixTime = 1704067200; // 2024-01-01T00:00:00.000Z - const ctx = createTextCtx({ text: 'Hello', date: unixTime }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - timestamp: '2024-01-01T00:00:00.000Z', - }), - ); - }); - }); - - // --- @mention translation --- - - describe('@mention translation', () => { - it('translates @bot_username mention to trigger format', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: '@andy_ai_bot what time is it?', - entities: [{ type: 'mention', offset: 0, length: 12 }], - }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - content: '@Andy @andy_ai_bot what time is it?', - }), - ); - }); - - it('does not translate if message already matches trigger', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: '@Andy @andy_ai_bot hello', - entities: [{ type: 'mention', offset: 6, length: 12 }], - }); - await triggerTextMessage(ctx); - - // Should NOT double-prepend — already starts with @Andy - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - content: '@Andy @andy_ai_bot hello', - }), - ); - }); - - it('does not translate mentions of other bots', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: '@some_other_bot hi', - entities: [{ type: 'mention', offset: 0, length: 15 }], - }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - content: '@some_other_bot hi', // No translation - }), - ); - }); - - it('handles mention in middle of message', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: 'hey @andy_ai_bot check this', - entities: [{ type: 'mention', offset: 4, length: 12 }], - }); - await triggerTextMessage(ctx); - - // Bot is mentioned, message doesn't match trigger → prepend trigger - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - content: '@Andy hey @andy_ai_bot check this', - }), - ); - }); - - it('handles message with no entities', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ text: 'plain message' }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - content: 'plain message', - }), - ); - }); - - it('ignores non-mention entities', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createTextCtx({ - text: 'check https://example.com', - entities: [{ type: 'url', offset: 6, length: 19 }], - }); - await triggerTextMessage(ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ - content: 'check https://example.com', - }), - ); - }); - }); - - // --- Non-text messages --- - - describe('non-text messages', () => { - it('stores photo with placeholder', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({}); - await triggerMediaMessage('message:photo', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Photo]' }), - ); - }); - - it('stores photo with caption', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({ caption: 'Look at this' }); - await triggerMediaMessage('message:photo', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Photo] Look at this' }), - ); - }); - - it('stores video with placeholder', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({}); - await triggerMediaMessage('message:video', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Video]' }), - ); - }); - - it('stores voice message with placeholder', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({}); - await triggerMediaMessage('message:voice', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Voice message]' }), - ); - }); - - it('stores audio with placeholder', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({}); - await triggerMediaMessage('message:audio', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Audio]' }), - ); - }); - - it('stores document with filename', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({ - extra: { document: { file_name: 'report.pdf' } }, - }); - await triggerMediaMessage('message:document', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Document: report.pdf]' }), - ); - }); - - it('stores document with fallback name when filename missing', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({ extra: { document: {} } }); - await triggerMediaMessage('message:document', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Document: file]' }), - ); - }); - - it('stores sticker with emoji', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({ - extra: { sticker: { emoji: '😂' } }, - }); - await triggerMediaMessage('message:sticker', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Sticker 😂]' }), - ); - }); - - it('stores location with placeholder', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({}); - await triggerMediaMessage('message:location', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Location]' }), - ); - }); - - it('stores contact with placeholder', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({}); - await triggerMediaMessage('message:contact', ctx); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'tg:100200300', - expect.objectContaining({ content: '[Contact]' }), - ); - }); - - it('ignores non-text messages from unregistered chats', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const ctx = createMediaCtx({ chatId: 999999 }); - await triggerMediaMessage('message:photo', ctx); - - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - }); - - // --- sendMessage --- - - describe('sendMessage', () => { - it('sends message via bot API', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - await channel.sendMessage('tg:100200300', 'Hello'); - - expect(currentBot().api.sendMessage).toHaveBeenCalledWith( - '100200300', - 'Hello', - ); - }); - - it('strips tg: prefix from JID', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - await channel.sendMessage('tg:-1001234567890', 'Group message'); - - expect(currentBot().api.sendMessage).toHaveBeenCalledWith( - '-1001234567890', - 'Group message', - ); - }); - - it('splits messages exceeding 4096 characters', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const longText = 'x'.repeat(5000); - await channel.sendMessage('tg:100200300', longText); - - expect(currentBot().api.sendMessage).toHaveBeenCalledTimes(2); - expect(currentBot().api.sendMessage).toHaveBeenNthCalledWith( - 1, - '100200300', - 'x'.repeat(4096), - ); - expect(currentBot().api.sendMessage).toHaveBeenNthCalledWith( - 2, - '100200300', - 'x'.repeat(904), - ); - }); - - it('sends exactly one message at 4096 characters', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const exactText = 'y'.repeat(4096); - await channel.sendMessage('tg:100200300', exactText); - - expect(currentBot().api.sendMessage).toHaveBeenCalledTimes(1); - }); - - it('handles send failure gracefully', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - currentBot().api.sendMessage.mockRejectedValueOnce( - new Error('Network error'), - ); - - // Should not throw - await expect( - channel.sendMessage('tg:100200300', 'Will fail'), - ).resolves.toBeUndefined(); - }); - - it('does nothing when bot is not initialized', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - // Don't connect — bot is null - await channel.sendMessage('tg:100200300', 'No bot'); - - // No error, no API call - }); - }); - - // --- ownsJid --- - - describe('ownsJid', () => { - it('owns tg: JIDs', () => { - const channel = new TelegramChannel('test-token', createTestOpts()); - expect(channel.ownsJid('tg:123456')).toBe(true); - }); - - it('owns tg: JIDs with negative IDs (groups)', () => { - const channel = new TelegramChannel('test-token', createTestOpts()); - expect(channel.ownsJid('tg:-1001234567890')).toBe(true); - }); - - it('does not own WhatsApp group JIDs', () => { - const channel = new TelegramChannel('test-token', createTestOpts()); - expect(channel.ownsJid('12345@g.us')).toBe(false); - }); - - it('does not own WhatsApp DM JIDs', () => { - const channel = new TelegramChannel('test-token', createTestOpts()); - expect(channel.ownsJid('12345@s.whatsapp.net')).toBe(false); - }); - - it('does not own unknown JID formats', () => { - const channel = new TelegramChannel('test-token', createTestOpts()); - expect(channel.ownsJid('random-string')).toBe(false); - }); - }); - - // --- setTyping --- - - describe('setTyping', () => { - it('sends typing action when isTyping is true', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - await channel.setTyping('tg:100200300', true); - - expect(currentBot().api.sendChatAction).toHaveBeenCalledWith( - '100200300', - 'typing', - ); - }); - - it('does nothing when isTyping is false', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - await channel.setTyping('tg:100200300', false); - - expect(currentBot().api.sendChatAction).not.toHaveBeenCalled(); - }); - - it('does nothing when bot is not initialized', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - - // Don't connect - await channel.setTyping('tg:100200300', true); - - // No error, no API call - }); - - it('handles typing indicator failure gracefully', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - currentBot().api.sendChatAction.mockRejectedValueOnce( - new Error('Rate limited'), - ); - - await expect( - channel.setTyping('tg:100200300', true), - ).resolves.toBeUndefined(); - }); - }); - - // --- Bot commands --- - - describe('bot commands', () => { - it('/chatid replies with chat ID and metadata', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const handler = currentBot().commandHandlers.get('chatid')!; - const ctx = { - chat: { id: 100200300, type: 'group' as const }, - from: { first_name: 'Alice' }, - reply: vi.fn(), - }; - - await handler(ctx); - - expect(ctx.reply).toHaveBeenCalledWith( - expect.stringContaining('tg:100200300'), - expect.objectContaining({ parse_mode: 'Markdown' }), - ); - }); - - it('/chatid shows chat type', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const handler = currentBot().commandHandlers.get('chatid')!; - const ctx = { - chat: { id: 555, type: 'private' as const }, - from: { first_name: 'Bob' }, - reply: vi.fn(), - }; - - await handler(ctx); - - expect(ctx.reply).toHaveBeenCalledWith( - expect.stringContaining('private'), - expect.any(Object), - ); - }); - - it('/ping replies with bot status', async () => { - const opts = createTestOpts(); - const channel = new TelegramChannel('test-token', opts); - await channel.connect(); - - const handler = currentBot().commandHandlers.get('ping')!; - const ctx = { reply: vi.fn() }; - - await handler(ctx); - - expect(ctx.reply).toHaveBeenCalledWith('Andy is online.'); - }); - }); - - // --- Channel properties --- - - describe('channel properties', () => { - it('has name "telegram"', () => { - const channel = new TelegramChannel('test-token', createTestOpts()); - expect(channel.name).toBe('telegram'); - }); - }); -}); diff --git a/.claude/skills/add-telegram/add/src/channels/telegram.ts b/.claude/skills/add-telegram/add/src/channels/telegram.ts deleted file mode 100644 index 43a6266..0000000 --- a/.claude/skills/add-telegram/add/src/channels/telegram.ts +++ /dev/null @@ -1,244 +0,0 @@ -import { Bot } from 'grammy'; - -import { ASSISTANT_NAME, TRIGGER_PATTERN } from '../config.js'; -import { logger } from '../logger.js'; -import { - Channel, - OnChatMetadata, - OnInboundMessage, - RegisteredGroup, -} from '../types.js'; - -export interface TelegramChannelOpts { - onMessage: OnInboundMessage; - onChatMetadata: OnChatMetadata; - registeredGroups: () => Record; -} - -export class TelegramChannel implements Channel { - name = 'telegram'; - - private bot: Bot | null = null; - private opts: TelegramChannelOpts; - private botToken: string; - - constructor(botToken: string, opts: TelegramChannelOpts) { - this.botToken = botToken; - this.opts = opts; - } - - async connect(): Promise { - this.bot = new Bot(this.botToken); - - // Command to get chat ID (useful for registration) - this.bot.command('chatid', (ctx) => { - const chatId = ctx.chat.id; - const chatType = ctx.chat.type; - const chatName = - chatType === 'private' - ? ctx.from?.first_name || 'Private' - : (ctx.chat as any).title || 'Unknown'; - - ctx.reply( - `Chat ID: \`tg:${chatId}\`\nName: ${chatName}\nType: ${chatType}`, - { parse_mode: 'Markdown' }, - ); - }); - - // Command to check bot status - this.bot.command('ping', (ctx) => { - ctx.reply(`${ASSISTANT_NAME} is online.`); - }); - - this.bot.on('message:text', async (ctx) => { - // Skip commands - if (ctx.message.text.startsWith('/')) return; - - const chatJid = `tg:${ctx.chat.id}`; - let content = ctx.message.text; - const timestamp = new Date(ctx.message.date * 1000).toISOString(); - const senderName = - ctx.from?.first_name || - ctx.from?.username || - ctx.from?.id.toString() || - 'Unknown'; - const sender = ctx.from?.id.toString() || ''; - const msgId = ctx.message.message_id.toString(); - - // Determine chat name - const chatName = - ctx.chat.type === 'private' - ? senderName - : (ctx.chat as any).title || chatJid; - - // Translate Telegram @bot_username mentions into TRIGGER_PATTERN format. - // Telegram @mentions (e.g., @andy_ai_bot) won't match TRIGGER_PATTERN - // (e.g., ^@Andy\b), so we prepend the trigger when the bot is @mentioned. - const botUsername = ctx.me?.username?.toLowerCase(); - if (botUsername) { - const entities = ctx.message.entities || []; - const isBotMentioned = entities.some((entity) => { - if (entity.type === 'mention') { - const mentionText = content - .substring(entity.offset, entity.offset + entity.length) - .toLowerCase(); - return mentionText === `@${botUsername}`; - } - return false; - }); - if (isBotMentioned && !TRIGGER_PATTERN.test(content)) { - content = `@${ASSISTANT_NAME} ${content}`; - } - } - - // Store chat metadata for discovery - const isGroup = ctx.chat.type === 'group' || ctx.chat.type === 'supergroup'; - this.opts.onChatMetadata(chatJid, timestamp, chatName, 'telegram', isGroup); - - // Only deliver full message for registered groups - const group = this.opts.registeredGroups()[chatJid]; - if (!group) { - logger.debug( - { chatJid, chatName }, - 'Message from unregistered Telegram chat', - ); - return; - } - - // Deliver message — startMessageLoop() will pick it up - this.opts.onMessage(chatJid, { - id: msgId, - chat_jid: chatJid, - sender, - sender_name: senderName, - content, - timestamp, - is_from_me: false, - }); - - logger.info( - { chatJid, chatName, sender: senderName }, - 'Telegram message stored', - ); - }); - - // Handle non-text messages with placeholders so the agent knows something was sent - const storeNonText = (ctx: any, placeholder: string) => { - const chatJid = `tg:${ctx.chat.id}`; - const group = this.opts.registeredGroups()[chatJid]; - if (!group) return; - - const timestamp = new Date(ctx.message.date * 1000).toISOString(); - const senderName = - ctx.from?.first_name || - ctx.from?.username || - ctx.from?.id?.toString() || - 'Unknown'; - const caption = ctx.message.caption ? ` ${ctx.message.caption}` : ''; - - const isGroup = ctx.chat.type === 'group' || ctx.chat.type === 'supergroup'; - this.opts.onChatMetadata(chatJid, timestamp, undefined, 'telegram', isGroup); - this.opts.onMessage(chatJid, { - id: ctx.message.message_id.toString(), - chat_jid: chatJid, - sender: ctx.from?.id?.toString() || '', - sender_name: senderName, - content: `${placeholder}${caption}`, - timestamp, - is_from_me: false, - }); - }; - - this.bot.on('message:photo', (ctx) => storeNonText(ctx, '[Photo]')); - this.bot.on('message:video', (ctx) => storeNonText(ctx, '[Video]')); - this.bot.on('message:voice', (ctx) => - storeNonText(ctx, '[Voice message]'), - ); - this.bot.on('message:audio', (ctx) => storeNonText(ctx, '[Audio]')); - this.bot.on('message:document', (ctx) => { - const name = ctx.message.document?.file_name || 'file'; - storeNonText(ctx, `[Document: ${name}]`); - }); - this.bot.on('message:sticker', (ctx) => { - const emoji = ctx.message.sticker?.emoji || ''; - storeNonText(ctx, `[Sticker ${emoji}]`); - }); - this.bot.on('message:location', (ctx) => storeNonText(ctx, '[Location]')); - this.bot.on('message:contact', (ctx) => storeNonText(ctx, '[Contact]')); - - // Handle errors gracefully - this.bot.catch((err) => { - logger.error({ err: err.message }, 'Telegram bot error'); - }); - - // Start polling — returns a Promise that resolves when started - return new Promise((resolve) => { - this.bot!.start({ - onStart: (botInfo) => { - logger.info( - { username: botInfo.username, id: botInfo.id }, - 'Telegram bot connected', - ); - console.log(`\n Telegram bot: @${botInfo.username}`); - console.log( - ` Send /chatid to the bot to get a chat's registration ID\n`, - ); - resolve(); - }, - }); - }); - } - - async sendMessage(jid: string, text: string): Promise { - if (!this.bot) { - logger.warn('Telegram bot not initialized'); - return; - } - - try { - const numericId = jid.replace(/^tg:/, ''); - - // Telegram has a 4096 character limit per message — split if needed - const MAX_LENGTH = 4096; - if (text.length <= MAX_LENGTH) { - await this.bot.api.sendMessage(numericId, text); - } else { - for (let i = 0; i < text.length; i += MAX_LENGTH) { - await this.bot.api.sendMessage( - numericId, - text.slice(i, i + MAX_LENGTH), - ); - } - } - logger.info({ jid, length: text.length }, 'Telegram message sent'); - } catch (err) { - logger.error({ jid, err }, 'Failed to send Telegram message'); - } - } - - isConnected(): boolean { - return this.bot !== null; - } - - ownsJid(jid: string): boolean { - return jid.startsWith('tg:'); - } - - async disconnect(): Promise { - if (this.bot) { - this.bot.stop(); - this.bot = null; - logger.info('Telegram bot stopped'); - } - } - - async setTyping(jid: string, isTyping: boolean): Promise { - if (!this.bot || !isTyping) return; - try { - const numericId = jid.replace(/^tg:/, ''); - await this.bot.api.sendChatAction(numericId, 'typing'); - } catch (err) { - logger.debug({ jid, err }, 'Failed to send Telegram typing indicator'); - } - } -} diff --git a/.claude/skills/add-telegram/manifest.yaml b/.claude/skills/add-telegram/manifest.yaml deleted file mode 100644 index fe7a36a..0000000 --- a/.claude/skills/add-telegram/manifest.yaml +++ /dev/null @@ -1,20 +0,0 @@ -skill: telegram -version: 1.0.0 -description: "Telegram Bot API integration via Grammy" -core_version: 0.1.0 -adds: - - src/channels/telegram.ts - - src/channels/telegram.test.ts -modifies: - - src/index.ts - - src/config.ts - - src/routing.test.ts -structured: - npm_dependencies: - grammy: "^1.39.3" - env_additions: - - TELEGRAM_BOT_TOKEN - - TELEGRAM_ONLY -conflicts: [] -depends: [] -test: "npx vitest run src/channels/telegram.test.ts" diff --git a/.claude/skills/add-telegram/modify/src/config.ts b/.claude/skills/add-telegram/modify/src/config.ts deleted file mode 100644 index f0093f2..0000000 --- a/.claude/skills/add-telegram/modify/src/config.ts +++ /dev/null @@ -1,77 +0,0 @@ -import os from 'os'; -import path from 'path'; - -import { readEnvFile } from './env.js'; - -// Read config values from .env (falls back to process.env). -// Secrets are NOT read here — they stay on disk and are loaded only -// where needed (container-runner.ts) to avoid leaking to child processes. -const envConfig = readEnvFile([ - 'ASSISTANT_NAME', - 'ASSISTANT_HAS_OWN_NUMBER', - 'TELEGRAM_BOT_TOKEN', - 'TELEGRAM_ONLY', -]); - -export const ASSISTANT_NAME = - process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy'; -export const ASSISTANT_HAS_OWN_NUMBER = - (process.env.ASSISTANT_HAS_OWN_NUMBER || envConfig.ASSISTANT_HAS_OWN_NUMBER) === 'true'; -export const POLL_INTERVAL = 2000; -export const SCHEDULER_POLL_INTERVAL = 60000; - -// Absolute paths needed for container mounts -const PROJECT_ROOT = process.cwd(); -const HOME_DIR = process.env.HOME || os.homedir(); - -// Mount security: allowlist stored OUTSIDE project root, never mounted into containers -export const MOUNT_ALLOWLIST_PATH = path.join( - HOME_DIR, - '.config', - 'nanoclaw', - 'mount-allowlist.json', -); -export const STORE_DIR = path.resolve(PROJECT_ROOT, 'store'); -export const GROUPS_DIR = path.resolve(PROJECT_ROOT, 'groups'); -export const DATA_DIR = path.resolve(PROJECT_ROOT, 'data'); -export const MAIN_GROUP_FOLDER = 'main'; - -export const CONTAINER_IMAGE = - process.env.CONTAINER_IMAGE || 'nanoclaw-agent:latest'; -export const CONTAINER_TIMEOUT = parseInt( - process.env.CONTAINER_TIMEOUT || '1800000', - 10, -); -export const CONTAINER_MAX_OUTPUT_SIZE = parseInt( - process.env.CONTAINER_MAX_OUTPUT_SIZE || '10485760', - 10, -); // 10MB default -export const IPC_POLL_INTERVAL = 1000; -export const IDLE_TIMEOUT = parseInt( - process.env.IDLE_TIMEOUT || '1800000', - 10, -); // 30min default — how long to keep container alive after last result -export const MAX_CONCURRENT_CONTAINERS = Math.max( - 1, - parseInt(process.env.MAX_CONCURRENT_CONTAINERS || '5', 10) || 5, -); - -function escapeRegex(str: string): string { - return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); -} - -export const TRIGGER_PATTERN = new RegExp( - `^@${escapeRegex(ASSISTANT_NAME)}\\b`, - 'i', -); - -// Timezone for scheduled tasks (cron expressions, etc.) -// Uses system timezone by default -export const TIMEZONE = - process.env.TZ || Intl.DateTimeFormat().resolvedOptions().timeZone; - -// Telegram configuration -export const TELEGRAM_BOT_TOKEN = - process.env.TELEGRAM_BOT_TOKEN || envConfig.TELEGRAM_BOT_TOKEN || ''; -export const TELEGRAM_ONLY = - (process.env.TELEGRAM_ONLY || envConfig.TELEGRAM_ONLY) === 'true'; diff --git a/.claude/skills/add-telegram/modify/src/config.ts.intent.md b/.claude/skills/add-telegram/modify/src/config.ts.intent.md deleted file mode 100644 index 9db1692..0000000 --- a/.claude/skills/add-telegram/modify/src/config.ts.intent.md +++ /dev/null @@ -1,21 +0,0 @@ -# Intent: src/config.ts modifications - -## What changed -Added two new configuration exports for Telegram channel support. - -## Key sections -- **readEnvFile call**: Must include `TELEGRAM_BOT_TOKEN` and `TELEGRAM_ONLY` in the keys array. NanoClaw does NOT load `.env` into `process.env` — all `.env` values must be explicitly requested via `readEnvFile()`. -- **TELEGRAM_BOT_TOKEN**: Read from `process.env` first, then `envConfig` fallback, defaults to empty string (channel disabled when empty) -- **TELEGRAM_ONLY**: Boolean flag from `process.env` or `envConfig`, when `true` disables WhatsApp channel creation - -## Invariants -- All existing config exports remain unchanged -- New Telegram keys are added to the `readEnvFile` call alongside existing keys -- New exports are appended at the end of the file -- No existing behavior is modified — Telegram config is additive only -- Both `process.env` and `envConfig` are checked (same pattern as `ASSISTANT_NAME`) - -## Must-keep -- All existing exports (`ASSISTANT_NAME`, `POLL_INTERVAL`, `TRIGGER_PATTERN`, etc.) -- The `readEnvFile` pattern — ALL config read from `.env` must go through this function -- The `escapeRegex` helper and `TRIGGER_PATTERN` construction diff --git a/.claude/skills/add-telegram/modify/src/index.ts b/.claude/skills/add-telegram/modify/src/index.ts deleted file mode 100644 index b91e244..0000000 --- a/.claude/skills/add-telegram/modify/src/index.ts +++ /dev/null @@ -1,509 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { - ASSISTANT_NAME, - IDLE_TIMEOUT, - MAIN_GROUP_FOLDER, - POLL_INTERVAL, - TELEGRAM_BOT_TOKEN, - TELEGRAM_ONLY, - TRIGGER_PATTERN, -} from './config.js'; -import { TelegramChannel } from './channels/telegram.js'; -import { WhatsAppChannel } from './channels/whatsapp.js'; -import { - ContainerOutput, - runContainerAgent, - writeGroupsSnapshot, - writeTasksSnapshot, -} from './container-runner.js'; -import { cleanupOrphans, ensureContainerRuntimeRunning } from './container-runtime.js'; -import { - getAllChats, - getAllRegisteredGroups, - getAllSessions, - getAllTasks, - getMessagesSince, - getNewMessages, - getRouterState, - initDatabase, - setRegisteredGroup, - setRouterState, - setSession, - storeChatMetadata, - storeMessage, -} from './db.js'; -import { GroupQueue } from './group-queue.js'; -import { resolveGroupFolderPath } from './group-folder.js'; -import { startIpcWatcher } from './ipc.js'; -import { findChannel, formatMessages, formatOutbound } from './router.js'; -import { startSchedulerLoop } from './task-scheduler.js'; -import { Channel, NewMessage, RegisteredGroup } from './types.js'; -import { logger } from './logger.js'; - -// Re-export for backwards compatibility during refactor -export { escapeXml, formatMessages } from './router.js'; - -let lastTimestamp = ''; -let sessions: Record = {}; -let registeredGroups: Record = {}; -let lastAgentTimestamp: Record = {}; -let messageLoopRunning = false; - -let whatsapp: WhatsAppChannel; -const channels: Channel[] = []; -const queue = new GroupQueue(); - -function loadState(): void { - lastTimestamp = getRouterState('last_timestamp') || ''; - const agentTs = getRouterState('last_agent_timestamp'); - try { - lastAgentTimestamp = agentTs ? JSON.parse(agentTs) : {}; - } catch { - logger.warn('Corrupted last_agent_timestamp in DB, resetting'); - lastAgentTimestamp = {}; - } - sessions = getAllSessions(); - registeredGroups = getAllRegisteredGroups(); - logger.info( - { groupCount: Object.keys(registeredGroups).length }, - 'State loaded', - ); -} - -function saveState(): void { - setRouterState('last_timestamp', lastTimestamp); - setRouterState( - 'last_agent_timestamp', - JSON.stringify(lastAgentTimestamp), - ); -} - -function registerGroup(jid: string, group: RegisteredGroup): void { - let groupDir: string; - try { - groupDir = resolveGroupFolderPath(group.folder); - } catch (err) { - logger.warn( - { jid, folder: group.folder, err }, - 'Rejecting group registration with invalid folder', - ); - return; - } - - registeredGroups[jid] = group; - setRegisteredGroup(jid, group); - - // Create group folder - fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true }); - - logger.info( - { jid, name: group.name, folder: group.folder }, - 'Group registered', - ); -} - -/** - * Get available groups list for the agent. - * Returns groups ordered by most recent activity. - */ -export function getAvailableGroups(): import('./container-runner.js').AvailableGroup[] { - const chats = getAllChats(); - const registeredJids = new Set(Object.keys(registeredGroups)); - - return chats - .filter((c) => c.jid !== '__group_sync__' && c.is_group) - .map((c) => ({ - jid: c.jid, - name: c.name, - lastActivity: c.last_message_time, - isRegistered: registeredJids.has(c.jid), - })); -} - -/** @internal - exported for testing */ -export function _setRegisteredGroups(groups: Record): void { - registeredGroups = groups; -} - -/** - * Process all pending messages for a group. - * Called by the GroupQueue when it's this group's turn. - */ -async function processGroupMessages(chatJid: string): Promise { - const group = registeredGroups[chatJid]; - if (!group) return true; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - return true; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const missedMessages = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - - if (missedMessages.length === 0) return true; - - // For non-main groups, check if trigger is required and present - if (!isMainGroup && group.requiresTrigger !== false) { - const hasTrigger = missedMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) return true; - } - - const prompt = formatMessages(missedMessages); - - // Advance cursor so the piping path in startMessageLoop won't re-fetch - // these messages. Save the old cursor so we can roll back on error. - const previousCursor = lastAgentTimestamp[chatJid] || ''; - lastAgentTimestamp[chatJid] = - missedMessages[missedMessages.length - 1].timestamp; - saveState(); - - logger.info( - { group: group.name, messageCount: missedMessages.length }, - 'Processing messages', - ); - - // Track idle timer for closing stdin when agent is idle - let idleTimer: ReturnType | null = null; - - const resetIdleTimer = () => { - if (idleTimer) clearTimeout(idleTimer); - idleTimer = setTimeout(() => { - logger.debug({ group: group.name }, 'Idle timeout, closing container stdin'); - queue.closeStdin(chatJid); - }, IDLE_TIMEOUT); - }; - - await channel.setTyping?.(chatJid, true); - let hadError = false; - let outputSentToUser = false; - - const output = await runAgent(group, prompt, chatJid, async (result) => { - // Streaming output callback — called for each agent result - if (result.result) { - const raw = typeof result.result === 'string' ? result.result : JSON.stringify(result.result); - // Strip ... blocks — agent uses these for internal reasoning - const text = raw.replace(/[\s\S]*?<\/internal>/g, '').trim(); - logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`); - if (text) { - await channel.sendMessage(chatJid, text); - outputSentToUser = true; - } - // Only reset idle timer on actual results, not session-update markers (result: null) - resetIdleTimer(); - } - - if (result.status === 'success') { - queue.notifyIdle(chatJid); - } - - if (result.status === 'error') { - hadError = true; - } - }); - - await channel.setTyping?.(chatJid, false); - if (idleTimer) clearTimeout(idleTimer); - - if (output === 'error' || hadError) { - // If we already sent output to the user, don't roll back the cursor — - // the user got their response and re-processing would send duplicates. - if (outputSentToUser) { - logger.warn({ group: group.name }, 'Agent error after output was sent, skipping cursor rollback to prevent duplicates'); - return true; - } - // Roll back cursor so retries can re-process these messages - lastAgentTimestamp[chatJid] = previousCursor; - saveState(); - logger.warn({ group: group.name }, 'Agent error, rolled back message cursor for retry'); - return false; - } - - return true; -} - -async function runAgent( - group: RegisteredGroup, - prompt: string, - chatJid: string, - onOutput?: (output: ContainerOutput) => Promise, -): Promise<'success' | 'error'> { - const isMain = group.folder === MAIN_GROUP_FOLDER; - const sessionId = sessions[group.folder]; - - // Update tasks snapshot for container to read (filtered by group) - const tasks = getAllTasks(); - writeTasksSnapshot( - group.folder, - isMain, - tasks.map((t) => ({ - id: t.id, - groupFolder: t.group_folder, - prompt: t.prompt, - schedule_type: t.schedule_type, - schedule_value: t.schedule_value, - status: t.status, - next_run: t.next_run, - })), - ); - - // Update available groups snapshot (main group only can see all groups) - const availableGroups = getAvailableGroups(); - writeGroupsSnapshot( - group.folder, - isMain, - availableGroups, - new Set(Object.keys(registeredGroups)), - ); - - // Wrap onOutput to track session ID from streamed results - const wrappedOnOutput = onOutput - ? async (output: ContainerOutput) => { - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - await onOutput(output); - } - : undefined; - - try { - const output = await runContainerAgent( - group, - { - prompt, - sessionId, - groupFolder: group.folder, - chatJid, - isMain, - assistantName: ASSISTANT_NAME, - }, - (proc, containerName) => queue.registerProcess(chatJid, proc, containerName, group.folder), - wrappedOnOutput, - ); - - if (output.newSessionId) { - sessions[group.folder] = output.newSessionId; - setSession(group.folder, output.newSessionId); - } - - if (output.status === 'error') { - logger.error( - { group: group.name, error: output.error }, - 'Container agent error', - ); - return 'error'; - } - - return 'success'; - } catch (err) { - logger.error({ group: group.name, err }, 'Agent error'); - return 'error'; - } -} - -async function startMessageLoop(): Promise { - if (messageLoopRunning) { - logger.debug('Message loop already running, skipping duplicate start'); - return; - } - messageLoopRunning = true; - - logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`); - - while (true) { - try { - const jids = Object.keys(registeredGroups); - const { messages, newTimestamp } = getNewMessages(jids, lastTimestamp, ASSISTANT_NAME); - - if (messages.length > 0) { - logger.info({ count: messages.length }, 'New messages'); - - // Advance the "seen" cursor for all messages immediately - lastTimestamp = newTimestamp; - saveState(); - - // Deduplicate by group - const messagesByGroup = new Map(); - for (const msg of messages) { - const existing = messagesByGroup.get(msg.chat_jid); - if (existing) { - existing.push(msg); - } else { - messagesByGroup.set(msg.chat_jid, [msg]); - } - } - - for (const [chatJid, groupMessages] of messagesByGroup) { - const group = registeredGroups[chatJid]; - if (!group) continue; - - const channel = findChannel(channels, chatJid); - if (!channel) { - console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`); - continue; - } - - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; - const needsTrigger = !isMainGroup && group.requiresTrigger !== false; - - // For non-main groups, only act on trigger messages. - // Non-trigger messages accumulate in DB and get pulled as - // context when a trigger eventually arrives. - if (needsTrigger) { - const hasTrigger = groupMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), - ); - if (!hasTrigger) continue; - } - - // Pull all messages since lastAgentTimestamp so non-trigger - // context that accumulated between triggers is included. - const allPending = getMessagesSince( - chatJid, - lastAgentTimestamp[chatJid] || '', - ASSISTANT_NAME, - ); - const messagesToSend = - allPending.length > 0 ? allPending : groupMessages; - const formatted = formatMessages(messagesToSend); - - if (queue.sendMessage(chatJid, formatted)) { - logger.debug( - { chatJid, count: messagesToSend.length }, - 'Piped messages to active container', - ); - lastAgentTimestamp[chatJid] = - messagesToSend[messagesToSend.length - 1].timestamp; - saveState(); - // Show typing indicator while the container processes the piped message - channel.setTyping?.(chatJid, true)?.catch((err) => - logger.warn({ chatJid, err }, 'Failed to set typing indicator'), - ); - } else { - // No active container — enqueue for a new one - queue.enqueueMessageCheck(chatJid); - } - } - } - } catch (err) { - logger.error({ err }, 'Error in message loop'); - } - await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL)); - } -} - -/** - * Startup recovery: check for unprocessed messages in registered groups. - * Handles crash between advancing lastTimestamp and processing messages. - */ -function recoverPendingMessages(): void { - for (const [chatJid, group] of Object.entries(registeredGroups)) { - const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; - const pending = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME); - if (pending.length > 0) { - logger.info( - { group: group.name, pendingCount: pending.length }, - 'Recovery: found unprocessed messages', - ); - queue.enqueueMessageCheck(chatJid); - } - } -} - -function ensureContainerSystemRunning(): void { - ensureContainerRuntimeRunning(); - cleanupOrphans(); -} - -async function main(): Promise { - ensureContainerSystemRunning(); - initDatabase(); - logger.info('Database initialized'); - loadState(); - - // Graceful shutdown handlers - const shutdown = async (signal: string) => { - logger.info({ signal }, 'Shutdown signal received'); - await queue.shutdown(10000); - for (const ch of channels) await ch.disconnect(); - process.exit(0); - }; - process.on('SIGTERM', () => shutdown('SIGTERM')); - process.on('SIGINT', () => shutdown('SIGINT')); - - // Channel callbacks (shared by all channels) - const channelOpts = { - onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg), - onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) => - storeChatMetadata(chatJid, timestamp, name, channel, isGroup), - registeredGroups: () => registeredGroups, - }; - - // Create and connect channels - if (TELEGRAM_BOT_TOKEN) { - const telegram = new TelegramChannel(TELEGRAM_BOT_TOKEN, channelOpts); - channels.push(telegram); - await telegram.connect(); - } - - if (!TELEGRAM_ONLY) { - whatsapp = new WhatsAppChannel(channelOpts); - channels.push(whatsapp); - await whatsapp.connect(); - } - - // Start subsystems (independently of connection handler) - startSchedulerLoop({ - registeredGroups: () => registeredGroups, - getSessions: () => sessions, - queue, - onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder), - sendMessage: async (jid, rawText) => { - const channel = findChannel(channels, jid); - if (!channel) { - console.log(`Warning: no channel owns JID ${jid}, cannot send message`); - return; - } - const text = formatOutbound(rawText); - if (text) await channel.sendMessage(jid, text); - }, - }); - startIpcWatcher({ - sendMessage: (jid, text) => { - const channel = findChannel(channels, jid); - if (!channel) throw new Error(`No channel for JID: ${jid}`); - return channel.sendMessage(jid, text); - }, - registeredGroups: () => registeredGroups, - registerGroup, - syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(), - getAvailableGroups, - writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj), - }); - queue.setProcessMessagesFn(processGroupMessages); - recoverPendingMessages(); - startMessageLoop().catch((err) => { - logger.fatal({ err }, 'Message loop crashed unexpectedly'); - process.exit(1); - }); -} - -// Guard: only run when executed directly, not when imported by tests -const isDirectRun = - process.argv[1] && - new URL(import.meta.url).pathname === new URL(`file://${process.argv[1]}`).pathname; - -if (isDirectRun) { - main().catch((err) => { - logger.error({ err }, 'Failed to start NanoClaw'); - process.exit(1); - }); -} diff --git a/.claude/skills/add-telegram/modify/src/index.ts.intent.md b/.claude/skills/add-telegram/modify/src/index.ts.intent.md deleted file mode 100644 index 1053490..0000000 --- a/.claude/skills/add-telegram/modify/src/index.ts.intent.md +++ /dev/null @@ -1,50 +0,0 @@ -# Intent: src/index.ts modifications - -## What changed -Refactored from single WhatsApp channel to multi-channel architecture using the `Channel` interface. - -## Key sections - -### Imports (top of file) -- Added: `TelegramChannel` from `./channels/telegram.js` -- Added: `TELEGRAM_BOT_TOKEN`, `TELEGRAM_ONLY` from `./config.js` -- Added: `findChannel` from `./router.js` -- Added: `Channel` type from `./types.js` - -### Module-level state -- Added: `const channels: Channel[] = []` — array of all active channels -- Kept: `let whatsapp: WhatsAppChannel` — still needed for `syncGroupMetadata` reference - -### processGroupMessages() -- Added: `findChannel(channels, chatJid)` lookup at the start -- Changed: `whatsapp.setTyping()` → `channel.setTyping?.()` (optional chaining) -- Changed: `whatsapp.sendMessage()` → `channel.sendMessage()` in output callback - -### getAvailableGroups() -- Unchanged: uses `c.is_group` filter from base (Telegram channels pass `isGroup=true` via `onChatMetadata`) - -### startMessageLoop() -- Added: `findChannel(channels, chatJid)` lookup per group in message processing -- Changed: `whatsapp.setTyping()` → `channel.setTyping?.()` for typing indicators - -### main() -- Changed: shutdown disconnects all channels via `for (const ch of channels)` -- Added: shared `channelOpts` object for channel callbacks -- Added: conditional WhatsApp creation (`if (!TELEGRAM_ONLY)`) -- Added: conditional Telegram creation (`if (TELEGRAM_BOT_TOKEN)`) -- Changed: scheduler `sendMessage` uses `findChannel()` → `channel.sendMessage()` -- Changed: IPC `sendMessage` uses `findChannel()` → `channel.sendMessage()` - -## Invariants -- All existing message processing logic (triggers, cursors, idle timers) is preserved -- The `runAgent` function is completely unchanged -- State management (loadState/saveState) is unchanged -- Recovery logic is unchanged -- Container runtime check is unchanged (ensureContainerSystemRunning) - -## Must-keep -- The `escapeXml` and `formatMessages` re-exports -- The `_setRegisteredGroups` test helper -- The `isDirectRun` guard at bottom -- All error handling and cursor rollback logic in processGroupMessages -- The outgoing queue flush and reconnection logic (in WhatsAppChannel, not here) diff --git a/.claude/skills/add-telegram/modify/src/routing.test.ts b/.claude/skills/add-telegram/modify/src/routing.test.ts deleted file mode 100644 index 5b44063..0000000 --- a/.claude/skills/add-telegram/modify/src/routing.test.ts +++ /dev/null @@ -1,161 +0,0 @@ -import { describe, it, expect, beforeEach } from 'vitest'; - -import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js'; -import { getAvailableGroups, _setRegisteredGroups } from './index.js'; - -beforeEach(() => { - _initTestDatabase(); - _setRegisteredGroups({}); -}); - -// --- JID ownership patterns --- - -describe('JID ownership patterns', () => { - // These test the patterns that will become ownsJid() on the Channel interface - - it('WhatsApp group JID: ends with @g.us', () => { - const jid = '12345678@g.us'; - expect(jid.endsWith('@g.us')).toBe(true); - }); - - it('WhatsApp DM JID: ends with @s.whatsapp.net', () => { - const jid = '12345678@s.whatsapp.net'; - expect(jid.endsWith('@s.whatsapp.net')).toBe(true); - }); - - it('Telegram JID: starts with tg:', () => { - const jid = 'tg:123456789'; - expect(jid.startsWith('tg:')).toBe(true); - }); - - it('Telegram group JID: starts with tg: and has negative ID', () => { - const jid = 'tg:-1001234567890'; - expect(jid.startsWith('tg:')).toBe(true); - }); -}); - -// --- getAvailableGroups --- - -describe('getAvailableGroups', () => { - it('returns only groups, excludes DMs', () => { - storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(2); - expect(groups.map((g) => g.jid)).toContain('group1@g.us'); - expect(groups.map((g) => g.jid)).toContain('group2@g.us'); - expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net'); - }); - - it('excludes __group_sync__ sentinel', () => { - storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z'); - storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('marks registered groups correctly', () => { - storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true); - storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true); - - _setRegisteredGroups({ - 'reg@g.us': { - name: 'Registered', - folder: 'registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const reg = groups.find((g) => g.jid === 'reg@g.us'); - const unreg = groups.find((g) => g.jid === 'unreg@g.us'); - - expect(reg?.isRegistered).toBe(true); - expect(unreg?.isRegistered).toBe(false); - }); - - it('returns groups ordered by most recent activity', () => { - storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true); - storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true); - storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups[0].jid).toBe('new@g.us'); - expect(groups[1].jid).toBe('mid@g.us'); - expect(groups[2].jid).toBe('old@g.us'); - }); - - it('excludes non-group chats regardless of JID format', () => { - // Unknown JID format stored without is_group should not appear - storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown'); - // Explicitly non-group with unusual JID - storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false); - // A real group for contrast - storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('group@g.us'); - }); - - it('returns empty array when no chats exist', () => { - const groups = getAvailableGroups(); - expect(groups).toHaveLength(0); - }); - - it('includes Telegram chat JIDs', () => { - storeChatMetadata('tg:100200300', '2024-01-01T00:00:01.000Z', 'Telegram Chat', 'telegram', true); - storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('tg:100200300'); - }); - - it('returns Telegram group JIDs with negative IDs', () => { - storeChatMetadata('tg:-1001234567890', '2024-01-01T00:00:01.000Z', 'TG Group', 'telegram', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(1); - expect(groups[0].jid).toBe('tg:-1001234567890'); - expect(groups[0].name).toBe('TG Group'); - }); - - it('marks registered Telegram chats correctly', () => { - storeChatMetadata('tg:100200300', '2024-01-01T00:00:01.000Z', 'TG Registered', 'telegram', true); - storeChatMetadata('tg:999999', '2024-01-01T00:00:02.000Z', 'TG Unregistered', 'telegram', true); - - _setRegisteredGroups({ - 'tg:100200300': { - name: 'TG Registered', - folder: 'tg-registered', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - }); - - const groups = getAvailableGroups(); - const tgReg = groups.find((g) => g.jid === 'tg:100200300'); - const tgUnreg = groups.find((g) => g.jid === 'tg:999999'); - - expect(tgReg?.isRegistered).toBe(true); - expect(tgUnreg?.isRegistered).toBe(false); - }); - - it('mixes WhatsApp and Telegram chats ordered by activity', () => { - storeChatMetadata('wa@g.us', '2024-01-01T00:00:01.000Z', 'WhatsApp', 'whatsapp', true); - storeChatMetadata('tg:100', '2024-01-01T00:00:03.000Z', 'Telegram', 'telegram', true); - storeChatMetadata('wa2@g.us', '2024-01-01T00:00:02.000Z', 'WhatsApp 2', 'whatsapp', true); - - const groups = getAvailableGroups(); - expect(groups).toHaveLength(3); - expect(groups[0].jid).toBe('tg:100'); - expect(groups[1].jid).toBe('wa2@g.us'); - expect(groups[2].jid).toBe('wa@g.us'); - }); -}); diff --git a/.claude/skills/add-telegram/tests/telegram.test.ts b/.claude/skills/add-telegram/tests/telegram.test.ts deleted file mode 100644 index 50dd599..0000000 --- a/.claude/skills/add-telegram/tests/telegram.test.ts +++ /dev/null @@ -1,118 +0,0 @@ -import { describe, expect, it } from 'vitest'; -import fs from 'fs'; -import path from 'path'; - -describe('telegram skill package', () => { - const skillDir = path.resolve(__dirname, '..'); - - it('has a valid manifest', () => { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - expect(fs.existsSync(manifestPath)).toBe(true); - - const content = fs.readFileSync(manifestPath, 'utf-8'); - expect(content).toContain('skill: telegram'); - expect(content).toContain('version: 1.0.0'); - expect(content).toContain('grammy'); - }); - - it('has all files declared in adds', () => { - const addFile = path.join(skillDir, 'add', 'src', 'channels', 'telegram.ts'); - expect(fs.existsSync(addFile)).toBe(true); - - const content = fs.readFileSync(addFile, 'utf-8'); - expect(content).toContain('class TelegramChannel'); - expect(content).toContain('implements Channel'); - - // Test file for the channel - const testFile = path.join(skillDir, 'add', 'src', 'channels', 'telegram.test.ts'); - expect(fs.existsSync(testFile)).toBe(true); - - const testContent = fs.readFileSync(testFile, 'utf-8'); - expect(testContent).toContain("describe('TelegramChannel'"); - }); - - it('has all files declared in modifies', () => { - const indexFile = path.join(skillDir, 'modify', 'src', 'index.ts'); - const configFile = path.join(skillDir, 'modify', 'src', 'config.ts'); - const routingTestFile = path.join(skillDir, 'modify', 'src', 'routing.test.ts'); - - expect(fs.existsSync(indexFile)).toBe(true); - expect(fs.existsSync(configFile)).toBe(true); - expect(fs.existsSync(routingTestFile)).toBe(true); - - const indexContent = fs.readFileSync(indexFile, 'utf-8'); - expect(indexContent).toContain('TelegramChannel'); - expect(indexContent).toContain('TELEGRAM_BOT_TOKEN'); - expect(indexContent).toContain('TELEGRAM_ONLY'); - expect(indexContent).toContain('findChannel'); - expect(indexContent).toContain('channels: Channel[]'); - - const configContent = fs.readFileSync(configFile, 'utf-8'); - expect(configContent).toContain('TELEGRAM_BOT_TOKEN'); - expect(configContent).toContain('TELEGRAM_ONLY'); - }); - - it('has intent files for modified files', () => { - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'index.ts.intent.md'))).toBe(true); - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'config.ts.intent.md'))).toBe(true); - }); - - it('modified index.ts preserves core structure', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'index.ts'), - 'utf-8', - ); - - // Core functions still present - expect(content).toContain('function loadState()'); - expect(content).toContain('function saveState()'); - expect(content).toContain('function registerGroup('); - expect(content).toContain('function getAvailableGroups()'); - expect(content).toContain('function processGroupMessages('); - expect(content).toContain('function runAgent('); - expect(content).toContain('function startMessageLoop()'); - expect(content).toContain('function recoverPendingMessages()'); - expect(content).toContain('function ensureContainerSystemRunning()'); - expect(content).toContain('async function main()'); - - // Test helper preserved - expect(content).toContain('_setRegisteredGroups'); - - // Direct-run guard preserved - expect(content).toContain('isDirectRun'); - }); - - it('modified index.ts includes Telegram channel creation', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'index.ts'), - 'utf-8', - ); - - // Multi-channel architecture - expect(content).toContain('const channels: Channel[] = []'); - expect(content).toContain('channels.push(whatsapp)'); - expect(content).toContain('channels.push(telegram)'); - - // Conditional channel creation - expect(content).toContain('if (!TELEGRAM_ONLY)'); - expect(content).toContain('if (TELEGRAM_BOT_TOKEN)'); - - // Shutdown disconnects all channels - expect(content).toContain('for (const ch of channels) await ch.disconnect()'); - }); - - it('modified config.ts preserves all existing exports', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'config.ts'), - 'utf-8', - ); - - // All original exports preserved - expect(content).toContain('export const ASSISTANT_NAME'); - expect(content).toContain('export const POLL_INTERVAL'); - expect(content).toContain('export const TRIGGER_PATTERN'); - expect(content).toContain('export const CONTAINER_IMAGE'); - expect(content).toContain('export const DATA_DIR'); - expect(content).toContain('export const TIMEZONE'); - }); -}); diff --git a/.claude/skills/add-voice-transcription/SKILL.md b/.claude/skills/add-voice-transcription/SKILL.md index 771c2d8..8ccec32 100644 --- a/.claude/skills/add-voice-transcription/SKILL.md +++ b/.claude/skills/add-voice-transcription/SKILL.md @@ -11,7 +11,7 @@ This skill adds automatic voice message transcription to NanoClaw's WhatsApp cha ### Check if already applied -Read `.nanoclaw/state.yaml`. If `voice-transcription` is in `applied_skills`, skip to Phase 3 (Configure). The code changes are already in place. +Check if `src/transcription.ts` exists. If it does, skip to Phase 3 (Configure). The code changes are already in place. ### Ask the user @@ -23,42 +23,49 @@ If yes, collect it now. If no, direct them to create one at https://platform.ope ## Phase 2: Apply Code Changes -Run the skills engine to apply this skill's code package. +**Prerequisite:** WhatsApp must be installed first (`skill/whatsapp` merged). This skill modifies WhatsApp channel files. -### Initialize skills system (if needed) - -If `.nanoclaw/` directory doesn't exist yet: +### Ensure WhatsApp fork remote ```bash -npx tsx scripts/apply-skill.ts --init +git remote -v ``` -### Apply the skill +If `whatsapp` is missing, add it: ```bash -npx tsx scripts/apply-skill.ts .claude/skills/add-voice-transcription +git remote add whatsapp https://github.com/qwibitai/nanoclaw-whatsapp.git ``` -This deterministically: -- Adds `src/transcription.ts` (voice transcription module using OpenAI Whisper) -- Three-way merges voice handling into `src/channels/whatsapp.ts` (isVoiceMessage check, transcribeAudioMessage call) -- Three-way merges transcription tests into `src/channels/whatsapp.test.ts` (mock + 3 test cases) -- Installs the `openai` npm dependency -- Updates `.env.example` with `OPENAI_API_KEY` -- Records the application in `.nanoclaw/state.yaml` +### Merge the skill branch -If the apply reports merge conflicts, read the intent files: -- `modify/src/channels/whatsapp.ts.intent.md` — what changed and invariants for whatsapp.ts -- `modify/src/channels/whatsapp.test.ts.intent.md` — what changed for whatsapp.test.ts +```bash +git fetch whatsapp skill/voice-transcription +git merge whatsapp/skill/voice-transcription || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/transcription.ts` (voice transcription module using OpenAI Whisper) +- Voice handling in `src/channels/whatsapp.ts` (isVoiceMessage check, transcribeAudioMessage call) +- Transcription tests in `src/channels/whatsapp.test.ts` +- `openai` npm dependency in `package.json` +- `OPENAI_API_KEY` in `.env.example` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. ### Validate code changes ```bash -npm test +npm install --legacy-peer-deps npm run build +npx vitest run src/channels/whatsapp.test.ts ``` -All tests must pass (including the 3 new voice transcription tests) and build must be clean before proceeding. +All tests must pass and build must be clean before proceeding. ## Phase 3: Configure diff --git a/.claude/skills/add-voice-transcription/add/src/transcription.ts b/.claude/skills/add-voice-transcription/add/src/transcription.ts deleted file mode 100644 index 91c5e7f..0000000 --- a/.claude/skills/add-voice-transcription/add/src/transcription.ts +++ /dev/null @@ -1,98 +0,0 @@ -import { downloadMediaMessage } from '@whiskeysockets/baileys'; -import { WAMessage, WASocket } from '@whiskeysockets/baileys'; - -import { readEnvFile } from './env.js'; - -interface TranscriptionConfig { - model: string; - enabled: boolean; - fallbackMessage: string; -} - -const DEFAULT_CONFIG: TranscriptionConfig = { - model: 'whisper-1', - enabled: true, - fallbackMessage: '[Voice Message - transcription unavailable]', -}; - -async function transcribeWithOpenAI( - audioBuffer: Buffer, - config: TranscriptionConfig, -): Promise { - const env = readEnvFile(['OPENAI_API_KEY']); - const apiKey = env.OPENAI_API_KEY; - - if (!apiKey) { - console.warn('OPENAI_API_KEY not set in .env'); - return null; - } - - try { - const openaiModule = await import('openai'); - const OpenAI = openaiModule.default; - const toFile = openaiModule.toFile; - - const openai = new OpenAI({ apiKey }); - - const file = await toFile(audioBuffer, 'voice.ogg', { - type: 'audio/ogg', - }); - - const transcription = await openai.audio.transcriptions.create({ - file: file, - model: config.model, - response_format: 'text', - }); - - // When response_format is 'text', the API returns a plain string - return transcription as unknown as string; - } catch (err) { - console.error('OpenAI transcription failed:', err); - return null; - } -} - -export async function transcribeAudioMessage( - msg: WAMessage, - sock: WASocket, -): Promise { - const config = DEFAULT_CONFIG; - - if (!config.enabled) { - return config.fallbackMessage; - } - - try { - const buffer = (await downloadMediaMessage( - msg, - 'buffer', - {}, - { - logger: console as any, - reuploadRequest: sock.updateMediaMessage, - }, - )) as Buffer; - - if (!buffer || buffer.length === 0) { - console.error('Failed to download audio message'); - return config.fallbackMessage; - } - - console.log(`Downloaded audio message: ${buffer.length} bytes`); - - const transcript = await transcribeWithOpenAI(buffer, config); - - if (!transcript) { - return config.fallbackMessage; - } - - return transcript.trim(); - } catch (err) { - console.error('Transcription error:', err); - return config.fallbackMessage; - } -} - -export function isVoiceMessage(msg: WAMessage): boolean { - return msg.message?.audioMessage?.ptt === true; -} diff --git a/.claude/skills/add-voice-transcription/manifest.yaml b/.claude/skills/add-voice-transcription/manifest.yaml deleted file mode 100644 index cb4d587..0000000 --- a/.claude/skills/add-voice-transcription/manifest.yaml +++ /dev/null @@ -1,17 +0,0 @@ -skill: voice-transcription -version: 1.0.0 -description: "Voice message transcription via OpenAI Whisper" -core_version: 0.1.0 -adds: - - src/transcription.ts -modifies: - - src/channels/whatsapp.ts - - src/channels/whatsapp.test.ts -structured: - npm_dependencies: - openai: "^4.77.0" - env_additions: - - OPENAI_API_KEY -conflicts: [] -depends: [] -test: "npx vitest run src/channels/whatsapp.test.ts" diff --git a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.test.ts b/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.test.ts deleted file mode 100644 index b56c6c4..0000000 --- a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.test.ts +++ /dev/null @@ -1,963 +0,0 @@ -import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest'; -import { EventEmitter } from 'events'; - -// --- Mocks --- - -// Mock config -vi.mock('../config.js', () => ({ - STORE_DIR: '/tmp/nanoclaw-test-store', - ASSISTANT_NAME: 'Andy', - ASSISTANT_HAS_OWN_NUMBER: false, -})); - -// Mock logger -vi.mock('../logger.js', () => ({ - logger: { - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }, -})); - -// Mock db -vi.mock('../db.js', () => ({ - getLastGroupSync: vi.fn(() => null), - setLastGroupSync: vi.fn(), - updateChatName: vi.fn(), -})); - -// Mock transcription -vi.mock('../transcription.js', () => ({ - isVoiceMessage: vi.fn((msg: any) => msg.message?.audioMessage?.ptt === true), - transcribeAudioMessage: vi.fn().mockResolvedValue('Hello this is a voice message'), -})); - -// Mock fs -vi.mock('fs', async () => { - const actual = await vi.importActual('fs'); - return { - ...actual, - default: { - ...actual, - existsSync: vi.fn(() => true), - mkdirSync: vi.fn(), - }, - }; -}); - -// Mock child_process (used for osascript notification) -vi.mock('child_process', () => ({ - exec: vi.fn(), -})); - -// Build a fake WASocket that's an EventEmitter with the methods we need -function createFakeSocket() { - const ev = new EventEmitter(); - const sock = { - ev: { - on: (event: string, handler: (...args: unknown[]) => void) => { - ev.on(event, handler); - }, - }, - user: { - id: '1234567890:1@s.whatsapp.net', - lid: '9876543210:1@lid', - }, - sendMessage: vi.fn().mockResolvedValue(undefined), - sendPresenceUpdate: vi.fn().mockResolvedValue(undefined), - groupFetchAllParticipating: vi.fn().mockResolvedValue({}), - end: vi.fn(), - // Expose the event emitter for triggering events in tests - _ev: ev, - }; - return sock; -} - -let fakeSocket: ReturnType; - -// Mock Baileys -vi.mock('@whiskeysockets/baileys', () => { - return { - default: vi.fn(() => fakeSocket), - Browsers: { macOS: vi.fn(() => ['macOS', 'Chrome', '']) }, - DisconnectReason: { - loggedOut: 401, - badSession: 500, - connectionClosed: 428, - connectionLost: 408, - connectionReplaced: 440, - timedOut: 408, - restartRequired: 515, - }, - makeCacheableSignalKeyStore: vi.fn((keys: unknown) => keys), - useMultiFileAuthState: vi.fn().mockResolvedValue({ - state: { - creds: {}, - keys: {}, - }, - saveCreds: vi.fn(), - }), - }; -}); - -import { WhatsAppChannel, WhatsAppChannelOpts } from './whatsapp.js'; -import { getLastGroupSync, updateChatName, setLastGroupSync } from '../db.js'; -import { transcribeAudioMessage } from '../transcription.js'; - -// --- Test helpers --- - -function createTestOpts(overrides?: Partial): WhatsAppChannelOpts { - return { - onMessage: vi.fn(), - onChatMetadata: vi.fn(), - registeredGroups: vi.fn(() => ({ - 'registered@g.us': { - name: 'Test Group', - folder: 'test-group', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - ...overrides, - }; -} - -function triggerConnection(state: string, extra?: Record) { - fakeSocket._ev.emit('connection.update', { connection: state, ...extra }); -} - -function triggerDisconnect(statusCode: number) { - fakeSocket._ev.emit('connection.update', { - connection: 'close', - lastDisconnect: { - error: { output: { statusCode } }, - }, - }); -} - -async function triggerMessages(messages: unknown[]) { - fakeSocket._ev.emit('messages.upsert', { messages }); - // Flush microtasks so the async messages.upsert handler completes - await new Promise((r) => setTimeout(r, 0)); -} - -// --- Tests --- - -describe('WhatsAppChannel', () => { - beforeEach(() => { - fakeSocket = createFakeSocket(); - vi.mocked(getLastGroupSync).mockReturnValue(null); - }); - - afterEach(() => { - vi.restoreAllMocks(); - }); - - /** - * Helper: start connect, flush microtasks so event handlers are registered, - * then trigger the connection open event. Returns the resolved promise. - */ - async function connectChannel(channel: WhatsAppChannel): Promise { - const p = channel.connect(); - // Flush microtasks so connectInternal completes its await and registers handlers - await new Promise((r) => setTimeout(r, 0)); - triggerConnection('open'); - return p; - } - - // --- Connection lifecycle --- - - describe('connection lifecycle', () => { - it('resolves connect() when connection opens', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - expect(channel.isConnected()).toBe(true); - }); - - it('sets up LID to phone mapping on open', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // The channel should have mapped the LID from sock.user - // We can verify by sending a message from a LID JID - // and checking the translated JID in the callback - }); - - it('flushes outgoing queue on reconnect', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Disconnect - (channel as any).connected = false; - - // Queue a message while disconnected - await channel.sendMessage('test@g.us', 'Queued message'); - expect(fakeSocket.sendMessage).not.toHaveBeenCalled(); - - // Reconnect - (channel as any).connected = true; - await (channel as any).flushOutgoingQueue(); - - // Group messages get prefixed when flushed - expect(fakeSocket.sendMessage).toHaveBeenCalledWith( - 'test@g.us', - { text: 'Andy: Queued message' }, - ); - }); - - it('disconnects cleanly', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.disconnect(); - expect(channel.isConnected()).toBe(false); - expect(fakeSocket.end).toHaveBeenCalled(); - }); - }); - - // --- QR code and auth --- - - describe('authentication', () => { - it('exits process when QR code is emitted (no auth state)', async () => { - vi.useFakeTimers(); - const mockExit = vi.spyOn(process, 'exit').mockImplementation(() => undefined as never); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - // Start connect but don't await (it won't resolve - process exits) - channel.connect().catch(() => {}); - - // Flush microtasks so connectInternal registers handlers - await vi.advanceTimersByTimeAsync(0); - - // Emit QR code event - fakeSocket._ev.emit('connection.update', { qr: 'some-qr-data' }); - - // Advance timer past the 1000ms setTimeout before exit - await vi.advanceTimersByTimeAsync(1500); - - expect(mockExit).toHaveBeenCalledWith(1); - mockExit.mockRestore(); - vi.useRealTimers(); - }); - }); - - // --- Reconnection behavior --- - - describe('reconnection', () => { - it('reconnects on non-loggedOut disconnect', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - expect(channel.isConnected()).toBe(true); - - // Disconnect with a non-loggedOut reason (e.g., connectionClosed = 428) - triggerDisconnect(428); - - expect(channel.isConnected()).toBe(false); - // The channel should attempt to reconnect (calls connectInternal again) - }); - - it('exits on loggedOut disconnect', async () => { - const mockExit = vi.spyOn(process, 'exit').mockImplementation(() => undefined as never); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Disconnect with loggedOut reason (401) - triggerDisconnect(401); - - expect(channel.isConnected()).toBe(false); - expect(mockExit).toHaveBeenCalledWith(0); - mockExit.mockRestore(); - }); - - it('retries reconnection after 5s on failure', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Disconnect with stream error 515 - triggerDisconnect(515); - - // The channel sets a 5s retry — just verify it doesn't crash - await new Promise((r) => setTimeout(r, 100)); - }); - }); - - // --- Message handling --- - - describe('message handling', () => { - it('delivers message for registered group', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-1', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'Hello Andy' }, - pushName: 'Alice', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'registered@g.us', - expect.any(String), - undefined, - 'whatsapp', - true, - ); - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ - id: 'msg-1', - content: 'Hello Andy', - sender_name: 'Alice', - is_from_me: false, - }), - ); - }); - - it('only emits metadata for unregistered groups', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-2', - remoteJid: 'unregistered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'Hello' }, - pushName: 'Bob', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'unregistered@g.us', - expect.any(String), - undefined, - 'whatsapp', - true, - ); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('ignores status@broadcast messages', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-3', - remoteJid: 'status@broadcast', - fromMe: false, - }, - message: { conversation: 'Status update' }, - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).not.toHaveBeenCalled(); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('ignores messages with no content', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-4', - remoteJid: 'registered@g.us', - fromMe: false, - }, - message: null, - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('extracts text from extendedTextMessage', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-5', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - extendedTextMessage: { text: 'A reply message' }, - }, - pushName: 'Charlie', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: 'A reply message' }), - ); - }); - - it('extracts caption from imageMessage', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-6', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - imageMessage: { caption: 'Check this photo', mimetype: 'image/jpeg' }, - }, - pushName: 'Diana', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: 'Check this photo' }), - ); - }); - - it('extracts caption from videoMessage', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-7', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - videoMessage: { caption: 'Watch this', mimetype: 'video/mp4' }, - }, - pushName: 'Eve', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: 'Watch this' }), - ); - }); - - it('transcribes voice messages', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-8', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - audioMessage: { mimetype: 'audio/ogg; codecs=opus', ptt: true }, - }, - pushName: 'Frank', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(transcribeAudioMessage).toHaveBeenCalled(); - expect(opts.onMessage).toHaveBeenCalledTimes(1); - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: '[Voice: Hello this is a voice message]' }), - ); - }); - - it('falls back when transcription returns null', async () => { - vi.mocked(transcribeAudioMessage).mockResolvedValueOnce(null); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-8b', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - audioMessage: { mimetype: 'audio/ogg; codecs=opus', ptt: true }, - }, - pushName: 'Frank', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledTimes(1); - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: '[Voice Message - transcription unavailable]' }), - ); - }); - - it('falls back when transcription throws', async () => { - vi.mocked(transcribeAudioMessage).mockRejectedValueOnce(new Error('API error')); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-8c', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - audioMessage: { mimetype: 'audio/ogg; codecs=opus', ptt: true }, - }, - pushName: 'Frank', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledTimes(1); - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: '[Voice Message - transcription failed]' }), - ); - }); - - it('uses sender JID when pushName is absent', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-9', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'No push name' }, - // pushName is undefined - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ sender_name: '5551234' }), - ); - }); - }); - - // --- LID ↔ JID translation --- - - describe('LID to JID translation', () => { - it('translates known LID to phone JID', async () => { - const opts = createTestOpts({ - registeredGroups: vi.fn(() => ({ - '1234567890@s.whatsapp.net': { - name: 'Self Chat', - folder: 'self-chat', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - }); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // The socket has lid '9876543210:1@lid' → phone '1234567890@s.whatsapp.net' - // Send a message from the LID - await triggerMessages([ - { - key: { - id: 'msg-lid', - remoteJid: '9876543210@lid', - fromMe: false, - }, - message: { conversation: 'From LID' }, - pushName: 'Self', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - // Should be translated to phone JID - expect(opts.onChatMetadata).toHaveBeenCalledWith( - '1234567890@s.whatsapp.net', - expect.any(String), - undefined, - 'whatsapp', - false, - ); - }); - - it('passes through non-LID JIDs unchanged', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-normal', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'Normal JID' }, - pushName: 'Grace', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'registered@g.us', - expect.any(String), - undefined, - 'whatsapp', - true, - ); - }); - - it('passes through unknown LID JIDs unchanged', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-unknown-lid', - remoteJid: '0000000000@lid', - fromMe: false, - }, - message: { conversation: 'Unknown LID' }, - pushName: 'Unknown', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - // Unknown LID passes through unchanged - expect(opts.onChatMetadata).toHaveBeenCalledWith( - '0000000000@lid', - expect.any(String), - undefined, - 'whatsapp', - false, - ); - }); - }); - - // --- Outgoing message queue --- - - describe('outgoing message queue', () => { - it('sends message directly when connected', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.sendMessage('test@g.us', 'Hello'); - // Group messages get prefixed with assistant name - expect(fakeSocket.sendMessage).toHaveBeenCalledWith('test@g.us', { text: 'Andy: Hello' }); - }); - - it('prefixes direct chat messages on shared number', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.sendMessage('123@s.whatsapp.net', 'Hello'); - // Shared number: DMs also get prefixed (needed for self-chat distinction) - expect(fakeSocket.sendMessage).toHaveBeenCalledWith('123@s.whatsapp.net', { text: 'Andy: Hello' }); - }); - - it('queues message when disconnected', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - // Don't connect — channel starts disconnected - await channel.sendMessage('test@g.us', 'Queued'); - expect(fakeSocket.sendMessage).not.toHaveBeenCalled(); - }); - - it('queues message on send failure', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Make sendMessage fail - fakeSocket.sendMessage.mockRejectedValueOnce(new Error('Network error')); - - await channel.sendMessage('test@g.us', 'Will fail'); - - // Should not throw, message queued for retry - // The queue should have the message - }); - - it('flushes multiple queued messages in order', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - // Queue messages while disconnected - await channel.sendMessage('test@g.us', 'First'); - await channel.sendMessage('test@g.us', 'Second'); - await channel.sendMessage('test@g.us', 'Third'); - - // Connect — flush happens automatically on open - await connectChannel(channel); - - // Give the async flush time to complete - await new Promise((r) => setTimeout(r, 50)); - - expect(fakeSocket.sendMessage).toHaveBeenCalledTimes(3); - // Group messages get prefixed - expect(fakeSocket.sendMessage).toHaveBeenNthCalledWith(1, 'test@g.us', { text: 'Andy: First' }); - expect(fakeSocket.sendMessage).toHaveBeenNthCalledWith(2, 'test@g.us', { text: 'Andy: Second' }); - expect(fakeSocket.sendMessage).toHaveBeenNthCalledWith(3, 'test@g.us', { text: 'Andy: Third' }); - }); - }); - - // --- Group metadata sync --- - - describe('group metadata sync', () => { - it('syncs group metadata on first connection', async () => { - fakeSocket.groupFetchAllParticipating.mockResolvedValue({ - 'group1@g.us': { subject: 'Group One' }, - 'group2@g.us': { subject: 'Group Two' }, - }); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Wait for async sync to complete - await new Promise((r) => setTimeout(r, 50)); - - expect(fakeSocket.groupFetchAllParticipating).toHaveBeenCalled(); - expect(updateChatName).toHaveBeenCalledWith('group1@g.us', 'Group One'); - expect(updateChatName).toHaveBeenCalledWith('group2@g.us', 'Group Two'); - expect(setLastGroupSync).toHaveBeenCalled(); - }); - - it('skips sync when synced recently', async () => { - // Last sync was 1 hour ago (within 24h threshold) - vi.mocked(getLastGroupSync).mockReturnValue( - new Date(Date.now() - 60 * 60 * 1000).toISOString(), - ); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await new Promise((r) => setTimeout(r, 50)); - - expect(fakeSocket.groupFetchAllParticipating).not.toHaveBeenCalled(); - }); - - it('forces sync regardless of cache', async () => { - vi.mocked(getLastGroupSync).mockReturnValue( - new Date(Date.now() - 60 * 60 * 1000).toISOString(), - ); - - fakeSocket.groupFetchAllParticipating.mockResolvedValue({ - 'group@g.us': { subject: 'Forced Group' }, - }); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.syncGroupMetadata(true); - - expect(fakeSocket.groupFetchAllParticipating).toHaveBeenCalled(); - expect(updateChatName).toHaveBeenCalledWith('group@g.us', 'Forced Group'); - }); - - it('handles group sync failure gracefully', async () => { - fakeSocket.groupFetchAllParticipating.mockRejectedValue( - new Error('Network timeout'), - ); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Should not throw - await expect(channel.syncGroupMetadata(true)).resolves.toBeUndefined(); - }); - - it('skips groups with no subject', async () => { - fakeSocket.groupFetchAllParticipating.mockResolvedValue({ - 'group1@g.us': { subject: 'Has Subject' }, - 'group2@g.us': { subject: '' }, - 'group3@g.us': {}, - }); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Clear any calls from the automatic sync on connect - vi.mocked(updateChatName).mockClear(); - - await channel.syncGroupMetadata(true); - - expect(updateChatName).toHaveBeenCalledTimes(1); - expect(updateChatName).toHaveBeenCalledWith('group1@g.us', 'Has Subject'); - }); - }); - - // --- JID ownership --- - - describe('ownsJid', () => { - it('owns @g.us JIDs (WhatsApp groups)', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('12345@g.us')).toBe(true); - }); - - it('owns @s.whatsapp.net JIDs (WhatsApp DMs)', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('12345@s.whatsapp.net')).toBe(true); - }); - - it('does not own Telegram JIDs', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('tg:12345')).toBe(false); - }); - - it('does not own unknown JID formats', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('random-string')).toBe(false); - }); - }); - - // --- Typing indicator --- - - describe('setTyping', () => { - it('sends composing presence when typing', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.setTyping('test@g.us', true); - expect(fakeSocket.sendPresenceUpdate).toHaveBeenCalledWith('composing', 'test@g.us'); - }); - - it('sends paused presence when stopping', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.setTyping('test@g.us', false); - expect(fakeSocket.sendPresenceUpdate).toHaveBeenCalledWith('paused', 'test@g.us'); - }); - - it('handles typing indicator failure gracefully', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - fakeSocket.sendPresenceUpdate.mockRejectedValueOnce(new Error('Failed')); - - // Should not throw - await expect(channel.setTyping('test@g.us', true)).resolves.toBeUndefined(); - }); - }); - - // --- Channel properties --- - - describe('channel properties', () => { - it('has name "whatsapp"', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.name).toBe('whatsapp'); - }); - - it('does not expose prefixAssistantName (prefix handled internally)', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect('prefixAssistantName' in channel).toBe(false); - }); - }); -}); diff --git a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.test.ts.intent.md b/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.test.ts.intent.md deleted file mode 100644 index 5856320..0000000 --- a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.test.ts.intent.md +++ /dev/null @@ -1,26 +0,0 @@ -# Intent: src/channels/whatsapp.test.ts modifications - -## What changed -Added mock for the transcription module and 3 new test cases for voice message handling. - -## Key sections - -### Mocks (top of file) -- Added: `vi.mock('../transcription.js', ...)` with `isVoiceMessage` and `transcribeAudioMessage` mocks -- Added: `import { transcribeAudioMessage } from '../transcription.js'` for test assertions - -### Test cases (inside "message handling" describe block) -- Changed: "handles message with no extractable text (e.g. voice note without caption)" → "transcribes voice messages" - - Now expects `[Voice: Hello this is a voice message]` instead of empty content -- Added: "falls back when transcription returns null" — expects `[Voice Message - transcription unavailable]` -- Added: "falls back when transcription throws" — expects `[Voice Message - transcription failed]` - -## Invariants (must-keep) -- All existing test cases for text, extendedTextMessage, imageMessage, videoMessage unchanged -- All connection lifecycle tests unchanged -- All LID translation tests unchanged -- All outgoing queue tests unchanged -- All group metadata sync tests unchanged -- All ownsJid and setTyping tests unchanged -- All existing mocks (config, logger, db, fs, child_process, baileys) unchanged -- Test helpers (createTestOpts, triggerConnection, triggerDisconnect, triggerMessages, connectChannel) unchanged diff --git a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.ts b/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.ts deleted file mode 100644 index 0781185..0000000 --- a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.ts +++ /dev/null @@ -1,356 +0,0 @@ -import { exec } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import makeWASocket, { - Browsers, - DisconnectReason, - WASocket, - fetchLatestWaWebVersion, - makeCacheableSignalKeyStore, - useMultiFileAuthState, -} from '@whiskeysockets/baileys'; - -import { ASSISTANT_HAS_OWN_NUMBER, ASSISTANT_NAME, STORE_DIR } from '../config.js'; -import { - getLastGroupSync, - setLastGroupSync, - updateChatName, -} from '../db.js'; -import { logger } from '../logger.js'; -import { isVoiceMessage, transcribeAudioMessage } from '../transcription.js'; -import { Channel, OnInboundMessage, OnChatMetadata, RegisteredGroup } from '../types.js'; - -const GROUP_SYNC_INTERVAL_MS = 24 * 60 * 60 * 1000; // 24 hours - -export interface WhatsAppChannelOpts { - onMessage: OnInboundMessage; - onChatMetadata: OnChatMetadata; - registeredGroups: () => Record; -} - -export class WhatsAppChannel implements Channel { - name = 'whatsapp'; - - private sock!: WASocket; - private connected = false; - private lidToPhoneMap: Record = {}; - private outgoingQueue: Array<{ jid: string; text: string }> = []; - private flushing = false; - private groupSyncTimerStarted = false; - - private opts: WhatsAppChannelOpts; - - constructor(opts: WhatsAppChannelOpts) { - this.opts = opts; - } - - async connect(): Promise { - return new Promise((resolve, reject) => { - this.connectInternal(resolve).catch(reject); - }); - } - - private async connectInternal(onFirstOpen?: () => void): Promise { - const authDir = path.join(STORE_DIR, 'auth'); - fs.mkdirSync(authDir, { recursive: true }); - - const { state, saveCreds } = await useMultiFileAuthState(authDir); - - const { version } = await fetchLatestWaWebVersion({}).catch((err) => { - logger.warn({ err }, 'Failed to fetch latest WA Web version, using default'); - return { version: undefined }; - }); - this.sock = makeWASocket({ - version, - auth: { - creds: state.creds, - keys: makeCacheableSignalKeyStore(state.keys, logger), - }, - printQRInTerminal: false, - logger, - browser: Browsers.macOS('Chrome'), - }); - - this.sock.ev.on('connection.update', (update) => { - const { connection, lastDisconnect, qr } = update; - - if (qr) { - const msg = - 'WhatsApp authentication required. Run /setup in Claude Code.'; - logger.error(msg); - exec( - `osascript -e 'display notification "${msg}" with title "NanoClaw" sound name "Basso"'`, - ); - setTimeout(() => process.exit(1), 1000); - } - - if (connection === 'close') { - this.connected = false; - const reason = (lastDisconnect?.error as { output?: { statusCode?: number } })?.output?.statusCode; - const shouldReconnect = reason !== DisconnectReason.loggedOut; - logger.info({ reason, shouldReconnect, queuedMessages: this.outgoingQueue.length }, 'Connection closed'); - - if (shouldReconnect) { - logger.info('Reconnecting...'); - this.connectInternal().catch((err) => { - logger.error({ err }, 'Failed to reconnect, retrying in 5s'); - setTimeout(() => { - this.connectInternal().catch((err2) => { - logger.error({ err: err2 }, 'Reconnection retry failed'); - }); - }, 5000); - }); - } else { - logger.info('Logged out. Run /setup to re-authenticate.'); - process.exit(0); - } - } else if (connection === 'open') { - this.connected = true; - logger.info('Connected to WhatsApp'); - - // Announce availability so WhatsApp relays subsequent presence updates (typing indicators) - this.sock.sendPresenceUpdate('available').catch((err) => { - logger.warn({ err }, 'Failed to send presence update'); - }); - - // Build LID to phone mapping from auth state for self-chat translation - if (this.sock.user) { - const phoneUser = this.sock.user.id.split(':')[0]; - const lidUser = this.sock.user.lid?.split(':')[0]; - if (lidUser && phoneUser) { - this.lidToPhoneMap[lidUser] = `${phoneUser}@s.whatsapp.net`; - logger.debug({ lidUser, phoneUser }, 'LID to phone mapping set'); - } - } - - // Flush any messages queued while disconnected - this.flushOutgoingQueue().catch((err) => - logger.error({ err }, 'Failed to flush outgoing queue'), - ); - - // Sync group metadata on startup (respects 24h cache) - this.syncGroupMetadata().catch((err) => - logger.error({ err }, 'Initial group sync failed'), - ); - // Set up daily sync timer (only once) - if (!this.groupSyncTimerStarted) { - this.groupSyncTimerStarted = true; - setInterval(() => { - this.syncGroupMetadata().catch((err) => - logger.error({ err }, 'Periodic group sync failed'), - ); - }, GROUP_SYNC_INTERVAL_MS); - } - - // Signal first connection to caller - if (onFirstOpen) { - onFirstOpen(); - onFirstOpen = undefined; - } - } - }); - - this.sock.ev.on('creds.update', saveCreds); - - this.sock.ev.on('messages.upsert', async ({ messages }) => { - for (const msg of messages) { - if (!msg.message) continue; - const rawJid = msg.key.remoteJid; - if (!rawJid || rawJid === 'status@broadcast') continue; - - // Translate LID JID to phone JID if applicable - const chatJid = await this.translateJid(rawJid); - - const timestamp = new Date( - Number(msg.messageTimestamp) * 1000, - ).toISOString(); - - // Always notify about chat metadata for group discovery - const isGroup = chatJid.endsWith('@g.us'); - this.opts.onChatMetadata(chatJid, timestamp, undefined, 'whatsapp', isGroup); - - // Only deliver full message for registered groups - const groups = this.opts.registeredGroups(); - if (groups[chatJid]) { - const content = - msg.message?.conversation || - msg.message?.extendedTextMessage?.text || - msg.message?.imageMessage?.caption || - msg.message?.videoMessage?.caption || - ''; - - // Skip protocol messages with no text content (encryption keys, read receipts, etc.) - // but allow voice messages through for transcription - if (!content && !isVoiceMessage(msg)) continue; - - const sender = msg.key.participant || msg.key.remoteJid || ''; - const senderName = msg.pushName || sender.split('@')[0]; - - const fromMe = msg.key.fromMe || false; - // Detect bot messages: with own number, fromMe is reliable - // since only the bot sends from that number. - // With shared number, bot messages carry the assistant name prefix - // (even in DMs/self-chat) so we check for that. - const isBotMessage = ASSISTANT_HAS_OWN_NUMBER - ? fromMe - : content.startsWith(`${ASSISTANT_NAME}:`); - - // Transcribe voice messages before storing - let finalContent = content; - if (isVoiceMessage(msg)) { - try { - const transcript = await transcribeAudioMessage(msg, this.sock); - if (transcript) { - finalContent = `[Voice: ${transcript}]`; - logger.info({ chatJid, length: transcript.length }, 'Transcribed voice message'); - } else { - finalContent = '[Voice Message - transcription unavailable]'; - } - } catch (err) { - logger.error({ err }, 'Voice transcription error'); - finalContent = '[Voice Message - transcription failed]'; - } - } - - this.opts.onMessage(chatJid, { - id: msg.key.id || '', - chat_jid: chatJid, - sender, - sender_name: senderName, - content: finalContent, - timestamp, - is_from_me: fromMe, - is_bot_message: isBotMessage, - }); - } - } - }); - } - - async sendMessage(jid: string, text: string): Promise { - // Prefix bot messages with assistant name so users know who's speaking. - // On a shared number, prefix is also needed in DMs (including self-chat) - // to distinguish bot output from user messages. - // Skip only when the assistant has its own dedicated phone number. - const prefixed = ASSISTANT_HAS_OWN_NUMBER - ? text - : `${ASSISTANT_NAME}: ${text}`; - - if (!this.connected) { - this.outgoingQueue.push({ jid, text: prefixed }); - logger.info({ jid, length: prefixed.length, queueSize: this.outgoingQueue.length }, 'WA disconnected, message queued'); - return; - } - try { - await this.sock.sendMessage(jid, { text: prefixed }); - logger.info({ jid, length: prefixed.length }, 'Message sent'); - } catch (err) { - // If send fails, queue it for retry on reconnect - this.outgoingQueue.push({ jid, text: prefixed }); - logger.warn({ jid, err, queueSize: this.outgoingQueue.length }, 'Failed to send, message queued'); - } - } - - isConnected(): boolean { - return this.connected; - } - - ownsJid(jid: string): boolean { - return jid.endsWith('@g.us') || jid.endsWith('@s.whatsapp.net'); - } - - async disconnect(): Promise { - this.connected = false; - this.sock?.end(undefined); - } - - async setTyping(jid: string, isTyping: boolean): Promise { - try { - const status = isTyping ? 'composing' : 'paused'; - logger.debug({ jid, status }, 'Sending presence update'); - await this.sock.sendPresenceUpdate(status, jid); - } catch (err) { - logger.debug({ jid, err }, 'Failed to update typing status'); - } - } - - /** - * Sync group metadata from WhatsApp. - * Fetches all participating groups and stores their names in the database. - * Called on startup, daily, and on-demand via IPC. - */ - async syncGroupMetadata(force = false): Promise { - if (!force) { - const lastSync = getLastGroupSync(); - if (lastSync) { - const lastSyncTime = new Date(lastSync).getTime(); - if (Date.now() - lastSyncTime < GROUP_SYNC_INTERVAL_MS) { - logger.debug({ lastSync }, 'Skipping group sync - synced recently'); - return; - } - } - } - - try { - logger.info('Syncing group metadata from WhatsApp...'); - const groups = await this.sock.groupFetchAllParticipating(); - - let count = 0; - for (const [jid, metadata] of Object.entries(groups)) { - if (metadata.subject) { - updateChatName(jid, metadata.subject); - count++; - } - } - - setLastGroupSync(); - logger.info({ count }, 'Group metadata synced'); - } catch (err) { - logger.error({ err }, 'Failed to sync group metadata'); - } - } - - private async translateJid(jid: string): Promise { - if (!jid.endsWith('@lid')) return jid; - const lidUser = jid.split('@')[0].split(':')[0]; - - // Check local cache first - const cached = this.lidToPhoneMap[lidUser]; - if (cached) { - logger.debug({ lidJid: jid, phoneJid: cached }, 'Translated LID to phone JID (cached)'); - return cached; - } - - // Query Baileys' signal repository for the mapping - try { - const pn = await this.sock.signalRepository?.lidMapping?.getPNForLID(jid); - if (pn) { - const phoneJid = `${pn.split('@')[0].split(':')[0]}@s.whatsapp.net`; - this.lidToPhoneMap[lidUser] = phoneJid; - logger.info({ lidJid: jid, phoneJid }, 'Translated LID to phone JID (signalRepository)'); - return phoneJid; - } - } catch (err) { - logger.debug({ err, jid }, 'Failed to resolve LID via signalRepository'); - } - - return jid; - } - - private async flushOutgoingQueue(): Promise { - if (this.flushing || this.outgoingQueue.length === 0) return; - this.flushing = true; - try { - logger.info({ count: this.outgoingQueue.length }, 'Flushing outgoing message queue'); - while (this.outgoingQueue.length > 0) { - const item = this.outgoingQueue.shift()!; - // Send directly — queued items are already prefixed by sendMessage - await this.sock.sendMessage(item.jid, { text: item.text }); - logger.info({ jid: item.jid, length: item.text.length }, 'Queued message sent'); - } - } finally { - this.flushing = false; - } - } -} diff --git a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.ts.intent.md b/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.ts.intent.md deleted file mode 100644 index 0049fed..0000000 --- a/.claude/skills/add-voice-transcription/modify/src/channels/whatsapp.ts.intent.md +++ /dev/null @@ -1,27 +0,0 @@ -# Intent: src/channels/whatsapp.ts modifications - -## What changed -Added voice message transcription support. When a WhatsApp voice note (PTT audio) arrives, it is downloaded and transcribed via OpenAI Whisper before being stored as message content. - -## Key sections - -### Imports (top of file) -- Added: `isVoiceMessage`, `transcribeAudioMessage` from `../transcription.js` - -### messages.upsert handler (inside connectInternal) -- Added: `let finalContent = content` variable to allow voice transcription to override text content -- Added: `isVoiceMessage(msg)` check after content extraction -- Added: try/catch block calling `transcribeAudioMessage(msg, this.sock)` - - Success: `finalContent = '[Voice: ]'` - - Null result: `finalContent = '[Voice Message - transcription unavailable]'` - - Error: `finalContent = '[Voice Message - transcription failed]'` -- Changed: `this.opts.onMessage()` call uses `finalContent` instead of `content` - -## Invariants (must-keep) -- All existing message handling (conversation, extendedTextMessage, imageMessage, videoMessage) unchanged -- Connection lifecycle (connect, reconnect, disconnect) unchanged -- LID translation logic unchanged -- Outgoing message queue unchanged -- Group metadata sync unchanged -- sendMessage prefix logic unchanged -- setTyping, ownsJid, isConnected — all unchanged diff --git a/.claude/skills/add-voice-transcription/tests/voice-transcription.test.ts b/.claude/skills/add-voice-transcription/tests/voice-transcription.test.ts deleted file mode 100644 index 76ebd0d..0000000 --- a/.claude/skills/add-voice-transcription/tests/voice-transcription.test.ts +++ /dev/null @@ -1,123 +0,0 @@ -import { describe, expect, it } from 'vitest'; -import fs from 'fs'; -import path from 'path'; - -describe('voice-transcription skill package', () => { - const skillDir = path.resolve(__dirname, '..'); - - it('has a valid manifest', () => { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - expect(fs.existsSync(manifestPath)).toBe(true); - - const content = fs.readFileSync(manifestPath, 'utf-8'); - expect(content).toContain('skill: voice-transcription'); - expect(content).toContain('version: 1.0.0'); - expect(content).toContain('openai'); - expect(content).toContain('OPENAI_API_KEY'); - }); - - it('has all files declared in adds', () => { - const transcriptionFile = path.join(skillDir, 'add', 'src', 'transcription.ts'); - expect(fs.existsSync(transcriptionFile)).toBe(true); - - const content = fs.readFileSync(transcriptionFile, 'utf-8'); - expect(content).toContain('transcribeAudioMessage'); - expect(content).toContain('isVoiceMessage'); - expect(content).toContain('transcribeWithOpenAI'); - expect(content).toContain('downloadMediaMessage'); - expect(content).toContain('readEnvFile'); - }); - - it('has all files declared in modifies', () => { - const whatsappFile = path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.ts'); - const whatsappTestFile = path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.test.ts'); - - expect(fs.existsSync(whatsappFile)).toBe(true); - expect(fs.existsSync(whatsappTestFile)).toBe(true); - }); - - it('has intent files for modified files', () => { - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.ts.intent.md'))).toBe(true); - expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.test.ts.intent.md'))).toBe(true); - }); - - it('modified whatsapp.ts preserves core structure', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.ts'), - 'utf-8', - ); - - // Core class and methods preserved - expect(content).toContain('class WhatsAppChannel'); - expect(content).toContain('implements Channel'); - expect(content).toContain('async connect()'); - expect(content).toContain('async sendMessage('); - expect(content).toContain('isConnected()'); - expect(content).toContain('ownsJid('); - expect(content).toContain('async disconnect()'); - expect(content).toContain('async setTyping('); - expect(content).toContain('async syncGroupMetadata('); - expect(content).toContain('private async translateJid('); - expect(content).toContain('private async flushOutgoingQueue('); - - // Core imports preserved - expect(content).toContain('ASSISTANT_HAS_OWN_NUMBER'); - expect(content).toContain('ASSISTANT_NAME'); - expect(content).toContain('STORE_DIR'); - }); - - it('modified whatsapp.ts includes transcription integration', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.ts'), - 'utf-8', - ); - - // Transcription imports - expect(content).toContain("import { isVoiceMessage, transcribeAudioMessage } from '../transcription.js'"); - - // Voice message handling - expect(content).toContain('isVoiceMessage(msg)'); - expect(content).toContain('transcribeAudioMessage(msg, this.sock)'); - expect(content).toContain('finalContent'); - expect(content).toContain('[Voice:'); - expect(content).toContain('[Voice Message - transcription unavailable]'); - expect(content).toContain('[Voice Message - transcription failed]'); - }); - - it('modified whatsapp.test.ts includes transcription mock and tests', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.test.ts'), - 'utf-8', - ); - - // Transcription mock - expect(content).toContain("vi.mock('../transcription.js'"); - expect(content).toContain('isVoiceMessage'); - expect(content).toContain('transcribeAudioMessage'); - - // Voice transcription test cases - expect(content).toContain('transcribes voice messages'); - expect(content).toContain('falls back when transcription returns null'); - expect(content).toContain('falls back when transcription throws'); - expect(content).toContain('[Voice: Hello this is a voice message]'); - }); - - it('modified whatsapp.test.ts preserves all existing test sections', () => { - const content = fs.readFileSync( - path.join(skillDir, 'modify', 'src', 'channels', 'whatsapp.test.ts'), - 'utf-8', - ); - - // All existing test describe blocks preserved - expect(content).toContain("describe('connection lifecycle'"); - expect(content).toContain("describe('authentication'"); - expect(content).toContain("describe('reconnection'"); - expect(content).toContain("describe('message handling'"); - expect(content).toContain("describe('LID to JID translation'"); - expect(content).toContain("describe('outgoing message queue'"); - expect(content).toContain("describe('group metadata sync'"); - expect(content).toContain("describe('ownsJid'"); - expect(content).toContain("describe('setTyping'"); - expect(content).toContain("describe('channel properties'"); - }); -}); diff --git a/.claude/skills/add-whatsapp/SKILL.md b/.claude/skills/add-whatsapp/SKILL.md new file mode 100644 index 0000000..cbdf00b --- /dev/null +++ b/.claude/skills/add-whatsapp/SKILL.md @@ -0,0 +1,372 @@ +--- +name: add-whatsapp +description: Add WhatsApp as a channel. Can replace other channels entirely or run alongside them. Uses QR code or pairing code for authentication. +--- + +# Add WhatsApp Channel + +This skill adds WhatsApp support to NanoClaw. It installs the WhatsApp channel code, dependencies, and guides through authentication, registration, and configuration. + +## Phase 1: Pre-flight + +### Check current state + +Check if WhatsApp is already configured. If `store/auth/` exists with credential files, skip to Phase 4 (Registration) or Phase 5 (Verify). + +```bash +ls store/auth/creds.json 2>/dev/null && echo "WhatsApp auth exists" || echo "No WhatsApp auth" +``` + +### Detect environment + +Check whether the environment is headless (no display server): + +```bash +[[ -z "$DISPLAY" && -z "$WAYLAND_DISPLAY" && "$OSTYPE" != darwin* ]] && echo "IS_HEADLESS=true" || echo "IS_HEADLESS=false" +``` + +### Ask the user + +Use `AskUserQuestion` to collect configuration. **Adapt auth options based on environment:** + +If IS_HEADLESS=true AND not WSL → AskUserQuestion: How do you want to authenticate WhatsApp? +- **Pairing code** (Recommended) - Enter a numeric code on your phone (no camera needed, requires phone number) +- **QR code in terminal** - Displays QR code in the terminal (can be too small on some displays) + +Otherwise (macOS, desktop Linux, or WSL) → AskUserQuestion: How do you want to authenticate WhatsApp? +- **QR code in browser** (Recommended) - Opens a browser window with a large, scannable QR code +- **Pairing code** - Enter a numeric code on your phone (no camera needed, requires phone number) +- **QR code in terminal** - Displays QR code in the terminal (can be too small on some displays) + +If they chose pairing code: + +AskUserQuestion: What is your phone number? (Digits only — country code followed by your 10-digit number, no + prefix, spaces, or dashes. Example: 14155551234 where 1 is the US country code and 4155551234 is the phone number.) + +## Phase 2: Apply Code Changes + +Check if `src/channels/whatsapp.ts` already exists. If it does, skip to Phase 3 (Authentication). + +### Ensure channel remote + +```bash +git remote -v +``` + +If `whatsapp` is missing, add it: + +```bash +git remote add whatsapp https://github.com/qwibitai/nanoclaw-whatsapp.git +``` + +### Merge the skill branch + +```bash +git fetch whatsapp main +git merge whatsapp/main || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/channels/whatsapp.ts` (WhatsAppChannel class with self-registration via `registerChannel`) +- `src/channels/whatsapp.test.ts` (41 unit tests) +- `src/whatsapp-auth.ts` (standalone WhatsApp authentication script) +- `setup/whatsapp-auth.ts` (WhatsApp auth setup step) +- `import './whatsapp.js'` appended to the channel barrel file `src/channels/index.ts` +- `'whatsapp-auth'` step added to `setup/index.ts` +- `@whiskeysockets/baileys`, `qrcode`, `qrcode-terminal` npm dependencies in `package.json` +- `ASSISTANT_HAS_OWN_NUMBER` in `.env.example` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. + +### Validate code changes + +```bash +npm install +npm run build +npx vitest run src/channels/whatsapp.test.ts +``` + +All tests must pass and build must be clean before proceeding. + +## Phase 3: Authentication + +### Clean previous auth state (if re-authenticating) + +```bash +rm -rf store/auth/ +``` + +### Run WhatsApp authentication + +For QR code in browser (recommended): + +```bash +npx tsx setup/index.ts --step whatsapp-auth -- --method qr-browser +``` + +(Bash timeout: 150000ms) + +Tell the user: + +> A browser window will open with a QR code. +> +> 1. Open WhatsApp > **Settings** > **Linked Devices** > **Link a Device** +> 2. Scan the QR code in the browser +> 3. The page will show "Authenticated!" when done + +For QR code in terminal: + +```bash +npx tsx setup/index.ts --step whatsapp-auth -- --method qr-terminal +``` + +Tell the user to run `npm run auth` in another terminal, then: + +> 1. Open WhatsApp > **Settings** > **Linked Devices** > **Link a Device** +> 2. Scan the QR code displayed in the terminal + +For pairing code: + +Tell the user to have WhatsApp open on **Settings > Linked Devices > Link a Device**, ready to tap **"Link with phone number instead"** — the code expires in ~60 seconds and must be entered immediately. + +Run the auth process in the background and poll `store/pairing-code.txt` for the code: + +```bash +rm -f store/pairing-code.txt && npx tsx setup/index.ts --step whatsapp-auth -- --method pairing-code --phone > /tmp/wa-auth.log 2>&1 & +``` + +Then immediately poll for the code (do NOT wait for the background command to finish): + +```bash +for i in $(seq 1 20); do [ -f store/pairing-code.txt ] && cat store/pairing-code.txt && break; sleep 1; done +``` + +Display the code to the user the moment it appears. Tell them: + +> **Enter this code now** — it expires in ~60 seconds. +> +> 1. Open WhatsApp > **Settings** > **Linked Devices** > **Link a Device** +> 2. Tap **Link with phone number instead** +> 3. Enter the code immediately + +After the user enters the code, poll for authentication to complete: + +```bash +for i in $(seq 1 60); do grep -q 'AUTH_STATUS: authenticated' /tmp/wa-auth.log 2>/dev/null && echo "authenticated" && break; grep -q 'AUTH_STATUS: failed' /tmp/wa-auth.log 2>/dev/null && echo "failed" && break; sleep 2; done +``` + +**If failed:** qr_timeout → re-run. logged_out → delete `store/auth/` and re-run. 515 → re-run. timeout → ask user, offer retry. + +### Verify authentication succeeded + +```bash +test -f store/auth/creds.json && echo "Authentication successful" || echo "Authentication failed" +``` + +### Configure environment + +Channels auto-enable when their credentials are present — WhatsApp activates when `store/auth/creds.json` exists. + +Sync to container environment: + +```bash +mkdir -p data/env && cp .env data/env/env +``` + +## Phase 4: Registration + +### Configure trigger and channel type + +Get the bot's WhatsApp number: `node -e "const c=require('./store/auth/creds.json');console.log(c.me.id.split(':')[0].split('@')[0])"` + +AskUserQuestion: Is this a shared phone number (personal WhatsApp) or a dedicated number (separate device)? +- **Shared number** - Your personal WhatsApp number (recommended: use self-chat or a solo group) +- **Dedicated number** - A separate phone/SIM for the assistant + +AskUserQuestion: What trigger word should activate the assistant? +- **@Andy** - Default trigger +- **@Claw** - Short and easy +- **@Claude** - Match the AI name + +AskUserQuestion: What should the assistant call itself? +- **Andy** - Default name +- **Claw** - Short and easy +- **Claude** - Match the AI name + +AskUserQuestion: Where do you want to chat with the assistant? + +**Shared number options:** +- **Self-chat** (Recommended) - Chat in your own "Message Yourself" conversation +- **Solo group** - A group with just you and the linked device +- **Existing group** - An existing WhatsApp group + +**Dedicated number options:** +- **DM with bot** (Recommended) - Direct message the bot's number +- **Solo group** - A group with just you and the bot +- **Existing group** - An existing WhatsApp group + +### Get the JID + +**Self-chat:** JID = your phone number with `@s.whatsapp.net`. Extract from auth credentials: + +```bash +node -e "const c=JSON.parse(require('fs').readFileSync('store/auth/creds.json','utf-8'));console.log(c.me?.id?.split(':')[0]+'@s.whatsapp.net')" +``` + +**DM with bot:** Ask for the bot's phone number. JID = `NUMBER@s.whatsapp.net` + +**Group (solo, existing):** Run group sync and list available groups: + +```bash +npx tsx setup/index.ts --step groups +npx tsx setup/index.ts --step groups --list +``` + +The output shows `JID|GroupName` pairs. Present candidates as AskUserQuestion (names only, not JIDs). + +### Register the chat + +```bash +npx tsx setup/index.ts --step register \ + --jid "" \ + --name "" \ + --trigger "@" \ + --folder "whatsapp_main" \ + --channel whatsapp \ + --assistant-name "" \ + --is-main \ + --no-trigger-required # Only for main/self-chat +``` + +For additional groups (trigger-required): + +```bash +npx tsx setup/index.ts --step register \ + --jid "" \ + --name "" \ + --trigger "@" \ + --folder "whatsapp_" \ + --channel whatsapp +``` + +## Phase 5: Verify + +### Build and restart + +```bash +npm run build +``` + +Restart the service: + +```bash +# macOS (launchd) +launchctl kickstart -k gui/$(id -u)/com.nanoclaw + +# Linux (systemd) +systemctl --user restart nanoclaw + +# Linux (nohup fallback) +bash start-nanoclaw.sh +``` + +### Test the connection + +Tell the user: + +> Send a message to your registered WhatsApp chat: +> - For self-chat / main: Any message works +> - For groups: Use the trigger word (e.g., "@Andy hello") +> +> The assistant should respond within a few seconds. + +### Check logs if needed + +```bash +tail -f logs/nanoclaw.log +``` + +## Troubleshooting + +### QR code expired + +QR codes expire after ~60 seconds. Re-run the auth command: + +```bash +rm -rf store/auth/ && npx tsx src/whatsapp-auth.ts +``` + +### Pairing code not working + +Codes expire in ~60 seconds. To retry: + +```bash +rm -rf store/auth/ && npx tsx src/whatsapp-auth.ts --pairing-code --phone +``` + +Enter the code **immediately** when it appears. Also ensure: +1. Phone number is digits only — country code + number, no `+` prefix (e.g., `14155551234` where `1` is country code, `4155551234` is the number) +2. Phone has internet access +3. WhatsApp is updated to the latest version + +If pairing code keeps failing, switch to QR-browser auth instead: + +```bash +rm -rf store/auth/ && npx tsx setup/index.ts --step whatsapp-auth -- --method qr-browser +``` + +### "conflict" disconnection + +This happens when two instances connect with the same credentials. Ensure only one NanoClaw process is running: + +```bash +pkill -f "node dist/index.js" +# Then restart +``` + +### Bot not responding + +Check: +1. Auth credentials exist: `ls store/auth/creds.json` +3. Chat is registered: `sqlite3 store/messages.db "SELECT * FROM registered_groups WHERE jid LIKE '%whatsapp%' OR jid LIKE '%@g.us' OR jid LIKE '%@s.whatsapp.net'"` +4. Service is running: `launchctl list | grep nanoclaw` (macOS) or `systemctl --user status nanoclaw` (Linux) +5. Logs: `tail -50 logs/nanoclaw.log` + +### Group names not showing + +Run group metadata sync: + +```bash +npx tsx setup/index.ts --step groups +``` + +This fetches all group names from WhatsApp. Runs automatically every 24 hours. + +## After Setup + +If running `npm run dev` while the service is active: + +```bash +# macOS: +launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist +npm run dev +# When done testing: +launchctl load ~/Library/LaunchAgents/com.nanoclaw.plist + +# Linux: +# systemctl --user stop nanoclaw +# npm run dev +# systemctl --user start nanoclaw +``` + +## Removal + +To remove WhatsApp integration: + +1. Delete auth credentials: `rm -rf store/auth/` +2. Remove WhatsApp registrations: `sqlite3 store/messages.db "DELETE FROM registered_groups WHERE jid LIKE '%@g.us' OR jid LIKE '%@s.whatsapp.net'"` +3. Sync env: `mkdir -p data/env && cp .env data/env/env` +4. Rebuild and restart: `npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `npm run build && systemctl --user restart nanoclaw` (Linux) diff --git a/.claude/skills/channel-formatting/SKILL.md b/.claude/skills/channel-formatting/SKILL.md new file mode 100644 index 0000000..3e2334c --- /dev/null +++ b/.claude/skills/channel-formatting/SKILL.md @@ -0,0 +1,137 @@ +--- +name: channel-formatting +description: Convert Claude's Markdown output to each channel's native text syntax before delivery. Adds zero-dependency formatting for WhatsApp, Telegram, and Slack (marker substitution). Also ships a Signal rich-text helper (parseSignalStyles) used by the Signal skill. +--- + +# Channel Formatting + +This skill wires channel-aware Markdown conversion into the outbound pipeline so Claude's +responses render natively on each platform — no more literal `**asterisks**` in WhatsApp or +Telegram. + +| Channel | Transformation | +|---------|---------------| +| WhatsApp | `**bold**` → `*bold*`, `*italic*` → `_italic_`, headings → bold, links → `text (url)` | +| Telegram | same as WhatsApp, but `[text](url)` links are preserved (Markdown v1 renders them natively) | +| Slack | same as WhatsApp, but links become `` | +| Discord | passthrough (Discord already renders Markdown) | +| Signal | passthrough for `parseTextStyles`; `parseSignalStyles` in `src/text-styles.ts` produces plain text + native `textStyle` ranges for use by the Signal skill | + +Code blocks (fenced and inline) are always protected — their content is never transformed. + +## Phase 1: Pre-flight + +### Check if already applied + +```bash +test -f src/text-styles.ts && echo "already applied" || echo "not yet applied" +``` + +If `already applied`, skip to Phase 3 (Verify). + +## Phase 2: Apply Code Changes + +### Ensure the upstream remote + +```bash +git remote -v +``` + +If an `upstream` remote pointing to `https://github.com/qwibitai/nanoclaw.git` is missing, +add it: + +```bash +git remote add upstream https://github.com/qwibitai/nanoclaw.git +``` + +### Merge the skill branch + +```bash +git fetch upstream skill/channel-formatting +git merge upstream/skill/channel-formatting +``` + +If there are merge conflicts on `package-lock.json`, resolve them by accepting the incoming +version and continuing: + +```bash +git checkout --theirs package-lock.json +git add package-lock.json +git merge --continue +``` + +For any other conflict, read the conflicted file and reconcile both sides manually. + +This merge adds: + +- `src/text-styles.ts` — `parseTextStyles(text, channel)` for marker substitution and + `parseSignalStyles(text)` for Signal native rich text +- `src/router.ts` — `formatOutbound` gains an optional `channel` parameter; when provided + it calls `parseTextStyles` after stripping `` tags +- `src/index.ts` — both outbound `sendMessage` paths pass `channel.name` to `formatOutbound` +- `src/formatting.test.ts` — test coverage for both functions across all channels + +### Validate + +```bash +npm install +npm run build +npx vitest run src/formatting.test.ts +``` + +All 73 tests should pass and the build should be clean before continuing. + +## Phase 3: Verify + +### Rebuild and restart + +```bash +npm run build +launchctl kickstart -k gui/$(id -u)/com.nanoclaw # macOS +# Linux: systemctl --user restart nanoclaw +``` + +### Spot-check formatting + +Send a message through any registered WhatsApp or Telegram chat that will trigger a +response from Claude. Ask something that will produce formatted output, such as: + +> Summarise the three main advantages of TypeScript using bullet points and **bold** headings. + +Confirm that the response arrives with native bold (`*text*`) rather than raw double +asterisks. + +### Check logs if needed + +```bash +tail -f logs/nanoclaw.log +``` + +## Signal Skill Integration + +If you have the Signal skill installed, `src/channels/signal.ts` can import +`parseSignalStyles` from the newly present `src/text-styles.ts`: + +```typescript +import { parseSignalStyles, SignalTextStyle } from '../text-styles.js'; +``` + +`parseSignalStyles` returns `{ text: string, textStyle: SignalTextStyle[] }` where +`textStyle` is an array of `{ style, start, length }` objects suitable for the +`signal-cli` JSON-RPC `textStyles` parameter (format: `"start:length:STYLE"`). + +## Removal + +```bash +# Remove the new file +rm src/text-styles.ts + +# Revert router.ts to remove the channel param +git diff upstream/main src/router.ts # review changes +git checkout upstream/main -- src/router.ts + +# Revert the index.ts sendMessage call sites to plain formatOutbound(rawText) +# (edit manually or: git checkout upstream/main -- src/index.ts) + +npm run build +``` \ No newline at end of file diff --git a/.claude/skills/claw/SKILL.md b/.claude/skills/claw/SKILL.md new file mode 100644 index 0000000..10e0dc3 --- /dev/null +++ b/.claude/skills/claw/SKILL.md @@ -0,0 +1,131 @@ +--- +name: claw +description: Install the claw CLI tool — run NanoClaw agent containers from the command line without opening a chat app. +--- + +# claw — NanoClaw CLI + +`claw` is a Python CLI that sends prompts directly to a NanoClaw agent container from the terminal. It reads registered groups from the NanoClaw database, picks up secrets from `.env`, and pipes a JSON payload into a container run — no chat app required. + +## What it does + +- Send a prompt to any registered group by name, folder, or JID +- Default target is the main group (no `-g` needed for most use) +- Resume a previous session with `-s ` +- Read prompts from stdin (`--pipe`) for scripting and piping +- List all registered groups with `--list-groups` +- Auto-detects `container` or `docker` runtime (or override with `--runtime`) +- Prints the agent's response to stdout; session ID to stderr +- Verbose mode (`-v`) shows the command, redacted payload, and exit code + +## Prerequisites + +- Python 3.8 or later +- NanoClaw installed with a built and tagged container image (`nanoclaw-agent:latest`) +- Either `container` (Apple Container, macOS 15+) or `docker` available in `PATH` + +## Install + +Run this skill from within the NanoClaw directory. The script auto-detects its location, so the symlink always points to the right place. + +### 1. Copy the script + +```bash +mkdir -p scripts +cp "${CLAUDE_SKILL_DIR}/scripts/claw" scripts/claw +chmod +x scripts/claw +``` + +### 2. Symlink into PATH + +```bash +mkdir -p ~/bin +ln -sf "$(pwd)/scripts/claw" ~/bin/claw +``` + +Make sure `~/bin` is in `PATH`. Add this to `~/.zshrc` or `~/.bashrc` if needed: + +```bash +export PATH="$HOME/bin:$PATH" +``` + +Then reload the shell: + +```bash +source ~/.zshrc # or ~/.bashrc +``` + +### 3. Verify + +```bash +claw --list-groups +``` + +You should see registered groups. If NanoClaw isn't running or the database doesn't exist yet, the list will be empty — that's fine. + +## Usage Examples + +```bash +# Send a prompt to the main group +claw "What's on my calendar today?" + +# Send to a specific group by name (fuzzy match) +claw -g "family" "Remind everyone about dinner at 7" + +# Send to a group by exact JID +claw -j "120363336345536173@g.us" "Hello" + +# Resume a previous session +claw -s abc123 "Continue where we left off" + +# Read prompt from stdin +echo "Summarize this" | claw --pipe -g dev + +# Pipe a file +cat report.txt | claw --pipe "Summarize this report" + +# List all registered groups +claw --list-groups + +# Force a specific runtime +claw --runtime docker "Hello" + +# Use a custom image tag (e.g. after rebuilding with a new tag) +claw --image nanoclaw-agent:dev "Hello" + +# Verbose mode (debug info, secrets redacted) +claw -v "Hello" + +# Custom timeout for long-running tasks +claw --timeout 600 "Run the full analysis" +``` + +## Troubleshooting + +### "neither 'container' nor 'docker' found" + +Install Docker Desktop or Apple Container (macOS 15+), or pass `--runtime` explicitly. + +### "no secrets found in .env" + +The script auto-detects your NanoClaw directory and reads `.env` from it. Check that the file exists and contains at least one of: `CLAUDE_CODE_OAUTH_TOKEN`, `ANTHROPIC_API_KEY`, `ANTHROPIC_AUTH_TOKEN`. + +### Container times out + +The default timeout is 300 seconds. For longer tasks, pass `--timeout 600` (or higher). If the container consistently hangs, check that your `nanoclaw-agent:latest` image is up to date by running `./container/build.sh`. + +### "group not found" + +Run `claw --list-groups` to see what's registered. Group lookup does a fuzzy partial match on name and folder — if your query matches multiple groups, you'll get an error listing the ambiguous matches. + +### Container crashes mid-stream + +Containers run with `--rm` so they are automatically removed. If the agent crashes before emitting the output sentinel, `claw` falls back to printing raw stdout. Use `-v` to see what the container produced. Rebuild the image with `./container/build.sh` if crashes are consistent. + +### Override the NanoClaw directory + +If `claw` can't find your database or `.env`, set the `NANOCLAW_DIR` environment variable: + +```bash +export NANOCLAW_DIR=/path/to/your/nanoclaw +``` diff --git a/.claude/skills/claw/scripts/claw b/.claude/skills/claw/scripts/claw new file mode 100644 index 0000000..b64a225 --- /dev/null +++ b/.claude/skills/claw/scripts/claw @@ -0,0 +1,374 @@ +#!/usr/bin/env python3 +""" +claw — NanoClaw CLI +Run a NanoClaw agent container from the command line. + +Usage: + claw "What is 2+2?" + claw -g "Review this code" + claw -g "" "What's the latest issue?" + claw -j "" "Hello" + claw -g -s "Continue" + claw --list-groups + echo "prompt text" | claw --pipe -g + cat prompt.txt | claw --pipe +""" + +from __future__ import annotations + +import argparse +import json +import os +import re +import sqlite3 +import subprocess +import sys +import threading +from pathlib import Path + +# ── Globals ───────────────────────────────────────────────────────────────── + +VERBOSE = False + +def dbg(*args): + if VERBOSE: + print("»", *args, file=sys.stderr) + +# ── Config ────────────────────────────────────────────────────────────────── + +def _find_nanoclaw_dir() -> Path: + """Locate the NanoClaw installation directory. + + Resolution order: + 1. NANOCLAW_DIR env var + 2. The directory containing this script (if it looks like a NanoClaw install) + 3. ~/src/nanoclaw (legacy default) + """ + if env := os.environ.get("NANOCLAW_DIR"): + return Path(env).expanduser() + # If this script lives inside the NanoClaw tree (e.g. scripts/claw), walk up + here = Path(__file__).resolve() + for parent in [here.parent, here.parent.parent]: + if (parent / "store" / "messages.db").exists() or (parent / ".env").exists(): + return parent + return Path.home() / "src" / "nanoclaw" + +NANOCLAW_DIR = _find_nanoclaw_dir() +DB_PATH = NANOCLAW_DIR / "store" / "messages.db" +ENV_FILE = NANOCLAW_DIR / ".env" +IMAGE = "nanoclaw-agent:latest" + +SECRET_KEYS = [ + "CLAUDE_CODE_OAUTH_TOKEN", + "ANTHROPIC_API_KEY", + "ANTHROPIC_BASE_URL", + "ANTHROPIC_AUTH_TOKEN", + "OLLAMA_HOST", +] + +# ── Helpers ────────────────────────────────────────────────────────────────── + +def detect_runtime(preference: str | None) -> str: + if preference: + dbg(f"runtime: forced to {preference}") + return preference + for rt in ("container", "docker"): + result = subprocess.run(["which", rt], capture_output=True) + if result.returncode == 0: + dbg(f"runtime: auto-detected {rt} at {result.stdout.decode().strip()}") + return rt + sys.exit("error: neither 'container' nor 'docker' found. Install one or pass --runtime.") + + +def read_secrets(env_file: Path) -> dict: + secrets = {} + if not env_file.exists(): + return secrets + for line in env_file.read_text().splitlines(): + line = line.strip() + if not line or line.startswith("#"): + continue + if "=" in line: + key, _, val = line.partition("=") + key = key.strip() + if key in SECRET_KEYS: + secrets[key] = val.strip() + return secrets + + +def get_groups(db: Path) -> list[dict]: + conn = sqlite3.connect(db) + rows = conn.execute( + "SELECT jid, name, folder, is_main FROM registered_groups ORDER BY name" + ).fetchall() + conn.close() + return [{"jid": r[0], "name": r[1], "folder": r[2], "is_main": bool(r[3])} for r in rows] + + +def find_group(groups: list[dict], query: str) -> dict | None: + q = query.lower() + # Exact name match + for g in groups: + if g["name"].lower() == q or g["folder"].lower() == q: + return g + # Partial match + matches = [g for g in groups if q in g["name"].lower() or q in g["folder"].lower()] + if len(matches) == 1: + return matches[0] + if len(matches) > 1: + names = ", ".join(f'"{g["name"]}"' for g in matches) + sys.exit(f"error: ambiguous group '{query}'. Matches: {names}") + return None + + +def build_mounts(folder: str, is_main: bool) -> list[tuple[str, str, bool]]: + """Return list of (host_path, container_path, readonly) tuples.""" + groups_dir = NANOCLAW_DIR / "groups" + data_dir = NANOCLAW_DIR / "data" + sessions_dir = data_dir / "sessions" / folder + ipc_dir = data_dir / "ipc" / folder + + # Ensure required dirs exist + group_dir = groups_dir / folder + group_dir.mkdir(parents=True, exist_ok=True) + (sessions_dir / ".claude").mkdir(parents=True, exist_ok=True) + for sub in ("messages", "tasks", "input"): + (ipc_dir / sub).mkdir(parents=True, exist_ok=True) + + agent_runner_src = sessions_dir / "agent-runner-src" + project_agent_runner = NANOCLAW_DIR / "container" / "agent-runner" / "src" + if not agent_runner_src.exists() and project_agent_runner.exists(): + import shutil + shutil.copytree(project_agent_runner, agent_runner_src) + + mounts: list[tuple[str, str, bool]] = [] + if is_main: + mounts.append((str(NANOCLAW_DIR), "/workspace/project", True)) + mounts.append((str(group_dir), "/workspace/group", False)) + mounts.append((str(sessions_dir / ".claude"), "/home/node/.claude", False)) + mounts.append((str(ipc_dir), "/workspace/ipc", False)) + if agent_runner_src.exists(): + mounts.append((str(agent_runner_src), "/app/src", False)) + return mounts + + +def run_container(runtime: str, image: str, payload: dict, + folder: str | None = None, is_main: bool = False, + timeout: int = 300) -> None: + cmd = [runtime, "run", "-i", "--rm"] + if folder: + for host, container, readonly in build_mounts(folder, is_main): + if readonly: + cmd += ["--mount", f"type=bind,source={host},target={container},readonly"] + else: + cmd += ["-v", f"{host}:{container}"] + cmd.append(image) + dbg(f"cmd: {' '.join(cmd)}") + + # Show payload sans secrets + if VERBOSE: + safe = {k: v for k, v in payload.items() if k != "secrets"} + safe["secrets"] = {k: "***" for k in payload.get("secrets", {})} + dbg(f"payload: {json.dumps(safe, indent=2)}") + + proc = subprocess.Popen( + cmd, + stdin=subprocess.PIPE, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + ) + dbg(f"container pid: {proc.pid}") + + # Write JSON payload and close stdin + proc.stdin.write(json.dumps(payload).encode()) + proc.stdin.close() + dbg("stdin closed, waiting for response...") + + stdout_lines: list[str] = [] + stderr_lines: list[str] = [] + done = threading.Event() + + def stream_stderr(): + for raw in proc.stderr: + line = raw.decode(errors="replace").rstrip() + if line.startswith("npm notice"): + continue + stderr_lines.append(line) + print(line, file=sys.stderr) + + def stream_stdout(): + for raw in proc.stdout: + line = raw.decode(errors="replace").rstrip() + stdout_lines.append(line) + dbg(f"stdout: {line}") + # Kill the container as soon as we see the closing sentinel — + # the Node.js event loop often keeps the process alive indefinitely. + if line.strip() == "---NANOCLAW_OUTPUT_END---": + dbg("output sentinel found, terminating container") + done.set() + try: + proc.terminate() + try: + proc.wait(timeout=5) + except subprocess.TimeoutExpired: + dbg("graceful stop timed out, force killing container") + proc.kill() + except ProcessLookupError: + pass + return + + t_err = threading.Thread(target=stream_stderr, daemon=True) + t_out = threading.Thread(target=stream_stdout, daemon=True) + t_err.start() + t_out.start() + + # Wait for sentinel or timeout + if not done.wait(timeout=timeout): + # Also check if process exited naturally + t_out.join(timeout=2) + if not done.is_set(): + proc.kill() + sys.exit(f"error: container timed out after {timeout}s (no output sentinel received)") + + t_err.join(timeout=5) + t_out.join(timeout=5) + proc.wait() + dbg(f"container done (rc={proc.returncode}), {len(stdout_lines)} stdout lines") + stdout = "\n".join(stdout_lines) + + # Parse output block + match = re.search( + r"---NANOCLAW_OUTPUT_START---\n(.*?)\n---NANOCLAW_OUTPUT_END---", + stdout, + re.DOTALL, + ) + success = False + + if match: + try: + data = json.loads(match.group(1)) + status = data.get("status", "unknown") + if status == "success": + print(data.get("result", "")) + session_id = data.get("newSessionId") or data.get("sessionId") + if session_id: + print(f"\n[session: {session_id}]", file=sys.stderr) + success = True + else: + print(f"[{status}] {data.get('result', '')}", file=sys.stderr) + sys.exit(1) + except json.JSONDecodeError: + print(match.group(1)) + else: + # No structured output — print raw stdout + print(stdout) + + if success: + return + + if proc.returncode not in (0, None): + sys.exit(proc.returncode) + + +# ── Main ───────────────────────────────────────────────────────────────────── + +def main(): + parser = argparse.ArgumentParser( + prog="claw", + description="Run a NanoClaw agent from the command line.", + ) + parser.add_argument("prompt", nargs="?", help="Prompt to send") + parser.add_argument("-g", "--group", help="Group name or folder (fuzzy match)") + parser.add_argument("-j", "--jid", help="Chat JID (exact)") + parser.add_argument("-s", "--session", help="Session ID to resume") + parser.add_argument("-p", "--pipe", action="store_true", + help="Read prompt from stdin (can be combined with a prompt arg as prefix)") + parser.add_argument("--runtime", choices=["docker", "container"], + help="Container runtime (default: auto-detect)") + parser.add_argument("--image", default=IMAGE, help=f"Container image (default: {IMAGE})") + parser.add_argument("--list-groups", action="store_true", help="List registered groups and exit") + parser.add_argument("--raw", action="store_true", help="Print raw JSON output") + parser.add_argument("--timeout", type=int, default=300, metavar="SECS", + help="Max seconds to wait for a response (default: 300)") + parser.add_argument("-v", "--verbose", action="store_true", + help="Show debug info: cmd, payload (secrets redacted), stdout lines, exit code") + args = parser.parse_args() + + global VERBOSE + VERBOSE = args.verbose + + groups = get_groups(DB_PATH) if DB_PATH.exists() else [] + + if args.list_groups: + print(f"{'NAME':<35} {'FOLDER':<30} {'JID'}") + print("-" * 100) + for g in groups: + main_tag = " [main]" if g["is_main"] else "" + print(f"{g['name']:<35} {g['folder']:<30} {g['jid']}{main_tag}") + return + + # Resolve prompt: --pipe reads stdin, optionally prepended with positional arg + if args.pipe or (not sys.stdin.isatty() and not args.prompt): + stdin_text = sys.stdin.read().strip() + if args.prompt: + prompt = f"{args.prompt}\n\n{stdin_text}" + else: + prompt = stdin_text + else: + prompt = args.prompt + + if not prompt: + parser.print_help() + sys.exit(1) + + # Resolve group → jid + jid = args.jid + group_name = None + group_folder = None + is_main = False + + if args.group: + g = find_group(groups, args.group) + if g is None: + sys.exit(f"error: group '{args.group}' not found. Run --list-groups to see options.") + jid = g["jid"] + group_name = g["name"] + group_folder = g["folder"] + is_main = g["is_main"] + elif not jid: + # Default: main group + mains = [g for g in groups if g["is_main"]] + if mains: + jid = mains[0]["jid"] + group_name = mains[0]["name"] + group_folder = mains[0]["folder"] + is_main = True + else: + sys.exit("error: no group specified and no main group found. Use -g or -j.") + + runtime = detect_runtime(args.runtime) + secrets = read_secrets(ENV_FILE) + + if not secrets: + print("warning: no secrets found in .env — agent may not be authenticated", file=sys.stderr) + + payload: dict = { + "prompt": prompt, + "chatJid": jid, + "isMain": is_main, + "secrets": secrets, + } + if group_name: + payload["groupFolder"] = group_name + if args.session: + payload["sessionId"] = args.session + payload["resumeAt"] = "latest" + + print(f"[{group_name or jid}] running via {runtime}...", file=sys.stderr) + run_container(runtime, args.image, payload, + folder=group_folder, is_main=is_main, + timeout=args.timeout) + + +if __name__ == "__main__": + main() diff --git a/.claude/skills/convert-to-apple-container/SKILL.md b/.claude/skills/convert-to-apple-container/SKILL.md index 8bfaebb..caf9c22 100644 --- a/.claude/skills/convert-to-apple-container/SKILL.md +++ b/.claude/skills/convert-to-apple-container/SKILL.md @@ -13,11 +13,13 @@ This skill switches NanoClaw's container runtime from Docker to Apple Container - Startup check: `docker info` → `container system status` (with auto-start) - Orphan detection: `docker ps --filter` → `container ls --format json` - Build script default: `docker` → `container` +- Dockerfile entrypoint: `.env` shadowing via `mount --bind` inside the container (Apple Container only supports directory mounts, not file mounts like Docker's `/dev/null` overlay) +- Container runner: main-group containers start as root for `mount --bind`, then drop privileges via `setpriv` **What stays the same:** -- Dockerfile (shared by both runtimes) -- Container runner code (`src/container-runner.ts`) - Mount security/allowlist validation +- All exported interfaces and IPC protocol +- Non-main container behavior (still uses `--user` flag) - All other functionality ## Prerequisites @@ -39,10 +41,6 @@ Apple Container requires macOS. It does not work on Linux. ### Check if already applied -Read `.nanoclaw/state.yaml`. If `convert-to-apple-container` is in `applied_skills`, skip to Phase 3 (Verify). The code changes are already in place. - -### Check current runtime - ```bash grep "CONTAINER_RUNTIME_BIN" src/container-runtime.ts ``` @@ -51,33 +49,33 @@ If it already shows `'container'`, the runtime is already Apple Container. Skip ## Phase 2: Apply Code Changes -Run the skills engine to apply this skill's code package. The package files are in this directory alongside this SKILL.md. - -### Initialize skills system (if needed) - -If `.nanoclaw/` directory doesn't exist yet: +### Ensure upstream remote ```bash -npx tsx scripts/apply-skill.ts --init +git remote -v ``` -Or call `initSkillsSystem()` from `skills-engine/migrate.ts`. - -### Apply the skill +If `upstream` is missing, add it: ```bash -npx tsx scripts/apply-skill.ts .claude/skills/convert-to-apple-container +git remote add upstream https://github.com/qwibitai/nanoclaw.git ``` -This deterministically: -- Replaces `src/container-runtime.ts` with the Apple Container implementation -- Replaces `src/container-runtime.test.ts` with Apple Container-specific tests -- Updates `container/build.sh` to default to `container` runtime -- Records the application in `.nanoclaw/state.yaml` +### Merge the skill branch -If the apply reports merge conflicts, read the intent files: -- `modify/src/container-runtime.ts.intent.md` — what changed and invariants -- `modify/container/build.sh.intent.md` — what changed for build script +```bash +git fetch upstream skill/apple-container +git merge upstream/skill/apple-container +``` + +This merges in: +- `src/container-runtime.ts` — Apple Container implementation (replaces Docker) +- `src/container-runtime.test.ts` — Apple Container-specific tests +- `src/container-runner.ts` — .env shadow mount fix and privilege dropping +- `container/Dockerfile` — entrypoint that shadows .env via `mount --bind` +- `container/build.sh` — default runtime set to `container` + +If the merge reports conflicts, resolve them by reading the conflicted files and understanding the intent of both sides. ### Validate code changes @@ -172,4 +170,6 @@ Check directory permissions on the host. The container runs as uid 1000. |------|----------------| | `src/container-runtime.ts` | Full replacement — Docker → Apple Container API | | `src/container-runtime.test.ts` | Full replacement — tests for Apple Container behavior | +| `src/container-runner.ts` | .env shadow mount removed, main containers start as root with privilege drop | +| `container/Dockerfile` | Entrypoint: `mount --bind` for .env shadowing, `setpriv` privilege drop | | `container/build.sh` | Default runtime: `docker` → `container` | diff --git a/.claude/skills/convert-to-apple-container/manifest.yaml b/.claude/skills/convert-to-apple-container/manifest.yaml deleted file mode 100644 index d9f65b6..0000000 --- a/.claude/skills/convert-to-apple-container/manifest.yaml +++ /dev/null @@ -1,13 +0,0 @@ -skill: convert-to-apple-container -version: 1.0.0 -description: "Switch container runtime from Docker to Apple Container (macOS)" -core_version: 0.1.0 -adds: [] -modifies: - - src/container-runtime.ts - - src/container-runtime.test.ts - - container/build.sh -structured: {} -conflicts: [] -depends: [] -test: "npx vitest run src/container-runtime.test.ts" diff --git a/.claude/skills/convert-to-apple-container/modify/container/build.sh b/.claude/skills/convert-to-apple-container/modify/container/build.sh deleted file mode 100644 index fbdef31..0000000 --- a/.claude/skills/convert-to-apple-container/modify/container/build.sh +++ /dev/null @@ -1,23 +0,0 @@ -#!/bin/bash -# Build the NanoClaw agent container image - -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -cd "$SCRIPT_DIR" - -IMAGE_NAME="nanoclaw-agent" -TAG="${1:-latest}" -CONTAINER_RUNTIME="${CONTAINER_RUNTIME:-container}" - -echo "Building NanoClaw agent container image..." -echo "Image: ${IMAGE_NAME}:${TAG}" - -${CONTAINER_RUNTIME} build -t "${IMAGE_NAME}:${TAG}" . - -echo "" -echo "Build complete!" -echo "Image: ${IMAGE_NAME}:${TAG}" -echo "" -echo "Test with:" -echo " echo '{\"prompt\":\"What is 2+2?\",\"groupFolder\":\"test\",\"chatJid\":\"test@g.us\",\"isMain\":false}' | ${CONTAINER_RUNTIME} run -i ${IMAGE_NAME}:${TAG}" diff --git a/.claude/skills/convert-to-apple-container/modify/container/build.sh.intent.md b/.claude/skills/convert-to-apple-container/modify/container/build.sh.intent.md deleted file mode 100644 index e7b2b97..0000000 --- a/.claude/skills/convert-to-apple-container/modify/container/build.sh.intent.md +++ /dev/null @@ -1,17 +0,0 @@ -# Intent: container/build.sh modifications - -## What changed -Changed the default container runtime from `docker` to `container` (Apple Container CLI). - -## Key sections -- `CONTAINER_RUNTIME` default: `docker` → `container` -- All build/run commands use `${CONTAINER_RUNTIME}` variable (unchanged) - -## Invariants -- The `CONTAINER_RUNTIME` environment variable override still works -- IMAGE_NAME and TAG logic unchanged -- Build and test echo commands unchanged - -## Must-keep -- The `CONTAINER_RUNTIME` env var override pattern -- The test command echo at the end diff --git a/.claude/skills/convert-to-apple-container/modify/src/container-runtime.test.ts b/.claude/skills/convert-to-apple-container/modify/src/container-runtime.test.ts deleted file mode 100644 index 79b77a3..0000000 --- a/.claude/skills/convert-to-apple-container/modify/src/container-runtime.test.ts +++ /dev/null @@ -1,177 +0,0 @@ -import { describe, it, expect, vi, beforeEach } from 'vitest'; - -// Mock logger -vi.mock('./logger.js', () => ({ - logger: { - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }, -})); - -// Mock child_process — store the mock fn so tests can configure it -const mockExecSync = vi.fn(); -vi.mock('child_process', () => ({ - execSync: (...args: unknown[]) => mockExecSync(...args), -})); - -import { - CONTAINER_RUNTIME_BIN, - readonlyMountArgs, - stopContainer, - ensureContainerRuntimeRunning, - cleanupOrphans, -} from './container-runtime.js'; -import { logger } from './logger.js'; - -beforeEach(() => { - vi.clearAllMocks(); -}); - -// --- Pure functions --- - -describe('readonlyMountArgs', () => { - it('returns --mount flag with type=bind and readonly', () => { - const args = readonlyMountArgs('/host/path', '/container/path'); - expect(args).toEqual([ - '--mount', - 'type=bind,source=/host/path,target=/container/path,readonly', - ]); - }); -}); - -describe('stopContainer', () => { - it('returns stop command using CONTAINER_RUNTIME_BIN', () => { - expect(stopContainer('nanoclaw-test-123')).toBe( - `${CONTAINER_RUNTIME_BIN} stop nanoclaw-test-123`, - ); - }); -}); - -// --- ensureContainerRuntimeRunning --- - -describe('ensureContainerRuntimeRunning', () => { - it('does nothing when runtime is already running', () => { - mockExecSync.mockReturnValueOnce(''); - - ensureContainerRuntimeRunning(); - - expect(mockExecSync).toHaveBeenCalledTimes(1); - expect(mockExecSync).toHaveBeenCalledWith( - `${CONTAINER_RUNTIME_BIN} system status`, - { stdio: 'pipe' }, - ); - expect(logger.debug).toHaveBeenCalledWith('Container runtime already running'); - }); - - it('auto-starts when system status fails', () => { - // First call (system status) fails - mockExecSync.mockImplementationOnce(() => { - throw new Error('not running'); - }); - // Second call (system start) succeeds - mockExecSync.mockReturnValueOnce(''); - - ensureContainerRuntimeRunning(); - - expect(mockExecSync).toHaveBeenCalledTimes(2); - expect(mockExecSync).toHaveBeenNthCalledWith( - 2, - `${CONTAINER_RUNTIME_BIN} system start`, - { stdio: 'pipe', timeout: 30000 }, - ); - expect(logger.info).toHaveBeenCalledWith('Container runtime started'); - }); - - it('throws when both status and start fail', () => { - mockExecSync.mockImplementation(() => { - throw new Error('failed'); - }); - - expect(() => ensureContainerRuntimeRunning()).toThrow( - 'Container runtime is required but failed to start', - ); - expect(logger.error).toHaveBeenCalled(); - }); -}); - -// --- cleanupOrphans --- - -describe('cleanupOrphans', () => { - it('stops orphaned nanoclaw containers from JSON output', () => { - // Apple Container ls returns JSON - const lsOutput = JSON.stringify([ - { status: 'running', configuration: { id: 'nanoclaw-group1-111' } }, - { status: 'stopped', configuration: { id: 'nanoclaw-group2-222' } }, - { status: 'running', configuration: { id: 'nanoclaw-group3-333' } }, - { status: 'running', configuration: { id: 'other-container' } }, - ]); - mockExecSync.mockReturnValueOnce(lsOutput); - // stop calls succeed - mockExecSync.mockReturnValue(''); - - cleanupOrphans(); - - // ls + 2 stop calls (only running nanoclaw- containers) - expect(mockExecSync).toHaveBeenCalledTimes(3); - expect(mockExecSync).toHaveBeenNthCalledWith( - 2, - `${CONTAINER_RUNTIME_BIN} stop nanoclaw-group1-111`, - { stdio: 'pipe' }, - ); - expect(mockExecSync).toHaveBeenNthCalledWith( - 3, - `${CONTAINER_RUNTIME_BIN} stop nanoclaw-group3-333`, - { stdio: 'pipe' }, - ); - expect(logger.info).toHaveBeenCalledWith( - { count: 2, names: ['nanoclaw-group1-111', 'nanoclaw-group3-333'] }, - 'Stopped orphaned containers', - ); - }); - - it('does nothing when no orphans exist', () => { - mockExecSync.mockReturnValueOnce('[]'); - - cleanupOrphans(); - - expect(mockExecSync).toHaveBeenCalledTimes(1); - expect(logger.info).not.toHaveBeenCalled(); - }); - - it('warns and continues when ls fails', () => { - mockExecSync.mockImplementationOnce(() => { - throw new Error('container not available'); - }); - - cleanupOrphans(); // should not throw - - expect(logger.warn).toHaveBeenCalledWith( - expect.objectContaining({ err: expect.any(Error) }), - 'Failed to clean up orphaned containers', - ); - }); - - it('continues stopping remaining containers when one stop fails', () => { - const lsOutput = JSON.stringify([ - { status: 'running', configuration: { id: 'nanoclaw-a-1' } }, - { status: 'running', configuration: { id: 'nanoclaw-b-2' } }, - ]); - mockExecSync.mockReturnValueOnce(lsOutput); - // First stop fails - mockExecSync.mockImplementationOnce(() => { - throw new Error('already stopped'); - }); - // Second stop succeeds - mockExecSync.mockReturnValueOnce(''); - - cleanupOrphans(); // should not throw - - expect(mockExecSync).toHaveBeenCalledTimes(3); - expect(logger.info).toHaveBeenCalledWith( - { count: 2, names: ['nanoclaw-a-1', 'nanoclaw-b-2'] }, - 'Stopped orphaned containers', - ); - }); -}); diff --git a/.claude/skills/convert-to-apple-container/modify/src/container-runtime.ts b/.claude/skills/convert-to-apple-container/modify/src/container-runtime.ts deleted file mode 100644 index ebfb6b9..0000000 --- a/.claude/skills/convert-to-apple-container/modify/src/container-runtime.ts +++ /dev/null @@ -1,85 +0,0 @@ -/** - * Container runtime abstraction for NanoClaw. - * All runtime-specific logic lives here so swapping runtimes means changing one file. - */ -import { execSync } from 'child_process'; - -import { logger } from './logger.js'; - -/** The container runtime binary name. */ -export const CONTAINER_RUNTIME_BIN = 'container'; - -/** Returns CLI args for a readonly bind mount. */ -export function readonlyMountArgs(hostPath: string, containerPath: string): string[] { - return ['--mount', `type=bind,source=${hostPath},target=${containerPath},readonly`]; -} - -/** Returns the shell command to stop a container by name. */ -export function stopContainer(name: string): string { - return `${CONTAINER_RUNTIME_BIN} stop ${name}`; -} - -/** Ensure the container runtime is running, starting it if needed. */ -export function ensureContainerRuntimeRunning(): void { - try { - execSync(`${CONTAINER_RUNTIME_BIN} system status`, { stdio: 'pipe' }); - logger.debug('Container runtime already running'); - } catch { - logger.info('Starting container runtime...'); - try { - execSync(`${CONTAINER_RUNTIME_BIN} system start`, { stdio: 'pipe', timeout: 30000 }); - logger.info('Container runtime started'); - } catch (err) { - logger.error({ err }, 'Failed to start container runtime'); - console.error( - '\n╔════════════════════════════════════════════════════════════════╗', - ); - console.error( - '║ FATAL: Container runtime failed to start ║', - ); - console.error( - '║ ║', - ); - console.error( - '║ Agents cannot run without a container runtime. To fix: ║', - ); - console.error( - '║ 1. Ensure Apple Container is installed ║', - ); - console.error( - '║ 2. Run: container system start ║', - ); - console.error( - '║ 3. Restart NanoClaw ║', - ); - console.error( - '╚════════════════════════════════════════════════════════════════╝\n', - ); - throw new Error('Container runtime is required but failed to start'); - } - } -} - -/** Kill orphaned NanoClaw containers from previous runs. */ -export function cleanupOrphans(): void { - try { - const output = execSync(`${CONTAINER_RUNTIME_BIN} ls --format json`, { - stdio: ['pipe', 'pipe', 'pipe'], - encoding: 'utf-8', - }); - const containers: { status: string; configuration: { id: string } }[] = JSON.parse(output || '[]'); - const orphans = containers - .filter((c) => c.status === 'running' && c.configuration.id.startsWith('nanoclaw-')) - .map((c) => c.configuration.id); - for (const name of orphans) { - try { - execSync(stopContainer(name), { stdio: 'pipe' }); - } catch { /* already stopped */ } - } - if (orphans.length > 0) { - logger.info({ count: orphans.length, names: orphans }, 'Stopped orphaned containers'); - } - } catch (err) { - logger.warn({ err }, 'Failed to clean up orphaned containers'); - } -} diff --git a/.claude/skills/convert-to-apple-container/modify/src/container-runtime.ts.intent.md b/.claude/skills/convert-to-apple-container/modify/src/container-runtime.ts.intent.md deleted file mode 100644 index cb7f78a..0000000 --- a/.claude/skills/convert-to-apple-container/modify/src/container-runtime.ts.intent.md +++ /dev/null @@ -1,32 +0,0 @@ -# Intent: src/container-runtime.ts modifications - -## What changed -Replaced Docker runtime with Apple Container runtime. This is a full file replacement — the exported API is identical, only the implementation differs. - -## Key sections - -### CONTAINER_RUNTIME_BIN -- Changed: `'docker'` → `'container'` (the Apple Container CLI binary) - -### readonlyMountArgs -- Changed: Docker `-v host:container:ro` → Apple Container `--mount type=bind,source=...,target=...,readonly` - -### ensureContainerRuntimeRunning -- Changed: `docker info` → `container system status` for checking -- Added: auto-start via `container system start` when not running (Apple Container supports this; Docker requires manual start) -- Changed: error message references Apple Container instead of Docker - -### cleanupOrphans -- Changed: `docker ps --filter name=nanoclaw- --format '{{.Names}}'` → `container ls --format json` with JSON parsing -- Apple Container returns JSON with `{ status, configuration: { id } }` structure - -## Invariants -- All five exports remain identical: `CONTAINER_RUNTIME_BIN`, `readonlyMountArgs`, `stopContainer`, `ensureContainerRuntimeRunning`, `cleanupOrphans` -- `stopContainer` implementation is unchanged (` stop `) -- Logger usage pattern is unchanged -- Error handling pattern is unchanged - -## Must-keep -- The exported function signatures (consumed by container-runner.ts and index.ts) -- The error box-drawing output format -- The orphan cleanup logic (find + stop pattern) diff --git a/.claude/skills/convert-to-apple-container/tests/convert-to-apple-container.test.ts b/.claude/skills/convert-to-apple-container/tests/convert-to-apple-container.test.ts deleted file mode 100644 index 33db430..0000000 --- a/.claude/skills/convert-to-apple-container/tests/convert-to-apple-container.test.ts +++ /dev/null @@ -1,69 +0,0 @@ -import { describe, expect, it } from 'vitest'; -import fs from 'fs'; -import path from 'path'; - -describe('convert-to-apple-container skill package', () => { - const skillDir = path.resolve(__dirname, '..'); - - it('has a valid manifest', () => { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - expect(fs.existsSync(manifestPath)).toBe(true); - - const content = fs.readFileSync(manifestPath, 'utf-8'); - expect(content).toContain('skill: convert-to-apple-container'); - expect(content).toContain('version: 1.0.0'); - expect(content).toContain('container-runtime.ts'); - expect(content).toContain('container/build.sh'); - }); - - it('has all modified files', () => { - const runtimeFile = path.join(skillDir, 'modify', 'src', 'container-runtime.ts'); - expect(fs.existsSync(runtimeFile)).toBe(true); - - const content = fs.readFileSync(runtimeFile, 'utf-8'); - expect(content).toContain("CONTAINER_RUNTIME_BIN = 'container'"); - expect(content).toContain('system status'); - expect(content).toContain('system start'); - expect(content).toContain('ls --format json'); - - const testFile = path.join(skillDir, 'modify', 'src', 'container-runtime.test.ts'); - expect(fs.existsSync(testFile)).toBe(true); - - const testContent = fs.readFileSync(testFile, 'utf-8'); - expect(testContent).toContain('system status'); - expect(testContent).toContain('--mount'); - }); - - it('has intent files for modified sources', () => { - const runtimeIntent = path.join(skillDir, 'modify', 'src', 'container-runtime.ts.intent.md'); - expect(fs.existsSync(runtimeIntent)).toBe(true); - - const buildIntent = path.join(skillDir, 'modify', 'container', 'build.sh.intent.md'); - expect(fs.existsSync(buildIntent)).toBe(true); - }); - - it('has build.sh with Apple Container default', () => { - const buildFile = path.join(skillDir, 'modify', 'container', 'build.sh'); - expect(fs.existsSync(buildFile)).toBe(true); - - const content = fs.readFileSync(buildFile, 'utf-8'); - expect(content).toContain('CONTAINER_RUNTIME:-container'); - expect(content).not.toContain('CONTAINER_RUNTIME:-docker'); - }); - - it('uses Apple Container API patterns (not Docker)', () => { - const runtimeFile = path.join(skillDir, 'modify', 'src', 'container-runtime.ts'); - const content = fs.readFileSync(runtimeFile, 'utf-8'); - - // Apple Container patterns - expect(content).toContain('system status'); - expect(content).toContain('system start'); - expect(content).toContain('ls --format json'); - expect(content).toContain('type=bind,source='); - - // Should NOT contain Docker patterns - expect(content).not.toContain('docker info'); - expect(content).not.toContain("'-v'"); - expect(content).not.toContain('--filter name='); - }); -}); diff --git a/.claude/skills/customize/SKILL.md b/.claude/skills/customize/SKILL.md index 95a4547..614a979 100644 --- a/.claude/skills/customize/SKILL.md +++ b/.claude/skills/customize/SKILL.md @@ -10,9 +10,9 @@ This skill helps users add capabilities or modify behavior. Use AskUserQuestion ## Workflow 1. **Understand the request** - Ask clarifying questions -2. **Plan the changes** - Identify files to modify -3. **Implement** - Make changes directly to the code -4. **Test guidance** - Tell user how to verify +3. **Plan the changes** - Identify files to modify. If a skill exists for the request (e.g., `/add-telegram` for adding Telegram), invoke it instead of implementing manually. +4. **Implement** - Make changes directly to the code +5. **Test guidance** - Tell user how to verify ## Key Files diff --git a/.claude/skills/init-onecli/SKILL.md b/.claude/skills/init-onecli/SKILL.md new file mode 100644 index 0000000..d7727dd --- /dev/null +++ b/.claude/skills/init-onecli/SKILL.md @@ -0,0 +1,276 @@ +--- +name: init-onecli +description: Install and initialize OneCLI Agent Vault. Migrates existing .env credentials to the vault. Use after /update-nanoclaw brings in OneCLI as a breaking change, or for first-time OneCLI setup. +--- + +# Initialize OneCLI Agent Vault + +This skill installs OneCLI, configures the Agent Vault gateway, and migrates any existing `.env` credentials into it. Run this after `/update-nanoclaw` introduces OneCLI as a breaking change, or any time OneCLI needs to be set up from scratch. + +**Principle:** When something is broken or missing, fix it. Don't tell the user to go fix it themselves unless it genuinely requires their manual action (e.g. pasting a token). + +## Phase 1: Pre-flight + +### Check if OneCLI is already working + +```bash +onecli version 2>/dev/null +``` + +If the command succeeds, OneCLI is installed. Check if the gateway is reachable: + +```bash +curl -sf http://127.0.0.1:10254/health +``` + +If both succeed, check for an Anthropic secret: + +```bash +onecli secrets list +``` + +If an Anthropic secret exists, tell the user OneCLI is already configured and working. Use AskUserQuestion: + +1. **Keep current setup** — description: "OneCLI is installed and has credentials configured. Nothing to do." +2. **Reconfigure** — description: "Start fresh — reinstall OneCLI and re-register credentials." + +If they choose to keep, skip to Phase 5 (Verify). If they choose to reconfigure, continue. + +### Check for native credential proxy + +```bash +grep "credential-proxy" src/index.ts 2>/dev/null +``` + +If `startCredentialProxy` is imported, the native credential proxy skill is active. Tell the user: "You're currently using the native credential proxy (`.env`-based). This skill will switch you to OneCLI's Agent Vault, which adds per-agent policies and rate limits. Your `.env` credentials will be migrated to the vault." + +Use AskUserQuestion: +1. **Continue** — description: "Switch to OneCLI Agent Vault." +2. **Cancel** — description: "Keep the native credential proxy." + +If they cancel, stop. + +### Check the codebase expects OneCLI + +```bash +grep "@onecli-sh/sdk" package.json +``` + +If `@onecli-sh/sdk` is NOT in package.json, the codebase hasn't been updated to use OneCLI yet. Tell the user to run `/update-nanoclaw` first to get the OneCLI integration, then retry `/init-onecli`. Stop here. + +## Phase 2: Install OneCLI + +### Install the gateway and CLI + +```bash +curl -fsSL onecli.sh/install | sh +curl -fsSL onecli.sh/cli/install | sh +``` + +Verify: `onecli version` + +If the command is not found, the CLI was likely installed to `~/.local/bin/`. Add it to PATH: + +```bash +export PATH="$HOME/.local/bin:$PATH" +grep -q '.local/bin' ~/.bashrc 2>/dev/null || echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc +grep -q '.local/bin' ~/.zshrc 2>/dev/null || echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.zshrc +``` + +Re-verify with `onecli version`. + +### Configure the CLI + +Point the CLI at the local OneCLI instance: + +```bash +onecli config set api-host http://127.0.0.1:10254 +``` + +### Set ONECLI_URL in .env + +```bash +grep -q 'ONECLI_URL' .env 2>/dev/null || echo 'ONECLI_URL=http://127.0.0.1:10254' >> .env +``` + +### Wait for gateway readiness + +The gateway may take a moment to start after installation. Poll for up to 15 seconds: + +```bash +for i in $(seq 1 15); do + curl -sf http://127.0.0.1:10254/health && break + sleep 1 +done +``` + +If it never becomes healthy, check if the gateway process is running: + +```bash +ps aux | grep -i onecli | grep -v grep +``` + +If it's not running, try starting it manually: `onecli start`. If that fails, show the error and stop — the user needs to debug their OneCLI installation. + +## Phase 3: Migrate existing credentials + +### Scan .env for credentials to migrate + +Read the `.env` file and look for these credential variables: + +| .env variable | OneCLI secret type | Host pattern | +|---|---|---| +| `ANTHROPIC_API_KEY` | `anthropic` | `api.anthropic.com` | +| `CLAUDE_CODE_OAUTH_TOKEN` | `anthropic` | `api.anthropic.com` | +| `ANTHROPIC_AUTH_TOKEN` | `anthropic` | `api.anthropic.com` | + +Read `.env`: + +```bash +cat .env +``` + +Parse the file for any of the credential variables listed above. + +### If credentials found in .env + +For each credential found, migrate it to OneCLI: + +**Anthropic API key** (`ANTHROPIC_API_KEY=sk-ant-...`): +```bash +onecli secrets create --name Anthropic --type anthropic --value --host-pattern api.anthropic.com +``` + +**Claude OAuth token** (`CLAUDE_CODE_OAUTH_TOKEN=...` or `ANTHROPIC_AUTH_TOKEN=...`): +```bash +onecli secrets create --name Anthropic --type anthropic --value --host-pattern api.anthropic.com +``` + +After successful migration, remove the credential lines from `.env`. Use the Edit tool to remove only the credential variable lines (`ANTHROPIC_API_KEY`, `CLAUDE_CODE_OAUTH_TOKEN`, `ANTHROPIC_AUTH_TOKEN`). Keep all other `.env` entries intact (e.g. `ONECLI_URL`, `TELEGRAM_BOT_TOKEN`, channel tokens). + +Verify the secret was registered: +```bash +onecli secrets list +``` + +Tell the user: "Migrated your Anthropic credentials from `.env` to the OneCLI Agent Vault. The raw keys have been removed from `.env` — they're now managed by OneCLI and will be injected at request time without entering containers." + +### Offer to migrate other container-facing credentials + +After handling Anthropic credentials (whether migrated or freshly registered), scan `.env` again for remaining credential variables that containers use for outbound API calls. + +**Important:** Only migrate credentials that containers use via outbound HTTPS. Channel tokens (`TELEGRAM_BOT_TOKEN`, `SLACK_BOT_TOKEN`, `SLACK_APP_TOKEN`, `DISCORD_BOT_TOKEN`) are used by the NanoClaw host process to connect to messaging platforms — they must stay in `.env`. + +Known container-facing credentials: + +| .env variable | Secret name | Host pattern | +|---|---|---| +| `OPENAI_API_KEY` | `OpenAI` | `api.openai.com` | +| `PARALLEL_API_KEY` | `Parallel` | `api.parallel.ai` | + +If any of these are found with non-empty values, present them to the user: + +AskUserQuestion (multiSelect): "These credentials are used by container agents for outbound API calls. Moving them to the vault means agents never see the raw keys, and you can apply rate limits and policies." + +- One option per credential found (e.g., "OPENAI_API_KEY" — description: "Used by voice transcription and other OpenAI integrations inside containers") +- **Skip — keep them in .env** — description: "Leave these in .env for now. You can move them later." + +For each credential the user selects: + +```bash +onecli secrets create --name --type api_key --value --host-pattern +``` + +If there are credential variables not in the table above that look container-facing (i.e. not a channel token), ask the user: "Is `` used by agents inside containers? If so, what API host does it authenticate against? (e.g., `api.example.com`)" — then migrate accordingly. + +After migration, remove the migrated lines from `.env` using the Edit tool. Keep channel tokens and any credentials the user chose not to migrate. + +Verify all secrets were registered: +```bash +onecli secrets list +``` + +### If no credentials found in .env + +No migration needed. Proceed to register credentials fresh. + +Check if OneCLI already has an Anthropic secret: +```bash +onecli secrets list +``` + +If an Anthropic secret already exists, skip to Phase 4. + +Otherwise, register credentials using the same flow as `/setup`: + +AskUserQuestion: Do you want to use your **Claude subscription** (Pro/Max) or an **Anthropic API key**? + +1. **Claude subscription (Pro/Max)** — description: "Uses your existing Claude Pro or Max subscription. You'll run `claude setup-token` in another terminal to get your token." +2. **Anthropic API key** — description: "Pay-per-use API key from console.anthropic.com." + +#### Subscription path + +Tell the user to run `claude setup-token` in another terminal and copy the token it outputs. Do NOT collect the token in chat. + +Once they have the token, AskUserQuestion with two options: + +1. **Dashboard** — description: "Best if you have a browser on this machine. Open http://127.0.0.1:10254 and add the secret in the UI. Use type 'anthropic' and paste your token as the value." +2. **CLI** — description: "Best for remote/headless servers. Run: `onecli secrets create --name Anthropic --type anthropic --value YOUR_TOKEN --host-pattern api.anthropic.com`" + +#### API key path + +Tell the user to get an API key from https://console.anthropic.com/settings/keys if they don't have one. + +AskUserQuestion with two options: + +1. **Dashboard** — description: "Best if you have a browser on this machine. Open http://127.0.0.1:10254 and add the secret in the UI." +2. **CLI** — description: "Best for remote/headless servers. Run: `onecli secrets create --name Anthropic --type anthropic --value YOUR_KEY --host-pattern api.anthropic.com`" + +#### After either path + +Ask them to let you know when done. + +**If the user's response happens to contain a token or key** (starts with `sk-ant-` or looks like a token): handle it gracefully — run the `onecli secrets create` command with that value on their behalf. + +**After user confirms:** verify with `onecli secrets list` that an Anthropic secret exists. If not, ask again. + +## Phase 4: Build and restart + +```bash +npm run build +``` + +If build fails, diagnose and fix. Common issue: `@onecli-sh/sdk` not installed — run `npm install` first. + +Restart the service: +- macOS (launchd): `launchctl kickstart -k gui/$(id -u)/com.nanoclaw` +- Linux (systemd): `systemctl --user restart nanoclaw` +- WSL/manual: stop and re-run `bash start-nanoclaw.sh` + +## Phase 5: Verify + +Check logs for successful OneCLI integration: + +```bash +tail -30 logs/nanoclaw.log | grep -i "onecli\|gateway" +``` + +Expected: `OneCLI gateway config applied` messages when containers start. + +If the service is running and a channel is configured, tell the user to send a test message to verify the agent responds. + +Tell the user: +- OneCLI Agent Vault is now managing credentials +- Agents never see raw API keys — credentials are injected at the gateway level +- To manage secrets: `onecli secrets list`, or open http://127.0.0.1:10254 +- To add rate limits or policies: `onecli rules create --help` + +## Troubleshooting + +**"OneCLI gateway not reachable" in logs:** The gateway isn't running. Check with `curl -sf http://127.0.0.1:10254/health`. Start it with `onecli start` if needed. + +**Container gets no credentials:** Verify `ONECLI_URL` is set in `.env` and the gateway has an Anthropic secret (`onecli secrets list`). + +**Old .env credentials still present:** This skill should have removed them. Double-check `.env` for `ANTHROPIC_API_KEY`, `CLAUDE_CODE_OAUTH_TOKEN`, or `ANTHROPIC_AUTH_TOKEN` and remove them manually if still present. + +**Port 10254 already in use:** Another OneCLI instance may be running. Check with `lsof -i :10254` and kill the old process, or configure a different port. diff --git a/.claude/skills/setup/SKILL.md b/.claude/skills/setup/SKILL.md index e4e8a13..54c3d2d 100644 --- a/.claude/skills/setup/SKILL.md +++ b/.claude/skills/setup/SKILL.md @@ -1,17 +1,56 @@ --- name: setup -description: Run initial NanoClaw setup. Use when user wants to install dependencies, authenticate WhatsApp, register their main channel, or start the background services. Triggers on "setup", "install", "configure nanoclaw", or first-time setup requests. +description: Run initial NanoClaw setup. Use when user wants to install dependencies, authenticate messaging channels, register their main channel, or start the background services. Triggers on "setup", "install", "configure nanoclaw", or first-time setup requests. --- # NanoClaw Setup -Run setup steps automatically. Only pause when user action is required (WhatsApp authentication, configuration choices). Setup uses `bash setup.sh` for bootstrap, then `npx tsx setup/index.ts --step ` for all other steps. Steps emit structured status blocks to stdout. Verbose logs go to `logs/setup.log`. +Run setup steps automatically. Only pause when user action is required (channel authentication, configuration choices). Setup uses `bash setup.sh` for bootstrap, then `npx tsx setup/index.ts --step ` for all other steps. Steps emit structured status blocks to stdout. Verbose logs go to `logs/setup.log`. -**Principle:** When something is broken or missing, fix it. Don't tell the user to go fix it themselves unless it genuinely requires their manual action (e.g. scanning a QR code, pasting a secret token). If a dependency is missing, install it. If a service won't start, diagnose and repair. Ask the user for permission when needed, then do the work. +**Principle:** When something is broken or missing, fix it. Don't tell the user to go fix it themselves unless it genuinely requires their manual action (e.g. authenticating a channel, pasting a secret token). If a dependency is missing, install it. If a service won't start, diagnose and repair. Ask the user for permission when needed, then do the work. **UX Note:** Use `AskUserQuestion` for all user-facing questions. -## 1. Bootstrap (Node.js + Dependencies) +## 0. Git & Fork Setup + +Check the git remote configuration to ensure the user has a fork and upstream is configured. + +Run: +- `git remote -v` + +**Case A — `origin` points to `qwibitai/nanoclaw` (user cloned directly):** + +The user cloned instead of forking. AskUserQuestion: "You cloned NanoClaw directly. We recommend forking so you can push your customizations. Would you like to set up a fork?" +- Fork now (recommended) — walk them through it +- Continue without fork — they'll only have local changes + +If fork: instruct the user to fork `qwibitai/nanoclaw` on GitHub (they need to do this in their browser), then ask them for their GitHub username. Run: +```bash +git remote rename origin upstream +git remote add origin https://github.com//nanoclaw.git +git push --force origin main +``` +Verify with `git remote -v`. + +If continue without fork: add upstream so they can still pull updates: +```bash +git remote add upstream https://github.com/qwibitai/nanoclaw.git +``` + +**Case B — `origin` points to user's fork, no `upstream` remote:** + +Add upstream: +```bash +git remote add upstream https://github.com/qwibitai/nanoclaw.git +``` + +**Case C — both `origin` (user's fork) and `upstream` (qwibitai) exist:** + +Already configured. Continue. + +**Verify:** `git remote -v` should show `origin` → user's repo, `upstream` → `qwibitai/nanoclaw.git`. + +## 1. Bootstrap (Node.js + Dependencies + OneCLI) Run `bash setup.sh` and parse the status block. @@ -19,18 +58,53 @@ Run `bash setup.sh` and parse the status block. - macOS: `brew install node@22` (if brew available) or install nvm then `nvm install 22` - Linux: `curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash - && sudo apt-get install -y nodejs`, or nvm - After installing Node, re-run `bash setup.sh` -- If DEPS_OK=false → Read `logs/setup.log`. Try: delete `node_modules` and `package-lock.json`, re-run `bash setup.sh`. If native module build fails, install build tools (`xcode-select --install` on macOS, `build-essential` on Linux), then retry. +- If DEPS_OK=false → Read `logs/setup.log`. Try: delete `node_modules`, re-run `bash setup.sh`. If native module build fails, install build tools (`xcode-select --install` on macOS, `build-essential` on Linux), then retry. - If NATIVE_OK=false → better-sqlite3 failed to load. Install build tools and re-run. - Record PLATFORM and IS_WSL for later steps. +After bootstrap succeeds, install OneCLI and its CLI tool: + +```bash +curl -fsSL onecli.sh/install | sh +curl -fsSL onecli.sh/cli/install | sh +``` + +Verify both installed: `onecli version`. If the command is not found, the CLI was likely installed to `~/.local/bin/`. Add it to PATH for the current session and persist it: + +```bash +export PATH="$HOME/.local/bin:$PATH" +# Persist for future sessions (append to shell profile if not already present) +grep -q '.local/bin' ~/.bashrc 2>/dev/null || echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc +grep -q '.local/bin' ~/.zshrc 2>/dev/null || echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.zshrc +``` + +Then re-verify with `onecli version`. + +Point the CLI at the local OneCLI instance (it defaults to the cloud service otherwise): +```bash +onecli config set api-host http://127.0.0.1:10254 +``` + +Ensure `.env` has the OneCLI URL (create the file if it doesn't exist): +```bash +grep -q 'ONECLI_URL' .env 2>/dev/null || echo 'ONECLI_URL=http://127.0.0.1:10254' >> .env +``` + ## 2. Check Environment Run `npx tsx setup/index.ts --step environment` and parse the status block. -- If HAS_AUTH=true → note that WhatsApp auth exists, offer to skip step 5 +- If HAS_AUTH=true → WhatsApp is already configured, note for step 5 - If HAS_REGISTERED_GROUPS=true → note existing config, offer to skip or reconfigure - Record APPLE_CONTAINER and DOCKER values for step 3 +## 2a. Timezone + +Run `npx tsx setup/index.ts --step timezone` and parse the status block. + +- If NEEDS_USER_INPUT=true → The system timezone could not be autodetected (e.g. POSIX-style TZ like `IST-2`). AskUserQuestion: "What is your timezone?" with common options (America/New_York, Europe/London, Asia/Jerusalem, Asia/Tokyo) and an "Other" escape. Then re-run: `npx tsx setup/index.ts --step timezone -- --tz `. +- If STATUS=success → Timezone is configured. Note RESOLVED_TZ for reference. + ## 3. Container Runtime ### 3a. Choose runtime @@ -38,12 +112,12 @@ Run `npx tsx setup/index.ts --step environment` and parse the status block. Check the preflight results for `APPLE_CONTAINER` and `DOCKER`, and the PLATFORM from step 1. - PLATFORM=linux → Docker (only option) -- PLATFORM=macos + APPLE_CONTAINER=installed → Use `AskUserQuestion: Docker (default, cross-platform) or Apple Container (native macOS)?` If Apple Container, run `/convert-to-apple-container` now, then skip to 3c. -- PLATFORM=macos + APPLE_CONTAINER=not_found → Docker (default) +- PLATFORM=macos + APPLE_CONTAINER=installed → Use `AskUserQuestion: Docker (cross-platform) or Apple Container (native macOS)?` If Apple Container, run `/convert-to-apple-container` now, then skip to 3c. +- PLATFORM=macos + APPLE_CONTAINER=not_found → Docker ### 3a-docker. Install Docker -- DOCKER=running → continue to 3b +- DOCKER=running → continue to 4b - DOCKER=installed_not_running → start Docker: `open -a Docker` (macOS) or `sudo systemctl start docker` (Linux). Wait 15s, re-check with `docker info`. - DOCKER=not_found → Use `AskUserQuestion: Docker is required for running agents. Would you like me to install it?` If confirmed: - macOS: install via `brew install --cask docker`, then `open -a Docker` and wait for it to start. If brew not available, direct to Docker Desktop download at https://docker.com/products/docker-desktop @@ -61,7 +135,7 @@ grep -q "CONTAINER_RUNTIME_BIN = 'container'" src/container-runtime.ts && echo " **If ALREADY_CONVERTED**, the code already uses Apple Container. Continue to 3c. -**If the chosen runtime is Docker**, no conversion is needed — Docker is the default. Continue to 3c. +**If the chosen runtime is Docker**, no conversion is needed. Continue to 3c. ### 3c. Build and test @@ -73,63 +147,88 @@ Run `npx tsx setup/index.ts --step container -- --runtime ` and parse th **If TEST_OK=false but BUILD_OK=true:** The image built but won't run. Check logs — common cause is runtime not fully started. Wait a moment and retry the test. -## 4. Claude Authentication (No Script) +## 4. Anthropic Credentials via OneCLI -If HAS_ENV=true from step 2, read `.env` and check for `CLAUDE_CODE_OAUTH_TOKEN` or `ANTHROPIC_API_KEY`. If present, confirm with user: keep or reconfigure? +NanoClaw uses OneCLI to manage credentials — API keys are never stored in `.env` or exposed to containers. The OneCLI gateway injects them at request time. -AskUserQuestion: Claude subscription (Pro/Max) vs Anthropic API key? +Check if a secret already exists: +```bash +onecli secrets list +``` -**Subscription:** Tell user to run `claude setup-token` in another terminal, copy the token, add `CLAUDE_CODE_OAUTH_TOKEN=` to `.env`. Do NOT collect the token in chat. +If an Anthropic secret is listed, confirm with user: keep or reconfigure? If keeping, skip to step 5. -**API key:** Tell user to add `ANTHROPIC_API_KEY=` to `.env`. +AskUserQuestion: Do you want to use your **Claude subscription** (Pro/Max) or an **Anthropic API key**? -## 5. WhatsApp Authentication +1. **Claude subscription (Pro/Max)** — description: "Uses your existing Claude Pro or Max subscription. You'll run `claude setup-token` in another terminal to get your token." +2. **Anthropic API key** — description: "Pay-per-use API key from console.anthropic.com." -If HAS_AUTH=true, confirm: keep or re-authenticate? +### Subscription path -**Choose auth method based on environment (from step 2):** +Tell the user to run `claude setup-token` in another terminal and copy the token it outputs. Do NOT collect the token in chat. -If IS_HEADLESS=true AND IS_WSL=false → AskUserQuestion: Pairing code (recommended) vs QR code in terminal? -Otherwise (macOS, desktop Linux, or WSL) → AskUserQuestion: QR code in browser (recommended) vs pairing code vs QR code in terminal? +Once they have the token, they register it with OneCLI. AskUserQuestion with two options: -- **QR browser:** `npx tsx setup/index.ts --step whatsapp-auth -- --method qr-browser` (Bash timeout: 150000ms) -- **Pairing code:** Ask for phone number first. `npx tsx setup/index.ts --step whatsapp-auth -- --method pairing-code --phone NUMBER` (Bash timeout: 150000ms). Display PAIRING_CODE. -- **QR terminal:** `npx tsx setup/index.ts --step whatsapp-auth -- --method qr-terminal`. Tell user to run `npm run auth` in another terminal. +1. **Dashboard** — description: "Best if you have a browser on this machine. Open http://127.0.0.1:10254 and add the secret in the UI. Use type 'anthropic' and paste your token as the value." +2. **CLI** — description: "Best for remote/headless servers. Run: `onecli secrets create --name Anthropic --type anthropic --value YOUR_TOKEN --host-pattern api.anthropic.com`" -**If failed:** qr_timeout → re-run. logged_out → delete `store/auth/` and re-run. 515 → re-run. timeout → ask user, offer retry. +### API key path -## 6. Configure Trigger and Channel Type +Tell the user to get an API key from https://console.anthropic.com/settings/keys if they don't have one. -Get bot's WhatsApp number: `node -e "const c=require('./store/auth/creds.json');console.log(c.me.id.split(':')[0].split('@')[0])"` +Then AskUserQuestion with two options: -AskUserQuestion: Shared number or dedicated? → AskUserQuestion: Trigger word? → AskUserQuestion: Main channel type? +1. **Dashboard** — description: "Best if you have a browser on this machine. Open http://127.0.0.1:10254 and add the secret in the UI." +2. **CLI** — description: "Best for remote/headless servers. Run: `onecli secrets create --name Anthropic --type anthropic --value YOUR_KEY --host-pattern api.anthropic.com`" -**Shared number:** Self-chat (recommended) or Solo group -**Dedicated number:** DM with bot (recommended) or Solo group with bot +### After either path -## 7. Sync and Select Group (If Group Channel) +Ask them to let you know when done. -**Personal chat:** JID = `NUMBER@s.whatsapp.net` -**DM with bot:** Ask for bot's number, JID = `NUMBER@s.whatsapp.net` +**If the user's response happens to contain a token or key** (starts with `sk-ant-`): handle it gracefully — run the `onecli secrets create` command with that value on their behalf. -**Group:** -1. `npx tsx setup/index.ts --step groups` (Bash timeout: 60000ms) -2. BUILD=failed → fix TypeScript, re-run. GROUPS_IN_DB=0 → check logs. -3. `npx tsx setup/index.ts --step groups -- --list` for pipe-separated JID|name lines. -4. Present candidates as AskUserQuestion (names only, not JIDs). +**After user confirms:** verify with `onecli secrets list` that an Anthropic secret exists. If not, ask again. -## 8. Register Channel +## 5. Set Up Channels -Run `npx tsx setup/index.ts --step register -- --jid "JID" --name "main" --trigger "@TriggerWord" --folder "main"` plus `--no-trigger-required` if personal/DM/solo, `--assistant-name "Name"` if not Andy. +AskUserQuestion (multiSelect): Which messaging channels do you want to enable? +- WhatsApp (authenticates via QR code or pairing code) +- Telegram (authenticates via bot token from @BotFather) +- Slack (authenticates via Slack app with Socket Mode) +- Discord (authenticates via Discord bot token) -## 9. Mount Allowlist +**Delegate to each selected channel's own skill.** Each channel skill handles its own code installation, authentication, registration, and JID resolution. This avoids duplicating channel-specific logic and ensures JIDs are always correct. + +For each selected channel, invoke its skill: + +- **WhatsApp:** Invoke `/add-whatsapp` +- **Telegram:** Invoke `/add-telegram` +- **Slack:** Invoke `/add-slack` +- **Discord:** Invoke `/add-discord` + +Each skill will: +1. Install the channel code (via `git merge` of the skill branch) +2. Collect credentials/tokens and write to `.env` +3. Authenticate (WhatsApp QR/pairing, or verify token-based connection) +4. Register the chat with the correct JID format +5. Build and verify + +**After all channel skills complete**, install dependencies and rebuild — channel merges may introduce new packages: + +```bash +npm install && npm run build +``` + +If the build fails, read the error output and fix it (usually a missing dependency). Then continue to step 6. + +## 6. Mount Allowlist AskUserQuestion: Agent access to external directories? **No:** `npx tsx setup/index.ts --step mounts -- --empty` **Yes:** Collect paths/permissions. `npx tsx setup/index.ts --step mounts -- --json '{"allowedRoots":[...],"blockedPatterns":[],"nonMainReadOnly":true}'` -## 10. Start Service +## 7. Start Service If service already running: unload first. - macOS: `launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist` @@ -159,28 +258,34 @@ Replace `USERNAME` with the actual username (from `whoami`). Run the two `sudo` - Linux: check `systemctl --user status nanoclaw`. - Re-run the service step after fixing. -## 11. Verify +## 8. Verify Run `npx tsx setup/index.ts --step verify` and parse the status block. **If STATUS=failed, fix each:** - SERVICE=stopped → `npm run build`, then restart: `launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `systemctl --user restart nanoclaw` (Linux) or `bash start-nanoclaw.sh` (WSL nohup) -- SERVICE=not_found → re-run step 10 -- CREDENTIALS=missing → re-run step 4 -- WHATSAPP_AUTH=not_found → re-run step 5 -- REGISTERED_GROUPS=0 → re-run steps 7-8 +- SERVICE=not_found → re-run step 7 +- CREDENTIALS=missing → re-run step 4 (check `onecli secrets list` for Anthropic secret) +- CHANNEL_AUTH shows `not_found` for any channel → re-invoke that channel's skill (e.g. `/add-telegram`) +- REGISTERED_GROUPS=0 → re-invoke the channel skills from step 5 - MOUNT_ALLOWLIST=missing → `npx tsx setup/index.ts --step mounts -- --empty` Tell user to test: send a message in their registered chat. Show: `tail -f logs/nanoclaw.log` ## Troubleshooting -**Service not starting:** Check `logs/nanoclaw.error.log`. Common: wrong Node path (re-run step 10), missing `.env` (step 4), missing auth (step 5). +**Service not starting:** Check `logs/nanoclaw.error.log`. Common: wrong Node path (re-run step 7), OneCLI not running (check `curl http://127.0.0.1:10254/api/health`), missing channel credentials (re-invoke channel skill). **Container agent fails ("Claude Code process exited with code 1"):** Ensure the container runtime is running — `open -a Docker` (macOS Docker), `container system start` (Apple Container), or `sudo systemctl start docker` (Linux). Check container logs in `groups/main/logs/container-*.log`. **No response to messages:** Check trigger pattern. Main channel doesn't need prefix. Check DB: `npx tsx setup/index.ts --step verify`. Check `logs/nanoclaw.log`. -**WhatsApp disconnected:** `npm run auth` then rebuild and restart: `npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw` (macOS) or `systemctl --user restart nanoclaw` (Linux). +**Channel not connecting:** Verify the channel's credentials are set in `.env`. Channels auto-enable when their credentials are present. For WhatsApp: check `store/auth/creds.json` exists. For token-based channels: check token values in `.env`. Restart the service after any `.env` change. **Unload service:** macOS: `launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist` | Linux: `systemctl --user stop nanoclaw` + + +## 9. Diagnostics + +1. Use the Read tool to read `.claude/skills/setup/diagnostics.md`. +2. Follow every step in that file before completing setup. diff --git a/.claude/skills/setup/diagnostics.md b/.claude/skills/setup/diagnostics.md new file mode 100644 index 0000000..c6a42db --- /dev/null +++ b/.claude/skills/setup/diagnostics.md @@ -0,0 +1,46 @@ +# Diagnostics + +Gather system info: + +```bash +node -p "require('./package.json').version" +uname -s +uname -m +node -p "process.versions.node.split('.')[0]" +``` + +Write `/tmp/nanoclaw-diagnostics.json`. No paths, usernames, hostnames, or IP addresses. + +```json +{ + "api_key": "phc_fx1Hhx9ucz8GuaJC8LVZWO8u03yXZZJJ6ObS4yplnaP", + "event": "setup_complete", + "distinct_id": "", + "properties": { + "success": true, + "nanoclaw_version": "1.2.21", + "os_platform": "darwin", + "arch": "arm64", + "node_major_version": 22, + "channels_selected": ["telegram", "whatsapp"], + "error_count": 0, + "failed_step": null + } +} +``` + +Show the entire JSON to the user and ask via AskUserQuestion: **Yes** / **No** / **Never ask again** + +**Yes**: +```bash +curl -s -X POST https://us.i.posthog.com/capture/ -H 'Content-Type: application/json' -d @/tmp/nanoclaw-diagnostics.json +rm /tmp/nanoclaw-diagnostics.json +``` + +**No**: `rm /tmp/nanoclaw-diagnostics.json` + +**Never ask again**: +1. Replace contents of `.claude/skills/setup/diagnostics.md` with `# Diagnostics — opted out` +2. Replace contents of `.claude/skills/update-nanoclaw/diagnostics.md` with `# Diagnostics — opted out` +3. Remove the `## 9. Diagnostics` section from `.claude/skills/setup/SKILL.md` and the `## Diagnostics` section from `.claude/skills/update-nanoclaw/SKILL.md` +4. `rm /tmp/nanoclaw-diagnostics.json` diff --git a/.claude/skills/update-nanoclaw/SKILL.md b/.claude/skills/update-nanoclaw/SKILL.md new file mode 100644 index 0000000..496d409 --- /dev/null +++ b/.claude/skills/update-nanoclaw/SKILL.md @@ -0,0 +1,241 @@ +--- +name: update-nanoclaw +description: Efficiently bring upstream NanoClaw updates into a customized install, with preview, selective cherry-pick, and low token usage. +--- + +# About + +Your NanoClaw fork drifts from upstream as you customize it. This skill pulls upstream changes into your install without losing your modifications. + +Run `/update-nanoclaw` in Claude Code. + +## How it works + +**Preflight**: checks for clean working tree (`git status --porcelain`). If `upstream` remote is missing, asks you for the URL (defaults to `https://github.com/qwibitai/nanoclaw.git`) and adds it. Detects the upstream branch name (`main` or `master`). + +**Backup**: creates a timestamped backup branch and tag (`backup/pre-update--`, `pre-update--`) before touching anything. Safe to run multiple times. + +**Preview**: runs `git log` and `git diff` against the merge base to show upstream changes since your last sync. Groups changed files into categories: +- **Skills** (`.claude/skills/`): unlikely to conflict unless you edited an upstream skill +- **Source** (`src/`): may conflict if you modified the same files +- **Build/config** (`package.json`, `tsconfig*.json`, `container/`): review needed + +**Update paths** (you pick one): +- `merge` (default): `git merge upstream/`. Resolves all conflicts in one pass. +- `cherry-pick`: `git cherry-pick `. Pull in only the commits you want. +- `rebase`: `git rebase upstream/`. Linear history, but conflicts resolve per-commit. +- `abort`: just view the changelog, change nothing. + +**Conflict preview**: before merging, runs a dry-run (`git merge --no-commit --no-ff`) to show which files would conflict. You can still abort at this point. + +**Conflict resolution**: opens only conflicted files, resolves the conflict markers, keeps your local customizations intact. + +**Validation**: runs `npm run build` and `npm test`. + +**Breaking changes check**: after validation, reads CHANGELOG.md for any `[BREAKING]` entries introduced by the update. If found, shows each breaking change and offers to run the recommended skill to migrate. + +## Rollback + +The backup tag is printed at the end of each run: +``` +git reset --hard pre-update-- +``` + +Backup branch `backup/pre-update--` also exists. + +## Token usage + +Only opens files with actual conflicts. Uses `git log`, `git diff`, and `git status` for everything else. Does not scan or refactor unrelated code. + +--- + +# Goal +Help a user with a customized NanoClaw install safely incorporate upstream changes without a fresh reinstall and without blowing tokens. + +# Operating principles +- Never proceed with a dirty working tree. +- Always create a rollback point (backup branch + tag) before touching anything. +- Prefer git-native operations (fetch, merge, cherry-pick). Do not manually rewrite files except conflict markers. +- Default to MERGE (one-pass conflict resolution). Offer REBASE as an explicit option. +- Keep token usage low: rely on `git status`, `git log`, `git diff`, and open only conflicted files. + +# Step 0: Preflight (stop early if unsafe) +Run: +- `git status --porcelain` +If output is non-empty: +- Tell the user to commit or stash first, then stop. + +Confirm remotes: +- `git remote -v` +If `upstream` is missing: +- Ask the user for the upstream repo URL (default: `https://github.com/qwibitai/nanoclaw.git`). +- Add it: `git remote add upstream ` +- Then: `git fetch upstream --prune` + +Determine the upstream branch name: +- `git branch -r | grep upstream/` +- If `upstream/main` exists, use `main`. +- If only `upstream/master` exists, use `master`. +- Otherwise, ask the user which branch to use. +- Store this as UPSTREAM_BRANCH for all subsequent commands. Every command below that references `upstream/main` should use `upstream/$UPSTREAM_BRANCH` instead. + +Fetch: +- `git fetch upstream --prune` + +# Step 1: Create a safety net +Capture current state: +- `HASH=$(git rev-parse --short HEAD)` +- `TIMESTAMP=$(date +%Y%m%d-%H%M%S)` + +Create backup branch and tag (using timestamp to avoid collisions on retry): +- `git branch backup/pre-update-$HASH-$TIMESTAMP` +- `git tag pre-update-$HASH-$TIMESTAMP` + +Save the tag name for later reference in the summary and rollback instructions. + +# Step 2: Preview what upstream changed (no edits yet) +Compute common base: +- `BASE=$(git merge-base HEAD upstream/$UPSTREAM_BRANCH)` + +Show upstream commits since BASE: +- `git log --oneline $BASE..upstream/$UPSTREAM_BRANCH` + +Show local commits since BASE (custom drift): +- `git log --oneline $BASE..HEAD` + +Show file-level impact from upstream: +- `git diff --name-only $BASE..upstream/$UPSTREAM_BRANCH` + +Bucket the upstream changed files: +- **Skills** (`.claude/skills/`): unlikely to conflict unless the user edited an upstream skill +- **Source** (`src/`): may conflict if user modified the same files +- **Build/config** (`package.json`, `package-lock.json`, `tsconfig*.json`, `container/`, `launchd/`): review needed +- **Other**: docs, tests, misc + +Present these buckets to the user and ask them to choose one path using AskUserQuestion: +- A) **Full update**: merge all upstream changes +- B) **Selective update**: cherry-pick specific upstream commits +- C) **Abort**: they only wanted the preview +- D) **Rebase mode**: advanced, linear history (warn: resolves conflicts per-commit) + +If Abort: stop here. + +# Step 3: Conflict preview (before committing anything) +If Full update or Rebase: +- Dry-run merge to preview conflicts. Run these as a single chained command so the abort always executes: + ``` + git merge --no-commit --no-ff upstream/$UPSTREAM_BRANCH; git diff --name-only --diff-filter=U; git merge --abort + ``` +- If conflicts were listed: show them and ask user if they want to proceed. +- If no conflicts: tell user it is clean and proceed. + +# Step 4A: Full update (MERGE, default) +Run: +- `git merge upstream/$UPSTREAM_BRANCH --no-edit` + +If conflicts occur: +- Run `git status` and identify conflicted files. +- For each conflicted file: + - Open the file. + - Resolve only conflict markers. + - Preserve intentional local customizations. + - Incorporate upstream fixes/improvements. + - Do not refactor surrounding code. + - `git add ` +- When all resolved: + - If merge did not auto-commit: `git commit --no-edit` + +# Step 4B: Selective update (CHERRY-PICK) +If user chose Selective: +- Recompute BASE if needed: `BASE=$(git merge-base HEAD upstream/$UPSTREAM_BRANCH)` +- Show commit list again: `git log --oneline $BASE..upstream/$UPSTREAM_BRANCH` +- Ask user which commit hashes they want. +- Apply: `git cherry-pick ...` + +If conflicts during cherry-pick: +- Resolve only conflict markers, then: + - `git add ` + - `git cherry-pick --continue` +If user wants to stop: + - `git cherry-pick --abort` + +# Step 4C: Rebase (only if user explicitly chose option D) +Run: +- `git rebase upstream/$UPSTREAM_BRANCH` + +If conflicts: +- Resolve conflict markers only, then: + - `git add ` + - `git rebase --continue` +If it gets messy (more than 3 rounds of conflicts): + - `git rebase --abort` + - Recommend merge instead. + +# Step 5: Validation +Run: +- `npm run build` +- `npm test` (do not fail the flow if tests are not configured) + +If build fails: +- Show the error. +- Only fix issues clearly caused by the merge (missing imports, type mismatches from merged code). +- Do not refactor unrelated code. +- If unclear, ask the user before making changes. + +# Step 6: Breaking changes check +After validation succeeds, check if the update introduced any breaking changes. + +Determine which CHANGELOG entries are new by diffing against the backup tag: +- `git diff ..HEAD -- CHANGELOG.md` + +Parse the diff output for lines starting with `+[BREAKING]`. Each such line is one breaking change entry. The format is: +``` +[BREAKING] . Run `/` to . +``` + +If no `[BREAKING]` lines are found: +- Skip this step silently. Proceed to Step 7 (skill updates check). + +If one or more `[BREAKING]` lines are found: +- Display a warning header to the user: "This update includes breaking changes that may require action:" +- For each breaking change, display the full description. +- Collect all skill names referenced in the breaking change entries (the `/` part). +- Use AskUserQuestion to ask the user which migration skills they want to run now. Options: + - One option per referenced skill (e.g., "Run /add-whatsapp to re-add WhatsApp channel") + - "Skip — I'll handle these manually" +- Set `multiSelect: true` so the user can pick multiple skills if there are several breaking changes. +- For each skill the user selects, invoke it using the Skill tool. +- After all selected skills complete (or if user chose Skip), proceed to Step 7 (skill updates check). + +# Step 7: Check for skill updates +After the summary, check if skills are distributed as branches in this repo: +- `git branch -r --list 'upstream/skill/*'` + +If any `upstream/skill/*` branches exist: +- Use AskUserQuestion to ask: "Upstream has skill branches. Would you like to check for skill updates?" + - Option 1: "Yes, check for updates" (description: "Runs /update-skills to check for and apply skill branch updates") + - Option 2: "No, skip" (description: "You can run /update-skills later any time") +- If user selects yes, invoke `/update-skills` using the Skill tool. +- After the skill completes (or if user selected no), proceed to Step 8. + +# Step 8: Summary + rollback instructions +Show: +- Backup tag: the tag name created in Step 1 +- New HEAD: `git rev-parse --short HEAD` +- Upstream HEAD: `git rev-parse --short upstream/$UPSTREAM_BRANCH` +- Conflicts resolved (list files, if any) +- Breaking changes applied (list skills run, if any) +- Remaining local diff vs upstream: `git diff --name-only upstream/$UPSTREAM_BRANCH..HEAD` + +Tell the user: +- To rollback: `git reset --hard ` +- Backup branch also exists: `backup/pre-update--` +- Restart the service to apply changes: + - If using launchd: `launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist && launchctl load ~/Library/LaunchAgents/com.nanoclaw.plist` + - If running manually: restart `npm run dev` + + +## Diagnostics + +1. Use the Read tool to read `.claude/skills/update-nanoclaw/diagnostics.md`. +2. Follow every step in that file before finishing. diff --git a/.claude/skills/update-nanoclaw/diagnostics.md b/.claude/skills/update-nanoclaw/diagnostics.md new file mode 100644 index 0000000..8b06aa4 --- /dev/null +++ b/.claude/skills/update-nanoclaw/diagnostics.md @@ -0,0 +1,49 @@ +# Diagnostics + +Gather system info: + +```bash +node -p "require('./package.json').version" +uname -s +uname -m +node -p "process.versions.node.split('.')[0]" +git log -1 --format=%ci HEAD@{1} 2>/dev/null || echo "unknown" +``` + +Write `/tmp/nanoclaw-diagnostics.json`. No paths, usernames, hostnames, or IP addresses. + +```json +{ + "api_key": "phc_fx1Hhx9ucz8GuaJC8LVZWO8u03yXZZJJ6ObS4yplnaP", + "event": "update_complete", + "distinct_id": "", + "properties": { + "success": true, + "nanoclaw_version": "1.2.21", + "os_platform": "darwin", + "arch": "arm64", + "node_major_version": 22, + "version_age_days": 45, + "update_method": "merge", + "conflict_count": 0, + "breaking_changes_found": false, + "error_count": 0 + } +} +``` + +Show the entire JSON to the user and ask via AskUserQuestion: **Yes** / **No** / **Never ask again** + +**Yes**: +```bash +curl -s -X POST https://us.i.posthog.com/capture/ -H 'Content-Type: application/json' -d @/tmp/nanoclaw-diagnostics.json +rm /tmp/nanoclaw-diagnostics.json +``` + +**No**: `rm /tmp/nanoclaw-diagnostics.json` + +**Never ask again**: +1. Replace contents of `.claude/skills/setup/diagnostics.md` with `# Diagnostics — opted out` +2. Replace contents of `.claude/skills/update-nanoclaw/diagnostics.md` with `# Diagnostics — opted out` +3. Remove the `## 9. Diagnostics` section from `.claude/skills/setup/SKILL.md` and the `## Diagnostics` section from `.claude/skills/update-nanoclaw/SKILL.md` +4. `rm /tmp/nanoclaw-diagnostics.json` diff --git a/.claude/skills/update-skills/SKILL.md b/.claude/skills/update-skills/SKILL.md new file mode 100644 index 0000000..cbbff39 --- /dev/null +++ b/.claude/skills/update-skills/SKILL.md @@ -0,0 +1,130 @@ +--- +name: update-skills +description: Check for and apply updates to installed skill branches from upstream. +--- + +# About + +Skills are distributed as git branches (`skill/*`). When you install a skill, you merge its branch into your repo. This skill checks upstream for newer commits on those skill branches and helps you update. + +Run `/update-skills` in Claude Code. + +## How it works + +**Preflight**: checks for clean working tree and upstream remote. + +**Detection**: fetches upstream, lists all `upstream/skill/*` branches, determines which ones you've previously merged (via merge-base), and checks if any have new commits. + +**Selection**: presents a list of skills with available updates. You pick which to update. + +**Update**: merges each selected skill branch, resolves conflicts if any, then validates with build + test. + +--- + +# Goal +Help users update their installed skill branches from upstream without losing local customizations. + +# Operating principles +- Never proceed with a dirty working tree. +- Only offer updates for skills the user has already merged (installed). +- Use git-native operations. Do not manually rewrite files except conflict markers. +- Keep token usage low: rely on `git` commands, only open files with actual conflicts. + +# Step 0: Preflight + +Run: +- `git status --porcelain` + +If output is non-empty: +- Tell the user to commit or stash first, then stop. + +Check remotes: +- `git remote -v` + +If `upstream` is missing: +- Ask the user for the upstream repo URL (default: `https://github.com/qwibitai/nanoclaw.git`). +- `git remote add upstream ` + +Fetch: +- `git fetch upstream --prune` + +# Step 1: Detect installed skills with available updates + +List all upstream skill branches: +- `git branch -r --list 'upstream/skill/*'` + +For each `upstream/skill/`: +1. Check if the user has merged this skill branch before: + - `git merge-base --is-ancestor upstream/skill/~1 HEAD` — if this succeeds (exit 0) for any ancestor commit of the skill branch, the user has merged it at some point. A simpler check: `git log --oneline --merges --grep="skill/" HEAD` to see if there's a merge commit referencing this branch. + - Alternative: `MERGE_BASE=$(git merge-base HEAD upstream/skill/)` — if the merge base is NOT the initial commit and the merge base includes commits unique to the skill branch, it has been merged. + - Simplest reliable check: compare `git merge-base HEAD upstream/skill/` with `git merge-base HEAD upstream/main`. If the skill merge-base is strictly ahead of (or different from) the main merge-base, the user has merged this skill. +2. Check if there are new commits on the skill branch not yet in HEAD: + - `git log --oneline HEAD..upstream/skill/` + - If this produces output, there are updates available. + +Build three lists: +- **Updates available**: skills that are merged AND have new commits +- **Up to date**: skills that are merged and have no new commits +- **Not installed**: skills that have never been merged + +# Step 2: Present results + +If no skills have updates available: +- Tell the user all installed skills are up to date. List them. +- If there are uninstalled skills, mention them briefly (e.g., "3 other skills available in upstream that you haven't installed"). +- Stop here. + +If updates are available: +- Show the list of skills with updates, including the number of new commits for each: + ``` + skill/: 3 new commits + skill/: 1 new commit + ``` +- Also show skills that are up to date (for context). +- Use AskUserQuestion with `multiSelect: true` to let the user pick which skills to update. + - One option per skill with updates, labeled with the skill name and commit count. + - Add an option: "Skip — don't update any skills now" +- If user selects Skip, stop here. + +# Step 3: Apply updates + +For each selected skill (process one at a time): + +1. Tell the user which skill is being updated. +2. Run: `git merge upstream/skill/ --no-edit` +3. If the merge is clean, move to the next skill. +4. If conflicts occur: + - Run `git status` to identify conflicted files. + - For each conflicted file: + - Open the file. + - Resolve only conflict markers. + - Preserve intentional local customizations. + - `git add ` + - Complete the merge: `git commit --no-edit` + +If a merge fails badly (e.g., cannot resolve conflicts): +- `git merge --abort` +- Tell the user this skill could not be auto-updated and they should resolve it manually. +- Continue with the remaining skills. + +# Step 4: Validation + +After all selected skills are merged: +- `npm run build` +- `npm test` (do not fail the flow if tests are not configured) + +If build fails: +- Show the error. +- Only fix issues clearly caused by the merge (missing imports, type mismatches). +- Do not refactor unrelated code. +- If unclear, ask the user. + +# Step 5: Summary + +Show: +- Skills updated (list) +- Skills skipped or failed (if any) +- New HEAD: `git rev-parse --short HEAD` +- Any conflicts that were resolved (list files) + +If the service is running, remind the user to restart it to pick up changes. diff --git a/.claude/skills/update/SKILL.md b/.claude/skills/update/SKILL.md deleted file mode 100644 index 7f4fc02..0000000 --- a/.claude/skills/update/SKILL.md +++ /dev/null @@ -1,171 +0,0 @@ ---- -name: update -description: "Update NanoClaw from upstream. Fetches latest changes, merges with your customizations and skills, runs migrations. Triggers on \"update\", \"pull upstream\", \"sync with upstream\", \"get latest changes\"." ---- - -# Update NanoClaw - -Pull upstream changes and merge them with the user's installation, preserving skills and customizations. Scripts live in `.claude/skills/update/scripts/`. - -**Principle:** Handle everything automatically. Only pause for user confirmation before applying changes, or when merge conflicts need human judgment. - -**UX Note:** Use `AskUserQuestion` for all user-facing questions. - -## 1. Pre-flight - -Check that the skills system is initialized: - -```bash -test -d .nanoclaw && echo "INITIALIZED" || echo "NOT_INITIALIZED" -``` - -**If NOT_INITIALIZED:** Run `initSkillsSystem()` first: - -```bash -npx tsx -e "import { initNanoclawDir } from './skills-engine/init.js'; initNanoclawDir();" -``` - -Check for uncommitted git changes: - -```bash -git status --porcelain -``` - -**If there are uncommitted changes:** Warn the user: "You have uncommitted changes. It's recommended to commit or stash them before updating. Continue anyway?" Use `AskUserQuestion` with options: "Continue anyway", "Abort (I'll commit first)". If they abort, stop here. - -## 2. Fetch upstream - -Run the fetch script: - -```bash -./.claude/skills/update/scripts/fetch-upstream.sh -``` - -Parse the structured status block between `<<< STATUS` and `STATUS >>>` markers. Extract: -- `TEMP_DIR` — path to extracted upstream files -- `REMOTE` — which git remote was used -- `CURRENT_VERSION` — version from local `package.json` -- `NEW_VERSION` — version from upstream `package.json` -- `STATUS` — "success" or "error" - -**If STATUS=error:** Show the error output and stop. - -**If CURRENT_VERSION equals NEW_VERSION:** Tell the user they're already up to date. Ask if they want to force the update anyway (there may be non-version-bumped changes). If no, clean up the temp dir and stop. - -## 3. Preview - -Run the preview to show what will change: - -```bash -npx tsx scripts/update-core.ts --json --preview-only -``` - -This outputs JSON with: `currentVersion`, `newVersion`, `filesChanged`, `filesDeleted`, `conflictRisk`, `customPatchesAtRisk`. - -Present to the user: -- "Updating from **{currentVersion}** to **{newVersion}**" -- "{N} files will be changed" — list them if <= 20, otherwise summarize -- If `conflictRisk` is non-empty: "These files have skill modifications and may conflict: {list}" -- If `customPatchesAtRisk` is non-empty: "These custom patches may need re-application: {list}" -- If `filesDeleted` is non-empty: "{N} files will be removed" - -## 4. Confirm - -Use `AskUserQuestion`: "Apply this update?" with options: -- "Yes, apply update" -- "No, cancel" - -If cancelled, clean up the temp dir (`rm -rf `) and stop. - -## 5. Apply - -Run the update: - -```bash -npx tsx scripts/update-core.ts --json -``` - -Parse the JSON output. The result has: `success`, `previousVersion`, `newVersion`, `mergeConflicts`, `backupPending`, `customPatchFailures`, `skillReapplyResults`, `error`. - -**If success=true with no issues:** Continue to step 7. - -**If customPatchFailures exist:** Warn the user which custom patches failed to re-apply. These may need manual attention after the update. - -**If skillReapplyResults has false entries:** Warn the user which skill tests failed after re-application. - -## 6. Handle conflicts - -**If backupPending=true:** There are unresolved merge conflicts. - -For each file in `mergeConflicts`: -1. Read the file — it contains conflict markers (`<<<<<<<`, `=======`, `>>>>>>>`) -2. Check if there's an intent file for this path in any applied skill (e.g., `.claude/skills//modify/.intent.md`) -3. Use the intent file and your understanding of the codebase to resolve the conflict -4. Write the resolved file - -After resolving all conflicts: - -```bash -npx tsx scripts/post-update.ts -``` - -This clears the backup, confirming the resolution. - -**If you cannot confidently resolve a conflict:** Show the user the conflicting sections and ask them to choose or provide guidance. - -## 7. Run migrations - -Run migrations between the old and new versions: - -```bash -npx tsx scripts/run-migrations.ts -``` - -Parse the JSON output. It contains: `migrationsRun` (count), `results` (array of `{version, success, error?}`). - -**If any migration fails:** Show the error to the user. The update itself is already applied — the migration failure needs manual attention. - -**If no migrations found:** This is normal (most updates won't have migrations). Continue silently. - -## 8. Verify - -Run build and tests: - -```bash -npm run build && npm test -``` - -**If build fails:** Show the error. Common causes: -- Type errors from merged files — read the error, fix the file, retry -- Missing dependencies — run `npm install` first, retry - -**If tests fail:** Show which tests failed. Try to diagnose and fix. If you can't fix automatically, report to the user. - -**If both pass:** Report success. - -## 9. Cleanup - -Remove the temp directory: - -```bash -rm -rf -``` - -Report final status: -- "Updated from **{previousVersion}** to **{newVersion}**" -- Number of files changed -- Any warnings (failed custom patches, failed skill tests, migration issues) -- Build and test status - -## Troubleshooting - -**No upstream remote:** The fetch script auto-adds `upstream` pointing to `https://github.com/qwibitai/nanoclaw.git`. If the user forked from a different URL, they should set the remote manually: `git remote add upstream `. - -**Merge conflicts in many files:** Consider whether the user has heavily customized core files. Suggest using the skills system for modifications instead of direct edits, as skills survive updates better. - -**Build fails after update:** Check if `package.json` dependencies changed. Run `npm install` to pick up new dependencies. - -**Rollback:** If something goes wrong after applying but before cleanup, the backup is still in `.nanoclaw/backup/`. Run: -```bash -npx tsx -e "import { restoreBackup, clearBackup } from './skills-engine/backup.js'; restoreBackup(); clearBackup();" -``` diff --git a/.claude/skills/update/scripts/fetch-upstream.sh b/.claude/skills/update/scripts/fetch-upstream.sh deleted file mode 100755 index 76bc783..0000000 --- a/.claude/skills/update/scripts/fetch-upstream.sh +++ /dev/null @@ -1,84 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Fetch upstream NanoClaw and extract to a temp directory. -# Outputs a structured status block for machine parsing. - -SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)" -PROJECT_ROOT="$(cd "$SCRIPT_DIR/../../../.." && pwd)" -cd "$PROJECT_ROOT" - -# Determine the correct remote -REMOTE="" -if git remote get-url upstream &>/dev/null; then - REMOTE="upstream" -elif git remote get-url origin &>/dev/null; then - ORIGIN_URL=$(git remote get-url origin) - if echo "$ORIGIN_URL" | grep -q "qwibitai/nanoclaw"; then - REMOTE="origin" - fi -fi - -if [ -z "$REMOTE" ]; then - echo "No upstream remote found. Adding upstream → https://github.com/qwibitai/nanoclaw.git" - git remote add upstream https://github.com/qwibitai/nanoclaw.git - REMOTE="upstream" -fi - -echo "Fetching from $REMOTE..." -if ! git fetch "$REMOTE" main 2>&1; then - echo "<<< STATUS" - echo "STATUS=error" - echo "ERROR=Failed to fetch from $REMOTE" - echo "STATUS >>>" - exit 1 -fi - -# Get current version from local package.json -CURRENT_VERSION="unknown" -if [ -f package.json ]; then - CURRENT_VERSION=$(node -e "console.log(require('./package.json').version || 'unknown')") -fi - -# Create temp dir and extract only the paths the skills engine tracks. -# Read BASE_INCLUDES from the single source of truth in skills-engine/constants.ts, -# plus always include migrations/ for the migration runner. -TEMP_DIR=$(mktemp -d /tmp/nanoclaw-update-XXXX) -trap 'rm -rf "$TEMP_DIR"' ERR -echo "Extracting $REMOTE/main to $TEMP_DIR..." - -CANDIDATES=$(node -e " - const fs = require('fs'); - const src = fs.readFileSync('skills-engine/constants.ts', 'utf-8'); - const m = src.match(/BASE_INCLUDES\s*=\s*\[([^\]]+)\]/); - if (!m) { console.error('Cannot parse BASE_INCLUDES'); process.exit(1); } - const paths = m[1].match(/'([^']+)'/g).map(s => s.replace(/'/g, '')); - paths.push('migrations/'); - console.log(paths.join(' ')); -") - -# Filter to paths that actually exist in the upstream tree. -# git archive errors if a path doesn't exist, so we check first. -PATHS="" -for candidate in $CANDIDATES; do - if [ -n "$(git ls-tree --name-only "$REMOTE/main" "$candidate" 2>/dev/null)" ]; then - PATHS="$PATHS $candidate" - fi -done - -git archive "$REMOTE/main" -- $PATHS | tar -x -C "$TEMP_DIR" - -# Get new version from extracted package.json -NEW_VERSION="unknown" -if [ -f "$TEMP_DIR/package.json" ]; then - NEW_VERSION=$(node -e "console.log(require('$TEMP_DIR/package.json').version || 'unknown')") -fi - -echo "" -echo "<<< STATUS" -echo "TEMP_DIR=$TEMP_DIR" -echo "REMOTE=$REMOTE" -echo "CURRENT_VERSION=$CURRENT_VERSION" -echo "NEW_VERSION=$NEW_VERSION" -echo "STATUS=success" -echo "STATUS >>>" diff --git a/.claude/skills/use-local-whisper/SKILL.md b/.claude/skills/use-local-whisper/SKILL.md new file mode 100644 index 0000000..ec18a09 --- /dev/null +++ b/.claude/skills/use-local-whisper/SKILL.md @@ -0,0 +1,152 @@ +--- +name: use-local-whisper +description: Use when the user wants local voice transcription instead of OpenAI Whisper API. Switches to whisper.cpp running on Apple Silicon. WhatsApp only for now. Requires voice-transcription skill to be applied first. +--- + +# Use Local Whisper + +Switches voice transcription from OpenAI's Whisper API to local whisper.cpp. Runs entirely on-device — no API key, no network, no cost. + +**Channel support:** Currently WhatsApp only. The transcription module (`src/transcription.ts`) uses Baileys types for audio download. Other channels (Telegram, Discord, etc.) would need their own audio-download logic before this skill can serve them. + +**Note:** The Homebrew package is `whisper-cpp`, but the CLI binary it installs is `whisper-cli`. + +## Prerequisites + +- `voice-transcription` skill must be applied first (WhatsApp channel) +- macOS with Apple Silicon (M1+) recommended +- `whisper-cpp` installed: `brew install whisper-cpp` (provides the `whisper-cli` binary) +- `ffmpeg` installed: `brew install ffmpeg` +- A GGML model file downloaded to `data/models/` + +## Phase 1: Pre-flight + +### Check if already applied + +Check if `src/transcription.ts` already uses `whisper-cli`: + +```bash +grep 'whisper-cli' src/transcription.ts && echo "Already applied" || echo "Not applied" +``` + +If already applied, skip to Phase 3 (Verify). + +### Check dependencies are installed + +```bash +whisper-cli --help >/dev/null 2>&1 && echo "WHISPER_OK" || echo "WHISPER_MISSING" +ffmpeg -version >/dev/null 2>&1 && echo "FFMPEG_OK" || echo "FFMPEG_MISSING" +``` + +If missing, install via Homebrew: +```bash +brew install whisper-cpp ffmpeg +``` + +### Check for model file + +```bash +ls data/models/ggml-*.bin 2>/dev/null || echo "NO_MODEL" +``` + +If no model exists, download the base model (148MB, good balance of speed and accuracy): +```bash +mkdir -p data/models +curl -L -o data/models/ggml-base.bin "https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-base.bin" +``` + +For better accuracy at the cost of speed, use `ggml-small.bin` (466MB) or `ggml-medium.bin` (1.5GB). + +## Phase 2: Apply Code Changes + +### Ensure WhatsApp fork remote + +```bash +git remote -v +``` + +If `whatsapp` is missing, add it: + +```bash +git remote add whatsapp https://github.com/qwibitai/nanoclaw-whatsapp.git +``` + +### Merge the skill branch + +```bash +git fetch whatsapp skill/local-whisper +git merge whatsapp/skill/local-whisper || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This modifies `src/transcription.ts` to use the `whisper-cli` binary instead of the OpenAI API. + +### Validate + +```bash +npm run build +``` + +## Phase 3: Verify + +### Ensure launchd PATH includes Homebrew + +The NanoClaw launchd service runs with a restricted PATH. `whisper-cli` and `ffmpeg` are in `/opt/homebrew/bin/` (Apple Silicon) or `/usr/local/bin/` (Intel), which may not be in the plist's PATH. + +Check the current PATH: +```bash +grep -A1 'PATH' ~/Library/LaunchAgents/com.nanoclaw.plist +``` + +If `/opt/homebrew/bin` is missing, add it to the `` value inside the `PATH` key in the plist. Then reload: +```bash +launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist +launchctl load ~/Library/LaunchAgents/com.nanoclaw.plist +``` + +### Build and restart + +```bash +npm run build +launchctl kickstart -k gui/$(id -u)/com.nanoclaw +``` + +### Test + +Send a voice note in any registered group. The agent should receive it as `[Voice: ]`. + +### Check logs + +```bash +tail -f logs/nanoclaw.log | grep -i -E "voice|transcri|whisper" +``` + +Look for: +- `Transcribed voice message` — successful transcription +- `whisper.cpp transcription failed` — check model path, ffmpeg, or PATH + +## Configuration + +Environment variables (optional, set in `.env`): + +| Variable | Default | Description | +|----------|---------|-------------| +| `WHISPER_BIN` | `whisper-cli` | Path to whisper.cpp binary | +| `WHISPER_MODEL` | `data/models/ggml-base.bin` | Path to GGML model file | + +## Troubleshooting + +**"whisper.cpp transcription failed"**: Ensure both `whisper-cli` and `ffmpeg` are in PATH. The launchd service uses a restricted PATH — see Phase 3 above. Test manually: +```bash +ffmpeg -f lavfi -i anullsrc=r=16000:cl=mono -t 1 -f wav /tmp/test.wav -y +whisper-cli -m data/models/ggml-base.bin -f /tmp/test.wav --no-timestamps -nt +``` + +**Transcription works in dev but not as service**: The launchd plist PATH likely doesn't include `/opt/homebrew/bin`. See "Ensure launchd PATH includes Homebrew" in Phase 3. + +**Slow transcription**: The base model processes ~30s of audio in <1s on M1+. If slower, check CPU usage — another process may be competing. + +**Wrong language**: whisper.cpp auto-detects language. To force a language, you can set `WHISPER_LANG` and modify `src/transcription.ts` to pass `-l $WHISPER_LANG`. diff --git a/.claude/skills/use-native-credential-proxy/SKILL.md b/.claude/skills/use-native-credential-proxy/SKILL.md new file mode 100644 index 0000000..71448b1 --- /dev/null +++ b/.claude/skills/use-native-credential-proxy/SKILL.md @@ -0,0 +1,167 @@ +--- +name: use-native-credential-proxy +description: Replace OneCLI gateway with the built-in credential proxy. For users who want simple .env-based credential management without installing OneCLI. Reads API key or OAuth token from .env and injects into container API requests. +--- + +# Use Native Credential Proxy + +This skill replaces the OneCLI gateway with NanoClaw's built-in credential proxy. Containers get credentials injected via a local HTTP proxy that reads from `.env` — no external services needed. + +## Phase 1: Pre-flight + +### Check if already applied + +Check if `src/credential-proxy.ts` is imported in `src/index.ts`: + +```bash +grep "credential-proxy" src/index.ts +``` + +If it shows an import for `startCredentialProxy`, the native proxy is already active. Skip to Phase 3 (Setup). + +### Check if OneCLI is active + +```bash +grep "@onecli-sh/sdk" package.json +``` + +If `@onecli-sh/sdk` appears, OneCLI is the active credential provider. Proceed with Phase 2 to replace it. + +If neither check matches, you may be on an older version. Run `/update-nanoclaw` first, then retry. + +## Phase 2: Apply Code Changes + +### Ensure upstream remote + +```bash +git remote -v +``` + +If `upstream` is missing, add it: + +```bash +git remote add upstream https://github.com/qwibitai/nanoclaw.git +``` + +### Merge the skill branch + +```bash +git fetch upstream skill/native-credential-proxy +git merge upstream/skill/native-credential-proxy || { + git checkout --theirs package-lock.json + git add package-lock.json + git merge --continue +} +``` + +This merges in: +- `src/credential-proxy.ts` and `src/credential-proxy.test.ts` (the proxy implementation) +- Restored credential proxy usage in `src/index.ts`, `src/container-runner.ts`, `src/container-runtime.ts`, `src/config.ts` +- Removed `@onecli-sh/sdk` dependency +- Restored `CREDENTIAL_PROXY_PORT` config (default 3001) +- Restored platform-aware proxy bind address detection +- Reverted setup skill to `.env`-based credential instructions + +If the merge reports conflicts beyond `package-lock.json`, resolve them by reading the conflicted files and understanding the intent of both sides. + +### Update main group CLAUDE.md + +Replace the OneCLI auth reference with the native proxy: + +In `groups/main/CLAUDE.md`, replace: +> OneCLI manages credentials (including Anthropic auth) — run `onecli --help`. + +with: +> The native credential proxy manages credentials (including Anthropic auth) via `.env` — see `src/credential-proxy.ts`. + +### Validate code changes + +```bash +npm install +npm run build +npx vitest run src/credential-proxy.test.ts src/container-runner.test.ts +``` + +All tests must pass and build must be clean before proceeding. + +## Phase 3: Setup Credentials + +AskUserQuestion: Do you want to use your **Claude subscription** (Pro/Max) or an **Anthropic API key**? + +1. **Claude subscription (Pro/Max)** — description: "Uses your existing Claude Pro or Max subscription. You'll run `claude setup-token` in another terminal to get your token." +2. **Anthropic API key** — description: "Pay-per-use API key from console.anthropic.com." + +### Subscription path + +Tell the user to run `claude setup-token` in another terminal and copy the token it outputs. Do NOT collect the token in chat. + +Once they have the token, add it to `.env`: + +```bash +# Add to .env (create file if needed) +echo 'CLAUDE_CODE_OAUTH_TOKEN=' >> .env +``` + +Note: `ANTHROPIC_AUTH_TOKEN` is also supported as a fallback. + +### API key path + +Tell the user to get an API key from https://console.anthropic.com/settings/keys if they don't have one. + +Add it to `.env`: + +```bash +echo 'ANTHROPIC_API_KEY=' >> .env +``` + +### After either path + +**If the user's response happens to contain a token or key** (starts with `sk-ant-` or looks like a token): write it to `.env` on their behalf using the appropriate variable name. + +**Optional:** If the user needs a custom API endpoint, they can add `ANTHROPIC_BASE_URL=` to `.env` (defaults to `https://api.anthropic.com`). + +## Phase 4: Verify + +1. Rebuild and restart: + +```bash +npm run build +``` + +Then restart the service: +- macOS: `launchctl kickstart -k gui/$(id -u)/com.nanoclaw` +- Linux: `systemctl --user restart nanoclaw` +- WSL/manual: stop and re-run `bash start-nanoclaw.sh` + +2. Check logs for successful proxy startup: + +```bash +tail -20 logs/nanoclaw.log | grep "Credential proxy" +``` + +Expected: `Credential proxy started` with port and auth mode. + +3. Send a test message in the registered chat to verify the agent responds. + +4. Note: after applying this skill, the OneCLI credential steps in `/setup` no longer apply. `.env` is now the credential source. + +## Troubleshooting + +**"Credential proxy upstream error" in logs:** Check that `.env` has a valid `ANTHROPIC_API_KEY` or `CLAUDE_CODE_OAUTH_TOKEN`. Verify the API is reachable: `curl -s https://api.anthropic.com/v1/messages -H "x-api-key: test" | head`. + +**Port 3001 already in use:** Set `CREDENTIAL_PROXY_PORT=` in `.env` or as an environment variable. + +**Container can't reach proxy (Linux):** The proxy binds to the `docker0` bridge IP by default. If that interface doesn't exist (e.g. rootless Docker), set `CREDENTIAL_PROXY_HOST=0.0.0.0` as an environment variable. + +**OAuth token expired (401 errors):** Re-run `claude setup-token` in a terminal and update the token in `.env`. + +## Removal + +To revert to OneCLI gateway: + +1. Find the merge commit: `git log --oneline --merges -5` +2. Revert it: `git revert -m 1` (undoes the skill branch merge, keeps your other changes) +3. `npm install` (re-adds `@onecli-sh/sdk`) +4. `npm run build` +5. Follow `/setup` step 4 to configure OneCLI credentials +6. Remove `ANTHROPIC_API_KEY` / `CLAUDE_CODE_OAUTH_TOKEN` from `.env` diff --git a/.env.example b/.env.example index 8b13789..b90e6c9 100644 --- a/.env.example +++ b/.env.example @@ -1 +1 @@ - +TELEGRAM_BOT_TOKEN= diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md index 8d33f7b..49fe366 100644 --- a/.github/PULL_REQUEST_TEMPLATE.md +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -1,14 +1,18 @@ + ## Type of Change -- [ ] **Skill** - adds a new skill in `.claude/skills/` +- [ ] **Feature skill** - adds a channel or integration (source code changes + SKILL.md) +- [ ] **Utility skill** - adds a standalone tool (code files in `.claude/skills//`, no source changes) +- [ ] **Operational/container skill** - adds a workflow or agent skill (SKILL.md only, no source changes) - [ ] **Fix** - bug fix or security fix to source code - [ ] **Simplification** - reduces or simplifies source code +- [ ] **Documentation** - docs, README, or CONTRIBUTING changes only ## Description ## For Skills -- [ ] I have not made any changes to source code -- [ ] My skill contains instructions for Claude to follow (not pre-built code) +- [ ] SKILL.md contains instructions, not inline code (code goes in separate files) +- [ ] SKILL.md is under 500 lines - [ ] I tested this skill on a fresh clone diff --git a/.github/workflows/bump-version.yml b/.github/workflows/bump-version.yml index fb77595..8191085 100644 --- a/.github/workflows/bump-version.yml +++ b/.github/workflows/bump-version.yml @@ -7,6 +7,7 @@ on: jobs: bump-version: + if: github.repository == 'qwibitai/nanoclaw' runs-on: ubuntu-latest steps: - uses: actions/create-github-app-token@v1 diff --git a/.github/workflows/label-pr.yml b/.github/workflows/label-pr.yml new file mode 100644 index 0000000..bec9d3e --- /dev/null +++ b/.github/workflows/label-pr.yml @@ -0,0 +1,35 @@ +name: Label PR + +on: + pull_request: + types: [opened, edited] + +jobs: + label: + runs-on: ubuntu-latest + permissions: + pull-requests: write + steps: + - uses: actions/github-script@v7 + with: + script: | + const body = context.payload.pull_request.body || ''; + const labels = []; + + if (body.includes('[x] **Feature skill**')) { labels.push('PR: Skill'); labels.push('PR: Feature'); } + else if (body.includes('[x] **Utility skill**')) labels.push('PR: Skill'); + else if (body.includes('[x] **Operational/container skill**')) labels.push('PR: Skill'); + else if (body.includes('[x] **Fix**')) labels.push('PR: Fix'); + else if (body.includes('[x] **Simplification**')) labels.push('PR: Refactor'); + else if (body.includes('[x] **Documentation**')) labels.push('PR: Docs'); + + if (body.includes('contributing-guide: v1')) labels.push('follows-guidelines'); + + if (labels.length > 0) { + await github.rest.issues.addLabels({ + owner: context.repo.owner, + repo: context.repo.repo, + issue_number: context.payload.pull_request.number, + labels, + }); + } diff --git a/.github/workflows/skill-drift.yml b/.github/workflows/skill-drift.yml deleted file mode 100644 index 9bc7ed8..0000000 --- a/.github/workflows/skill-drift.yml +++ /dev/null @@ -1,102 +0,0 @@ -name: Skill Drift Detection - -# Runs after every push to main that touches source files. -# Validates every skill can still be cleanly applied, type-checked, and tested. -# If a skill drifts, attempts auto-fix via three-way merge of modify/ files, -# then opens a PR with the result (auto-fixed or with conflict markers). - -on: - push: - branches: [main] - paths: - - 'src/**' - - 'container/**' - - 'package.json' - workflow_dispatch: - -permissions: - contents: write - pull-requests: write - -jobs: - # ── Step 1: Check all skills against current main ───────────────────── - validate: - runs-on: ubuntu-latest - outputs: - drifted: ${{ steps.check.outputs.drifted }} - drifted_skills: ${{ steps.check.outputs.drifted_skills }} - results: ${{ steps.check.outputs.results }} - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 - - uses: actions/setup-node@v4 - with: - node-version: 20 - cache: npm - - run: npm ci - - - name: Validate all skills against main - id: check - run: npx tsx scripts/validate-all-skills.ts - continue-on-error: true - - # ── Step 2: Auto-fix and create PR ──────────────────────────────────── - fix-drift: - needs: validate - if: needs.validate.outputs.drifted == 'true' - runs-on: ubuntu-latest - steps: - - uses: actions/create-github-app-token@v1 - id: app-token - with: - app-id: ${{ secrets.APP_ID }} - private-key: ${{ secrets.APP_PRIVATE_KEY }} - - - uses: actions/checkout@v4 - with: - token: ${{ steps.app-token.outputs.token }} - fetch-depth: 0 - - - uses: actions/setup-node@v4 - with: - node-version: 20 - cache: npm - - run: npm ci - - - name: Attempt auto-fix via three-way merge - id: fix - run: | - SKILLS=$(echo '${{ needs.validate.outputs.drifted_skills }}' | jq -r '.[]') - npx tsx scripts/fix-skill-drift.ts $SKILLS - - - name: Create pull request - uses: peter-evans/create-pull-request@v7 - with: - token: ${{ steps.app-token.outputs.token }} - branch: ci/fix-skill-drift - delete-branch: true - title: 'fix(skills): auto-update drifted skills' - body: | - ## Skill Drift Detected - - A push to `main` (${{ github.sha }}) changed source files that caused - the following skills to fail validation: - - **Drifted:** ${{ needs.validate.outputs.drifted_skills }} - - ### Auto-fix results - - ${{ steps.fix.outputs.summary }} - - ### What to do - - 1. Review the changes to `.claude/skills/*/modify/` files - 2. If there are conflict markers (`<<<<<<<`), resolve them - 3. CI will run typecheck + tests on this PR automatically - 4. Merge when green - - --- - *Auto-generated by [skill-drift CI](${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }})* - labels: skill-drift,automated - commit-message: 'fix(skills): auto-update drifted skill modify/ files' diff --git a/.github/workflows/skill-pr.yml b/.github/workflows/skill-pr.yml deleted file mode 100644 index 7ecd71a..0000000 --- a/.github/workflows/skill-pr.yml +++ /dev/null @@ -1,151 +0,0 @@ -name: Skill PR Validation - -on: - pull_request: - branches: [main] - paths: - - '.claude/skills/**' - - 'skills-engine/**' - -jobs: - # ── Job 1: Policy gate ──────────────────────────────────────────────── - # Block PRs that add NEW skill files while also modifying source code. - # Skill PRs should contain instructions for Claude, not raw source edits. - policy-check: - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 - - - name: Check for mixed skill + source changes - run: | - ADDED_SKILLS=$(git diff --name-only --diff-filter=A origin/main...HEAD \ - | grep '^\\.claude/skills/' || true) - CHANGED=$(git diff --name-only origin/main...HEAD) - SOURCE=$(echo "$CHANGED" \ - | grep -E '^src/|^container/|^package\.json|^package-lock\.json' || true) - - if [ -n "$ADDED_SKILLS" ] && [ -n "$SOURCE" ]; then - echo "::error::PRs that add new skills should not modify source files." - echo "" - echo "New skill files:" - echo "$ADDED_SKILLS" - echo "" - echo "Source files:" - echo "$SOURCE" - echo "" - echo "Please split into separate PRs. See CONTRIBUTING.md." - exit 1 - fi - - - name: Comment on failure - if: failure() - uses: actions/github-script@v7 - with: - script: | - github.rest.issues.createComment({ - owner: context.repo.owner, - repo: context.repo.repo, - issue_number: context.issue.number, - body: `This PR adds a skill while also modifying source code. A skill PR should not change source files—the skill should contain **instructions** for Claude to follow. - - If you're fixing a bug or simplifying code, please submit that as a separate PR. - - See [CONTRIBUTING.md](https://github.com/${context.repo.owner}/${context.repo.repo}/blob/main/CONTRIBUTING.md) for details.` - }) - - # ── Job 2: Detect which skills changed ──────────────────────────────── - detect-changed: - runs-on: ubuntu-latest - outputs: - skills: ${{ steps.detect.outputs.skills }} - has_skills: ${{ steps.detect.outputs.has_skills }} - steps: - - uses: actions/checkout@v4 - with: - fetch-depth: 0 - - - name: Detect changed skills - id: detect - run: | - CHANGED_SKILLS=$(git diff --name-only origin/main...HEAD \ - | grep '^\\.claude/skills/' \ - | sed 's|^\.claude/skills/||' \ - | cut -d/ -f1 \ - | sort -u \ - | jq -R . | jq -s .) - echo "skills=$CHANGED_SKILLS" >> "$GITHUB_OUTPUT" - if [ "$CHANGED_SKILLS" = "[]" ]; then - echo "has_skills=false" >> "$GITHUB_OUTPUT" - else - echo "has_skills=true" >> "$GITHUB_OUTPUT" - fi - echo "Changed skills: $CHANGED_SKILLS" - - # ── Job 3: Validate each changed skill in isolation ─────────────────── - validate-skills: - needs: detect-changed - if: needs.detect-changed.outputs.has_skills == 'true' - runs-on: ubuntu-latest - strategy: - fail-fast: false - matrix: - skill: ${{ fromJson(needs.detect-changed.outputs.skills) }} - steps: - - uses: actions/checkout@v4 - - uses: actions/setup-node@v4 - with: - node-version: 20 - cache: npm - - run: npm ci - - - name: Initialize skills system - run: >- - npx tsx -e - "import { initNanoclawDir } from './skills-engine/index'; initNanoclawDir();" - - - name: Apply skill - run: npx tsx scripts/apply-skill.ts ".claude/skills/${{ matrix.skill }}" - - - name: Typecheck after apply - run: npx tsc --noEmit - - - name: Run skill tests - run: | - TEST_CMD=$(npx tsx -e " - import { parse } from 'yaml'; - import fs from 'fs'; - const m = parse(fs.readFileSync('.claude/skills/${{ matrix.skill }}/manifest.yaml', 'utf-8')); - if (m.test) console.log(m.test); - ") - if [ -n "$TEST_CMD" ]; then - echo "Running: $TEST_CMD" - eval "$TEST_CMD" - else - echo "No test command defined, skipping" - fi - - # ── Summary gate for branch protection ──────────────────────────────── - skill-validation-summary: - needs: - - policy-check - - detect-changed - - validate-skills - if: always() - runs-on: ubuntu-latest - steps: - - name: Check results - run: | - echo "policy-check: ${{ needs.policy-check.result }}" - echo "validate-skills: ${{ needs.validate-skills.result }}" - - if [ "${{ needs.policy-check.result }}" = "failure" ]; then - echo "::error::Policy check failed" - exit 1 - fi - if [ "${{ needs.validate-skills.result }}" = "failure" ]; then - echo "::error::Skill validation failed" - exit 1 - fi - echo "All skill checks passed" diff --git a/.github/workflows/update-tokens.yml b/.github/workflows/update-tokens.yml index 753da18..9b25c55 100644 --- a/.github/workflows/update-tokens.yml +++ b/.github/workflows/update-tokens.yml @@ -8,6 +8,7 @@ on: jobs: update-tokens: + if: github.repository == 'qwibitai/nanoclaw' runs-on: ubuntu-latest steps: - uses: actions/create-github-app-token@v1 diff --git a/.gitignore b/.gitignore index deda421..e259fbf 100644 --- a/.gitignore +++ b/.gitignore @@ -22,6 +22,9 @@ groups/global/* *.keys.json .env +# Temp files +.tmp-* + # OS .DS_Store diff --git a/CHANGELOG.md b/CHANGELOG.md index c3833a1..28178e8 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,144 @@ # Changelog All notable changes to NanoClaw will be documented in this file. + +For detailed release notes, see the [full changelog on the documentation site](https://docs.nanoclaw.dev/changelog). + +## [1.2.35] - 2026-03-26 + +- [BREAKING] OneCLI Agent Vault replaces the built-in credential proxy. Existing `.env` credentials must be migrated to the vault. Run `/init-onecli` to install OneCLI and migrate credentials. + +## [1.2.21] - 2026-03-22 + +- Added opt-in diagnostics via PostHog with explicit user consent (Yes / No / Never ask again) + +## [1.2.20] - 2026-03-21 + +- Added ESLint configuration with error-handling rules + +## [1.2.19] - 2026-03-19 + +- Reduced `docker stop` timeout for faster container restarts (`-t 1` flag) + +## [1.2.18] - 2026-03-19 + +- User prompt content no longer logged on container errors — only input metadata +- Added Japanese README translation + +## [1.2.17] - 2026-03-18 + +- Added `/capabilities` and `/status` container-agent skills + +## [1.2.16] - 2026-03-18 + +- Tasks snapshot now refreshes immediately after IPC task mutations + +## [1.2.15] - 2026-03-16 + +- Fixed remote-control prompt auto-accept to prevent immediate exit +- Added `KillMode=process` so remote-control survives service restarts + +## [1.2.14] - 2026-03-14 + +- Added `/remote-control` command for host-level Claude Code access from within containers + +## [1.2.13] - 2026-03-14 + +**Breaking:** Skills are now git branches, channels are separate fork repos. + +- Skills live as `skill/*` git branches merged via `git merge` +- Added Docker Sandboxes support +- Fixed setup registration to use correct CLI commands + +## [1.2.12] - 2026-03-08 + +- Added `/compact` skill for manual context compaction +- Enhanced container environment isolation via credential proxy + +## [1.2.11] - 2026-03-08 + +- Added PDF reader, image vision, and WhatsApp reactions skills +- Fixed task container to close promptly when agent uses IPC-only messaging + +## [1.2.10] - 2026-03-06 + +- Added `LIMIT` to unbounded message history queries for better performance + +## [1.2.9] - 2026-03-06 + +- Agent prompts now include timezone context for accurate time references + +## [1.2.8] - 2026-03-06 + +- Fixed misleading `send_message` tool description for scheduled tasks + +## [1.2.7] - 2026-03-06 + +- Added `/add-ollama` skill for local model inference +- Added `update_task` tool and return task ID from `schedule_task` + +## [1.2.6] - 2026-03-04 + +- Updated `claude-agent-sdk` to 0.2.68 + +## [1.2.5] - 2026-03-04 + +- CI formatting fix + +## [1.2.4] - 2026-03-04 + +- Fixed `_chatJid` rename to `chatJid` in `onMessage` callback + +## [1.2.3] - 2026-03-04 + +- Added sender allowlist for per-chat access control + +## [1.2.2] - 2026-03-04 + +- Added `/use-local-whisper` skill for local voice transcription +- Atomic task claims prevent scheduled tasks from executing twice + +## [1.2.1] - 2026-03-02 + +- Version bump (no functional changes) + +## [1.2.0] - 2026-03-02 + +**Breaking:** WhatsApp removed from core, now a skill. Run `/add-whatsapp` to re-add. + +- Channel registry: channels self-register at startup via `registerChannel()` factory pattern +- `isMain` flag replaces folder-name-based main group detection +- `ENABLED_CHANNELS` removed — channels detected by credential presence +- Prevent scheduled tasks from executing twice when container runtime exceeds poll interval + +## [1.1.6] - 2026-03-01 + +- Added CJK font support for Chromium screenshots + +## [1.1.5] - 2026-03-01 + +- Fixed wrapped WhatsApp message normalization + +## [1.1.4] - 2026-03-01 + +- Added third-party model support +- Added `/update-nanoclaw` skill for syncing with upstream + +## [1.1.3] - 2026-02-25 + +- Added `/add-slack` skill +- Restructured Gmail skill for new architecture + +## [1.1.2] - 2026-02-24 + +- Improved error handling for WhatsApp Web version fetch + +## [1.1.1] - 2026-02-24 + +- Added Qodo skills and codebase intelligence +- Fixed WhatsApp 405 connection failures + +## [1.1.0] - 2026-02-23 + +- Added `/update` skill to pull upstream changes from within Claude Code +- Enhanced container environment isolation via credential proxy diff --git a/CLAUDE.md b/CLAUDE.md index 010adb0..c9c49ff 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -4,14 +4,14 @@ Personal Claude assistant. See [README.md](README.md) for philosophy and setup. ## Quick Context -Single Node.js process that connects to WhatsApp, routes messages to Claude Agent SDK running in containers (Linux VMs). Each group has isolated filesystem and memory. +Single Node.js process with skill-based channel system. Channels (WhatsApp, Telegram, Slack, Discord, Gmail) are skills that self-register at startup. Messages route to Claude Agent SDK running in containers (Linux VMs). Each group has isolated filesystem and memory. ## Key Files | File | Purpose | |------|---------| | `src/index.ts` | Orchestrator: state, message loop, agent invocation | -| `src/channels/whatsapp.ts` | WhatsApp connection, auth, send/receive | +| `src/channels/registry.ts` | Channel registry (self-registration at startup) | | `src/ipc.ts` | IPC watcher and task processing | | `src/router.ts` | Message formatting and outbound routing | | `src/config.ts` | Trigger pattern, paths, intervals | @@ -19,19 +19,35 @@ Single Node.js process that connects to WhatsApp, routes messages to Claude Agen | `src/task-scheduler.ts` | Runs scheduled tasks | | `src/db.ts` | SQLite operations | | `groups/{name}/CLAUDE.md` | Per-group memory (isolated) | -| `container/skills/agent-browser.md` | Browser automation tool (available to all agents via Bash) | +| `container/skills/` | Skills loaded inside agent containers (browser, status, formatting) | + +## Secrets / Credentials / Proxy (OneCLI) + +API keys, secret keys, OAuth tokens, and auth credentials are managed by the OneCLI gateway — which handles secret injection into containers at request time, so no keys or tokens are ever passed to containers directly. Run `onecli --help`. ## Skills +Four types of skills exist in NanoClaw. See [CONTRIBUTING.md](CONTRIBUTING.md) for the full taxonomy and guidelines. + +- **Feature skills** — merge a `skill/*` branch to add capabilities (e.g. `/add-telegram`, `/add-slack`) +- **Utility skills** — ship code files alongside SKILL.md (e.g. `/claw`) +- **Operational skills** — instruction-only workflows, always on `main` (e.g. `/setup`, `/debug`) +- **Container skills** — loaded inside agent containers at runtime (`container/skills/`) + | Skill | When to Use | |-------|-------------| | `/setup` | First-time installation, authentication, service configuration | | `/customize` | Adding channels, integrations, changing behavior | | `/debug` | Container issues, logs, troubleshooting | -| `/update` | Pull upstream NanoClaw changes, merge with customizations, run migrations | +| `/update-nanoclaw` | Bring upstream NanoClaw updates into a customized install | +| `/init-onecli` | Install OneCLI Agent Vault and migrate `.env` credentials to it | | `/qodo-pr-resolver` | Fetch and fix Qodo PR review issues interactively or in batch | | `/get-qodo-rules` | Load org- and repo-level coding rules from Qodo before code tasks | +## Contributing + +Before creating a PR, adding a skill, or preparing any contribution, you MUST read [CONTRIBUTING.md](CONTRIBUTING.md). It covers accepted change types, the four skill types and their guidelines, SKILL.md format rules, PR requirements, and the pre-submission checklist (searching for existing PRs/issues, testing, description format). + ## Development Run commands directly—don't tell the user to run them. @@ -55,6 +71,10 @@ systemctl --user stop nanoclaw systemctl --user restart nanoclaw ``` +## Troubleshooting + +**WhatsApp not connecting after upgrade:** WhatsApp is now a separate skill, not bundled in core. Run `/add-whatsapp` (or `npx tsx scripts/apply-skill.ts .claude/skills/add-whatsapp && npm run build`) to install it. Existing auth credentials and groups are preserved. + ## Container Build Cache The container buildkit caches the build context aggressively. `--no-cache` alone does NOT invalidate COPY steps — the builder's volume retains stale files. To force a truly clean rebuild, prune the builder then re-run `./container/build.sh`. diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index dd3614d..7a7816a 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,5 +1,18 @@ # Contributing +## Before You Start + +1. **Check for existing work.** Search open PRs and issues before starting: + ```bash + gh pr list --repo qwibitai/nanoclaw --search "" + gh issue list --repo qwibitai/nanoclaw --search "" + ``` + If a related PR or issue exists, build on it rather than duplicating effort. + +2. **Check alignment.** Read the [Philosophy section in README.md](README.md#philosophy). Source code changes should only be things 90%+ of users need. Skills can be more niche, but should still be useful beyond a single person's setup. + +3. **One thing per PR.** Each PR should do one thing — one bug fix, one skill, one simplification. Don't mix unrelated changes in a single PR. + ## Source Code Changes **Accepted:** Bug fixes, security fixes, simplifications, reducing code. @@ -8,16 +21,127 @@ ## Skills -A [skill](https://code.claude.com/docs/en/skills) is a markdown file in `.claude/skills/` that teaches Claude Code how to transform a NanoClaw installation. +NanoClaw uses [Claude Code skills](https://code.claude.com/docs/en/skills) — markdown files with optional supporting files that teach Claude how to do something. There are four types of skills in NanoClaw, each serving a different purpose. -A PR that contributes a skill should not modify any source files. - -Your skill should contain the **instructions** Claude follows to add the feature—not pre-built code. See `/add-telegram` for a good example. - -### Why? +### Why skills? Every user should have clean and minimal code that does exactly what they need. Skills let users selectively add features to their fork without inheriting code for features they don't want. -### Testing +### Skill types -Test your skill by running it on a fresh clone before submitting. +#### 1. Feature skills (branch-based) + +Add capabilities to NanoClaw by merging a git branch. The SKILL.md contains setup instructions; the actual code lives on a `skill/*` branch. + +**Location:** `.claude/skills/` on `main` (instructions only), code on `skill/*` branch + +**Examples:** `/add-telegram`, `/add-slack`, `/add-discord`, `/add-gmail` + +**How they work:** +1. User runs `/add-telegram` +2. Claude follows the SKILL.md: fetches and merges the `skill/telegram` branch +3. Claude walks through interactive setup (env vars, bot creation, etc.) + +**Contributing a feature skill:** +1. Fork `qwibitai/nanoclaw` and branch from `main` +2. Make the code changes (new files, modified source, updated `package.json`, etc.) +3. Add a SKILL.md in `.claude/skills//` with setup instructions — step 1 should be merging the branch +4. Open a PR. We'll create the `skill/` branch from your work + +See `/add-telegram` for a good example. See [docs/skills-as-branches.md](docs/skills-as-branches.md) for the full system design. + +#### 2. Utility skills (with code files) + +Standalone tools that ship code files alongside the SKILL.md. The SKILL.md tells Claude how to install the tool; the code lives in the skill directory itself (e.g. in a `scripts/` subfolder). + +**Location:** `.claude/skills//` with supporting files + +**Examples:** `/claw` (Python CLI in `scripts/claw`) + +**Key difference from feature skills:** No branch merge needed. The code is self-contained in the skill directory and gets copied into place during installation. + +**Guidelines:** +- Put code in separate files, not inline in the SKILL.md +- Use `${CLAUDE_SKILL_DIR}` to reference files in the skill directory +- SKILL.md contains installation instructions, usage docs, and troubleshooting + +#### 3. Operational skills (instruction-only) + +Workflows and guides with no code changes. The SKILL.md is the entire skill — Claude follows the instructions to perform a task. + +**Location:** `.claude/skills/` on `main` + +**Examples:** `/setup`, `/debug`, `/customize`, `/update-nanoclaw`, `/update-skills` + +**Guidelines:** +- Pure instructions — no code files, no branch merges +- Use `AskUserQuestion` for interactive prompts +- These stay on `main` and are always available to every user + +#### 4. Container skills (agent runtime) + +Skills that run inside the agent container, not on the host. These teach the container agent how to use tools, format output, or perform tasks. They are synced into each group's `.claude/skills/` directory when a container starts. + +**Location:** `container/skills//` + +**Examples:** `agent-browser` (web browsing), `capabilities` (/capabilities command), `status` (/status command), `slack-formatting` (Slack mrkdwn syntax) + +**Key difference:** These are NOT invoked by the user on the host. They're loaded by Claude Code inside the container and influence how the agent behaves. + +**Guidelines:** +- Follow the same SKILL.md + frontmatter format +- Use `allowed-tools` frontmatter to scope tool permissions +- Keep them focused — the agent's context window is shared across all container skills + +### SKILL.md format + +All skills use the [Claude Code skills standard](https://code.claude.com/docs/en/skills): + +```markdown +--- +name: my-skill +description: What this skill does and when to use it. +--- + +Instructions here... +``` + +**Rules:** +- Keep SKILL.md **under 500 lines** — move detail to separate reference files +- `name`: lowercase, alphanumeric + hyphens, max 64 chars +- `description`: required — Claude uses this to decide when to invoke the skill +- Put code in separate files, not inline in the markdown +- See the [skills standard](https://code.claude.com/docs/en/skills) for all available frontmatter fields + +## Testing + +Test your contribution on a fresh clone before submitting. For skills, run the skill end-to-end and verify it works. + +## Pull Requests + +### Before opening + +1. **Link related issues.** If your PR resolves an open issue, include `Closes #123` in the description so it's auto-closed on merge. +2. **Test thoroughly.** Run the feature yourself. For skills, test on a fresh clone. +3. **Check the right box** in the PR template. Labels are auto-applied based on your selection: + +| Checkbox | Label | +|----------|-------| +| Feature skill | `PR: Skill` + `PR: Feature` | +| Utility skill | `PR: Skill` | +| Operational/container skill | `PR: Skill` | +| Fix | `PR: Fix` | +| Simplification | `PR: Refactor` | +| Documentation | `PR: Docs` | + +### PR description + +Keep it concise. Remove any template sections that don't apply. The description should cover: + +- **What** — what the PR adds or changes +- **Why** — the motivation +- **How it works** — brief explanation of the approach +- **How it was tested** — what you did to verify it works +- **Usage** — how the user invokes it (for skills) + +Don't pad the description. A few clear sentences are better than lengthy paragraphs. diff --git a/CONTRIBUTORS.md b/CONTRIBUTORS.md index ab32921..4038595 100644 --- a/CONTRIBUTORS.md +++ b/CONTRIBUTORS.md @@ -7,3 +7,12 @@ Thanks to everyone who has contributed to NanoClaw! - [pottertech](https://github.com/pottertech) — Skip Potter - [rgarcia](https://github.com/rgarcia) — Rafael - [AmaxGuan](https://github.com/AmaxGuan) — Lingfeng Guan +- [happydog-intj](https://github.com/happydog-intj) — happy dog +- [bindoon](https://github.com/bindoon) — 潕量 +- [taslim](https://github.com/taslim) — Taslim +- [baijunjie](https://github.com/baijunjie) — BaiJunjie +- [Michaelliv](https://github.com/Michaelliv) — Michael +- [kk17](https://github.com/kk17) — Kyle Zhike Chen +- [flobo3](https://github.com/flobo3) — Flo +- [edwinwzhe](https://github.com/edwinwzhe) — Edwin He +- [scottgl9](https://github.com/scottgl9) — Scott Glover diff --git a/README.md b/README.md index 2248556..874a8d7 100644 --- a/README.md +++ b/README.md @@ -8,14 +8,14 @@

nanoclaw.dev  •   + docs  •   中文  •   + 日本語  •   Discord  •   34.9k tokens, 17% of context window

-Using Claude Code, NanoClaw can dynamically rewrite its code to customize its feature set for your needs. - -**New:** First AI assistant to support [Agent Swarms](https://code.claude.com/docs/en/agent-teams). Spin up teams of agents that collaborate in your chat. +--- ## Why I Built NanoClaw @@ -26,13 +26,25 @@ NanoClaw provides that same core functionality, but in a codebase small enough t ## Quick Start ```bash -git clone https://github.com/qwibitai/NanoClaw.git -cd NanoClaw +gh repo fork qwibitai/nanoclaw --clone +cd nanoclaw claude ``` +
+Without GitHub CLI + +1. Fork [qwibitai/nanoclaw](https://github.com/qwibitai/nanoclaw) on GitHub (click the Fork button) +2. `git clone https://github.com//nanoclaw.git` +3. `cd nanoclaw` +4. `claude` + +
+ Then run `/setup`. Claude Code handles everything: dependencies, authentication, container setup and service configuration. +> **Note:** Commands prefixed with `/` (like `/setup`, `/add-whatsapp`) are [Claude Code skills](https://code.claude.com/docs/en/skills). Type them inside the `claude` CLI prompt, not in your regular terminal. If you don't have Claude Code installed, get it at [claude.com/product/claude-code](https://claude.com/product/claude-code). + ## Philosophy **Small enough to understand.** One process, a few source files and no microservices. If you want to understand the full NanoClaw codebase, just ask Claude Code to walk you through it. @@ -54,13 +66,14 @@ Then run `/setup`. Claude Code handles everything: dependencies, authentication, ## What It Supports -- **Messenger I/O** - Message NanoClaw from your phone. Supports WhatsApp, Telegram, Discord, Slack, Signal and headless operation. +- **Multi-channel messaging** - Talk to your assistant from WhatsApp, Telegram, Discord, Slack, or Gmail. Add channels with skills like `/add-whatsapp` or `/add-telegram`. Run one or many at the same time. - **Isolated group context** - Each group has its own `CLAUDE.md` memory, isolated filesystem, and runs in its own container sandbox with only that filesystem mounted to it. - **Main channel** - Your private channel (self-chat) for admin control; every group is completely isolated - **Scheduled tasks** - Recurring jobs that run Claude and can message you back - **Web access** - Search and fetch content from the Web -- **Container isolation** - Agents are sandboxed in Apple Container (macOS) or Docker (macOS/Linux) -- **Agent Swarms** - Spin up teams of specialized agents that collaborate on complex tasks. NanoClaw is the first personal AI assistant to support agent swarms. +- **Container isolation** - Agents are sandboxed in Docker (macOS/Linux), [Docker Sandboxes](docs/docker-sandboxes.md) (micro VM isolation), or Apple Container (macOS) +- **Credential security** - Agents never hold raw API keys. Outbound requests route through [OneCLI's Agent Vault](https://github.com/onecli/onecli), which injects credentials at request time and enforces per-agent policies and rate limits. +- **Agent Swarms** - Spin up teams of specialized agents that collaborate on complex tasks - **Optional integrations** - Add Gmail (`/add-gmail`) and more via skills ## Usage @@ -97,7 +110,7 @@ The codebase is small enough that Claude can safely modify it. **Don't add features. Add skills.** -If you want to add Telegram support, don't create a PR that adds Telegram alongside WhatsApp. Instead, contribute a skill file (`.claude/skills/add-telegram/SKILL.md`) that teaches Claude Code how to transform a NanoClaw installation to use Telegram. +If you want to add Telegram support, don't create a PR that adds Telegram to the core codebase. Instead, fork NanoClaw, make the code changes on a branch, and open a PR. We'll create a `skill/telegram` branch from your PR that other users can merge into their fork. Users then run `/add-telegram` on their fork and get clean code that does exactly what they need, not a bloated system trying to support every use case. @@ -106,14 +119,11 @@ Users then run `/add-telegram` on their fork and get clean code that does exactl Skills we'd like to see: **Communication Channels** -- `/add-slack` - Add Slack - -**Session Management** -- `/clear` - Add a `/clear` command that compacts the conversation (summarizes context while preserving critical information in the same session). Requires figuring out how to trigger compaction programmatically via the Claude Agent SDK. +- `/add-signal` - Add Signal as a channel ## Requirements -- macOS or Linux +- macOS, Linux, or Windows (via WSL2) - Node.js 20+ - [Claude Code](https://claude.ai/download) - [Apple Container](https://github.com/apple/container) (macOS) or [Docker](https://docker.com/products/docker-desktop) (macOS/Linux) @@ -121,14 +131,16 @@ Skills we'd like to see: ## Architecture ``` -WhatsApp (baileys) --> SQLite --> Polling loop --> Container (Claude Agent SDK) --> Response +Channels --> SQLite --> Polling loop --> Container (Claude Agent SDK) --> Response ``` -Single Node.js process. Agents execute in isolated Linux containers with filesystem isolation. Only mounted directories are accessible. Per-group message queue with concurrency control. IPC via filesystem. +Single Node.js process. Channels are added via skills and self-register at startup — the orchestrator connects whichever ones have credentials present. Agents execute in isolated Linux containers with filesystem isolation. Only mounted directories are accessible. Per-group message queue with concurrency control. IPC via filesystem. + +For the full architecture details, see the [documentation site](https://docs.nanoclaw.dev/concepts/architecture). Key files: - `src/index.ts` - Orchestrator: state, message loop, agent invocation -- `src/channels/whatsapp.ts` - WhatsApp connection, auth, send/receive +- `src/channels/registry.ts` - Channel registry (self-registration at startup) - `src/ipc.ts` - IPC watcher and task processing - `src/router.ts` - Message formatting and outbound routing - `src/group-queue.ts` - Per-group queue with global concurrency limit @@ -141,20 +153,36 @@ Key files: **Why Docker?** -Docker provides cross-platform support (macOS, Linux and even Windows via WSL2) and a mature ecosystem. On macOS, you can optionally switch to Apple Container via `/convert-to-apple-container` for a lighter-weight native runtime. +Docker provides cross-platform support (macOS, Linux and even Windows via WSL2) and a mature ecosystem. On macOS, you can optionally switch to Apple Container via `/convert-to-apple-container` for a lighter-weight native runtime. For additional isolation, [Docker Sandboxes](docs/docker-sandboxes.md) run each container inside a micro VM. -**Can I run this on Linux?** +**Can I run this on Linux or Windows?** -Yes. Docker is the default runtime and works on both macOS and Linux. Just run `/setup`. +Yes. Docker is the default runtime and works on macOS, Linux, and Windows (via WSL2). Just run `/setup`. **Is this secure?** -Agents run in containers, not behind application-level permission checks. They can only access explicitly mounted directories. You should still review what you're running, but the codebase is small enough that you actually can. See [docs/SECURITY.md](docs/SECURITY.md) for the full security model. +Agents run in containers, not behind application-level permission checks. They can only access explicitly mounted directories. Credentials never enter the container — outbound API requests route through [OneCLI's Agent Vault](https://github.com/onecli/onecli), which injects authentication at the proxy level and supports rate limits and access policies. You should still review what you're running, but the codebase is small enough that you actually can. See the [security documentation](https://docs.nanoclaw.dev/concepts/security) for the full security model. **Why no configuration files?** We don't want configuration sprawl. Every user should customize NanoClaw so that the code does exactly what they want, rather than configuring a generic system. If you prefer having config files, you can tell Claude to add them. +**Can I use third-party or open-source models?** + +Yes. NanoClaw supports any Claude API-compatible model endpoint. Set these environment variables in your `.env` file: + +```bash +ANTHROPIC_BASE_URL=https://your-api-endpoint.com +ANTHROPIC_AUTH_TOKEN=your-token-here +``` + +This allows you to use: +- Local models via [Ollama](https://ollama.ai) with an API proxy +- Open-source models hosted on [Together AI](https://together.ai), [Fireworks](https://fireworks.ai), etc. +- Custom model deployments with Anthropic-compatible APIs + +Note: The model must support the Anthropic API format for best compatibility. + **How do I debug issues?** Ask Claude Code. "Why isn't the scheduler running?" "What's in the recent logs?" "Why did this message not get a response?" That's the AI-native approach that underlies NanoClaw. @@ -175,6 +203,10 @@ This keeps the base system minimal and lets every user customize their installat Questions? Ideas? [Join the Discord](https://discord.gg/VDdww8qS42). +## Changelog + +See [CHANGELOG.md](CHANGELOG.md) for breaking changes, or the [full release history](https://docs.nanoclaw.dev/changelog) on the documentation site. + ## License MIT diff --git a/README_ja.md b/README_ja.md new file mode 100644 index 0000000..5c3f648 --- /dev/null +++ b/README_ja.md @@ -0,0 +1,232 @@ +

+ NanoClaw +

+ +

+ エージェントを専用コンテナで安全に実行するAIアシスタント。軽量で、理解しやすく、あなたのニーズに完全にカスタマイズできるように設計されています。 +

+ +

+ nanoclaw.dev  •   + English  •   + 中文  •   + Discord  •   + 34.9k tokens, 17% of context window +

+ +--- + +

🐳 Dockerサンドボックスで動作

+

各エージェントはマイクロVM内の独立したコンテナで実行されます。
ハイパーバイザーレベルの分離。ミリ秒で起動。複雑なセットアップ不要。

+ +**macOS (Apple Silicon)** +```bash +curl -fsSL https://nanoclaw.dev/install-docker-sandboxes.sh | bash +``` + +**Windows (WSL)** +```bash +curl -fsSL https://nanoclaw.dev/install-docker-sandboxes-windows.sh | bash +``` + +> 現在、macOS(Apple Silicon)とWindows(x86)に対応しています。Linux対応は近日公開予定。 + +

発表記事を読む →  ·  手動セットアップガイド →

+ +--- + +## NanoClawを作った理由 + +[OpenClaw](https://github.com/openclaw/openclaw)は素晴らしいプロジェクトですが、理解しきれない複雑なソフトウェアに自分の生活へのフルアクセスを与えたまま安心して眠れるとは思えませんでした。OpenClawは約50万行のコード、53の設定ファイル、70以上の依存関係を持っています。セキュリティはアプリケーションレベル(許可リスト、ペアリングコード)であり、真のOS レベルの分離ではありません。すべてが共有メモリを持つ1つのNodeプロセスで動作します。 + +NanoClawは同じコア機能を提供しますが、理解できる規模のコードベースで実現しています:1つのプロセスと少数のファイル。Claudeエージェントは単なるパーミッションチェックの背後ではなく、ファイルシステム分離された独自のLinuxコンテナで実行されます。 + +## クイックスタート + +```bash +gh repo fork qwibitai/nanoclaw --clone +cd nanoclaw +claude +``` + +
+GitHub CLIなしの場合 + +1. GitHub上で[qwibitai/nanoclaw](https://github.com/qwibitai/nanoclaw)をフォーク(Forkボタンをクリック) +2. `git clone https://github.com/<あなたのユーザー名>/nanoclaw.git` +3. `cd nanoclaw` +4. `claude` + +
+ +その後、`/setup`を実行します。Claude Codeがすべてを処理します:依存関係、認証、コンテナセットアップ、サービス設定。 + +> **注意:** `/`で始まるコマンド(`/setup`、`/add-whatsapp`など)は[Claude Codeスキル](https://code.claude.com/docs/en/skills)です。通常のターミナルではなく、`claude` CLIプロンプト内で入力してください。Claude Codeをインストールしていない場合は、[claude.com/product/claude-code](https://claude.com/product/claude-code)から入手してください。 + +## 設計思想 + +**理解できる規模。** 1つのプロセス、少数のソースファイル、マイクロサービスなし。NanoClawのコードベース全体を理解したい場合は、Claude Codeに説明を求めるだけです。 + +**分離によるセキュリティ。** エージェントはLinuxコンテナ(macOSではApple Container、またはDocker)で実行され、明示的にマウントされたものだけが見えます。コマンドはホストではなくコンテナ内で実行されるため、Bashアクセスは安全です。 + +**個人ユーザー向け。** NanoClawはモノリシックなフレームワークではなく、各ユーザーのニーズに正確にフィットするソフトウェアです。肥大化するのではなく、オーダーメイドになるよう設計されています。自分のフォークを作成し、Claude Codeにニーズに合わせて変更させます。 + +**カスタマイズ=コード変更。** 設定ファイルの肥大化なし。動作を変えたい?コードを変更するだけ。コードベースは変更しても安全な規模です。 + +**AIネイティブ。** +- インストールウィザードなし — Claude Codeがセットアップを案内。 +- モニタリングダッシュボードなし — Claudeに状況を聞くだけ。 +- デバッグツールなし — 問題を説明すればClaudeが修正。 + +**機能追加ではなくスキル。** コードベースに機能(例:Telegram対応)を追加する代わりに、コントリビューターは`/add-telegram`のような[Claude Codeスキル](https://code.claude.com/docs/en/skills)を提出し、あなたのフォークを変換します。あなたが必要なものだけを正確に実行するクリーンなコードが手に入ります。 + +**最高のハーネス、最高のモデル。** NanoClawはClaude Agent SDK上で動作します。つまり、Claude Codeを直接実行しているということです。Claude Codeは高い能力を持ち、そのコーディングと問題解決能力によってNanoClawを変更・拡張し、各ユーザーに合わせてカスタマイズできます。 + +## サポート機能 + +- **マルチチャネルメッセージング** - WhatsApp、Telegram、Discord、Slack、Gmailからアシスタントと会話。`/add-whatsapp`や`/add-telegram`などのスキルでチャネルを追加。1つでも複数でも同時に実行可能。 +- **グループごとの分離コンテキスト** - 各グループは独自の`CLAUDE.md`メモリ、分離されたファイルシステムを持ち、そのファイルシステムのみがマウントされた専用コンテナサンドボックスで実行。 +- **メインチャネル** - 管理制御用のプライベートチャネル(セルフチャット)。各グループは完全に分離。 +- **スケジュールタスク** - Claudeを実行し、メッセージを返せる定期ジョブ。 +- **Webアクセス** - Webからのコンテンツ検索・取得。 +- **コンテナ分離** - エージェントは[Dockerサンドボックス](https://nanoclaw.dev/blog/nanoclaw-docker-sandboxes)(マイクロVM分離)、Apple Container(macOS)、またはDocker(macOS/Linux)でサンドボックス化。 +- **エージェントスウォーム** - 複雑なタスクで協力する専門エージェントチームを起動。 +- **オプション連携** - Gmail(`/add-gmail`)などをスキルで追加。 + +## 使い方 + +トリガーワード(デフォルト:`@Andy`)でアシスタントに話しかけます: + +``` +@Andy 毎朝9時に営業パイプラインの概要を送って(Obsidian vaultフォルダにアクセス可能) +@Andy 毎週金曜に過去1週間のgit履歴をレビューして、差異があればREADMEを更新して +@Andy 毎週月曜の朝8時に、Hacker NewsとTechCrunchからAI関連のニュースをまとめてブリーフィングを送って +``` + +メインチャネル(セルフチャット)から、グループやタスクを管理できます: +``` +@Andy 全グループのスケジュールタスクを一覧表示して +@Andy 月曜のブリーフィングタスクを一時停止して +@Andy Family Chatグループに参加して +``` + +## カスタマイズ + +NanoClawは設定ファイルを使いません。変更するには、Claude Codeに伝えるだけです: + +- 「トリガーワードを@Bobに変更して」 +- 「今後はレスポンスをもっと短く直接的にして」 +- 「おはようと言ったらカスタム挨拶を追加して」 +- 「会話の要約を毎週保存して」 + +または`/customize`を実行してガイド付きの変更を行えます。 + +コードベースは十分に小さいため、Claudeが安全に変更できます。 + +## コントリビューション + +**機能を追加するのではなく、スキルを追加してください。** + +Telegram対応を追加したい場合、コアコードベースにTelegramを追加するPRを作成しないでください。代わりに、NanoClawをフォークし、ブランチでコード変更を行い、PRを開いてください。あなたのPRから`skill/telegram`ブランチを作成し、他のユーザーが自分のフォークにマージできるようにします。 + +ユーザーは自分のフォークで`/add-telegram`を実行するだけで、あらゆるユースケースに対応しようとする肥大化したシステムではなく、必要なものだけを正確に実行するクリーンなコードが手に入ります。 + +### RFS(スキル募集) + +私たちが求めているスキル: + +**コミュニケーションチャネル** +- `/add-signal` - Signalをチャネルとして追加 + +**セッション管理** +- `/clear` - 会話をコンパクト化する`/clear`コマンドの追加(同一セッション内で重要な情報を保持しながらコンテキストを要約)。Claude Agent SDKを通じてプログラム的にコンパクト化をトリガーする方法の解明が必要。 + +## 必要条件 + +- macOSまたはLinux +- Node.js 20以上 +- [Claude Code](https://claude.ai/download) +- [Apple Container](https://github.com/apple/container)(macOS)または[Docker](https://docker.com/products/docker-desktop)(macOS/Linux) + +## アーキテクチャ + +``` +チャネル --> SQLite --> ポーリングループ --> コンテナ(Claude Agent SDK) --> レスポンス +``` + +単一のNode.jsプロセス。チャネルはスキルで追加され、起動時に自己登録します — オーケストレーターは認証情報が存在するチャネルを接続します。エージェントはファイルシステム分離された独立したLinuxコンテナで実行されます。マウントされたディレクトリのみアクセス可能。グループごとのメッセージキューと同時実行制御。ファイルシステム経由のIPC。 + +詳細なアーキテクチャについては、[docs/SPEC.md](docs/SPEC.md)を参照してください。 + +主要ファイル: +- `src/index.ts` - オーケストレーター:状態、メッセージループ、エージェント呼び出し +- `src/channels/registry.ts` - チャネルレジストリ(起動時の自己登録) +- `src/ipc.ts` - IPCウォッチャーとタスク処理 +- `src/router.ts` - メッセージフォーマットとアウトバウンドルーティング +- `src/group-queue.ts` - グローバル同時実行制限付きのグループごとのキュー +- `src/container-runner.ts` - ストリーミングエージェントコンテナの起動 +- `src/task-scheduler.ts` - スケジュールタスクの実行 +- `src/db.ts` - SQLite操作(メッセージ、グループ、セッション、状態) +- `groups/*/CLAUDE.md` - グループごとのメモリ + +## FAQ + +**なぜDockerなのか?** + +Dockerはクロスプラットフォーム対応(macOS、Linux、さらにWSL2経由のWindows)と成熟したエコシステムを提供します。macOSでは、`/convert-to-apple-container`でオプションとしてApple Containerに切り替え、より軽量なネイティブランタイムを使用できます。 + +**Linuxで実行できますか?** + +はい。DockerがデフォルトのランタイムでmacOSとLinuxの両方で動作します。`/setup`を実行するだけです。 + +**セキュリティは大丈夫ですか?** + +エージェントはアプリケーションレベルのパーミッションチェックの背後ではなく、コンテナで実行されます。明示的にマウントされたディレクトリのみアクセスできます。実行するものをレビューすべきですが、コードベースは十分に小さいため実際にレビュー可能です。完全なセキュリティモデルについては[docs/SECURITY.md](docs/SECURITY.md)を参照してください。 + +**なぜ設定ファイルがないのか?** + +設定の肥大化を避けたいからです。すべてのユーザーがNanoClawをカスタマイズし、汎用的なシステムを設定するのではなく、コードが必要なことを正確に実行するようにすべきです。設定ファイルが欲しい場合は、Claudeに追加するよう伝えることができます。 + +**サードパーティやオープンソースモデルを使えますか?** + +はい。NanoClawはClaude API互換のモデルエンドポイントに対応しています。`.env`ファイルで以下の環境変数を設定してください: + +```bash +ANTHROPIC_BASE_URL=https://your-api-endpoint.com +ANTHROPIC_AUTH_TOKEN=your-token-here +``` + +以下が使用可能です: +- [Ollama](https://ollama.ai)とAPIプロキシ経由のローカルモデル +- [Together AI](https://together.ai)、[Fireworks](https://fireworks.ai)等でホストされたオープンソースモデル +- Anthropic互換APIのカスタムモデルデプロイメント + +注意:最高の互換性のため、モデルはAnthropic APIフォーマットに対応している必要があります。 + +**問題のデバッグ方法は?** + +Claude Codeに聞いてください。「スケジューラーが動いていないのはなぜ?」「最近のログには何がある?」「このメッセージに返信がなかったのはなぜ?」これがNanoClawの基盤となるAIネイティブなアプローチです。 + +**セットアップがうまくいかない場合は?** + +問題がある場合、セットアップ中にClaudeが動的に修正を試みます。それでもうまくいかない場合は、`claude`を実行してから`/debug`を実行してください。Claudeが他のユーザーにも影響する可能性のある問題を見つけた場合は、セットアップのSKILL.mdを修正するPRを開いてください。 + +**どのような変更がコードベースに受け入れられますか?** + +セキュリティ修正、バグ修正、明確な改善のみが基本設定に受け入れられます。それだけです。 + +それ以外のすべて(新機能、OS互換性、ハードウェアサポート、機能拡張)はスキルとしてコントリビューションすべきです。 + +これにより、基本システムを最小限に保ち、すべてのユーザーが不要な機能を継承することなく、自分のインストールをカスタマイズできます。 + +## コミュニティ + +質問やアイデアは?[Discordに参加](https://discord.gg/VDdww8qS42)してください。 + +## 変更履歴 + +破壊的変更と移行ノートについては[CHANGELOG.md](CHANGELOG.md)を参照してください。 + +## ライセンス + +MIT diff --git a/README_zh.md b/README_zh.md index a0e7115..714bd87 100644 --- a/README_zh.md +++ b/README_zh.md @@ -9,17 +9,19 @@

nanoclaw.dev  •   English  •   + 日本語  •   Discord  •   34.9k tokens, 17% of context window

+通过 Claude Code,NanoClaw 可以动态重写自身代码,根据您的需求定制功能。 **新功能:** 首个支持 [Agent Swarms(智能体集群)](https://code.claude.com/docs/en/agent-teams) 的 AI 助手。可轻松组建智能体团队,在您的聊天中高效协作。 ## 我为什么创建这个项目 -[OpenClaw](https://github.com/openclaw/openclaw) 是一个令人印象深刻的项目,愿景宏大。但我无法安心使用一个我不了解却能访问我个人隐私的软件。OpenClaw 有 52+ 个模块、8 个配置管理文件、45+ 个依赖项,以及为 15 个渠道提供商设计的抽象层。其安全性是应用级别的(通过白名单、配对码实现),而非操作系统级别的隔离。所有东西都在一个共享内存的 Node 进程中运行。 +[OpenClaw](https://github.com/openclaw/openclaw) 是一个令人印象深刻的项目,但我无法安心使用一个我不了解却能访问我个人隐私的软件。OpenClaw 有近 50 万行代码、53 个配置文件和 70+ 个依赖项。其安全性是应用级别的(通过白名单、配对码实现),而非操作系统级别的隔离。所有东西都在一个共享内存的 Node 进程中运行。 -NanoClaw 用一个您能在 8 分钟内理解的代码库,为您提供了同样的核心功能。只有一个进程,少数几个文件。智能体(Agent)运行在具有文件系统隔离的真实 Linux 容器中,而不是依赖于权限检查。 +NanoClaw 用一个您能快速理解的代码库,为您提供了同样的核心功能。只有一个进程,少数几个文件。智能体(Agent)运行在具有文件系统隔离的真实 Linux 容器中,而不是依赖于权限检查。 ## 快速开始 @@ -31,25 +33,27 @@ claude 然后运行 `/setup`。Claude Code 会处理一切:依赖安装、身份验证、容器设置、服务配置。 +> **注意:** 以 `/` 开头的命令(如 `/setup`、`/add-whatsapp`)是 [Claude Code 技能](https://code.claude.com/docs/en/skills)。请在 `claude` CLI 提示符中输入,而非在普通终端中。 + ## 设计哲学 **小巧易懂:** 单一进程,少量源文件。无微服务、无消息队列、无复杂抽象层。让 Claude Code 引导您轻松上手。 **通过隔离保障安全:** 智能体运行在 Linux 容器(在 macOS 上是 Apple Container,或 Docker)中。它们只能看到被明确挂载的内容。即便通过 Bash 访问也十分安全,因为所有命令都在容器内执行,不会直接操作您的宿主机。 -**为单一用户打造:** 这不是一个框架,是一个完全符合我个人需求的、可工作的软件。您可以 Fork 本项目,然后让 Claude Code 根据您的精确需求进行修改和适配。 +**为单一用户打造:** 这不是一个框架,是一个完全符合您个人需求的、可工作的软件。您可以 Fork 本项目,然后让 Claude Code 根据您的精确需求进行修改和适配。 **定制即代码修改:** 没有繁杂的配置文件。想要不同的行为?直接修改代码。代码库足够小,这样做是安全的。 -**AI 原生:** 无安装向导(由 Claude Code 指导安装)。无需监控仪表盘,直接询问 Claude 即可了解系统状况。无调试工具(描述问题,Claude 会修复它)。 +**AI 原生:** 无安装向导(由 Claude Code 指导安装)。无需监控仪表盘,直接询问 Claude 即可了解系统状况。无调试工具(描述问题,Claude 会修复它)。 **技能(Skills)优于功能(Features):** 贡献者不应该向代码库添加新功能(例如支持 Telegram)。相反,他们应该贡献像 `/add-telegram` 这样的 [Claude Code 技能](https://code.claude.com/docs/en/skills),这些技能可以改造您的 fork。最终,您得到的是只做您需要事情的整洁代码。 -**最好的工具套件,最好的模型:** 本项目运行在 Claude Agent SDK 之上,这意味着您直接运行的就是 Claude Code。工具套件至关重要。一个低效的工具套件会让再聪明的模型也显得迟钝,而一个优秀的套件则能赋予它们超凡的能力。Claude Code (在我看来) 是市面上最好的工具套件。 +**最好的工具套件,最好的模型:** 本项目运行在 Claude Agent SDK 之上,这意味着您直接运行的就是 Claude Code。Claude Code 高度强大,其编码和问题解决能力使其能够修改和扩展 NanoClaw,为每个用户量身定制。 ## 功能支持 -- **WhatsApp 输入/输出** - 通过手机给 Claude 发消息 +- **多渠道消息** - 通过 WhatsApp、Telegram、Discord、Slack 或 Gmail 与您的助手对话。使用 `/add-whatsapp` 或 `/add-telegram` 等技能添加渠道,可同时运行一个或多个。 - **隔离的群组上下文** - 每个群组都拥有独立的 `CLAUDE.md` 记忆和隔离的文件系统。它们在各自的容器沙箱中运行,且仅挂载所需的文件系统。 - **主频道** - 您的私有频道(self-chat),用于管理控制;其他所有群组都完全隔离 - **计划任务** - 运行 Claude 的周期性作业,并可以给您回发消息 @@ -101,15 +105,10 @@ claude 我们希望看到的技能: **通信渠道** -- `/add-telegram` - 添加 Telegram 作为渠道。应提供选项让用户选择替换 WhatsApp 或作为额外渠道添加。也应能将其添加为控制渠道(可以触发动作)或仅作为被其他地方触发的动作所使用的渠道。 -- `/add-slack` - 添加 Slack -- `/add-discord` - 添加 Discord - -**平台支持** -- `/setup-windows` - 通过 WSL2 + Docker 支持 Windows +- `/add-signal` - 添加 Signal 作为渠道 **会话管理** -- `/add-clear` - 添加一个 `/clear` 命令,用于压缩会话(在同一会话中总结上下文,同时保留关键信息)。这需要研究如何通过 Claude Agent SDK 以编程方式触发压缩。 +- `/clear` - 添加一个 `/clear` 命令,用于压缩会话(在同一会话中总结上下文,同时保留关键信息)。这需要研究如何通过 Claude Agent SDK 以编程方式触发压缩。 ## 系统要求 @@ -121,17 +120,19 @@ claude ## 架构 ``` -WhatsApp (baileys) --> SQLite --> 轮询循环 --> 容器 (Claude Agent SDK) --> 响应 +渠道 --> SQLite --> 轮询循环 --> 容器 (Claude Agent SDK) --> 响应 ``` -单一 Node.js 进程。智能体在具有挂载目录的隔离 Linux 容器中执行。每个群组的消息队列都带有全局并发控制。通过文件系统进行进程间通信(IPC)。 +单一 Node.js 进程。渠道通过技能添加,启动时自注册 — 编排器连接具有凭据的渠道。智能体在具有文件系统隔离的 Linux 容器中执行。每个群组的消息队列带有并发控制。通过文件系统进行 IPC。 + +完整架构详情请见 [docs/SPEC.md](docs/SPEC.md)。 关键文件: - `src/index.ts` - 编排器:状态管理、消息循环、智能体调用 -- `src/channels/whatsapp.ts` - WhatsApp 连接、认证、收发消息 +- `src/channels/registry.ts` - 渠道注册表(启动时自注册) - `src/ipc.ts` - IPC 监听与任务处理 - `src/router.ts` - 消息格式化与出站路由 -- `src/group-queue.ts` - 各带全局并发限制的群组队列 +- `src/group-queue.ts` - 带全局并发限制的群组队列 - `src/container-runner.ts` - 生成流式智能体容器 - `src/task-scheduler.ts` - 运行计划任务 - `src/db.ts` - SQLite 操作(消息、群组、会话、状态) @@ -139,10 +140,6 @@ WhatsApp (baileys) --> SQLite --> 轮询循环 --> 容器 (Claude Agent SDK) --> ## FAQ -**为什么是 WhatsApp 而不是 Telegram/Signal 等?** - -因为我用 WhatsApp。Fork 这个项目然后运行一个技能来改变它。正是这个项目的核心理念。 - **为什么是 Docker?** Docker 提供跨平台支持(macOS 和 Linux)和成熟的生态系统。在 macOS 上,您可以选择通过运行 `/convert-to-apple-container` 切换到 Apple Container,以获得更轻量级的原生运行时体验。 @@ -159,13 +156,29 @@ Docker 提供跨平台支持(macOS 和 Linux)和成熟的生态系统。在 我们不希望配置泛滥。每个用户都应该定制它,让代码完全符合他们的需求,而不是去配置一个通用的系统。如果您喜欢用配置文件,告诉 Claude 让它加上。 +**我可以使用第三方或开源模型吗?** + +可以。NanoClaw 支持任何 API 兼容的模型端点。在 `.env` 文件中设置以下环境变量: + +```bash +ANTHROPIC_BASE_URL=https://your-api-endpoint.com +ANTHROPIC_AUTH_TOKEN=your-token-here +``` + +这使您能够使用: +- 通过 [Ollama](https://ollama.ai) 配合 API 代理运行的本地模型 +- 托管在 [Together AI](https://together.ai)、[Fireworks](https://fireworks.ai) 等平台上的开源模型 +- 兼容 Anthropic API 格式的自定义模型部署 + +注意:为获得最佳兼容性,模型需支持 Anthropic API 格式。 + **我该如何调试问题?** 问 Claude Code。"为什么计划任务没有运行?" "最近的日志里有什么?" "为什么这条消息没有得到回应?" 这就是 AI 原生的方法。 **为什么我的安装不成功?** -我不知道。运行 `claude`,然后运行 `/debug`。如果 Claude 发现一个可能影响其他用户的问题,请开一个 PR 来修改 `SKILL.md` 安装文件。 +如果遇到问题,安装过程中 Claude 会尝试动态修复。如果问题仍然存在,运行 `claude`,然后运行 `/debug`。如果 Claude 发现一个可能影响其他用户的问题,请开一个 PR 来修改 setup SKILL.md。 **什么样的代码更改会被接受?** @@ -179,6 +192,10 @@ Docker 提供跨平台支持(macOS 和 Linux)和成熟的生态系统。在 有任何疑问或建议?欢迎[加入 Discord 社区](https://discord.gg/VDdww8qS42)与我们交流。 +## 更新日志 + +破坏性变更和迁移说明请见 [CHANGELOG.md](CHANGELOG.md)。 + ## 许可证 MIT diff --git a/container/Dockerfile b/container/Dockerfile index 58e1acd..e8537c3 100644 --- a/container/Dockerfile +++ b/container/Dockerfile @@ -7,6 +7,7 @@ FROM node:22-slim RUN apt-get update && apt-get install -y \ chromium \ fonts-liberation \ + fonts-noto-cjk \ fonts-noto-color-emoji \ libgbm1 \ libnss3 \ @@ -51,7 +52,8 @@ RUN npm run build RUN mkdir -p /workspace/group /workspace/global /workspace/extra /workspace/ipc/messages /workspace/ipc/tasks /workspace/ipc/input # Create entrypoint script -# Secrets are passed via stdin JSON — temp file is deleted immediately after Node reads it +# Container input (prompt, group info) is passed via stdin JSON. +# Credentials are injected by the host's credential proxy — never passed here. # Follow-up messages arrive via IPC files in /workspace/ipc/input/ RUN printf '#!/bin/bash\nset -e\ncd /app && npx tsc --outDir /tmp/dist 2>&1 >&2\nln -s /app/node_modules /tmp/dist/node_modules\nchmod -R a-w /tmp/dist\ncat > /tmp/input.json\nnode /tmp/dist/index.js < /tmp/input.json\n' > /app/entrypoint.sh && chmod +x /app/entrypoint.sh diff --git a/container/agent-runner/package-lock.json b/container/agent-runner/package-lock.json index cbd51ca..9ae119b 100644 --- a/container/agent-runner/package-lock.json +++ b/container/agent-runner/package-lock.json @@ -8,7 +8,7 @@ "name": "nanoclaw-agent-runner", "version": "1.0.0", "dependencies": { - "@anthropic-ai/claude-agent-sdk": "^0.2.34", + "@anthropic-ai/claude-agent-sdk": "^0.2.76", "@modelcontextprotocol/sdk": "^1.12.1", "cron-parser": "^5.0.0", "zod": "^4.0.0" @@ -19,22 +19,23 @@ } }, "node_modules/@anthropic-ai/claude-agent-sdk": { - "version": "0.2.34", - "resolved": "https://registry.npmjs.org/@anthropic-ai/claude-agent-sdk/-/claude-agent-sdk-0.2.34.tgz", - "integrity": "sha512-QLHd3Nt7bGU7/YH71fXFaztM9fNxGGruzTMrTYJkbm5gYJl5ZyU2zGyoE5VpWC0e1QU0yYdNdBVgqSYDcJGufg==", + "version": "0.2.76", + "resolved": "https://registry.npmjs.org/@anthropic-ai/claude-agent-sdk/-/claude-agent-sdk-0.2.76.tgz", + "integrity": "sha512-HZxvnT8ZWkzCnQygaYCA0dl8RSUzuVbxE1YG4ecy6vh4nQbTT36CxUxBy+QVdR12pPQluncC0mCOLhI2918Eaw==", "license": "SEE LICENSE IN README.md", "engines": { "node": ">=18.0.0" }, "optionalDependencies": { - "@img/sharp-darwin-arm64": "^0.33.5", - "@img/sharp-darwin-x64": "^0.33.5", - "@img/sharp-linux-arm": "^0.33.5", - "@img/sharp-linux-arm64": "^0.33.5", - "@img/sharp-linux-x64": "^0.33.5", - "@img/sharp-linuxmusl-arm64": "^0.33.5", - "@img/sharp-linuxmusl-x64": "^0.33.5", - "@img/sharp-win32-x64": "^0.33.5" + "@img/sharp-darwin-arm64": "^0.34.2", + "@img/sharp-darwin-x64": "^0.34.2", + "@img/sharp-linux-arm": "^0.34.2", + "@img/sharp-linux-arm64": "^0.34.2", + "@img/sharp-linux-x64": "^0.34.2", + "@img/sharp-linuxmusl-arm64": "^0.34.2", + "@img/sharp-linuxmusl-x64": "^0.34.2", + "@img/sharp-win32-arm64": "^0.34.2", + "@img/sharp-win32-x64": "^0.34.2" }, "peerDependencies": { "zod": "^4.0.0" @@ -53,9 +54,9 @@ } }, "node_modules/@img/sharp-darwin-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.33.5.tgz", - "integrity": "sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.34.5.tgz", + "integrity": "sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==", "cpu": [ "arm64" ], @@ -71,13 +72,13 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-darwin-arm64": "1.0.4" + "@img/sharp-libvips-darwin-arm64": "1.2.4" } }, "node_modules/@img/sharp-darwin-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.33.5.tgz", - "integrity": "sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.34.5.tgz", + "integrity": "sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==", "cpu": [ "x64" ], @@ -93,13 +94,13 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-darwin-x64": "1.0.4" + "@img/sharp-libvips-darwin-x64": "1.2.4" } }, "node_modules/@img/sharp-libvips-darwin-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.0.4.tgz", - "integrity": "sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.2.4.tgz", + "integrity": "sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==", "cpu": [ "arm64" ], @@ -113,9 +114,9 @@ } }, "node_modules/@img/sharp-libvips-darwin-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.0.4.tgz", - "integrity": "sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.2.4.tgz", + "integrity": "sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==", "cpu": [ "x64" ], @@ -129,9 +130,9 @@ } }, "node_modules/@img/sharp-libvips-linux-arm": { - "version": "1.0.5", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.0.5.tgz", - "integrity": "sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.2.4.tgz", + "integrity": "sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==", "cpu": [ "arm" ], @@ -145,9 +146,9 @@ } }, "node_modules/@img/sharp-libvips-linux-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.0.4.tgz", - "integrity": "sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.2.4.tgz", + "integrity": "sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==", "cpu": [ "arm64" ], @@ -161,9 +162,9 @@ } }, "node_modules/@img/sharp-libvips-linux-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.0.4.tgz", - "integrity": "sha512-MmWmQ3iPFZr0Iev+BAgVMb3ZyC4KeFc3jFxnNbEPas60e1cIfevbtuyf9nDGIzOaW9PdnDciJm+wFFaTlj5xYw==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.2.4.tgz", + "integrity": "sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==", "cpu": [ "x64" ], @@ -177,9 +178,9 @@ } }, "node_modules/@img/sharp-libvips-linuxmusl-arm64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.0.4.tgz", - "integrity": "sha512-9Ti+BbTYDcsbp4wfYib8Ctm1ilkugkA/uscUn6UXK1ldpC1JjiXbLfFZtRlBhjPZ5o1NCLiDbg8fhUPKStHoTA==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.2.4.tgz", + "integrity": "sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==", "cpu": [ "arm64" ], @@ -193,9 +194,9 @@ } }, "node_modules/@img/sharp-libvips-linuxmusl-x64": { - "version": "1.0.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.0.4.tgz", - "integrity": "sha512-viYN1KX9m+/hGkJtvYYp+CCLgnJXwiQB39damAO7WMdKWlIhmYTfHjwSbQeUK/20vY154mwezd9HflVFM1wVSw==", + "version": "1.2.4", + "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.2.4.tgz", + "integrity": "sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==", "cpu": [ "x64" ], @@ -209,9 +210,9 @@ } }, "node_modules/@img/sharp-linux-arm": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.33.5.tgz", - "integrity": "sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.34.5.tgz", + "integrity": "sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==", "cpu": [ "arm" ], @@ -227,13 +228,13 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-arm": "1.0.5" + "@img/sharp-libvips-linux-arm": "1.2.4" } }, "node_modules/@img/sharp-linux-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.33.5.tgz", - "integrity": "sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.34.5.tgz", + "integrity": "sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==", "cpu": [ "arm64" ], @@ -249,13 +250,13 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-arm64": "1.0.4" + "@img/sharp-libvips-linux-arm64": "1.2.4" } }, "node_modules/@img/sharp-linux-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.33.5.tgz", - "integrity": "sha512-opC+Ok5pRNAzuvq1AG0ar+1owsu842/Ab+4qvU879ippJBHvyY5n2mxF1izXqkPYlGuP/M556uh53jRLJmzTWA==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.34.5.tgz", + "integrity": "sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==", "cpu": [ "x64" ], @@ -271,13 +272,13 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linux-x64": "1.0.4" + "@img/sharp-libvips-linux-x64": "1.2.4" } }, "node_modules/@img/sharp-linuxmusl-arm64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.33.5.tgz", - "integrity": "sha512-XrHMZwGQGvJg2V/oRSUfSAfjfPxO+4DkiRh6p2AFjLQztWUuY/o8Mq0eMQVIY7HJ1CDQUJlxGGZRw1a5bqmd1g==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.34.5.tgz", + "integrity": "sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==", "cpu": [ "arm64" ], @@ -293,13 +294,13 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linuxmusl-arm64": "1.0.4" + "@img/sharp-libvips-linuxmusl-arm64": "1.2.4" } }, "node_modules/@img/sharp-linuxmusl-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.33.5.tgz", - "integrity": "sha512-WT+d/cgqKkkKySYmqoZ8y3pxx7lx9vVejxW/W4DOFMYVSkErR+w7mf2u8m/y4+xHe7yY9DAXQMWQhpnMuFfScw==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.34.5.tgz", + "integrity": "sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==", "cpu": [ "x64" ], @@ -315,13 +316,32 @@ "url": "https://opencollective.com/libvips" }, "optionalDependencies": { - "@img/sharp-libvips-linuxmusl-x64": "1.0.4" + "@img/sharp-libvips-linuxmusl-x64": "1.2.4" + } + }, + "node_modules/@img/sharp-win32-arm64": { + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-arm64/-/sharp-win32-arm64-0.34.5.tgz", + "integrity": "sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==", + "cpu": [ + "arm64" + ], + "license": "Apache-2.0 AND LGPL-3.0-or-later", + "optional": true, + "os": [ + "win32" + ], + "engines": { + "node": "^18.17.0 || ^20.3.0 || >=21.0.0" + }, + "funding": { + "url": "https://opencollective.com/libvips" } }, "node_modules/@img/sharp-win32-x64": { - "version": "0.33.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.33.5.tgz", - "integrity": "sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg==", + "version": "0.34.5", + "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.34.5.tgz", + "integrity": "sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==", "cpu": [ "x64" ], diff --git a/container/agent-runner/package.json b/container/agent-runner/package.json index bf13328..42a994e 100644 --- a/container/agent-runner/package.json +++ b/container/agent-runner/package.json @@ -9,7 +9,7 @@ "start": "node dist/index.js" }, "dependencies": { - "@anthropic-ai/claude-agent-sdk": "^0.2.34", + "@anthropic-ai/claude-agent-sdk": "^0.2.76", "@modelcontextprotocol/sdk": "^1.12.1", "cron-parser": "^5.0.0", "zod": "^4.0.0" diff --git a/container/agent-runner/src/index.ts b/container/agent-runner/src/index.ts index 543c5f5..25554f9 100644 --- a/container/agent-runner/src/index.ts +++ b/container/agent-runner/src/index.ts @@ -16,7 +16,8 @@ import fs from 'fs'; import path from 'path'; -import { query, HookCallback, PreCompactHookInput, PreToolUseHookInput } from '@anthropic-ai/claude-agent-sdk'; +import { execFile } from 'child_process'; +import { query, HookCallback, PreCompactHookInput } from '@anthropic-ai/claude-agent-sdk'; import { fileURLToPath } from 'url'; interface ContainerInput { @@ -27,7 +28,7 @@ interface ContainerInput { isMain: boolean; isScheduledTask?: boolean; assistantName?: string; - secrets?: Record; + script?: string; } interface ContainerOutput { @@ -185,30 +186,6 @@ function createPreCompactHook(assistantName?: string): HookCallback { }; } -// Secrets to strip from Bash tool subprocess environments. -// These are needed by claude-code for API auth but should never -// be visible to commands Kit runs. -const SECRET_ENV_VARS = ['ANTHROPIC_API_KEY', 'CLAUDE_CODE_OAUTH_TOKEN']; - -function createSanitizeBashHook(): HookCallback { - return async (input, _toolUseId, _context) => { - const preInput = input as PreToolUseHookInput; - const command = (preInput.tool_input as { command?: string })?.command; - if (!command) return {}; - - const unsetPrefix = `unset ${SECRET_ENV_VARS.join(' ')} 2>/dev/null; `; - return { - hookSpecificOutput: { - hookEventName: 'PreToolUse', - updatedInput: { - ...(preInput.tool_input as Record), - command: unsetPrefix + command, - }, - }, - }; - }; -} - function sanitizeFilename(summary: string): string { return summary .toLowerCase() @@ -451,7 +428,6 @@ async function runQuery( }, hooks: { PreCompact: [{ hooks: [createPreCompactHook(containerInput.assistantName)] }], - PreToolUse: [{ matcher: 'Bash', hooks: [createSanitizeBashHook()] }], }, } })) { @@ -490,13 +466,61 @@ async function runQuery( return { newSessionId, lastAssistantUuid, closedDuringQuery }; } +interface ScriptResult { + wakeAgent: boolean; + data?: unknown; +} + +const SCRIPT_TIMEOUT_MS = 30_000; + +async function runScript(script: string): Promise { + const scriptPath = '/tmp/task-script.sh'; + fs.writeFileSync(scriptPath, script, { mode: 0o755 }); + + return new Promise((resolve) => { + execFile('bash', [scriptPath], { + timeout: SCRIPT_TIMEOUT_MS, + maxBuffer: 1024 * 1024, + env: process.env, + }, (error, stdout, stderr) => { + if (stderr) { + log(`Script stderr: ${stderr.slice(0, 500)}`); + } + + if (error) { + log(`Script error: ${error.message}`); + return resolve(null); + } + + // Parse last non-empty line of stdout as JSON + const lines = stdout.trim().split('\n'); + const lastLine = lines[lines.length - 1]; + if (!lastLine) { + log('Script produced no output'); + return resolve(null); + } + + try { + const result = JSON.parse(lastLine); + if (typeof result.wakeAgent !== 'boolean') { + log(`Script output missing wakeAgent boolean: ${lastLine.slice(0, 200)}`); + return resolve(null); + } + resolve(result as ScriptResult); + } catch { + log(`Script output is not valid JSON: ${lastLine.slice(0, 200)}`); + resolve(null); + } + }); + }); +} + async function main(): Promise { let containerInput: ContainerInput; try { const stdinData = await readStdin(); containerInput = JSON.parse(stdinData); - // Delete the temp file the entrypoint wrote — it contains secrets try { fs.unlinkSync('/tmp/input.json'); } catch { /* may not exist */ } log(`Received input for group: ${containerInput.groupFolder}`); } catch (err) { @@ -508,12 +532,9 @@ async function main(): Promise { process.exit(1); } - // Build SDK env: merge secrets into process.env for the SDK only. - // Secrets never touch process.env itself, so Bash subprocesses can't see them. + // Credentials are injected by the host's credential proxy via ANTHROPIC_BASE_URL. + // No real secrets exist in the container environment. const sdkEnv: Record = { ...process.env }; - for (const [key, value] of Object.entries(containerInput.secrets || {})) { - sdkEnv[key] = value; - } const __dirname = path.dirname(fileURLToPath(import.meta.url)); const mcpServerPath = path.join(__dirname, 'ipc-mcp-stdio.js'); @@ -535,6 +556,26 @@ async function main(): Promise { prompt += '\n' + pending.join('\n'); } + // Script phase: run script before waking agent + if (containerInput.script && containerInput.isScheduledTask) { + log('Running task script...'); + const scriptResult = await runScript(containerInput.script); + + if (!scriptResult || !scriptResult.wakeAgent) { + const reason = scriptResult ? 'wakeAgent=false' : 'script error/no output'; + log(`Script decided not to wake agent: ${reason}`); + writeOutput({ + status: 'success', + result: null, + }); + return; + } + + // Script says wake agent — enrich prompt with script data + log(`Script wakeAgent=true, enriching prompt with data`); + prompt = `[SCHEDULED TASK]\n\nScript output:\n${JSON.stringify(scriptResult.data, null, 2)}\n\nInstructions:\n${containerInput.prompt}`; + } + // Query loop: run query → wait for IPC message → run new query → repeat let resumeAt: string | undefined; try { diff --git a/container/agent-runner/src/ipc-mcp-stdio.ts b/container/agent-runner/src/ipc-mcp-stdio.ts index 006b812..5b03478 100644 --- a/container/agent-runner/src/ipc-mcp-stdio.ts +++ b/container/agent-runner/src/ipc-mcp-stdio.ts @@ -41,7 +41,7 @@ const server = new McpServer({ server.tool( 'send_message', - "Send a message to the user or group immediately while you're still running. Use this for progress updates or to send multiple messages. You can call this multiple times. Note: when running as a scheduled task, your final output is NOT sent to the user — use this tool if you need to communicate with the user or group.", + "Send a message to the user or group immediately while you're still running. Use this for progress updates or to send multiple messages. You can call this multiple times.", { text: z.string().describe('The message text to send'), sender: z.string().optional().describe('Your role/identity name (e.g. "Researcher"). When set, messages appear from a dedicated bot in Telegram.'), @@ -64,7 +64,7 @@ server.tool( server.tool( 'schedule_task', - `Schedule a recurring or one-time task. The task will run as a full agent with access to all tools. + `Schedule a recurring or one-time task. The task will run as a full agent with access to all tools. Returns the task ID for future reference. To modify an existing task, use update_task instead. CONTEXT MODE - Choose based on task type: \u2022 "group": Task runs in the group's conversation context, with access to chat history. Use for tasks that need context about ongoing discussions, user preferences, or recent interactions. @@ -91,6 +91,7 @@ SCHEDULE VALUE FORMAT (all times are LOCAL timezone): schedule_value: z.string().describe('cron: "*/5 * * * *" | interval: milliseconds like "300000" | once: local timestamp like "2026-02-01T15:30:00" (no Z suffix!)'), context_mode: z.enum(['group', 'isolated']).default('group').describe('group=runs with chat history and memory, isolated=fresh session (include context in prompt)'), target_group_jid: z.string().optional().describe('(Main group only) JID of the group to schedule the task for. Defaults to the current group.'), + script: z.string().optional().describe('Optional bash script to run before waking the agent. Script must output JSON on the last line of stdout: { "wakeAgent": boolean, "data"?: any }. If wakeAgent is false, the agent is not called. Test your script with bash -c "..." before scheduling.'), }, async (args) => { // Validate schedule_value before writing IPC @@ -130,9 +131,13 @@ SCHEDULE VALUE FORMAT (all times are LOCAL timezone): // Non-main groups can only schedule for themselves const targetJid = isMain && args.target_group_jid ? args.target_group_jid : chatJid; + const taskId = `task-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`; + const data = { type: 'schedule_task', + taskId, prompt: args.prompt, + script: args.script || undefined, schedule_type: args.schedule_type, schedule_value: args.schedule_value, context_mode: args.context_mode || 'group', @@ -141,10 +146,10 @@ SCHEDULE VALUE FORMAT (all times are LOCAL timezone): timestamp: new Date().toISOString(), }; - const filename = writeIpcFile(TASKS_DIR, data); + writeIpcFile(TASKS_DIR, data); return { - content: [{ type: 'text' as const, text: `Task scheduled (${filename}): ${args.schedule_type} - ${args.schedule_value}` }], + content: [{ type: 'text' as const, text: `Task ${taskId} scheduled: ${args.schedule_type} - ${args.schedule_value}` }], }; }, ); @@ -245,14 +250,66 @@ server.tool( ); server.tool( - 'register_group', - `Register a new WhatsApp group so the agent can respond to messages there. Main group only. - -Use available_groups.json to find the JID for a group. The folder name should be lowercase with hyphens (e.g., "family-chat").`, + 'update_task', + 'Update an existing scheduled task. Only provided fields are changed; omitted fields stay the same.', { - jid: z.string().describe('The WhatsApp JID (e.g., "120363336345536173@g.us")'), + task_id: z.string().describe('The task ID to update'), + prompt: z.string().optional().describe('New prompt for the task'), + schedule_type: z.enum(['cron', 'interval', 'once']).optional().describe('New schedule type'), + schedule_value: z.string().optional().describe('New schedule value (see schedule_task for format)'), + script: z.string().optional().describe('New script for the task. Set to empty string to remove the script.'), + }, + async (args) => { + // Validate schedule_value if provided + if (args.schedule_type === 'cron' || (!args.schedule_type && args.schedule_value)) { + if (args.schedule_value) { + try { + CronExpressionParser.parse(args.schedule_value); + } catch { + return { + content: [{ type: 'text' as const, text: `Invalid cron: "${args.schedule_value}".` }], + isError: true, + }; + } + } + } + if (args.schedule_type === 'interval' && args.schedule_value) { + const ms = parseInt(args.schedule_value, 10); + if (isNaN(ms) || ms <= 0) { + return { + content: [{ type: 'text' as const, text: `Invalid interval: "${args.schedule_value}".` }], + isError: true, + }; + } + } + + const data: Record = { + type: 'update_task', + taskId: args.task_id, + groupFolder, + isMain: String(isMain), + timestamp: new Date().toISOString(), + }; + if (args.prompt !== undefined) data.prompt = args.prompt; + if (args.script !== undefined) data.script = args.script; + if (args.schedule_type !== undefined) data.schedule_type = args.schedule_type; + if (args.schedule_value !== undefined) data.schedule_value = args.schedule_value; + + writeIpcFile(TASKS_DIR, data); + + return { content: [{ type: 'text' as const, text: `Task ${args.task_id} update requested.` }] }; + }, +); + +server.tool( + 'register_group', + `Register a new chat/group so the agent can respond to messages there. Main group only. + +Use available_groups.json to find the JID for a group. The folder name must be channel-prefixed: "{channel}_{group-name}" (e.g., "whatsapp_family-chat", "telegram_dev-team", "discord_general"). Use lowercase with hyphens for the group name part.`, + { + jid: z.string().describe('The chat JID (e.g., "120363336345536173@g.us", "tg:-1001234567890", "dc:1234567890123456")'), name: z.string().describe('Display name for the group'), - folder: z.string().describe('Folder name for group files (lowercase, hyphens, e.g., "family-chat")'), + folder: z.string().describe('Channel-prefixed folder name (e.g., "whatsapp_family-chat", "telegram_dev-team")'), trigger: z.string().describe('Trigger word (e.g., "@Andy")'), }, async (args) => { diff --git a/container/skills/capabilities/SKILL.md b/container/skills/capabilities/SKILL.md new file mode 100644 index 0000000..8e8be14 --- /dev/null +++ b/container/skills/capabilities/SKILL.md @@ -0,0 +1,100 @@ +--- +name: capabilities +description: Show what this NanoClaw instance can do — installed skills, available tools, and system info. Read-only. Use when the user asks what the bot can do, what's installed, or runs /capabilities. +--- + +# /capabilities — System Capabilities Report + +Generate a structured read-only report of what this NanoClaw instance can do. + +**Main-channel check:** Only the main channel has `/workspace/project` mounted. Run: + +```bash +test -d /workspace/project && echo "MAIN" || echo "NOT_MAIN" +``` + +If `NOT_MAIN`, respond with: +> This command is available in your main chat only. Send `/capabilities` there to see what I can do. + +Then stop — do not generate the report. + +## How to gather the information + +Run these commands and compile the results into the report format below. + +### 1. Installed skills + +List skill directories available to you: + +```bash +ls -1 /home/node/.claude/skills/ 2>/dev/null || echo "No skills found" +``` + +Each directory is an installed skill. The directory name is the skill name (e.g., `agent-browser` → `/agent-browser`). + +### 2. Available tools + +Read the allowed tools from your SDK configuration. You always have access to: +- **Core:** Bash, Read, Write, Edit, Glob, Grep +- **Web:** WebSearch, WebFetch +- **Orchestration:** Task, TaskOutput, TaskStop, TeamCreate, TeamDelete, SendMessage +- **Other:** TodoWrite, ToolSearch, Skill, NotebookEdit +- **MCP:** mcp__nanoclaw__* (messaging, tasks, group management) + +### 3. MCP server tools + +The NanoClaw MCP server exposes these tools (via `mcp__nanoclaw__*` prefix): +- `send_message` — send a message to the user/group +- `schedule_task` — schedule a recurring or one-time task +- `list_tasks` — list scheduled tasks +- `pause_task` — pause a scheduled task +- `resume_task` — resume a paused task +- `cancel_task` — cancel and delete a task +- `update_task` — update an existing task +- `register_group` — register a new chat/group (main only) + +### 4. Container skills (Bash tools) + +Check for executable tools in the container: + +```bash +which agent-browser 2>/dev/null && echo "agent-browser: available" || echo "agent-browser: not found" +``` + +### 5. Group info + +```bash +ls /workspace/group/CLAUDE.md 2>/dev/null && echo "Group memory: yes" || echo "Group memory: no" +ls /workspace/extra/ 2>/dev/null && echo "Extra mounts: $(ls /workspace/extra/ 2>/dev/null | wc -l | tr -d ' ')" || echo "Extra mounts: none" +``` + +## Report format + +Present the report as a clean, readable message. Example: + +``` +📋 *NanoClaw Capabilities* + +*Installed Skills:* +• /agent-browser — Browse the web, fill forms, extract data +• /capabilities — This report +(list all found skills) + +*Tools:* +• Core: Bash, Read, Write, Edit, Glob, Grep +• Web: WebSearch, WebFetch +• Orchestration: Task, TeamCreate, SendMessage +• MCP: send_message, schedule_task, list_tasks, pause/resume/cancel/update_task, register_group + +*Container Tools:* +• agent-browser: ✓ + +*System:* +• Group memory: yes/no +• Extra mounts: N directories +• Main channel: yes +``` + +Adapt the output based on what you actually find — don't list things that aren't installed. + +**See also:** `/status` for a quick health check of session, workspace, and tasks. diff --git a/container/skills/slack-formatting/SKILL.md b/container/skills/slack-formatting/SKILL.md new file mode 100644 index 0000000..29a1b87 --- /dev/null +++ b/container/skills/slack-formatting/SKILL.md @@ -0,0 +1,94 @@ +--- +name: slack-formatting +description: Format messages for Slack using mrkdwn syntax. Use when responding to Slack channels (folder starts with "slack_" or JID contains slack identifiers). +--- + +# Slack Message Formatting (mrkdwn) + +When responding to Slack channels, use Slack's mrkdwn syntax instead of standard Markdown. + +## How to detect Slack context + +Check your group folder name or workspace path: +- Folder starts with `slack_` (e.g., `slack_engineering`, `slack_general`) +- Or check `/workspace/group/` path for `slack_` prefix + +## Formatting reference + +### Text styles + +| Style | Syntax | Example | +|-------|--------|---------| +| Bold | `*text*` | *bold text* | +| Italic | `_text_` | _italic text_ | +| Strikethrough | `~text~` | ~strikethrough~ | +| Code (inline) | `` `code` `` | `inline code` | +| Code block | ` ```code``` ` | Multi-line code | + +### Links and mentions + +``` + # Named link + # Auto-linked URL +<@U1234567890> # Mention user by ID +<#C1234567890> # Mention channel by ID + # @here + # @channel +``` + +### Lists + +Slack supports simple bullet lists but NOT numbered lists: + +``` +• First item +• Second item +• Third item +``` + +Use `•` (bullet character) or `- ` or `* ` for bullets. + +### Block quotes + +``` +> This is a block quote +> It can span multiple lines +``` + +### Emoji + +Use standard emoji shortcodes: `:white_check_mark:`, `:x:`, `:rocket:`, `:tada:` + +## What NOT to use + +- **NO** `##` headings (use `*Bold text*` for headers instead) +- **NO** `**double asterisks**` for bold (use `*single asterisks*`) +- **NO** `[text](url)` links (use `` instead) +- **NO** `1.` numbered lists (use bullets with numbers: `• 1. First`) +- **NO** tables (use code blocks or plain text alignment) +- **NO** `---` horizontal rules + +## Example message + +``` +*Daily Standup Summary* + +_March 21, 2026_ + +• *Completed:* Fixed authentication bug in login flow +• *In Progress:* Building new dashboard widgets +• *Blocked:* Waiting on API access from DevOps + +> Next sync: Monday 10am + +:white_check_mark: All tests passing | +``` + +## Quick rules + +1. Use `*bold*` not `**bold**` +2. Use `` not `[text](url)` +3. Use `•` bullets, avoid numbered lists +4. Use `:emoji:` shortcodes +5. Quote blocks with `>` +6. Skip headings — use bold text instead diff --git a/container/skills/status/SKILL.md b/container/skills/status/SKILL.md new file mode 100644 index 0000000..3a99fcc --- /dev/null +++ b/container/skills/status/SKILL.md @@ -0,0 +1,104 @@ +--- +name: status +description: Quick read-only health check — session context, workspace mounts, tool availability, and task snapshot. Use when the user asks for system status or runs /status. +--- + +# /status — System Status Check + +Generate a quick read-only status report of the current agent environment. + +**Main-channel check:** Only the main channel has `/workspace/project` mounted. Run: + +```bash +test -d /workspace/project && echo "MAIN" || echo "NOT_MAIN" +``` + +If `NOT_MAIN`, respond with: +> This command is available in your main chat only. Send `/status` there to check system status. + +Then stop — do not generate the report. + +## How to gather the information + +Run the checks below and compile results into the report format. + +### 1. Session context + +```bash +echo "Timestamp: $(date)" +echo "Working dir: $(pwd)" +echo "Channel: main" +``` + +### 2. Workspace and mount visibility + +```bash +echo "=== Workspace ===" +ls /workspace/ 2>/dev/null +echo "=== Group folder ===" +ls /workspace/group/ 2>/dev/null | head -20 +echo "=== Extra mounts ===" +ls /workspace/extra/ 2>/dev/null || echo "none" +echo "=== IPC ===" +ls /workspace/ipc/ 2>/dev/null +``` + +### 3. Tool availability + +Confirm which tool families are available to you: + +- **Core:** Bash, Read, Write, Edit, Glob, Grep +- **Web:** WebSearch, WebFetch +- **Orchestration:** Task, TaskOutput, TaskStop, TeamCreate, TeamDelete, SendMessage +- **MCP:** mcp__nanoclaw__* (send_message, schedule_task, list_tasks, pause_task, resume_task, cancel_task, update_task, register_group) + +### 4. Container utilities + +```bash +which agent-browser 2>/dev/null && echo "agent-browser: available" || echo "agent-browser: not installed" +node --version 2>/dev/null +claude --version 2>/dev/null +``` + +### 5. Task snapshot + +Use the MCP tool to list tasks: + +``` +Call mcp__nanoclaw__list_tasks to get scheduled tasks. +``` + +If no tasks exist, report "No scheduled tasks." + +## Report format + +Present as a clean, readable message: + +``` +🔍 *NanoClaw Status* + +*Session:* +• Channel: main +• Time: 2026-03-14 09:30 UTC +• Working dir: /workspace/group + +*Workspace:* +• Group folder: ✓ (N files) +• Extra mounts: none / N directories +• IPC: ✓ (messages, tasks, input) + +*Tools:* +• Core: ✓ Web: ✓ Orchestration: ✓ MCP: ✓ + +*Container:* +• agent-browser: ✓ / not installed +• Node: vXX.X.X +• Claude Code: vX.X.X + +*Scheduled Tasks:* +• N active tasks / No scheduled tasks +``` + +Adapt based on what you actually find. Keep it concise — this is a quick health check, not a deep diagnostic. + +**See also:** `/capabilities` for a full list of installed skills and tools. diff --git a/docs/DEBUG_CHECKLIST.md b/docs/DEBUG_CHECKLIST.md index a04d88f..a7b3de1 100644 --- a/docs/DEBUG_CHECKLIST.md +++ b/docs/DEBUG_CHECKLIST.md @@ -47,16 +47,16 @@ launchctl list | grep nanoclaw # Expected: PID 0 com.nanoclaw (PID = running, "-" = not running, non-zero exit = crashed) # 2. Any running containers? -container ls --format '{{.Names}} {{.Status}}' 2>/dev/null | grep nanoclaw +docker ps --format '{{.Names}} {{.Status}}' 2>/dev/null | grep nanoclaw # 3. Any stopped/orphaned containers? -container ls -a --format '{{.Names}} {{.Status}}' 2>/dev/null | grep nanoclaw +docker ps -a --format '{{.Names}} {{.Status}}' 2>/dev/null | grep nanoclaw # 4. Recent errors in service log? grep -E 'ERROR|WARN' logs/nanoclaw.log | tail -20 -# 5. Is WhatsApp connected? (look for last connection event) -grep -E 'Connected to WhatsApp|Connection closed|connection.*close' logs/nanoclaw.log | tail -5 +# 5. Are channels connected? (look for last connection event) +grep -E 'Connected|Connection closed|connection.*close|channel.*ready' logs/nanoclaw.log | tail -5 # 6. Are groups loaded? grep 'groupCount' logs/nanoclaw.log | tail -3 @@ -105,7 +105,7 @@ grep -E 'Scheduling retry|retry|Max retries' logs/nanoclaw.log | tail -10 ## Agent Not Responding ```bash -# Check if messages are being received from WhatsApp +# Check if messages are being received from channels grep 'New messages' logs/nanoclaw.log | tail -10 # Check if messages are being processed (container spawned) @@ -135,10 +135,10 @@ sqlite3 store/messages.db "SELECT name, container_config FROM registered_groups; # Test-run a container to check mounts (dry run) # Replace with the group's folder name -container run -i --rm --entrypoint ls nanoclaw-agent:latest /workspace/extra/ +docker run -i --rm --entrypoint ls nanoclaw-agent:latest /workspace/extra/ ``` -## WhatsApp Auth Issues +## Channel Auth Issues ```bash # Check if QR code was requested (means auth expired) diff --git a/docs/README.md b/docs/README.md new file mode 100644 index 0000000..bb062e5 --- /dev/null +++ b/docs/README.md @@ -0,0 +1,15 @@ +# NanoClaw Documentation + +The official documentation is at **[docs.nanoclaw.dev](https://docs.nanoclaw.dev)**. + +The files in this directory are original design documents and developer references. For the most current and accurate information, use the documentation site. + +| This directory | Documentation site | +|---|---| +| [SPEC.md](SPEC.md) | [Architecture](https://docs.nanoclaw.dev/concepts/architecture) | +| [SECURITY.md](SECURITY.md) | [Security model](https://docs.nanoclaw.dev/concepts/security) | +| [REQUIREMENTS.md](REQUIREMENTS.md) | [Introduction](https://docs.nanoclaw.dev/introduction) | +| [skills-as-branches.md](skills-as-branches.md) | [Skills system](https://docs.nanoclaw.dev/integrations/skills-system) | +| [DEBUG_CHECKLIST.md](DEBUG_CHECKLIST.md) | [Troubleshooting](https://docs.nanoclaw.dev/advanced/troubleshooting) | +| [docker-sandboxes.md](docker-sandboxes.md) | [Docker Sandboxes](https://docs.nanoclaw.dev/advanced/docker-sandboxes) | +| [APPLE-CONTAINER-NETWORKING.md](APPLE-CONTAINER-NETWORKING.md) | [Container runtime](https://docs.nanoclaw.dev/advanced/container-runtime) | diff --git a/docs/REQUIREMENTS.md b/docs/REQUIREMENTS.md index 227c9ad..e7c2376 100644 --- a/docs/REQUIREMENTS.md +++ b/docs/REQUIREMENTS.md @@ -22,9 +22,9 @@ The entire codebase should be something you can read and understand. One Node.js Instead of application-level permission systems trying to prevent agents from accessing things, agents run in actual Linux containers. The isolation is at the OS level. Agents can only see what's explicitly mounted. Bash access is safe because commands run inside the container, not on your Mac. -### Built for One User +### Built for the Individual User -This isn't a framework or a platform. It's working software for my specific needs. I use WhatsApp and Email, so it supports WhatsApp and Email. I don't use Telegram, so it doesn't support Telegram. I add the integrations I actually want, not every possible integration. +This isn't a framework or a platform. It's software that fits each user's exact needs. You fork the repo, add the channels you want (WhatsApp, Telegram, Discord, Slack, Gmail), and end up with clean code that does exactly what you need. ### Customization = Code Changes @@ -44,41 +44,31 @@ When people contribute, they shouldn't add "Telegram support alongside WhatsApp. ## RFS (Request for Skills) -Skills we'd love contributors to build: +Skills we'd like to see contributed: ### Communication Channels -Skills to add or switch to different messaging platforms: -- `/add-telegram` - Add Telegram as an input channel -- `/add-slack` - Add Slack as an input channel -- `/add-discord` - Add Discord as an input channel -- `/add-sms` - Add SMS via Twilio or similar -- `/convert-to-telegram` - Replace WhatsApp with Telegram entirely +- `/add-signal` - Add Signal as a channel +- `/add-matrix` - Add Matrix integration -### Container Runtime -The project uses Docker by default (cross-platform). For macOS users who prefer Apple Container: -- `/convert-to-apple-container` - Switch from Docker to Apple Container (macOS-only) - -### Platform Support -- `/setup-linux` - Make the full setup work on Linux (depends on Docker conversion) -- `/setup-windows` - Windows support via WSL2 + Docker +> **Note:** Telegram, Slack, Discord, Gmail, and Apple Container skills already exist. See the [skills documentation](https://docs.nanoclaw.dev/integrations/skills-system) for the full list. --- ## Vision -A personal Claude assistant accessible via WhatsApp, with minimal custom code. +A personal Claude assistant accessible via messaging, with minimal custom code. **Core components:** - **Claude Agent SDK** as the core agent - **Containers** for isolated agent execution (Linux VMs) -- **WhatsApp** as the primary I/O channel +- **Multi-channel messaging** (WhatsApp, Telegram, Discord, Slack, Gmail) — add exactly the channels you need - **Persistent memory** per conversation and globally - **Scheduled tasks** that run Claude and can message back - **Web access** for search and browsing - **Browser automation** via agent-browser **Implementation approach:** -- Use existing tools (WhatsApp connector, Claude Agent SDK, MCP servers) +- Use existing tools (channel libraries, Claude Agent SDK, MCP servers) - Minimal glue code - File-based systems where possible (CLAUDE.md for memory, folders for groups) @@ -87,7 +77,7 @@ A personal Claude assistant accessible via WhatsApp, with minimal custom code. ## Architecture Decisions ### Message Routing -- A router listens to WhatsApp and routes messages based on configuration +- A router listens to connected channels and routes messages based on configuration - Only messages from registered groups are processed - Trigger: `@Andy` prefix (case insensitive), configurable via `ASSISTANT_NAME` env var - Unregistered groups are ignored completely @@ -136,10 +126,11 @@ A personal Claude assistant accessible via WhatsApp, with minimal custom code. ## Integration Points -### WhatsApp -- Using baileys library for WhatsApp Web connection +### Channels +- WhatsApp (baileys), Telegram (grammy), Discord (discord.js), Slack (@slack/bolt), Gmail (googleapis) +- Each channel lives in a separate fork repo and is added via skills (e.g., `/add-whatsapp`, `/add-telegram`) - Messages stored in SQLite, polled by router -- QR code authentication during setup +- Channels self-register at startup — unconfigured channels are skipped with a warning ### Scheduler - Built-in scheduler runs on the host, spawns containers for task execution @@ -170,12 +161,12 @@ A personal Claude assistant accessible via WhatsApp, with minimal custom code. - Each user gets a custom setup matching their exact needs ### Skills -- `/setup` - Install dependencies, authenticate WhatsApp, configure scheduler, start services -- `/customize` - General-purpose skill for adding capabilities (new channels like Telegram, new integrations, behavior changes) -- `/update` - Pull upstream changes, merge with customizations, run migrations +- `/setup` - Install dependencies, configure channels, start services +- `/customize` - General-purpose skill for adding capabilities +- `/update-nanoclaw` - Pull upstream changes, merge with customizations ### Deployment -- Runs on local Mac via launchd +- Runs on macOS (launchd), Linux (systemd), or Windows (WSL2) - Single Node.js process handles everything --- diff --git a/docs/SECURITY.md b/docs/SECURITY.md index 7fcee1b..7cf29f8 100644 --- a/docs/SECURITY.md +++ b/docs/SECURITY.md @@ -7,7 +7,7 @@ | Main group | Trusted | Private self-chat, admin control | | Non-main groups | Untrusted | Other users may be malicious | | Container agents | Sandboxed | Isolated execution environment | -| WhatsApp messages | User input | Potential prompt injection | +| Incoming messages | User input | Potential prompt injection | ## Security Boundaries @@ -64,23 +64,24 @@ Messages and task operations are verified against group identity: | View all tasks | ✓ | Own only | | Manage other groups | ✓ | ✗ | -### 5. Credential Handling +### 5. Credential Isolation (OneCLI Agent Vault) -**Mounted Credentials:** -- Claude auth tokens (filtered from `.env`, read-only) +Real API credentials **never enter containers**. NanoClaw uses [OneCLI's Agent Vault](https://github.com/onecli/onecli) to proxy outbound requests and inject credentials at the gateway level. + +**How it works:** +1. Credentials are registered once with `onecli secrets create`, stored and managed by OneCLI +2. When NanoClaw spawns a container, it calls `applyContainerConfig()` to route outbound HTTPS through the OneCLI gateway +3. The gateway matches requests by host and path, injects the real credential, and forwards +4. Agents cannot discover real credentials — not in environment, stdin, files, or `/proc` + +**Per-agent policies:** +Each NanoClaw group gets its own OneCLI agent identity. This allows different credential policies per group (e.g. your sales agent vs. support agent). OneCLI supports rate limits, and time-bound access and approval flows are on the roadmap. **NOT Mounted:** -- WhatsApp session (`store/auth/`) - host only -- Mount allowlist - external, never mounted +- Channel auth sessions (`store/auth/`) — host only +- Mount allowlist — external, never mounted - Any credentials matching blocked patterns - -**Credential Filtering:** -Only these environment variables are exposed to containers: -```typescript -const allowedVars = ['CLAUDE_CODE_OAUTH_TOKEN', 'ANTHROPIC_API_KEY']; -``` - -> **Note:** Anthropic credentials are mounted so that Claude Code can authenticate when the agent runs. However, this means the agent itself can discover these credentials via Bash or file operations. Ideally, Claude Code would authenticate without exposing credentials to the agent's execution environment, but I couldn't figure this out. **PRs welcome** if you have ideas for credential isolation. +- `.env` is shadowed with `/dev/null` in the project root mount ## Privilege Comparison @@ -98,7 +99,7 @@ const allowedVars = ['CLAUDE_CODE_OAUTH_TOKEN', 'ANTHROPIC_API_KEY']; ``` ┌──────────────────────────────────────────────────────────────────┐ │ UNTRUSTED ZONE │ -│ WhatsApp Messages (potentially malicious) │ +│ Incoming Messages (potentially malicious) │ └────────────────────────────────┬─────────────────────────────────┘ │ ▼ Trigger check, input escaping @@ -108,16 +109,16 @@ const allowedVars = ['CLAUDE_CODE_OAUTH_TOKEN', 'ANTHROPIC_API_KEY']; │ • IPC authorization │ │ • Mount validation (external allowlist) │ │ • Container lifecycle │ -│ • Credential filtering │ +│ • OneCLI Agent Vault (injects credentials, enforces policies) │ └────────────────────────────────┬─────────────────────────────────┘ │ - ▼ Explicit mounts only + ▼ Explicit mounts only, no secrets ┌──────────────────────────────────────────────────────────────────┐ │ CONTAINER (ISOLATED/SANDBOXED) │ │ • Agent execution │ │ • Bash commands (sandboxed) │ │ • File operations (limited to mounts) │ -│ • Network access (unrestricted) │ -│ • Cannot modify security config │ +│ • API calls routed through OneCLI Agent Vault │ +│ • No real credentials in environment or filesystem │ └──────────────────────────────────────────────────────────────────┘ ``` diff --git a/docs/SPEC.md b/docs/SPEC.md index b439012..598f34e 100644 --- a/docs/SPEC.md +++ b/docs/SPEC.md @@ -1,79 +1,81 @@ # NanoClaw Specification -A personal Claude assistant accessible via WhatsApp, with persistent memory per conversation, scheduled tasks, and email integration. +A personal Claude assistant with multi-channel support, persistent memory per conversation, scheduled tasks, and container-isolated agent execution. --- ## Table of Contents 1. [Architecture](#architecture) -2. [Folder Structure](#folder-structure) -3. [Configuration](#configuration) -4. [Memory System](#memory-system) -5. [Session Management](#session-management) -6. [Message Flow](#message-flow) -7. [Commands](#commands) -8. [Scheduled Tasks](#scheduled-tasks) -9. [MCP Servers](#mcp-servers) -10. [Deployment](#deployment) -11. [Security Considerations](#security-considerations) +2. [Architecture: Channel System](#architecture-channel-system) +3. [Folder Structure](#folder-structure) +4. [Configuration](#configuration) +5. [Memory System](#memory-system) +6. [Session Management](#session-management) +7. [Message Flow](#message-flow) +8. [Commands](#commands) +9. [Scheduled Tasks](#scheduled-tasks) +10. [MCP Servers](#mcp-servers) +11. [Deployment](#deployment) +12. [Security Considerations](#security-considerations) --- ## Architecture ``` -┌─────────────────────────────────────────────────────────────────────┐ -│ HOST (macOS) │ -│ (Main Node.js Process) │ -├─────────────────────────────────────────────────────────────────────┤ -│ │ -│ ┌──────────────┐ ┌────────────────────┐ │ -│ │ WhatsApp │────────────────────▶│ SQLite Database │ │ -│ │ (baileys) │◀────────────────────│ (messages.db) │ │ -│ └──────────────┘ store/send └─────────┬──────────┘ │ -│ │ │ -│ ┌────────────────────────────────────────┘ │ -│ │ │ -│ ▼ │ -│ ┌──────────────────┐ ┌──────────────────┐ ┌───────────────┐ │ -│ │ Message Loop │ │ Scheduler Loop │ │ IPC Watcher │ │ -│ │ (polls SQLite) │ │ (checks tasks) │ │ (file-based) │ │ -│ └────────┬─────────┘ └────────┬─────────┘ └───────────────┘ │ -│ │ │ │ -│ └───────────┬───────────┘ │ -│ │ spawns container │ -│ ▼ │ -├─────────────────────────────────────────────────────────────────────┤ -│ CONTAINER (Linux VM) │ -├─────────────────────────────────────────────────────────────────────┤ -│ ┌──────────────────────────────────────────────────────────────┐ │ -│ │ AGENT RUNNER │ │ -│ │ │ │ -│ │ Working directory: /workspace/group (mounted from host) │ │ -│ │ Volume mounts: │ │ -│ │ • groups/{name}/ → /workspace/group │ │ -│ │ • groups/global/ → /workspace/global/ (non-main only) │ │ -│ │ • data/sessions/{group}/.claude/ → /home/node/.claude/ │ │ -│ │ • Additional dirs → /workspace/extra/* │ │ -│ │ │ │ -│ │ Tools (all groups): │ │ -│ │ • Bash (safe - sandboxed in container!) │ │ -│ │ • Read, Write, Edit, Glob, Grep (file operations) │ │ -│ │ • WebSearch, WebFetch (internet access) │ │ -│ │ • agent-browser (browser automation) │ │ -│ │ • mcp__nanoclaw__* (scheduler tools via IPC) │ │ -│ │ │ │ -│ └──────────────────────────────────────────────────────────────┘ │ -│ │ -└──────────────────────────────────────────────────────────────────────┘ +┌──────────────────────────────────────────────────────────────────────┐ +│ HOST (macOS / Linux) │ +│ (Main Node.js Process) │ +├──────────────────────────────────────────────────────────────────────┤ +│ │ +│ ┌──────────────────┐ ┌────────────────────┐ │ +│ │ Channels │─────────────────▶│ SQLite Database │ │ +│ │ (self-register │◀────────────────│ (messages.db) │ │ +│ │ at startup) │ store/send └─────────┬──────────┘ │ +│ └──────────────────┘ │ │ +│ │ │ +│ ┌─────────────────────────────────────────┘ │ +│ │ │ +│ ▼ │ +│ ┌──────────────────┐ ┌──────────────────┐ ┌───────────────┐ │ +│ │ Message Loop │ │ Scheduler Loop │ │ IPC Watcher │ │ +│ │ (polls SQLite) │ │ (checks tasks) │ │ (file-based) │ │ +│ └────────┬─────────┘ └────────┬─────────┘ └───────────────┘ │ +│ │ │ │ +│ └───────────┬───────────┘ │ +│ │ spawns container │ +│ ▼ │ +├──────────────────────────────────────────────────────────────────────┤ +│ CONTAINER (Linux VM) │ +├──────────────────────────────────────────────────────────────────────┤ +│ ┌──────────────────────────────────────────────────────────────┐ │ +│ │ AGENT RUNNER │ │ +│ │ │ │ +│ │ Working directory: /workspace/group (mounted from host) │ │ +│ │ Volume mounts: │ │ +│ │ • groups/{name}/ → /workspace/group │ │ +│ │ • groups/global/ → /workspace/global/ (non-main only) │ │ +│ │ • data/sessions/{group}/.claude/ → /home/node/.claude/ │ │ +│ │ • Additional dirs → /workspace/extra/* │ │ +│ │ │ │ +│ │ Tools (all groups): │ │ +│ │ • Bash (safe - sandboxed in container!) │ │ +│ │ • Read, Write, Edit, Glob, Grep (file operations) │ │ +│ │ • WebSearch, WebFetch (internet access) │ │ +│ │ • agent-browser (browser automation) │ │ +│ │ • mcp__nanoclaw__* (scheduler tools via IPC) │ │ +│ │ │ │ +│ └──────────────────────────────────────────────────────────────┘ │ +│ │ +└───────────────────────────────────────────────────────────────────────┘ ``` ### Technology Stack | Component | Technology | Purpose | |-----------|------------|---------| -| WhatsApp Connection | Node.js (@whiskeysockets/baileys) | Connect to WhatsApp, send/receive messages | +| Channel System | Channel registry (`src/channels/registry.ts`) | Channels self-register at startup | | Message Storage | SQLite (better-sqlite3) | Store messages for polling | | Container Runtime | Containers (Linux VMs) | Isolated environments for agent execution | | Agent | @anthropic-ai/claude-agent-sdk (0.2.29) | Run Claude with tools and MCP servers | @@ -82,6 +84,158 @@ A personal Claude assistant accessible via WhatsApp, with persistent memory per --- +## Architecture: Channel System + +The core ships with no channels built in — each channel (WhatsApp, Telegram, Slack, Discord, Gmail) is installed as a [Claude Code skill](https://code.claude.com/docs/en/skills) that adds the channel code to your fork. Channels self-register at startup; installed channels with missing credentials emit a WARN log and are skipped. + +### System Diagram + +```mermaid +graph LR + subgraph Channels["Channels"] + WA[WhatsApp] + TG[Telegram] + SL[Slack] + DC[Discord] + New["Other Channel (Signal, Gmail...)"] + end + + subgraph Orchestrator["Orchestrator — index.ts"] + ML[Message Loop] + GQ[Group Queue] + RT[Router] + TS[Task Scheduler] + DB[(SQLite)] + end + + subgraph Execution["Container Execution"] + CR[Container Runner] + LC["Linux Container"] + IPC[IPC Watcher] + end + + %% Flow + WA & TG & SL & DC & New -->|onMessage| ML + ML --> GQ + GQ -->|concurrency| CR + CR --> LC + LC -->|filesystem IPC| IPC + IPC -->|tasks & messages| RT + RT -->|Channel.sendMessage| Channels + TS -->|due tasks| CR + + %% DB Connections + DB <--> ML + DB <--> TS + + %% Styling for the dynamic channel + style New stroke-dasharray: 5 5,stroke-width:2px +``` + +### Channel Registry + +The channel system is built on a factory registry in `src/channels/registry.ts`: + +```typescript +export type ChannelFactory = (opts: ChannelOpts) => Channel | null; + +const registry = new Map(); + +export function registerChannel(name: string, factory: ChannelFactory): void { + registry.set(name, factory); +} + +export function getChannelFactory(name: string): ChannelFactory | undefined { + return registry.get(name); +} + +export function getRegisteredChannelNames(): string[] { + return [...registry.keys()]; +} +``` + +Each factory receives `ChannelOpts` (callbacks for `onMessage`, `onChatMetadata`, and `registeredGroups`) and returns either a `Channel` instance or `null` if that channel's credentials are not configured. + +### Channel Interface + +Every channel implements this interface (defined in `src/types.ts`): + +```typescript +interface Channel { + name: string; + connect(): Promise; + sendMessage(jid: string, text: string): Promise; + isConnected(): boolean; + ownsJid(jid: string): boolean; + disconnect(): Promise; + setTyping?(jid: string, isTyping: boolean): Promise; + syncGroups?(force: boolean): Promise; +} +``` + +### Self-Registration Pattern + +Channels self-register using a barrel-import pattern: + +1. Each channel skill adds a file to `src/channels/` (e.g. `whatsapp.ts`, `telegram.ts`) that calls `registerChannel()` at module load time: + + ```typescript + // src/channels/whatsapp.ts + import { registerChannel, ChannelOpts } from './registry.js'; + + export class WhatsAppChannel implements Channel { /* ... */ } + + registerChannel('whatsapp', (opts: ChannelOpts) => { + // Return null if credentials are missing + if (!existsSync(authPath)) return null; + return new WhatsAppChannel(opts); + }); + ``` + +2. The barrel file `src/channels/index.ts` imports all channel modules, triggering registration: + + ```typescript + import './whatsapp.js'; + import './telegram.js'; + // ... each skill adds its import here + ``` + +3. At startup, the orchestrator (`src/index.ts`) loops through registered channels and connects whichever ones return a valid instance: + + ```typescript + for (const name of getRegisteredChannelNames()) { + const factory = getChannelFactory(name); + const channel = factory?.(channelOpts); + if (channel) { + await channel.connect(); + channels.push(channel); + } + } + ``` + +### Key Files + +| File | Purpose | +|------|---------| +| `src/channels/registry.ts` | Channel factory registry | +| `src/channels/index.ts` | Barrel imports that trigger channel self-registration | +| `src/types.ts` | `Channel` interface, `ChannelOpts`, message types | +| `src/index.ts` | Orchestrator — instantiates channels, runs message loop | +| `src/router.ts` | Finds the owning channel for a JID, formats messages | + +### Adding a New Channel + +To add a new channel, contribute a skill to `.claude/skills/add-/` that: + +1. Adds a `src/channels/.ts` file implementing the `Channel` interface +2. Calls `registerChannel(name, factory)` at module load +3. Returns `null` from the factory if credentials are missing +4. Adds an import line to `src/channels/index.ts` + +See existing skills (`/add-whatsapp`, `/add-telegram`, `/add-slack`, `/add-discord`, `/add-gmail`) for the pattern. + +--- + ## Folder Structure ``` @@ -100,7 +254,8 @@ nanoclaw/ ├── src/ │ ├── index.ts # Orchestrator: state, message loop, agent invocation │ ├── channels/ -│ │ └── whatsapp.ts # WhatsApp connection, auth, send/receive +│ │ ├── registry.ts # Channel factory registry +│ │ └── index.ts # Barrel imports for channel self-registration │ ├── ipc.ts # IPC watcher and task processing │ ├── router.ts # Message formatting and outbound routing │ ├── config.ts # Configuration constants @@ -141,10 +296,10 @@ nanoclaw/ │ ├── groups/ │ ├── CLAUDE.md # Global memory (all groups read this) -│ ├── main/ # Self-chat (main control channel) +│ ├── {channel}_main/ # Main control channel (e.g., whatsapp_main/) │ │ ├── CLAUDE.md # Main channel memory │ │ └── logs/ # Task execution logs -│ └── {Group Name}/ # Per-group folders (created on registration) +│ └── {channel}_{group-name}/ # Per-group folders (created on registration) │ ├── CLAUDE.md # Group-specific memory │ ├── logs/ # Task logs for this group │ └── *.md # Files created by the agent @@ -203,9 +358,9 @@ export const TRIGGER_PATTERN = new RegExp(`^@${ASSISTANT_NAME}\\b`, 'i'); Groups can have additional directories mounted via `containerConfig` in the SQLite `registered_groups` table (stored as JSON in the `container_config` column). Example registration: ```typescript -registerGroup("1234567890@g.us", { +setRegisteredGroup("1234567890@g.us", { name: "Dev Team", - folder: "dev-team", + folder: "whatsapp_dev-team", trigger: "@Andy", added_at: new Date().toISOString(), containerConfig: { @@ -221,6 +376,8 @@ registerGroup("1234567890@g.us", { }); ``` +Folder names follow the convention `{channel}_{group-name}` (e.g., `whatsapp_family-chat`, `telegram_dev-team`). The main group has `isMain: true` set during registration. + Additional mounts appear at `/workspace/extra/{containerPath}` inside the container. **Mount syntax note:** Read-write mounts use `-v host:container`, but readonly mounts require `--mount "type=bind,source=...,target=...,readonly"` (the `:ro` suffix may not work on all runtimes). @@ -314,10 +471,10 @@ Sessions enable conversation continuity - Claude remembers what you talked about ### Incoming Message Flow ``` -1. User sends WhatsApp message +1. User sends a message via any connected channel │ ▼ -2. Baileys receives message via WhatsApp Web protocol +2. Channel receives message (e.g. Baileys for WhatsApp, Bot API for Telegram) │ ▼ 3. Message stored in SQLite (store/messages.db) @@ -349,7 +506,7 @@ Sessions enable conversation continuity - Claude remembers what you talked about └── Uses tools as needed (search, email, etc.) │ ▼ -9. Router prefixes response with assistant name and sends via WhatsApp +9. Router prefixes response with assistant name and sends via the owning channel │ ▼ 10. Router updates last agent timestamp and saves session ID @@ -473,7 +630,7 @@ The `nanoclaw` MCP server is created dynamically per agent call with the current | `pause_task` | Pause a task | | `resume_task` | Resume a paused task | | `cancel_task` | Delete a task | -| `send_message` | Send a WhatsApp message to the group | +| `send_message` | Send a message to the group via its channel | --- @@ -487,7 +644,8 @@ When NanoClaw starts, it: 1. **Ensures container runtime is running** - Automatically starts it if needed; kills orphaned NanoClaw containers from previous runs 2. Initializes the SQLite database (migrates from JSON files if they exist) 3. Loads state from SQLite (registered groups, sessions, router state) -4. Connects to WhatsApp (on `connection.open`): +4. **Connects channels** — loops through registered channels, instantiates those with credentials, calls `connect()` on each +5. Once at least one channel is connected: - Starts the scheduler loop - Starts the IPC watcher for container messages - Sets up the per-group queue with `processGroupMessages` diff --git a/docs/docker-sandboxes.md b/docs/docker-sandboxes.md new file mode 100644 index 0000000..e887bad --- /dev/null +++ b/docs/docker-sandboxes.md @@ -0,0 +1,359 @@ +# Running NanoClaw in Docker Sandboxes (Manual Setup) + +This guide walks through setting up NanoClaw inside a [Docker Sandbox](https://docs.docker.com/ai/sandboxes/) from scratch — no install script, no pre-built fork. You'll clone the upstream repo, apply the necessary patches, and have agents running in full hypervisor-level isolation. + +## Architecture + +``` +Host (macOS / Windows WSL) +└── Docker Sandbox (micro VM with isolated kernel) + ├── NanoClaw process (Node.js) + │ ├── Channel adapters (WhatsApp, Telegram, etc.) + │ └── Container spawner → nested Docker daemon + └── Docker-in-Docker + └── nanoclaw-agent containers + └── Claude Agent SDK +``` + +Each agent runs in its own container, inside a micro VM that is fully isolated from your host. Two layers of isolation: per-agent containers + the VM boundary. + +The sandbox provides a MITM proxy at `host.docker.internal:3128` that handles network access and injects your Anthropic API key automatically. + +> **Note:** This guide is based on a validated setup running on macOS (Apple Silicon) with WhatsApp. Other channels (Telegram, Slack, etc.) and environments (Windows WSL) may require additional proxy patches for their specific HTTP/WebSocket clients. The core patches (container runner, credential proxy, Dockerfile) apply universally — channel-specific proxy configuration varies. + +## Prerequisites + +- **Docker Desktop v4.40+** with Sandbox support +- **Anthropic API key** (the sandbox proxy manages injection) +- For **Telegram**: a bot token from [@BotFather](https://t.me/BotFather) and your chat ID +- For **WhatsApp**: a phone with WhatsApp installed + +Verify sandbox support: +```bash +docker sandbox version +``` + +## Step 1: Create the Sandbox + +On your host machine: + +```bash +# Create a workspace directory +mkdir -p ~/nanoclaw-workspace + +# Create a shell sandbox with the workspace mounted +docker sandbox create shell ~/nanoclaw-workspace +``` + +If you're using WhatsApp, configure proxy bypass so WhatsApp's Noise protocol isn't MITM-inspected: + +```bash +docker sandbox network proxy shell-nanoclaw-workspace \ + --bypass-host web.whatsapp.com \ + --bypass-host "*.whatsapp.com" \ + --bypass-host "*.whatsapp.net" +``` + +Telegram does not need proxy bypass. + +Enter the sandbox: +```bash +docker sandbox run shell-nanoclaw-workspace +``` + +## Step 2: Install Prerequisites + +Inside the sandbox: + +```bash +sudo apt-get update && sudo apt-get install -y build-essential python3 +npm config set strict-ssl false +``` + +## Step 3: Clone and Install NanoClaw + +NanoClaw must live inside the workspace directory — Docker-in-Docker can only bind-mount from the shared workspace path. + +```bash +# Clone to home first (virtiofs can corrupt git pack files during clone) +cd ~ +git clone https://github.com/qwibitai/nanoclaw.git + +# Replace with YOUR workspace path (the host path you passed to `docker sandbox create`) +WORKSPACE=/Users/you/nanoclaw-workspace + +# Move into workspace so DinD mounts work +mv nanoclaw "$WORKSPACE/nanoclaw" +cd "$WORKSPACE/nanoclaw" + +# Install dependencies +npm install +npm install https-proxy-agent +``` + +## Step 4: Apply Proxy and Sandbox Patches + +NanoClaw needs several patches to work inside a Docker Sandbox. These handle proxy routing, CA certificates, and Docker-in-Docker mount restrictions. + +### 4a. Dockerfile — proxy args for container image build + +`npm install` inside `docker build` fails with `SELF_SIGNED_CERT_IN_CHAIN` because the sandbox's MITM proxy presents its own certificate. Add proxy build args to `container/Dockerfile`: + +Add these lines after the `FROM` line: + +```dockerfile +# Accept proxy build args +ARG http_proxy +ARG https_proxy +ARG no_proxy +ARG NODE_EXTRA_CA_CERTS +ARG npm_config_strict_ssl=true +RUN npm config set strict-ssl ${npm_config_strict_ssl} +``` + +And after the `RUN npm install` line: + +```dockerfile +RUN npm config set strict-ssl true +``` + +### 4b. Build script — forward proxy args + +Patch `container/build.sh` to pass proxy env vars to `docker build`: + +Add these `--build-arg` flags to the `docker build` command: + +```bash +--build-arg http_proxy="${http_proxy:-$HTTP_PROXY}" \ +--build-arg https_proxy="${https_proxy:-$HTTPS_PROXY}" \ +--build-arg no_proxy="${no_proxy:-$NO_PROXY}" \ +--build-arg npm_config_strict_ssl=false \ +``` + +### 4c. Container runner — proxy forwarding, CA cert mount, /dev/null fix + +Three changes to `src/container-runner.ts`: + +**Replace `/dev/null` shadow mount.** The sandbox rejects `/dev/null` bind mounts. Find where `.env` is shadow-mounted to `/dev/null` and replace it with an empty file: + +```typescript +// Create an empty file to shadow .env (Docker Sandbox rejects /dev/null mounts) +const emptyEnvPath = path.join(DATA_DIR, 'empty-env'); +if (!fs.existsSync(emptyEnvPath)) fs.writeFileSync(emptyEnvPath, ''); +// Use emptyEnvPath instead of '/dev/null' in the mount +``` + +**Forward proxy env vars** to spawned agent containers. Add `-e` flags for `HTTP_PROXY`, `HTTPS_PROXY`, `NO_PROXY` and their lowercase variants. + +**Mount CA certificate.** If `NODE_EXTRA_CA_CERTS` or `SSL_CERT_FILE` is set, copy the cert into the project directory and mount it into agent containers: + +```typescript +const caCertSrc = process.env.NODE_EXTRA_CA_CERTS || process.env.SSL_CERT_FILE; +if (caCertSrc) { + const certDir = path.join(DATA_DIR, 'ca-cert'); + fs.mkdirSync(certDir, { recursive: true }); + fs.copyFileSync(caCertSrc, path.join(certDir, 'proxy-ca.crt')); + // Mount: certDir -> /workspace/ca-cert (read-only) + // Set NODE_EXTRA_CA_CERTS=/workspace/ca-cert/proxy-ca.crt in the container +} +``` + +### 4d. Container runtime — prevent self-termination + +In `src/container-runtime.ts`, the `cleanupOrphans()` function matches containers by the `nanoclaw-` prefix. Inside a sandbox, the sandbox container itself may match (e.g., `nanoclaw-docker-sandbox`). Filter out the current hostname: + +```typescript +// In cleanupOrphans(), filter out os.hostname() from the list of containers to stop +``` + +### 4e. Credential proxy — route through MITM proxy + +In `src/credential-proxy.ts`, upstream API requests need to go through the sandbox proxy. Add `HttpsProxyAgent` to outbound requests: + +```typescript +import { HttpsProxyAgent } from 'https-proxy-agent'; + +const proxyUrl = process.env.HTTPS_PROXY || process.env.https_proxy; +const upstreamAgent = proxyUrl ? new HttpsProxyAgent(proxyUrl) : undefined; +// Pass upstreamAgent to https.request() options +``` + +### 4f. Setup script — proxy build args + +Patch `setup/container.ts` to pass the same proxy `--build-arg` flags as `build.sh` (Step 4b). + +## Step 5: Build + +```bash +npm run build +bash container/build.sh +``` + +## Step 6: Add a Channel + +### Telegram + +```bash +# Apply the Telegram skill +npx tsx scripts/apply-skill.ts .claude/skills/add-telegram + +# Rebuild after applying the skill +npm run build + +# Configure .env +cat > .env << EOF +TELEGRAM_BOT_TOKEN= +ASSISTANT_NAME=nanoclaw +ANTHROPIC_API_KEY=proxy-managed +EOF +mkdir -p data/env && cp .env data/env/env + +# Register your chat +npx tsx setup/index.ts --step register \ + --jid "tg:" \ + --name "My Chat" \ + --trigger "@nanoclaw" \ + --folder "telegram_main" \ + --channel telegram \ + --assistant-name "nanoclaw" \ + --is-main \ + --no-trigger-required +``` + +**To find your chat ID:** Send any message to your bot, then: +```bash +curl -s --proxy $HTTPS_PROXY "https://api.telegram.org/bot/getUpdates" | python3 -m json.tool +``` + +**Telegram in groups:** Disable Group Privacy in @BotFather (`/mybots` > Bot Settings > Group Privacy > Turn off), then remove and re-add the bot. + +**Important:** If the Telegram skill creates `src/channels/telegram.ts`, you'll need to patch it for proxy support. Add an `HttpsProxyAgent` and pass it to grammy's `Bot` constructor via `baseFetchConfig.agent`. Then rebuild. + +### WhatsApp + +Make sure you configured proxy bypass in [Step 1](#step-1-create-the-sandbox) first. + +```bash +# Apply the WhatsApp skill +npx tsx scripts/apply-skill.ts .claude/skills/add-whatsapp + +# Rebuild +npm run build + +# Configure .env +cat > .env << EOF +ASSISTANT_NAME=nanoclaw +ANTHROPIC_API_KEY=proxy-managed +EOF +mkdir -p data/env && cp .env data/env/env + +# Authenticate (choose one): + +# QR code — scan with WhatsApp camera: +npx tsx src/whatsapp-auth.ts + +# OR pairing code — enter code in WhatsApp > Linked Devices > Link with phone number: +npx tsx src/whatsapp-auth.ts --pairing-code --phone + +# Register your chat (JID = your phone number + @s.whatsapp.net) +npx tsx setup/index.ts --step register \ + --jid "@s.whatsapp.net" \ + --name "My Chat" \ + --trigger "@nanoclaw" \ + --folder "whatsapp_main" \ + --channel whatsapp \ + --assistant-name "nanoclaw" \ + --is-main \ + --no-trigger-required +``` + +**Important:** The WhatsApp skill files (`src/channels/whatsapp.ts` and `src/whatsapp-auth.ts`) also need proxy patches — add `HttpsProxyAgent` for WebSocket connections and a proxy-aware version fetch. Then rebuild. + +### Both Channels + +Apply both skills, patch both for proxy support, combine the `.env` variables, and register each chat separately. + +## Step 7: Run + +```bash +npm start +``` + +You don't need to set `ANTHROPIC_API_KEY` manually. The sandbox proxy intercepts requests and replaces `proxy-managed` with your real key automatically. + +## Networking Details + +### How the proxy works + +All traffic from the sandbox routes through the host proxy at `host.docker.internal:3128`: + +``` +Agent container → DinD bridge → Sandbox VM → host.docker.internal:3128 → Host proxy → api.anthropic.com +``` + +**"Bypass" does not mean traffic skips the proxy.** It means the proxy passes traffic through without MITM inspection. Node.js doesn't automatically use `HTTP_PROXY` env vars — you need explicit `HttpsProxyAgent` configuration in every HTTP/WebSocket client. + +### Shared paths for DinD mounts + +Only the workspace directory is available for Docker-in-Docker bind mounts. Paths outside the workspace fail with "path not shared": +- `/dev/null` → replace with an empty file in the project dir +- `/usr/local/share/ca-certificates/` → copy cert to project dir +- `/home/agent/` → clone to workspace instead + +### Git clone and virtiofs + +The workspace is mounted via virtiofs. Git's pack file handling can corrupt over virtiofs during clone. Workaround: clone to `/home/agent` first, then `mv` into the workspace. + +## Troubleshooting + +### npm install fails with SELF_SIGNED_CERT_IN_CHAIN +```bash +npm config set strict-ssl false +``` + +### Container build fails with proxy errors +```bash +docker build \ + --build-arg http_proxy=$http_proxy \ + --build-arg https_proxy=$https_proxy \ + -t nanoclaw-agent:latest container/ +``` + +### Agent containers fail with "path not shared" +All bind-mounted paths must be under the workspace directory. Check: +- Is NanoClaw cloned into the workspace? (not `/home/agent/`) +- Is the CA cert copied to the project root? +- Has the empty `.env` shadow file been created? + +### Agent containers can't reach Anthropic API +Verify proxy env vars are forwarded to agent containers. Check container logs for `HTTP_PROXY=http://host.docker.internal:3128`. + +### WhatsApp error 405 +The version fetch is returning a stale version. Make sure the proxy-aware `fetchWaVersionViaProxy` patch is applied — it fetches `sw.js` through `HttpsProxyAgent` and parses `client_revision`. + +### WhatsApp "Connection failed" immediately +Proxy bypass not configured. From the **host**, run: +```bash +docker sandbox network proxy \ + --bypass-host web.whatsapp.com \ + --bypass-host "*.whatsapp.com" \ + --bypass-host "*.whatsapp.net" +``` + +### Telegram bot doesn't receive messages +1. Check the grammy proxy patch is applied (look for `HttpsProxyAgent` in `src/channels/telegram.ts`) +2. Check Group Privacy is disabled in @BotFather if using in groups + +### Git clone fails with "inflate: data stream error" +Clone to a non-workspace path first, then move: +```bash +cd ~ && git clone https://github.com/qwibitai/nanoclaw.git && mv nanoclaw /path/to/workspace/nanoclaw +``` + +### WhatsApp QR code doesn't display +Run the auth command interactively inside the sandbox (not piped through `docker sandbox exec`): +```bash +docker sandbox run shell-nanoclaw-workspace +# Then inside: +npx tsx src/whatsapp-auth.ts +``` diff --git a/docs/nanoclaw-architecture-final.md b/docs/nanoclaw-architecture-final.md deleted file mode 100644 index 103b38b..0000000 --- a/docs/nanoclaw-architecture-final.md +++ /dev/null @@ -1,1063 +0,0 @@ -# NanoClaw Skills Architecture - -## Core Principle - -Skills are self-contained, auditable packages that apply programmatically via standard git merge mechanics. Claude Code orchestrates the process — running git commands, reading skill manifests, and stepping in only when git can't resolve a conflict on its own. The system uses existing git features (`merge-file`, `rerere`, `apply`) rather than custom merge infrastructure. - -### The Three-Level Resolution Model - -Every operation in the system follows this escalation: - -1. **Git** — deterministic, programmatic. `git merge-file` merges, `git rerere` replays cached resolutions, structured operations apply without merging. No AI involved. This handles the vast majority of cases. -2. **Claude Code** — reads `SKILL.md`, `.intent.md`, migration guides, and `state.yaml` to understand context. Resolves conflicts that git can't handle programmatically. Caches the resolution via `git rerere` so it never needs to resolve the same conflict again. -3. **User** — Claude Code asks the user when it lacks context or intent. This happens when two features genuinely conflict at an application level (not just a text-level merge conflict) and a human decision is needed about desired behavior. - -The goal is that Level 1 handles everything on a mature, well-tested installation. Level 2 handles first-time conflicts and edge cases. Level 3 is rare and only for genuine ambiguity. - -**Important**: a clean merge (exit code 0) does not guarantee working code. Semantic conflicts — a renamed variable, a shifted reference, a changed function signature — can produce clean text merges that break at runtime. **Tests must run after every operation**, regardless of whether the merge was clean. A clean merge with failing tests escalates to Level 2. - -### Safe Operations via Backup/Restore - -Many users clone the repo without forking, don't commit their changes, and don't think of themselves as git users. The system must work safely for them without requiring any git knowledge. - -Before any operation, the system copies all files that will be modified to `.nanoclaw/backup/`. On success, the backup is deleted. On failure, the backup is restored. This provides rollback safety regardless of whether the user commits, pushes, or understands git. - ---- - -## 1. The Shared Base - -`.nanoclaw/base/` holds the clean core — the original codebase before any skills or customizations were applied. This is the stable common ancestor for all three-way merges, and it only changes on core updates. - -- `git merge-file` uses the base to compute two diffs: what the user changed (current vs base) and what the skill wants to change (base vs skill's modified file), then combines both -- The base enables drift detection: if a file's hash differs from its base hash, something has been modified (skills, user customizations, or both) -- Each skill's `modify/` files contain the full file as it should look with that skill applied (including any prerequisite skill changes), all authored against the same clean core base - -On a **fresh codebase**, the user's files are identical to the base. This means `git merge-file` always exits cleanly for the first skill — the merge trivially produces the skill's modified version. No special-casing needed. - -When multiple skills modify the same file, the three-way merge handles the overlap naturally. If Telegram and Discord both modify `src/index.ts`, and both skill files include the Telegram changes, those common changes merge cleanly against the base. The result is the base + all skill changes + user customizations. - ---- - -## 2. Two Types of Changes: Code Merges vs. Structured Operations - -Not all files should be merged as text. The system distinguishes between **code files** (merged via `git merge-file`) and **structured data** (modified via deterministic operations). - -### Code Files (Three-Way Merge) - -Source code files where skills weave in logic — route handlers, middleware, business logic. These are merged using `git merge-file` against the shared base. The skill carries a full modified version of the file. - -### Structured Data (Deterministic Operations) - -Files like `package.json`, `docker-compose.yml`, `.env.example`, and generated configs are not code you merge — they're structured data you aggregate. Multiple skills adding npm dependencies to `package.json` shouldn't require a three-way text merge. Instead, skills declare their structured requirements in the manifest, and the system applies them programmatically. - -**Structured operations are implicit.** If a skill declares `npm_dependencies`, the system handles dependency installation automatically. There is no need for the skill author to add `npm install` to `post_apply`. When multiple skills are applied in sequence, the system batches structured operations: merge all dependency declarations first, write `package.json` once, run `npm install` once at the end. - -```yaml -# In manifest.yaml -structured: - npm_dependencies: - whatsapp-web.js: "^2.1.0" - qrcode-terminal: "^0.12.0" - env_additions: - - WHATSAPP_TOKEN - - WHATSAPP_VERIFY_TOKEN - - WHATSAPP_PHONE_ID - docker_compose_services: - whatsapp-redis: - image: redis:alpine - ports: ["6380:6379"] -``` - -### Structured Operation Conflicts - -Structured operations eliminate text merge conflicts but can still conflict at a semantic level: - -- **NPM version conflicts**: two skills request incompatible semver ranges for the same package -- **Port collisions**: two docker-compose services claim the same host port -- **Service name collisions**: two skills define a service with the same name -- **Env var duplicates**: two skills declare the same variable with different expectations - -The resolution policy: - -1. **Automatic where possible**: widen semver ranges to find a compatible version, detect and flag port/name collisions -2. **Level 2 (Claude Code)**: if automatic resolution fails, Claude proposes options based on skill intents -3. **Level 3 (User)**: if it's a genuine product choice (which Redis instance should get port 6379?), ask the user - -Structured operation conflicts are included in the CI overlap graph alongside code file overlaps, so the maintainer test matrix catches these before users encounter them. - -### State Records Structured Outcomes - -`state.yaml` records not just the declared dependencies but the resolved outcomes — actual installed versions, resolved port assignments, final env var list. This makes structured operations replayable and auditable. - -### Deterministic Serialization - -All structured output (YAML, JSON) uses stable serialization: sorted keys, consistent quoting, normalized whitespace. This prevents noisy diffs in git history from non-functional formatting changes. - ---- - -## 3. Skill Package Structure - -A skill contains only the files it adds or modifies. For modified code files, the skill carries the **full modified file** (the clean core with the skill's changes applied). - -``` -skills/ - add-whatsapp/ - SKILL.md # Context, intent, what this skill does and why - manifest.yaml # Metadata, dependencies, env vars, post-apply steps - tests/ # Integration tests for this skill - whatsapp.test.ts - add/ # New files — copied directly - src/channels/whatsapp.ts - src/channels/whatsapp.config.ts - modify/ # Modified code files — merged via git merge-file - src/ - server.ts # Full file: clean core + whatsapp changes - server.ts.intent.md # "Adds WhatsApp webhook route and message handler" - config.ts # Full file: clean core + whatsapp config options - config.ts.intent.md # "Adds WhatsApp channel configuration block" -``` - -### Why Full Modified Files - -- `git merge-file` requires three full files — no intermediate reconstruction step -- Git's three-way merge uses context matching, so it works even if the user has moved code around — unlike line-number-based diffs that break immediately -- Auditable: `diff .nanoclaw/base/src/server.ts skills/add-whatsapp/modify/src/server.ts` shows exactly what the skill changes -- Deterministic: same three inputs always produce the same merge result -- Size is negligible since NanoClaw's core files are small - -### Intent Files - -Each modified code file has a corresponding `.intent.md` with structured headings: - -```markdown -# Intent: server.ts modifications - -## What this skill adds -Adds WhatsApp webhook route and message handler to the Express server. - -## Key sections -- Route registration at `/webhook/whatsapp` (POST and GET for verification) -- Message handler middleware between auth and response pipeline - -## Invariants -- Must not interfere with other channel webhook routes -- Auth middleware must run before the WhatsApp handler -- Error handling must propagate to the global error handler - -## Must-keep sections -- The webhook verification flow (GET route) is required by WhatsApp Cloud API -``` - -Structured headings (What, Key sections, Invariants, Must-keep) give Claude Code specific guidance during conflict resolution instead of requiring it to infer from unstructured text. - -### Manifest Format - -```yaml -# --- Required fields --- -skill: whatsapp -version: 1.2.0 -description: "WhatsApp Business API integration via Cloud API" -core_version: 0.1.0 # The core version this skill was authored against - -# Files this skill adds -adds: - - src/channels/whatsapp.ts - - src/channels/whatsapp.config.ts - -# Code files this skill modifies (three-way merge) -modifies: - - src/server.ts - - src/config.ts - -# File operations (renames, deletes, moves — see Section 5) -file_ops: [] - -# Structured operations (deterministic, no merge — implicit handling) -structured: - npm_dependencies: - whatsapp-web.js: "^2.1.0" - qrcode-terminal: "^0.12.0" - env_additions: - - WHATSAPP_TOKEN - - WHATSAPP_VERIFY_TOKEN - - WHATSAPP_PHONE_ID - -# Skill relationships -conflicts: [] # Skills that cannot coexist without agent resolution -depends: [] # Skills that must be applied first - -# Test command — runs after apply to validate the skill works -test: "npx vitest run src/channels/whatsapp.test.ts" - -# --- Future fields (not yet implemented in v0.1) --- -# author: nanoclaw-team -# license: MIT -# min_skills_system_version: "0.1.0" -# tested_with: [telegram@1.0.0] -# post_apply: [] -``` - -Note: `post_apply` is only for operations that can't be expressed as structured declarations. Dependency installation is **never** in `post_apply` — it's handled implicitly by the structured operations system. - ---- - -## 4. Skills, Customization, and Layering - -### One Skill, One Happy Path - -A skill implements **one way of doing something — the reasonable default that covers 80% of users.** `add-telegram` gives you a clean, solid Telegram integration. It doesn't try to anticipate every use case with predefined configuration options and modes. - -### Customization Is Just More Patching - -The entire system is built around applying transformations to a codebase. Customizing a skill after applying it is no different from any other modification: - -- **Apply the skill** — get the standard Telegram integration -- **Modify from there** — using the customize flow (tracked patch), direct editing (detected by hash tracking), or by applying additional skills that build on top - -### Layered Skills - -Skills can build on other skills: - -``` -add-telegram # Core Telegram integration (happy path) - ├── telegram-reactions # Adds reaction handling (depends: [telegram]) - ├── telegram-multi-bot # Multiple bot instances (depends: [telegram]) - └── telegram-filters # Custom message filtering (depends: [telegram]) -``` - -Each layer is a separate skill with its own `SKILL.md`, manifest (with `depends: [telegram]`), tests, and modified files. The user composes exactly what they want by stacking skills. - -### Custom Skill Application - -A user can apply a skill with their own modifications in a single step: - -1. Apply the skill normally (programmatic merge) -2. Claude Code asks if the user wants to make any modifications -3. User describes what they want different -4. Claude Code makes the modifications on top of the freshly applied skill -5. The modifications are recorded as a custom patch tied to this skill - -Recorded in `state.yaml`: - -```yaml -applied_skills: - - skill: telegram - version: 1.0.0 - custom_patch: .nanoclaw/custom/telegram-group-only.patch - custom_patch_description: "Restrict bot responses to group chats only" -``` - -On replay, the skill applies programmatically, then the custom patch applies on top. - ---- - -## 5. File Operations: Renames, Deletes, Moves - -Core updates and some skills will need to rename, delete, or move files. These are not text merges — they're structural changes handled as explicit scripted operations. - -### Declaration in Manifest - -```yaml -file_ops: - - type: rename - from: src/server.ts - to: src/app.ts - - type: delete - path: src/deprecated/old-handler.ts - - type: move - from: src/utils/helpers.ts - to: src/lib/helpers.ts -``` - -### Execution Order - -File operations run **before** code merges, because merges need to target the correct file paths: - -1. Pre-flight checks (state validation, core version, dependencies, conflicts, drift detection) -2. Acquire operation lock -3. **Backup** all files that will be touched -4. **File operations** (renames, deletes, moves) -5. Copy new files from `add/` -6. Three-way merge modified code files -7. Conflict resolution (rerere auto-resolve, or return with `backupPending: true`) -8. Apply structured operations (npm deps, env vars, docker-compose — batched) -9. Run `npm install` (once, if any structured npm_dependencies exist) -10. Update state (record skill application, file hashes, structured outcomes) -11. Run tests (if `manifest.test` defined; rollback state + backup on failure) -12. Clean up (delete backup on success, release lock) - -### Path Remapping for Skills - -When the core renames a file (e.g., `server.ts` → `app.ts`), skills authored against the old path still reference `server.ts` in their `modifies` and `modify/` directories. **Skill packages are never mutated on the user's machine.** - -Instead, core updates ship a **compatibility map**: - -```yaml -# In the update package -path_remap: - src/server.ts: src/app.ts - src/old-config.ts: src/config/main.ts -``` - -The system resolves paths at apply time: if a skill targets `src/server.ts` and the remap says it's now `src/app.ts`, the merge runs against `src/app.ts`. The remap is recorded in `state.yaml` so future operations are consistent. - -### Safety Checks - -Before executing file operations: - -- Verify the source file exists -- For deletes: warn if the file has modifications beyond the base (user or skill changes would be lost) - ---- - -## 6. The Apply Flow - -When a user runs the skill's slash command in Claude Code: - -### Step 1: Pre-flight Checks - -- Core version compatibility -- Dependencies satisfied -- No unresolvable conflicts with applied skills -- Check for untracked changes (see Section 9) - -### Step 2: Backup - -Copy all files that will be modified to `.nanoclaw/backup/`. If the operation fails at any point, restore from backup. - -### Step 3: File Operations - -Execute renames, deletes, or moves with safety checks. Apply path remapping if needed. - -### Step 4: Apply New Files - -```bash -cp skills/add-whatsapp/add/src/channels/whatsapp.ts src/channels/whatsapp.ts -``` - -### Step 5: Merge Modified Code Files - -For each file in `modifies` (with path remapping applied): - -```bash -git merge-file src/server.ts .nanoclaw/base/src/server.ts skills/add-whatsapp/modify/src/server.ts -``` - -- **Exit code 0**: clean merge, move on -- **Exit code > 0**: conflict markers in file, proceed to resolution - -### Step 6: Conflict Resolution (Three-Level) - -1. **Check shared resolution cache** (`.nanoclaw/resolutions/`) — load into local `git rerere` if a verified resolution exists for this skill combination. **Only apply if input hashes match exactly** (base hash + current hash + skill modified hash). -2. **`git rerere`** — checks local cache. If found, applied automatically. Done. -3. **Claude Code** — reads conflict markers + `SKILL.md` + `.intent.md` (Invariants, Must-keep sections) of current and previously applied skills. Resolves. `git rerere` caches the resolution. -4. **User** — if Claude Code cannot determine intent, it asks the user for the desired behavior. - -### Step 7: Apply Structured Operations - -Collect all structured declarations (from this skill and any previously applied skills if batching). Apply deterministically: - -- Merge npm dependencies into `package.json` (check for version conflicts) -- Append env vars to `.env.example` -- Merge docker-compose services (check for port/name collisions) -- Run `npm install` **once** at the end -- Record resolved outcomes in state - -### Step 8: Post-Apply and Validate - -1. Run any `post_apply` commands (non-structured operations only) -2. Update `.nanoclaw/state.yaml` — skill record, file hashes (base, skill, merged per file), structured outcomes -3. **Run skill tests** — mandatory, even if all merges were clean -4. If tests fail on a clean merge → escalate to Level 2 (Claude Code diagnoses the semantic conflict) - -### Step 9: Clean Up - -If tests pass, delete `.nanoclaw/backup/`. The operation is complete. - -If tests fail and Level 2 can't resolve, restore from `.nanoclaw/backup/` and report the failure. - ---- - -## 7. Shared Resolution Cache - -### The Problem - -`git rerere` is local by default. But NanoClaw has thousands of users applying the same skill combinations. Every user hitting the same conflict and waiting for Claude Code to resolve it is wasteful. - -### The Solution - -NanoClaw maintains a verified resolution cache in `.nanoclaw/resolutions/` that ships with the project. This is the shared artifact — **not** `.git/rr-cache/`, which stays local. - -``` -.nanoclaw/ - resolutions/ - whatsapp@1.2.0+telegram@1.0.0/ - src/ - server.ts.resolution - server.ts.preimage - config.ts.resolution - config.ts.preimage - meta.yaml -``` - -### Hash Enforcement - -A cached resolution is **only applied if input hashes match exactly**: - -```yaml -# meta.yaml -skills: - - whatsapp@1.2.0 - - telegram@1.0.0 -apply_order: [whatsapp, telegram] -core_version: 0.6.0 -resolved_at: 2026-02-15T10:00:00Z -tested: true -test_passed: true -resolution_source: maintainer -input_hashes: - base: "aaa..." - current_after_whatsapp: "bbb..." - telegram_modified: "ccc..." -output_hash: "ddd..." -``` - -If any input hash doesn't match, the cached resolution is skipped and the system proceeds to Level 2. - -### Validated: rerere + merge-file Require an Index Adapter - -`git rerere` does **not** natively recognize `git merge-file` output. This was validated in Phase 0 testing (`tests/phase0-merge-rerere.sh`, 33 tests). - -The issue is not about conflict marker format — `merge-file` uses filenames as labels (`<<<<<<< current.ts`) while `git merge` uses branch names (`<<<<<<< HEAD`), but rerere strips all labels and hashes only the conflict body. The formats are compatible. - -The actual issue: **rerere requires unmerged index entries** (stages 1/2/3) to detect that a merge conflict exists. A normal `git merge` creates these automatically. `git merge-file` operates on the filesystem only and does not touch the index. - -#### The Adapter - -After `git merge-file` produces a conflict, the system must create the index state that rerere expects: - -```bash -# 1. Run the merge (produces conflict markers in the working tree) -git merge-file current.ts .nanoclaw/base/src/file.ts skills/add-whatsapp/modify/src/file.ts - -# 2. If exit code > 0 (conflict), set up rerere adapter: - -# Create blob objects for the three versions -base_hash=$(git hash-object -w .nanoclaw/base/src/file.ts) -ours_hash=$(git hash-object -w skills/previous-skill/modify/src/file.ts) # or the pre-merge current -theirs_hash=$(git hash-object -w skills/add-whatsapp/modify/src/file.ts) - -# Create unmerged index entries at stages 1 (base), 2 (ours), 3 (theirs) -printf '100644 %s 1\tsrc/file.ts\0' "$base_hash" | git update-index --index-info -printf '100644 %s 2\tsrc/file.ts\0' "$ours_hash" | git update-index --index-info -printf '100644 %s 3\tsrc/file.ts\0' "$theirs_hash" | git update-index --index-info - -# Set merge state (rerere checks for MERGE_HEAD) -echo "$(git rev-parse HEAD)" > .git/MERGE_HEAD -echo "skill merge" > .git/MERGE_MSG - -# 3. Now rerere can see the conflict -git rerere # Records preimage, or auto-resolves from cache - -# 4. After resolution (manual or auto): -git add src/file.ts -git rerere # Records postimage (caches the resolution) - -# 5. Clean up merge state -rm .git/MERGE_HEAD .git/MERGE_MSG -git reset HEAD -``` - -#### Key Properties Validated - -- **Conflict body identity**: `merge-file` and `git merge` produce identical conflict bodies for the same inputs. Rerere hashes the body only, so resolutions learned from either source are interchangeable. -- **Hash determinism**: The same conflict always produces the same rerere hash. This is critical for the shared resolution cache. -- **Resolution portability**: Copying `preimage` and `postimage` files (plus the hash directory name) from one repo's `.git/rr-cache/` to another works. Rerere auto-resolves in the target repo. -- **Adjacent line sensitivity**: Changes within ~3 lines of each other are treated as a single conflict hunk by `merge-file`. Skills that modify the same area of a file will conflict even if they modify different lines. This is expected and handled by the resolution cache. - -#### Implication: Git Repository Required - -The adapter requires `git hash-object`, `git update-index`, and `.git/rr-cache/`. This means the project directory must be a git repository for rerere caching to work. Users who download a zip (no `.git/`) lose resolution caching but not functionality — conflicts escalate directly to Level 2 (Claude Code resolves). The system should detect this case and skip rerere operations gracefully. - -### Maintainer Workflow - -When releasing a core update or new skill version: - -1. Fresh codebase at target core version -2. Apply each official skill individually — verify clean merge, run tests -3. Apply pairwise combinations **for skills that modify at least one common file or have overlapping structured operations** -4. Apply curated three-skill stacks based on popularity and high overlap -5. Resolve all conflicts (code and structured) -6. Record all resolutions with input hashes -7. Run full test suite for every combination -8. Ship verified resolutions with the release - -The bar: **a user with any common combination of official skills should never encounter an unresolved conflict.** - ---- - -## 8. State Tracking - -`.nanoclaw/state.yaml` records everything about the installation: - -```yaml -skills_system_version: "0.1.0" # Schema version — tooling checks this before any operation -core_version: 0.1.0 - -applied_skills: - - name: telegram - version: 1.0.0 - applied_at: 2026-02-16T22:47:02.139Z - file_hashes: - src/channels/telegram.ts: "f627b9cf..." - src/channels/telegram.test.ts: "400116769..." - src/config.ts: "9ae28d1f..." - src/index.ts: "46dbe495..." - src/routing.test.ts: "5e1aede9..." - structured_outcomes: - npm_dependencies: - grammy: "^1.39.3" - env_additions: - - TELEGRAM_BOT_TOKEN - - TELEGRAM_ONLY - test: "npx vitest run src/channels/telegram.test.ts" - - - name: discord - version: 1.0.0 - applied_at: 2026-02-17T17:29:37.821Z - file_hashes: - src/channels/discord.ts: "5d669123..." - src/channels/discord.test.ts: "19e1c6b9..." - src/config.ts: "a0a32df4..." - src/index.ts: "d61e3a9d..." - src/routing.test.ts: "edbacb00..." - structured_outcomes: - npm_dependencies: - discord.js: "^14.18.0" - env_additions: - - DISCORD_BOT_TOKEN - - DISCORD_ONLY - test: "npx vitest run src/channels/discord.test.ts" - -custom_modifications: - - description: "Added custom logging middleware" - applied_at: 2026-02-15T12:00:00Z - files_modified: - - src/server.ts - patch_file: .nanoclaw/custom/001-logging-middleware.patch -``` - -**v0.1 implementation notes:** -- `file_hashes` stores a single SHA-256 hash per file (the final merged result). Three-part hashes (base/skill_modified/merged) are planned for a future version to improve drift diagnosis. -- Applied skills use `name` as the key field (not `skill`), matching the TypeScript `AppliedSkill` interface. -- `structured_outcomes` stores the raw manifest values plus the `test` command. Resolved npm versions (actual installed versions vs semver ranges) are not yet tracked. -- Fields like `installed_at`, `last_updated`, `path_remap`, `rebased_at`, `core_version_at_apply`, `files_added`, and `files_modified` are planned for future versions. - ---- - -## 9. Untracked Changes - -If a user edits files directly, the system detects this via hash comparison. - -### When Detection Happens - -Before **any operation that modifies the codebase**: applying a skill, removing a skill, updating the core, replaying, or rebasing. - -### What Happens - -``` -Detected untracked changes to src/server.ts. -[1] Record these as a custom modification (recommended) -[2] Continue anyway (changes preserved, but not tracked for future replay) -[3] Abort -``` - -The system never blocks or loses work. Option 1 generates a patch and records it, making changes reproducible. Option 2 preserves the changes but they won't survive replay. - -### The Recovery Guarantee - -No matter how much a user modifies their codebase outside the system, the three-level model can always bring them back: - -1. **Git**: diff current files against base, identify what changed -2. **Claude Code**: read `state.yaml` to understand what skills were applied, compare against actual file state, identify discrepancies -3. **User**: Claude Code asks what they intended, what to keep, what to discard - -There is no unrecoverable state. - ---- - -## 10. Core Updates - -Core updates must be as programmatic as possible. The NanoClaw team is responsible for ensuring updates apply cleanly to common skill combinations. - -### Patches and Migrations - -Most core changes — bug fixes, performance improvements, new functionality — propagate automatically through the three-way merge. No special handling needed. - -**Breaking changes** — changed defaults, removed features, functionality moved to skills — require a **migration**. A migration is a skill that preserves the old behavior, authored against the new core. It's applied automatically during the update so the user's setup doesn't change. - -The maintainer's responsibility when making a breaking change: make the change in core, author a migration skill that reverts it, add the entry to `migrations.yaml`, test it. That's the cost of breaking changes. - -### `migrations.yaml` - -An append-only file in the repo root. Each entry records a breaking change and the skill that preserves the old behavior: - -```yaml -- since: 0.6.0 - skill: apple-containers@1.0.0 - description: "Preserves Apple Containers (default changed to Docker in 0.6)" - -- since: 0.7.0 - skill: add-whatsapp@2.0.0 - description: "Preserves WhatsApp (moved from core to skill in 0.7)" - -- since: 0.8.0 - skill: legacy-auth@1.0.0 - description: "Preserves legacy auth module (removed from core in 0.8)" -``` - -Migration skills are regular skills in the `skills/` directory. They have manifests, intent files, tests — everything. They're authored against the **new** core version: the modified file is the new core with the specific breaking change reverted, everything else (bug fixes, new features) identical to the new core. - -### How Migrations Work During Updates - -1. Three-way merge brings in everything from the new core — patches, breaking changes, all of it -2. Conflict resolution (normal) -3. Re-apply custom patches (normal) -4. **Update base to new core** -5. Filter `migrations.yaml` for entries where `since` > user's old `core_version` -6. **Apply each migration skill using the normal apply flow against the new base** -7. Record migration skills in `state.yaml` like any other skill -8. Run tests - -Step 6 is just the same apply function used for any skill. The migration skill merges against the new base: - -- **Base**: new core (e.g., v0.8 with Docker) -- **Current**: user's file after the update merge (new core + user's customizations preserved by the earlier merge) -- **Other**: migration skill's file (new core with Docker reverted to Apple, everything else identical) - -Three-way merge correctly keeps user's customizations, reverts the breaking change, and preserves all bug fixes. If there's a conflict, normal resolution: cache → Claude → user. - -For big version jumps (v0.5 → v0.8), all applicable migrations are applied in sequence. Migration skills are maintained against the latest core version, so they always compose correctly with the current codebase. - -### What the User Sees - -``` -Core updated: 0.5.0 → 0.8.0 - ✓ All patches applied - - Preserving your current setup: - + apple-containers@1.0.0 - + add-whatsapp@2.0.0 - + legacy-auth@1.0.0 - - Skill updates: - ✓ add-telegram 1.0.0 → 1.2.0 - - To accept new defaults: /remove-skill - ✓ All tests passing -``` - -No prompts, no choices during the update. The user's setup doesn't change. If they later want to accept a new default, they remove the migration skill. - -### What the Core Team Ships With an Update - -``` -updates/ - 0.5.0-to-0.6.0/ - migration.md # What changed, why, and how it affects skills - files/ # The new core files - file_ops: # Any renames, deletes, moves - path_remap: # Compatibility map for old skill paths - resolutions/ # Pre-computed resolutions for official skills -``` - -Plus any new migration skills added to `skills/` and entries appended to `migrations.yaml`. - -### The Maintainer's Process - -1. **Make the core change** -2. **If it's a breaking change**: author a migration skill against the new core, add entry to `migrations.yaml` -3. **Write `migration.md`** — what changed, why, what skills might be affected -4. **Test every official skill individually** against the new core (including migration skills) -5. **Test pairwise combinations** for skills that share modified files or structured operations -6. **Test curated three-skill stacks** based on popularity and overlap -7. **Resolve all conflicts** -8. **Record all resolutions** with enforced input hashes -9. **Run full test suites** -10. **Ship everything** — migration guide, migration skills, file ops, path remap, resolutions - -The bar: **patches apply silently. Breaking changes are auto-preserved via migration skills. A user should never be surprised by a change to their working setup.** - -### Update Flow (Full) - -#### Step 1: Pre-flight - -- Check for untracked changes -- Read `state.yaml` -- Load shipped resolutions -- Parse `migrations.yaml`, filter for applicable migrations - -#### Step 2: Preview - -Before modifying anything, show the user what's coming. This uses only git commands — no files are opened or changed: - -```bash -# Compute common base -BASE=$(git merge-base HEAD upstream/$BRANCH) - -# Upstream commits since last sync -git log --oneline $BASE..upstream/$BRANCH - -# Files changed upstream -git diff --name-only $BASE..upstream/$BRANCH -``` - -Present a summary grouped by impact: - -``` -Update available: 0.5.0 → 0.8.0 (12 commits) - - Source: 4 files modified (server.ts, config.ts, ...) - Skills: 2 new skills added, 1 skill updated - Config: package.json, docker-compose.yml updated - - Migrations (auto-applied to preserve your setup): - + apple-containers@1.0.0 (container default changed to Docker) - + add-whatsapp@2.0.0 (WhatsApp moved from core to skill) - - Skill updates: - add-telegram 1.0.0 → 1.2.0 - - [1] Proceed with update - [2] Abort -``` - -If the user aborts, stop here. Nothing was modified. - -#### Step 3: Backup - -Copy all files that will be modified to `.nanoclaw/backup/`. - -#### Step 4: File Operations and Path Remap - -Apply renames, deletes, moves. Record path remap in state. - -#### Step 5: Three-Way Merge - -For each core file that changed: - -```bash -git merge-file src/server.ts .nanoclaw/base/src/server.ts updates/0.5.0-to-0.6.0/files/src/server.ts -``` - -#### Step 6: Conflict Resolution - -1. Shipped resolutions (hash-verified) → automatic -2. `git rerere` local cache → automatic -3. Claude Code with `migration.md` + skill intents → resolves -4. User → only for genuine ambiguity - -#### Step 7: Re-apply Custom Patches - -```bash -git apply --3way .nanoclaw/custom/001-logging-middleware.patch -``` - -Using `--3way` allows git to fall back to three-way merge when line numbers have drifted. If `--3way` fails, escalate to Level 2. - -#### Step 8: Update Base - -`.nanoclaw/base/` replaced with new clean core. This is the **only time** the base changes. - -#### Step 9: Apply Migration Skills - -For each applicable migration (where `since` > old `core_version`), apply the migration skill using the normal apply flow against the new base. Record in `state.yaml`. - -#### Step 10: Re-apply Updated Skills - -Skills live in the repo and update alongside core files. After the update, compare the version in each skill's `manifest.yaml` on disk against the version recorded in `state.yaml`. - -For each skill where the on-disk version is newer than the recorded version: - -1. Re-apply the skill using the normal apply flow against the new base -2. The three-way merge brings in the skill's new changes while preserving user customizations -3. Re-apply any custom patches tied to the skill (`git apply --3way`) -4. Update the version in `state.yaml` - -Skills whose version hasn't changed are skipped — no action needed. - -If the user has a custom patch on a skill that changed significantly, the patch may conflict. Normal resolution: cache → Claude → user. - -#### Step 11: Re-run Structured Operations - -Recompute structured operations against the updated codebase to ensure consistency. - -#### Step 12: Validate - -- Run all skill tests — mandatory -- Compatibility report: - -``` -Core updated: 0.5.0 → 0.8.0 - ✓ All patches applied - - Migrations: - + apple-containers@1.0.0 (preserves container runtime) - + add-whatsapp@2.0.0 (WhatsApp moved to skill) - - Skill updates: - ✓ add-telegram 1.0.0 → 1.2.0 (new features applied) - ✓ custom/telegram-group-only — re-applied cleanly - - ✓ All tests passing -``` - -#### Step 13: Clean Up - -Delete `.nanoclaw/backup/`. - -### Progressive Core Slimming - -Migrations enable a clean path for slimming down the core over time. Each release can move more functionality to skills: - -- The breaking change removes the feature from core -- The migration skill preserves it for existing users -- New users start with a minimal core and add what they need -- Over time, `state.yaml` reflects exactly what each user is running - ---- - -## 11. Skill Removal (Uninstall) - -Removing a skill is not a reverse-patch operation. **Uninstall is a replay without the skill.** - -### How It Works - -1. Read `state.yaml` to get the full list of applied skills and custom modifications -2. Remove the target skill from the list -3. Backup the current codebase to `.nanoclaw/backup/` -4. **Replay from clean base** — apply each remaining skill in order, apply custom patches, using the resolution cache -5. Run all tests -6. If tests pass, delete backup and update `state.yaml` -7. If tests fail, restore from backup and report - -### Custom Patches Tied to the Removed Skill - -If the removed skill has a `custom_patch` in `state.yaml`, the user is warned: - -``` -Removing telegram will also discard custom patch: "Restrict bot responses to group chats only" -[1] Continue (discard custom patch) -[2] Abort -``` - ---- - -## 12. Rebase - -Flatten accumulated layers into a clean starting point. - -### What Rebase Does - -1. Takes the user's current actual files as the new reality -2. Updates `.nanoclaw/base/` to the current core version's clean files -3. For each applied skill, regenerates the modified file diffs against the new base -4. Updates `state.yaml` with `rebased_at` timestamp -5. Clears old custom patches (now baked in) -6. Clears stale resolution cache entries - -### When to Rebase - -- After a major core update -- When accumulated patches become unwieldy -- Before a significant new skill application -- Periodically as maintenance - -### Tradeoffs - -**Lose**: individual skill patch history, ability to cleanly remove a single old skill, old custom patches as separate artifacts - -**Gain**: clean base, simpler future merges, reduced cache size, fresh starting point - ---- - -## 13. Replay - -Given `state.yaml`, reproduce the exact installation on a fresh machine with no AI intervention (assuming all resolutions are cached). - -### Replay Flow - -```bash -# Fully programmatic — no Claude Code needed - -# 1. Install core at specified version -nanoclaw-init --version 0.5.0 - -# 2. Load shared resolutions into local rerere cache -load-resolutions .nanoclaw/resolutions/ - -# 3. For each skill in applied_skills (in order): -for skill in state.applied_skills: - # File operations - apply_file_ops(skill) - - # Copy new files - cp skills/${skill.name}/add/* . - - # Merge modified code files (with path remapping) - for file in skill.files_modified: - resolved_path = apply_remap(file, state.path_remap) - git merge-file ${resolved_path} .nanoclaw/base/${resolved_path} skills/${skill.name}/modify/${file} - # git rerere auto-resolves from shared cache if needed - - # Apply skill-specific custom patch if recorded - if skill.custom_patch: - git apply --3way ${skill.custom_patch} - -# 4. Apply all structured operations (batched) -collect_all_structured_ops(state.applied_skills) -merge_npm_dependencies → write package.json once -npm install once -merge_env_additions → write .env.example once -merge_compose_services → write docker-compose.yml once - -# 5. Apply standalone custom modifications -for custom in state.custom_modifications: - git apply --3way ${custom.patch_file} - -# 6. Run tests and verify hashes -run_tests && verify_hashes -``` - ---- - -## 14. Skill Tests - -Each skill includes integration tests that validate the skill works correctly when applied. - -### Structure - -``` -skills/ - add-whatsapp/ - tests/ - whatsapp.test.ts -``` - -### What Tests Validate - -- **Single skill on fresh core**: apply to clean codebase → tests pass → integration works -- **Skill functionality**: the feature actually works -- **Post-apply state**: files in expected state, `state.yaml` correctly updated - -### When Tests Run (Always) - -- **After applying a skill** — even if all merges were clean -- **After core update** — even if all merges were clean -- **After uninstall replay** — confirms removal didn't break remaining skills -- **In CI** — tests all official skills individually and in common combinations -- **During replay** — validates replayed state - -Clean merge ≠ working code. Tests are the only reliable signal. - -### CI Test Matrix - -Test coverage is **smart, not exhaustive**: - -- Every official skill individually against each supported core version -- **Pairwise combinations for skills that modify at least one common file or have overlapping structured operations** -- Curated three-skill stacks based on popularity and high overlap -- Test matrix auto-generated from manifest `modifies` and `structured` fields - -Each passing combination generates a verified resolution entry for the shared cache. - ---- - -## 15. Project Configuration - -### `.gitattributes` - -Ship with NanoClaw to reduce noisy merge conflicts: - -``` -* text=auto -*.ts text eol=lf -*.json text eol=lf -*.yaml text eol=lf -*.md text eol=lf -``` - ---- - -## 16. Directory Structure - -``` -project/ - src/ # The actual codebase - server.ts - config.ts - channels/ - whatsapp.ts - telegram.ts - skills/ # Skill packages (Claude Code slash commands) - add-whatsapp/ - SKILL.md - manifest.yaml - tests/ - whatsapp.test.ts - add/ - src/channels/whatsapp.ts - modify/ - src/ - server.ts - server.ts.intent.md - config.ts - config.ts.intent.md - add-telegram/ - ... - telegram-reactions/ # Layered skill - ... - .nanoclaw/ - base/ # Clean core (shared base) - src/ - server.ts - config.ts - ... - state.yaml # Full installation state - backup/ # Temporary backup during operations - custom/ # Custom patches - telegram-group-only.patch - 001-logging-middleware.patch - 001-logging-middleware.md - resolutions/ # Shared verified resolution cache - whatsapp@1.2.0+telegram@1.0.0/ - src/ - server.ts.resolution - server.ts.preimage - meta.yaml - .gitattributes -``` - ---- - -## 17. Design Principles - -1. **Use git, don't reinvent it.** `git merge-file` for code merges, `git rerere` for caching resolutions, `git apply --3way` for custom patches. -2. **Three-level resolution: git → Claude → user.** Programmatic first, AI second, human third. -3. **Clean merges aren't enough.** Tests run after every operation. Semantic conflicts survive text merges. -4. **All operations are safe.** Backup before, restore on failure. No half-applied state. -5. **One shared base.** `.nanoclaw/base/` is the clean core before any skills or customizations. It's the stable common ancestor for all three-way merges. Only updated on core updates. -6. **Code merges vs. structured operations.** Source code is three-way merged. Dependencies, env vars, and configs are aggregated programmatically. Structured operations are implicit and batched. -7. **Resolutions are learned and shared.** Maintainers resolve conflicts and ship verified resolutions with hash enforcement. `.nanoclaw/resolutions/` is the shared artifact. -8. **One skill, one happy path.** No predefined configuration options. Customization is more patching. -9. **Skills layer and compose.** Core skills provide the foundation. Extension skills add capabilities. -10. **Intent is first-class and structured.** `SKILL.md`, `.intent.md` (What, Invariants, Must-keep), and `migration.md`. -11. **State is explicit and complete.** Skills, custom patches, per-file hashes, structured outcomes, path remaps. Replay is deterministic. Drift is instant to detect. -12. **Always recoverable.** The three-level model reconstructs coherent state from any starting point. -13. **Uninstall is replay.** Replay from clean base without the skill. Backup for safety. -14. **Core updates are the maintainers' responsibility.** Test, resolve, ship. Breaking changes require a migration skill that preserves the old behavior. The cost of a breaking change is authoring and testing the migration. Users should never be surprised by a change to their setup. -15. **File operations and path remapping are first-class.** Renames, deletes, moves in manifests. Skills are never mutated — paths resolve at apply time. -16. **Skills are tested.** Integration tests per skill. CI tests pairwise by overlap. Tests run always. -17. **Deterministic serialization.** Sorted keys, consistent formatting. No noisy diffs. -18. **Rebase when needed.** Flatten layers for a clean starting point. -19. **Progressive core slimming.** Breaking changes move functionality from core to migration skills. Existing users keep what they have automatically. New users start minimal and add what they need. \ No newline at end of file diff --git a/docs/nanorepo-architecture.md b/docs/nanorepo-architecture.md deleted file mode 100644 index 1365e9e..0000000 --- a/docs/nanorepo-architecture.md +++ /dev/null @@ -1,168 +0,0 @@ -# NanoClaw Skills Architecture - -## What Skills Are For - -NanoClaw's core is intentionally minimal. Skills are how users extend it: adding channels, integrations, cross-platform support, or replacing internals entirely. Examples: add Telegram alongside WhatsApp, switch from Apple Container to Docker, add Gmail integration, add voice message transcription. Each skill modifies the actual codebase, adding channel handlers, updating the message router, changing container configuration, and adding dependencies, rather than working through a plugin API or runtime hooks. - -## Why This Architecture - -The problem: users need to combine multiple modifications to a shared codebase, keep those modifications working across core updates, and do all of this without becoming git experts or losing their custom changes. A plugin system would be simpler but constrains what skills can do. Giving skills full codebase access means they can change anything, but that creates merge conflicts, update breakage, and state tracking challenges. - -This architecture solves that by making skill application fully programmatic using standard git mechanics, with AI as a fallback for conflicts git can't resolve, and a shared resolution cache so most users never hit those conflicts at all. The result: users compose exactly the features they want, customizations survive core updates automatically, and the system is always recoverable. - -## Core Principle - -Skills are self-contained, auditable packages applied via standard git merge mechanics. Claude Code orchestrates the process — running git commands, reading skill manifests, and stepping in only when git can't resolve a conflict. The system uses existing git features (`merge-file`, `rerere`, `apply`) rather than custom merge infrastructure. - -## Three-Level Resolution Model - -Every operation follows this escalation: - -1. **Git** — deterministic. `git merge-file` merges, `git rerere` replays cached resolutions, structured operations apply without merging. No AI. Handles the vast majority of cases. -2. **Claude Code** — reads `SKILL.md`, `.intent.md`, and `state.yaml` to resolve conflicts git can't handle. Caches resolutions via `git rerere` so the same conflict never needs resolving twice. -3. **Claude Code + user input** — when Claude Code lacks sufficient context to determine intent (e.g., two features genuinely conflict at an application level), it asks the user for a decision, then uses that input to perform the resolution. Claude Code still does the work — the user provides direction, not code. - -**Important**: A clean merge doesn't guarantee working code. Semantic conflicts can produce clean text merges that break at runtime. **Tests run after every operation.** - -## Backup/Restore Safety - -Before any operation, all affected files are copied to `.nanoclaw/backup/`. On success, backup is deleted. On failure, backup is restored. Works safely for users who don't use git. - -## The Shared Base - -`.nanoclaw/base/` holds a clean copy of the core codebase. This is the single common ancestor for all three-way merges, only updated during core updates. - -## Two Types of Changes - -### Code Files (Three-Way Merge) -Source code where skills weave in logic. Merged via `git merge-file` against the shared base. Skills carry full modified files. - -### Structured Data (Deterministic Operations) -Files like `package.json`, `docker-compose.yml`, `.env.example`. Skills declare requirements in the manifest; the system applies them programmatically. Multiple skills' declarations are batched — dependencies merged, `package.json` written once, `npm install` run once. - -```yaml -structured: - npm_dependencies: - whatsapp-web.js: "^2.1.0" - env_additions: - - WHATSAPP_TOKEN - docker_compose_services: - whatsapp-redis: - image: redis:alpine - ports: ["6380:6379"] -``` - -Structured conflicts (version incompatibilities, port collisions) follow the same three-level resolution model. - -## Skill Package Structure - -A skill contains only the files it adds or modifies. Modified code files carry the **full file** (clean core + skill's changes), making `git merge-file` straightforward and auditable. - -``` -skills/add-whatsapp/ - SKILL.md # What this skill does and why - manifest.yaml # Metadata, dependencies, structured ops - tests/whatsapp.test.ts # Integration tests - add/src/channels/whatsapp.ts # New files - modify/src/server.ts # Full modified file for merge - modify/src/server.ts.intent.md # Structured intent for conflict resolution -``` - -### Intent Files -Each modified file has a `.intent.md` with structured headings: **What this skill adds**, **Key sections**, **Invariants**, and **Must-keep sections**. These give Claude Code specific guidance during conflict resolution. - -### Manifest -Declares: skill metadata, core version compatibility, files added/modified, file operations, structured operations, skill relationships (conflicts, depends, tested_with), post-apply commands, and test command. - -## Customization and Layering - -**One skill, one happy path** — a skill implements the reasonable default for 80% of users. - -**Customization is more patching.** Apply the skill, then modify via tracked patches, direct editing, or additional layered skills. Custom modifications are recorded in `state.yaml` and replayable. - -**Skills layer via `depends`.** Extension skills build on base skills (e.g., `telegram-reactions` depends on `add-telegram`). - -## File Operations - -Renames, deletes, and moves are declared in the manifest and run **before** code merges. When core renames a file, a **path remap** resolves skill references at apply time — skill packages are never mutated. - -## The Apply Flow - -1. Pre-flight checks (compatibility, dependencies, untracked changes) -2. Backup -3. File operations + path remapping -4. Copy new files -5. Merge modified code files (`git merge-file`) -6. Conflict resolution (shared cache → `git rerere` → Claude Code → Claude Code + user input) -7. Apply structured operations (batched) -8. Post-apply commands, update `state.yaml` -9. **Run tests** (mandatory, even if all merges were clean) -10. Clean up (delete backup on success, restore on failure) - -## Shared Resolution Cache - -`.nanoclaw/resolutions/` ships pre-computed, verified conflict resolutions with **hash enforcement** — a cached resolution only applies if base, current, and skill input hashes match exactly. This means most users never encounter unresolved conflicts for common skill combinations. - -### rerere Adapter -`git rerere` requires unmerged index entries that `git merge-file` doesn't create. An adapter sets up the required index state after `merge-file` produces a conflict, enabling rerere caching. This requires the project to be a git repository; users without `.git/` lose caching but not functionality. - -## State Tracking - -`.nanoclaw/state.yaml` records: core version, all applied skills (with per-file hashes for base/skill/merged), structured operation outcomes, custom patches, and path remaps. This makes drift detection instant and replay deterministic. - -## Untracked Changes - -Direct edits are detected via hash comparison before any operation. Users can record them as tracked patches, continue untracked, or abort. The three-level model can always recover coherent state from any starting point. - -## Core Updates - -Most changes propagate automatically through three-way merge. **Breaking changes** require a **migration skill** — a regular skill that preserves the old behavior, authored against the new core. Migrations are declared in `migrations.yaml` and applied automatically during updates. - -### Update Flow -1. Preview changes (git-only, no files modified) -2. Backup → file operations → three-way merge → conflict resolution -3. Re-apply custom patches (`git apply --3way`) -4. **Update base** to new core -5. Apply migration skills (preserves user's setup automatically) -6. Re-apply updated skills (version-changed skills only) -7. Re-run structured operations → run all tests → clean up - -The user sees no prompts during updates. To accept a new default later, they remove the migration skill. - -## Skill Removal - -Uninstall is **replay without the skill**: read `state.yaml`, remove the target skill, replay all remaining skills from clean base using the resolution cache. Backup for safety. - -## Rebase - -Flatten accumulated layers into a clean starting point. Updates base, regenerates diffs, clears old patches and stale cache entries. Trades individual skill history for simpler future merges. - -## Replay - -Given `state.yaml`, reproduce the exact installation on a fresh machine with no AI (assuming cached resolutions). Apply skills in order, merge, apply custom patches, batch structured operations, run tests. - -## Skill Tests - -Each skill includes integration tests. Tests run **always** — after apply, after update, after uninstall, during replay, in CI. CI tests all official skills individually and pairwise combinations for skills sharing modified files or structured operations. - -## Design Principles - -1. **Use git, don't reinvent it.** -2. **Three-level resolution: git → Claude Code → Claude Code + user input.** -3. **Clean merges aren't enough.** Tests run after every operation. -4. **All operations are safe.** Backup/restore, no half-applied state. -5. **One shared base**, only updated on core updates. -6. **Code merges vs. structured operations.** Source code is merged; configs are aggregated. -7. **Resolutions are learned and shared** with hash enforcement. -8. **One skill, one happy path.** Customization is more patching. -9. **Skills layer and compose.** -10. **Intent is first-class and structured.** -11. **State is explicit and complete.** Replay is deterministic. -12. **Always recoverable.** -13. **Uninstall is replay.** -14. **Core updates are the maintainers' responsibility.** Breaking changes require migration skills. -15. **File operations and path remapping are first-class.** -16. **Skills are tested.** CI tests pairwise by overlap. -17. **Deterministic serialization.** No noisy diffs. -18. **Rebase when needed.** -19. **Progressive core slimming** via migration skills. \ No newline at end of file diff --git a/docs/skills-as-branches.md b/docs/skills-as-branches.md new file mode 100644 index 0000000..4a6db9b --- /dev/null +++ b/docs/skills-as-branches.md @@ -0,0 +1,677 @@ +# Skills as Branches + +## Overview + +This document covers **feature skills** — skills that add capabilities via git branch merges. This is the most complex skill type and the primary way NanoClaw is extended. + +NanoClaw has four types of skills overall. See [CONTRIBUTING.md](../CONTRIBUTING.md) for the full taxonomy: + +| Type | Location | How it works | +|------|----------|-------------| +| **Feature** (this doc) | `.claude/skills/` + `skill/*` branch | SKILL.md has instructions; code lives on a branch, applied via `git merge` | +| **Utility** | `.claude/skills//` with code files | Self-contained tools; code in skill directory, copied into place on install | +| **Operational** | `.claude/skills/` on `main` | Instruction-only workflows (setup, debug, update) | +| **Container** | `container/skills/` | Loaded inside agent containers at runtime | + +--- + +Feature skills are distributed as git branches on the upstream repository. Applying a skill is a `git merge`. Updating core is a `git merge`. Everything is standard git. + +This replaces the previous `skills-engine/` system (three-way file merging, `.nanoclaw/` state, manifest files, replay, backup/restore) with plain git operations and Claude for conflict resolution. + +## How It Works + +### Repository structure + +The upstream repo (`qwibitai/nanoclaw`) maintains: + +- `main` — core NanoClaw (no skill code) +- `skill/discord` — main + Discord integration +- `skill/telegram` — main + Telegram integration +- `skill/slack` — main + Slack integration +- `skill/gmail` — main + Gmail integration +- etc. + +Each skill branch contains all the code changes for that skill: new files, modified source files, updated `package.json` dependencies, `.env.example` additions — everything. No manifest, no structured operations, no separate `add/` and `modify/` directories. + +### Skill discovery and installation + +Skills are split into two categories: + +**Operational skills** (on `main`, always available): +- `/setup`, `/debug`, `/update-nanoclaw`, `/customize`, `/update-skills` +- These are instruction-only SKILL.md files — no code changes, just workflows +- Live in `.claude/skills/` on `main`, immediately available to every user + +**Feature skills** (in marketplace, installed on demand): +- `/add-discord`, `/add-telegram`, `/add-slack`, `/add-gmail`, etc. +- Each has a SKILL.md with setup instructions and a corresponding `skill/*` branch with code +- Live in the marketplace repo (`qwibitai/nanoclaw-skills`) + +Users never interact with the marketplace directly. The operational skills `/setup` and `/customize` handle plugin installation transparently: + +```bash +# Claude runs this behind the scenes — users don't see it +claude plugin install nanoclaw-skills@nanoclaw-skills --scope project +``` + +Skills are hot-loaded after `claude plugin install` — no restart needed. This means `/setup` can install the marketplace plugin, then immediately run any feature skill, all in one session. + +### Selective skill installation + +`/setup` asks users what channels they want, then only offers relevant skills: + +1. "Which messaging channels do you want to use?" → Discord, Telegram, Slack, WhatsApp +2. User picks Telegram → Claude installs the plugin and runs `/add-telegram` +3. After Telegram is set up: "Want to add Agent Swarm support for Telegram?" → offers `/add-telegram-swarm` +4. "Want to enable community skills?" → installs community marketplace plugins + +Dependent skills (e.g., `telegram-swarm` depends on `telegram`) are only offered after their parent is installed. `/customize` follows the same pattern for post-setup additions. + +### Marketplace configuration + +NanoClaw's `.claude/settings.json` registers the official marketplace: + +```json +{ + "extraKnownMarketplaces": { + "nanoclaw-skills": { + "source": { + "source": "github", + "repo": "qwibitai/nanoclaw-skills" + } + } + } +} +``` + +The marketplace repo uses Claude Code's plugin structure: + +``` +qwibitai/nanoclaw-skills/ + .claude-plugin/ + marketplace.json # Plugin catalog + plugins/ + nanoclaw-skills/ # Single plugin bundling all official skills + .claude-plugin/ + plugin.json # Plugin manifest + skills/ + add-discord/ + SKILL.md # Setup instructions; step 1 is "merge the branch" + add-telegram/ + SKILL.md + add-slack/ + SKILL.md + ... +``` + +Multiple skills are bundled in one plugin — installing `nanoclaw-skills` makes all feature skills available at once. Individual skills don't need separate installation. + +Each SKILL.md tells Claude to merge the corresponding skill branch as step 1, then walks through interactive setup (env vars, bot creation, etc.). + +### Applying a skill + +User runs `/add-discord` (discovered via marketplace). Claude follows the SKILL.md: + +1. `git fetch upstream skill/discord` +2. `git merge upstream/skill/discord` +3. Interactive setup (create bot, get token, configure env vars, etc.) + +Or manually: + +```bash +git fetch upstream skill/discord +git merge upstream/skill/discord +``` + +### Applying multiple skills + +```bash +git merge upstream/skill/discord +git merge upstream/skill/telegram +``` + +Git handles the composition. If both skills modify the same lines, it's a real conflict and Claude resolves it. + +### Updating core + +```bash +git fetch upstream main +git merge upstream/main +``` + +Since skill branches are kept merged-forward with main (see CI section), the user's merged-in skill changes and upstream changes have proper common ancestors. + +### Checking for skill updates + +Users who previously merged a skill branch can check for updates. For each `upstream/skill/*` branch, check whether the branch has commits that aren't in the user's HEAD: + +```bash +git fetch upstream +for branch in $(git branch -r | grep 'upstream/skill/'); do + # Check if user has merged this skill at some point + merge_base=$(git merge-base HEAD "$branch" 2>/dev/null) || continue + # Check if the skill branch has new commits beyond what the user has + if ! git merge-base --is-ancestor "$branch" HEAD 2>/dev/null; then + echo "$branch has updates available" + fi +done +``` + +This requires no state — it uses git history to determine which skills were previously merged and whether they have new commits. + +This logic is available in two ways: +- Built into `/update-nanoclaw` — after merging main, optionally check for skill updates +- Standalone `/update-skills` — check and merge skill updates independently + +### Conflict resolution + +At any merge step, conflicts may arise. Claude resolves them — reading the conflicted files, understanding the intent of both sides, and producing the correct result. This is what makes the branch approach viable at scale: conflict resolution that previously required human judgment is now automated. + +### Skill dependencies + +Some skills depend on other skills. E.g., `skill/telegram-swarm` requires `skill/telegram`. Dependent skill branches are branched from their parent skill branch, not from `main`. + +This means `skill/telegram-swarm` includes all of telegram's changes plus its own additions. When a user merges `skill/telegram-swarm`, they get both — no need to merge telegram separately. + +Dependencies are implicit in git history — `git merge-base --is-ancestor` determines whether one skill branch is an ancestor of another. No separate dependency file is needed. + +### Uninstalling a skill + +```bash +# Find the merge commit +git log --merges --oneline | grep discord + +# Revert it +git revert -m 1 +``` + +This creates a new commit that undoes the skill's changes. Claude can handle the whole flow. + +If the user has modified the skill's code since merging (custom changes on top), the revert might conflict — Claude resolves it. + +If the user later wants to re-apply the skill, they need to revert the revert first (git treats reverted changes as "already applied and undone"). Claude handles this too. + +## CI: Keeping Skill Branches Current + +A GitHub Action runs on every push to `main`: + +1. List all `skill/*` branches +2. For each skill branch, merge `main` into it (merge-forward, not rebase) +3. Run build and tests on the merged result +4. If tests pass, push the updated skill branch +5. If a skill fails (conflict, build error, test failure), open a GitHub issue for manual resolution + +**Why merge-forward instead of rebase:** +- No force-push — preserves history for users who already merged the skill +- Users can re-merge a skill branch to pick up skill updates (bug fixes, improvements) +- Git has proper common ancestors throughout the merge graph + +**Why this scales:** With a few hundred skills and a few commits to main per day, the CI cost is trivial. Haiku is fast and cheap. The approach that wouldn't have been feasible a year or two ago is now practical because Claude can resolve conflicts at scale. + +## Installation Flow + +### New users (recommended) + +1. Fork `qwibitai/nanoclaw` on GitHub (click the Fork button) +2. Clone your fork: + ```bash + git clone https://github.com//nanoclaw.git + cd nanoclaw + ``` +3. Run Claude Code: + ```bash + claude + ``` +4. Run `/setup` — Claude handles dependencies, authentication, container setup, service configuration, and adds `upstream` remote if not present + +Forking is recommended because it gives users a remote to push their customizations to. Clone-only works for trying things out but provides no remote backup. + +### Existing users migrating from clone + +Users who previously ran `git clone https://github.com/qwibitai/nanoclaw.git` and have local customizations: + +1. Fork `qwibitai/nanoclaw` on GitHub +2. Reroute remotes: + ```bash + git remote rename origin upstream + git remote add origin https://github.com//nanoclaw.git + git push --force origin main + ``` + The `--force` is needed because the fresh fork's main is at upstream's latest, but the user wants their (possibly behind) version. The fork was just created so there's nothing to lose. +3. From this point, `origin` = their fork, `upstream` = qwibitai/nanoclaw + +### Existing users migrating from the old skills engine + +Users who previously applied skills via the `skills-engine/` system have skill code in their tree but no merge commits linking to skill branches. Git doesn't know these changes came from a skill, so merging a skill branch on top would conflict or duplicate. + +**For new skills going forward:** just merge skill branches as normal. No issue. + +**For existing old-engine skills**, two migration paths: + +**Option A: Per-skill reapply (keep your fork)** +1. For each old-engine skill: identify and revert the old changes, then merge the skill branch fresh +2. Claude assists with identifying what to revert and resolving any conflicts +3. Custom modifications (non-skill changes) are preserved + +**Option B: Fresh start (cleanest)** +1. Create a new fork from upstream +2. Merge the skill branches you want +3. Manually re-apply your custom (non-skill) changes +4. Claude assists by diffing your old fork against the new one to identify custom changes + +In both cases: +- Delete the `.nanoclaw/` directory (no longer needed) +- The `skills-engine/` code will be removed from upstream once all skills are migrated +- `/update-skills` only tracks skills applied via branch merge — old-engine skills won't appear in update checks + +## User Workflows + +### Custom changes + +Users make custom changes directly on their main branch. This is the standard fork workflow — their `main` IS their customized version. + +```bash +# Make changes +vim src/config.ts +git commit -am "change trigger word to @Bob" +git push origin main +``` + +Custom changes, skills, and core updates all coexist on their main branch. Git handles the three-way merging at each merge step because it can trace common ancestors through the merge history. + +### Applying a skill + +Run `/add-discord` in Claude Code (discovered via the marketplace plugin), or manually: + +```bash +git fetch upstream skill/discord +git merge upstream/skill/discord +# Follow setup instructions for configuration +git push origin main +``` + +If the user is behind upstream's main when they merge a skill branch, the merge might bring in some core changes too (since skill branches are merged-forward with main). This is generally fine — they get a compatible version of everything. + +### Updating core + +```bash +git fetch upstream main +git merge upstream/main +git push origin main +``` + +This is the same as the existing `/update-nanoclaw` skill's merge path. + +### Updating skills + +Run `/update-skills` or let `/update-nanoclaw` check after a core update. For each previously-merged skill branch that has new commits, Claude offers to merge the updates. + +### Contributing back to upstream + +Users who want to submit a PR to upstream: + +```bash +git fetch upstream main +git checkout -b my-fix upstream/main +# Make changes +git push origin my-fix +# Create PR from my-fix to qwibitai/nanoclaw:main +``` + +Standard fork contribution workflow. Their custom changes stay on their main and don't leak into the PR. + +## Contributing a Skill + +The flow below is for **feature skills** (branch-based). For utility skills (self-contained tools) and container skills, the contributor opens a PR that adds files directly to `.claude/skills//` or `container/skills//` — no branch extraction needed. See [CONTRIBUTING.md](../CONTRIBUTING.md) for all skill types. + +### Contributor flow (feature skills) + +1. Fork `qwibitai/nanoclaw` +2. Branch from `main` +3. Make the code changes (new channel file, modified integration points, updated package.json, .env.example additions, etc.) +4. Open a PR to `main` + +The contributor opens a normal PR — they don't need to know about skill branches or marketplace repos. They just make code changes and submit. + +### Maintainer flow + +When a skill PR is reviewed and approved: + +1. Create a `skill/` branch from the PR's commits: + ```bash + git fetch origin pull//head:skill/ + git push origin skill/ + ``` +2. Force-push to the contributor's PR branch, replacing it with a single commit that adds the contributor to `CONTRIBUTORS.md` (removing all code changes) +3. Merge the slimmed PR into `main` (just the contributor addition) +4. Add the skill's SKILL.md to the marketplace repo (`qwibitai/nanoclaw-skills`) + +This way: +- The contributor gets merge credit (their PR is merged) +- They're added to CONTRIBUTORS.md automatically by the maintainer +- The skill branch is created from their work +- `main` stays clean (no skill code) +- The contributor only had to do one thing: open a PR with code changes + +**Note:** GitHub PRs from forks have "Allow edits from maintainers" checked by default, so the maintainer can push to the contributor's PR branch. + +### Skill SKILL.md + +The contributor can optionally provide a SKILL.md (either in the PR or separately). This goes into the marketplace repo and contains: + +1. Frontmatter (name, description, triggers) +2. Step 1: Merge the skill branch +3. Steps 2-N: Interactive setup (create bot, get token, configure env vars, verify) + +If the contributor doesn't provide a SKILL.md, the maintainer writes one based on the PR. + +## Community Marketplaces + +Anyone can maintain their own fork with skill branches and their own marketplace repo. This enables a community-driven skill ecosystem without requiring write access to the upstream repo. + +### How it works + +A community contributor: + +1. Maintains a fork of NanoClaw (e.g., `alice/nanoclaw`) +2. Creates `skill/*` branches on their fork with their custom skills +3. Creates a marketplace repo (e.g., `alice/nanoclaw-skills`) with a `.claude-plugin/marketplace.json` and plugin structure + +### Adding a community marketplace + +If the community contributor is trusted, they can open a PR to add their marketplace to NanoClaw's `.claude/settings.json`: + +```json +{ + "extraKnownMarketplaces": { + "nanoclaw-skills": { + "source": { + "source": "github", + "repo": "qwibitai/nanoclaw-skills" + } + }, + "alice-nanoclaw-skills": { + "source": { + "source": "github", + "repo": "alice/nanoclaw-skills" + } + } + } +} +``` + +Once merged, all NanoClaw users automatically discover the community marketplace alongside the official one. + +### Installing community skills + +`/setup` and `/customize` ask users whether they want to enable community skills. If yes, Claude installs community marketplace plugins via `claude plugin install`: + +```bash +claude plugin install alice-skills@alice-nanoclaw-skills --scope project +``` + +Community skills are hot-loaded and immediately available — no restart needed. Dependent skills are only offered after their prerequisites are met (e.g., community Telegram add-ons only after Telegram is installed). + +Users can also browse and install community plugins manually via `/plugin`. + +### Properties of this system + +- **No gatekeeping required.** Anyone can create skills on their fork without permission. They only need approval to be listed in the auto-discovered marketplaces. +- **Multiple marketplaces coexist.** Users see skills from all trusted marketplaces in `/plugin`. +- **Community skills use the same merge pattern.** The SKILL.md just points to a different remote: + ```bash + git remote add alice https://github.com/alice/nanoclaw.git + git fetch alice skill/my-cool-feature + git merge alice/skill/my-cool-feature + ``` +- **Users can also add marketplaces manually.** Even without being listed in settings.json, users can run `/plugin marketplace add alice/nanoclaw-skills` to discover skills from any source. +- **CI is per-fork.** Each community maintainer runs their own CI to keep their skill branches merged-forward. They can use the same GitHub Action as the upstream repo. + +## Flavors + +A flavor is a curated fork of NanoClaw — a combination of skills, custom changes, and configuration tailored for a specific use case (e.g., "NanoClaw for Sales," "NanoClaw Minimal," "NanoClaw for Developers"). + +### Creating a flavor + +1. Fork `qwibitai/nanoclaw` +2. Merge in the skills you want +3. Make custom changes (trigger word, prompts, integrations, etc.) +4. Your fork's `main` IS the flavor + +### Installing a flavor + +During `/setup`, users are offered a choice of flavors before any configuration happens. The setup skill reads `flavors.yaml` from the repo (shipped with upstream, always up to date) and presents options: + +AskUserQuestion: "Start with a flavor or default NanoClaw?" +- Default NanoClaw +- NanoClaw for Sales — Gmail + Slack + CRM (maintained by alice) +- NanoClaw Minimal — Telegram-only, lightweight (maintained by bob) + +If a flavor is chosen: + +```bash +git remote add https://github.com/alice/nanoclaw.git +git fetch main +git merge /main +``` + +Then setup continues normally (dependencies, auth, container, service). + +**This choice is only offered on a fresh fork** — when the user's main matches or is close to upstream's main with no local commits. If `/setup` detects significant local changes (re-running setup on an existing install), it skips the flavor selection and goes straight to configuration. + +After installation, the user's fork has three remotes: +- `origin` — their fork (push customizations here) +- `upstream` — `qwibitai/nanoclaw` (core updates) +- `` — the flavor fork (flavor updates) + +### Updating a flavor + +```bash +git fetch main +git merge /main +``` + +The flavor maintainer keeps their fork updated (merging upstream, updating skills). Users pull flavor updates the same way they pull core updates. + +### Flavors registry + +`flavors.yaml` lives in the upstream repo: + +```yaml +flavors: + - name: NanoClaw for Sales + repo: alice/nanoclaw + description: Gmail + Slack + CRM integration, daily pipeline summaries + maintainer: alice + + - name: NanoClaw Minimal + repo: bob/nanoclaw + description: Telegram-only, no container overhead + maintainer: bob +``` + +Anyone can PR to add their flavor. The file is available locally when `/setup` runs since it's part of the cloned repo. + +### Discoverability + +- **During setup** — flavor selection is offered as part of the initial setup flow +- **`/browse-flavors` skill** — reads `flavors.yaml` and presents options at any time +- **GitHub topics** — flavor forks can tag themselves with `nanoclaw-flavor` for searchability +- **Discord / website** — community-curated lists + +## Migration + +Migration from the old skills engine to branches is complete. All feature skills now live on `skill/*` branches, and the skills engine has been removed. + +### Skill branches + +| Branch | Base | Description | +|--------|------|-------------| +| `skill/whatsapp` | `main` | WhatsApp channel | +| `skill/telegram` | `main` | Telegram channel | +| `skill/slack` | `main` | Slack channel | +| `skill/discord` | `main` | Discord channel | +| `skill/gmail` | `main` | Gmail channel | +| `skill/voice-transcription` | `skill/whatsapp` | OpenAI Whisper voice transcription | +| `skill/image-vision` | `skill/whatsapp` | Image attachment processing | +| `skill/pdf-reader` | `skill/whatsapp` | PDF attachment reading | +| `skill/local-whisper` | `skill/voice-transcription` | Local whisper.cpp transcription | +| `skill/ollama-tool` | `main` | Ollama MCP server for local models | +| `skill/apple-container` | `main` | Apple Container runtime | +| `skill/reactions` | `main` | WhatsApp emoji reactions | + +### What was removed + +- `skills-engine/` directory (entire engine) +- `scripts/apply-skill.ts`, `scripts/uninstall-skill.ts`, `scripts/rebase.ts` +- `scripts/fix-skill-drift.ts`, `scripts/validate-all-skills.ts` +- `.github/workflows/skill-drift.yml`, `.github/workflows/skill-pr.yml` +- All `add/`, `modify/`, `tests/`, and `manifest.yaml` from skill directories +- `.nanoclaw/` state directory + +Operational skills (`setup`, `debug`, `update-nanoclaw`, `customize`, `update-skills`) remain on main in `.claude/skills/`. + +## What Changes + +### README Quick Start + +Before: +```bash +git clone https://github.com/qwibitai/NanoClaw.git +cd NanoClaw +claude +``` + +After: +``` +1. Fork qwibitai/nanoclaw on GitHub +2. git clone https://github.com//nanoclaw.git +3. cd nanoclaw +4. claude +5. /setup +``` + +### Setup skill (`/setup`) + +Updates to the setup flow: + +- Check if `upstream` remote exists; if not, add it: `git remote add upstream https://github.com/qwibitai/nanoclaw.git` +- Check if `origin` points to the user's fork (not qwibitai). If it points to qwibitai, guide them through the fork migration. +- **Install marketplace plugin:** `claude plugin install nanoclaw-skills@nanoclaw-skills --scope project` — makes all feature skills available (hot-loaded, no restart) +- **Ask which channels to add:** present channel options (Discord, Telegram, Slack, WhatsApp, Gmail), run corresponding `/add-*` skills for selected channels +- **Offer dependent skills:** after a channel is set up, offer relevant add-ons (e.g., Agent Swarm after Telegram, voice transcription after WhatsApp) +- **Optionally enable community marketplaces:** ask if the user wants community skills, install those marketplace plugins too + +### `.claude/settings.json` + +Marketplace configuration so the official marketplace is auto-registered: + +```json +{ + "extraKnownMarketplaces": { + "nanoclaw-skills": { + "source": { + "source": "github", + "repo": "qwibitai/nanoclaw-skills" + } + } + } +} +``` + +### Skills directory on main + +The `.claude/skills/` directory on `main` retains only operational skills (setup, debug, update-nanoclaw, customize, update-skills). Feature skills (add-discord, add-telegram, etc.) live in the marketplace repo, installed via `claude plugin install` during `/setup` or `/customize`. + +### Skills engine removal + +The following can be removed: + +- `skills-engine/` — entire directory (apply, merge, replay, state, backup, etc.) +- `scripts/apply-skill.ts` +- `scripts/uninstall-skill.ts` +- `scripts/fix-skill-drift.ts` +- `scripts/validate-all-skills.ts` +- `.nanoclaw/` — state directory +- `add/` and `modify/` subdirectories from all skill directories +- Feature skill SKILL.md files from `.claude/skills/` on main (they now live in the marketplace) + +Operational skills (`setup`, `debug`, `update-nanoclaw`, `customize`, `update-skills`) remain on main in `.claude/skills/`. + +### New infrastructure + +- **Marketplace repo** (`qwibitai/nanoclaw-skills`) — single Claude Code plugin bundling SKILL.md files for all feature skills +- **CI GitHub Action** — merge-forward `main` into all `skill/*` branches on every push to `main`, using Claude (Haiku) for conflict resolution +- **`/update-skills` skill** — checks for and applies skill branch updates using git history +- **`CONTRIBUTORS.md`** — tracks skill contributors + +### Update skill (`/update-nanoclaw`) + +The update skill gets simpler with the branch-based approach. The old skills engine required replaying all applied skills after merging core updates — that entire step disappears. Skill changes are already in the user's git history, so `git merge upstream/main` just works. + +**What stays the same:** +- Preflight (clean working tree, upstream remote) +- Backup branch + tag +- Preview (git log, git diff, file buckets) +- Merge/cherry-pick/rebase options +- Conflict preview (dry-run merge) +- Conflict resolution +- Build + test validation +- Rollback instructions + +**What's removed:** +- Skill replay step (was needed by the old skills engine to re-apply skills after core update) +- Re-running structured operations (npm deps, env vars — these are part of git history now) + +**What's added:** +- Optional step at the end: "Check for skill updates?" which runs the `/update-skills` logic +- This checks whether any previously-merged skill branches have new commits (bug fixes, improvements to the skill itself — not just merge-forwards from main) + +**Why users don't need to re-merge skills after a core update:** +When the user merged a skill branch, those changes became part of their git history. When they later merge `upstream/main`, git performs a normal three-way merge — the skill changes in their tree are untouched, and only core changes are brought in. The merge-forward CI ensures skill branches stay compatible with latest main, but that's for new users applying the skill fresh. Existing users who already merged the skill don't need to do anything. + +Users only need to re-merge a skill branch if the skill itself was updated (not just merged-forward with main). The `/update-skills` check detects this. + +## Discord Announcement + +### For existing users + +> **Skills are now git branches** +> +> We've simplified how skills work in NanoClaw. Instead of a custom skills engine, skills are now git branches that you merge in. +> +> **What this means for you:** +> - Applying a skill: `git fetch upstream skill/discord && git merge upstream/skill/discord` +> - Updating core: `git fetch upstream main && git merge upstream/main` +> - Checking for skill updates: `/update-skills` +> - No more `.nanoclaw/` state directory or skills engine +> +> **We now recommend forking instead of cloning.** This gives you a remote to push your customizations to. +> +> **If you currently have a clone with local changes**, migrate to a fork: +> 1. Fork `qwibitai/nanoclaw` on GitHub +> 2. Run: +> ``` +> git remote rename origin upstream +> git remote add origin https://github.com//nanoclaw.git +> git push --force origin main +> ``` +> This works even if you're way behind — just push your current state. +> +> **If you previously applied skills via the old system**, your code changes are already in your working tree — nothing to redo. You can delete the `.nanoclaw/` directory. Future skills and updates use the branch-based approach. +> +> **Discovering skills:** Skills are now available through Claude Code's plugin marketplace. Run `/plugin` in Claude Code to browse and install available skills. + +### For skill contributors + +> **Contributing skills** +> +> To contribute a skill: +> 1. Fork `qwibitai/nanoclaw` +> 2. Branch from `main` and make your code changes +> 3. Open a regular PR +> +> That's it. We'll create a `skill/` branch from your PR, add you to CONTRIBUTORS.md, and add the SKILL.md to the marketplace. CI automatically keeps skill branches merged-forward with `main` using Claude to resolve any conflicts. +> +> **Want to run your own skill marketplace?** Maintain skill branches on your fork and create a marketplace repo. Open a PR to add it to NanoClaw's auto-discovered marketplaces — or users can add it manually via `/plugin marketplace add`. diff --git a/eslint.config.js b/eslint.config.js new file mode 100644 index 0000000..fa257de --- /dev/null +++ b/eslint.config.js @@ -0,0 +1,32 @@ +import globals from 'globals' +import pluginJs from '@eslint/js' +import tseslint from 'typescript-eslint' +import noCatchAll from 'eslint-plugin-no-catch-all' + +export default [ + { ignores: ['node_modules/', 'dist/', 'container/', 'groups/'] }, + { files: ['src/**/*.{js,ts}'] }, + { languageOptions: { globals: globals.node } }, + pluginJs.configs.recommended, + ...tseslint.configs.recommended, + { + plugins: { 'no-catch-all': noCatchAll }, + rules: { + 'preserve-caught-error': ['error', { requireCatchParameter: true }], + '@typescript-eslint/no-unused-vars': [ + 'error', + { + args: 'all', + argsIgnorePattern: '^_', + caughtErrors: 'all', + caughtErrorsIgnorePattern: '^_', + destructuredArrayIgnorePattern: '^_', + varsIgnorePattern: '^_', + ignoreRestSiblings: true, + }, + ], + 'no-catch-all/no-catch-all': 'warn', + '@typescript-eslint/no-explicit-any': 'warn', + }, + }, +] diff --git a/groups/global/CLAUDE.md b/groups/global/CLAUDE.md index 28e97a7..11988bc 100644 --- a/groups/global/CLAUDE.md +++ b/groups/global/CLAUDE.md @@ -49,10 +49,67 @@ When you learn something important: ## Message Formatting -NEVER use markdown. Only use WhatsApp/Telegram formatting: -- *single asterisks* for bold (NEVER **double asterisks**) -- _underscores_ for italic -- • bullet points -- ```triple backticks``` for code +Format messages based on the channel you're responding to. Check your group folder name: -No ## headings. No [links](url). No **double stars**. +### Slack channels (folder starts with `slack_`) + +Use Slack mrkdwn syntax. Run `/slack-formatting` for the full reference. Key rules: +- `*bold*` (single asterisks) +- `_italic_` (underscores) +- `` for links (NOT `[text](url)`) +- `•` bullets (no numbered lists) +- `:emoji:` shortcodes +- `>` for block quotes +- No `##` headings — use `*Bold text*` instead + +### WhatsApp/Telegram channels (folder starts with `whatsapp_` or `telegram_`) + +- `*bold*` (single asterisks, NEVER **double**) +- `_italic_` (underscores) +- `•` bullet points +- ` ``` ` code blocks + +No `##` headings. No `[links](url)`. No `**double stars**`. + +### Discord channels (folder starts with `discord_`) + +Standard Markdown works: `**bold**`, `*italic*`, `[links](url)`, `# headings`. + +--- + +## Task Scripts + +For any recurring task, use `schedule_task`. Frequent agent invocations — especially multiple times a day — consume API credits and can risk account restrictions. If a simple check can determine whether action is needed, add a `script` — it runs first, and the agent is only called when the check passes. This keeps invocations to a minimum. + +### How it works + +1. You provide a bash `script` alongside the `prompt` when scheduling +2. When the task fires, the script runs first (30-second timeout) +3. Script prints JSON to stdout: `{ "wakeAgent": true/false, "data": {...} }` +4. If `wakeAgent: false` — nothing happens, task waits for next run +5. If `wakeAgent: true` — you wake up and receive the script's data + prompt + +### Always test your script first + +Before scheduling, run the script in your sandbox to verify it works: + +```bash +bash -c 'node --input-type=module -e " + const r = await fetch(\"https://api.github.com/repos/owner/repo/pulls?state=open\"); + const prs = await r.json(); + console.log(JSON.stringify({ wakeAgent: prs.length > 0, data: prs.slice(0, 5) })); +"' +``` + +### When NOT to use scripts + +If a task requires your judgment every time (daily briefings, reminders, reports), skip the script — just use a regular prompt. + +### Frequent task guidance + +If a user wants tasks running more than ~2x daily and a script can't reduce agent wake-ups: + +- Explain that each wake-up uses API credits and risks rate limits +- Suggest restructuring with a script that checks the condition first +- If the user needs an LLM to evaluate data, suggest using an API key with direct Anthropic API calls inside the script +- Help the user find the minimum viable frequency diff --git a/groups/main/CLAUDE.md b/groups/main/CLAUDE.md index 9ae1b13..e99de77 100644 --- a/groups/main/CLAUDE.md +++ b/groups/main/CLAUDE.md @@ -43,15 +43,33 @@ When you learn something important: - Split files larger than 500 lines into folders - Keep an index in your memory for the files you create -## WhatsApp Formatting (and other messaging apps) +## Message Formatting -Do NOT use markdown headings (##) in WhatsApp messages. Only use: -- *Bold* (single asterisks) (NEVER **double asterisks**) -- _Italic_ (underscores) -- • Bullets (bullet points) -- ```Code blocks``` (triple backticks) +Format messages based on the channel. Check the group folder name prefix: -Keep messages clean and readable for WhatsApp. +### Slack channels (folder starts with `slack_`) + +Use Slack mrkdwn syntax. Run `/slack-formatting` for the full reference. Key rules: +- `*bold*` (single asterisks) +- `_italic_` (underscores) +- `` for links (NOT `[text](url)`) +- `•` bullets (no numbered lists) +- `:emoji:` shortcodes like `:white_check_mark:`, `:rocket:` +- `>` for block quotes +- No `##` headings — use `*Bold text*` instead + +### WhatsApp/Telegram (folder starts with `whatsapp_` or `telegram_`) + +- `*bold*` (single asterisks, NEVER **double**) +- `_italic_` (underscores) +- `•` bullet points +- ` ``` ` code blocks + +No `##` headings. No `[links](url)`. No `**double stars**`. + +### Discord (folder starts with `discord_`) + +Standard Markdown: `**bold**`, `*italic*`, `[links](url)`, `# headings`. --- @@ -59,6 +77,10 @@ Keep messages clean and readable for WhatsApp. This is the **main channel**, which has elevated privileges. +## Authentication + +Anthropic credentials must be either an API key from console.anthropic.com (`ANTHROPIC_API_KEY`) or a long-lived OAuth token from `claude setup-token` (`CLAUDE_CODE_OAUTH_TOKEN`). Short-lived tokens from the system keychain or `~/.claude/.credentials.json` expire within hours and can cause recurring container 401s. The `/setup` skill walks through this. OneCLI manages credentials (including Anthropic auth) — run `onecli --help`. + ## Container Mounts Main has read-only access to the project and read-write access to its group folder: @@ -119,13 +141,13 @@ sqlite3 /workspace/project/store/messages.db " ### Registered Groups Config -Groups are registered in `/workspace/project/data/registered_groups.json`: +Groups are registered in the SQLite `registered_groups` table: ```json { "1234567890-1234567890@g.us": { "name": "Family Chat", - "folder": "family-chat", + "folder": "whatsapp_family-chat", "trigger": "@Andy", "added_at": "2024-01-31T12:00:00.000Z" } @@ -133,32 +155,34 @@ Groups are registered in `/workspace/project/data/registered_groups.json`: ``` Fields: -- **Key**: The WhatsApp JID (unique identifier for the chat) +- **Key**: The chat JID (unique identifier — WhatsApp, Telegram, Slack, Discord, etc.) - **name**: Display name for the group -- **folder**: Folder name under `groups/` for this group's files and memory +- **folder**: Channel-prefixed folder name under `groups/` for this group's files and memory - **trigger**: The trigger word (usually same as global, but could differ) - **requiresTrigger**: Whether `@trigger` prefix is needed (default: `true`). Set to `false` for solo/personal chats where all messages should be processed +- **isMain**: Whether this is the main control group (elevated privileges, no trigger required) - **added_at**: ISO timestamp when registered ### Trigger Behavior -- **Main group**: No trigger needed — all messages are processed automatically +- **Main group** (`isMain: true`): No trigger needed — all messages are processed automatically - **Groups with `requiresTrigger: false`**: No trigger needed — all messages processed (use for 1-on-1 or solo chats) - **Other groups** (default): Messages must start with `@AssistantName` to be processed ### Adding a Group 1. Query the database to find the group's JID -2. Read `/workspace/project/data/registered_groups.json` -3. Add the new group entry with `containerConfig` if needed -4. Write the updated JSON back -5. Create the group folder: `/workspace/project/groups/{folder-name}/` -6. Optionally create an initial `CLAUDE.md` for the group +2. Use the `register_group` MCP tool with the JID, name, folder, and trigger +3. Optionally include `containerConfig` for additional mounts +4. The group folder is created automatically: `/workspace/project/groups/{folder-name}/` +5. Optionally create an initial `CLAUDE.md` for the group -Example folder name conventions: -- "Family Chat" → `family-chat` -- "Work Team" → `work-team` -- Use lowercase, hyphens instead of spaces +Folder naming convention — channel prefix with underscore separator: +- WhatsApp "Family Chat" → `whatsapp_family-chat` +- Telegram "Dev Team" → `telegram_dev-team` +- Discord "General" → `discord_general` +- Slack "Engineering" → `slack_engineering` +- Use lowercase, hyphens for the group name part #### Adding Additional Directories for a Group @@ -186,6 +210,37 @@ Groups can have extra directories mounted. Add `containerConfig` to their entry: The directory will appear at `/workspace/extra/webapp` in that group's container. +#### Sender Allowlist + +After registering a group, explain the sender allowlist feature to the user: + +> This group can be configured with a sender allowlist to control who can interact with me. There are two modes: +> +> - **Trigger mode** (default): Everyone's messages are stored for context, but only allowed senders can trigger me with @{AssistantName}. +> - **Drop mode**: Messages from non-allowed senders are not stored at all. +> +> For closed groups with trusted members, I recommend setting up an allow-only list so only specific people can trigger me. Want me to configure that? + +If the user wants to set up an allowlist, edit `~/.config/nanoclaw/sender-allowlist.json` on the host: + +```json +{ + "default": { "allow": "*", "mode": "trigger" }, + "chats": { + "": { + "allow": ["sender-id-1", "sender-id-2"], + "mode": "trigger" + } + }, + "logDenied": true +} +``` + +Notes: +- Your own messages (`is_from_me`) explicitly bypass the allowlist in trigger checks. Bot messages are filtered out by the database query before trigger evaluation, so they never reach the allowlist. +- If the config file doesn't exist or is invalid, all senders are allowed (fail-open) +- The config file is on the host at `~/.config/nanoclaw/sender-allowlist.json`, not inside the container + ### Removing a Group 1. Read `/workspace/project/data/registered_groups.json` @@ -211,3 +266,42 @@ When scheduling tasks for other groups, use the `target_group_jid` parameter wit - `schedule_task(prompt: "...", schedule_type: "cron", schedule_value: "0 9 * * 1", target_group_jid: "120363336345536173@g.us")` The task will run in that group's context with access to their files and memory. + +--- + +## Task Scripts + +For any recurring task, use `schedule_task`. Frequent agent invocations — especially multiple times a day — consume API credits and can risk account restrictions. If a simple check can determine whether action is needed, add a `script` — it runs first, and the agent is only called when the check passes. This keeps invocations to a minimum. + +### How it works + +1. You provide a bash `script` alongside the `prompt` when scheduling +2. When the task fires, the script runs first (30-second timeout) +3. Script prints JSON to stdout: `{ "wakeAgent": true/false, "data": {...} }` +4. If `wakeAgent: false` — nothing happens, task waits for next run +5. If `wakeAgent: true` — you wake up and receive the script's data + prompt + +### Always test your script first + +Before scheduling, run the script in your sandbox to verify it works: + +```bash +bash -c 'node --input-type=module -e " + const r = await fetch(\"https://api.github.com/repos/owner/repo/pulls?state=open\"); + const prs = await r.json(); + console.log(JSON.stringify({ wakeAgent: prs.length > 0, data: prs.slice(0, 5) })); +"' +``` + +### When NOT to use scripts + +If a task requires your judgment every time (daily briefings, reminders, reports), skip the script — just use a regular prompt. + +### Frequent task guidance + +If a user wants tasks running more than ~2x daily and a script can't reduce agent wake-ups: + +- Explain that each wake-up uses API credits and risks rate limits +- Suggest restructuring with a script that checks the condition first +- If the user needs an LLM to evaluate data, suggest using an API key with direct Anthropic API calls inside the script +- Help the user find the minimum viable frequency diff --git a/package-lock.json b/package-lock.json index 353d39f..b1dd2ea 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,154 +1,35 @@ { "name": "nanoclaw", - "version": "1.1.3", + "version": "1.2.36", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "nanoclaw", - "version": "1.1.3", + "version": "1.2.36", "dependencies": { - "@whiskeysockets/baileys": "^7.0.0-rc.9", - "better-sqlite3": "^11.8.1", - "cron-parser": "^5.5.0", - "pino": "^9.6.0", - "pino-pretty": "^13.0.0", - "qrcode": "^1.5.4", - "qrcode-terminal": "^0.12.0", - "yaml": "^2.8.2", - "zod": "^4.3.6" + "@onecli-sh/sdk": "^0.2.0", + "better-sqlite3": "11.10.0", + "cron-parser": "5.5.0" }, "devDependencies": { + "@eslint/js": "^9.35.0", "@types/better-sqlite3": "^7.6.12", "@types/node": "^22.10.0", - "@types/qrcode-terminal": "^0.12.2", - "@vitest/coverage-v8": "^4.0.18", + "eslint": "^9.35.0", + "eslint-plugin-no-catch-all": "^1.1.0", + "globals": "^15.12.0", "husky": "^9.1.7", "prettier": "^3.8.1", "tsx": "^4.19.0", "typescript": "^5.7.0", + "typescript-eslint": "^8.35.0", "vitest": "^4.0.18" }, "engines": { "node": ">=20" } }, - "node_modules/@babel/helper-string-parser": { - "version": "7.27.1", - "resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz", - "integrity": "sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==", - "dev": true, - "license": "MIT", - "engines": { - "node": ">=6.9.0" - } - }, - "node_modules/@babel/helper-validator-identifier": { - "version": "7.28.5", - "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.28.5.tgz", - "integrity": "sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==", - "dev": true, - "license": "MIT", - "engines": { - "node": ">=6.9.0" - } - }, - "node_modules/@babel/parser": { - "version": "7.29.0", - "resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.29.0.tgz", - "integrity": "sha512-IyDgFV5GeDUVX4YdF/3CPULtVGSXXMLh1xVIgdCgxApktqnQV0r7/8Nqthg+8YLGaAtdyIlo2qIdZrbCv4+7ww==", - "dev": true, - "license": "MIT", - "dependencies": { - "@babel/types": "^7.29.0" - }, - "bin": { - "parser": "bin/babel-parser.js" - }, - "engines": { - "node": ">=6.0.0" - } - }, - "node_modules/@babel/types": { - "version": "7.29.0", - "resolved": "https://registry.npmjs.org/@babel/types/-/types-7.29.0.tgz", - "integrity": "sha512-LwdZHpScM4Qz8Xw2iKSzS+cfglZzJGvofQICy7W7v4caru4EaAmyUuO6BGrbyQ2mYV11W0U8j5mBhd14dd3B0A==", - "dev": true, - "license": "MIT", - "dependencies": { - "@babel/helper-string-parser": "^7.27.1", - "@babel/helper-validator-identifier": "^7.28.5" - }, - "engines": { - "node": ">=6.9.0" - } - }, - "node_modules/@bcoe/v8-coverage": { - "version": "1.0.2", - "resolved": "https://registry.npmjs.org/@bcoe/v8-coverage/-/v8-coverage-1.0.2.tgz", - "integrity": "sha512-6zABk/ECA/QYSCQ1NGiVwwbQerUCZ+TQbp64Q3AgmfNvurHH0j8TtXa1qbShXA6qqkpAj4V5W8pP6mLe1mcMqA==", - "dev": true, - "license": "MIT", - "engines": { - "node": ">=18" - } - }, - "node_modules/@borewit/text-codec": { - "version": "0.2.1", - "resolved": "https://registry.npmjs.org/@borewit/text-codec/-/text-codec-0.2.1.tgz", - "integrity": "sha512-k7vvKPbf7J2fZ5klGRD9AeKfUvojuZIQ3BT5u7Jfv+puwXkUBUT5PVyMDfJZpy30CBDXGMgw7fguK/lpOMBvgw==", - "license": "MIT", - "funding": { - "type": "github", - "url": "https://github.com/sponsors/Borewit" - } - }, - "node_modules/@cacheable/memory": { - "version": "2.0.7", - "resolved": "https://registry.npmjs.org/@cacheable/memory/-/memory-2.0.7.tgz", - "integrity": "sha512-RbxnxAMf89Tp1dLhXMS7ceft/PGsDl1Ip7T20z5nZ+pwIAsQ1p2izPjVG69oCLv/jfQ7HDPHTWK0c9rcAWXN3A==", - "license": "MIT", - "dependencies": { - "@cacheable/utils": "^2.3.3", - "@keyv/bigmap": "^1.3.0", - "hookified": "^1.14.0", - "keyv": "^5.5.5" - } - }, - "node_modules/@cacheable/node-cache": { - "version": "1.7.6", - "resolved": "https://registry.npmjs.org/@cacheable/node-cache/-/node-cache-1.7.6.tgz", - "integrity": "sha512-6Omk2SgNnjtxB5f/E6bTIWIt5xhdpx39fGNRQgU9lojvRxU68v+qY+SXXLsp3ZGukqoPjsK21wZ6XABFr/Ge3A==", - "license": "MIT", - "dependencies": { - "cacheable": "^2.3.1", - "hookified": "^1.14.0", - "keyv": "^5.5.5" - }, - "engines": { - "node": ">=18" - } - }, - "node_modules/@cacheable/utils": { - "version": "2.3.4", - "resolved": "https://registry.npmjs.org/@cacheable/utils/-/utils-2.3.4.tgz", - "integrity": "sha512-knwKUJEYgIfwShABS1BX6JyJJTglAFcEU7EXqzTdiGCXur4voqkiJkdgZIQtWNFhynzDWERcTYv/sETMu3uJWA==", - "license": "MIT", - "dependencies": { - "hashery": "^1.3.0", - "keyv": "^5.6.0" - } - }, - "node_modules/@emnapi/runtime": { - "version": "1.8.1", - "resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.8.1.tgz", - "integrity": "sha512-mehfKSMWjjNol8659Z8KxEMrdSJDDot5SXMq00dM8BN4o+CLNXQ0xH2V7EchNHV4RmbZLmmPdEaXZc5H2FXmDg==", - "license": "MIT", - "optional": true, - "dependencies": { - "tslib": "^2.4.0" - } - }, "node_modules/@esbuild/aix-ppc64": { "version": "0.27.3", "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.27.3.tgz", @@ -591,494 +472,210 @@ "node": ">=18" } }, - "node_modules/@hapi/boom": { - "version": "9.1.4", - "resolved": "https://registry.npmjs.org/@hapi/boom/-/boom-9.1.4.tgz", - "integrity": "sha512-Ls1oH8jaN1vNsqcaHVYJrKmgMcKsC1wcp8bujvXrHaAqD2iDYq3HoOwsxwo09Cuda5R5nC0o0IxlrlTuvPuzSw==", - "license": "BSD-3-Clause", + "node_modules/@eslint-community/eslint-utils": { + "version": "4.9.1", + "resolved": "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.9.1.tgz", + "integrity": "sha512-phrYmNiYppR7znFEdqgfWHXR6NCkZEK7hwWDHZUjit/2/U0r6XvkDl0SYnoM51Hq7FhCGdLDT6zxCCOY1hexsQ==", + "dev": true, "dependencies": { - "@hapi/hoek": "9.x.x" + "eslint-visitor-keys": "^3.4.3" + }, + "engines": { + "node": "^12.22.0 || ^14.17.0 || >=16.0.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + }, + "peerDependencies": { + "eslint": "^6.0.0 || ^7.0.0 || >=8.0.0" } }, - "node_modules/@hapi/hoek": { - "version": "9.3.0", - "resolved": "https://registry.npmjs.org/@hapi/hoek/-/hoek-9.3.0.tgz", - "integrity": "sha512-/c6rf4UJlmHlC9b5BaNvzAcFv7HZ2QHaV0D4/HNlBdvFnvQq8RI4kYdhyPCl7Xj+oWvTWQ8ujhqS53LIgAe6KQ==", - "license": "BSD-3-Clause" + "node_modules/@eslint-community/eslint-utils/node_modules/eslint-visitor-keys": { + "version": "3.4.3", + "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-3.4.3.tgz", + "integrity": "sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag==", + "dev": true, + "engines": { + "node": "^12.22.0 || ^14.17.0 || >=16.0.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } }, - "node_modules/@img/colour": { - "version": "1.0.0", - "resolved": "https://registry.npmjs.org/@img/colour/-/colour-1.0.0.tgz", - "integrity": "sha512-A5P/LfWGFSl6nsckYtjw9da+19jB8hkJ6ACTGcDfEJ0aE+l2n2El7dsVM7UVHZQ9s2lmYMWlrS21YLy2IR1LUw==", - "license": "MIT", + "node_modules/@eslint-community/regexpp": { + "version": "4.12.2", + "resolved": "https://registry.npmjs.org/@eslint-community/regexpp/-/regexpp-4.12.2.tgz", + "integrity": "sha512-EriSTlt5OC9/7SXkRSCAhfSxxoSUgBm33OH+IkwbdpgoqsSsUg7y3uh+IICI/Qg4BBWr3U2i39RpmycbxMq4ew==", + "dev": true, + "engines": { + "node": "^12.0.0 || ^14.0.0 || >=16.0.0" + } + }, + "node_modules/@eslint/config-array": { + "version": "0.21.2", + "resolved": "https://registry.npmjs.org/@eslint/config-array/-/config-array-0.21.2.tgz", + "integrity": "sha512-nJl2KGTlrf9GjLimgIru+V/mzgSK0ABCDQRvxw5BjURL7WfH5uoWmizbH7QB6MmnMBd8cIC9uceWnezL1VZWWw==", + "dev": true, + "dependencies": { + "@eslint/object-schema": "^2.1.7", + "debug": "^4.3.1", + "minimatch": "^3.1.5" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/config-helpers": { + "version": "0.4.2", + "resolved": "https://registry.npmjs.org/@eslint/config-helpers/-/config-helpers-0.4.2.tgz", + "integrity": "sha512-gBrxN88gOIf3R7ja5K9slwNayVcZgK6SOUORm2uBzTeIEfeVaIhOpCtTox3P6R7o2jLFwLFTLnC7kU/RGcYEgw==", + "dev": true, + "dependencies": { + "@eslint/core": "^0.17.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/core": { + "version": "0.17.0", + "resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.17.0.tgz", + "integrity": "sha512-yL/sLrpmtDaFEiUj1osRP4TI2MDz1AddJL+jZ7KSqvBuliN4xqYY54IfdN8qD8Toa6g1iloph1fxQNkjOxrrpQ==", + "dev": true, + "dependencies": { + "@types/json-schema": "^7.0.15" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/eslintrc": { + "version": "3.3.5", + "resolved": "https://registry.npmjs.org/@eslint/eslintrc/-/eslintrc-3.3.5.tgz", + "integrity": "sha512-4IlJx0X0qftVsN5E+/vGujTRIFtwuLbNsVUe7TO6zYPDR1O6nFwvwhIKEKSrl6dZchmYBITazxKoUYOjdtjlRg==", + "dev": true, + "dependencies": { + "ajv": "^6.14.0", + "debug": "^4.3.2", + "espree": "^10.0.1", + "globals": "^14.0.0", + "ignore": "^5.2.0", + "import-fresh": "^3.2.1", + "js-yaml": "^4.1.1", + "minimatch": "^3.1.5", + "strip-json-comments": "^3.1.1" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/@eslint/eslintrc/node_modules/globals": { + "version": "14.0.0", + "resolved": "https://registry.npmjs.org/globals/-/globals-14.0.0.tgz", + "integrity": "sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ==", + "dev": true, "engines": { "node": ">=18" - } - }, - "node_modules/@img/sharp-darwin-arm64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.34.5.tgz", - "integrity": "sha512-imtQ3WMJXbMY4fxb/Ndp6HBTNVtWCUI0WdobyheGf5+ad6xX8VIDO8u2xE4qc/fr08CKG/7dDseFtn6M6g/r3w==", - "cpu": [ - "arm64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "darwin" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" }, "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-darwin-arm64": "1.2.4" + "url": "https://github.com/sponsors/sindresorhus" } }, - "node_modules/@img/sharp-darwin-x64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.34.5.tgz", - "integrity": "sha512-YNEFAF/4KQ/PeW0N+r+aVVsoIY0/qxxikF2SWdp+NRkmMB7y9LBZAVqQ4yhGCm/H3H270OSykqmQMKLBhBJDEw==", - "cpu": [ - "x64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "darwin" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-darwin-x64": "1.2.4" - } - }, - "node_modules/@img/sharp-libvips-darwin-arm64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.2.4.tgz", - "integrity": "sha512-zqjjo7RatFfFoP0MkQ51jfuFZBnVE2pRiaydKJ1G/rHZvnsrHAOcQALIi9sA5co5xenQdTugCvtb1cuf78Vf4g==", - "cpu": [ - "arm64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "darwin" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-darwin-x64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.2.4.tgz", - "integrity": "sha512-1IOd5xfVhlGwX+zXv2N93k0yMONvUlANylbJw1eTah8K/Jtpi15KC+WSiaX/nBmbm2HxRM1gZ0nSdjSsrZbGKg==", - "cpu": [ - "x64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "darwin" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linux-arm": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.2.4.tgz", - "integrity": "sha512-bFI7xcKFELdiNCVov8e44Ia4u2byA+l3XtsAj+Q8tfCwO6BQ8iDojYdvoPMqsKDkuoOo+X6HZA0s0q11ANMQ8A==", - "cpu": [ - "arm" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linux-arm64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.2.4.tgz", - "integrity": "sha512-excjX8DfsIcJ10x1Kzr4RcWe1edC9PquDRRPx3YVCvQv+U5p7Yin2s32ftzikXojb1PIFc/9Mt28/y+iRklkrw==", - "cpu": [ - "arm64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linux-ppc64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-ppc64/-/sharp-libvips-linux-ppc64-1.2.4.tgz", - "integrity": "sha512-FMuvGijLDYG6lW+b/UvyilUWu5Ayu+3r2d1S8notiGCIyYU/76eig1UfMmkZ7vwgOrzKzlQbFSuQfgm7GYUPpA==", - "cpu": [ - "ppc64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linux-riscv64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-riscv64/-/sharp-libvips-linux-riscv64-1.2.4.tgz", - "integrity": "sha512-oVDbcR4zUC0ce82teubSm+x6ETixtKZBh/qbREIOcI3cULzDyb18Sr/Wcyx7NRQeQzOiHTNbZFF1UwPS2scyGA==", - "cpu": [ - "riscv64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linux-s390x": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.2.4.tgz", - "integrity": "sha512-qmp9VrzgPgMoGZyPvrQHqk02uyjA0/QrTO26Tqk6l4ZV0MPWIW6LTkqOIov+J1yEu7MbFQaDpwdwJKhbJvuRxQ==", - "cpu": [ - "s390x" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linux-x64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.2.4.tgz", - "integrity": "sha512-tJxiiLsmHc9Ax1bz3oaOYBURTXGIRDODBqhveVHonrHJ9/+k89qbLl0bcJns+e4t4rvaNBxaEZsFtSfAdquPrw==", - "cpu": [ - "x64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linuxmusl-arm64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.2.4.tgz", - "integrity": "sha512-FVQHuwx1IIuNow9QAbYUzJ+En8KcVm9Lk5+uGUQJHaZmMECZmOlix9HnH7n1TRkXMS0pGxIJokIVB9SuqZGGXw==", - "cpu": [ - "arm64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-libvips-linuxmusl-x64": { - "version": "1.2.4", - "resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.2.4.tgz", - "integrity": "sha512-+LpyBk7L44ZIXwz/VYfglaX/okxezESc6UxDSoyo2Ks6Jxc4Y7sGjpgU9s4PMgqgjj1gZCylTieNamqA1MF7Dg==", - "cpu": [ - "x64" - ], - "license": "LGPL-3.0-or-later", - "optional": true, - "os": [ - "linux" - ], - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-linux-arm": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.34.5.tgz", - "integrity": "sha512-9dLqsvwtg1uuXBGZKsxem9595+ujv0sJ6Vi8wcTANSFpwV/GONat5eCkzQo/1O6zRIkh0m/8+5BjrRr7jDUSZw==", - "cpu": [ - "arm" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linux-arm": "1.2.4" - } - }, - "node_modules/@img/sharp-linux-arm64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.34.5.tgz", - "integrity": "sha512-bKQzaJRY/bkPOXyKx5EVup7qkaojECG6NLYswgktOZjaXecSAeCWiZwwiFf3/Y+O1HrauiE3FVsGxFg8c24rZg==", - "cpu": [ - "arm64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linux-arm64": "1.2.4" - } - }, - "node_modules/@img/sharp-linux-ppc64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-ppc64/-/sharp-linux-ppc64-0.34.5.tgz", - "integrity": "sha512-7zznwNaqW6YtsfrGGDA6BRkISKAAE1Jo0QdpNYXNMHu2+0dTrPflTLNkpc8l7MUP5M16ZJcUvysVWWrMefZquA==", - "cpu": [ - "ppc64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linux-ppc64": "1.2.4" - } - }, - "node_modules/@img/sharp-linux-riscv64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-riscv64/-/sharp-linux-riscv64-0.34.5.tgz", - "integrity": "sha512-51gJuLPTKa7piYPaVs8GmByo7/U7/7TZOq+cnXJIHZKavIRHAP77e3N2HEl3dgiqdD/w0yUfiJnII77PuDDFdw==", - "cpu": [ - "riscv64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linux-riscv64": "1.2.4" - } - }, - "node_modules/@img/sharp-linux-s390x": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.34.5.tgz", - "integrity": "sha512-nQtCk0PdKfho3eC5MrbQoigJ2gd1CgddUMkabUj+rBevs8tZ2cULOx46E7oyX+04WGfABgIwmMC0VqieTiR4jg==", - "cpu": [ - "s390x" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linux-s390x": "1.2.4" - } - }, - "node_modules/@img/sharp-linux-x64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.34.5.tgz", - "integrity": "sha512-MEzd8HPKxVxVenwAa+JRPwEC7QFjoPWuS5NZnBt6B3pu7EG2Ge0id1oLHZpPJdn3OQK+BQDiw9zStiHBTJQQQQ==", - "cpu": [ - "x64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linux-x64": "1.2.4" - } - }, - "node_modules/@img/sharp-linuxmusl-arm64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.34.5.tgz", - "integrity": "sha512-fprJR6GtRsMt6Kyfq44IsChVZeGN97gTD331weR1ex1c1rypDEABN6Tm2xa1wE6lYb5DdEnk03NZPqA7Id21yg==", - "cpu": [ - "arm64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linuxmusl-arm64": "1.2.4" - } - }, - "node_modules/@img/sharp-linuxmusl-x64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.34.5.tgz", - "integrity": "sha512-Jg8wNT1MUzIvhBFxViqrEhWDGzqymo3sV7z7ZsaWbZNDLXRJZoRGrjulp60YYtV4wfY8VIKcWidjojlLcWrd8Q==", - "cpu": [ - "x64" - ], - "license": "Apache-2.0", - "optional": true, - "os": [ - "linux" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-libvips-linuxmusl-x64": "1.2.4" - } - }, - "node_modules/@img/sharp-wasm32": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.34.5.tgz", - "integrity": "sha512-OdWTEiVkY2PHwqkbBI8frFxQQFekHaSSkUIJkwzclWZe64O1X4UlUjqqqLaPbUpMOQk6FBu/HtlGXNblIs0huw==", - "cpu": [ - "wasm32" - ], - "license": "Apache-2.0 AND LGPL-3.0-or-later AND MIT", - "optional": true, - "dependencies": { - "@emnapi/runtime": "^1.7.0" - }, - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-win32-arm64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-arm64/-/sharp-win32-arm64-0.34.5.tgz", - "integrity": "sha512-WQ3AgWCWYSb2yt+IG8mnC6Jdk9Whs7O0gxphblsLvdhSpSTtmu69ZG1Gkb6NuvxsNACwiPV6cNSZNzt0KPsw7g==", - "cpu": [ - "arm64" - ], - "license": "Apache-2.0 AND LGPL-3.0-or-later", - "optional": true, - "os": [ - "win32" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-win32-ia32": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.34.5.tgz", - "integrity": "sha512-FV9m/7NmeCmSHDD5j4+4pNI8Cp3aW+JvLoXcTUo0IqyjSfAZJ8dIUmijx1qaJsIiU+Hosw6xM5KijAWRJCSgNg==", - "cpu": [ - "ia32" - ], - "license": "Apache-2.0 AND LGPL-3.0-or-later", - "optional": true, - "os": [ - "win32" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@img/sharp-win32-x64": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.34.5.tgz", - "integrity": "sha512-+29YMsqY2/9eFEiW93eqWnuLcWcufowXewwSNIT6UwZdUUCrM3oFjMWH/Z6/TMmb4hlFenmfAVbpWeup2jryCw==", - "cpu": [ - "x64" - ], - "license": "Apache-2.0 AND LGPL-3.0-or-later", - "optional": true, - "os": [ - "win32" - ], - "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - } - }, - "node_modules/@jridgewell/resolve-uri": { - "version": "3.1.2", - "resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz", - "integrity": "sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==", + "node_modules/@eslint/eslintrc/node_modules/strip-json-comments": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz", + "integrity": "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==", "dev": true, - "license": "MIT", "engines": { - "node": ">=6.0.0" + "node": ">=8" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/@eslint/js": { + "version": "9.39.4", + "resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.39.4.tgz", + "integrity": "sha512-nE7DEIchvtiFTwBw4Lfbu59PG+kCofhjsKaCWzxTpt4lfRjRMqG6uMBzKXuEcyXhOHoUp9riAm7/aWYGhXZ9cw==", + "dev": true, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://eslint.org/donate" + } + }, + "node_modules/@eslint/object-schema": { + "version": "2.1.7", + "resolved": "https://registry.npmjs.org/@eslint/object-schema/-/object-schema-2.1.7.tgz", + "integrity": "sha512-VtAOaymWVfZcmZbp6E2mympDIHvyjXs/12LqWYjVw6qjrfF+VK+fyG33kChz3nnK+SU5/NeHOqrTEHS8sXO3OA==", + "dev": true, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@eslint/plugin-kit": { + "version": "0.4.1", + "resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.4.1.tgz", + "integrity": "sha512-43/qtrDUokr7LJqoF2c3+RInu/t4zfrpYdoSDfYyhg52rwLV6TnOvdG4fXm7IkSB3wErkcmJS9iEhjVtOSEjjA==", + "dev": true, + "dependencies": { + "@eslint/core": "^0.17.0", + "levn": "^0.4.1" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + } + }, + "node_modules/@humanfs/core": { + "version": "0.19.1", + "resolved": "https://registry.npmjs.org/@humanfs/core/-/core-0.19.1.tgz", + "integrity": "sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA==", + "dev": true, + "engines": { + "node": ">=18.18.0" + } + }, + "node_modules/@humanfs/node": { + "version": "0.16.7", + "resolved": "https://registry.npmjs.org/@humanfs/node/-/node-0.16.7.tgz", + "integrity": "sha512-/zUx+yOsIrG4Y43Eh2peDeKCxlRt/gET6aHfaKpuq267qXdYDFViVHfMaLyygZOnl0kGWxFIgsBy8QFuTLUXEQ==", + "dev": true, + "dependencies": { + "@humanfs/core": "^0.19.1", + "@humanwhocodes/retry": "^0.4.0" + }, + "engines": { + "node": ">=18.18.0" + } + }, + "node_modules/@humanwhocodes/module-importer": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/@humanwhocodes/module-importer/-/module-importer-1.0.1.tgz", + "integrity": "sha512-bxveV4V8v5Yb4ncFTT3rPSgZBOpCkjfK0y4oVVVJwIuDVBRMDXrPyXRL988i5ap9m9bnyEEjWfm5WkBmtffLfA==", + "dev": true, + "engines": { + "node": ">=12.22" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/nzakas" + } + }, + "node_modules/@humanwhocodes/retry": { + "version": "0.4.3", + "resolved": "https://registry.npmjs.org/@humanwhocodes/retry/-/retry-0.4.3.tgz", + "integrity": "sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ==", + "dev": true, + "engines": { + "node": ">=18.18" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/nzakas" } }, "node_modules/@jridgewell/sourcemap-codec": { @@ -1088,109 +685,14 @@ "dev": true, "license": "MIT" }, - "node_modules/@jridgewell/trace-mapping": { - "version": "0.3.31", - "resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.31.tgz", - "integrity": "sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw==", - "dev": true, - "license": "MIT", - "dependencies": { - "@jridgewell/resolve-uri": "^3.1.0", - "@jridgewell/sourcemap-codec": "^1.4.14" - } - }, - "node_modules/@keyv/bigmap": { - "version": "1.3.1", - "resolved": "https://registry.npmjs.org/@keyv/bigmap/-/bigmap-1.3.1.tgz", - "integrity": "sha512-WbzE9sdmQtKy8vrNPa9BRnwZh5UF4s1KTmSK0KUVLo3eff5BlQNNWDnFOouNpKfPKDnms9xynJjsMYjMaT/aFQ==", - "license": "MIT", - "dependencies": { - "hashery": "^1.4.0", - "hookified": "^1.15.0" - }, + "node_modules/@onecli-sh/sdk": { + "version": "0.2.0", + "resolved": "https://registry.npmjs.org/@onecli-sh/sdk/-/sdk-0.2.0.tgz", + "integrity": "sha512-u7PqWROEvTV9f0ADVkjigTrd2AZn3klbPrv7GGpeRHIJpjAxJUdlWqxr5kiGt6qTDKL8t3nq76xr4X2pxTiyBg==", "engines": { - "node": ">= 18" - }, - "peerDependencies": { - "keyv": "^5.6.0" + "node": ">=20" } }, - "node_modules/@keyv/serialize": { - "version": "1.1.1", - "resolved": "https://registry.npmjs.org/@keyv/serialize/-/serialize-1.1.1.tgz", - "integrity": "sha512-dXn3FZhPv0US+7dtJsIi2R+c7qWYiReoEh5zUntWCf4oSpMNib8FDhSoed6m3QyZdx5hK7iLFkYk3rNxwt8vTA==", - "license": "MIT" - }, - "node_modules/@pinojs/redact": { - "version": "0.4.0", - "resolved": "https://registry.npmjs.org/@pinojs/redact/-/redact-0.4.0.tgz", - "integrity": "sha512-k2ENnmBugE/rzQfEcdWHcCY+/FM3VLzH9cYEsbdsoqrvzAKRhUZeRNhAZvB8OitQJ1TBed3yqWtdjzS6wJKBwg==", - "license": "MIT" - }, - "node_modules/@protobufjs/aspromise": { - "version": "1.1.2", - "resolved": "https://registry.npmjs.org/@protobufjs/aspromise/-/aspromise-1.1.2.tgz", - "integrity": "sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/base64": { - "version": "1.1.2", - "resolved": "https://registry.npmjs.org/@protobufjs/base64/-/base64-1.1.2.tgz", - "integrity": "sha512-AZkcAA5vnN/v4PDqKyMR5lx7hZttPDgClv83E//FMNhR2TMcLUhfRUBHCmSl0oi9zMgDDqRUJkSxO3wm85+XLg==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/codegen": { - "version": "2.0.4", - "resolved": "https://registry.npmjs.org/@protobufjs/codegen/-/codegen-2.0.4.tgz", - "integrity": "sha512-YyFaikqM5sH0ziFZCN3xDC7zeGaB/d0IUb9CATugHWbd1FRFwWwt4ld4OYMPWu5a3Xe01mGAULCdqhMlPl29Jg==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/eventemitter": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/@protobufjs/eventemitter/-/eventemitter-1.1.0.tgz", - "integrity": "sha512-j9ednRT81vYJ9OfVuXG6ERSTdEL1xVsNgqpkxMsbIabzSo3goCjDIveeGv5d03om39ML71RdmrGNjG5SReBP/Q==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/fetch": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/@protobufjs/fetch/-/fetch-1.1.0.tgz", - "integrity": "sha512-lljVXpqXebpsijW71PZaCYeIcE5on1w5DlQy5WH6GLbFryLUrBD4932W/E2BSpfRJWseIL4v/KPgBFxDOIdKpQ==", - "license": "BSD-3-Clause", - "dependencies": { - "@protobufjs/aspromise": "^1.1.1", - "@protobufjs/inquire": "^1.1.0" - } - }, - "node_modules/@protobufjs/float": { - "version": "1.0.2", - "resolved": "https://registry.npmjs.org/@protobufjs/float/-/float-1.0.2.tgz", - "integrity": "sha512-Ddb+kVXlXst9d+R9PfTIxh1EdNkgoRe5tOX6t01f1lYWOvJnSPDBlG241QLzcyPdoNTsblLUdujGSE4RzrTZGQ==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/inquire": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/@protobufjs/inquire/-/inquire-1.1.0.tgz", - "integrity": "sha512-kdSefcPdruJiFMVSbn801t4vFK7KB/5gd2fYvrxhuJYg8ILrmn9SKSX2tZdV6V+ksulWqS7aXjBcRXl3wHoD9Q==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/path": { - "version": "1.1.2", - "resolved": "https://registry.npmjs.org/@protobufjs/path/-/path-1.1.2.tgz", - "integrity": "sha512-6JOcJ5Tm08dOHAbdR3GrvP+yUUfkjG5ePsHYczMFLq3ZmMkAD98cDgcT2iA1lJ9NVwFd4tH/iSSoe44YWkltEA==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/pool": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/@protobufjs/pool/-/pool-1.1.0.tgz", - "integrity": "sha512-0kELaGSIDBKvcgS4zkjz1PeddatrjYcmMWOlAuAPwAeccUrPHdUqo/J6LiymHHEiJT5NrF1UVwxY14f+fy4WQw==", - "license": "BSD-3-Clause" - }, - "node_modules/@protobufjs/utf8": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/@protobufjs/utf8/-/utf8-1.1.0.tgz", - "integrity": "sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw==", - "license": "BSD-3-Clause" - }, "node_modules/@rollup/rollup-android-arm-eabi": { "version": "4.57.1", "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.57.1.tgz", @@ -1548,29 +1050,6 @@ "dev": true, "license": "MIT" }, - "node_modules/@tokenizer/inflate": { - "version": "0.4.1", - "resolved": "https://registry.npmjs.org/@tokenizer/inflate/-/inflate-0.4.1.tgz", - "integrity": "sha512-2mAv+8pkG6GIZiF1kNg1jAjh27IDxEPKwdGul3snfztFerfPGI1LjDezZp3i7BElXompqEtPmoPx6c2wgtWsOA==", - "license": "MIT", - "dependencies": { - "debug": "^4.4.3", - "token-types": "^6.1.1" - }, - "engines": { - "node": ">=18" - }, - "funding": { - "type": "github", - "url": "https://github.com/sponsors/Borewit" - } - }, - "node_modules/@tokenizer/token": { - "version": "0.3.0", - "resolved": "https://registry.npmjs.org/@tokenizer/token/-/token-0.3.0.tgz", - "integrity": "sha512-OvjF+z51L3ov0OyAU0duzsYuvO01PH7x4t6DJx+guahgTnBHkhJdG7soQeTSFLWN3efnHyibZ4Z8l2EuWwJN3A==", - "license": "MIT" - }, "node_modules/@types/better-sqlite3": { "version": "7.6.13", "resolved": "https://registry.npmjs.org/@types/better-sqlite3/-/better-sqlite3-7.6.13.tgz", @@ -1606,57 +1085,287 @@ "dev": true, "license": "MIT" }, - "node_modules/@types/long": { - "version": "4.0.2", - "resolved": "https://registry.npmjs.org/@types/long/-/long-4.0.2.tgz", - "integrity": "sha512-MqTGEo5bj5t157U6fA/BiDynNkn0YknVdh48CMPkTSpFTVmvao5UQmm7uEF6xBEo7qIMAlY/JSleYaE6VOdpaA==", - "license": "MIT" + "node_modules/@types/json-schema": { + "version": "7.0.15", + "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz", + "integrity": "sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA==", + "dev": true }, "node_modules/@types/node": { "version": "22.19.11", "resolved": "https://registry.npmjs.org/@types/node/-/node-22.19.11.tgz", "integrity": "sha512-BH7YwL6rA93ReqeQS1c4bsPpcfOmJasG+Fkr6Y59q83f9M1WcBRHR2vM+P9eOisYRcN3ujQoiZY8uk5W+1WL8w==", + "dev": true, "license": "MIT", "dependencies": { "undici-types": "~6.21.0" } }, - "node_modules/@types/qrcode-terminal": { - "version": "0.12.2", - "resolved": "https://registry.npmjs.org/@types/qrcode-terminal/-/qrcode-terminal-0.12.2.tgz", - "integrity": "sha512-v+RcIEJ+Uhd6ygSQ0u5YYY7ZM+la7GgPbs0V/7l/kFs2uO4S8BcIUEMoP7za4DNIqNnUD5npf0A/7kBhrCKG5Q==", + "node_modules/@typescript-eslint/eslint-plugin": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.57.2.tgz", + "integrity": "sha512-NZZgp0Fm2IkD+La5PR81sd+g+8oS6JwJje+aRWsDocxHkjyRw0J5L5ZTlN3LI1LlOcGL7ph3eaIUmTXMIjLk0w==", "dev": true, - "license": "MIT" - }, - "node_modules/@vitest/coverage-v8": { - "version": "4.0.18", - "resolved": "https://registry.npmjs.org/@vitest/coverage-v8/-/coverage-v8-4.0.18.tgz", - "integrity": "sha512-7i+N2i0+ME+2JFZhfuz7Tg/FqKtilHjGyGvoHYQ6iLV0zahbsJ9sljC9OcFcPDbhYKCet+sG8SsVqlyGvPflZg==", - "dev": true, - "license": "MIT", "dependencies": { - "@bcoe/v8-coverage": "^1.0.2", - "@vitest/utils": "4.0.18", - "ast-v8-to-istanbul": "^0.3.10", - "istanbul-lib-coverage": "^3.2.2", - "istanbul-lib-report": "^3.0.1", - "istanbul-reports": "^3.2.0", - "magicast": "^0.5.1", - "obug": "^2.1.1", - "std-env": "^3.10.0", - "tinyrainbow": "^3.0.3" + "@eslint-community/regexpp": "^4.12.2", + "@typescript-eslint/scope-manager": "8.57.2", + "@typescript-eslint/type-utils": "8.57.2", + "@typescript-eslint/utils": "8.57.2", + "@typescript-eslint/visitor-keys": "8.57.2", + "ignore": "^7.0.5", + "natural-compare": "^1.4.0", + "ts-api-utils": "^2.4.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" }, "funding": { - "url": "https://opencollective.com/vitest" + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" }, "peerDependencies": { - "@vitest/browser": "4.0.18", - "vitest": "4.0.18" + "@typescript-eslint/parser": "^8.57.2", + "eslint": "^8.57.0 || ^9.0.0 || ^10.0.0", + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/eslint-plugin/node_modules/ignore": { + "version": "7.0.5", + "resolved": "https://registry.npmjs.org/ignore/-/ignore-7.0.5.tgz", + "integrity": "sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg==", + "dev": true, + "engines": { + "node": ">= 4" + } + }, + "node_modules/@typescript-eslint/parser": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-8.57.2.tgz", + "integrity": "sha512-30ScMRHIAD33JJQkgfGW1t8CURZtjc2JpTrq5n2HFhOefbAhb7ucc7xJwdWcrEtqUIYJ73Nybpsggii6GtAHjA==", + "dev": true, + "dependencies": { + "@typescript-eslint/scope-manager": "8.57.2", + "@typescript-eslint/types": "8.57.2", + "@typescript-eslint/typescript-estree": "8.57.2", + "@typescript-eslint/visitor-keys": "8.57.2", + "debug": "^4.4.3" }, - "peerDependenciesMeta": { - "@vitest/browser": { - "optional": true - } + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "eslint": "^8.57.0 || ^9.0.0 || ^10.0.0", + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/project-service": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/project-service/-/project-service-8.57.2.tgz", + "integrity": "sha512-FuH0wipFywXRTHf+bTTjNyuNQQsQC3qh/dYzaM4I4W0jrCqjCVuUh99+xd9KamUfmCGPvbO8NDngo/vsnNVqgw==", + "dev": true, + "dependencies": { + "@typescript-eslint/tsconfig-utils": "^8.57.2", + "@typescript-eslint/types": "^8.57.2", + "debug": "^4.4.3" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/scope-manager": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/scope-manager/-/scope-manager-8.57.2.tgz", + "integrity": "sha512-snZKH+W4WbWkrBqj4gUNRIGb/jipDW3qMqVJ4C9rzdFc+wLwruxk+2a5D+uoFcKPAqyqEnSb4l2ULuZf95eSkw==", + "dev": true, + "dependencies": { + "@typescript-eslint/types": "8.57.2", + "@typescript-eslint/visitor-keys": "8.57.2" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + } + }, + "node_modules/@typescript-eslint/tsconfig-utils": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.57.2.tgz", + "integrity": "sha512-3Lm5DSM+DCowsUOJC+YqHHnKEfFh5CoGkj5Z31NQSNF4l5wdOwqGn99wmwN/LImhfY3KJnmordBq/4+VDe2eKw==", + "dev": true, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/type-utils": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/type-utils/-/type-utils-8.57.2.tgz", + "integrity": "sha512-Co6ZCShm6kIbAM/s+oYVpKFfW7LBc6FXoPXjTRQ449PPNBY8U0KZXuevz5IFuuUj2H9ss40atTaf9dlGLzbWZg==", + "dev": true, + "dependencies": { + "@typescript-eslint/types": "8.57.2", + "@typescript-eslint/typescript-estree": "8.57.2", + "@typescript-eslint/utils": "8.57.2", + "debug": "^4.4.3", + "ts-api-utils": "^2.4.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "eslint": "^8.57.0 || ^9.0.0 || ^10.0.0", + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/types": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/types/-/types-8.57.2.tgz", + "integrity": "sha512-/iZM6FnM4tnx9csuTxspMW4BOSegshwX5oBDznJ7S4WggL7Vczz5d2W11ecc4vRrQMQHXRSxzrCsyG5EsPPTbA==", + "dev": true, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + } + }, + "node_modules/@typescript-eslint/typescript-estree": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-8.57.2.tgz", + "integrity": "sha512-2MKM+I6g8tJxfSmFKOnHv2t8Sk3T6rF20A1Puk0svLK+uVapDZB/4pfAeB7nE83uAZrU6OxW+HmOd5wHVdXwXA==", + "dev": true, + "dependencies": { + "@typescript-eslint/project-service": "8.57.2", + "@typescript-eslint/tsconfig-utils": "8.57.2", + "@typescript-eslint/types": "8.57.2", + "@typescript-eslint/visitor-keys": "8.57.2", + "debug": "^4.4.3", + "minimatch": "^10.2.2", + "semver": "^7.7.3", + "tinyglobby": "^0.2.15", + "ts-api-utils": "^2.4.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/typescript-estree/node_modules/balanced-match": { + "version": "4.0.4", + "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-4.0.4.tgz", + "integrity": "sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA==", + "dev": true, + "engines": { + "node": "18 || 20 || >=22" + } + }, + "node_modules/@typescript-eslint/typescript-estree/node_modules/brace-expansion": { + "version": "5.0.5", + "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.5.tgz", + "integrity": "sha512-VZznLgtwhn+Mact9tfiwx64fA9erHH/MCXEUfB/0bX/6Fz6ny5EGTXYltMocqg4xFAQZtnO3DHWWXi8RiuN7cQ==", + "dev": true, + "dependencies": { + "balanced-match": "^4.0.2" + }, + "engines": { + "node": "18 || 20 || >=22" + } + }, + "node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch": { + "version": "10.2.4", + "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz", + "integrity": "sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg==", + "dev": true, + "dependencies": { + "brace-expansion": "^5.0.2" + }, + "engines": { + "node": "18 || 20 || >=22" + }, + "funding": { + "url": "https://github.com/sponsors/isaacs" + } + }, + "node_modules/@typescript-eslint/utils": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/utils/-/utils-8.57.2.tgz", + "integrity": "sha512-krRIbvPK1ju1WBKIefiX+bngPs+odIQUtR7kymzPfo1POVw3jlF+nLkmexdSSd4UCbDcQn+wMBATOOmpBbqgKg==", + "dev": true, + "dependencies": { + "@eslint-community/eslint-utils": "^4.9.1", + "@typescript-eslint/scope-manager": "8.57.2", + "@typescript-eslint/types": "8.57.2", + "@typescript-eslint/typescript-estree": "8.57.2" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "eslint": "^8.57.0 || ^9.0.0 || ^10.0.0", + "typescript": ">=4.8.4 <6.0.0" + } + }, + "node_modules/@typescript-eslint/visitor-keys": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/@typescript-eslint/visitor-keys/-/visitor-keys-8.57.2.tgz", + "integrity": "sha512-zhahknjobV2FiD6Ee9iLbS7OV9zi10rG26odsQdfBO/hjSzUQbkIYgda+iNKK1zNiW2ey+Lf8MU5btN17V3dUw==", + "dev": true, + "dependencies": { + "@typescript-eslint/types": "8.57.2", + "eslint-visitor-keys": "^5.0.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + } + }, + "node_modules/@typescript-eslint/visitor-keys/node_modules/eslint-visitor-keys": { + "version": "5.0.1", + "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-5.0.1.tgz", + "integrity": "sha512-tD40eHxA35h0PEIZNeIjkHoDR4YjjJp34biM0mDvplBe//mB+IHCqHDGV7pxF+7MklTvighcCPPZC7ynWyjdTA==", + "dev": true, + "engines": { + "node": "^20.19.0 || ^22.13.0 || >=24" + }, + "funding": { + "url": "https://opencollective.com/eslint" } }, "node_modules/@vitest/expect": { @@ -1770,59 +1479,48 @@ "url": "https://opencollective.com/vitest" } }, - "node_modules/@whiskeysockets/baileys": { - "version": "7.0.0-rc.9", - "resolved": "https://registry.npmjs.org/@whiskeysockets/baileys/-/baileys-7.0.0-rc.9.tgz", - "integrity": "sha512-YFm5gKXfDP9byCXCW3OPHKXLzrAKzolzgVUlRosHHgwbnf2YOO3XknkMm6J7+F0ns8OA0uuSBhgkRHTDtqkacw==", - "hasInstallScript": true, - "license": "MIT", - "dependencies": { - "@cacheable/node-cache": "^1.4.0", - "@hapi/boom": "^9.1.3", - "async-mutex": "^0.5.0", - "libsignal": "git+https://github.com/whiskeysockets/libsignal-node.git", - "lru-cache": "^11.1.0", - "music-metadata": "^11.7.0", - "p-queue": "^9.0.0", - "pino": "^9.6", - "protobufjs": "^7.2.4", - "ws": "^8.13.0" + "node_modules/acorn": { + "version": "8.16.0", + "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.16.0.tgz", + "integrity": "sha512-UVJyE9MttOsBQIDKw1skb9nAwQuR5wuGD3+82K6JgJlm/Y+KI92oNsMNGZCYdDsVtRHSak0pcV5Dno5+4jh9sw==", + "dev": true, + "bin": { + "acorn": "bin/acorn" }, "engines": { - "node": ">=20.0.0" - }, - "peerDependencies": { - "audio-decode": "^2.1.3", - "jimp": "^1.6.0", - "link-preview-js": "^3.0.0", - "sharp": "*" - }, - "peerDependenciesMeta": { - "audio-decode": { - "optional": true - }, - "jimp": { - "optional": true - }, - "link-preview-js": { - "optional": true - } + "node": ">=0.4.0" } }, - "node_modules/ansi-regex": { - "version": "5.0.1", - "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz", - "integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==", - "license": "MIT", - "engines": { - "node": ">=8" + "node_modules/acorn-jsx": { + "version": "5.3.2", + "resolved": "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.3.2.tgz", + "integrity": "sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==", + "dev": true, + "peerDependencies": { + "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0" + } + }, + "node_modules/ajv": { + "version": "6.14.0", + "resolved": "https://registry.npmjs.org/ajv/-/ajv-6.14.0.tgz", + "integrity": "sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw==", + "dev": true, + "dependencies": { + "fast-deep-equal": "^3.1.1", + "fast-json-stable-stringify": "^2.0.0", + "json-schema-traverse": "^0.4.1", + "uri-js": "^4.2.2" + }, + "funding": { + "type": "github", + "url": "https://github.com/sponsors/epoberezkin" } }, "node_modules/ansi-styles": { "version": "4.3.0", "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz", "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==", - "license": "MIT", + "dev": true, "dependencies": { "color-convert": "^2.0.1" }, @@ -1833,6 +1531,12 @@ "url": "https://github.com/chalk/ansi-styles?sponsor=1" } }, + "node_modules/argparse": { + "version": "2.0.1", + "resolved": "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz", + "integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==", + "dev": true + }, "node_modules/assertion-error": { "version": "2.0.1", "resolved": "https://registry.npmjs.org/assertion-error/-/assertion-error-2.0.1.tgz", @@ -1843,35 +1547,11 @@ "node": ">=12" } }, - "node_modules/ast-v8-to-istanbul": { - "version": "0.3.11", - "resolved": "https://registry.npmjs.org/ast-v8-to-istanbul/-/ast-v8-to-istanbul-0.3.11.tgz", - "integrity": "sha512-Qya9fkoofMjCBNVdWINMjB5KZvkYfaO9/anwkWnjxibpWUxo5iHl2sOdP7/uAqaRuUYuoo8rDwnbaaKVFxoUvw==", - "dev": true, - "license": "MIT", - "dependencies": { - "@jridgewell/trace-mapping": "^0.3.31", - "estree-walker": "^3.0.3", - "js-tokens": "^10.0.0" - } - }, - "node_modules/async-mutex": { - "version": "0.5.0", - "resolved": "https://registry.npmjs.org/async-mutex/-/async-mutex-0.5.0.tgz", - "integrity": "sha512-1A94B18jkJ3DYq284ohPxoXbfTA5HsQ7/Mf4DEhcyLx3Bz27Rh59iScbB6EPiP+B+joue6YCxcMXSbFC1tZKwA==", - "license": "MIT", - "dependencies": { - "tslib": "^2.4.0" - } - }, - "node_modules/atomic-sleep": { - "version": "1.0.0", - "resolved": "https://registry.npmjs.org/atomic-sleep/-/atomic-sleep-1.0.0.tgz", - "integrity": "sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ==", - "license": "MIT", - "engines": { - "node": ">=8.0.0" - } + "node_modules/balanced-match": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz", + "integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==", + "dev": true }, "node_modules/base64-js": { "version": "1.5.1", @@ -1924,6 +1604,16 @@ "readable-stream": "^3.4.0" } }, + "node_modules/brace-expansion": { + "version": "1.1.12", + "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz", + "integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==", + "dev": true, + "dependencies": { + "balanced-match": "^1.0.0", + "concat-map": "0.0.1" + } + }, "node_modules/buffer": { "version": "5.7.1", "resolved": "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz", @@ -1948,24 +1638,11 @@ "ieee754": "^1.1.13" } }, - "node_modules/cacheable": { - "version": "2.3.2", - "resolved": "https://registry.npmjs.org/cacheable/-/cacheable-2.3.2.tgz", - "integrity": "sha512-w+ZuRNmex9c1TR9RcsxbfTKCjSL0rh1WA5SABbrWprIHeNBdmyQLSYonlDy9gpD+63XT8DgZ/wNh1Smvc9WnJA==", - "license": "MIT", - "dependencies": { - "@cacheable/memory": "^2.0.7", - "@cacheable/utils": "^2.3.3", - "hookified": "^1.15.0", - "keyv": "^5.5.5", - "qified": "^0.6.0" - } - }, - "node_modules/camelcase": { - "version": "5.3.1", - "resolved": "https://registry.npmjs.org/camelcase/-/camelcase-5.3.1.tgz", - "integrity": "sha512-L28STB170nwWS63UjtlEOE3dldQApaJXZkOI1uMFfzf3rRuPegHaHesyee+YxQ+W6SvRDQV6UrdOdRiR153wJg==", - "license": "MIT", + "node_modules/callsites": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz", + "integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==", + "dev": true, "engines": { "node": ">=6" } @@ -1980,28 +1657,33 @@ "node": ">=18" } }, + "node_modules/chalk": { + "version": "4.1.2", + "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", + "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==", + "dev": true, + "dependencies": { + "ansi-styles": "^4.1.0", + "supports-color": "^7.1.0" + }, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/chalk/chalk?sponsor=1" + } + }, "node_modules/chownr": { "version": "1.1.4", "resolved": "https://registry.npmjs.org/chownr/-/chownr-1.1.4.tgz", "integrity": "sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg==", "license": "ISC" }, - "node_modules/cliui": { - "version": "6.0.0", - "resolved": "https://registry.npmjs.org/cliui/-/cliui-6.0.0.tgz", - "integrity": "sha512-t6wbgtoCXvAzst7QgXxJYqPt0usEfbgQdftEPbLL/cvv6HPE5VgvqCuAIDR0NgU52ds6rFwqrgakNLrHEjCbrQ==", - "license": "ISC", - "dependencies": { - "string-width": "^4.2.0", - "strip-ansi": "^6.0.0", - "wrap-ansi": "^6.2.0" - } - }, "node_modules/color-convert": { "version": "2.0.1", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==", - "license": "MIT", + "dev": true, "dependencies": { "color-name": "~1.1.4" }, @@ -2013,22 +1695,13 @@ "version": "1.1.4", "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz", "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==", - "license": "MIT" + "dev": true }, - "node_modules/colorette": { - "version": "2.0.20", - "resolved": "https://registry.npmjs.org/colorette/-/colorette-2.0.20.tgz", - "integrity": "sha512-IfEDxwoWIjkeXL1eXcDiow4UbKjhLdq6/EuSVR9GMN7KVH3r9gQ83e73hsz1Nd1T3ijd5xv1wcWRYO+D6kCI2w==", - "license": "MIT" - }, - "node_modules/content-type": { - "version": "1.0.5", - "resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.5.tgz", - "integrity": "sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==", - "license": "MIT", - "engines": { - "node": ">= 0.6" - } + "node_modules/concat-map": { + "version": "0.0.1", + "resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz", + "integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==", + "dev": true }, "node_modules/cron-parser": { "version": "5.5.0", @@ -2042,25 +1715,25 @@ "node": ">=18" } }, - "node_modules/curve25519-js": { - "version": "0.0.4", - "resolved": "https://registry.npmjs.org/curve25519-js/-/curve25519-js-0.0.4.tgz", - "integrity": "sha512-axn2UMEnkhyDUPWOwVKBMVIzSQy2ejH2xRGy1wq81dqRwApXfIzfbE3hIX0ZRFBIihf/KDqK158DLwESu4AK1w==", - "license": "MIT" - }, - "node_modules/dateformat": { - "version": "4.6.3", - "resolved": "https://registry.npmjs.org/dateformat/-/dateformat-4.6.3.tgz", - "integrity": "sha512-2P0p0pFGzHS5EMnhdxQi7aJN+iMheud0UhG4dlE1DLAlvL8JHjJJTX/CSm4JXwV0Ka5nGk3zC5mcb5bUQUxxMA==", - "license": "MIT", + "node_modules/cross-spawn": { + "version": "7.0.6", + "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz", + "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==", + "dev": true, + "dependencies": { + "path-key": "^3.1.0", + "shebang-command": "^2.0.0", + "which": "^2.0.1" + }, "engines": { - "node": "*" + "node": ">= 8" } }, "node_modules/debug": { "version": "4.4.3", "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz", "integrity": "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==", + "dev": true, "license": "MIT", "dependencies": { "ms": "^2.1.3" @@ -2074,15 +1747,6 @@ } } }, - "node_modules/decamelize": { - "version": "1.2.0", - "resolved": "https://registry.npmjs.org/decamelize/-/decamelize-1.2.0.tgz", - "integrity": "sha512-z2S+W9X73hAUUki+N+9Za2lBlun89zigOyGrsax+KUQ6wKW4ZoWpEYBkGhQjwAjjDCkWxhY0VKEhk8wzY7F5cA==", - "license": "MIT", - "engines": { - "node": ">=0.10.0" - } - }, "node_modules/decompress-response": { "version": "6.0.0", "resolved": "https://registry.npmjs.org/decompress-response/-/decompress-response-6.0.0.tgz", @@ -2107,6 +1771,12 @@ "node": ">=4.0.0" } }, + "node_modules/deep-is": { + "version": "0.1.4", + "resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz", + "integrity": "sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ==", + "dev": true + }, "node_modules/detect-libc": { "version": "2.1.2", "resolved": "https://registry.npmjs.org/detect-libc/-/detect-libc-2.1.2.tgz", @@ -2116,18 +1786,6 @@ "node": ">=8" } }, - "node_modules/dijkstrajs": { - "version": "1.0.3", - "resolved": "https://registry.npmjs.org/dijkstrajs/-/dijkstrajs-1.0.3.tgz", - "integrity": "sha512-qiSlmBq9+BCdCA/L46dw8Uy93mloxsPSbwnm5yrKn2vMPiy8KyAskTF6zuV/j5BMsmOGZDPs7KjU+mjb670kfA==", - "license": "MIT" - }, - "node_modules/emoji-regex": { - "version": "8.0.0", - "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz", - "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==", - "license": "MIT" - }, "node_modules/end-of-stream": { "version": "1.4.5", "resolved": "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.5.tgz", @@ -2186,6 +1844,164 @@ "@esbuild/win32-x64": "0.27.3" } }, + "node_modules/escape-string-regexp": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz", + "integrity": "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==", + "dev": true, + "engines": { + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/eslint": { + "version": "9.39.4", + "resolved": "https://registry.npmjs.org/eslint/-/eslint-9.39.4.tgz", + "integrity": "sha512-XoMjdBOwe/esVgEvLmNsD3IRHkm7fbKIUGvrleloJXUZgDHig2IPWNniv+GwjyJXzuNqVjlr5+4yVUZjycJwfQ==", + "dev": true, + "dependencies": { + "@eslint-community/eslint-utils": "^4.8.0", + "@eslint-community/regexpp": "^4.12.1", + "@eslint/config-array": "^0.21.2", + "@eslint/config-helpers": "^0.4.2", + "@eslint/core": "^0.17.0", + "@eslint/eslintrc": "^3.3.5", + "@eslint/js": "9.39.4", + "@eslint/plugin-kit": "^0.4.1", + "@humanfs/node": "^0.16.6", + "@humanwhocodes/module-importer": "^1.0.1", + "@humanwhocodes/retry": "^0.4.2", + "@types/estree": "^1.0.6", + "ajv": "^6.14.0", + "chalk": "^4.0.0", + "cross-spawn": "^7.0.6", + "debug": "^4.3.2", + "escape-string-regexp": "^4.0.0", + "eslint-scope": "^8.4.0", + "eslint-visitor-keys": "^4.2.1", + "espree": "^10.4.0", + "esquery": "^1.5.0", + "esutils": "^2.0.2", + "fast-deep-equal": "^3.1.3", + "file-entry-cache": "^8.0.0", + "find-up": "^5.0.0", + "glob-parent": "^6.0.2", + "ignore": "^5.2.0", + "imurmurhash": "^0.1.4", + "is-glob": "^4.0.0", + "json-stable-stringify-without-jsonify": "^1.0.1", + "lodash.merge": "^4.6.2", + "minimatch": "^3.1.5", + "natural-compare": "^1.4.0", + "optionator": "^0.9.3" + }, + "bin": { + "eslint": "bin/eslint.js" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://eslint.org/donate" + }, + "peerDependencies": { + "jiti": "*" + }, + "peerDependenciesMeta": { + "jiti": { + "optional": true + } + } + }, + "node_modules/eslint-plugin-no-catch-all": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/eslint-plugin-no-catch-all/-/eslint-plugin-no-catch-all-1.1.0.tgz", + "integrity": "sha512-VkP62jLTmccPrFGN/W6V7a3SEwdtTZm+Su2k4T3uyJirtkm0OMMm97h7qd8pRFAHus/jQg9FpUpLRc7sAylBEQ==", + "dev": true, + "peerDependencies": { + "eslint": ">=2.0.0" + } + }, + "node_modules/eslint-scope": { + "version": "8.4.0", + "resolved": "https://registry.npmjs.org/eslint-scope/-/eslint-scope-8.4.0.tgz", + "integrity": "sha512-sNXOfKCn74rt8RICKMvJS7XKV/Xk9kA7DyJr8mJik3S7Cwgy3qlkkmyS2uQB3jiJg6VNdZd/pDBJu0nvG2NlTg==", + "dev": true, + "dependencies": { + "esrecurse": "^4.3.0", + "estraverse": "^5.2.0" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/eslint-visitor-keys": { + "version": "4.2.1", + "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-4.2.1.tgz", + "integrity": "sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ==", + "dev": true, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/espree": { + "version": "10.4.0", + "resolved": "https://registry.npmjs.org/espree/-/espree-10.4.0.tgz", + "integrity": "sha512-j6PAQ2uUr79PZhBjP5C5fhl8e39FmRnOjsD5lGnWrFU8i2G776tBK7+nP8KuQUTTyAZUwfQqXAgrVH5MbH9CYQ==", + "dev": true, + "dependencies": { + "acorn": "^8.15.0", + "acorn-jsx": "^5.3.2", + "eslint-visitor-keys": "^4.2.1" + }, + "engines": { + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" + }, + "funding": { + "url": "https://opencollective.com/eslint" + } + }, + "node_modules/esquery": { + "version": "1.7.0", + "resolved": "https://registry.npmjs.org/esquery/-/esquery-1.7.0.tgz", + "integrity": "sha512-Ap6G0WQwcU/LHsvLwON1fAQX9Zp0A2Y6Y/cJBl9r/JbW90Zyg4/zbG6zzKa2OTALELarYHmKu0GhpM5EO+7T0g==", + "dev": true, + "dependencies": { + "estraverse": "^5.1.0" + }, + "engines": { + "node": ">=0.10" + } + }, + "node_modules/esrecurse": { + "version": "4.3.0", + "resolved": "https://registry.npmjs.org/esrecurse/-/esrecurse-4.3.0.tgz", + "integrity": "sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag==", + "dev": true, + "dependencies": { + "estraverse": "^5.2.0" + }, + "engines": { + "node": ">=4.0" + } + }, + "node_modules/estraverse": { + "version": "5.3.0", + "resolved": "https://registry.npmjs.org/estraverse/-/estraverse-5.3.0.tgz", + "integrity": "sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA==", + "dev": true, + "engines": { + "node": ">=4.0" + } + }, "node_modules/estree-walker": { "version": "3.0.3", "resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-3.0.3.tgz", @@ -2196,11 +2012,14 @@ "@types/estree": "^1.0.0" } }, - "node_modules/eventemitter3": { - "version": "5.0.4", - "resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-5.0.4.tgz", - "integrity": "sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw==", - "license": "MIT" + "node_modules/esutils": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz", + "integrity": "sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==", + "dev": true, + "engines": { + "node": ">=0.10.0" + } }, "node_modules/expand-template": { "version": "2.0.3", @@ -2221,17 +2040,23 @@ "node": ">=12.0.0" } }, - "node_modules/fast-copy": { - "version": "4.0.2", - "resolved": "https://registry.npmjs.org/fast-copy/-/fast-copy-4.0.2.tgz", - "integrity": "sha512-ybA6PDXIXOXivLJK/z9e+Otk7ve13I4ckBvGO5I2RRmBU1gMHLVDJYEuJYhGwez7YNlYji2M2DvVU+a9mSFDlw==", - "license": "MIT" + "node_modules/fast-deep-equal": { + "version": "3.1.3", + "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz", + "integrity": "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==", + "dev": true }, - "node_modules/fast-safe-stringify": { - "version": "2.1.1", - "resolved": "https://registry.npmjs.org/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz", - "integrity": "sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA==", - "license": "MIT" + "node_modules/fast-json-stable-stringify": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz", + "integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw==", + "dev": true + }, + "node_modules/fast-levenshtein": { + "version": "2.0.6", + "resolved": "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz", + "integrity": "sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw==", + "dev": true }, "node_modules/fdir": { "version": "6.5.0", @@ -2251,22 +2076,16 @@ } } }, - "node_modules/file-type": { - "version": "21.3.0", - "resolved": "https://registry.npmjs.org/file-type/-/file-type-21.3.0.tgz", - "integrity": "sha512-8kPJMIGz1Yt/aPEwOsrR97ZyZaD1Iqm8PClb1nYFclUCkBi0Ma5IsYNQzvSFS9ib51lWyIw5mIT9rWzI/xjpzA==", - "license": "MIT", + "node_modules/file-entry-cache": { + "version": "8.0.0", + "resolved": "https://registry.npmjs.org/file-entry-cache/-/file-entry-cache-8.0.0.tgz", + "integrity": "sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ==", + "dev": true, "dependencies": { - "@tokenizer/inflate": "^0.4.1", - "strtok3": "^10.3.4", - "token-types": "^6.1.1", - "uint8array-extras": "^1.4.0" + "flat-cache": "^4.0.0" }, "engines": { - "node": ">=20" - }, - "funding": { - "url": "https://github.com/sindresorhus/file-type?sponsor=1" + "node": ">=16.0.0" } }, "node_modules/file-uri-to-path": { @@ -2276,18 +2095,40 @@ "license": "MIT" }, "node_modules/find-up": { - "version": "4.1.0", - "resolved": "https://registry.npmjs.org/find-up/-/find-up-4.1.0.tgz", - "integrity": "sha512-PpOwAdQ/YlXQ2vj8a3h8IipDuYRi3wceVQQGYWxNINccq40Anw7BlsEXCMbt1Zt+OLA6Fq9suIpIWD0OsnISlw==", - "license": "MIT", + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz", + "integrity": "sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==", + "dev": true, "dependencies": { - "locate-path": "^5.0.0", + "locate-path": "^6.0.0", "path-exists": "^4.0.0" }, "engines": { - "node": ">=8" + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" } }, + "node_modules/flat-cache": { + "version": "4.0.1", + "resolved": "https://registry.npmjs.org/flat-cache/-/flat-cache-4.0.1.tgz", + "integrity": "sha512-f7ccFPK3SXFHpx15UIGyRJ/FJQctuKZ0zVuN3frBo4HnK3cay9VEW0R6yPYFHC0AgqhukPzKjq22t5DmAyqGyw==", + "dev": true, + "dependencies": { + "flatted": "^3.2.9", + "keyv": "^4.5.4" + }, + "engines": { + "node": ">=16" + } + }, + "node_modules/flatted": { + "version": "3.4.2", + "resolved": "https://registry.npmjs.org/flatted/-/flatted-3.4.2.tgz", + "integrity": "sha512-PjDse7RzhcPkIJwy5t7KPWQSZ9cAbzQXcafsetQoD7sOJRQlGikNbx7yZp2OotDnJyrDcbyRq3Ttb18iYOqkxA==", + "dev": true + }, "node_modules/fs-constants": { "version": "1.0.0", "resolved": "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz", @@ -2309,15 +2150,6 @@ "node": "^8.16.0 || ^10.6.0 || >=11.0.0" } }, - "node_modules/get-caller-file": { - "version": "2.0.5", - "resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz", - "integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==", - "license": "ISC", - "engines": { - "node": "6.* || 8.* || >= 10.*" - } - }, "node_modules/get-tsconfig": { "version": "4.13.6", "resolved": "https://registry.npmjs.org/get-tsconfig/-/get-tsconfig-4.13.6.tgz", @@ -2337,6 +2169,30 @@ "integrity": "sha512-SyHy3T1v2NUXn29OsWdxmK6RwHD+vkj3v8en8AOBZ1wBQ/hCAQ5bAQTD02kW4W9tUp/3Qh6J8r9EvntiyCmOOw==", "license": "MIT" }, + "node_modules/glob-parent": { + "version": "6.0.2", + "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz", + "integrity": "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==", + "dev": true, + "dependencies": { + "is-glob": "^4.0.3" + }, + "engines": { + "node": ">=10.13.0" + } + }, + "node_modules/globals": { + "version": "15.15.0", + "resolved": "https://registry.npmjs.org/globals/-/globals-15.15.0.tgz", + "integrity": "sha512-7ACyT3wmyp3I61S4fG682L0VA2RGD9otkqGJIwNUMF1SWUombIIk+af1unuDYgMm082aHYwD+mzJvv9Iu8dsgg==", + "dev": true, + "engines": { + "node": ">=18" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, "node_modules/has-flag": { "version": "4.0.0", "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz", @@ -2347,37 +2203,6 @@ "node": ">=8" } }, - "node_modules/hashery": { - "version": "1.4.0", - "resolved": "https://registry.npmjs.org/hashery/-/hashery-1.4.0.tgz", - "integrity": "sha512-Wn2i1In6XFxl8Az55kkgnFRiAlIAushzh26PTjL2AKtQcEfXrcLa7Hn5QOWGZEf3LU057P9TwwZjFyxfS1VuvQ==", - "license": "MIT", - "dependencies": { - "hookified": "^1.14.0" - }, - "engines": { - "node": ">=20" - } - }, - "node_modules/help-me": { - "version": "5.0.0", - "resolved": "https://registry.npmjs.org/help-me/-/help-me-5.0.0.tgz", - "integrity": "sha512-7xgomUX6ADmcYzFik0HzAxh/73YlKR9bmFzf51CZwR+b6YtzU2m0u49hQCqV6SvlqIqsaxovfwdvbnsw3b/zpg==", - "license": "MIT" - }, - "node_modules/hookified": { - "version": "1.15.1", - "resolved": "https://registry.npmjs.org/hookified/-/hookified-1.15.1.tgz", - "integrity": "sha512-MvG/clsADq1GPM2KGo2nyfaWVyn9naPiXrqIe4jYjXNZQt238kWyOGrsyc/DmRAQ+Re6yeo6yX/yoNCG5KAEVg==", - "license": "MIT" - }, - "node_modules/html-escaper": { - "version": "2.0.2", - "resolved": "https://registry.npmjs.org/html-escaper/-/html-escaper-2.0.2.tgz", - "integrity": "sha512-H2iMtd0I4Mt5eYiapRdIDjp+XzelXQ0tFE4JS7YFwFevXXMmOp9myNrUvCg0D6ws8iqkRPBfKHgbwig1SmlLfg==", - "dev": true, - "license": "MIT" - }, "node_modules/husky": { "version": "9.1.7", "resolved": "https://registry.npmjs.org/husky/-/husky-9.1.7.tgz", @@ -2414,6 +2239,40 @@ ], "license": "BSD-3-Clause" }, + "node_modules/ignore": { + "version": "5.3.2", + "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz", + "integrity": "sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==", + "dev": true, + "engines": { + "node": ">= 4" + } + }, + "node_modules/import-fresh": { + "version": "3.3.1", + "resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.1.tgz", + "integrity": "sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==", + "dev": true, + "dependencies": { + "parent-module": "^1.0.0", + "resolve-from": "^4.0.0" + }, + "engines": { + "node": ">=6" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" + } + }, + "node_modules/imurmurhash": { + "version": "0.1.4", + "resolved": "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz", + "integrity": "sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==", + "dev": true, + "engines": { + "node": ">=0.8.19" + } + }, "node_modules/inherits": { "version": "2.0.4", "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz", @@ -2426,154 +2285,105 @@ "integrity": "sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew==", "license": "ISC" }, - "node_modules/is-fullwidth-code-point": { - "version": "3.0.0", - "resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz", - "integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==", - "license": "MIT", + "node_modules/is-extglob": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz", + "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==", + "dev": true, "engines": { - "node": ">=8" + "node": ">=0.10.0" } }, - "node_modules/istanbul-lib-coverage": { - "version": "3.2.2", - "resolved": "https://registry.npmjs.org/istanbul-lib-coverage/-/istanbul-lib-coverage-3.2.2.tgz", - "integrity": "sha512-O8dpsF+r0WV/8MNRKfnmrtCWhuKjxrq2w+jpzBL5UZKTi2LeVWnWOmWRxFlesJONmc+wLAGvKQZEOanko0LFTg==", + "node_modules/is-glob": { + "version": "4.0.3", + "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz", + "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==", "dev": true, - "license": "BSD-3-Clause", - "engines": { - "node": ">=8" - } - }, - "node_modules/istanbul-lib-report": { - "version": "3.0.1", - "resolved": "https://registry.npmjs.org/istanbul-lib-report/-/istanbul-lib-report-3.0.1.tgz", - "integrity": "sha512-GCfE1mtsHGOELCU8e/Z7YWzpmybrx/+dSTfLrvY8qRmaY6zXTKWn6WQIjaAFw069icm6GVMNkgu0NzI4iPZUNw==", - "dev": true, - "license": "BSD-3-Clause", "dependencies": { - "istanbul-lib-coverage": "^3.0.0", - "make-dir": "^4.0.0", - "supports-color": "^7.1.0" + "is-extglob": "^2.1.1" }, "engines": { - "node": ">=10" + "node": ">=0.10.0" } }, - "node_modules/istanbul-reports": { - "version": "3.2.0", - "resolved": "https://registry.npmjs.org/istanbul-reports/-/istanbul-reports-3.2.0.tgz", - "integrity": "sha512-HGYWWS/ehqTV3xN10i23tkPkpH46MLCIMFNCaaKNavAXTF1RkqxawEPtnjnGZ6XKSInBKkiOA5BKS+aZiY3AvA==", + "node_modules/isexe": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz", + "integrity": "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==", + "dev": true + }, + "node_modules/js-yaml": { + "version": "4.1.1", + "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.1.tgz", + "integrity": "sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==", "dev": true, - "license": "BSD-3-Clause", "dependencies": { - "html-escaper": "^2.0.0", - "istanbul-lib-report": "^3.0.0" - }, - "engines": { - "node": ">=8" - } - }, - "node_modules/joycon": { - "version": "3.1.1", - "resolved": "https://registry.npmjs.org/joycon/-/joycon-3.1.1.tgz", - "integrity": "sha512-34wB/Y7MW7bzjKRjUKTa46I2Z7eV62Rkhva+KkopW7Qvv/OSWBqvkSY7vusOPrNuZcUG3tApvdVgNB8POj3SPw==", - "license": "MIT", - "engines": { - "node": ">=10" - } - }, - "node_modules/js-tokens": { - "version": "10.0.0", - "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-10.0.0.tgz", - "integrity": "sha512-lM/UBzQmfJRo9ABXbPWemivdCW8V2G8FHaHdypQaIy523snUjog0W71ayWXTjiR+ixeMyVHN2XcpnTd/liPg/Q==", - "dev": true, - "license": "MIT" - }, - "node_modules/keyv": { - "version": "5.6.0", - "resolved": "https://registry.npmjs.org/keyv/-/keyv-5.6.0.tgz", - "integrity": "sha512-CYDD3SOtsHtyXeEORYRx2qBtpDJFjRTGXUtmNEMGyzYOKj1TE3tycdlho7kA1Ufx9OYWZzg52QFBGALTirzDSw==", - "license": "MIT", - "peer": true, - "dependencies": { - "@keyv/serialize": "^1.1.1" - } - }, - "node_modules/libsignal": { - "name": "@whiskeysockets/libsignal-node", - "version": "2.0.1", - "resolved": "git+ssh://git@github.com/whiskeysockets/libsignal-node.git#1c30d7d7e76a3b0aa120b04dc6a26f5a12dccf67", - "license": "GPL-3.0", - "dependencies": { - "curve25519-js": "^0.0.4", - "protobufjs": "6.8.8" - } - }, - "node_modules/libsignal/node_modules/@types/node": { - "version": "10.17.60", - "resolved": "https://registry.npmjs.org/@types/node/-/node-10.17.60.tgz", - "integrity": "sha512-F0KIgDJfy2nA3zMLmWGKxcH2ZVEtCZXHHdOQs2gSaQ27+lNeEfGxzkIw90aXswATX7AZ33tahPbzy6KAfUreVw==", - "license": "MIT" - }, - "node_modules/libsignal/node_modules/long": { - "version": "4.0.0", - "resolved": "https://registry.npmjs.org/long/-/long-4.0.0.tgz", - "integrity": "sha512-XsP+KhQif4bjX1kbuSiySJFNAehNxgLb6hPRGJ9QsUr8ajHkuXGdrHmFUTUUXhDwVX2R5bY4JNZEwbUiMhV+MA==", - "license": "Apache-2.0" - }, - "node_modules/libsignal/node_modules/protobufjs": { - "version": "6.8.8", - "resolved": "https://registry.npmjs.org/protobufjs/-/protobufjs-6.8.8.tgz", - "integrity": "sha512-AAmHtD5pXgZfi7GMpllpO3q1Xw1OYldr+dMUlAnffGTAhqkg72WdmSY71uKBF/JuyiKs8psYbtKrhi0ASCD8qw==", - "hasInstallScript": true, - "license": "BSD-3-Clause", - "dependencies": { - "@protobufjs/aspromise": "^1.1.2", - "@protobufjs/base64": "^1.1.2", - "@protobufjs/codegen": "^2.0.4", - "@protobufjs/eventemitter": "^1.1.0", - "@protobufjs/fetch": "^1.1.0", - "@protobufjs/float": "^1.0.2", - "@protobufjs/inquire": "^1.1.0", - "@protobufjs/path": "^1.1.2", - "@protobufjs/pool": "^1.1.0", - "@protobufjs/utf8": "^1.1.0", - "@types/long": "^4.0.0", - "@types/node": "^10.1.0", - "long": "^4.0.0" + "argparse": "^2.0.1" }, "bin": { - "pbjs": "bin/pbjs", - "pbts": "bin/pbts" + "js-yaml": "bin/js-yaml.js" + } + }, + "node_modules/json-buffer": { + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz", + "integrity": "sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ==", + "dev": true + }, + "node_modules/json-schema-traverse": { + "version": "0.4.1", + "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz", + "integrity": "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg==", + "dev": true + }, + "node_modules/json-stable-stringify-without-jsonify": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/json-stable-stringify-without-jsonify/-/json-stable-stringify-without-jsonify-1.0.1.tgz", + "integrity": "sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw==", + "dev": true + }, + "node_modules/keyv": { + "version": "4.5.4", + "resolved": "https://registry.npmjs.org/keyv/-/keyv-4.5.4.tgz", + "integrity": "sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==", + "dev": true, + "dependencies": { + "json-buffer": "3.0.1" + } + }, + "node_modules/levn": { + "version": "0.4.1", + "resolved": "https://registry.npmjs.org/levn/-/levn-0.4.1.tgz", + "integrity": "sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==", + "dev": true, + "dependencies": { + "prelude-ls": "^1.2.1", + "type-check": "~0.4.0" + }, + "engines": { + "node": ">= 0.8.0" } }, "node_modules/locate-path": { - "version": "5.0.0", - "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-5.0.0.tgz", - "integrity": "sha512-t7hw9pI+WvuwNJXwk5zVHpyhIqzg2qTlklJOf0mVxGSbe3Fp2VieZcduNYjaLDoy6p9uGpQEGWG87WpMKlNq8g==", - "license": "MIT", + "version": "6.0.0", + "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz", + "integrity": "sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw==", + "dev": true, "dependencies": { - "p-locate": "^4.1.0" + "p-locate": "^5.0.0" }, "engines": { - "node": ">=8" + "node": ">=10" + }, + "funding": { + "url": "https://github.com/sponsors/sindresorhus" } }, - "node_modules/long": { - "version": "5.3.2", - "resolved": "https://registry.npmjs.org/long/-/long-5.3.2.tgz", - "integrity": "sha512-mNAgZ1GmyNhD7AuqnTG3/VQ26o760+ZYBPKjPvugO8+nLbYfX6TVpJPseBvopbdY+qpZ/lKUnmEc1LeZYS3QAA==", - "license": "Apache-2.0" - }, - "node_modules/lru-cache": { - "version": "11.2.6", - "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-11.2.6.tgz", - "integrity": "sha512-ESL2CrkS/2wTPfuend7Zhkzo2u0daGJ/A2VucJOgQ/C48S/zB8MMeMHSGKYpXhIjbPxfuezITkaBH1wqv00DDQ==", - "license": "BlueOak-1.0.0", - "engines": { - "node": "20 || >=22" - } + "node_modules/lodash.merge": { + "version": "4.6.2", + "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz", + "integrity": "sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==", + "dev": true }, "node_modules/luxon": { "version": "3.7.2", @@ -2594,43 +2404,6 @@ "@jridgewell/sourcemap-codec": "^1.5.5" } }, - "node_modules/magicast": { - "version": "0.5.2", - "resolved": "https://registry.npmjs.org/magicast/-/magicast-0.5.2.tgz", - "integrity": "sha512-E3ZJh4J3S9KfwdjZhe2afj6R9lGIN5Pher1pF39UGrXRqq/VDaGVIGN13BjHd2u8B61hArAGOnso7nBOouW3TQ==", - "dev": true, - "license": "MIT", - "dependencies": { - "@babel/parser": "^7.29.0", - "@babel/types": "^7.29.0", - "source-map-js": "^1.2.1" - } - }, - "node_modules/make-dir": { - "version": "4.0.0", - "resolved": "https://registry.npmjs.org/make-dir/-/make-dir-4.0.0.tgz", - "integrity": "sha512-hXdUTZYIVOt1Ex//jAQi+wTZZpUpwBj/0QsOzqegb3rGMMeJiSEu5xLHnYfBrRV4RH2+OCSOO95Is/7x1WJ4bw==", - "dev": true, - "license": "MIT", - "dependencies": { - "semver": "^7.5.3" - }, - "engines": { - "node": ">=10" - }, - "funding": { - "url": "https://github.com/sponsors/sindresorhus" - } - }, - "node_modules/media-typer": { - "version": "1.1.0", - "resolved": "https://registry.npmjs.org/media-typer/-/media-typer-1.1.0.tgz", - "integrity": "sha512-aisnrDP4GNe06UcKFnV5bfMNPBUw4jsLGaWwWfnH3v02GnBuXX2MCVn5RbrWo0j3pczUilYblq7fQ7Nw2t5XKw==", - "license": "MIT", - "engines": { - "node": ">= 0.8" - } - }, "node_modules/mimic-response": { "version": "3.1.0", "resolved": "https://registry.npmjs.org/mimic-response/-/mimic-response-3.1.0.tgz", @@ -2643,6 +2416,18 @@ "url": "https://github.com/sponsors/sindresorhus" } }, + "node_modules/minimatch": { + "version": "3.1.5", + "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.5.tgz", + "integrity": "sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w==", + "dev": true, + "dependencies": { + "brace-expansion": "^1.1.7" + }, + "engines": { + "node": "*" + } + }, "node_modules/minimist": { "version": "1.2.8", "resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz", @@ -2662,39 +2447,9 @@ "version": "2.1.3", "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz", "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==", + "dev": true, "license": "MIT" }, - "node_modules/music-metadata": { - "version": "11.12.0", - "resolved": "https://registry.npmjs.org/music-metadata/-/music-metadata-11.12.0.tgz", - "integrity": "sha512-9ChYnmVmyHvFxR2g0MWFSHmJfbssRy07457G4gbb4LA9WYvyZea/8EMbqvg5dcv4oXNCNL01m8HXtymLlhhkYg==", - "funding": [ - { - "type": "github", - "url": "https://github.com/sponsors/Borewit" - }, - { - "type": "buymeacoffee", - "url": "https://buymeacoffee.com/borewit" - } - ], - "license": "MIT", - "dependencies": { - "@borewit/text-codec": "^0.2.1", - "@tokenizer/token": "^0.3.0", - "content-type": "^1.0.5", - "debug": "^4.4.3", - "file-type": "^21.3.0", - "media-typer": "^1.1.0", - "strtok3": "^10.3.4", - "token-types": "^6.1.2", - "uint8array-extras": "^1.5.0", - "win-guid": "^0.2.1" - }, - "engines": { - "node": ">=18" - } - }, "node_modules/nanoid": { "version": "3.3.11", "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz", @@ -2720,6 +2475,12 @@ "integrity": "sha512-GEbrYkbfF7MoNaoh2iGG84Mnf/WZfB0GdGEsM8wz7Expx/LlWf5U8t9nvJKXSp3qr5IsEbK04cBGhol/KwOsWA==", "license": "MIT" }, + "node_modules/natural-compare": { + "version": "1.4.0", + "resolved": "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz", + "integrity": "sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw==", + "dev": true + }, "node_modules/node-abi": { "version": "3.87.0", "resolved": "https://registry.npmjs.org/node-abi/-/node-abi-3.87.0.tgz", @@ -2743,15 +2504,6 @@ ], "license": "MIT" }, - "node_modules/on-exit-leak-free": { - "version": "2.1.2", - "resolved": "https://registry.npmjs.org/on-exit-leak-free/-/on-exit-leak-free-2.1.2.tgz", - "integrity": "sha512-0eJJY6hXLGf1udHwfNftBqH+g73EU4B504nZeKpz1sYRKafAghwxEJunB2O7rDZkL4PGfsMVnTXZ2EjibbqcsA==", - "license": "MIT", - "engines": { - "node": ">=14.0.0" - } - }, "node_modules/once": { "version": "1.4.0", "resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz", @@ -2761,66 +2513,61 @@ "wrappy": "1" } }, - "node_modules/p-limit": { - "version": "2.3.0", - "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-2.3.0.tgz", - "integrity": "sha512-//88mFWSJx8lxCzwdAABTJL2MyWB12+eIY7MDL2SqLmAkeKU9qxRvWuSyTjm3FUmpBEMuFfckAIqEaVGUDxb6w==", - "license": "MIT", + "node_modules/optionator": { + "version": "0.9.4", + "resolved": "https://registry.npmjs.org/optionator/-/optionator-0.9.4.tgz", + "integrity": "sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g==", + "dev": true, "dependencies": { - "p-try": "^2.0.0" + "deep-is": "^0.1.3", + "fast-levenshtein": "^2.0.6", + "levn": "^0.4.1", + "prelude-ls": "^1.2.1", + "type-check": "^0.4.0", + "word-wrap": "^1.2.5" }, "engines": { - "node": ">=6" + "node": ">= 0.8.0" + } + }, + "node_modules/p-limit": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz", + "integrity": "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==", + "dev": true, + "dependencies": { + "yocto-queue": "^0.1.0" + }, + "engines": { + "node": ">=10" }, "funding": { "url": "https://github.com/sponsors/sindresorhus" } }, "node_modules/p-locate": { - "version": "4.1.0", - "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-4.1.0.tgz", - "integrity": "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A==", - "license": "MIT", + "version": "5.0.0", + "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-5.0.0.tgz", + "integrity": "sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==", + "dev": true, "dependencies": { - "p-limit": "^2.2.0" + "p-limit": "^3.0.2" }, "engines": { - "node": ">=8" - } - }, - "node_modules/p-queue": { - "version": "9.1.0", - "resolved": "https://registry.npmjs.org/p-queue/-/p-queue-9.1.0.tgz", - "integrity": "sha512-O/ZPaXuQV29uSLbxWBGGZO1mCQXV2BLIwUr59JUU9SoH76mnYvtms7aafH/isNSNGwuEfP6W/4xD0/TJXxrizw==", - "license": "MIT", - "dependencies": { - "eventemitter3": "^5.0.1", - "p-timeout": "^7.0.0" - }, - "engines": { - "node": ">=20" + "node": ">=10" }, "funding": { "url": "https://github.com/sponsors/sindresorhus" } }, - "node_modules/p-timeout": { - "version": "7.0.1", - "resolved": "https://registry.npmjs.org/p-timeout/-/p-timeout-7.0.1.tgz", - "integrity": "sha512-AxTM2wDGORHGEkPCt8yqxOTMgpfbEHqF51f/5fJCmwFC3C/zNcGT63SymH2ttOAaiIws2zVg4+izQCjrakcwHg==", - "license": "MIT", - "engines": { - "node": ">=20" + "node_modules/parent-module": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz", + "integrity": "sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==", + "dev": true, + "dependencies": { + "callsites": "^3.0.0" }, - "funding": { - "url": "https://github.com/sponsors/sindresorhus" - } - }, - "node_modules/p-try": { - "version": "2.2.0", - "resolved": "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz", - "integrity": "sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==", - "license": "MIT", "engines": { "node": ">=6" } @@ -2829,7 +2576,16 @@ "version": "4.0.0", "resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz", "integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==", - "license": "MIT", + "dev": true, + "engines": { + "node": ">=8" + } + }, + "node_modules/path-key": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz", + "integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==", + "dev": true, "engines": { "node": ">=8" } @@ -2854,7 +2610,6 @@ "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==", "dev": true, "license": "MIT", - "peer": true, "engines": { "node": ">=12" }, @@ -2862,85 +2617,6 @@ "url": "https://github.com/sponsors/jonschlinkert" } }, - "node_modules/pino": { - "version": "9.14.0", - "resolved": "https://registry.npmjs.org/pino/-/pino-9.14.0.tgz", - "integrity": "sha512-8OEwKp5juEvb/MjpIc4hjqfgCNysrS94RIOMXYvpYCdm/jglrKEiAYmiumbmGhCvs+IcInsphYDFwqrjr7398w==", - "license": "MIT", - "dependencies": { - "@pinojs/redact": "^0.4.0", - "atomic-sleep": "^1.0.0", - "on-exit-leak-free": "^2.1.0", - "pino-abstract-transport": "^2.0.0", - "pino-std-serializers": "^7.0.0", - "process-warning": "^5.0.0", - "quick-format-unescaped": "^4.0.3", - "real-require": "^0.2.0", - "safe-stable-stringify": "^2.3.1", - "sonic-boom": "^4.0.1", - "thread-stream": "^3.0.0" - }, - "bin": { - "pino": "bin.js" - } - }, - "node_modules/pino-abstract-transport": { - "version": "2.0.0", - "resolved": "https://registry.npmjs.org/pino-abstract-transport/-/pino-abstract-transport-2.0.0.tgz", - "integrity": "sha512-F63x5tizV6WCh4R6RHyi2Ml+M70DNRXt/+HANowMflpgGFMAym/VKm6G7ZOQRjqN7XbGxK1Lg9t6ZrtzOaivMw==", - "license": "MIT", - "dependencies": { - "split2": "^4.0.0" - } - }, - "node_modules/pino-pretty": { - "version": "13.1.3", - "resolved": "https://registry.npmjs.org/pino-pretty/-/pino-pretty-13.1.3.tgz", - "integrity": "sha512-ttXRkkOz6WWC95KeY9+xxWL6AtImwbyMHrL1mSwqwW9u+vLp/WIElvHvCSDg0xO/Dzrggz1zv3rN5ovTRVowKg==", - "license": "MIT", - "dependencies": { - "colorette": "^2.0.7", - "dateformat": "^4.6.3", - "fast-copy": "^4.0.0", - "fast-safe-stringify": "^2.1.1", - "help-me": "^5.0.0", - "joycon": "^3.1.1", - "minimist": "^1.2.6", - "on-exit-leak-free": "^2.1.0", - "pino-abstract-transport": "^3.0.0", - "pump": "^3.0.0", - "secure-json-parse": "^4.0.0", - "sonic-boom": "^4.0.1", - "strip-json-comments": "^5.0.2" - }, - "bin": { - "pino-pretty": "bin.js" - } - }, - "node_modules/pino-pretty/node_modules/pino-abstract-transport": { - "version": "3.0.0", - "resolved": "https://registry.npmjs.org/pino-abstract-transport/-/pino-abstract-transport-3.0.0.tgz", - "integrity": "sha512-wlfUczU+n7Hy/Ha5j9a/gZNy7We5+cXp8YL+X+PG8S0KXxw7n/JXA3c46Y0zQznIJ83URJiwy7Lh56WLokNuxg==", - "license": "MIT", - "dependencies": { - "split2": "^4.0.0" - } - }, - "node_modules/pino-std-serializers": { - "version": "7.1.0", - "resolved": "https://registry.npmjs.org/pino-std-serializers/-/pino-std-serializers-7.1.0.tgz", - "integrity": "sha512-BndPH67/JxGExRgiX1dX0w1FvZck5Wa4aal9198SrRhZjH3GxKQUKIBnYJTdj2HDN3UQAS06HlfcSbQj2OHmaw==", - "license": "MIT" - }, - "node_modules/pngjs": { - "version": "5.0.0", - "resolved": "https://registry.npmjs.org/pngjs/-/pngjs-5.0.0.tgz", - "integrity": "sha512-40QW5YalBNfQo5yRYmiw7Yz6TKKVr3h6970B2YE+3fQpsWcrbj1PzJgxeJ19DRQjhMbKPIuMY8rFaXc8moolVw==", - "license": "MIT", - "engines": { - "node": ">=10.13.0" - } - }, "node_modules/postcss": { "version": "8.5.6", "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz", @@ -2996,6 +2672,15 @@ "node": ">=10" } }, + "node_modules/prelude-ls": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/prelude-ls/-/prelude-ls-1.2.1.tgz", + "integrity": "sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==", + "dev": true, + "engines": { + "node": ">= 0.8.0" + } + }, "node_modules/prettier": { "version": "3.8.1", "resolved": "https://registry.npmjs.org/prettier/-/prettier-3.8.1.tgz", @@ -3012,46 +2697,6 @@ "url": "https://github.com/prettier/prettier?sponsor=1" } }, - "node_modules/process-warning": { - "version": "5.0.0", - "resolved": "https://registry.npmjs.org/process-warning/-/process-warning-5.0.0.tgz", - "integrity": "sha512-a39t9ApHNx2L4+HBnQKqxxHNs1r7KF+Intd8Q/g1bUh6q0WIp9voPXJ/x0j+ZL45KF1pJd9+q2jLIRMfvEshkA==", - "funding": [ - { - "type": "github", - "url": "https://github.com/sponsors/fastify" - }, - { - "type": "opencollective", - "url": "https://opencollective.com/fastify" - } - ], - "license": "MIT" - }, - "node_modules/protobufjs": { - "version": "7.5.4", - "resolved": "https://registry.npmjs.org/protobufjs/-/protobufjs-7.5.4.tgz", - "integrity": "sha512-CvexbZtbov6jW2eXAvLukXjXUW1TzFaivC46BpWc/3BpcCysb5Vffu+B3XHMm8lVEuy2Mm4XGex8hBSg1yapPg==", - "hasInstallScript": true, - "license": "BSD-3-Clause", - "dependencies": { - "@protobufjs/aspromise": "^1.1.2", - "@protobufjs/base64": "^1.1.2", - "@protobufjs/codegen": "^2.0.4", - "@protobufjs/eventemitter": "^1.1.0", - "@protobufjs/fetch": "^1.1.0", - "@protobufjs/float": "^1.0.2", - "@protobufjs/inquire": "^1.1.0", - "@protobufjs/path": "^1.1.2", - "@protobufjs/pool": "^1.1.0", - "@protobufjs/utf8": "^1.1.0", - "@types/node": ">=13.7.0", - "long": "^5.0.0" - }, - "engines": { - "node": ">=12.0.0" - } - }, "node_modules/pump": { "version": "3.0.3", "resolved": "https://registry.npmjs.org/pump/-/pump-3.0.3.tgz", @@ -3062,49 +2707,15 @@ "once": "^1.3.1" } }, - "node_modules/qified": { - "version": "0.6.0", - "resolved": "https://registry.npmjs.org/qified/-/qified-0.6.0.tgz", - "integrity": "sha512-tsSGN1x3h569ZSU1u6diwhltLyfUWDp3YbFHedapTmpBl0B3P6U3+Qptg7xu+v+1io1EwhdPyyRHYbEw0KN2FA==", - "license": "MIT", - "dependencies": { - "hookified": "^1.14.0" - }, + "node_modules/punycode": { + "version": "2.3.1", + "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz", + "integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==", + "dev": true, "engines": { - "node": ">=20" + "node": ">=6" } }, - "node_modules/qrcode": { - "version": "1.5.4", - "resolved": "https://registry.npmjs.org/qrcode/-/qrcode-1.5.4.tgz", - "integrity": "sha512-1ca71Zgiu6ORjHqFBDpnSMTR2ReToX4l1Au1VFLyVeBTFavzQnv5JxMFr3ukHVKpSrSA2MCk0lNJSykjUfz7Zg==", - "license": "MIT", - "dependencies": { - "dijkstrajs": "^1.0.1", - "pngjs": "^5.0.0", - "yargs": "^15.3.1" - }, - "bin": { - "qrcode": "bin/qrcode" - }, - "engines": { - "node": ">=10.13.0" - } - }, - "node_modules/qrcode-terminal": { - "version": "0.12.0", - "resolved": "https://registry.npmjs.org/qrcode-terminal/-/qrcode-terminal-0.12.0.tgz", - "integrity": "sha512-EXtzRZmC+YGmGlDFbXKxQiMZNwCLEO6BANKXG4iCtSIM0yqc/pappSx3RIKr4r0uh5JsBckOXeKrB3Iz7mdQpQ==", - "bin": { - "qrcode-terminal": "bin/qrcode-terminal.js" - } - }, - "node_modules/quick-format-unescaped": { - "version": "4.0.4", - "resolved": "https://registry.npmjs.org/quick-format-unescaped/-/quick-format-unescaped-4.0.4.tgz", - "integrity": "sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg==", - "license": "MIT" - }, "node_modules/rc": { "version": "1.2.8", "resolved": "https://registry.npmjs.org/rc/-/rc-1.2.8.tgz", @@ -3143,30 +2754,15 @@ "node": ">= 6" } }, - "node_modules/real-require": { - "version": "0.2.0", - "resolved": "https://registry.npmjs.org/real-require/-/real-require-0.2.0.tgz", - "integrity": "sha512-57frrGM/OCTLqLOAh0mhVA9VBMHd+9U7Zb2THMGdBUoZVOtGbJzjxsYGDJ3A9AYYCP4hn6y1TVbaOfzWtm5GFg==", - "license": "MIT", + "node_modules/resolve-from": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz", + "integrity": "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==", + "dev": true, "engines": { - "node": ">= 12.13.0" + "node": ">=4" } }, - "node_modules/require-directory": { - "version": "2.1.1", - "resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz", - "integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==", - "license": "MIT", - "engines": { - "node": ">=0.10.0" - } - }, - "node_modules/require-main-filename": { - "version": "2.0.0", - "resolved": "https://registry.npmjs.org/require-main-filename/-/require-main-filename-2.0.0.tgz", - "integrity": "sha512-NKN5kMDylKuldxYLSUfrbo5Tuzh4hd+2E8NPPX02mZtn1VuREQToYe/ZdlJy+J3uCpfaiGF05e7B8W0iXbQHmg==", - "license": "ISC" - }, "node_modules/resolve-pkg-maps": { "version": "1.0.0", "resolved": "https://registry.npmjs.org/resolve-pkg-maps/-/resolve-pkg-maps-1.0.0.tgz", @@ -3242,31 +2838,6 @@ ], "license": "MIT" }, - "node_modules/safe-stable-stringify": { - "version": "2.5.0", - "resolved": "https://registry.npmjs.org/safe-stable-stringify/-/safe-stable-stringify-2.5.0.tgz", - "integrity": "sha512-b3rppTKm9T+PsVCBEOUR46GWI7fdOs00VKZ1+9c1EWDaDMvjQc6tUwuFyIprgGgTcWoVHSKrU8H31ZHA2e0RHA==", - "license": "MIT", - "engines": { - "node": ">=10" - } - }, - "node_modules/secure-json-parse": { - "version": "4.1.0", - "resolved": "https://registry.npmjs.org/secure-json-parse/-/secure-json-parse-4.1.0.tgz", - "integrity": "sha512-l4KnYfEyqYJxDwlNVyRfO2E4NTHfMKAWdUuA8J0yve2Dz/E/PdBepY03RvyJpssIpRFwJoCD55wA+mEDs6ByWA==", - "funding": [ - { - "type": "github", - "url": "https://github.com/sponsors/fastify" - }, - { - "type": "opencollective", - "url": "https://opencollective.com/fastify" - } - ], - "license": "BSD-3-Clause" - }, "node_modules/semver": { "version": "7.7.4", "resolved": "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz", @@ -3279,55 +2850,25 @@ "node": ">=10" } }, - "node_modules/set-blocking": { + "node_modules/shebang-command": { "version": "2.0.0", - "resolved": "https://registry.npmjs.org/set-blocking/-/set-blocking-2.0.0.tgz", - "integrity": "sha512-KiKBS8AnWGEyLzofFfmvKwpdPzqiy16LvQfK3yv/fVH7Bj13/wl3JSR1J+rfgRE9q7xUJK4qvgS8raSOeLUehw==", - "license": "ISC" - }, - "node_modules/sharp": { - "version": "0.34.5", - "resolved": "https://registry.npmjs.org/sharp/-/sharp-0.34.5.tgz", - "integrity": "sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==", - "hasInstallScript": true, - "license": "Apache-2.0", - "peer": true, + "resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz", + "integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==", + "dev": true, "dependencies": { - "@img/colour": "^1.0.0", - "detect-libc": "^2.1.2", - "semver": "^7.7.3" + "shebang-regex": "^3.0.0" }, "engines": { - "node": "^18.17.0 || ^20.3.0 || >=21.0.0" - }, - "funding": { - "url": "https://opencollective.com/libvips" - }, - "optionalDependencies": { - "@img/sharp-darwin-arm64": "0.34.5", - "@img/sharp-darwin-x64": "0.34.5", - "@img/sharp-libvips-darwin-arm64": "1.2.4", - "@img/sharp-libvips-darwin-x64": "1.2.4", - "@img/sharp-libvips-linux-arm": "1.2.4", - "@img/sharp-libvips-linux-arm64": "1.2.4", - "@img/sharp-libvips-linux-ppc64": "1.2.4", - "@img/sharp-libvips-linux-riscv64": "1.2.4", - "@img/sharp-libvips-linux-s390x": "1.2.4", - "@img/sharp-libvips-linux-x64": "1.2.4", - "@img/sharp-libvips-linuxmusl-arm64": "1.2.4", - "@img/sharp-libvips-linuxmusl-x64": "1.2.4", - "@img/sharp-linux-arm": "0.34.5", - "@img/sharp-linux-arm64": "0.34.5", - "@img/sharp-linux-ppc64": "0.34.5", - "@img/sharp-linux-riscv64": "0.34.5", - "@img/sharp-linux-s390x": "0.34.5", - "@img/sharp-linux-x64": "0.34.5", - "@img/sharp-linuxmusl-arm64": "0.34.5", - "@img/sharp-linuxmusl-x64": "0.34.5", - "@img/sharp-wasm32": "0.34.5", - "@img/sharp-win32-arm64": "0.34.5", - "@img/sharp-win32-ia32": "0.34.5", - "@img/sharp-win32-x64": "0.34.5" + "node": ">=8" + } + }, + "node_modules/shebang-regex": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz", + "integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==", + "dev": true, + "engines": { + "node": ">=8" } }, "node_modules/siginfo": { @@ -3382,15 +2923,6 @@ "simple-concat": "^1.0.0" } }, - "node_modules/sonic-boom": { - "version": "4.2.1", - "resolved": "https://registry.npmjs.org/sonic-boom/-/sonic-boom-4.2.1.tgz", - "integrity": "sha512-w6AxtubXa2wTXAUsZMMWERrsIRAdrK0Sc+FUytWvYAhBJLyuI4llrMIC1DtlNSdI99EI86KZum2MMq3EAZlF9Q==", - "license": "MIT", - "dependencies": { - "atomic-sleep": "^1.0.0" - } - }, "node_modules/source-map-js": { "version": "1.2.1", "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", @@ -3401,15 +2933,6 @@ "node": ">=0.10.0" } }, - "node_modules/split2": { - "version": "4.2.0", - "resolved": "https://registry.npmjs.org/split2/-/split2-4.2.0.tgz", - "integrity": "sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg==", - "license": "ISC", - "engines": { - "node": ">= 10.x" - } - }, "node_modules/stackback": { "version": "0.0.2", "resolved": "https://registry.npmjs.org/stackback/-/stackback-0.0.2.tgz", @@ -3433,60 +2956,6 @@ "safe-buffer": "~5.2.0" } }, - "node_modules/string-width": { - "version": "4.2.3", - "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz", - "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==", - "license": "MIT", - "dependencies": { - "emoji-regex": "^8.0.0", - "is-fullwidth-code-point": "^3.0.0", - "strip-ansi": "^6.0.1" - }, - "engines": { - "node": ">=8" - } - }, - "node_modules/strip-ansi": { - "version": "6.0.1", - "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz", - "integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==", - "license": "MIT", - "dependencies": { - "ansi-regex": "^5.0.1" - }, - "engines": { - "node": ">=8" - } - }, - "node_modules/strip-json-comments": { - "version": "5.0.3", - "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-5.0.3.tgz", - "integrity": "sha512-1tB5mhVo7U+ETBKNf92xT4hrQa3pm0MZ0PQvuDnWgAAGHDsfp4lPSpiS6psrSiet87wyGPh9ft6wmhOMQ0hDiw==", - "license": "MIT", - "engines": { - "node": ">=14.16" - }, - "funding": { - "url": "https://github.com/sponsors/sindresorhus" - } - }, - "node_modules/strtok3": { - "version": "10.3.4", - "resolved": "https://registry.npmjs.org/strtok3/-/strtok3-10.3.4.tgz", - "integrity": "sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg==", - "license": "MIT", - "dependencies": { - "@tokenizer/token": "^0.3.0" - }, - "engines": { - "node": ">=18" - }, - "funding": { - "type": "github", - "url": "https://github.com/sponsors/Borewit" - } - }, "node_modules/supports-color": { "version": "7.2.0", "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz", @@ -3528,15 +2997,6 @@ "node": ">=6" } }, - "node_modules/thread-stream": { - "version": "3.1.0", - "resolved": "https://registry.npmjs.org/thread-stream/-/thread-stream-3.1.0.tgz", - "integrity": "sha512-OqyPZ9u96VohAyMfJykzmivOrY2wfMSf3C5TtFJVgN+Hm6aj+voFhlK+kZEIv2FBh1X6Xp3DlnCOfEQ3B2J86A==", - "license": "MIT", - "dependencies": { - "real-require": "^0.2.0" - } - }, "node_modules/tinybench": { "version": "2.9.0", "resolved": "https://registry.npmjs.org/tinybench/-/tinybench-2.9.0.tgz", @@ -3581,37 +3041,24 @@ "node": ">=14.0.0" } }, - "node_modules/token-types": { - "version": "6.1.2", - "resolved": "https://registry.npmjs.org/token-types/-/token-types-6.1.2.tgz", - "integrity": "sha512-dRXchy+C0IgK8WPC6xvCHFRIWYUbqqdEIKPaKo/AcTUNzwLTK6AH7RjdLWsEZcAN/TBdtfUw3PYEgPr5VPr6ww==", - "license": "MIT", - "dependencies": { - "@borewit/text-codec": "^0.2.1", - "@tokenizer/token": "^0.3.0", - "ieee754": "^1.2.1" - }, + "node_modules/ts-api-utils": { + "version": "2.5.0", + "resolved": "https://registry.npmjs.org/ts-api-utils/-/ts-api-utils-2.5.0.tgz", + "integrity": "sha512-OJ/ibxhPlqrMM0UiNHJ/0CKQkoKF243/AEmplt3qpRgkW8VG7IfOS41h7V8TjITqdByHzrjcS/2si+y4lIh8NA==", + "dev": true, "engines": { - "node": ">=14.16" + "node": ">=18.12" }, - "funding": { - "type": "github", - "url": "https://github.com/sponsors/Borewit" + "peerDependencies": { + "typescript": ">=4.8.4" } }, - "node_modules/tslib": { - "version": "2.8.1", - "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz", - "integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==", - "license": "0BSD" - }, "node_modules/tsx": { "version": "4.21.0", "resolved": "https://registry.npmjs.org/tsx/-/tsx-4.21.0.tgz", "integrity": "sha512-5C1sg4USs1lfG0GFb2RLXsdpXqBSEhAaA/0kPL01wxzpMqLILNxIxIOKiILz+cdg/pLnOUxFYOR5yhHU666wbw==", "dev": true, "license": "MIT", - "peer": true, "dependencies": { "esbuild": "~0.27.0", "get-tsconfig": "^4.7.5" @@ -3638,6 +3085,18 @@ "node": "*" } }, + "node_modules/type-check": { + "version": "0.4.0", + "resolved": "https://registry.npmjs.org/type-check/-/type-check-0.4.0.tgz", + "integrity": "sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew==", + "dev": true, + "dependencies": { + "prelude-ls": "^1.2.1" + }, + "engines": { + "node": ">= 0.8.0" + } + }, "node_modules/typescript": { "version": "5.9.3", "resolved": "https://registry.npmjs.org/typescript/-/typescript-5.9.3.tgz", @@ -3652,24 +3111,45 @@ "node": ">=14.17" } }, - "node_modules/uint8array-extras": { - "version": "1.5.0", - "resolved": "https://registry.npmjs.org/uint8array-extras/-/uint8array-extras-1.5.0.tgz", - "integrity": "sha512-rvKSBiC5zqCCiDZ9kAOszZcDvdAHwwIKJG33Ykj43OKcWsnmcBRL09YTU4nOeHZ8Y2a7l1MgTd08SBe9A8Qj6A==", - "license": "MIT", + "node_modules/typescript-eslint": { + "version": "8.57.2", + "resolved": "https://registry.npmjs.org/typescript-eslint/-/typescript-eslint-8.57.2.tgz", + "integrity": "sha512-VEPQ0iPgWO/sBaZOU1xo4nuNdODVOajPnTIbog2GKYr31nIlZ0fWPoCQgGfF3ETyBl1vn63F/p50Um9Z4J8O8A==", + "dev": true, + "dependencies": { + "@typescript-eslint/eslint-plugin": "8.57.2", + "@typescript-eslint/parser": "8.57.2", + "@typescript-eslint/typescript-estree": "8.57.2", + "@typescript-eslint/utils": "8.57.2" + }, "engines": { - "node": ">=18" + "node": "^18.18.0 || ^20.9.0 || >=21.1.0" }, "funding": { - "url": "https://github.com/sponsors/sindresorhus" + "type": "opencollective", + "url": "https://opencollective.com/typescript-eslint" + }, + "peerDependencies": { + "eslint": "^8.57.0 || ^9.0.0 || ^10.0.0", + "typescript": ">=4.8.4 <6.0.0" } }, "node_modules/undici-types": { "version": "6.21.0", "resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.21.0.tgz", "integrity": "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ==", + "dev": true, "license": "MIT" }, + "node_modules/uri-js": { + "version": "4.4.1", + "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz", + "integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==", + "dev": true, + "dependencies": { + "punycode": "^2.1.0" + } + }, "node_modules/util-deprecate": { "version": "1.0.2", "resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz", @@ -3682,7 +3162,6 @@ "integrity": "sha512-w+N7Hifpc3gRjZ63vYBXA56dvvRlNWRczTdmCBBa+CotUzAPf5b7YMdMR/8CQoeYE5LX3W4wj6RYTgonm1b9DA==", "dev": true, "license": "MIT", - "peer": true, "dependencies": { "esbuild": "^0.27.0", "fdir": "^6.5.0", @@ -3758,7 +3237,6 @@ "integrity": "sha512-hOQuK7h0FGKgBAas7v0mSAsnvrIgAvWmRFjmzpJ7SwFHH3g1k2u37JtYwOwmEKhK6ZO3v9ggDBBm0La1LCK4uQ==", "dev": true, "license": "MIT", - "peer": true, "dependencies": { "@vitest/expect": "4.0.18", "@vitest/mocker": "4.0.18", @@ -3831,11 +3309,20 @@ } } }, - "node_modules/which-module": { - "version": "2.0.1", - "resolved": "https://registry.npmjs.org/which-module/-/which-module-2.0.1.tgz", - "integrity": "sha512-iBdZ57RDvnOR9AGBhML2vFZf7h8vmBjhoaZqODJBFWHVtKkDmKuHai3cx5PgVMrX5YDNp27AofYbAwctSS+vhQ==", - "license": "ISC" + "node_modules/which": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz", + "integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==", + "dev": true, + "dependencies": { + "isexe": "^2.0.0" + }, + "bin": { + "node-which": "bin/node-which" + }, + "engines": { + "node": ">= 8" + } }, "node_modules/why-is-node-running": { "version": "2.3.0", @@ -3854,24 +3341,13 @@ "node": ">=8" } }, - "node_modules/win-guid": { - "version": "0.2.1", - "resolved": "https://registry.npmjs.org/win-guid/-/win-guid-0.2.1.tgz", - "integrity": "sha512-gEIQU4mkgl2OPeoNrWflcJFJ3Ae2BPd4eCsHHA/XikslkIVms/nHhvnvzIZV7VLmBvtFlDOzLt9rrZT+n6D67A==", - "license": "MIT" - }, - "node_modules/wrap-ansi": { - "version": "6.2.0", - "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-6.2.0.tgz", - "integrity": "sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA==", - "license": "MIT", - "dependencies": { - "ansi-styles": "^4.0.0", - "string-width": "^4.1.0", - "strip-ansi": "^6.0.0" - }, + "node_modules/word-wrap": { + "version": "1.2.5", + "resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz", + "integrity": "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA==", + "dev": true, "engines": { - "node": ">=8" + "node": ">=0.10.0" } }, "node_modules/wrappy": { @@ -3880,38 +3356,13 @@ "integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==", "license": "ISC" }, - "node_modules/ws": { - "version": "8.19.0", - "resolved": "https://registry.npmjs.org/ws/-/ws-8.19.0.tgz", - "integrity": "sha512-blAT2mjOEIi0ZzruJfIhb3nps74PRWTCz1IjglWEEpQl5XS/UNama6u2/rjFkDDouqr4L67ry+1aGIALViWjDg==", - "license": "MIT", - "engines": { - "node": ">=10.0.0" - }, - "peerDependencies": { - "bufferutil": "^4.0.1", - "utf-8-validate": ">=5.0.2" - }, - "peerDependenciesMeta": { - "bufferutil": { - "optional": true - }, - "utf-8-validate": { - "optional": true - } - } - }, - "node_modules/y18n": { - "version": "4.0.3", - "resolved": "https://registry.npmjs.org/y18n/-/y18n-4.0.3.tgz", - "integrity": "sha512-JKhqTOwSrqNA1NY5lSztJ1GrBiUodLMmIZuLiDaMRJ+itFd+ABVE8XBjOvIWL+rSqNDC74LCSFmlb/U4UZ4hJQ==", - "license": "ISC" - }, "node_modules/yaml": { "version": "2.8.2", "resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.2.tgz", "integrity": "sha512-mplynKqc1C2hTVYxd0PU2xQAc22TI1vShAYGksCCfxbn/dFwnHTNi1bvYsBTkhdUNtGIf5xNOg938rrSSYvS9A==", + "dev": true, "license": "ISC", + "optional": true, "peer": true, "bin": { "yaml": "bin.mjs" @@ -3923,48 +3374,16 @@ "url": "https://github.com/sponsors/eemeli" } }, - "node_modules/yargs": { - "version": "15.4.1", - "resolved": "https://registry.npmjs.org/yargs/-/yargs-15.4.1.tgz", - "integrity": "sha512-aePbxDmcYW++PaqBsJ+HYUFwCdv4LVvdnhBy78E57PIor8/OVvhMrADFFEDh8DHDFRv/O9i3lPhsENjO7QX0+A==", - "license": "MIT", - "dependencies": { - "cliui": "^6.0.0", - "decamelize": "^1.2.0", - "find-up": "^4.1.0", - "get-caller-file": "^2.0.1", - "require-directory": "^2.1.1", - "require-main-filename": "^2.0.0", - "set-blocking": "^2.0.0", - "string-width": "^4.2.0", - "which-module": "^2.0.0", - "y18n": "^4.0.0", - "yargs-parser": "^18.1.2" - }, + "node_modules/yocto-queue": { + "version": "0.1.0", + "resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz", + "integrity": "sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==", + "dev": true, "engines": { - "node": ">=8" - } - }, - "node_modules/yargs-parser": { - "version": "18.1.3", - "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-18.1.3.tgz", - "integrity": "sha512-o50j0JeToy/4K6OZcaQmW6lyXXKhq7csREXcDwk2omFPJEwUNOVtJKvmDr9EI1fAJZUyZcRF7kxGBWmRXudrCQ==", - "license": "ISC", - "dependencies": { - "camelcase": "^5.0.0", - "decamelize": "^1.2.0" + "node": ">=10" }, - "engines": { - "node": ">=6" - } - }, - "node_modules/zod": { - "version": "4.3.6", - "resolved": "https://registry.npmjs.org/zod/-/zod-4.3.6.tgz", - "integrity": "sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg==", - "license": "MIT", "funding": { - "url": "https://github.com/sponsors/colinhacks" + "url": "https://github.com/sponsors/sindresorhus" } } } diff --git a/package.json b/package.json index f33839f..081d2b4 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "nanoclaw", - "version": "1.1.3", + "version": "1.2.36", "description": "Personal Claude assistant. Lightweight, secure, customizable.", "type": "module", "main": "dist/index.js", @@ -8,36 +8,35 @@ "build": "tsc", "start": "node dist/index.js", "dev": "tsx src/index.ts", - "auth": "tsx src/whatsapp-auth.ts", "typecheck": "tsc --noEmit", "format": "prettier --write \"src/**/*.ts\"", "format:fix": "prettier --write \"src/**/*.ts\"", "format:check": "prettier --check \"src/**/*.ts\"", "prepare": "husky", "setup": "tsx setup/index.ts", + "auth": "tsx src/whatsapp-auth.ts", + "lint": "eslint src/", + "lint:fix": "eslint src/ --fix", "test": "vitest run", "test:watch": "vitest" }, "dependencies": { - "@whiskeysockets/baileys": "^7.0.0-rc.9", - "better-sqlite3": "^11.8.1", - "cron-parser": "^5.5.0", - "pino": "^9.6.0", - "pino-pretty": "^13.0.0", - "qrcode": "^1.5.4", - "qrcode-terminal": "^0.12.0", - "yaml": "^2.8.2", - "zod": "^4.3.6" + "@onecli-sh/sdk": "^0.2.0", + "better-sqlite3": "11.10.0", + "cron-parser": "5.5.0" }, "devDependencies": { + "@eslint/js": "^9.35.0", "@types/better-sqlite3": "^7.6.12", "@types/node": "^22.10.0", - "@types/qrcode-terminal": "^0.12.2", - "@vitest/coverage-v8": "^4.0.18", + "eslint": "^9.35.0", + "eslint-plugin-no-catch-all": "^1.1.0", + "globals": "^15.12.0", "husky": "^9.1.7", "prettier": "^3.8.1", "tsx": "^4.19.0", "typescript": "^5.7.0", + "typescript-eslint": "^8.35.0", "vitest": "^4.0.18" }, "engines": { diff --git a/repo-tokens/badge.svg b/repo-tokens/badge.svg index e59720d..6e1646a 100644 --- a/repo-tokens/badge.svg +++ b/repo-tokens/badge.svg @@ -1,5 +1,5 @@ - - 38.4k tokens, 19% of context window + + 42.0k tokens, 21% of context window @@ -15,8 +15,8 @@ tokens - - 38.4k + + 42.0k diff --git a/scripts/apply-skill.ts b/scripts/apply-skill.ts deleted file mode 100644 index db31bdc..0000000 --- a/scripts/apply-skill.ts +++ /dev/null @@ -1,24 +0,0 @@ -import { applySkill } from '../skills-engine/apply.js'; -import { initNanoclawDir } from '../skills-engine/init.js'; - -const args = process.argv.slice(2); - -// Handle --init flag: initialize .nanoclaw/ directory and exit -if (args.includes('--init')) { - initNanoclawDir(); - console.log(JSON.stringify({ success: true, action: 'init' })); - process.exit(0); -} - -const skillDir = args[0]; -if (!skillDir) { - console.error('Usage: tsx scripts/apply-skill.ts [--init] '); - process.exit(1); -} - -const result = await applySkill(skillDir); -console.log(JSON.stringify(result, null, 2)); - -if (!result.success) { - process.exit(1); -} diff --git a/scripts/fix-skill-drift.ts b/scripts/fix-skill-drift.ts deleted file mode 100644 index ffa5c35..0000000 --- a/scripts/fix-skill-drift.ts +++ /dev/null @@ -1,266 +0,0 @@ -#!/usr/bin/env npx tsx -/** - * Auto-fix drifted skills by three-way merging their modify/ files. - * - * For each drifted skill's `modifies` entry: - * 1. Find the commit where the skill's modify/ copy was last updated - * 2. Retrieve the source file at that commit (old base) - * 3. git merge-file - * - Clean merge → modify/ file is auto-updated - * - Conflicts → conflict markers left in place for human/Claude review - * - * The calling workflow should commit the resulting changes and create a PR. - * - * Sets GitHub Actions outputs: - * has_conflicts — "true" | "false" - * fixed_count — number of auto-fixed files - * conflict_count — number of files with unresolved conflict markers - * summary — human-readable summary for PR body - * - * Usage: npx tsx scripts/fix-skill-drift.ts add-telegram add-discord - */ -import { execFileSync, execSync } from 'child_process'; -import crypto from 'crypto'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { parse } from 'yaml'; -import type { SkillManifest } from '../skills-engine/types.js'; - -interface FixResult { - skill: string; - file: string; - status: 'auto-fixed' | 'conflict' | 'skipped' | 'error'; - conflicts?: number; - reason?: string; -} - -function readManifest(skillDir: string): SkillManifest { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - return parse(fs.readFileSync(manifestPath, 'utf-8')) as SkillManifest; -} - -function fixSkill(skillName: string, projectRoot: string): FixResult[] { - const skillDir = path.join(projectRoot, '.claude', 'skills', skillName); - const manifest = readManifest(skillDir); - const results: FixResult[] = []; - - for (const relPath of manifest.modifies) { - const modifyPath = path.join(skillDir, 'modify', relPath); - const currentPath = path.join(projectRoot, relPath); - - if (!fs.existsSync(modifyPath)) { - results.push({ - skill: skillName, - file: relPath, - status: 'skipped', - reason: 'modify/ file not found', - }); - continue; - } - - if (!fs.existsSync(currentPath)) { - results.push({ - skill: skillName, - file: relPath, - status: 'skipped', - reason: 'source file not found on main', - }); - continue; - } - - // Find when the skill's modify file was last changed - let lastCommit: string; - try { - lastCommit = execSync(`git log -1 --format=%H -- "${modifyPath}"`, { - encoding: 'utf-8', - stdio: ['pipe', 'pipe', 'pipe'], - }).trim(); - } catch { - results.push({ - skill: skillName, - file: relPath, - status: 'skipped', - reason: 'no git history for modify file', - }); - continue; - } - - if (!lastCommit) { - results.push({ - skill: skillName, - file: relPath, - status: 'skipped', - reason: 'no commits found for modify file', - }); - continue; - } - - // Get the source file at that commit (the old base the skill was written against) - const tmpOldBase = path.join( - os.tmpdir(), - `nanoclaw-drift-base-${crypto.randomUUID()}`, - ); - try { - const oldBase = execSync(`git show "${lastCommit}:${relPath}"`, { - encoding: 'utf-8', - stdio: ['pipe', 'pipe', 'pipe'], - }); - fs.writeFileSync(tmpOldBase, oldBase); - } catch { - results.push({ - skill: skillName, - file: relPath, - status: 'skipped', - reason: `source file not found at commit ${lastCommit.slice(0, 7)}`, - }); - continue; - } - - // If old base == current main, the source hasn't changed since the skill was updated. - // The skill is already in sync for this file. - const currentContent = fs.readFileSync(currentPath, 'utf-8'); - const oldBaseContent = fs.readFileSync(tmpOldBase, 'utf-8'); - if (oldBaseContent === currentContent) { - fs.unlinkSync(tmpOldBase); - results.push({ - skill: skillName, - file: relPath, - status: 'skipped', - reason: 'source unchanged since skill update', - }); - continue; - } - - // Three-way merge: modify/file ← old_base → current_main - // git merge-file modifies first argument in-place - try { - execFileSync('git', ['merge-file', modifyPath, tmpOldBase, currentPath], { - stdio: 'pipe', - }); - results.push({ skill: skillName, file: relPath, status: 'auto-fixed' }); - } catch (err: any) { - const exitCode = err.status ?? -1; - if (exitCode > 0) { - // Positive exit code = number of conflicts, file has markers - results.push({ - skill: skillName, - file: relPath, - status: 'conflict', - conflicts: exitCode, - }); - } else { - results.push({ - skill: skillName, - file: relPath, - status: 'error', - reason: err.message, - }); - } - } finally { - try { - fs.unlinkSync(tmpOldBase); - } catch { - /* ignore */ - } - } - } - - return results; -} - -function setOutput(key: string, value: string): void { - const outputFile = process.env.GITHUB_OUTPUT; - if (!outputFile) return; - - if (value.includes('\n')) { - const delimiter = `ghadelim_${Date.now()}`; - fs.appendFileSync( - outputFile, - `${key}<<${delimiter}\n${value}\n${delimiter}\n`, - ); - } else { - fs.appendFileSync(outputFile, `${key}=${value}\n`); - } -} - -async function main(): Promise { - const projectRoot = process.cwd(); - const skillNames = process.argv.slice(2); - - if (skillNames.length === 0) { - console.error( - 'Usage: npx tsx scripts/fix-skill-drift.ts [skill2] ...', - ); - process.exit(1); - } - - console.log(`Attempting auto-fix for: ${skillNames.join(', ')}\n`); - - const allResults: FixResult[] = []; - - for (const skillName of skillNames) { - console.log(`--- ${skillName} ---`); - const results = fixSkill(skillName, projectRoot); - allResults.push(...results); - - for (const r of results) { - const icon = - r.status === 'auto-fixed' - ? 'FIXED' - : r.status === 'conflict' - ? `CONFLICT (${r.conflicts})` - : r.status === 'skipped' - ? 'SKIP' - : 'ERROR'; - const detail = r.reason ? ` -- ${r.reason}` : ''; - console.log(` ${icon} ${r.file}${detail}`); - } - } - - // Summary - const fixed = allResults.filter((r) => r.status === 'auto-fixed'); - const conflicts = allResults.filter((r) => r.status === 'conflict'); - const skipped = allResults.filter((r) => r.status === 'skipped'); - - console.log('\n=== Summary ==='); - console.log(` Auto-fixed: ${fixed.length}`); - console.log(` Conflicts: ${conflicts.length}`); - console.log(` Skipped: ${skipped.length}`); - - // Build markdown summary for PR body - const summaryLines: string[] = []; - for (const skillName of skillNames) { - const skillResults = allResults.filter((r) => r.skill === skillName); - const fixedFiles = skillResults.filter((r) => r.status === 'auto-fixed'); - const conflictFiles = skillResults.filter((r) => r.status === 'conflict'); - - summaryLines.push(`### ${skillName}`); - if (fixedFiles.length > 0) { - summaryLines.push( - `Auto-fixed: ${fixedFiles.map((r) => `\`${r.file}\``).join(', ')}`, - ); - } - if (conflictFiles.length > 0) { - summaryLines.push( - `Needs manual resolution: ${conflictFiles.map((r) => `\`${r.file}\``).join(', ')}`, - ); - } - if (fixedFiles.length === 0 && conflictFiles.length === 0) { - summaryLines.push('No modify/ files needed updating.'); - } - summaryLines.push(''); - } - - // GitHub outputs - setOutput('has_conflicts', conflicts.length > 0 ? 'true' : 'false'); - setOutput('fixed_count', String(fixed.length)); - setOutput('conflict_count', String(conflicts.length)); - setOutput('summary', summaryLines.join('\n')); -} - -main().catch((err) => { - console.error('Fatal error:', err); - process.exit(1); -}); diff --git a/scripts/post-update.ts b/scripts/post-update.ts deleted file mode 100644 index 83612b5..0000000 --- a/scripts/post-update.ts +++ /dev/null @@ -1,5 +0,0 @@ -#!/usr/bin/env tsx -import { clearBackup } from '../skills-engine/backup.js'; - -clearBackup(); -console.log('Backup cleared.'); diff --git a/scripts/rebase.ts b/scripts/rebase.ts deleted file mode 100644 index 047e07c..0000000 --- a/scripts/rebase.ts +++ /dev/null @@ -1,21 +0,0 @@ -#!/usr/bin/env npx tsx -import { rebase } from '../skills-engine/rebase.js'; - -async function main() { - const newBasePath = process.argv[2]; // optional - - if (newBasePath) { - console.log(`Rebasing with new base from: ${newBasePath}`); - } else { - console.log('Rebasing current state...'); - } - - const result = await rebase(newBasePath); - console.log(JSON.stringify(result, null, 2)); - - if (!result.success) { - process.exit(1); - } -} - -main(); diff --git a/scripts/run-migrations.ts b/scripts/run-migrations.ts index 355312a..b75c26e 100644 --- a/scripts/run-migrations.ts +++ b/scripts/run-migrations.ts @@ -3,7 +3,15 @@ import { execFileSync, execSync } from 'child_process'; import fs from 'fs'; import path from 'path'; -import { compareSemver } from '../skills-engine/state.js'; +function compareSemver(a: string, b: string): number { + const partsA = a.split('.').map(Number); + const partsB = b.split('.').map(Number); + for (let i = 0; i < Math.max(partsA.length, partsB.length); i++) { + const diff = (partsA[i] || 0) - (partsB[i] || 0); + if (diff !== 0) return diff; + } + return 0; +} // Resolve tsx binary once to avoid npx race conditions across migrations function resolveTsx(): string { diff --git a/scripts/uninstall-skill.ts b/scripts/uninstall-skill.ts deleted file mode 100644 index a3d6682..0000000 --- a/scripts/uninstall-skill.ts +++ /dev/null @@ -1,39 +0,0 @@ -#!/usr/bin/env npx tsx -import { uninstallSkill } from '../skills-engine/uninstall.js'; - -async function main() { - const skillName = process.argv[2]; - if (!skillName) { - console.error('Usage: npx tsx scripts/uninstall-skill.ts '); - process.exit(1); - } - - console.log(`Uninstalling skill: ${skillName}`); - const result = await uninstallSkill(skillName); - - if (result.customPatchWarning) { - console.warn(`\nWarning: ${result.customPatchWarning}`); - console.warn( - 'To proceed, remove the custom_patch from state.yaml and re-run.', - ); - process.exit(1); - } - - if (!result.success) { - console.error(`\nFailed: ${result.error}`); - process.exit(1); - } - - console.log(`\nSuccessfully uninstalled: ${skillName}`); - if (result.replayResults) { - console.log('Replay test results:'); - for (const [name, passed] of Object.entries(result.replayResults)) { - console.log(` ${name}: ${passed ? 'PASS' : 'FAIL'}`); - } - } -} - -main().catch((err) => { - console.error(err); - process.exit(1); -}); diff --git a/scripts/update-core.ts b/scripts/update-core.ts deleted file mode 100644 index 1bc05e1..0000000 --- a/scripts/update-core.ts +++ /dev/null @@ -1,61 +0,0 @@ -#!/usr/bin/env tsx -import { applyUpdate, previewUpdate } from '../skills-engine/update.js'; - -const args = process.argv.slice(2); -const jsonMode = args.includes('--json'); -const previewOnly = args.includes('--preview-only'); -const newCorePath = args.find((a) => !a.startsWith('--')); - -if (!newCorePath) { - console.error( - 'Usage: tsx scripts/update-core.ts [--json] [--preview-only] ', - ); - process.exit(1); -} - -// Preview -const preview = previewUpdate(newCorePath); - -if (jsonMode && previewOnly) { - console.log(JSON.stringify(preview, null, 2)); - process.exit(0); -} - -function printPreview(): void { - console.log('=== Update Preview ==='); - console.log(`Current version: ${preview.currentVersion}`); - console.log(`New version: ${preview.newVersion}`); - console.log(`Files changed: ${preview.filesChanged.length}`); - if (preview.filesChanged.length > 0) { - for (const f of preview.filesChanged) { - console.log(` ${f}`); - } - } - if (preview.conflictRisk.length > 0) { - console.log(`Conflict risk: ${preview.conflictRisk.join(', ')}`); - } - if (preview.customPatchesAtRisk.length > 0) { - console.log( - `Custom patches at risk: ${preview.customPatchesAtRisk.join(', ')}`, - ); - } -} - -if (previewOnly) { - printPreview(); - process.exit(0); -} - -if (!jsonMode) { - printPreview(); - console.log(''); - console.log('Applying update...'); -} - -const result = await applyUpdate(newCorePath); - -console.log(JSON.stringify(result, null, 2)); - -if (!result.success) { - process.exit(1); -} diff --git a/scripts/validate-all-skills.ts b/scripts/validate-all-skills.ts deleted file mode 100644 index 5208a90..0000000 --- a/scripts/validate-all-skills.ts +++ /dev/null @@ -1,252 +0,0 @@ -#!/usr/bin/env npx tsx -/** - * Validate all skills by applying each in isolation against current main. - * - * For each skill: - * 1. Reset working tree to clean state - * 2. Initialize .nanoclaw/ (snapshot current source as base) - * 3. Apply skill via apply-skill.ts - * 4. Run tsc --noEmit (typecheck) - * 5. Run the skill's test command (from manifest.yaml) - * - * Sets GitHub Actions outputs: - * drifted — "true" | "false" - * drifted_skills — JSON array of drifted skill names, e.g. ["add-telegram"] - * results — JSON array of per-skill results - * - * Exit code 1 if any skill drifted, 0 otherwise. - * - * Usage: - * npx tsx scripts/validate-all-skills.ts # validate all - * npx tsx scripts/validate-all-skills.ts add-telegram # validate one - */ -import { execSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import { parse } from 'yaml'; -import type { SkillManifest } from '../skills-engine/types.js'; - -interface SkillValidationResult { - name: string; - success: boolean; - failedStep?: 'apply' | 'typecheck' | 'test'; - error?: string; -} - -function discoverSkills( - skillsDir: string, -): { name: string; dir: string; manifest: SkillManifest }[] { - if (!fs.existsSync(skillsDir)) return []; - const results: { name: string; dir: string; manifest: SkillManifest }[] = []; - - for (const entry of fs.readdirSync(skillsDir, { withFileTypes: true })) { - if (!entry.isDirectory()) continue; - const manifestPath = path.join(skillsDir, entry.name, 'manifest.yaml'); - if (!fs.existsSync(manifestPath)) continue; - const manifest = parse( - fs.readFileSync(manifestPath, 'utf-8'), - ) as SkillManifest; - results.push({ - name: entry.name, - dir: path.join(skillsDir, entry.name), - manifest, - }); - } - - return results; -} - -/** Restore tracked files and remove untracked skill artifacts. */ -function resetWorkingTree(): void { - execSync('git checkout -- .', { stdio: 'pipe' }); - // Remove untracked files added by skill application (e.g. src/channels/telegram.ts) - // but preserve node_modules to avoid costly reinstalls. - execSync('git clean -fd --exclude=node_modules', { stdio: 'pipe' }); - // Clean skills-system state directory - if (fs.existsSync('.nanoclaw')) { - fs.rmSync('.nanoclaw', { recursive: true, force: true }); - } -} - -function initNanoclaw(): void { - execSync( - 'npx tsx -e "import { initNanoclawDir } from \'./skills-engine/index\'; initNanoclawDir();"', - { stdio: 'pipe', timeout: 30_000 }, - ); -} - -/** Append a key=value to $GITHUB_OUTPUT (no-op locally). */ -function setOutput(key: string, value: string): void { - const outputFile = process.env.GITHUB_OUTPUT; - if (!outputFile) return; - - if (value.includes('\n')) { - const delimiter = `ghadelim_${Date.now()}`; - fs.appendFileSync( - outputFile, - `${key}<<${delimiter}\n${value}\n${delimiter}\n`, - ); - } else { - fs.appendFileSync(outputFile, `${key}=${value}\n`); - } -} - -function truncate(s: string, max = 300): string { - return s.length > max ? s.slice(0, max) + '...' : s; -} - -async function main(): Promise { - const projectRoot = process.cwd(); - const skillsDir = path.join(projectRoot, '.claude', 'skills'); - - // Allow filtering to specific skills via CLI args - const filterSkills = process.argv.slice(2); - - let skills = discoverSkills(skillsDir); - if (filterSkills.length > 0) { - skills = skills.filter((s) => filterSkills.includes(s.name)); - } - - if (skills.length === 0) { - console.log('No skills found to validate.'); - setOutput('drifted', 'false'); - setOutput('drifted_skills', '[]'); - setOutput('results', '[]'); - process.exit(0); - } - - console.log( - `Validating ${skills.length} skill(s): ${skills.map((s) => s.name).join(', ')}\n`, - ); - - const results: SkillValidationResult[] = []; - - for (const skill of skills) { - console.log(`--- ${skill.name} ---`); - - // Clean slate - resetWorkingTree(); - initNanoclaw(); - - // Step 1: Apply skill - try { - const applyOutput = execSync( - `npx tsx scripts/apply-skill.ts "${skill.dir}"`, - { - encoding: 'utf-8', - stdio: ['pipe', 'pipe', 'pipe'], - timeout: 120_000, - }, - ); - // parse stdout to verify success - try { - const parsed = JSON.parse(applyOutput); - if (!parsed.success) { - console.log(` FAIL (apply): ${truncate(parsed.error || 'unknown')}`); - results.push({ - name: skill.name, - success: false, - failedStep: 'apply', - error: parsed.error, - }); - continue; - } - } catch { - // Non-JSON stdout with exit 0 is treated as success - } - } catch (err: any) { - const stderr = err.stderr?.toString() || ''; - const stdout = err.stdout?.toString() || ''; - let error = 'Apply failed'; - try { - const parsed = JSON.parse(stdout); - error = parsed.error || error; - } catch { - error = stderr || stdout || err.message; - } - console.log(` FAIL (apply): ${truncate(error)}`); - results.push({ - name: skill.name, - success: false, - failedStep: 'apply', - error, - }); - continue; - } - console.log(' apply: OK'); - - // Step 2: Typecheck - try { - execSync('npx tsc --noEmit', { - stdio: 'pipe', - timeout: 120_000, - }); - } catch (err: any) { - const error = err.stdout?.toString() || err.message; - console.log(` FAIL (typecheck): ${truncate(error)}`); - results.push({ - name: skill.name, - success: false, - failedStep: 'typecheck', - error, - }); - continue; - } - console.log(' typecheck: OK'); - - // Step 3: Skill's own test command - if (skill.manifest.test) { - try { - execSync(skill.manifest.test, { - stdio: 'pipe', - timeout: 300_000, - }); - } catch (err: any) { - const error = - err.stdout?.toString() || err.stderr?.toString() || err.message; - console.log(` FAIL (test): ${truncate(error)}`); - results.push({ - name: skill.name, - success: false, - failedStep: 'test', - error, - }); - continue; - } - console.log(' test: OK'); - } - - console.log(' PASS'); - results.push({ name: skill.name, success: true }); - } - - // Restore clean state - resetWorkingTree(); - - // Summary - const drifted = results.filter((r) => !r.success); - const passed = results.filter((r) => r.success); - - console.log('\n=== Summary ==='); - for (const r of results) { - const status = r.success ? 'PASS' : 'FAIL'; - const detail = r.failedStep ? ` (${r.failedStep})` : ''; - console.log(` ${status} ${r.name}${detail}`); - } - console.log(`\n${passed.length} passed, ${drifted.length} failed`); - - // GitHub Actions outputs - setOutput('drifted', drifted.length > 0 ? 'true' : 'false'); - setOutput('drifted_skills', JSON.stringify(drifted.map((d) => d.name))); - setOutput('results', JSON.stringify(results)); - - if (drifted.length > 0) { - process.exit(1); - } -} - -main().catch((err) => { - console.error('Fatal error:', err); - process.exit(1); -}); diff --git a/setup.sh b/setup.sh index ef7d683..c37f143 100755 --- a/setup.sh +++ b/setup.sh @@ -79,8 +79,8 @@ install_deps() { log "Running as root, using --unsafe-perm" fi - log "Running npm install $npm_flags" - if npm install $npm_flags >> "$LOG_FILE" 2>&1; then + log "Running npm ci $npm_flags" + if npm ci $npm_flags >> "$LOG_FILE" 2>&1; then DEPS_OK="true" log "npm install succeeded" else diff --git a/setup/environment.test.ts b/setup/environment.test.ts index b33f272..deda62f 100644 --- a/setup/environment.test.ts +++ b/setup/environment.test.ts @@ -104,9 +104,8 @@ describe('Docker detection logic', () => { }); }); -describe('WhatsApp auth detection', () => { - it('detects non-empty auth directory logic', () => { - // Simulate the check: directory exists and has files +describe('channel auth detection', () => { + it('detects non-empty auth directory', () => { const hasAuth = (authDir: string) => { try { return fs.existsSync(authDir) && fs.readdirSync(authDir).length > 0; @@ -119,3 +118,4 @@ describe('WhatsApp auth detection', () => { expect(hasAuth('/tmp/nonexistent_auth_dir_xyz')).toBe(false); }); }); + diff --git a/setup/groups.ts b/setup/groups.ts index d251d5d..6697029 100644 --- a/setup/groups.ts +++ b/setup/groups.ts @@ -1,5 +1,7 @@ /** - * Step: groups — Connect to WhatsApp, fetch group metadata, write to DB. + * Step: groups — Fetch group metadata from messaging platforms, write to DB. + * WhatsApp requires an upfront sync (Baileys groupFetchAllParticipating). + * Other channels discover group names at runtime — this step auto-skips for them. * Replaces 05-sync-groups.sh + 05b-list-groups.sh */ import { execSync } from 'child_process'; @@ -62,6 +64,25 @@ async function listGroups(limit: number): Promise { } async function syncGroups(projectRoot: string): Promise { + // Only WhatsApp needs an upfront group sync; other channels resolve names at runtime. + // Detect WhatsApp by checking for auth credentials on disk. + const authDir = path.join(projectRoot, 'store', 'auth'); + const hasWhatsAppAuth = + fs.existsSync(authDir) && fs.readdirSync(authDir).length > 0; + + if (!hasWhatsAppAuth) { + logger.info('WhatsApp auth not found — skipping group sync'); + emitStatus('SYNC_GROUPS', { + BUILD: 'skipped', + SYNC: 'skipped', + GROUPS_IN_DB: 0, + REASON: 'whatsapp_not_configured', + STATUS: 'success', + LOG: 'logs/setup.log', + }); + return; + } + // Build TypeScript first logger.info('Building TypeScript'); let buildOk = false; @@ -85,7 +106,7 @@ async function syncGroups(projectRoot: string): Promise { process.exit(1); } - // Run inline sync script via node + // Run sync script via a temp file to avoid shell escaping issues with node -e logger.info('Fetching group metadata'); let syncOk = false; try { @@ -158,17 +179,20 @@ sock.ev.on('connection.update', async (update) => { }); `; - const output = execSync( - `node --input-type=module -e ${JSON.stringify(syncScript)}`, - { + const tmpScript = path.join(projectRoot, '.tmp-group-sync.mjs'); + fs.writeFileSync(tmpScript, syncScript, 'utf-8'); + try { + const output = execSync(`node ${tmpScript}`, { cwd: projectRoot, encoding: 'utf-8', timeout: 45000, stdio: ['ignore', 'pipe', 'pipe'], - }, - ); - syncOk = output.includes('SYNCED:'); - logger.info({ output: output.trim() }, 'Sync output'); + }); + syncOk = output.includes('SYNCED:'); + logger.info({ output: output.trim() }, 'Sync output'); + } finally { + try { fs.unlinkSync(tmpScript); } catch { /* ignore cleanup errors */ } + } } catch (err) { logger.error({ err }, 'Sync failed'); } diff --git a/setup/index.ts b/setup/index.ts index 287a790..7e10ddc 100644 --- a/setup/index.ts +++ b/setup/index.ts @@ -9,9 +9,9 @@ const STEPS: Record< string, () => Promise<{ run: (args: string[]) => Promise }> > = { + timezone: () => import('./timezone.js'), environment: () => import('./environment.js'), container: () => import('./container.js'), - 'whatsapp-auth': () => import('./whatsapp-auth.js'), groups: () => import('./groups.js'), register: () => import('./register.js'), mounts: () => import('./mounts.js'), diff --git a/setup/mounts.ts b/setup/mounts.ts index eb2a5f6..e14d23b 100644 --- a/setup/mounts.ts +++ b/setup/mounts.ts @@ -10,21 +10,23 @@ import { logger } from '../src/logger.js'; import { isRoot } from './platform.js'; import { emitStatus } from './status.js'; -function parseArgs(args: string[]): { empty: boolean; json: string } { +function parseArgs(args: string[]): { empty: boolean; json: string; force: boolean } { let empty = false; let json = ''; + let force = false; for (let i = 0; i < args.length; i++) { if (args[i] === '--empty') empty = true; + if (args[i] === '--force') force = true; if (args[i] === '--json' && args[i + 1]) { json = args[i + 1]; i++; } } - return { empty, json }; + return { empty, json, force }; } export async function run(args: string[]): Promise { - const { empty, json } = parseArgs(args); + const { empty, json, force } = parseArgs(args); const homeDir = os.homedir(); const configDir = path.join(homeDir, '.config', 'nanoclaw'); const configFile = path.join(configDir, 'mount-allowlist.json'); @@ -37,6 +39,21 @@ export async function run(args: string[]): Promise { fs.mkdirSync(configDir, { recursive: true }); + if (fs.existsSync(configFile) && !force) { + logger.info( + { configFile }, + 'Mount allowlist already exists — skipping (use --force to overwrite)', + ); + emitStatus('CONFIGURE_MOUNTS', { + PATH: configFile, + ALLOWED_ROOTS: 0, + NON_MAIN_READ_ONLY: 'unknown', + STATUS: 'skipped', + LOG: 'logs/setup.log', + }); + return; + } + let allowedRoots = 0; let nonMainReadOnly = 'true'; diff --git a/setup/register.test.ts b/setup/register.test.ts index 7258445..5a70740 100644 --- a/setup/register.test.ts +++ b/setup/register.test.ts @@ -1,4 +1,7 @@ -import { describe, it, expect, beforeEach } from 'vitest'; +import fs from 'fs'; +import os from 'os'; +import path from 'path'; +import { afterEach, describe, it, expect, beforeEach } from 'vitest'; import Database from 'better-sqlite3'; @@ -6,7 +9,7 @@ import Database from 'better-sqlite3'; * Tests for the register step. * * Verifies: parameterized SQL (no injection), file templating, - * apostrophe in names, .env updates. + * apostrophe in names, .env updates, CLAUDE.md template copy. */ function createTestDb(): Database.Database { @@ -18,7 +21,8 @@ function createTestDb(): Database.Database { trigger_pattern TEXT NOT NULL, added_at TEXT NOT NULL, container_config TEXT, - requires_trigger INTEGER DEFAULT 1 + requires_trigger INTEGER DEFAULT 1, + is_main INTEGER DEFAULT 0 )`); return db; } @@ -130,6 +134,49 @@ describe('parameterized SQL registration', () => { expect(row.requires_trigger).toBe(0); }); + it('stores is_main flag', () => { + db.prepare( + `INSERT OR REPLACE INTO registered_groups + (jid, name, folder, trigger_pattern, added_at, container_config, requires_trigger, is_main) + VALUES (?, ?, ?, ?, ?, NULL, ?, ?)`, + ).run( + '789@s.whatsapp.net', + 'Personal', + 'whatsapp_main', + '@Andy', + '2024-01-01T00:00:00.000Z', + 0, + 1, + ); + + const row = db + .prepare('SELECT is_main FROM registered_groups WHERE jid = ?') + .get('789@s.whatsapp.net') as { is_main: number }; + + expect(row.is_main).toBe(1); + }); + + it('defaults is_main to 0', () => { + db.prepare( + `INSERT OR REPLACE INTO registered_groups + (jid, name, folder, trigger_pattern, added_at, container_config, requires_trigger) + VALUES (?, ?, ?, ?, ?, NULL, ?)`, + ).run( + '123@g.us', + 'Some Group', + 'whatsapp_some-group', + '@Andy', + '2024-01-01T00:00:00.000Z', + 1, + ); + + const row = db + .prepare('SELECT is_main FROM registered_groups WHERE jid = ?') + .get('123@g.us') as { is_main: number }; + + expect(row.is_main).toBe(0); + }); + it('upserts on conflict', () => { const stmt = db.prepare( `INSERT OR REPLACE INTO registered_groups @@ -211,3 +258,207 @@ describe('file templating', () => { expect(envContent).toContain('ASSISTANT_NAME="Nova"'); }); }); + +describe('CLAUDE.md template copy', () => { + let tmpDir: string; + let groupsDir: string; + + // Replicates register.ts template copy + name update logic + function simulateRegister( + folder: string, + isMain: boolean, + assistantName = 'Andy', + ): void { + const folderDir = path.join(groupsDir, folder); + fs.mkdirSync(path.join(folderDir, 'logs'), { recursive: true }); + + // Template copy — never overwrite existing (register.ts lines 119-135) + const dest = path.join(folderDir, 'CLAUDE.md'); + if (!fs.existsSync(dest)) { + const templatePath = isMain + ? path.join(groupsDir, 'main', 'CLAUDE.md') + : path.join(groupsDir, 'global', 'CLAUDE.md'); + if (fs.existsSync(templatePath)) { + fs.copyFileSync(templatePath, dest); + } + } + + // Name update across all groups (register.ts lines 140-165) + if (assistantName !== 'Andy') { + const mdFiles = fs + .readdirSync(groupsDir) + .map((d) => path.join(groupsDir, d, 'CLAUDE.md')) + .filter((f) => fs.existsSync(f)); + + for (const mdFile of mdFiles) { + let content = fs.readFileSync(mdFile, 'utf-8'); + content = content.replace(/^# Andy$/m, `# ${assistantName}`); + content = content.replace( + /You are Andy/g, + `You are ${assistantName}`, + ); + fs.writeFileSync(mdFile, content); + } + } + } + + function readGroupMd(folder: string): string { + return fs.readFileSync( + path.join(groupsDir, folder, 'CLAUDE.md'), + 'utf-8', + ); + } + + beforeEach(() => { + tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-register-test-')); + groupsDir = path.join(tmpDir, 'groups'); + fs.mkdirSync(path.join(groupsDir, 'main'), { recursive: true }); + fs.mkdirSync(path.join(groupsDir, 'global'), { recursive: true }); + fs.writeFileSync( + path.join(groupsDir, 'main', 'CLAUDE.md'), + '# Andy\n\nYou are Andy, a personal assistant.\n\n## Admin Context\n\nThis is the **main channel**.', + ); + fs.writeFileSync( + path.join(groupsDir, 'global', 'CLAUDE.md'), + '# Andy\n\nYou are Andy, a personal assistant.', + ); + }); + + afterEach(() => { + fs.rmSync(tmpDir, { recursive: true, force: true }); + }); + + it('copies global template for non-main group', () => { + simulateRegister('telegram_dev-team', false); + + const content = readGroupMd('telegram_dev-team'); + expect(content).toContain('You are Andy'); + expect(content).not.toContain('Admin Context'); + }); + + it('copies main template for main group', () => { + simulateRegister('whatsapp_main', true); + + expect(readGroupMd('whatsapp_main')).toContain('Admin Context'); + }); + + it('each channel can have its own main with admin context', () => { + simulateRegister('whatsapp_main', true); + simulateRegister('telegram_main', true); + simulateRegister('slack_main', true); + simulateRegister('discord_main', true); + + for (const folder of [ + 'whatsapp_main', + 'telegram_main', + 'slack_main', + 'discord_main', + ]) { + const content = readGroupMd(folder); + expect(content).toContain('Admin Context'); + expect(content).toContain('You are Andy'); + } + }); + + it('non-main groups across channels get global template', () => { + simulateRegister('whatsapp_main', true); + simulateRegister('telegram_friends', false); + simulateRegister('slack_engineering', false); + simulateRegister('discord_general', false); + + expect(readGroupMd('whatsapp_main')).toContain('Admin Context'); + for (const folder of [ + 'telegram_friends', + 'slack_engineering', + 'discord_general', + ]) { + const content = readGroupMd(folder); + expect(content).toContain('You are Andy'); + expect(content).not.toContain('Admin Context'); + } + }); + + it('custom name propagates to all channels and groups', () => { + // Register multiple channels, last one sets custom name + simulateRegister('whatsapp_main', true); + simulateRegister('telegram_main', true); + simulateRegister('slack_devs', false); + // Final registration triggers name update across all + simulateRegister('discord_main', true, 'Luna'); + + for (const folder of [ + 'main', + 'global', + 'whatsapp_main', + 'telegram_main', + 'slack_devs', + 'discord_main', + ]) { + const content = readGroupMd(folder); + expect(content).toContain('# Luna'); + expect(content).toContain('You are Luna'); + expect(content).not.toContain('Andy'); + } + }); + + it('never overwrites existing CLAUDE.md on re-registration', () => { + simulateRegister('slack_main', true); + // User customizes the file extensively (persona, workspace, rules) + const mdPath = path.join(groupsDir, 'slack_main', 'CLAUDE.md'); + fs.writeFileSync( + mdPath, + '# Gambi\n\nCustom persona with workspace rules and family context.', + ); + // Re-registering same folder (e.g. re-running /add-slack) + simulateRegister('slack_main', true); + + const content = readGroupMd('slack_main'); + expect(content).toContain('Custom persona'); + expect(content).not.toContain('Admin Context'); + }); + + it('never overwrites when non-main becomes main (isMain changes)', () => { + // User registers a family group as non-main + simulateRegister('whatsapp_casa', false); + // User extensively customizes it (PARA system, task management, etc.) + const mdPath = path.join(groupsDir, 'whatsapp_casa', 'CLAUDE.md'); + fs.writeFileSync( + mdPath, + '# Casa\n\nFamily group with PARA system, task management, shopping lists.', + ); + // Later, user promotes to main (no trigger required) — CLAUDE.md must be preserved + simulateRegister('whatsapp_casa', true); + + const content = readGroupMd('whatsapp_casa'); + expect(content).toContain('PARA system'); + expect(content).not.toContain('Admin Context'); + }); + + it('preserves custom CLAUDE.md across channels when changing main', () => { + // Real-world scenario: WhatsApp main + customized Discord research channel + simulateRegister('whatsapp_main', true); + simulateRegister('discord_main', false); + const discordPath = path.join(groupsDir, 'discord_main', 'CLAUDE.md'); + fs.writeFileSync( + discordPath, + '# Gambi HQ — Research Assistant\n\nResearch workflows for Laura and Ethan.', + ); + + // Discord becomes main too — custom content must survive + simulateRegister('discord_main', true); + expect(readGroupMd('discord_main')).toContain('Research Assistant'); + // WhatsApp main also untouched + expect(readGroupMd('whatsapp_main')).toContain('Admin Context'); + }); + + it('handles missing templates gracefully', () => { + fs.unlinkSync(path.join(groupsDir, 'global', 'CLAUDE.md')); + fs.unlinkSync(path.join(groupsDir, 'main', 'CLAUDE.md')); + + simulateRegister('discord_general', false); + + expect( + fs.existsSync(path.join(groupsDir, 'discord_general', 'CLAUDE.md')), + ).toBe(false); + }); +}); diff --git a/setup/register.ts b/setup/register.ts index 55c3569..c08d910 100644 --- a/setup/register.ts +++ b/setup/register.ts @@ -1,25 +1,26 @@ /** * Step: register — Write channel registration config, create group folders. - * Replaces 06-register-channel.sh * - * Fixes: SQL injection (parameterized queries), sed -i '' (uses fs directly). + * Accepts --channel to specify the messaging platform (whatsapp, telegram, slack, discord). + * Uses parameterized SQL queries to prevent injection. */ import fs from 'fs'; import path from 'path'; -import Database from 'better-sqlite3'; - -import { STORE_DIR } from '../src/config.js'; -import { isValidGroupFolder } from '../src/group-folder.js'; -import { logger } from '../src/logger.js'; -import { emitStatus } from './status.js'; +import { STORE_DIR } from '../src/config.ts'; +import { initDatabase, setRegisteredGroup } from '../src/db.ts'; +import { isValidGroupFolder } from '../src/group-folder.ts'; +import { logger } from '../src/logger.ts'; +import { emitStatus } from './status.ts'; interface RegisterArgs { jid: string; name: string; trigger: string; folder: string; + channel: string; requiresTrigger: boolean; + isMain: boolean; assistantName: string; } @@ -29,7 +30,9 @@ function parseArgs(args: string[]): RegisterArgs { name: '', trigger: '', folder: '', + channel: 'whatsapp', // backward-compat: pre-refactor installs omit --channel requiresTrigger: true, + isMain: false, assistantName: 'Andy', }; @@ -47,9 +50,15 @@ function parseArgs(args: string[]): RegisterArgs { case '--folder': result.folder = args[++i] || ''; break; + case '--channel': + result.channel = (args[++i] || '').toLowerCase(); + break; case '--no-trigger-required': result.requiresTrigger = false; break; + case '--is-main': + result.isMain = true; + break; case '--assistant-name': result.assistantName = args[++i] || 'Andy'; break; @@ -83,40 +92,23 @@ export async function run(args: string[]): Promise { logger.info(parsed, 'Registering channel'); - // Ensure data directory exists + // Ensure data and store directories exist (store/ may not exist on + // fresh installs that skip WhatsApp auth, which normally creates it) fs.mkdirSync(path.join(projectRoot, 'data'), { recursive: true }); + fs.mkdirSync(STORE_DIR, { recursive: true }); - // Write to SQLite using parameterized queries (no SQL injection) - const dbPath = path.join(STORE_DIR, 'messages.db'); - const timestamp = new Date().toISOString(); - const requiresTriggerInt = parsed.requiresTrigger ? 1 : 0; + // Initialize database (creates schema + runs migrations) + initDatabase(); - const db = new Database(dbPath); - // Ensure schema exists - db.exec(`CREATE TABLE IF NOT EXISTS registered_groups ( - jid TEXT PRIMARY KEY, - name TEXT NOT NULL, - folder TEXT NOT NULL UNIQUE, - trigger_pattern TEXT NOT NULL, - added_at TEXT NOT NULL, - container_config TEXT, - requires_trigger INTEGER DEFAULT 1 - )`); + setRegisteredGroup(parsed.jid, { + name: parsed.name, + folder: parsed.folder, + trigger: parsed.trigger, + added_at: new Date().toISOString(), + requiresTrigger: parsed.requiresTrigger, + isMain: parsed.isMain, + }); - db.prepare( - `INSERT OR REPLACE INTO registered_groups - (jid, name, folder, trigger_pattern, added_at, container_config, requires_trigger) - VALUES (?, ?, ?, ?, ?, NULL, ?)`, - ).run( - parsed.jid, - parsed.name, - parsed.folder, - parsed.trigger, - timestamp, - requiresTriggerInt, - ); - - db.close(); logger.info('Wrote registration to SQLite'); // Create group folders @@ -124,6 +116,30 @@ export async function run(args: string[]): Promise { recursive: true, }); + // Create CLAUDE.md in the new group folder from template if it doesn't exist. + // The agent runs with CWD=/workspace/group and loads CLAUDE.md from there. + // Never overwrite an existing CLAUDE.md — users customize these extensively + // (persona, workspace structure, communication rules, family context, etc.) + // and a stock template replacement would destroy that work. + const groupClaudeMdPath = path.join( + projectRoot, + 'groups', + parsed.folder, + 'CLAUDE.md', + ); + if (!fs.existsSync(groupClaudeMdPath)) { + const templatePath = parsed.isMain + ? path.join(projectRoot, 'groups', 'main', 'CLAUDE.md') + : path.join(projectRoot, 'groups', 'global', 'CLAUDE.md'); + if (fs.existsSync(templatePath)) { + fs.copyFileSync(templatePath, groupClaudeMdPath); + logger.info( + { file: groupClaudeMdPath, template: templatePath }, + 'Created CLAUDE.md from template', + ); + } + } + // Update assistant name in CLAUDE.md files if different from default let nameUpdated = false; if (parsed.assistantName !== 'Andy') { @@ -132,10 +148,11 @@ export async function run(args: string[]): Promise { 'Updating assistant name', ); - const mdFiles = [ - path.join(projectRoot, 'groups', 'global', 'CLAUDE.md'), - path.join(projectRoot, 'groups', 'main', 'CLAUDE.md'), - ]; + const groupsDir = path.join(projectRoot, 'groups'); + const mdFiles = fs + .readdirSync(groupsDir) + .map((d) => path.join(groupsDir, d, 'CLAUDE.md')) + .filter((f) => fs.existsSync(f)); for (const mdFile of mdFiles) { if (fs.existsSync(mdFile)) { @@ -174,6 +191,7 @@ export async function run(args: string[]): Promise { JID: parsed.jid, NAME: parsed.name, FOLDER: parsed.folder, + CHANNEL: parsed.channel, TRIGGER: parsed.trigger, REQUIRES_TRIGGER: parsed.requiresTrigger, ASSISTANT_NAME: parsed.assistantName, diff --git a/setup/service.test.ts b/setup/service.test.ts index eb15db8..9168fe1 100644 --- a/setup/service.test.ts +++ b/setup/service.test.ts @@ -62,6 +62,7 @@ ExecStart=${nodePath} ${projectRoot}/dist/index.js WorkingDirectory=${projectRoot} Restart=always RestartSec=5 +KillMode=process Environment=HOME=${homeDir} Environment=PATH=/usr/local/bin:/usr/bin:/bin:${homeDir}/.local/bin StandardOutput=append:${projectRoot}/logs/nanoclaw.log @@ -142,6 +143,16 @@ describe('systemd unit generation', () => { expect(unit).toContain('RestartSec=5'); }); + it('uses KillMode=process to preserve detached children', () => { + const unit = generateSystemdUnit( + '/usr/bin/node', + '/home/user/nanoclaw', + '/home/user', + false, + ); + expect(unit).toContain('KillMode=process'); + }); + it('sets correct ExecStart', () => { const unit = generateSystemdUnit( '/usr/bin/node', diff --git a/setup/service.ts b/setup/service.ts index 9e7932a..c385267 100644 --- a/setup/service.ts +++ b/setup/service.ts @@ -161,7 +161,7 @@ function setupLinux( /** * Kill any orphaned nanoclaw node processes left from previous runs or debugging. - * Prevents WhatsApp "conflict" disconnects when two instances connect simultaneously. + * Prevents connection conflicts when two instances connect to the same channel simultaneously. */ function killOrphanedProcesses(projectRoot: string): void { try { @@ -243,6 +243,7 @@ ExecStart=${nodePath} ${projectRoot}/dist/index.js WorkingDirectory=${projectRoot} Restart=always RestartSec=5 +KillMode=process Environment=HOME=${homeDir} Environment=PATH=/usr/local/bin:/usr/bin:/bin:${homeDir}/.local/bin StandardOutput=append:${projectRoot}/logs/nanoclaw.log @@ -262,9 +263,23 @@ WantedBy=${runningAsRoot ? 'multi-user.target' : 'default.target'}`; ); } - // Kill orphaned nanoclaw processes to avoid WhatsApp conflict errors + // Kill orphaned nanoclaw processes to avoid channel connection conflicts killOrphanedProcesses(projectRoot); + // Enable lingering so the user service survives SSH logout. + // Without linger, systemd terminates all user processes when the last session closes. + if (!runningAsRoot) { + try { + execSync('loginctl enable-linger', { stdio: 'ignore' }); + logger.info('Enabled loginctl linger for current user'); + } catch (err) { + logger.warn( + { err }, + 'loginctl enable-linger failed — service may stop on SSH logout', + ); + } + } + // Enable and start try { execSync(`${systemctlPrefix} daemon-reload`, { stdio: 'ignore' }); @@ -300,6 +315,7 @@ WantedBy=${runningAsRoot ? 'multi-user.target' : 'default.target'}`; UNIT_PATH: unitPath, SERVICE_LOADED: serviceLoaded, ...(dockerGroupStale ? { DOCKER_GROUP_STALE: true } : {}), + LINGER_ENABLED: !runningAsRoot, STATUS: 'success', LOG: 'logs/setup.log', }); diff --git a/setup/timezone.ts b/setup/timezone.ts new file mode 100644 index 0000000..22c0394 --- /dev/null +++ b/setup/timezone.ts @@ -0,0 +1,67 @@ +/** + * Step: timezone — Detect, validate, and persist the user's timezone. + * Writes TZ to .env if a valid IANA timezone is resolved. + * Emits NEEDS_USER_INPUT=true when autodetection fails. + */ +import fs from 'fs'; +import path from 'path'; + +import { isValidTimezone } from '../src/timezone.js'; +import { logger } from '../src/logger.js'; +import { emitStatus } from './status.js'; + +export async function run(args: string[]): Promise { + const projectRoot = process.cwd(); + const envFile = path.join(projectRoot, '.env'); + + // Check what's already in .env + let envFileTz: string | undefined; + if (fs.existsSync(envFile)) { + const content = fs.readFileSync(envFile, 'utf-8'); + const match = content.match(/^TZ=(.+)$/m); + if (match) envFileTz = match[1].trim().replace(/^["']|["']$/g, ''); + } + + const systemTz = Intl.DateTimeFormat().resolvedOptions().timeZone; + const envTz = process.env.TZ; + + // Accept --tz flag from CLI (used when setup skill collects from user) + const tzFlagIdx = args.indexOf('--tz'); + const userTz = tzFlagIdx !== -1 ? args[tzFlagIdx + 1] : undefined; + + // Resolve: user-provided > .env > process.env > system autodetect + let resolvedTz: string | undefined; + for (const candidate of [userTz, envFileTz, envTz, systemTz]) { + if (candidate && isValidTimezone(candidate)) { + resolvedTz = candidate; + break; + } + } + + const needsUserInput = !resolvedTz; + + if (resolvedTz && resolvedTz !== envFileTz) { + // Write/update TZ in .env + if (fs.existsSync(envFile)) { + let content = fs.readFileSync(envFile, 'utf-8'); + if (/^TZ=/m.test(content)) { + content = content.replace(/^TZ=.*$/m, `TZ=${resolvedTz}`); + } else { + content = content.trimEnd() + `\nTZ=${resolvedTz}\n`; + } + fs.writeFileSync(envFile, content); + } else { + fs.writeFileSync(envFile, `TZ=${resolvedTz}\n`); + } + logger.info({ timezone: resolvedTz }, 'Set TZ in .env'); + } + + emitStatus('TIMEZONE', { + SYSTEM_TZ: systemTz || 'unknown', + ENV_TZ: envTz || 'unset', + ENV_FILE_TZ: envFileTz || 'unset', + RESOLVED_TZ: resolvedTz || 'none', + NEEDS_USER_INPUT: needsUserInput, + STATUS: needsUserInput ? 'needs_input' : 'success', + }); +} diff --git a/setup/verify.ts b/setup/verify.ts index a738b8c..e039e52 100644 --- a/setup/verify.ts +++ b/setup/verify.ts @@ -12,6 +12,7 @@ import path from 'path'; import Database from 'better-sqlite3'; import { STORE_DIR } from '../src/config.js'; +import { readEnvFile } from '../src/env.js'; import { logger } from '../src/logger.js'; import { getPlatform, @@ -68,9 +69,10 @@ export async function run(_args: string[]): Promise { const pidFile = path.join(projectRoot, 'nanoclaw.pid'); if (fs.existsSync(pidFile)) { try { - const pid = fs.readFileSync(pidFile, 'utf-8').trim(); - if (pid) { - execSync(`kill -0 ${pid}`, { stdio: 'ignore' }); + const raw = fs.readFileSync(pidFile, 'utf-8').trim(); + const pid = Number(raw); + if (raw && Number.isInteger(pid) && pid > 0) { + process.kill(pid, 0); service = 'running'; } } catch { @@ -99,18 +101,44 @@ export async function run(_args: string[]): Promise { const envFile = path.join(projectRoot, '.env'); if (fs.existsSync(envFile)) { const envContent = fs.readFileSync(envFile, 'utf-8'); - if (/^(CLAUDE_CODE_OAUTH_TOKEN|ANTHROPIC_API_KEY)=/m.test(envContent)) { + if (/^(CLAUDE_CODE_OAUTH_TOKEN|ANTHROPIC_API_KEY|ONECLI_URL)=/m.test(envContent)) { credentials = 'configured'; } } - // 4. Check WhatsApp auth - let whatsappAuth = 'not_found'; + // 4. Check channel auth (detect configured channels by credentials) + const envVars = readEnvFile([ + 'TELEGRAM_BOT_TOKEN', + 'SLACK_BOT_TOKEN', + 'SLACK_APP_TOKEN', + 'DISCORD_BOT_TOKEN', + ]); + + const channelAuth: Record = {}; + + // WhatsApp: check for auth credentials on disk const authDir = path.join(projectRoot, 'store', 'auth'); if (fs.existsSync(authDir) && fs.readdirSync(authDir).length > 0) { - whatsappAuth = 'authenticated'; + channelAuth.whatsapp = 'authenticated'; } + // Token-based channels: check .env + if (process.env.TELEGRAM_BOT_TOKEN || envVars.TELEGRAM_BOT_TOKEN) { + channelAuth.telegram = 'configured'; + } + if ( + (process.env.SLACK_BOT_TOKEN || envVars.SLACK_BOT_TOKEN) && + (process.env.SLACK_APP_TOKEN || envVars.SLACK_APP_TOKEN) + ) { + channelAuth.slack = 'configured'; + } + if (process.env.DISCORD_BOT_TOKEN || envVars.DISCORD_BOT_TOKEN) { + channelAuth.discord = 'configured'; + } + + const configuredChannels = Object.keys(channelAuth); + const anyChannelConfigured = configuredChannels.length > 0; + // 5. Check registered groups (using better-sqlite3, not sqlite3 CLI) let registeredGroups = 0; const dbPath = path.join(STORE_DIR, 'messages.db'); @@ -141,18 +169,19 @@ export async function run(_args: string[]): Promise { const status = service === 'running' && credentials !== 'missing' && - whatsappAuth !== 'not_found' && + anyChannelConfigured && registeredGroups > 0 ? 'success' : 'failed'; - logger.info({ status }, 'Verification complete'); + logger.info({ status, channelAuth }, 'Verification complete'); emitStatus('VERIFY', { SERVICE: service, CONTAINER_RUNTIME: containerRuntime, CREDENTIALS: credentials, - WHATSAPP_AUTH: whatsappAuth, + CONFIGURED_CHANNELS: configuredChannels.join(','), + CHANNEL_AUTH: JSON.stringify(channelAuth), REGISTERED_GROUPS: registeredGroups, MOUNT_ALLOWLIST: mountAllowlist, STATUS: status, diff --git a/setup/whatsapp-auth.ts b/setup/whatsapp-auth.ts deleted file mode 100644 index feed41b..0000000 --- a/setup/whatsapp-auth.ts +++ /dev/null @@ -1,358 +0,0 @@ -/** - * Step: whatsapp-auth — Full WhatsApp auth flow with polling. - * Replaces 04-auth-whatsapp.sh - */ -import { execSync, spawn } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import { logger } from '../src/logger.js'; -import { openBrowser, isHeadless } from './platform.js'; -import { emitStatus } from './status.js'; - -const QR_AUTH_TEMPLATE = ` -NanoClaw - WhatsApp Auth - - -
-

Scan with WhatsApp

-
Expires in 60s
-
{{QR_SVG}}
-
Settings \\u2192 Linked Devices \\u2192 Link a Device
-
-`; - -const SUCCESS_HTML = ` -NanoClaw - Connected! - -
-
-

Connected to WhatsApp

-

You can close this tab.

-
- -`; - -function parseArgs(args: string[]): { method: string; phone: string } { - let method = ''; - let phone = ''; - for (let i = 0; i < args.length; i++) { - if (args[i] === '--method' && args[i + 1]) { - method = args[i + 1]; - i++; - } - if (args[i] === '--phone' && args[i + 1]) { - phone = args[i + 1]; - i++; - } - } - return { method, phone }; -} - -function sleep(ms: number): Promise { - return new Promise((resolve) => setTimeout(resolve, ms)); -} - -function readFileSafe(filePath: string): string { - try { - return fs.readFileSync(filePath, 'utf-8'); - } catch { - return ''; - } -} - -function getPhoneNumber(projectRoot: string): string { - try { - const creds = JSON.parse( - fs.readFileSync( - path.join(projectRoot, 'store', 'auth', 'creds.json'), - 'utf-8', - ), - ); - if (creds.me?.id) { - return creds.me.id.split(':')[0].split('@')[0]; - } - } catch { - // Not available yet - } - return ''; -} - -function emitAuthStatus( - method: string, - authStatus: string, - status: string, - extra: Record = {}, -): void { - const fields: Record = { - AUTH_METHOD: method, - AUTH_STATUS: authStatus, - ...extra, - STATUS: status, - LOG: 'logs/setup.log', - }; - emitStatus('AUTH_WHATSAPP', fields); -} - -export async function run(args: string[]): Promise { - const projectRoot = process.cwd(); - const { method, phone } = parseArgs(args); - const statusFile = path.join(projectRoot, 'store', 'auth-status.txt'); - const qrFile = path.join(projectRoot, 'store', 'qr-data.txt'); - - if (!method) { - emitAuthStatus('unknown', 'failed', 'failed', { - ERROR: 'missing_method_flag', - }); - process.exit(4); - } - - // qr-terminal is a manual flow - if (method === 'qr-terminal') { - emitAuthStatus('qr-terminal', 'manual', 'manual', { - PROJECT_PATH: projectRoot, - }); - return; - } - - if (method === 'pairing-code' && !phone) { - emitAuthStatus('pairing-code', 'failed', 'failed', { - ERROR: 'missing_phone_number', - }); - process.exit(4); - } - - if (!['qr-browser', 'pairing-code'].includes(method)) { - emitAuthStatus(method, 'failed', 'failed', { ERROR: 'unknown_method' }); - process.exit(4); - } - - // Clean stale state - logger.info({ method }, 'Starting WhatsApp auth'); - try { - fs.rmSync(path.join(projectRoot, 'store', 'auth'), { - recursive: true, - force: true, - }); - } catch { - /* ok */ - } - try { - fs.unlinkSync(qrFile); - } catch { - /* ok */ - } - try { - fs.unlinkSync(statusFile); - } catch { - /* ok */ - } - - // Start auth process in background - const authArgs = - method === 'pairing-code' - ? ['src/whatsapp-auth.ts', '--pairing-code', '--phone', phone] - : ['src/whatsapp-auth.ts']; - - const authProc = spawn('npx', ['tsx', ...authArgs], { - cwd: projectRoot, - stdio: ['ignore', 'pipe', 'pipe'], - detached: false, - }); - - const logFile = path.join(projectRoot, 'logs', 'setup.log'); - const logStream = fs.createWriteStream(logFile, { flags: 'a' }); - authProc.stdout?.pipe(logStream); - authProc.stderr?.pipe(logStream); - - // Cleanup on exit - const cleanup = () => { - try { - authProc.kill(); - } catch { - /* ok */ - } - }; - process.on('exit', cleanup); - - try { - if (method === 'qr-browser') { - await handleQrBrowser(projectRoot, statusFile, qrFile); - } else { - await handlePairingCode(projectRoot, statusFile, phone); - } - } finally { - cleanup(); - process.removeListener('exit', cleanup); - } -} - -async function handleQrBrowser( - projectRoot: string, - statusFile: string, - qrFile: string, -): Promise { - // Poll for QR data (15s) - let qrReady = false; - for (let i = 0; i < 15; i++) { - const statusContent = readFileSafe(statusFile); - if (statusContent === 'already_authenticated') { - emitAuthStatus('qr-browser', 'already_authenticated', 'success'); - return; - } - if (fs.existsSync(qrFile)) { - qrReady = true; - break; - } - await sleep(1000); - } - - if (!qrReady) { - emitAuthStatus('qr-browser', 'failed', 'failed', { ERROR: 'qr_timeout' }); - process.exit(3); - } - - // Generate QR SVG and HTML - const qrData = fs.readFileSync(qrFile, 'utf-8'); - try { - const svg = execSync( - `node -e "const QR=require('qrcode');const data='${qrData}';QR.toString(data,{type:'svg'},(e,s)=>{if(e)process.exit(1);process.stdout.write(s)})"`, - { cwd: projectRoot, encoding: 'utf-8' }, - ); - const html = QR_AUTH_TEMPLATE.replace('{{QR_SVG}}', svg); - const htmlPath = path.join(projectRoot, 'store', 'qr-auth.html'); - fs.writeFileSync(htmlPath, html); - - // Open in browser (cross-platform) - if (!isHeadless()) { - const opened = openBrowser(htmlPath); - if (!opened) { - logger.warn( - 'Could not open browser — display QR in terminal as fallback', - ); - } - } else { - logger.info( - 'Headless environment — QR HTML saved but browser not opened', - ); - } - } catch (err) { - logger.error({ err }, 'Failed to generate QR HTML'); - } - - // Poll for completion (120s) - await pollAuthCompletion('qr-browser', statusFile, projectRoot); -} - -async function handlePairingCode( - projectRoot: string, - statusFile: string, - phone: string, -): Promise { - // Poll for pairing code (15s) - let pairingCode = ''; - for (let i = 0; i < 15; i++) { - const statusContent = readFileSafe(statusFile); - if (statusContent === 'already_authenticated') { - emitAuthStatus('pairing-code', 'already_authenticated', 'success'); - return; - } - if (statusContent.startsWith('pairing_code:')) { - pairingCode = statusContent.replace('pairing_code:', ''); - break; - } - if (statusContent.startsWith('failed:')) { - emitAuthStatus('pairing-code', 'failed', 'failed', { - ERROR: statusContent.replace('failed:', ''), - }); - process.exit(1); - } - await sleep(1000); - } - - if (!pairingCode) { - emitAuthStatus('pairing-code', 'failed', 'failed', { - ERROR: 'pairing_code_timeout', - }); - process.exit(3); - } - - // Emit pairing code immediately so the caller can display it to the user - emitAuthStatus('pairing-code', 'pairing_code_ready', 'waiting', { - PAIRING_CODE: pairingCode, - }); - - // Poll for completion (120s) - await pollAuthCompletion( - 'pairing-code', - statusFile, - projectRoot, - pairingCode, - ); -} - -async function pollAuthCompletion( - method: string, - statusFile: string, - projectRoot: string, - pairingCode?: string, -): Promise { - const extra: Record = {}; - if (pairingCode) extra.PAIRING_CODE = pairingCode; - - for (let i = 0; i < 60; i++) { - const content = readFileSafe(statusFile); - - if (content === 'authenticated' || content === 'already_authenticated') { - // Write success page if qr-auth.html exists - const htmlPath = path.join(projectRoot, 'store', 'qr-auth.html'); - if (fs.existsSync(htmlPath)) { - fs.writeFileSync(htmlPath, SUCCESS_HTML); - } - const phoneNumber = getPhoneNumber(projectRoot); - if (phoneNumber) extra.PHONE_NUMBER = phoneNumber; - emitAuthStatus(method, content, 'success', extra); - return; - } - - if (content.startsWith('failed:')) { - const error = content.replace('failed:', ''); - emitAuthStatus(method, 'failed', 'failed', { ERROR: error, ...extra }); - process.exit(1); - } - - await sleep(2000); - } - - emitAuthStatus(method, 'failed', 'failed', { ERROR: 'timeout', ...extra }); - process.exit(3); -} diff --git a/skills-engine/__tests__/apply.test.ts b/skills-engine/__tests__/apply.test.ts deleted file mode 100644 index bb41f32..0000000 --- a/skills-engine/__tests__/apply.test.ts +++ /dev/null @@ -1,157 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; - -import { applySkill } from '../apply.js'; -import { - cleanup, - createMinimalState, - createSkillPackage, - createTempDir, - initGitRepo, - setupNanoclawDir, -} from './test-helpers.js'; -import { readState, writeState } from '../state.js'; - -describe('apply', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - createMinimalState(tmpDir); - initGitRepo(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('rejects when min_skills_system_version is too high', async () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'future-skill', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - min_skills_system_version: '99.0.0', - }); - - const result = await applySkill(skillDir); - expect(result.success).toBe(false); - expect(result.error).toContain('99.0.0'); - }); - - it('executes post_apply commands on success', async () => { - const markerFile = path.join(tmpDir, 'post-apply-marker.txt'); - const skillDir = createSkillPackage(tmpDir, { - skill: 'post-test', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/newfile.ts'], - modifies: [], - addFiles: { 'src/newfile.ts': 'export const x = 1;' }, - post_apply: [`echo "applied" > "${markerFile}"`], - }); - - const result = await applySkill(skillDir); - expect(result.success).toBe(true); - expect(fs.existsSync(markerFile)).toBe(true); - expect(fs.readFileSync(markerFile, 'utf-8').trim()).toBe('applied'); - }); - - it('rolls back on post_apply failure', async () => { - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - const existingFile = path.join(tmpDir, 'src/existing.ts'); - fs.writeFileSync(existingFile, 'original content'); - - // Set up base for the modified file - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'existing.ts'), 'original content'); - - const skillDir = createSkillPackage(tmpDir, { - skill: 'bad-post', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/added.ts'], - modifies: [], - addFiles: { 'src/added.ts': 'new file' }, - post_apply: ['false'], // always fails - }); - - const result = await applySkill(skillDir); - expect(result.success).toBe(false); - expect(result.error).toContain('post_apply'); - - // Added file should be cleaned up - expect(fs.existsSync(path.join(tmpDir, 'src/added.ts'))).toBe(false); - }); - - it('does not allow path_remap to write files outside project root', async () => { - const state = readState(); - state.path_remap = { 'src/newfile.ts': '../../outside.txt' }; - writeState(state); - - const skillDir = createSkillPackage(tmpDir, { - skill: 'remap-escape', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/newfile.ts'], - modifies: [], - addFiles: { 'src/newfile.ts': 'safe content' }, - }); - - const result = await applySkill(skillDir); - expect(result.success).toBe(true); - - // Remap escape is ignored; file remains constrained inside project root. - expect(fs.existsSync(path.join(tmpDir, 'src/newfile.ts'))).toBe(true); - expect(fs.existsSync(path.join(tmpDir, '..', 'outside.txt'))).toBe(false); - }); - - it('does not allow path_remap symlink targets to write outside project root', async () => { - const outsideDir = fs.mkdtempSync( - path.join(path.dirname(tmpDir), 'nanoclaw-remap-outside-'), - ); - const linkPath = path.join(tmpDir, 'link-out'); - - try { - fs.symlinkSync(outsideDir, linkPath); - } catch (err) { - const code = (err as NodeJS.ErrnoException).code; - if (code === 'EPERM' || code === 'EACCES' || code === 'ENOSYS') { - fs.rmSync(outsideDir, { recursive: true, force: true }); - return; - } - fs.rmSync(outsideDir, { recursive: true, force: true }); - throw err; - } - - try { - const state = readState(); - state.path_remap = { 'src/newfile.ts': 'link-out/pwned.txt' }; - writeState(state); - - const skillDir = createSkillPackage(tmpDir, { - skill: 'remap-symlink-escape', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/newfile.ts'], - modifies: [], - addFiles: { 'src/newfile.ts': 'safe content' }, - }); - - const result = await applySkill(skillDir); - expect(result.success).toBe(true); - - expect(fs.existsSync(path.join(tmpDir, 'src/newfile.ts'))).toBe(true); - expect(fs.existsSync(path.join(outsideDir, 'pwned.txt'))).toBe(false); - } finally { - fs.rmSync(outsideDir, { recursive: true, force: true }); - } - }); -}); diff --git a/skills-engine/__tests__/backup.test.ts b/skills-engine/__tests__/backup.test.ts deleted file mode 100644 index aeeb6ee..0000000 --- a/skills-engine/__tests__/backup.test.ts +++ /dev/null @@ -1,87 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { createBackup, restoreBackup, clearBackup } from '../backup.js'; -import { createTempDir, setupNanoclawDir, cleanup } from './test-helpers.js'; - -describe('backup', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('createBackup copies files and restoreBackup puts them back', () => { - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src', 'app.ts'), 'original content'); - - createBackup(['src/app.ts']); - - fs.writeFileSync(path.join(tmpDir, 'src', 'app.ts'), 'modified content'); - expect(fs.readFileSync(path.join(tmpDir, 'src', 'app.ts'), 'utf-8')).toBe( - 'modified content', - ); - - restoreBackup(); - expect(fs.readFileSync(path.join(tmpDir, 'src', 'app.ts'), 'utf-8')).toBe( - 'original content', - ); - }); - - it('createBackup skips missing files without error', () => { - expect(() => createBackup(['does-not-exist.ts'])).not.toThrow(); - }); - - it('clearBackup removes backup directory', () => { - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src', 'app.ts'), 'content'); - createBackup(['src/app.ts']); - - const backupDir = path.join(tmpDir, '.nanoclaw', 'backup'); - expect(fs.existsSync(backupDir)).toBe(true); - - clearBackup(); - expect(fs.existsSync(backupDir)).toBe(false); - }); - - it('createBackup writes tombstone for non-existent files', () => { - createBackup(['src/newfile.ts']); - - const tombstone = path.join( - tmpDir, - '.nanoclaw', - 'backup', - 'src', - 'newfile.ts.tombstone', - ); - expect(fs.existsSync(tombstone)).toBe(true); - }); - - it('restoreBackup deletes files with tombstone markers', () => { - // Create backup first — file doesn't exist yet, so tombstone is written - createBackup(['src/added.ts']); - - // Now the file gets created (simulating skill apply) - const filePath = path.join(tmpDir, 'src', 'added.ts'); - fs.mkdirSync(path.dirname(filePath), { recursive: true }); - fs.writeFileSync(filePath, 'new content'); - expect(fs.existsSync(filePath)).toBe(true); - - // Restore should delete the file (tombstone means it didn't exist before) - restoreBackup(); - expect(fs.existsSync(filePath)).toBe(false); - }); - - it('restoreBackup is no-op when backup dir is empty or missing', () => { - clearBackup(); - expect(() => restoreBackup()).not.toThrow(); - }); -}); diff --git a/skills-engine/__tests__/constants.test.ts b/skills-engine/__tests__/constants.test.ts deleted file mode 100644 index 4ceeb3d..0000000 --- a/skills-engine/__tests__/constants.test.ts +++ /dev/null @@ -1,41 +0,0 @@ -import { describe, it, expect } from 'vitest'; -import { - NANOCLAW_DIR, - STATE_FILE, - BASE_DIR, - BACKUP_DIR, - LOCK_FILE, - CUSTOM_DIR, - SKILLS_SCHEMA_VERSION, -} from '../constants.js'; - -describe('constants', () => { - const allConstants = { - NANOCLAW_DIR, - STATE_FILE, - BASE_DIR, - BACKUP_DIR, - LOCK_FILE, - CUSTOM_DIR, - SKILLS_SCHEMA_VERSION, - }; - - it('all constants are non-empty strings', () => { - for (const [name, value] of Object.entries(allConstants)) { - expect(value, `${name} should be a non-empty string`).toBeTruthy(); - expect(typeof value, `${name} should be a string`).toBe('string'); - } - }); - - it('path constants use forward slashes and .nanoclaw prefix', () => { - const pathConstants = [BASE_DIR, BACKUP_DIR, LOCK_FILE, CUSTOM_DIR]; - for (const p of pathConstants) { - expect(p).not.toContain('\\'); - expect(p).toMatch(/^\.nanoclaw\//); - } - }); - - it('NANOCLAW_DIR is .nanoclaw', () => { - expect(NANOCLAW_DIR).toBe('.nanoclaw'); - }); -}); diff --git a/skills-engine/__tests__/customize.test.ts b/skills-engine/__tests__/customize.test.ts deleted file mode 100644 index 1c055a2..0000000 --- a/skills-engine/__tests__/customize.test.ts +++ /dev/null @@ -1,146 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { - isCustomizeActive, - startCustomize, - commitCustomize, - abortCustomize, -} from '../customize.js'; -import { CUSTOM_DIR } from '../constants.js'; -import { - createTempDir, - setupNanoclawDir, - createMinimalState, - cleanup, - writeState, -} from './test-helpers.js'; -import { - readState, - recordSkillApplication, - computeFileHash, -} from '../state.js'; - -describe('customize', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - createMinimalState(tmpDir); - fs.mkdirSync(path.join(tmpDir, CUSTOM_DIR), { recursive: true }); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('startCustomize creates pending.yaml and isCustomizeActive returns true', () => { - // Need at least one applied skill with file_hashes for snapshot - const trackedFile = path.join(tmpDir, 'src', 'app.ts'); - fs.mkdirSync(path.dirname(trackedFile), { recursive: true }); - fs.writeFileSync(trackedFile, 'export const x = 1;'); - recordSkillApplication('test-skill', '1.0.0', { - 'src/app.ts': computeFileHash(trackedFile), - }); - - expect(isCustomizeActive()).toBe(false); - startCustomize('test customization'); - expect(isCustomizeActive()).toBe(true); - - const pendingPath = path.join(tmpDir, CUSTOM_DIR, 'pending.yaml'); - expect(fs.existsSync(pendingPath)).toBe(true); - }); - - it('abortCustomize removes pending.yaml', () => { - const trackedFile = path.join(tmpDir, 'src', 'app.ts'); - fs.mkdirSync(path.dirname(trackedFile), { recursive: true }); - fs.writeFileSync(trackedFile, 'export const x = 1;'); - recordSkillApplication('test-skill', '1.0.0', { - 'src/app.ts': computeFileHash(trackedFile), - }); - - startCustomize('test'); - expect(isCustomizeActive()).toBe(true); - - abortCustomize(); - expect(isCustomizeActive()).toBe(false); - }); - - it('commitCustomize with no changes clears pending', () => { - const trackedFile = path.join(tmpDir, 'src', 'app.ts'); - fs.mkdirSync(path.dirname(trackedFile), { recursive: true }); - fs.writeFileSync(trackedFile, 'export const x = 1;'); - recordSkillApplication('test-skill', '1.0.0', { - 'src/app.ts': computeFileHash(trackedFile), - }); - - startCustomize('no-op'); - commitCustomize(); - - expect(isCustomizeActive()).toBe(false); - }); - - it('commitCustomize with changes creates patch and records in state', () => { - const trackedFile = path.join(tmpDir, 'src', 'app.ts'); - fs.mkdirSync(path.dirname(trackedFile), { recursive: true }); - fs.writeFileSync(trackedFile, 'export const x = 1;'); - recordSkillApplication('test-skill', '1.0.0', { - 'src/app.ts': computeFileHash(trackedFile), - }); - - startCustomize('add feature'); - - // Modify the tracked file - fs.writeFileSync(trackedFile, 'export const x = 2;\nexport const y = 3;'); - - commitCustomize(); - - expect(isCustomizeActive()).toBe(false); - const state = readState(); - expect(state.custom_modifications).toBeDefined(); - expect(state.custom_modifications!.length).toBeGreaterThan(0); - expect(state.custom_modifications![0].description).toBe('add feature'); - }); - - it('commitCustomize throws descriptive error on diff failure', () => { - const trackedFile = path.join(tmpDir, 'src', 'app.ts'); - fs.mkdirSync(path.dirname(trackedFile), { recursive: true }); - fs.writeFileSync(trackedFile, 'export const x = 1;'); - recordSkillApplication('test-skill', '1.0.0', { - 'src/app.ts': computeFileHash(trackedFile), - }); - - startCustomize('diff-error test'); - - // Modify the tracked file - fs.writeFileSync(trackedFile, 'export const x = 2;'); - - // Make the base file a directory to cause diff to exit with code 2 - const baseFilePath = path.join( - tmpDir, - '.nanoclaw', - 'base', - 'src', - 'app.ts', - ); - fs.mkdirSync(baseFilePath, { recursive: true }); - - expect(() => commitCustomize()).toThrow(/diff error/i); - }); - - it('startCustomize while active throws', () => { - const trackedFile = path.join(tmpDir, 'src', 'app.ts'); - fs.mkdirSync(path.dirname(trackedFile), { recursive: true }); - fs.writeFileSync(trackedFile, 'export const x = 1;'); - recordSkillApplication('test-skill', '1.0.0', { - 'src/app.ts': computeFileHash(trackedFile), - }); - - startCustomize('first'); - expect(() => startCustomize('second')).toThrow(); - }); -}); diff --git a/skills-engine/__tests__/fetch-upstream.test.ts b/skills-engine/__tests__/fetch-upstream.test.ts deleted file mode 100644 index ca2f6ab..0000000 --- a/skills-engine/__tests__/fetch-upstream.test.ts +++ /dev/null @@ -1,240 +0,0 @@ -import { execFileSync, execSync } from 'child_process'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; - -describe('fetch-upstream.sh', () => { - let projectDir: string; - let upstreamBareDir: string; - const scriptPath = path.resolve( - '.claude/skills/update/scripts/fetch-upstream.sh', - ); - - beforeEach(() => { - // Create a bare repo to act as "upstream" - upstreamBareDir = fs.mkdtempSync( - path.join(os.tmpdir(), 'nanoclaw-upstream-'), - ); - execSync('git init --bare -b main', { - cwd: upstreamBareDir, - stdio: 'pipe', - }); - - // Create a working repo, add files, push to the bare repo - const seedDir = fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-seed-')); - execSync('git init -b main', { cwd: seedDir, stdio: 'pipe' }); - execSync('git config user.email "test@test.com"', { - cwd: seedDir, - stdio: 'pipe', - }); - execSync('git config user.name "Test"', { cwd: seedDir, stdio: 'pipe' }); - fs.writeFileSync( - path.join(seedDir, 'package.json'), - JSON.stringify({ name: 'nanoclaw', version: '2.0.0' }), - ); - fs.mkdirSync(path.join(seedDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(seedDir, 'src/index.ts'), 'export const v = 2;'); - execSync('git add -A && git commit -m "upstream v2.0.0"', { - cwd: seedDir, - stdio: 'pipe', - }); - execSync(`git remote add origin ${upstreamBareDir}`, { - cwd: seedDir, - stdio: 'pipe', - }); - execSync('git push origin main', { - cwd: seedDir, - stdio: 'pipe', - }); - - fs.rmSync(seedDir, { recursive: true, force: true }); - - // Create the "project" repo that will run the script - projectDir = fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-project-')); - execSync('git init -b main', { cwd: projectDir, stdio: 'pipe' }); - execSync('git config user.email "test@test.com"', { - cwd: projectDir, - stdio: 'pipe', - }); - execSync('git config user.name "Test"', { - cwd: projectDir, - stdio: 'pipe', - }); - fs.writeFileSync( - path.join(projectDir, 'package.json'), - JSON.stringify({ name: 'nanoclaw', version: '1.0.0' }), - ); - execSync('git add -A && git commit -m "init"', { - cwd: projectDir, - stdio: 'pipe', - }); - - // Copy skills-engine/constants.ts so fetch-upstream.sh can read BASE_INCLUDES - const constantsSrc = path.resolve('skills-engine/constants.ts'); - const constantsDest = path.join(projectDir, 'skills-engine/constants.ts'); - fs.mkdirSync(path.dirname(constantsDest), { recursive: true }); - fs.copyFileSync(constantsSrc, constantsDest); - - // Copy the script into the project so it can find PROJECT_ROOT - const skillScriptsDir = path.join( - projectDir, - '.claude/skills/update/scripts', - ); - fs.mkdirSync(skillScriptsDir, { recursive: true }); - fs.copyFileSync( - scriptPath, - path.join(skillScriptsDir, 'fetch-upstream.sh'), - ); - fs.chmodSync(path.join(skillScriptsDir, 'fetch-upstream.sh'), 0o755); - }); - - afterEach(() => { - // Clean up temp dirs (also any TEMP_DIR created by the script) - for (const dir of [projectDir, upstreamBareDir]) { - if (dir && fs.existsSync(dir)) { - fs.rmSync(dir, { recursive: true, force: true }); - } - } - }); - - function runFetchUpstream(): { stdout: string; exitCode: number } { - try { - const stdout = execFileSync( - 'bash', - ['.claude/skills/update/scripts/fetch-upstream.sh'], - { - cwd: projectDir, - encoding: 'utf-8', - stdio: 'pipe', - timeout: 30_000, - }, - ); - return { stdout, exitCode: 0 }; - } catch (err: any) { - return { - stdout: (err.stdout ?? '') + (err.stderr ?? ''), - exitCode: err.status ?? 1, - }; - } - } - - function parseStatus(stdout: string): Record { - const match = stdout.match(/<<< STATUS\n([\s\S]*?)\nSTATUS >>>/); - if (!match) return {}; - const lines = match[1].trim().split('\n'); - const result: Record = {}; - for (const line of lines) { - const eq = line.indexOf('='); - if (eq > 0) { - result[line.slice(0, eq)] = line.slice(eq + 1); - } - } - return result; - } - - it('uses existing upstream remote', () => { - execSync(`git remote add upstream ${upstreamBareDir}`, { - cwd: projectDir, - stdio: 'pipe', - }); - - const { stdout, exitCode } = runFetchUpstream(); - const status = parseStatus(stdout); - - expect(exitCode).toBe(0); - expect(status.STATUS).toBe('success'); - expect(status.REMOTE).toBe('upstream'); - expect(status.CURRENT_VERSION).toBe('1.0.0'); - expect(status.NEW_VERSION).toBe('2.0.0'); - expect(status.TEMP_DIR).toMatch(/^\/tmp\/nanoclaw-update-/); - - // Verify extracted files exist - expect(fs.existsSync(path.join(status.TEMP_DIR, 'package.json'))).toBe( - true, - ); - expect(fs.existsSync(path.join(status.TEMP_DIR, 'src/index.ts'))).toBe( - true, - ); - - // Cleanup temp dir - fs.rmSync(status.TEMP_DIR, { recursive: true, force: true }); - }); - - it('uses origin when it points to qwibitai/nanoclaw', () => { - // Set origin to a URL containing qwibitai/nanoclaw - execSync(`git remote add origin https://github.com/qwibitai/nanoclaw.git`, { - cwd: projectDir, - stdio: 'pipe', - }); - // We can't actually fetch from GitHub in tests, but we can verify - // it picks the right remote. We'll add a second remote it CAN fetch from. - execSync(`git remote add upstream ${upstreamBareDir}`, { - cwd: projectDir, - stdio: 'pipe', - }); - - const { stdout, exitCode } = runFetchUpstream(); - const status = parseStatus(stdout); - - // It should find 'upstream' first (checked before origin) - expect(exitCode).toBe(0); - expect(status.REMOTE).toBe('upstream'); - - if (status.TEMP_DIR) { - fs.rmSync(status.TEMP_DIR, { recursive: true, force: true }); - } - }); - - it('adds upstream remote when none exists', { timeout: 15_000 }, () => { - // Remove origin if any - try { - execSync('git remote remove origin', { - cwd: projectDir, - stdio: 'pipe', - }); - } catch { - // No origin - } - - const { stdout } = runFetchUpstream(); - - // It will try to add upstream pointing to github (which will fail to fetch), - // but we can verify it attempted to add the remote - expect(stdout).toContain('Adding upstream'); - - // Verify the remote was added - const remotes = execSync('git remote -v', { - cwd: projectDir, - encoding: 'utf-8', - }); - expect(remotes).toContain('upstream'); - expect(remotes).toContain('qwibitai/nanoclaw'); - }); - - it('extracts files to temp dir correctly', () => { - execSync(`git remote add upstream ${upstreamBareDir}`, { - cwd: projectDir, - stdio: 'pipe', - }); - - const { stdout, exitCode } = runFetchUpstream(); - const status = parseStatus(stdout); - - expect(exitCode).toBe(0); - - // Check file content matches what was pushed - const pkg = JSON.parse( - fs.readFileSync(path.join(status.TEMP_DIR, 'package.json'), 'utf-8'), - ); - expect(pkg.version).toBe('2.0.0'); - - const indexContent = fs.readFileSync( - path.join(status.TEMP_DIR, 'src/index.ts'), - 'utf-8', - ); - expect(indexContent).toBe('export const v = 2;'); - - fs.rmSync(status.TEMP_DIR, { recursive: true, force: true }); - }); -}); diff --git a/skills-engine/__tests__/file-ops.test.ts b/skills-engine/__tests__/file-ops.test.ts deleted file mode 100644 index bfb32e8..0000000 --- a/skills-engine/__tests__/file-ops.test.ts +++ /dev/null @@ -1,169 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { executeFileOps } from '../file-ops.js'; -import { createTempDir, cleanup } from './test-helpers.js'; - -function shouldSkipSymlinkTests(err: unknown): boolean { - return !!( - err && - typeof err === 'object' && - 'code' in err && - ((err as { code?: string }).code === 'EPERM' || - (err as { code?: string }).code === 'EACCES' || - (err as { code?: string }).code === 'ENOSYS') - ); -} - -describe('file-ops', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('rename success', () => { - fs.writeFileSync(path.join(tmpDir, 'old.ts'), 'content'); - const result = executeFileOps( - [{ type: 'rename', from: 'old.ts', to: 'new.ts' }], - tmpDir, - ); - expect(result.success).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'new.ts'))).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'old.ts'))).toBe(false); - }); - - it('move success', () => { - fs.writeFileSync(path.join(tmpDir, 'file.ts'), 'content'); - const result = executeFileOps( - [{ type: 'move', from: 'file.ts', to: 'sub/file.ts' }], - tmpDir, - ); - expect(result.success).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'sub', 'file.ts'))).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'file.ts'))).toBe(false); - }); - - it('delete success', () => { - fs.writeFileSync(path.join(tmpDir, 'remove-me.ts'), 'content'); - const result = executeFileOps( - [{ type: 'delete', path: 'remove-me.ts' }], - tmpDir, - ); - expect(result.success).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'remove-me.ts'))).toBe(false); - }); - - it('rename target exists produces error', () => { - fs.writeFileSync(path.join(tmpDir, 'a.ts'), 'a'); - fs.writeFileSync(path.join(tmpDir, 'b.ts'), 'b'); - const result = executeFileOps( - [{ type: 'rename', from: 'a.ts', to: 'b.ts' }], - tmpDir, - ); - expect(result.success).toBe(false); - expect(result.errors.length).toBeGreaterThan(0); - }); - - it('delete missing file produces warning not error', () => { - const result = executeFileOps( - [{ type: 'delete', path: 'nonexistent.ts' }], - tmpDir, - ); - expect(result.success).toBe(true); - expect(result.warnings.length).toBeGreaterThan(0); - }); - - it('move creates destination directory', () => { - fs.writeFileSync(path.join(tmpDir, 'src.ts'), 'content'); - const result = executeFileOps( - [{ type: 'move', from: 'src.ts', to: 'deep/nested/dir/src.ts' }], - tmpDir, - ); - expect(result.success).toBe(true); - expect( - fs.existsSync(path.join(tmpDir, 'deep', 'nested', 'dir', 'src.ts')), - ).toBe(true); - }); - - it('path escape produces error', () => { - fs.writeFileSync(path.join(tmpDir, 'file.ts'), 'content'); - const result = executeFileOps( - [{ type: 'rename', from: 'file.ts', to: '../../escaped.ts' }], - tmpDir, - ); - expect(result.success).toBe(false); - expect(result.errors.length).toBeGreaterThan(0); - }); - - it('source missing produces error for rename', () => { - const result = executeFileOps( - [{ type: 'rename', from: 'missing.ts', to: 'new.ts' }], - tmpDir, - ); - expect(result.success).toBe(false); - expect(result.errors.length).toBeGreaterThan(0); - }); - - it('move rejects symlink escape to outside project root', () => { - const outsideDir = createTempDir(); - - try { - fs.symlinkSync(outsideDir, path.join(tmpDir, 'linkdir')); - } catch (err) { - cleanup(outsideDir); - if (shouldSkipSymlinkTests(err)) return; - throw err; - } - - fs.writeFileSync(path.join(tmpDir, 'source.ts'), 'content'); - - const result = executeFileOps( - [{ type: 'move', from: 'source.ts', to: 'linkdir/pwned.ts' }], - tmpDir, - ); - - expect(result.success).toBe(false); - expect(result.errors.some((e) => e.includes('escapes project root'))).toBe( - true, - ); - expect(fs.existsSync(path.join(tmpDir, 'source.ts'))).toBe(true); - expect(fs.existsSync(path.join(outsideDir, 'pwned.ts'))).toBe(false); - - cleanup(outsideDir); - }); - - it('delete rejects symlink escape to outside project root', () => { - const outsideDir = createTempDir(); - const outsideFile = path.join(outsideDir, 'victim.ts'); - fs.writeFileSync(outsideFile, 'secret'); - - try { - fs.symlinkSync(outsideDir, path.join(tmpDir, 'linkdir')); - } catch (err) { - cleanup(outsideDir); - if (shouldSkipSymlinkTests(err)) return; - throw err; - } - - const result = executeFileOps( - [{ type: 'delete', path: 'linkdir/victim.ts' }], - tmpDir, - ); - - expect(result.success).toBe(false); - expect(result.errors.some((e) => e.includes('escapes project root'))).toBe( - true, - ); - expect(fs.existsSync(outsideFile)).toBe(true); - - cleanup(outsideDir); - }); -}); diff --git a/skills-engine/__tests__/lock.test.ts b/skills-engine/__tests__/lock.test.ts deleted file mode 100644 index 57840e6..0000000 --- a/skills-engine/__tests__/lock.test.ts +++ /dev/null @@ -1,60 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { acquireLock, releaseLock, isLocked } from '../lock.js'; -import { LOCK_FILE } from '../constants.js'; -import { createTempDir, cleanup } from './test-helpers.js'; - -describe('lock', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - fs.mkdirSync(path.join(tmpDir, '.nanoclaw'), { recursive: true }); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('acquireLock returns a release function', () => { - const release = acquireLock(); - expect(typeof release).toBe('function'); - expect(fs.existsSync(path.join(tmpDir, LOCK_FILE))).toBe(true); - release(); - }); - - it('releaseLock removes the lock file', () => { - acquireLock(); - expect(fs.existsSync(path.join(tmpDir, LOCK_FILE))).toBe(true); - releaseLock(); - expect(fs.existsSync(path.join(tmpDir, LOCK_FILE))).toBe(false); - }); - - it('acquire after release succeeds', () => { - const release1 = acquireLock(); - release1(); - const release2 = acquireLock(); - expect(typeof release2).toBe('function'); - release2(); - }); - - it('isLocked returns true when locked', () => { - const release = acquireLock(); - expect(isLocked()).toBe(true); - release(); - }); - - it('isLocked returns false when released', () => { - const release = acquireLock(); - release(); - expect(isLocked()).toBe(false); - }); - - it('isLocked returns false when no lock exists', () => { - expect(isLocked()).toBe(false); - }); -}); diff --git a/skills-engine/__tests__/manifest.test.ts b/skills-engine/__tests__/manifest.test.ts deleted file mode 100644 index b5f695a..0000000 --- a/skills-engine/__tests__/manifest.test.ts +++ /dev/null @@ -1,355 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { stringify } from 'yaml'; -import { - readManifest, - checkCoreVersion, - checkDependencies, - checkConflicts, - checkSystemVersion, -} from '../manifest.js'; -import { - createTempDir, - setupNanoclawDir, - createMinimalState, - createSkillPackage, - cleanup, - writeState, -} from './test-helpers.js'; -import { recordSkillApplication } from '../state.js'; - -describe('manifest', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - createMinimalState(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('parses a valid manifest', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'telegram', - version: '2.0.0', - core_version: '1.0.0', - adds: ['src/telegram.ts'], - modifies: ['src/config.ts'], - }); - const manifest = readManifest(skillDir); - expect(manifest.skill).toBe('telegram'); - expect(manifest.version).toBe('2.0.0'); - expect(manifest.adds).toEqual(['src/telegram.ts']); - expect(manifest.modifies).toEqual(['src/config.ts']); - }); - - it('throws on missing skill field', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - }), - ); - expect(() => readManifest(dir)).toThrow(); - }); - - it('throws on missing version field', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - core_version: '1.0.0', - adds: [], - modifies: [], - }), - ); - expect(() => readManifest(dir)).toThrow(); - }); - - it('throws on missing core_version field', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - adds: [], - modifies: [], - }), - ); - expect(() => readManifest(dir)).toThrow(); - }); - - it('throws on missing adds field', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - modifies: [], - }), - ); - expect(() => readManifest(dir)).toThrow(); - }); - - it('throws on missing modifies field', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - }), - ); - expect(() => readManifest(dir)).toThrow(); - }); - - it('throws on path traversal in adds', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: ['../etc/passwd'], - modifies: [], - }), - ); - expect(() => readManifest(dir)).toThrow('Invalid path'); - }); - - it('throws on path traversal in modifies', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: ['../../secret.ts'], - }), - ); - expect(() => readManifest(dir)).toThrow('Invalid path'); - }); - - it('throws on absolute path in adds', () => { - const dir = path.join(tmpDir, 'bad-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: ['/etc/passwd'], - modifies: [], - }), - ); - expect(() => readManifest(dir)).toThrow('Invalid path'); - }); - - it('defaults conflicts and depends to empty arrays', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - }); - const manifest = readManifest(skillDir); - expect(manifest.conflicts).toEqual([]); - expect(manifest.depends).toEqual([]); - }); - - it('checkCoreVersion returns warning when manifest targets newer core', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '2.0.0', - adds: [], - modifies: [], - }); - const manifest = readManifest(skillDir); - const result = checkCoreVersion(manifest); - expect(result.warning).toBeTruthy(); - }); - - it('checkCoreVersion returns no warning when versions match', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - }); - const manifest = readManifest(skillDir); - const result = checkCoreVersion(manifest); - expect(result.ok).toBe(true); - expect(result.warning).toBeFalsy(); - }); - - it('checkDependencies satisfied when deps present', () => { - recordSkillApplication('dep-skill', '1.0.0', {}); - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - depends: ['dep-skill'], - }); - const manifest = readManifest(skillDir); - const result = checkDependencies(manifest); - expect(result.ok).toBe(true); - expect(result.missing).toEqual([]); - }); - - it('checkDependencies missing when deps not present', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - depends: ['missing-skill'], - }); - const manifest = readManifest(skillDir); - const result = checkDependencies(manifest); - expect(result.ok).toBe(false); - expect(result.missing).toContain('missing-skill'); - }); - - it('checkConflicts ok when no conflicts', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - conflicts: [], - }); - const manifest = readManifest(skillDir); - const result = checkConflicts(manifest); - expect(result.ok).toBe(true); - expect(result.conflicting).toEqual([]); - }); - - it('checkConflicts detects conflicting skill', () => { - recordSkillApplication('bad-skill', '1.0.0', {}); - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - conflicts: ['bad-skill'], - }); - const manifest = readManifest(skillDir); - const result = checkConflicts(manifest); - expect(result.ok).toBe(false); - expect(result.conflicting).toContain('bad-skill'); - }); - - it('parses new optional fields (author, license, etc)', () => { - const dir = path.join(tmpDir, 'full-pkg'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - author: 'tester', - license: 'MIT', - min_skills_system_version: '0.1.0', - tested_with: ['telegram', 'discord'], - post_apply: ['echo done'], - }), - ); - const manifest = readManifest(dir); - expect(manifest.author).toBe('tester'); - expect(manifest.license).toBe('MIT'); - expect(manifest.min_skills_system_version).toBe('0.1.0'); - expect(manifest.tested_with).toEqual(['telegram', 'discord']); - expect(manifest.post_apply).toEqual(['echo done']); - }); - - it('checkSystemVersion passes when not set', () => { - const skillDir = createSkillPackage(tmpDir, { - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - }); - const manifest = readManifest(skillDir); - const result = checkSystemVersion(manifest); - expect(result.ok).toBe(true); - }); - - it('checkSystemVersion passes when engine is new enough', () => { - const dir = path.join(tmpDir, 'sys-ok'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - min_skills_system_version: '0.1.0', - }), - ); - const manifest = readManifest(dir); - const result = checkSystemVersion(manifest); - expect(result.ok).toBe(true); - }); - - it('checkSystemVersion fails when engine is too old', () => { - const dir = path.join(tmpDir, 'sys-fail'); - fs.mkdirSync(dir, { recursive: true }); - fs.writeFileSync( - path.join(dir, 'manifest.yaml'), - stringify({ - skill: 'test', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - min_skills_system_version: '99.0.0', - }), - ); - const manifest = readManifest(dir); - const result = checkSystemVersion(manifest); - expect(result.ok).toBe(false); - expect(result.error).toContain('99.0.0'); - }); -}); diff --git a/skills-engine/__tests__/merge.test.ts b/skills-engine/__tests__/merge.test.ts deleted file mode 100644 index 7d6ebb6..0000000 --- a/skills-engine/__tests__/merge.test.ts +++ /dev/null @@ -1,71 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { isGitRepo, mergeFile } from '../merge.js'; -import { createTempDir, initGitRepo, cleanup } from './test-helpers.js'; - -describe('merge', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('isGitRepo returns true in a git repo', () => { - initGitRepo(tmpDir); - expect(isGitRepo()).toBe(true); - }); - - it('isGitRepo returns false outside a git repo', () => { - expect(isGitRepo()).toBe(false); - }); - - describe('mergeFile', () => { - beforeEach(() => { - initGitRepo(tmpDir); - }); - - it('clean merge with no overlapping changes', () => { - const base = path.join(tmpDir, 'base.txt'); - const current = path.join(tmpDir, 'current.txt'); - const skill = path.join(tmpDir, 'skill.txt'); - - fs.writeFileSync(base, 'line1\nline2\nline3\n'); - fs.writeFileSync(current, 'line1-modified\nline2\nline3\n'); - fs.writeFileSync(skill, 'line1\nline2\nline3-modified\n'); - - const result = mergeFile(current, base, skill); - expect(result.clean).toBe(true); - expect(result.exitCode).toBe(0); - - const merged = fs.readFileSync(current, 'utf-8'); - expect(merged).toContain('line1-modified'); - expect(merged).toContain('line3-modified'); - }); - - it('conflict with overlapping changes', () => { - const base = path.join(tmpDir, 'base.txt'); - const current = path.join(tmpDir, 'current.txt'); - const skill = path.join(tmpDir, 'skill.txt'); - - fs.writeFileSync(base, 'line1\nline2\nline3\n'); - fs.writeFileSync(current, 'line1-ours\nline2\nline3\n'); - fs.writeFileSync(skill, 'line1-theirs\nline2\nline3\n'); - - const result = mergeFile(current, base, skill); - expect(result.clean).toBe(false); - expect(result.exitCode).toBeGreaterThan(0); - - const merged = fs.readFileSync(current, 'utf-8'); - expect(merged).toContain('<<<<<<<'); - expect(merged).toContain('>>>>>>>'); - }); - }); -}); diff --git a/skills-engine/__tests__/path-remap.test.ts b/skills-engine/__tests__/path-remap.test.ts deleted file mode 100644 index e37b82c..0000000 --- a/skills-engine/__tests__/path-remap.test.ts +++ /dev/null @@ -1,172 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; - -import { - loadPathRemap, - recordPathRemap, - resolvePathRemap, -} from '../path-remap.js'; -import { readState, writeState } from '../state.js'; -import { - cleanup, - createMinimalState, - createTempDir, - setupNanoclawDir, -} from './test-helpers.js'; - -describe('path-remap', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - createMinimalState(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - describe('resolvePathRemap', () => { - it('returns remapped path when entry exists', () => { - const remap = { 'src/old.ts': 'src/new.ts' }; - expect(resolvePathRemap('src/old.ts', remap)).toBe('src/new.ts'); - }); - - it('returns original path when no remap entry', () => { - const remap = { 'src/old.ts': 'src/new.ts' }; - expect(resolvePathRemap('src/other.ts', remap)).toBe('src/other.ts'); - }); - - it('returns original path when remap is empty', () => { - expect(resolvePathRemap('src/file.ts', {})).toBe('src/file.ts'); - }); - - it('ignores remap entries that escape project root', () => { - const remap = { 'src/file.ts': '../../outside.txt' }; - expect(resolvePathRemap('src/file.ts', remap)).toBe('src/file.ts'); - }); - - it('ignores remap target that resolves through symlink outside project root', () => { - const outsideDir = fs.mkdtempSync( - path.join(path.dirname(tmpDir), 'nanoclaw-remap-outside-'), - ); - const linkPath = path.join(tmpDir, 'link-out'); - - try { - fs.symlinkSync(outsideDir, linkPath); - } catch (err) { - const code = (err as NodeJS.ErrnoException).code; - if (code === 'EPERM' || code === 'EACCES' || code === 'ENOSYS') { - fs.rmSync(outsideDir, { recursive: true, force: true }); - return; - } - fs.rmSync(outsideDir, { recursive: true, force: true }); - throw err; - } - - try { - const remap = { 'src/file.ts': 'link-out/pwned.txt' }; - expect(resolvePathRemap('src/file.ts', remap)).toBe('src/file.ts'); - } finally { - fs.rmSync(outsideDir, { recursive: true, force: true }); - } - }); - - it('throws when requested path itself escapes project root', () => { - expect(() => resolvePathRemap('../../outside.txt', {})).toThrow( - /escapes project root/i, - ); - }); - }); - - describe('loadPathRemap', () => { - it('returns empty object when no remap in state', () => { - const remap = loadPathRemap(); - expect(remap).toEqual({}); - }); - - it('returns remap from state', () => { - recordPathRemap({ 'src/a.ts': 'src/b.ts' }); - const remap = loadPathRemap(); - expect(remap).toEqual({ 'src/a.ts': 'src/b.ts' }); - }); - - it('drops unsafe remap entries stored in state', () => { - const state = readState(); - state.path_remap = { - 'src/a.ts': 'src/b.ts', - 'src/evil.ts': '../../outside.txt', - }; - writeState(state); - - const remap = loadPathRemap(); - expect(remap).toEqual({ 'src/a.ts': 'src/b.ts' }); - }); - - it('drops symlink-based escape entries stored in state', () => { - const outsideDir = fs.mkdtempSync( - path.join(path.dirname(tmpDir), 'nanoclaw-remap-outside-'), - ); - const linkPath = path.join(tmpDir, 'link-out'); - - try { - fs.symlinkSync(outsideDir, linkPath); - } catch (err) { - const code = (err as NodeJS.ErrnoException).code; - if (code === 'EPERM' || code === 'EACCES' || code === 'ENOSYS') { - fs.rmSync(outsideDir, { recursive: true, force: true }); - return; - } - fs.rmSync(outsideDir, { recursive: true, force: true }); - throw err; - } - - try { - const state = readState(); - state.path_remap = { - 'src/a.ts': 'src/b.ts', - 'src/evil.ts': 'link-out/pwned.txt', - }; - writeState(state); - - const remap = loadPathRemap(); - expect(remap).toEqual({ 'src/a.ts': 'src/b.ts' }); - } finally { - fs.rmSync(outsideDir, { recursive: true, force: true }); - } - }); - }); - - describe('recordPathRemap', () => { - it('records new remap entries', () => { - recordPathRemap({ 'src/old.ts': 'src/new.ts' }); - expect(loadPathRemap()).toEqual({ 'src/old.ts': 'src/new.ts' }); - }); - - it('merges with existing remap', () => { - recordPathRemap({ 'src/a.ts': 'src/b.ts' }); - recordPathRemap({ 'src/c.ts': 'src/d.ts' }); - expect(loadPathRemap()).toEqual({ - 'src/a.ts': 'src/b.ts', - 'src/c.ts': 'src/d.ts', - }); - }); - - it('overwrites existing key on conflict', () => { - recordPathRemap({ 'src/a.ts': 'src/b.ts' }); - recordPathRemap({ 'src/a.ts': 'src/c.ts' }); - expect(loadPathRemap()).toEqual({ 'src/a.ts': 'src/c.ts' }); - }); - - it('rejects unsafe remap entries', () => { - expect(() => - recordPathRemap({ 'src/a.ts': '../../outside.txt' }), - ).toThrow(/escapes project root/i); - }); - }); -}); diff --git a/skills-engine/__tests__/rebase.test.ts b/skills-engine/__tests__/rebase.test.ts deleted file mode 100644 index a7aaa3f..0000000 --- a/skills-engine/__tests__/rebase.test.ts +++ /dev/null @@ -1,389 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; -import { parse } from 'yaml'; - -import { rebase } from '../rebase.js'; -import { - cleanup, - createMinimalState, - createTempDir, - initGitRepo, - setupNanoclawDir, - writeState, -} from './test-helpers.js'; - -describe('rebase', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - createMinimalState(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('rebase with one skill: patch created, state updated, rebased_at set', async () => { - // Set up base file - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'index.ts'), 'const x = 1;\n'); - - // Set up working tree with skill modification - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'const x = 1;\nconst y = 2; // added by skill\n', - ); - - // Write state with applied skill - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'test-skill', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/index.ts': 'abc123', - }, - }, - ], - }); - - initGitRepo(tmpDir); - - const result = await rebase(); - - expect(result.success).toBe(true); - expect(result.filesInPatch).toBeGreaterThan(0); - expect(result.rebased_at).toBeDefined(); - expect(result.patchFile).toBeDefined(); - - // Verify patch file exists - const patchPath = path.join(tmpDir, '.nanoclaw', 'combined.patch'); - expect(fs.existsSync(patchPath)).toBe(true); - - const patchContent = fs.readFileSync(patchPath, 'utf-8'); - expect(patchContent).toContain('added by skill'); - - // Verify state was updated - const stateContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'state.yaml'), - 'utf-8', - ); - const state = parse(stateContent); - expect(state.rebased_at).toBeDefined(); - expect(state.applied_skills).toHaveLength(1); - expect(state.applied_skills[0].name).toBe('test-skill'); - - // File hashes should be updated to actual current values - const currentHash = state.applied_skills[0].file_hashes['src/index.ts']; - expect(currentHash).toBeDefined(); - expect(currentHash).not.toBe('abc123'); // Should be recomputed - - // Working tree file should still have the skill's changes - const workingContent = fs.readFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'utf-8', - ); - expect(workingContent).toContain('added by skill'); - }); - - it('rebase flattens: base updated to match working tree', async () => { - // Set up base file (clean core) - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'index.ts'), 'const x = 1;\n'); - - // Working tree has skill modification - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'const x = 1;\nconst y = 2; // skill\n', - ); - - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'my-skill', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/index.ts': 'oldhash', - }, - }, - ], - }); - - initGitRepo(tmpDir); - - const result = await rebase(); - expect(result.success).toBe(true); - - // Base should now include the skill's changes (flattened) - const baseContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'base', 'src', 'index.ts'), - 'utf-8', - ); - expect(baseContent).toContain('skill'); - expect(baseContent).toBe('const x = 1;\nconst y = 2; // skill\n'); - }); - - it('rebase with multiple skills + custom mods: all collapsed into single patch', async () => { - // Set up base files - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src', 'index.ts'), 'const x = 1;\n'); - fs.writeFileSync( - path.join(baseDir, 'src', 'config.ts'), - 'export const port = 3000;\n', - ); - - // Set up working tree with modifications from multiple skills - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'const x = 1;\nconst y = 2; // skill-a\n', - ); - fs.writeFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'export const port = 3000;\nexport const host = "0.0.0.0"; // skill-b\n', - ); - // File added by skill - fs.writeFileSync( - path.join(tmpDir, 'src', 'plugin.ts'), - 'export const plugin = true;\n', - ); - - // Write state with multiple skills and custom modifications - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'skill-a', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/index.ts': 'hash-a1', - }, - }, - { - name: 'skill-b', - version: '2.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/config.ts': 'hash-b1', - 'src/plugin.ts': 'hash-b2', - }, - }, - ], - custom_modifications: [ - { - description: 'tweaked config', - applied_at: new Date().toISOString(), - files_modified: ['src/config.ts'], - patch_file: '.nanoclaw/custom/001-tweaked-config.patch', - }, - ], - }); - - initGitRepo(tmpDir); - - const result = await rebase(); - - expect(result.success).toBe(true); - expect(result.filesInPatch).toBeGreaterThanOrEqual(2); - - // Verify combined patch includes changes from both skills - const patchContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'combined.patch'), - 'utf-8', - ); - expect(patchContent).toContain('skill-a'); - expect(patchContent).toContain('skill-b'); - - // Verify state: custom_modifications should be cleared - const stateContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'state.yaml'), - 'utf-8', - ); - const state = parse(stateContent); - expect(state.custom_modifications).toBeUndefined(); - expect(state.rebased_at).toBeDefined(); - - // applied_skills should still be present (informational) - expect(state.applied_skills).toHaveLength(2); - - // Base should be flattened — include all skill changes - const baseIndex = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'base', 'src', 'index.ts'), - 'utf-8', - ); - expect(baseIndex).toContain('skill-a'); - - const baseConfig = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'base', 'src', 'config.ts'), - 'utf-8', - ); - expect(baseConfig).toContain('skill-b'); - }); - - it('rebase with new base: base updated, changes merged', async () => { - // Set up current base (multi-line so changes don't conflict) - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(baseDir, 'src', 'index.ts'), - 'line1\nline2\nline3\nline4\nline5\nline6\nline7\nline8\n', - ); - - // Working tree: skill adds at bottom - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'line1\nline2\nline3\nline4\nline5\nline6\nline7\nline8\nskill change\n', - ); - - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'my-skill', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/index.ts': 'oldhash', - }, - }, - ], - }); - - initGitRepo(tmpDir); - - // New base: core update at top - const newBase = path.join(tmpDir, 'new-core'); - fs.mkdirSync(path.join(newBase, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(newBase, 'src', 'index.ts'), - 'core v2 header\nline1\nline2\nline3\nline4\nline5\nline6\nline7\nline8\n', - ); - - const result = await rebase(newBase); - - expect(result.success).toBe(true); - expect(result.patchFile).toBeDefined(); - - // Verify base was updated to new core - const baseContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'base', 'src', 'index.ts'), - 'utf-8', - ); - expect(baseContent).toContain('core v2 header'); - - // Working tree should have both core v2 and skill changes merged - const workingContent = fs.readFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'utf-8', - ); - expect(workingContent).toContain('core v2 header'); - expect(workingContent).toContain('skill change'); - - // State should reflect rebase - const stateContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'state.yaml'), - 'utf-8', - ); - const state = parse(stateContent); - expect(state.rebased_at).toBeDefined(); - }); - - it('rebase with new base: conflict returns backupPending', async () => { - // Set up current base — short file so changes overlap - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src', 'index.ts'), 'const x = 1;\n'); - - // Working tree: skill replaces the same line - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'const x = 42; // skill override\n', - ); - - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'my-skill', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/index.ts': 'oldhash', - }, - }, - ], - }); - - initGitRepo(tmpDir); - - // New base: also changes the same line — guaranteed conflict - const newBase = path.join(tmpDir, 'new-core'); - fs.mkdirSync(path.join(newBase, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(newBase, 'src', 'index.ts'), - 'const x = 999; // core v2\n', - ); - - const result = await rebase(newBase); - - expect(result.success).toBe(false); - expect(result.mergeConflicts).toContain('src/index.ts'); - expect(result.backupPending).toBe(true); - expect(result.error).toContain('Merge conflicts'); - - // combined.patch should still exist - expect(result.patchFile).toBeDefined(); - const patchPath = path.join(tmpDir, '.nanoclaw', 'combined.patch'); - expect(fs.existsSync(patchPath)).toBe(true); - - // Working tree should have conflict markers (not rolled back) - const workingContent = fs.readFileSync( - path.join(tmpDir, 'src', 'index.ts'), - 'utf-8', - ); - expect(workingContent).toContain('<<<<<<<'); - expect(workingContent).toContain('>>>>>>>'); - - // State should NOT be updated yet (conflicts pending) - const stateContent = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'state.yaml'), - 'utf-8', - ); - const state = parse(stateContent); - expect(state.rebased_at).toBeUndefined(); - }); - - it('error when no skills applied', async () => { - // State has no applied skills (created by createMinimalState) - initGitRepo(tmpDir); - - const result = await rebase(); - - expect(result.success).toBe(false); - expect(result.error).toContain('No skills applied'); - expect(result.filesInPatch).toBe(0); - }); -}); diff --git a/skills-engine/__tests__/replay.test.ts b/skills-engine/__tests__/replay.test.ts deleted file mode 100644 index 9d0aa34..0000000 --- a/skills-engine/__tests__/replay.test.ts +++ /dev/null @@ -1,297 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; - -import { findSkillDir, replaySkills } from '../replay.js'; -import { - cleanup, - createMinimalState, - createSkillPackage, - createTempDir, - initGitRepo, - setupNanoclawDir, -} from './test-helpers.js'; - -describe('replay', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - createMinimalState(tmpDir); - initGitRepo(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - describe('findSkillDir', () => { - it('finds skill directory by name', () => { - const skillsRoot = path.join(tmpDir, '.claude', 'skills', 'telegram'); - fs.mkdirSync(skillsRoot, { recursive: true }); - const { stringify } = require('yaml'); - fs.writeFileSync( - path.join(skillsRoot, 'manifest.yaml'), - stringify({ - skill: 'telegram', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: [], - }), - ); - - const result = findSkillDir('telegram', tmpDir); - expect(result).toBe(skillsRoot); - }); - - it('returns null for missing skill', () => { - const result = findSkillDir('nonexistent', tmpDir); - expect(result).toBeNull(); - }); - - it('returns null when .claude/skills does not exist', () => { - const result = findSkillDir('anything', tmpDir); - expect(result).toBeNull(); - }); - }); - - describe('replaySkills', () => { - it('replays a single skill from base', async () => { - // Set up base file - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'config.ts'), 'base content\n'); - - // Set up current file (will be overwritten by replay) - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'modified content\n', - ); - - // Create skill package - const skillDir = createSkillPackage(tmpDir, { - skill: 'telegram', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/telegram.ts'], - modifies: ['src/config.ts'], - addFiles: { 'src/telegram.ts': 'telegram code\n' }, - modifyFiles: { 'src/config.ts': 'base content\ntelegram config\n' }, - }); - - const result = await replaySkills({ - skills: ['telegram'], - skillDirs: { telegram: skillDir }, - projectRoot: tmpDir, - }); - - expect(result.success).toBe(true); - expect(result.perSkill.telegram.success).toBe(true); - - // Added file should exist - expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(true); - expect( - fs.readFileSync(path.join(tmpDir, 'src', 'telegram.ts'), 'utf-8'), - ).toBe('telegram code\n'); - - // Modified file should be merged from base - const config = fs.readFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'utf-8', - ); - expect(config).toContain('telegram config'); - }); - - it('replays two skills in order', async () => { - // Set up base - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync( - path.join(baseDir, 'config.ts'), - 'line1\nline2\nline3\nline4\nline5\n', - ); - - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'line1\nline2\nline3\nline4\nline5\n', - ); - - // Skill 1 adds at top - const skill1Dir = createSkillPackage(tmpDir, { - skill: 'telegram', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/telegram.ts'], - modifies: ['src/config.ts'], - addFiles: { 'src/telegram.ts': 'tg code' }, - modifyFiles: { - 'src/config.ts': - 'telegram import\nline1\nline2\nline3\nline4\nline5\n', - }, - dirName: 'skill-pkg-tg', - }); - - // Skill 2 adds at bottom - const skill2Dir = createSkillPackage(tmpDir, { - skill: 'discord', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/discord.ts'], - modifies: ['src/config.ts'], - addFiles: { 'src/discord.ts': 'dc code' }, - modifyFiles: { - 'src/config.ts': - 'line1\nline2\nline3\nline4\nline5\ndiscord import\n', - }, - dirName: 'skill-pkg-dc', - }); - - const result = await replaySkills({ - skills: ['telegram', 'discord'], - skillDirs: { telegram: skill1Dir, discord: skill2Dir }, - projectRoot: tmpDir, - }); - - expect(result.success).toBe(true); - expect(result.perSkill.telegram.success).toBe(true); - expect(result.perSkill.discord.success).toBe(true); - - // Both added files should exist - expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'src', 'discord.ts'))).toBe(true); - - // Config should have both changes - const config = fs.readFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'utf-8', - ); - expect(config).toContain('telegram import'); - expect(config).toContain('discord import'); - }); - - it('stops on first conflict and does not process later skills', async () => { - // After reset, current=base. Skill 1 merges cleanly (changes line 1). - // Skill 2 also changes line 1 differently → conflict with skill 1's result. - // Skill 3 should NOT be processed due to break-on-conflict. - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'config.ts'), 'line1\n'); - - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src', 'config.ts'), 'line1\n'); - - // Skill 1: changes line 1 — merges cleanly since current=base after reset - const skill1Dir = createSkillPackage(tmpDir, { - skill: 'skill-a', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: ['src/config.ts'], - modifyFiles: { 'src/config.ts': 'line1-from-skill-a\n' }, - dirName: 'skill-pkg-a', - }); - - // Skill 2: also changes line 1 differently → conflict with skill-a's result - const skill2Dir = createSkillPackage(tmpDir, { - skill: 'skill-b', - version: '1.0.0', - core_version: '1.0.0', - adds: [], - modifies: ['src/config.ts'], - modifyFiles: { 'src/config.ts': 'line1-from-skill-b\n' }, - dirName: 'skill-pkg-b', - }); - - // Skill 3: adds a new file — should be skipped - const skill3Dir = createSkillPackage(tmpDir, { - skill: 'skill-c', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/newfile.ts'], - modifies: [], - addFiles: { 'src/newfile.ts': 'should not appear' }, - dirName: 'skill-pkg-c', - }); - - const result = await replaySkills({ - skills: ['skill-a', 'skill-b', 'skill-c'], - skillDirs: { - 'skill-a': skill1Dir, - 'skill-b': skill2Dir, - 'skill-c': skill3Dir, - }, - projectRoot: tmpDir, - }); - - expect(result.success).toBe(false); - expect(result.mergeConflicts).toBeDefined(); - expect(result.mergeConflicts!.length).toBeGreaterThan(0); - // Skill B caused the conflict - expect(result.perSkill['skill-b']?.success).toBe(false); - // Skill C should NOT have been processed - expect(result.perSkill['skill-c']).toBeUndefined(); - }); - - it('returns error for missing skill dir', async () => { - const result = await replaySkills({ - skills: ['missing'], - skillDirs: {}, - projectRoot: tmpDir, - }); - - expect(result.success).toBe(false); - expect(result.error).toContain('missing'); - expect(result.perSkill.missing.success).toBe(false); - }); - - it('resets files to base before replay', async () => { - // Set up base - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'config.ts'), 'base content\n'); - - // Current has drift - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'drifted content\n', - ); - - // Also a stale added file - fs.writeFileSync( - path.join(tmpDir, 'src', 'stale-add.ts'), - 'should be removed', - ); - - const skillDir = createSkillPackage(tmpDir, { - skill: 'skill1', - version: '1.0.0', - core_version: '1.0.0', - adds: ['src/stale-add.ts'], - modifies: ['src/config.ts'], - addFiles: { 'src/stale-add.ts': 'fresh add' }, - modifyFiles: { 'src/config.ts': 'base content\nskill addition\n' }, - }); - - const result = await replaySkills({ - skills: ['skill1'], - skillDirs: { skill1: skillDir }, - projectRoot: tmpDir, - }); - - expect(result.success).toBe(true); - - // The added file should have the fresh content (not stale) - expect( - fs.readFileSync(path.join(tmpDir, 'src', 'stale-add.ts'), 'utf-8'), - ).toBe('fresh add'); - }); - }); -}); diff --git a/skills-engine/__tests__/run-migrations.test.ts b/skills-engine/__tests__/run-migrations.test.ts deleted file mode 100644 index bc208ac..0000000 --- a/skills-engine/__tests__/run-migrations.test.ts +++ /dev/null @@ -1,235 +0,0 @@ -import { execFileSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; - -import { cleanup, createTempDir } from './test-helpers.js'; - -describe('run-migrations', () => { - let tmpDir: string; - let newCoreDir: string; - const scriptPath = path.resolve('scripts/run-migrations.ts'); - const tsxBin = path.resolve('node_modules/.bin/tsx'); - - beforeEach(() => { - tmpDir = createTempDir(); - newCoreDir = path.join(tmpDir, 'new-core'); - fs.mkdirSync(newCoreDir, { recursive: true }); - }); - - afterEach(() => { - cleanup(tmpDir); - }); - - function createMigration(version: string, code: string): void { - const migDir = path.join(newCoreDir, 'migrations', version); - fs.mkdirSync(migDir, { recursive: true }); - fs.writeFileSync(path.join(migDir, 'index.ts'), code); - } - - function runMigrations( - from: string, - to: string, - ): { stdout: string; exitCode: number } { - try { - const stdout = execFileSync(tsxBin, [scriptPath, from, to, newCoreDir], { - cwd: tmpDir, - encoding: 'utf-8', - stdio: 'pipe', - timeout: 30_000, - }); - return { stdout, exitCode: 0 }; - } catch (err: any) { - return { stdout: err.stdout ?? '', exitCode: err.status ?? 1 }; - } - } - - it('outputs empty results when no migrations directory exists', () => { - const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(exitCode).toBe(0); - expect(result.migrationsRun).toBe(0); - expect(result.results).toEqual([]); - }); - - it('outputs empty results when migrations dir exists but is empty', () => { - fs.mkdirSync(path.join(newCoreDir, 'migrations'), { recursive: true }); - - const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(exitCode).toBe(0); - expect(result.migrationsRun).toBe(0); - }); - - it('runs migrations in the correct version range', () => { - // Create a marker file when the migration runs - createMigration( - '1.1.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-1.1.0'), 'done'); -`, - ); - createMigration( - '1.2.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-1.2.0'), 'done'); -`, - ); - // This one should NOT run (outside range) - createMigration( - '2.1.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-2.1.0'), 'done'); -`, - ); - - const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(exitCode).toBe(0); - expect(result.migrationsRun).toBe(2); - expect(result.results[0].version).toBe('1.1.0'); - expect(result.results[0].success).toBe(true); - expect(result.results[1].version).toBe('1.2.0'); - expect(result.results[1].success).toBe(true); - - // Verify the migrations actually ran - expect(fs.existsSync(path.join(tmpDir, 'migrated-1.1.0'))).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'migrated-1.2.0'))).toBe(true); - // 2.1.0 is outside range - expect(fs.existsSync(path.join(tmpDir, 'migrated-2.1.0'))).toBe(false); - }); - - it('excludes the from-version (only runs > from)', () => { - createMigration( - '1.0.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-1.0.0'), 'done'); -`, - ); - createMigration( - '1.1.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-1.1.0'), 'done'); -`, - ); - - const { stdout } = runMigrations('1.0.0', '1.1.0'); - const result = JSON.parse(stdout); - - expect(result.migrationsRun).toBe(1); - expect(result.results[0].version).toBe('1.1.0'); - // 1.0.0 should NOT have run - expect(fs.existsSync(path.join(tmpDir, 'migrated-1.0.0'))).toBe(false); - }); - - it('includes the to-version (<= to)', () => { - createMigration( - '2.0.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-2.0.0'), 'done'); -`, - ); - - const { stdout } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(result.migrationsRun).toBe(1); - expect(result.results[0].version).toBe('2.0.0'); - expect(result.results[0].success).toBe(true); - }); - - it('runs migrations in semver ascending order', () => { - // Create them in non-sorted order - for (const v of ['1.3.0', '1.1.0', '1.2.0']) { - createMigration( - v, - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -const log = path.join(root, 'migration-order.log'); -const existing = fs.existsSync(log) ? fs.readFileSync(log, 'utf-8') : ''; -fs.writeFileSync(log, existing + '${v}\\n'); -`, - ); - } - - const { stdout } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(result.migrationsRun).toBe(3); - expect(result.results.map((r: any) => r.version)).toEqual([ - '1.1.0', - '1.2.0', - '1.3.0', - ]); - - // Verify execution order from the log file - const log = fs.readFileSync( - path.join(tmpDir, 'migration-order.log'), - 'utf-8', - ); - expect(log.trim()).toBe('1.1.0\n1.2.0\n1.3.0'); - }); - - it('reports failure and exits non-zero when a migration throws', () => { - createMigration( - '1.1.0', - `throw new Error('migration failed intentionally');`, - ); - - const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(exitCode).toBe(1); - expect(result.migrationsRun).toBe(1); - expect(result.results[0].success).toBe(false); - expect(result.results[0].error).toBeDefined(); - }); - - it('ignores non-semver directories in migrations/', () => { - fs.mkdirSync(path.join(newCoreDir, 'migrations', 'README'), { - recursive: true, - }); - fs.mkdirSync(path.join(newCoreDir, 'migrations', 'utils'), { - recursive: true, - }); - createMigration( - '1.1.0', - ` -import fs from 'fs'; -import path from 'path'; -const root = process.argv[2]; -fs.writeFileSync(path.join(root, 'migrated-1.1.0'), 'done'); -`, - ); - - const { stdout, exitCode } = runMigrations('1.0.0', '2.0.0'); - const result = JSON.parse(stdout); - - expect(exitCode).toBe(0); - expect(result.migrationsRun).toBe(1); - expect(result.results[0].version).toBe('1.1.0'); - }); -}); diff --git a/skills-engine/__tests__/state.test.ts b/skills-engine/__tests__/state.test.ts deleted file mode 100644 index e4cdbb1..0000000 --- a/skills-engine/__tests__/state.test.ts +++ /dev/null @@ -1,122 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { - readState, - writeState, - recordSkillApplication, - computeFileHash, - compareSemver, - recordCustomModification, - getCustomModifications, -} from '../state.js'; -import { - createTempDir, - setupNanoclawDir, - createMinimalState, - writeState as writeStateHelper, - cleanup, -} from './test-helpers.js'; - -describe('state', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - it('readState/writeState roundtrip', () => { - const state = { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }; - writeState(state); - const result = readState(); - expect(result.skills_system_version).toBe('0.1.0'); - expect(result.core_version).toBe('1.0.0'); - expect(result.applied_skills).toEqual([]); - }); - - it('readState throws when no state file exists', () => { - expect(() => readState()).toThrow(); - }); - - it('readState throws when version is newer than current', () => { - writeStateHelper(tmpDir, { - skills_system_version: '99.0.0', - core_version: '1.0.0', - applied_skills: [], - }); - expect(() => readState()).toThrow(); - }); - - it('recordSkillApplication adds a skill', () => { - createMinimalState(tmpDir); - recordSkillApplication('my-skill', '1.0.0', { 'src/foo.ts': 'abc123' }); - const state = readState(); - expect(state.applied_skills).toHaveLength(1); - expect(state.applied_skills[0].name).toBe('my-skill'); - expect(state.applied_skills[0].version).toBe('1.0.0'); - expect(state.applied_skills[0].file_hashes).toEqual({ - 'src/foo.ts': 'abc123', - }); - }); - - it('re-applying same skill replaces it', () => { - createMinimalState(tmpDir); - recordSkillApplication('my-skill', '1.0.0', { 'a.ts': 'hash1' }); - recordSkillApplication('my-skill', '2.0.0', { 'a.ts': 'hash2' }); - const state = readState(); - expect(state.applied_skills).toHaveLength(1); - expect(state.applied_skills[0].version).toBe('2.0.0'); - expect(state.applied_skills[0].file_hashes).toEqual({ 'a.ts': 'hash2' }); - }); - - it('computeFileHash produces consistent sha256', () => { - const filePath = path.join(tmpDir, 'hashtest.txt'); - fs.writeFileSync(filePath, 'hello world'); - const hash1 = computeFileHash(filePath); - const hash2 = computeFileHash(filePath); - expect(hash1).toBe(hash2); - expect(hash1).toMatch(/^[a-f0-9]{64}$/); - }); - - describe('compareSemver', () => { - it('1.0.0 < 1.1.0', () => { - expect(compareSemver('1.0.0', '1.1.0')).toBeLessThan(0); - }); - - it('0.9.0 < 0.10.0', () => { - expect(compareSemver('0.9.0', '0.10.0')).toBeLessThan(0); - }); - - it('1.0.0 = 1.0.0', () => { - expect(compareSemver('1.0.0', '1.0.0')).toBe(0); - }); - }); - - it('recordCustomModification adds to array', () => { - createMinimalState(tmpDir); - recordCustomModification('tweak', ['src/a.ts'], 'custom/001-tweak.patch'); - const mods = getCustomModifications(); - expect(mods).toHaveLength(1); - expect(mods[0].description).toBe('tweak'); - expect(mods[0].files_modified).toEqual(['src/a.ts']); - expect(mods[0].patch_file).toBe('custom/001-tweak.patch'); - }); - - it('getCustomModifications returns empty when none recorded', () => { - createMinimalState(tmpDir); - const mods = getCustomModifications(); - expect(mods).toEqual([]); - }); -}); diff --git a/skills-engine/__tests__/structured.test.ts b/skills-engine/__tests__/structured.test.ts deleted file mode 100644 index 1d98f27..0000000 --- a/skills-engine/__tests__/structured.test.ts +++ /dev/null @@ -1,243 +0,0 @@ -import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import fs from 'fs'; -import path from 'path'; -import { - areRangesCompatible, - mergeNpmDependencies, - mergeEnvAdditions, - mergeDockerComposeServices, -} from '../structured.js'; -import { createTempDir, cleanup } from './test-helpers.js'; - -describe('structured', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - describe('areRangesCompatible', () => { - it('identical versions are compatible', () => { - const result = areRangesCompatible('^1.0.0', '^1.0.0'); - expect(result.compatible).toBe(true); - }); - - it('compatible ^ ranges resolve to higher', () => { - const result = areRangesCompatible('^1.0.0', '^1.1.0'); - expect(result.compatible).toBe(true); - expect(result.resolved).toBe('^1.1.0'); - }); - - it('incompatible major ^ ranges', () => { - const result = areRangesCompatible('^1.0.0', '^2.0.0'); - expect(result.compatible).toBe(false); - }); - - it('compatible ~ ranges', () => { - const result = areRangesCompatible('~1.0.0', '~1.0.3'); - expect(result.compatible).toBe(true); - expect(result.resolved).toBe('~1.0.3'); - }); - - it('mismatched prefixes are incompatible', () => { - const result = areRangesCompatible('^1.0.0', '~1.0.0'); - expect(result.compatible).toBe(false); - }); - - it('handles double-digit version parts numerically', () => { - // ^1.9.0 vs ^1.10.0 — 10 > 9 numerically, but "9" > "10" as strings - const result = areRangesCompatible('^1.9.0', '^1.10.0'); - expect(result.compatible).toBe(true); - expect(result.resolved).toBe('^1.10.0'); - }); - - it('handles double-digit patch versions', () => { - const result = areRangesCompatible('~1.0.9', '~1.0.10'); - expect(result.compatible).toBe(true); - expect(result.resolved).toBe('~1.0.10'); - }); - }); - - describe('mergeNpmDependencies', () => { - it('adds new dependencies', () => { - const pkgPath = path.join(tmpDir, 'package.json'); - fs.writeFileSync( - pkgPath, - JSON.stringify( - { - name: 'test', - dependencies: { existing: '^1.0.0' }, - }, - null, - 2, - ), - ); - - mergeNpmDependencies(pkgPath, { newdep: '^2.0.0' }); - - const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8')); - expect(pkg.dependencies.newdep).toBe('^2.0.0'); - expect(pkg.dependencies.existing).toBe('^1.0.0'); - }); - - it('resolves compatible ^ ranges', () => { - const pkgPath = path.join(tmpDir, 'package.json'); - fs.writeFileSync( - pkgPath, - JSON.stringify( - { - name: 'test', - dependencies: { dep: '^1.0.0' }, - }, - null, - 2, - ), - ); - - mergeNpmDependencies(pkgPath, { dep: '^1.1.0' }); - - const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8')); - expect(pkg.dependencies.dep).toBe('^1.1.0'); - }); - - it('sorts devDependencies after merge', () => { - const pkgPath = path.join(tmpDir, 'package.json'); - fs.writeFileSync( - pkgPath, - JSON.stringify( - { - name: 'test', - dependencies: {}, - devDependencies: { zlib: '^1.0.0', acorn: '^2.0.0' }, - }, - null, - 2, - ), - ); - - mergeNpmDependencies(pkgPath, { middle: '^1.0.0' }); - - const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8')); - const devKeys = Object.keys(pkg.devDependencies); - expect(devKeys).toEqual(['acorn', 'zlib']); - }); - - it('throws on incompatible major versions', () => { - const pkgPath = path.join(tmpDir, 'package.json'); - fs.writeFileSync( - pkgPath, - JSON.stringify( - { - name: 'test', - dependencies: { dep: '^1.0.0' }, - }, - null, - 2, - ), - ); - - expect(() => mergeNpmDependencies(pkgPath, { dep: '^2.0.0' })).toThrow(); - }); - }); - - describe('mergeEnvAdditions', () => { - it('adds new variables', () => { - const envPath = path.join(tmpDir, '.env.example'); - fs.writeFileSync(envPath, 'EXISTING_VAR=value\n'); - - mergeEnvAdditions(envPath, ['NEW_VAR']); - - const content = fs.readFileSync(envPath, 'utf-8'); - expect(content).toContain('NEW_VAR='); - expect(content).toContain('EXISTING_VAR=value'); - }); - - it('skips existing variables', () => { - const envPath = path.join(tmpDir, '.env.example'); - fs.writeFileSync(envPath, 'MY_VAR=original\n'); - - mergeEnvAdditions(envPath, ['MY_VAR']); - - const content = fs.readFileSync(envPath, 'utf-8'); - // Should not add duplicate - only 1 occurrence of MY_VAR= - const matches = content.match(/MY_VAR=/g); - expect(matches).toHaveLength(1); - }); - - it('recognizes lowercase and mixed-case env vars as existing', () => { - const envPath = path.join(tmpDir, '.env.example'); - fs.writeFileSync(envPath, 'my_lower_var=value\nMixed_Case=abc\n'); - - mergeEnvAdditions(envPath, ['my_lower_var', 'Mixed_Case']); - - const content = fs.readFileSync(envPath, 'utf-8'); - // Should not add duplicates - const lowerMatches = content.match(/my_lower_var=/g); - expect(lowerMatches).toHaveLength(1); - const mixedMatches = content.match(/Mixed_Case=/g); - expect(mixedMatches).toHaveLength(1); - }); - - it('creates file if it does not exist', () => { - const envPath = path.join(tmpDir, '.env.example'); - mergeEnvAdditions(envPath, ['NEW_VAR']); - - expect(fs.existsSync(envPath)).toBe(true); - const content = fs.readFileSync(envPath, 'utf-8'); - expect(content).toContain('NEW_VAR='); - }); - }); - - describe('mergeDockerComposeServices', () => { - it('adds new services', () => { - const composePath = path.join(tmpDir, 'docker-compose.yaml'); - fs.writeFileSync( - composePath, - 'version: "3"\nservices:\n web:\n image: nginx\n', - ); - - mergeDockerComposeServices(composePath, { - redis: { image: 'redis:7' }, - }); - - const content = fs.readFileSync(composePath, 'utf-8'); - expect(content).toContain('redis'); - }); - - it('skips existing services', () => { - const composePath = path.join(tmpDir, 'docker-compose.yaml'); - fs.writeFileSync( - composePath, - 'version: "3"\nservices:\n web:\n image: nginx\n', - ); - - mergeDockerComposeServices(composePath, { - web: { image: 'apache' }, - }); - - const content = fs.readFileSync(composePath, 'utf-8'); - expect(content).toContain('nginx'); - }); - - it('throws on port collision', () => { - const composePath = path.join(tmpDir, 'docker-compose.yaml'); - fs.writeFileSync( - composePath, - 'version: "3"\nservices:\n web:\n image: nginx\n ports:\n - "8080:80"\n', - ); - - expect(() => - mergeDockerComposeServices(composePath, { - api: { image: 'node', ports: ['8080:3000'] }, - }), - ).toThrow(); - }); - }); -}); diff --git a/skills-engine/__tests__/test-helpers.ts b/skills-engine/__tests__/test-helpers.ts deleted file mode 100644 index bd3db0b..0000000 --- a/skills-engine/__tests__/test-helpers.ts +++ /dev/null @@ -1,108 +0,0 @@ -import { execSync } from 'child_process'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; -import { stringify } from 'yaml'; - -export function createTempDir(): string { - return fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-test-')); -} - -export function setupNanoclawDir(tmpDir: string): void { - fs.mkdirSync(path.join(tmpDir, '.nanoclaw', 'base', 'src'), { - recursive: true, - }); - fs.mkdirSync(path.join(tmpDir, '.nanoclaw', 'backup'), { recursive: true }); -} - -export function writeState(tmpDir: string, state: any): void { - const statePath = path.join(tmpDir, '.nanoclaw', 'state.yaml'); - fs.writeFileSync(statePath, stringify(state), 'utf-8'); -} - -export function createMinimalState(tmpDir: string): void { - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); -} - -export function createSkillPackage( - tmpDir: string, - opts: { - skill?: string; - version?: string; - core_version?: string; - adds?: string[]; - modifies?: string[]; - addFiles?: Record; - modifyFiles?: Record; - conflicts?: string[]; - depends?: string[]; - test?: string; - structured?: any; - file_ops?: any[]; - post_apply?: string[]; - min_skills_system_version?: string; - dirName?: string; - }, -): string { - const skillDir = path.join(tmpDir, opts.dirName ?? 'skill-pkg'); - fs.mkdirSync(skillDir, { recursive: true }); - - const manifest: Record = { - skill: opts.skill ?? 'test-skill', - version: opts.version ?? '1.0.0', - description: 'Test skill', - core_version: opts.core_version ?? '1.0.0', - adds: opts.adds ?? [], - modifies: opts.modifies ?? [], - conflicts: opts.conflicts ?? [], - depends: opts.depends ?? [], - test: opts.test, - structured: opts.structured, - file_ops: opts.file_ops, - }; - if (opts.post_apply) manifest.post_apply = opts.post_apply; - if (opts.min_skills_system_version) - manifest.min_skills_system_version = opts.min_skills_system_version; - - fs.writeFileSync(path.join(skillDir, 'manifest.yaml'), stringify(manifest)); - - if (opts.addFiles) { - const addDir = path.join(skillDir, 'add'); - for (const [relPath, content] of Object.entries(opts.addFiles)) { - const fullPath = path.join(addDir, relPath); - fs.mkdirSync(path.dirname(fullPath), { recursive: true }); - fs.writeFileSync(fullPath, content); - } - } - - if (opts.modifyFiles) { - const modDir = path.join(skillDir, 'modify'); - for (const [relPath, content] of Object.entries(opts.modifyFiles)) { - const fullPath = path.join(modDir, relPath); - fs.mkdirSync(path.dirname(fullPath), { recursive: true }); - fs.writeFileSync(fullPath, content); - } - } - - return skillDir; -} - -export function initGitRepo(dir: string): void { - execSync('git init', { cwd: dir, stdio: 'pipe' }); - execSync('git config user.email "test@test.com"', { - cwd: dir, - stdio: 'pipe', - }); - execSync('git config user.name "Test"', { cwd: dir, stdio: 'pipe' }); - execSync('git config rerere.enabled true', { cwd: dir, stdio: 'pipe' }); - fs.writeFileSync(path.join(dir, '.gitignore'), 'node_modules\n'); - execSync('git add -A && git commit -m "init"', { cwd: dir, stdio: 'pipe' }); -} - -export function cleanup(dir: string): void { - fs.rmSync(dir, { recursive: true, force: true }); -} diff --git a/skills-engine/__tests__/uninstall.test.ts b/skills-engine/__tests__/uninstall.test.ts deleted file mode 100644 index 7bb24fd..0000000 --- a/skills-engine/__tests__/uninstall.test.ts +++ /dev/null @@ -1,259 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; -import { stringify } from 'yaml'; - -import { uninstallSkill } from '../uninstall.js'; -import { - cleanup, - createTempDir, - initGitRepo, - setupNanoclawDir, - writeState, -} from './test-helpers.js'; - -describe('uninstall', () => { - let tmpDir: string; - const originalCwd = process.cwd(); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - initGitRepo(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - function setupSkillPackage( - name: string, - opts: { - adds?: Record; - modifies?: Record; - modifiesBase?: Record; - } = {}, - ): void { - const skillDir = path.join(tmpDir, '.claude', 'skills', name); - fs.mkdirSync(skillDir, { recursive: true }); - - const addsList = Object.keys(opts.adds ?? {}); - const modifiesList = Object.keys(opts.modifies ?? {}); - - fs.writeFileSync( - path.join(skillDir, 'manifest.yaml'), - stringify({ - skill: name, - version: '1.0.0', - core_version: '1.0.0', - adds: addsList, - modifies: modifiesList, - }), - ); - - if (opts.adds) { - const addDir = path.join(skillDir, 'add'); - for (const [relPath, content] of Object.entries(opts.adds)) { - const fullPath = path.join(addDir, relPath); - fs.mkdirSync(path.dirname(fullPath), { recursive: true }); - fs.writeFileSync(fullPath, content); - } - } - - if (opts.modifies) { - const modDir = path.join(skillDir, 'modify'); - for (const [relPath, content] of Object.entries(opts.modifies)) { - const fullPath = path.join(modDir, relPath); - fs.mkdirSync(path.dirname(fullPath), { recursive: true }); - fs.writeFileSync(fullPath, content); - } - } - } - - it('returns error for non-applied skill', async () => { - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const result = await uninstallSkill('nonexistent'); - expect(result.success).toBe(false); - expect(result.error).toContain('not applied'); - }); - - it('blocks uninstall after rebase', async () => { - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - rebased_at: new Date().toISOString(), - applied_skills: [ - { - name: 'telegram', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { 'src/config.ts': 'abc' }, - }, - ], - }); - - const result = await uninstallSkill('telegram'); - expect(result.success).toBe(false); - expect(result.error).toContain('Cannot uninstall'); - expect(result.error).toContain('after rebase'); - }); - - it('returns custom patch warning', async () => { - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'telegram', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: {}, - custom_patch: '.nanoclaw/custom/001.patch', - custom_patch_description: 'My tweak', - }, - ], - }); - - const result = await uninstallSkill('telegram'); - expect(result.success).toBe(false); - expect(result.customPatchWarning).toContain('custom patch'); - expect(result.customPatchWarning).toContain('My tweak'); - }); - - it('uninstalls only skill → files reset to base', async () => { - // Set up base - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'config.ts'), 'base config\n'); - - // Set up current files (as if skill was applied) - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'base config\ntelegram config\n', - ); - fs.writeFileSync( - path.join(tmpDir, 'src', 'telegram.ts'), - 'telegram code\n', - ); - - // Set up skill package in .claude/skills/ - setupSkillPackage('telegram', { - adds: { 'src/telegram.ts': 'telegram code\n' }, - modifies: { - 'src/config.ts': 'base config\ntelegram config\n', - }, - }); - - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'telegram', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/config.ts': 'abc', - 'src/telegram.ts': 'def', - }, - }, - ], - }); - - const result = await uninstallSkill('telegram'); - expect(result.success).toBe(true); - expect(result.skill).toBe('telegram'); - - // config.ts should be reset to base - expect( - fs.readFileSync(path.join(tmpDir, 'src', 'config.ts'), 'utf-8'), - ).toBe('base config\n'); - - // telegram.ts (add-only) should be removed - expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(false); - }); - - it('uninstalls one of two → other preserved', async () => { - // Set up base - const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src'); - fs.mkdirSync(baseDir, { recursive: true }); - fs.writeFileSync( - path.join(baseDir, 'config.ts'), - 'line1\nline2\nline3\nline4\nline5\n', - ); - - // Current has both skills applied - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'telegram import\nline1\nline2\nline3\nline4\nline5\ndiscord import\n', - ); - fs.writeFileSync(path.join(tmpDir, 'src', 'telegram.ts'), 'tg code\n'); - fs.writeFileSync(path.join(tmpDir, 'src', 'discord.ts'), 'dc code\n'); - - // Set up both skill packages - setupSkillPackage('telegram', { - adds: { 'src/telegram.ts': 'tg code\n' }, - modifies: { - 'src/config.ts': 'telegram import\nline1\nline2\nline3\nline4\nline5\n', - }, - }); - - setupSkillPackage('discord', { - adds: { 'src/discord.ts': 'dc code\n' }, - modifies: { - 'src/config.ts': 'line1\nline2\nline3\nline4\nline5\ndiscord import\n', - }, - }); - - writeState(tmpDir, { - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'telegram', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/config.ts': 'abc', - 'src/telegram.ts': 'def', - }, - }, - { - name: 'discord', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { - 'src/config.ts': 'ghi', - 'src/discord.ts': 'jkl', - }, - }, - ], - }); - - const result = await uninstallSkill('telegram'); - expect(result.success).toBe(true); - - // discord.ts should still exist - expect(fs.existsSync(path.join(tmpDir, 'src', 'discord.ts'))).toBe(true); - - // telegram.ts should be gone - expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(false); - - // config should have discord import but not telegram - const config = fs.readFileSync( - path.join(tmpDir, 'src', 'config.ts'), - 'utf-8', - ); - expect(config).toContain('discord import'); - expect(config).not.toContain('telegram import'); - }); -}); diff --git a/skills-engine/__tests__/update-core-cli.test.ts b/skills-engine/__tests__/update-core-cli.test.ts deleted file mode 100644 index c95e65d..0000000 --- a/skills-engine/__tests__/update-core-cli.test.ts +++ /dev/null @@ -1,137 +0,0 @@ -import { execFileSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; -import { stringify } from 'yaml'; - -import { - cleanup, - createTempDir, - initGitRepo, - setupNanoclawDir, -} from './test-helpers.js'; - -describe('update-core.ts CLI flags', () => { - let tmpDir: string; - const scriptPath = path.resolve('scripts/update-core.ts'); - const tsxBin = path.resolve('node_modules/.bin/tsx'); - - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - initGitRepo(tmpDir); - - // Write state file - const statePath = path.join(tmpDir, '.nanoclaw', 'state.yaml'); - fs.writeFileSync( - statePath, - stringify({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }), - ); - }); - - afterEach(() => { - cleanup(tmpDir); - }); - - function createNewCore(files: Record): string { - const dir = path.join(tmpDir, 'new-core'); - fs.mkdirSync(dir, { recursive: true }); - for (const [relPath, content] of Object.entries(files)) { - const fullPath = path.join(dir, relPath); - fs.mkdirSync(path.dirname(fullPath), { recursive: true }); - fs.writeFileSync(fullPath, content); - } - return dir; - } - - it('--json --preview-only outputs JSON preview without applying', () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'original'); - - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'original'); - - const newCoreDir = createNewCore({ - 'src/index.ts': 'updated', - 'package.json': JSON.stringify({ version: '2.0.0' }), - }); - - const stdout = execFileSync( - tsxBin, - [scriptPath, '--json', '--preview-only', newCoreDir], - { cwd: tmpDir, encoding: 'utf-8', stdio: 'pipe', timeout: 30_000 }, - ); - - const preview = JSON.parse(stdout); - - expect(preview.currentVersion).toBe('1.0.0'); - expect(preview.newVersion).toBe('2.0.0'); - expect(preview.filesChanged).toContain('src/index.ts'); - - // File should NOT have been modified (preview only) - expect(fs.readFileSync(path.join(tmpDir, 'src/index.ts'), 'utf-8')).toBe( - 'original', - ); - }); - - it('--preview-only without --json outputs human-readable text', () => { - const newCoreDir = createNewCore({ - 'src/new-file.ts': 'export const x = 1;', - 'package.json': JSON.stringify({ version: '2.0.0' }), - }); - - const stdout = execFileSync( - tsxBin, - [scriptPath, '--preview-only', newCoreDir], - { cwd: tmpDir, encoding: 'utf-8', stdio: 'pipe', timeout: 30_000 }, - ); - - expect(stdout).toContain('Update Preview'); - expect(stdout).toContain('2.0.0'); - // Should NOT contain JSON (it's human-readable mode) - expect(stdout).not.toContain('"currentVersion"'); - }); - - it('--json applies and outputs JSON result', () => { - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'original'); - - const newCoreDir = createNewCore({ - 'src/index.ts': 'original', - 'package.json': JSON.stringify({ version: '2.0.0' }), - }); - - const stdout = execFileSync(tsxBin, [scriptPath, '--json', newCoreDir], { - cwd: tmpDir, - encoding: 'utf-8', - stdio: 'pipe', - timeout: 30_000, - }); - - const result = JSON.parse(stdout); - - expect(result.success).toBe(true); - expect(result.previousVersion).toBe('1.0.0'); - expect(result.newVersion).toBe('2.0.0'); - }); - - it('exits with error when no path provided', () => { - try { - execFileSync(tsxBin, [scriptPath], { - cwd: tmpDir, - encoding: 'utf-8', - stdio: 'pipe', - timeout: 30_000, - }); - expect.unreachable('Should have exited with error'); - } catch (err: any) { - expect(err.status).toBe(1); - expect(err.stderr).toContain('Usage'); - } - }); -}); diff --git a/skills-engine/__tests__/update.test.ts b/skills-engine/__tests__/update.test.ts deleted file mode 100644 index a4091ed..0000000 --- a/skills-engine/__tests__/update.test.ts +++ /dev/null @@ -1,418 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import { afterEach, beforeEach, describe, expect, it } from 'vitest'; -import { stringify } from 'yaml'; - -import { - cleanup, - createTempDir, - initGitRepo, - setupNanoclawDir, -} from './test-helpers.js'; - -let tmpDir: string; -const originalCwd = process.cwd(); - -describe('update', () => { - beforeEach(() => { - tmpDir = createTempDir(); - setupNanoclawDir(tmpDir); - initGitRepo(tmpDir); - process.chdir(tmpDir); - }); - - afterEach(() => { - process.chdir(originalCwd); - cleanup(tmpDir); - }); - - function writeStateFile(state: Record): void { - const statePath = path.join(tmpDir, '.nanoclaw', 'state.yaml'); - fs.writeFileSync(statePath, stringify(state), 'utf-8'); - } - - function createNewCoreDir(files: Record): string { - const newCoreDir = path.join(tmpDir, 'new-core'); - fs.mkdirSync(newCoreDir, { recursive: true }); - - for (const [relPath, content] of Object.entries(files)) { - const fullPath = path.join(newCoreDir, relPath); - fs.mkdirSync(path.dirname(fullPath), { recursive: true }); - fs.writeFileSync(fullPath, content); - } - - return newCoreDir; - } - - describe('previewUpdate', () => { - it('detects new files in update', async () => { - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'src/new-file.ts': 'export const x = 1;', - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.filesChanged).toContain('src/new-file.ts'); - expect(preview.currentVersion).toBe('1.0.0'); - }); - - it('detects changed files vs base', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'original'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'modified', - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.filesChanged).toContain('src/index.ts'); - }); - - it('does not list unchanged files', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'same content'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'same content', - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.filesChanged).not.toContain('src/index.ts'); - }); - - it('identifies conflict risk with applied skills', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'original'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [ - { - name: 'telegram', - version: '1.0.0', - applied_at: new Date().toISOString(), - file_hashes: { 'src/index.ts': 'abc123' }, - }, - ], - }); - - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'updated core', - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.conflictRisk).toContain('src/index.ts'); - }); - - it('identifies custom patches at risk', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/config.ts'), 'original'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - custom_modifications: [ - { - description: 'custom tweak', - applied_at: new Date().toISOString(), - files_modified: ['src/config.ts'], - patch_file: '.nanoclaw/custom/001-tweak.patch', - }, - ], - }); - - const newCoreDir = createNewCoreDir({ - 'src/config.ts': 'updated core config', - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.customPatchesAtRisk).toContain('src/config.ts'); - }); - - it('reads version from package.json in new core', async () => { - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'package.json': JSON.stringify({ version: '2.0.0' }), - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.newVersion).toBe('2.0.0'); - }); - - it('detects files deleted in new core', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'keep this'); - fs.writeFileSync(path.join(baseDir, 'src/removed.ts'), 'delete this'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - // New core only has index.ts — removed.ts is gone - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'keep this', - }); - - const { previewUpdate } = await import('../update.js'); - const preview = previewUpdate(newCoreDir); - - expect(preview.filesDeleted).toContain('src/removed.ts'); - expect(preview.filesChanged).not.toContain('src/removed.ts'); - }); - }); - - describe('applyUpdate', () => { - it('rejects when customize session is active', async () => { - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - // Create the pending.yaml that indicates active customize - const customDir = path.join(tmpDir, '.nanoclaw', 'custom'); - fs.mkdirSync(customDir, { recursive: true }); - fs.writeFileSync(path.join(customDir, 'pending.yaml'), 'active: true'); - - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'new content', - }); - - const { applyUpdate } = await import('../update.js'); - const result = await applyUpdate(newCoreDir); - - expect(result.success).toBe(false); - expect(result.error).toContain('customize session'); - }); - - it('copies new files that do not exist yet', async () => { - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'src/brand-new.ts': 'export const fresh = true;', - }); - - const { applyUpdate } = await import('../update.js'); - const result = await applyUpdate(newCoreDir); - - expect(result.error).toBeUndefined(); - expect(result.success).toBe(true); - expect( - fs.readFileSync(path.join(tmpDir, 'src/brand-new.ts'), 'utf-8'), - ).toBe('export const fresh = true;'); - }); - - it('performs clean three-way merge', async () => { - // Set up base - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(baseDir, 'src/index.ts'), - 'line 1\nline 2\nline 3\n', - ); - - // Current has user changes at the bottom - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src/index.ts'), - 'line 1\nline 2\nline 3\nuser addition\n', - ); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - // New core changes at the top - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'core update\nline 1\nline 2\nline 3\n', - 'package.json': JSON.stringify({ version: '2.0.0' }), - }); - - const { applyUpdate } = await import('../update.js'); - const result = await applyUpdate(newCoreDir); - - expect(result.success).toBe(true); - expect(result.newVersion).toBe('2.0.0'); - - const merged = fs.readFileSync( - path.join(tmpDir, 'src/index.ts'), - 'utf-8', - ); - expect(merged).toContain('core update'); - expect(merged).toContain('user addition'); - }); - - it('updates base directory after successful merge', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'old base'); - - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'old base'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'new base content', - }); - - const { applyUpdate } = await import('../update.js'); - await applyUpdate(newCoreDir); - - const newBase = fs.readFileSync( - path.join(tmpDir, '.nanoclaw', 'base', 'src/index.ts'), - 'utf-8', - ); - expect(newBase).toBe('new base content'); - }); - - it('updates core_version in state after success', async () => { - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - const newCoreDir = createNewCoreDir({ - 'package.json': JSON.stringify({ version: '2.0.0' }), - }); - - const { applyUpdate } = await import('../update.js'); - const result = await applyUpdate(newCoreDir); - - expect(result.success).toBe(true); - expect(result.previousVersion).toBe('1.0.0'); - expect(result.newVersion).toBe('2.0.0'); - - // Verify state file was updated - const { readState } = await import('../state.js'); - const state = readState(); - expect(state.core_version).toBe('2.0.0'); - }); - - it('restores backup on merge conflict', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(baseDir, 'src/index.ts'), - 'line 1\nline 2\nline 3\n', - ); - - // Current has conflicting change on same line - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync( - path.join(tmpDir, 'src/index.ts'), - 'line 1\nuser changed line 2\nline 3\n', - ); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - // New core also changes line 2 — guaranteed conflict - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'line 1\ncore changed line 2\nline 3\n', - }); - - const { applyUpdate } = await import('../update.js'); - const result = await applyUpdate(newCoreDir); - - expect(result.success).toBe(false); - expect(result.mergeConflicts).toContain('src/index.ts'); - expect(result.backupPending).toBe(true); - - // File should have conflict markers (backup preserved, not restored) - const content = fs.readFileSync( - path.join(tmpDir, 'src/index.ts'), - 'utf-8', - ); - expect(content).toContain('<<<<<<<'); - expect(content).toContain('>>>>>>>'); - }); - - it('removes files deleted in new core', async () => { - const baseDir = path.join(tmpDir, '.nanoclaw', 'base'); - fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'keep'); - fs.writeFileSync(path.join(baseDir, 'src/removed.ts'), 'old content'); - - // Working tree has both files - fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true }); - fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'keep'); - fs.writeFileSync(path.join(tmpDir, 'src/removed.ts'), 'old content'); - - writeStateFile({ - skills_system_version: '0.1.0', - core_version: '1.0.0', - applied_skills: [], - }); - - // New core only has index.ts - const newCoreDir = createNewCoreDir({ - 'src/index.ts': 'keep', - }); - - const { applyUpdate } = await import('../update.js'); - const result = await applyUpdate(newCoreDir); - - expect(result.success).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'src/index.ts'))).toBe(true); - expect(fs.existsSync(path.join(tmpDir, 'src/removed.ts'))).toBe(false); - }); - }); -}); diff --git a/skills-engine/apply.ts b/skills-engine/apply.ts deleted file mode 100644 index 3c114df..0000000 --- a/skills-engine/apply.ts +++ /dev/null @@ -1,378 +0,0 @@ -import { execSync } from 'child_process'; -import crypto from 'crypto'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { clearBackup, createBackup, restoreBackup } from './backup.js'; -import { NANOCLAW_DIR } from './constants.js'; -import { copyDir } from './fs-utils.js'; -import { isCustomizeActive } from './customize.js'; -import { executeFileOps } from './file-ops.js'; -import { acquireLock } from './lock.js'; -import { - checkConflicts, - checkCoreVersion, - checkDependencies, - checkSystemVersion, - readManifest, -} from './manifest.js'; -import { loadPathRemap, resolvePathRemap } from './path-remap.js'; -import { mergeFile } from './merge.js'; -import { - computeFileHash, - readState, - recordSkillApplication, - writeState, -} from './state.js'; -import { - mergeDockerComposeServices, - mergeEnvAdditions, - mergeNpmDependencies, - runNpmInstall, -} from './structured.js'; -import { ApplyResult } from './types.js'; - -export async function applySkill(skillDir: string): Promise { - const projectRoot = process.cwd(); - const manifest = readManifest(skillDir); - - // --- Pre-flight checks --- - const currentState = readState(); // Validates state exists and version is compatible - - // Check skills system version compatibility - const sysCheck = checkSystemVersion(manifest); - if (!sysCheck.ok) { - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: sysCheck.error, - }; - } - - // Check core version compatibility - const coreCheck = checkCoreVersion(manifest); - if (coreCheck.warning) { - console.log(`Warning: ${coreCheck.warning}`); - } - - // Block if customize session is active - if (isCustomizeActive()) { - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: - 'A customize session is active. Run commitCustomize() or abortCustomize() first.', - }; - } - - const deps = checkDependencies(manifest); - if (!deps.ok) { - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: `Missing dependencies: ${deps.missing.join(', ')}`, - }; - } - - const conflicts = checkConflicts(manifest); - if (!conflicts.ok) { - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: `Conflicting skills: ${conflicts.conflicting.join(', ')}`, - }; - } - - // Load path remap for renamed core files - const pathRemap = loadPathRemap(); - - // Detect drift for modified files - const driftFiles: string[] = []; - for (const relPath of manifest.modifies) { - const resolvedPath = resolvePathRemap(relPath, pathRemap); - const currentPath = path.join(projectRoot, resolvedPath); - const basePath = path.join(projectRoot, NANOCLAW_DIR, 'base', resolvedPath); - - if (fs.existsSync(currentPath) && fs.existsSync(basePath)) { - const currentHash = computeFileHash(currentPath); - const baseHash = computeFileHash(basePath); - if (currentHash !== baseHash) { - driftFiles.push(relPath); - } - } - } - - if (driftFiles.length > 0) { - console.log(`Drift detected in: ${driftFiles.join(', ')}`); - console.log('Three-way merge will be used to reconcile changes.'); - } - - // --- Acquire lock --- - const releaseLock = acquireLock(); - - // Track added files so we can remove them on rollback - const addedFiles: string[] = []; - - try { - // --- Backup --- - const filesToBackup = [ - ...manifest.modifies.map((f) => - path.join(projectRoot, resolvePathRemap(f, pathRemap)), - ), - ...manifest.adds.map((f) => - path.join(projectRoot, resolvePathRemap(f, pathRemap)), - ), - ...(manifest.file_ops || []) - .filter((op) => op.from) - .map((op) => - path.join(projectRoot, resolvePathRemap(op.from!, pathRemap)), - ), - path.join(projectRoot, 'package.json'), - path.join(projectRoot, 'package-lock.json'), - path.join(projectRoot, '.env.example'), - path.join(projectRoot, 'docker-compose.yml'), - ]; - createBackup(filesToBackup); - - // --- File operations (before copy adds, per architecture doc) --- - if (manifest.file_ops && manifest.file_ops.length > 0) { - const fileOpsResult = executeFileOps(manifest.file_ops, projectRoot); - if (!fileOpsResult.success) { - restoreBackup(); - clearBackup(); - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: `File operations failed: ${fileOpsResult.errors.join('; ')}`, - }; - } - } - - // --- Copy new files from add/ --- - const addDir = path.join(skillDir, 'add'); - if (fs.existsSync(addDir)) { - for (const relPath of manifest.adds) { - const resolvedDest = resolvePathRemap(relPath, pathRemap); - const destPath = path.join(projectRoot, resolvedDest); - if (!fs.existsSync(destPath)) { - addedFiles.push(destPath); - } - // Copy individual file with remap (can't use copyDir when paths differ) - const srcPath = path.join(addDir, relPath); - if (fs.existsSync(srcPath)) { - fs.mkdirSync(path.dirname(destPath), { recursive: true }); - fs.copyFileSync(srcPath, destPath); - } - } - } - - // --- Merge modified files --- - const mergeConflicts: string[] = []; - - for (const relPath of manifest.modifies) { - const resolvedPath = resolvePathRemap(relPath, pathRemap); - const currentPath = path.join(projectRoot, resolvedPath); - const basePath = path.join( - projectRoot, - NANOCLAW_DIR, - 'base', - resolvedPath, - ); - // skillPath uses original relPath — skill packages are never mutated - const skillPath = path.join(skillDir, 'modify', relPath); - - if (!fs.existsSync(skillPath)) { - throw new Error(`Skill modified file not found: ${skillPath}`); - } - - if (!fs.existsSync(currentPath)) { - // File doesn't exist yet — just copy from skill - fs.mkdirSync(path.dirname(currentPath), { recursive: true }); - fs.copyFileSync(skillPath, currentPath); - continue; - } - - if (!fs.existsSync(basePath)) { - // No base — use current as base (first-time apply) - fs.mkdirSync(path.dirname(basePath), { recursive: true }); - fs.copyFileSync(currentPath, basePath); - } - - // Three-way merge: current ← base → skill - // git merge-file modifies the first argument in-place, so use a temp copy - const tmpCurrent = path.join( - os.tmpdir(), - `nanoclaw-merge-${crypto.randomUUID()}-${path.basename(relPath)}`, - ); - fs.copyFileSync(currentPath, tmpCurrent); - - const result = mergeFile(tmpCurrent, basePath, skillPath); - - if (result.clean) { - fs.copyFileSync(tmpCurrent, currentPath); - fs.unlinkSync(tmpCurrent); - } else { - // Conflict — copy markers to working tree - fs.copyFileSync(tmpCurrent, currentPath); - fs.unlinkSync(tmpCurrent); - mergeConflicts.push(relPath); - } - } - - if (mergeConflicts.length > 0) { - // Bug 4 fix: Preserve backup when returning with conflicts - return { - success: false, - skill: manifest.skill, - version: manifest.version, - mergeConflicts, - backupPending: true, - untrackedChanges: driftFiles.length > 0 ? driftFiles : undefined, - error: `Merge conflicts in: ${mergeConflicts.join(', ')}. Resolve manually then run recordSkillApplication(). Call clearBackup() after resolution or restoreBackup() + clearBackup() to abort.`, - }; - } - - // --- Structured operations --- - if (manifest.structured?.npm_dependencies) { - const pkgPath = path.join(projectRoot, 'package.json'); - mergeNpmDependencies(pkgPath, manifest.structured.npm_dependencies); - } - - if (manifest.structured?.env_additions) { - const envPath = path.join(projectRoot, '.env.example'); - mergeEnvAdditions(envPath, manifest.structured.env_additions); - } - - if (manifest.structured?.docker_compose_services) { - const composePath = path.join(projectRoot, 'docker-compose.yml'); - mergeDockerComposeServices( - composePath, - manifest.structured.docker_compose_services, - ); - } - - // Run npm install if dependencies were added - if ( - manifest.structured?.npm_dependencies && - Object.keys(manifest.structured.npm_dependencies).length > 0 - ) { - runNpmInstall(); - } - - // --- Post-apply commands --- - if (manifest.post_apply && manifest.post_apply.length > 0) { - for (const cmd of manifest.post_apply) { - try { - execSync(cmd, { stdio: 'pipe', cwd: projectRoot, timeout: 120_000 }); - } catch (postErr: any) { - // Rollback on post_apply failure - for (const f of addedFiles) { - try { - if (fs.existsSync(f)) fs.unlinkSync(f); - } catch { - /* best effort */ - } - } - restoreBackup(); - clearBackup(); - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: `post_apply command failed: ${cmd} — ${postErr.message}`, - }; - } - } - } - - // --- Update state --- - const fileHashes: Record = {}; - for (const relPath of [...manifest.adds, ...manifest.modifies]) { - const resolvedPath = resolvePathRemap(relPath, pathRemap); - const absPath = path.join(projectRoot, resolvedPath); - if (fs.existsSync(absPath)) { - fileHashes[resolvedPath] = computeFileHash(absPath); - } - } - - // Store structured outcomes including the test command so applyUpdate() can run them - const outcomes: Record = manifest.structured - ? { ...manifest.structured } - : {}; - if (manifest.test) { - outcomes.test = manifest.test; - } - - recordSkillApplication( - manifest.skill, - manifest.version, - fileHashes, - Object.keys(outcomes).length > 0 ? outcomes : undefined, - ); - - // --- Bug 3 fix: Execute test command if defined --- - if (manifest.test) { - try { - execSync(manifest.test, { - stdio: 'pipe', - cwd: projectRoot, - timeout: 120_000, - }); - } catch (testErr: any) { - // Tests failed — remove added files, restore backup and undo state - for (const f of addedFiles) { - try { - if (fs.existsSync(f)) fs.unlinkSync(f); - } catch { - /* best effort */ - } - } - restoreBackup(); - // Re-read state and remove the skill we just recorded - const state = readState(); - state.applied_skills = state.applied_skills.filter( - (s) => s.name !== manifest.skill, - ); - writeState(state); - - clearBackup(); - return { - success: false, - skill: manifest.skill, - version: manifest.version, - error: `Tests failed: ${testErr.message}`, - }; - } - } - - // --- Cleanup --- - clearBackup(); - - return { - success: true, - skill: manifest.skill, - version: manifest.version, - untrackedChanges: driftFiles.length > 0 ? driftFiles : undefined, - }; - } catch (err) { - // Remove newly added files before restoring backup - for (const f of addedFiles) { - try { - if (fs.existsSync(f)) fs.unlinkSync(f); - } catch { - /* best effort */ - } - } - restoreBackup(); - clearBackup(); - throw err; - } finally { - releaseLock(); - } -} diff --git a/skills-engine/backup.ts b/skills-engine/backup.ts deleted file mode 100644 index d9fa307..0000000 --- a/skills-engine/backup.ts +++ /dev/null @@ -1,65 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { BACKUP_DIR } from './constants.js'; - -const TOMBSTONE_SUFFIX = '.tombstone'; - -function getBackupDir(): string { - return path.join(process.cwd(), BACKUP_DIR); -} - -export function createBackup(filePaths: string[]): void { - const backupDir = getBackupDir(); - fs.mkdirSync(backupDir, { recursive: true }); - - for (const filePath of filePaths) { - const absPath = path.resolve(filePath); - const relativePath = path.relative(process.cwd(), absPath); - const backupPath = path.join(backupDir, relativePath); - fs.mkdirSync(path.dirname(backupPath), { recursive: true }); - - if (fs.existsSync(absPath)) { - fs.copyFileSync(absPath, backupPath); - } else { - // File doesn't exist yet — write a tombstone so restore can delete it - fs.writeFileSync(backupPath + TOMBSTONE_SUFFIX, '', 'utf-8'); - } - } -} - -export function restoreBackup(): void { - const backupDir = getBackupDir(); - if (!fs.existsSync(backupDir)) return; - - const walk = (dir: string) => { - for (const entry of fs.readdirSync(dir, { withFileTypes: true })) { - const fullPath = path.join(dir, entry.name); - if (entry.isDirectory()) { - walk(fullPath); - } else if (entry.name.endsWith(TOMBSTONE_SUFFIX)) { - // Tombstone: delete the corresponding project file - const tombRelPath = path.relative(backupDir, fullPath); - const originalRelPath = tombRelPath.slice(0, -TOMBSTONE_SUFFIX.length); - const originalPath = path.join(process.cwd(), originalRelPath); - if (fs.existsSync(originalPath)) { - fs.unlinkSync(originalPath); - } - } else { - const relativePath = path.relative(backupDir, fullPath); - const originalPath = path.join(process.cwd(), relativePath); - fs.mkdirSync(path.dirname(originalPath), { recursive: true }); - fs.copyFileSync(fullPath, originalPath); - } - } - }; - - walk(backupDir); -} - -export function clearBackup(): void { - const backupDir = getBackupDir(); - if (fs.existsSync(backupDir)) { - fs.rmSync(backupDir, { recursive: true, force: true }); - } -} diff --git a/skills-engine/constants.ts b/skills-engine/constants.ts deleted file mode 100644 index 93bd5e1..0000000 --- a/skills-engine/constants.ts +++ /dev/null @@ -1,16 +0,0 @@ -export const NANOCLAW_DIR = '.nanoclaw'; -export const STATE_FILE = 'state.yaml'; -export const BASE_DIR = '.nanoclaw/base'; -export const BACKUP_DIR = '.nanoclaw/backup'; -export const LOCK_FILE = '.nanoclaw/lock'; -export const CUSTOM_DIR = '.nanoclaw/custom'; -export const SKILLS_SCHEMA_VERSION = '0.1.0'; - -// Top-level paths to include in base snapshot and upstream extraction. -// Add new entries here when new root-level directories/files need tracking. -export const BASE_INCLUDES = [ - 'src/', - 'package.json', - '.env.example', - 'container/', -]; diff --git a/skills-engine/customize.ts b/skills-engine/customize.ts deleted file mode 100644 index e7ec330..0000000 --- a/skills-engine/customize.ts +++ /dev/null @@ -1,152 +0,0 @@ -import { execFileSync, execSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import { parse, stringify } from 'yaml'; - -import { BASE_DIR, CUSTOM_DIR } from './constants.js'; -import { - computeFileHash, - readState, - recordCustomModification, -} from './state.js'; - -interface PendingCustomize { - description: string; - started_at: string; - file_hashes: Record; -} - -function getPendingPath(): string { - return path.join(process.cwd(), CUSTOM_DIR, 'pending.yaml'); -} - -export function isCustomizeActive(): boolean { - return fs.existsSync(getPendingPath()); -} - -export function startCustomize(description: string): void { - if (isCustomizeActive()) { - throw new Error( - 'A customize session is already active. Commit or abort it first.', - ); - } - - const state = readState(); - - // Collect all file hashes from applied skills - const fileHashes: Record = {}; - for (const skill of state.applied_skills) { - for (const [relativePath, hash] of Object.entries(skill.file_hashes)) { - fileHashes[relativePath] = hash; - } - } - - const pending: PendingCustomize = { - description, - started_at: new Date().toISOString(), - file_hashes: fileHashes, - }; - - const customDir = path.join(process.cwd(), CUSTOM_DIR); - fs.mkdirSync(customDir, { recursive: true }); - fs.writeFileSync(getPendingPath(), stringify(pending), 'utf-8'); -} - -export function commitCustomize(): void { - const pendingPath = getPendingPath(); - if (!fs.existsSync(pendingPath)) { - throw new Error('No active customize session. Run startCustomize() first.'); - } - - const pending = parse( - fs.readFileSync(pendingPath, 'utf-8'), - ) as PendingCustomize; - const cwd = process.cwd(); - - // Find files that changed - const changedFiles: string[] = []; - for (const relativePath of Object.keys(pending.file_hashes)) { - const fullPath = path.join(cwd, relativePath); - if (!fs.existsSync(fullPath)) { - // File was deleted — counts as changed - changedFiles.push(relativePath); - continue; - } - const currentHash = computeFileHash(fullPath); - if (currentHash !== pending.file_hashes[relativePath]) { - changedFiles.push(relativePath); - } - } - - if (changedFiles.length === 0) { - console.log( - 'No files changed during customize session. Nothing to commit.', - ); - fs.unlinkSync(pendingPath); - return; - } - - // Generate unified diff for each changed file - const baseDir = path.join(cwd, BASE_DIR); - let combinedPatch = ''; - - for (const relativePath of changedFiles) { - const basePath = path.join(baseDir, relativePath); - const currentPath = path.join(cwd, relativePath); - - // Use /dev/null if either side doesn't exist - const oldPath = fs.existsSync(basePath) ? basePath : '/dev/null'; - const newPath = fs.existsSync(currentPath) ? currentPath : '/dev/null'; - - try { - const diff = execFileSync('diff', ['-ruN', oldPath, newPath], { - encoding: 'utf-8', - }); - combinedPatch += diff; - } catch (err: unknown) { - const execErr = err as { status?: number; stdout?: string }; - if (execErr.status === 1 && execErr.stdout) { - // diff exits 1 when files differ — that's expected - combinedPatch += execErr.stdout; - } else if (execErr.status === 2) { - throw new Error( - `diff error for ${relativePath}: diff exited with status 2 (check file permissions or encoding)`, - ); - } else { - throw err; - } - } - } - - if (!combinedPatch.trim()) { - console.log('Diff was empty despite hash changes. Nothing to commit.'); - fs.unlinkSync(pendingPath); - return; - } - - // Determine sequence number - const state = readState(); - const existingCount = state.custom_modifications?.length ?? 0; - const seqNum = String(existingCount + 1).padStart(3, '0'); - - // Sanitize description for filename - const sanitized = pending.description - .toLowerCase() - .replace(/[^a-z0-9]+/g, '-') - .replace(/^-|-$/g, ''); - const patchFilename = `${seqNum}-${sanitized}.patch`; - const patchRelPath = path.join(CUSTOM_DIR, patchFilename); - const patchFullPath = path.join(cwd, patchRelPath); - - fs.writeFileSync(patchFullPath, combinedPatch, 'utf-8'); - recordCustomModification(pending.description, changedFiles, patchRelPath); - fs.unlinkSync(pendingPath); -} - -export function abortCustomize(): void { - const pendingPath = getPendingPath(); - if (fs.existsSync(pendingPath)) { - fs.unlinkSync(pendingPath); - } -} diff --git a/skills-engine/file-ops.ts b/skills-engine/file-ops.ts deleted file mode 100644 index 6d656c5..0000000 --- a/skills-engine/file-ops.ts +++ /dev/null @@ -1,191 +0,0 @@ -import fs from 'fs'; -import path from 'path'; -import type { FileOperation, FileOpsResult } from './types.js'; - -function isWithinRoot(rootPath: string, targetPath: string): boolean { - return targetPath === rootPath || targetPath.startsWith(rootPath + path.sep); -} - -function nearestExistingPathOrSymlink(candidateAbsPath: string): string { - let current = candidateAbsPath; - while (true) { - try { - fs.lstatSync(current); - return current; - } catch { - const parent = path.dirname(current); - if (parent === current) { - throw new Error(`Invalid file operation path: "${candidateAbsPath}"`); - } - current = parent; - } - } -} - -function resolveRealPathWithSymlinkAwareAnchor( - candidateAbsPath: string, -): string { - const anchorPath = nearestExistingPathOrSymlink(candidateAbsPath); - const anchorStat = fs.lstatSync(anchorPath); - let realAnchor: string; - - if (anchorStat.isSymbolicLink()) { - const linkTarget = fs.readlinkSync(anchorPath); - const linkResolved = path.resolve(path.dirname(anchorPath), linkTarget); - realAnchor = fs.realpathSync(linkResolved); - } else { - realAnchor = fs.realpathSync(anchorPath); - } - - const relativeRemainder = path.relative(anchorPath, candidateAbsPath); - return relativeRemainder - ? path.resolve(realAnchor, relativeRemainder) - : realAnchor; -} - -function safePath(projectRoot: string, relativePath: string): string | null { - if (typeof relativePath !== 'string' || relativePath.trim() === '') { - return null; - } - - const root = path.resolve(projectRoot); - const resolved = path.resolve(root, relativePath); - if (!isWithinRoot(root, resolved)) { - return null; - } - if (resolved === root) { - return null; - } - - const realRoot = fs.realpathSync(root); - const realParent = resolveRealPathWithSymlinkAwareAnchor( - path.dirname(resolved), - ); - if (!isWithinRoot(realRoot, realParent)) { - return null; - } - - return resolved; -} - -export function executeFileOps( - ops: FileOperation[], - projectRoot: string, -): FileOpsResult { - const result: FileOpsResult = { - success: true, - executed: [], - warnings: [], - errors: [], - }; - - const root = path.resolve(projectRoot); - - for (const op of ops) { - switch (op.type) { - case 'rename': { - if (!op.from || !op.to) { - result.errors.push(`rename: requires 'from' and 'to'`); - result.success = false; - return result; - } - const fromPath = safePath(root, op.from); - const toPath = safePath(root, op.to); - if (!fromPath) { - result.errors.push(`rename: path escapes project root: ${op.from}`); - result.success = false; - return result; - } - if (!toPath) { - result.errors.push(`rename: path escapes project root: ${op.to}`); - result.success = false; - return result; - } - if (!fs.existsSync(fromPath)) { - result.errors.push(`rename: source does not exist: ${op.from}`); - result.success = false; - return result; - } - if (fs.existsSync(toPath)) { - result.errors.push(`rename: target already exists: ${op.to}`); - result.success = false; - return result; - } - fs.renameSync(fromPath, toPath); - result.executed.push(op); - break; - } - - case 'delete': { - if (!op.path) { - result.errors.push(`delete: requires 'path'`); - result.success = false; - return result; - } - const delPath = safePath(root, op.path); - if (!delPath) { - result.errors.push(`delete: path escapes project root: ${op.path}`); - result.success = false; - return result; - } - if (!fs.existsSync(delPath)) { - result.warnings.push( - `delete: file does not exist (skipped): ${op.path}`, - ); - result.executed.push(op); - break; - } - fs.unlinkSync(delPath); - result.executed.push(op); - break; - } - - case 'move': { - if (!op.from || !op.to) { - result.errors.push(`move: requires 'from' and 'to'`); - result.success = false; - return result; - } - const srcPath = safePath(root, op.from); - const dstPath = safePath(root, op.to); - if (!srcPath) { - result.errors.push(`move: path escapes project root: ${op.from}`); - result.success = false; - return result; - } - if (!dstPath) { - result.errors.push(`move: path escapes project root: ${op.to}`); - result.success = false; - return result; - } - if (!fs.existsSync(srcPath)) { - result.errors.push(`move: source does not exist: ${op.from}`); - result.success = false; - return result; - } - if (fs.existsSync(dstPath)) { - result.errors.push(`move: target already exists: ${op.to}`); - result.success = false; - return result; - } - const dstDir = path.dirname(dstPath); - if (!fs.existsSync(dstDir)) { - fs.mkdirSync(dstDir, { recursive: true }); - } - fs.renameSync(srcPath, dstPath); - result.executed.push(op); - break; - } - - default: { - result.errors.push( - `unknown operation type: ${(op as FileOperation).type}`, - ); - result.success = false; - return result; - } - } - } - - return result; -} diff --git a/skills-engine/fs-utils.ts b/skills-engine/fs-utils.ts deleted file mode 100644 index a957752..0000000 --- a/skills-engine/fs-utils.ts +++ /dev/null @@ -1,21 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -/** - * Recursively copy a directory tree from src to dest. - * Creates destination directories as needed. - */ -export function copyDir(src: string, dest: string): void { - for (const entry of fs.readdirSync(src, { withFileTypes: true })) { - const srcPath = path.join(src, entry.name); - const destPath = path.join(dest, entry.name); - - if (entry.isDirectory()) { - fs.mkdirSync(destPath, { recursive: true }); - copyDir(srcPath, destPath); - } else { - fs.mkdirSync(path.dirname(destPath), { recursive: true }); - fs.copyFileSync(srcPath, destPath); - } - } -} diff --git a/skills-engine/index.ts b/skills-engine/index.ts deleted file mode 100644 index 5c35ed2..0000000 --- a/skills-engine/index.ts +++ /dev/null @@ -1,70 +0,0 @@ -export { applySkill } from './apply.js'; -export { clearBackup, createBackup, restoreBackup } from './backup.js'; -export { - BACKUP_DIR, - BASE_DIR, - SKILLS_SCHEMA_VERSION, - CUSTOM_DIR, - LOCK_FILE, - NANOCLAW_DIR, - STATE_FILE, -} from './constants.js'; -export { - abortCustomize, - commitCustomize, - isCustomizeActive, - startCustomize, -} from './customize.js'; -export { executeFileOps } from './file-ops.js'; -export { initNanoclawDir } from './init.js'; -export { acquireLock, isLocked, releaseLock } from './lock.js'; -export { - checkConflicts, - checkCoreVersion, - checkDependencies, - checkSystemVersion, - readManifest, -} from './manifest.js'; -export { isGitRepo, mergeFile } from './merge.js'; -export { - loadPathRemap, - recordPathRemap, - resolvePathRemap, -} from './path-remap.js'; -export { rebase } from './rebase.js'; -export { findSkillDir, replaySkills } from './replay.js'; -export type { ReplayOptions, ReplayResult } from './replay.js'; -export { uninstallSkill } from './uninstall.js'; -export { initSkillsSystem, migrateExisting } from './migrate.js'; -export { applyUpdate, previewUpdate } from './update.js'; -export { - compareSemver, - computeFileHash, - getAppliedSkills, - getCustomModifications, - readState, - recordCustomModification, - recordSkillApplication, - writeState, -} from './state.js'; -export { - areRangesCompatible, - mergeDockerComposeServices, - mergeEnvAdditions, - mergeNpmDependencies, - runNpmInstall, -} from './structured.js'; -export type { - AppliedSkill, - ApplyResult, - CustomModification, - FileOpsResult, - FileOperation, - MergeResult, - RebaseResult, - SkillManifest, - SkillState, - UninstallResult, - UpdatePreview, - UpdateResult, -} from './types.js'; diff --git a/skills-engine/init.ts b/skills-engine/init.ts deleted file mode 100644 index 9f43b5d..0000000 --- a/skills-engine/init.ts +++ /dev/null @@ -1,101 +0,0 @@ -import { execSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import { - BACKUP_DIR, - BASE_DIR, - BASE_INCLUDES, - NANOCLAW_DIR, -} from './constants.js'; -import { isGitRepo } from './merge.js'; -import { writeState } from './state.js'; -import { SkillState } from './types.js'; - -// Directories/files to always exclude from base snapshot -const BASE_EXCLUDES = [ - 'node_modules', - '.nanoclaw', - '.git', - 'dist', - 'data', - 'groups', - 'store', - 'logs', -]; - -export function initNanoclawDir(): void { - const projectRoot = process.cwd(); - const nanoclawDir = path.join(projectRoot, NANOCLAW_DIR); - const baseDir = path.join(projectRoot, BASE_DIR); - - // Create structure - fs.mkdirSync(path.join(projectRoot, BACKUP_DIR), { recursive: true }); - - // Clean existing base - if (fs.existsSync(baseDir)) { - fs.rmSync(baseDir, { recursive: true, force: true }); - } - fs.mkdirSync(baseDir, { recursive: true }); - - // Snapshot all included paths - for (const include of BASE_INCLUDES) { - const srcPath = path.join(projectRoot, include); - if (!fs.existsSync(srcPath)) continue; - - const destPath = path.join(baseDir, include); - const stat = fs.statSync(srcPath); - - if (stat.isDirectory()) { - copyDirFiltered(srcPath, destPath, BASE_EXCLUDES); - } else { - fs.mkdirSync(path.dirname(destPath), { recursive: true }); - fs.copyFileSync(srcPath, destPath); - } - } - - // Create initial state - const coreVersion = getCoreVersion(projectRoot); - const initialState: SkillState = { - skills_system_version: '0.1.0', - core_version: coreVersion, - applied_skills: [], - }; - writeState(initialState); - - // Enable git rerere if in a git repo - if (isGitRepo()) { - try { - execSync('git config --local rerere.enabled true', { stdio: 'pipe' }); - } catch { - // Non-fatal - } - } -} - -function copyDirFiltered(src: string, dest: string, excludes: string[]): void { - fs.mkdirSync(dest, { recursive: true }); - - for (const entry of fs.readdirSync(src, { withFileTypes: true })) { - if (excludes.includes(entry.name)) continue; - - const srcPath = path.join(src, entry.name); - const destPath = path.join(dest, entry.name); - - if (entry.isDirectory()) { - copyDirFiltered(srcPath, destPath, excludes); - } else { - fs.copyFileSync(srcPath, destPath); - } - } -} - -function getCoreVersion(projectRoot: string): string { - try { - const pkgPath = path.join(projectRoot, 'package.json'); - const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8')); - return pkg.version || '0.0.0'; - } catch { - return '0.0.0'; - } -} diff --git a/skills-engine/lock.ts b/skills-engine/lock.ts deleted file mode 100644 index 20814c4..0000000 --- a/skills-engine/lock.ts +++ /dev/null @@ -1,106 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { LOCK_FILE } from './constants.js'; - -const STALE_TIMEOUT_MS = 5 * 60 * 1000; // 5 minutes - -interface LockInfo { - pid: number; - timestamp: number; -} - -function getLockPath(): string { - return path.join(process.cwd(), LOCK_FILE); -} - -function isStale(lock: LockInfo): boolean { - return Date.now() - lock.timestamp > STALE_TIMEOUT_MS; -} - -function isProcessAlive(pid: number): boolean { - try { - process.kill(pid, 0); - return true; - } catch { - return false; - } -} - -export function acquireLock(): () => void { - const lockPath = getLockPath(); - fs.mkdirSync(path.dirname(lockPath), { recursive: true }); - - const lockInfo: LockInfo = { pid: process.pid, timestamp: Date.now() }; - - try { - // Atomic creation — fails if file already exists - fs.writeFileSync(lockPath, JSON.stringify(lockInfo), { flag: 'wx' }); - return () => releaseLock(); - } catch { - // Lock file exists — check if it's stale or from a dead process - try { - const existing: LockInfo = JSON.parse(fs.readFileSync(lockPath, 'utf-8')); - if (!isStale(existing) && isProcessAlive(existing.pid)) { - throw new Error( - `Operation in progress (pid ${existing.pid}, started ${new Date(existing.timestamp).toISOString()}). If this is stale, delete ${LOCK_FILE}`, - ); - } - // Stale or dead process — overwrite - } catch (err) { - if ( - err instanceof Error && - err.message.startsWith('Operation in progress') - ) { - throw err; - } - // Corrupt or unreadable — overwrite - } - - try { - fs.unlinkSync(lockPath); - } catch { - /* already gone */ - } - try { - fs.writeFileSync(lockPath, JSON.stringify(lockInfo), { flag: 'wx' }); - } catch { - throw new Error( - 'Lock contention: another process acquired the lock. Retry.', - ); - } - return () => releaseLock(); - } -} - -export function releaseLock(): void { - const lockPath = getLockPath(); - if (fs.existsSync(lockPath)) { - try { - const lock: LockInfo = JSON.parse(fs.readFileSync(lockPath, 'utf-8')); - // Only release our own lock - if (lock.pid === process.pid) { - fs.unlinkSync(lockPath); - } - } catch { - // Corrupt or missing — safe to remove - try { - fs.unlinkSync(lockPath); - } catch { - // Already gone - } - } - } -} - -export function isLocked(): boolean { - const lockPath = getLockPath(); - if (!fs.existsSync(lockPath)) return false; - - try { - const lock: LockInfo = JSON.parse(fs.readFileSync(lockPath, 'utf-8')); - return !isStale(lock) && isProcessAlive(lock.pid); - } catch { - return false; - } -} diff --git a/skills-engine/manifest.ts b/skills-engine/manifest.ts deleted file mode 100644 index 5522901..0000000 --- a/skills-engine/manifest.ts +++ /dev/null @@ -1,104 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { parse } from 'yaml'; - -import { SKILLS_SCHEMA_VERSION } from './constants.js'; -import { getAppliedSkills, readState, compareSemver } from './state.js'; -import { SkillManifest } from './types.js'; - -export function readManifest(skillDir: string): SkillManifest { - const manifestPath = path.join(skillDir, 'manifest.yaml'); - if (!fs.existsSync(manifestPath)) { - throw new Error(`Manifest not found: ${manifestPath}`); - } - - const content = fs.readFileSync(manifestPath, 'utf-8'); - const manifest = parse(content) as SkillManifest; - - // Validate required fields - const required = [ - 'skill', - 'version', - 'core_version', - 'adds', - 'modifies', - ] as const; - for (const field of required) { - if (manifest[field] === undefined) { - throw new Error(`Manifest missing required field: ${field}`); - } - } - - // Defaults - manifest.conflicts = manifest.conflicts || []; - manifest.depends = manifest.depends || []; - manifest.file_ops = manifest.file_ops || []; - - // Validate paths don't escape project root - const allPaths = [...manifest.adds, ...manifest.modifies]; - for (const p of allPaths) { - if (p.includes('..') || path.isAbsolute(p)) { - throw new Error( - `Invalid path in manifest: ${p} (must be relative without "..")`, - ); - } - } - - return manifest; -} - -export function checkCoreVersion(manifest: SkillManifest): { - ok: boolean; - warning?: string; -} { - const state = readState(); - const cmp = compareSemver(manifest.core_version, state.core_version); - if (cmp > 0) { - return { - ok: true, - warning: `Skill targets core ${manifest.core_version} but current core is ${state.core_version}. The merge might still work but there's a compatibility risk.`, - }; - } - return { ok: true }; -} - -export function checkDependencies(manifest: SkillManifest): { - ok: boolean; - missing: string[]; -} { - const applied = getAppliedSkills(); - const appliedNames = new Set(applied.map((s) => s.name)); - const missing = manifest.depends.filter((dep) => !appliedNames.has(dep)); - return { ok: missing.length === 0, missing }; -} - -export function checkSystemVersion(manifest: SkillManifest): { - ok: boolean; - error?: string; -} { - if (!manifest.min_skills_system_version) { - return { ok: true }; - } - const cmp = compareSemver( - manifest.min_skills_system_version, - SKILLS_SCHEMA_VERSION, - ); - if (cmp > 0) { - return { - ok: false, - error: `Skill requires skills system version ${manifest.min_skills_system_version} but current is ${SKILLS_SCHEMA_VERSION}. Update your skills engine.`, - }; - } - return { ok: true }; -} - -export function checkConflicts(manifest: SkillManifest): { - ok: boolean; - conflicting: string[]; -} { - const applied = getAppliedSkills(); - const appliedNames = new Set(applied.map((s) => s.name)); - const conflicting = manifest.conflicts.filter((c) => appliedNames.has(c)); - return { ok: conflicting.length === 0, conflicting }; -} diff --git a/skills-engine/merge.ts b/skills-engine/merge.ts deleted file mode 100644 index 11cd54a..0000000 --- a/skills-engine/merge.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { execFileSync, execSync } from 'child_process'; - -import { MergeResult } from './types.js'; - -export function isGitRepo(): boolean { - try { - execSync('git rev-parse --git-dir', { stdio: 'pipe' }); - return true; - } catch { - return false; - } -} - -/** - * Run git merge-file to three-way merge files. - * Modifies currentPath in-place. - * Returns { clean: true, exitCode: 0 } on clean merge, - * { clean: false, exitCode: N } on conflict (N = number of conflicts). - */ -export function mergeFile( - currentPath: string, - basePath: string, - skillPath: string, -): MergeResult { - try { - execFileSync('git', ['merge-file', currentPath, basePath, skillPath], { - stdio: 'pipe', - }); - return { clean: true, exitCode: 0 }; - } catch (err: any) { - const exitCode = err.status ?? 1; - if (exitCode > 0) { - // Positive exit code = number of conflicts - return { clean: false, exitCode }; - } - // Negative exit code = error - throw new Error(`git merge-file failed: ${err.message}`); - } -} diff --git a/skills-engine/migrate.ts b/skills-engine/migrate.ts deleted file mode 100644 index d604c23..0000000 --- a/skills-engine/migrate.ts +++ /dev/null @@ -1,70 +0,0 @@ -import { execFileSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import { BASE_DIR, CUSTOM_DIR, NANOCLAW_DIR } from './constants.js'; -import { initNanoclawDir } from './init.js'; -import { recordCustomModification } from './state.js'; - -export function initSkillsSystem(): void { - initNanoclawDir(); - console.log('Skills system initialized. .nanoclaw/ directory created.'); -} - -export function migrateExisting(): void { - const projectRoot = process.cwd(); - - // First, do a fresh init - initNanoclawDir(); - - // Then, diff current files against base to capture modifications - const baseSrcDir = path.join(projectRoot, BASE_DIR, 'src'); - const srcDir = path.join(projectRoot, 'src'); - const customDir = path.join(projectRoot, CUSTOM_DIR); - const patchRelPath = path.join(CUSTOM_DIR, 'migration.patch'); - - try { - let diff: string; - try { - diff = execFileSync('diff', ['-ruN', baseSrcDir, srcDir], { - encoding: 'utf-8', - maxBuffer: 10 * 1024 * 1024, - }); - } catch (err: unknown) { - // diff exits 1 when files differ — that's expected - const execErr = err as { status?: number; stdout?: string }; - if (execErr.status === 1 && execErr.stdout) { - diff = execErr.stdout; - } else { - throw err; - } - } - - if (diff.trim()) { - fs.mkdirSync(customDir, { recursive: true }); - fs.writeFileSync(path.join(projectRoot, patchRelPath), diff, 'utf-8'); - - // Extract modified file paths from the diff - const filesModified = [...diff.matchAll(/^diff -ruN .+ (.+)$/gm)] - .map((m) => path.relative(projectRoot, m[1])) - .filter((f) => !f.startsWith('.nanoclaw')); - - // Record in state so the patch is visible to the tracking system - recordCustomModification( - 'Pre-skills migration', - filesModified, - patchRelPath, - ); - - console.log( - 'Custom modifications captured in .nanoclaw/custom/migration.patch', - ); - } else { - console.log('No custom modifications detected.'); - } - } catch { - console.log('Could not generate diff. Continuing with clean base.'); - } - - console.log('Migration complete. Skills system ready.'); -} diff --git a/skills-engine/path-remap.ts b/skills-engine/path-remap.ts deleted file mode 100644 index 2de54dc..0000000 --- a/skills-engine/path-remap.ts +++ /dev/null @@ -1,125 +0,0 @@ -import fs from 'fs'; -import path from 'path'; - -import { readState, writeState } from './state.js'; - -function isWithinRoot(rootPath: string, targetPath: string): boolean { - return targetPath === rootPath || targetPath.startsWith(rootPath + path.sep); -} - -function nearestExistingPathOrSymlink(candidateAbsPath: string): string { - let current = candidateAbsPath; - while (true) { - try { - fs.lstatSync(current); - return current; - } catch { - const parent = path.dirname(current); - if (parent === current) { - throw new Error(`Invalid remap path: "${candidateAbsPath}"`); - } - current = parent; - } - } -} - -function toSafeProjectRelativePath( - candidatePath: string, - projectRoot: string, -): string { - if (typeof candidatePath !== 'string' || candidatePath.trim() === '') { - throw new Error(`Invalid remap path: "${candidatePath}"`); - } - - const root = path.resolve(projectRoot); - const realRoot = fs.realpathSync(root); - const resolved = path.resolve(root, candidatePath); - if (!resolved.startsWith(root + path.sep) && resolved !== root) { - throw new Error(`Path remap escapes project root: "${candidatePath}"`); - } - if (resolved === root) { - throw new Error(`Path remap points to project root: "${candidatePath}"`); - } - - // Detect symlink escapes by resolving the nearest existing ancestor/symlink. - const anchorPath = nearestExistingPathOrSymlink(resolved); - const anchorStat = fs.lstatSync(anchorPath); - let realAnchor: string; - - if (anchorStat.isSymbolicLink()) { - const linkTarget = fs.readlinkSync(anchorPath); - const linkResolved = path.resolve(path.dirname(anchorPath), linkTarget); - realAnchor = fs.realpathSync(linkResolved); - } else { - realAnchor = fs.realpathSync(anchorPath); - } - - const relativeRemainder = path.relative(anchorPath, resolved); - const realResolved = relativeRemainder - ? path.resolve(realAnchor, relativeRemainder) - : realAnchor; - - if (!isWithinRoot(realRoot, realResolved)) { - throw new Error( - `Path remap escapes project root via symlink: "${candidatePath}"`, - ); - } - - return path.relative(realRoot, realResolved); -} - -function sanitizeRemapEntries( - remap: Record, - mode: 'throw' | 'drop', -): Record { - const projectRoot = process.cwd(); - const sanitized: Record = {}; - - for (const [from, to] of Object.entries(remap)) { - try { - const safeFrom = toSafeProjectRelativePath(from, projectRoot); - const safeTo = toSafeProjectRelativePath(to, projectRoot); - sanitized[safeFrom] = safeTo; - } catch (err) { - if (mode === 'throw') { - throw err; - } - } - } - - return sanitized; -} - -export function resolvePathRemap( - relPath: string, - remap: Record, -): string { - const projectRoot = process.cwd(); - const safeRelPath = toSafeProjectRelativePath(relPath, projectRoot); - const remapped = remap[safeRelPath] ?? remap[relPath]; - - if (remapped === undefined) { - return safeRelPath; - } - - // Fail closed: if remap target is invalid, ignore remap and keep original path. - try { - return toSafeProjectRelativePath(remapped, projectRoot); - } catch { - return safeRelPath; - } -} - -export function loadPathRemap(): Record { - const state = readState(); - const remap = state.path_remap ?? {}; - return sanitizeRemapEntries(remap, 'drop'); -} - -export function recordPathRemap(remap: Record): void { - const state = readState(); - const existing = sanitizeRemapEntries(state.path_remap ?? {}, 'drop'); - const incoming = sanitizeRemapEntries(remap, 'throw'); - state.path_remap = { ...existing, ...incoming }; - writeState(state); -} diff --git a/skills-engine/rebase.ts b/skills-engine/rebase.ts deleted file mode 100644 index 7b5d830..0000000 --- a/skills-engine/rebase.ts +++ /dev/null @@ -1,257 +0,0 @@ -import { execFileSync } from 'child_process'; -import crypto from 'crypto'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { clearBackup, createBackup, restoreBackup } from './backup.js'; -import { BASE_DIR, NANOCLAW_DIR } from './constants.js'; -import { copyDir } from './fs-utils.js'; -import { acquireLock } from './lock.js'; -import { mergeFile } from './merge.js'; -import { computeFileHash, readState, writeState } from './state.js'; -import type { RebaseResult } from './types.js'; - -function walkDir(dir: string, root: string): string[] { - const results: string[] = []; - if (!fs.existsSync(dir)) return results; - - for (const entry of fs.readdirSync(dir, { withFileTypes: true })) { - const fullPath = path.join(dir, entry.name); - if (entry.isDirectory()) { - results.push(...walkDir(fullPath, root)); - } else { - results.push(path.relative(root, fullPath)); - } - } - return results; -} - -function collectTrackedFiles(state: ReturnType): Set { - const tracked = new Set(); - - for (const skill of state.applied_skills) { - for (const relPath of Object.keys(skill.file_hashes)) { - tracked.add(relPath); - } - } - - if (state.custom_modifications) { - for (const mod of state.custom_modifications) { - for (const relPath of mod.files_modified) { - tracked.add(relPath); - } - } - } - - return tracked; -} - -export async function rebase(newBasePath?: string): Promise { - const projectRoot = process.cwd(); - const state = readState(); - - if (state.applied_skills.length === 0) { - return { - success: false, - filesInPatch: 0, - error: 'No skills applied. Nothing to rebase.', - }; - } - - const releaseLock = acquireLock(); - - try { - const trackedFiles = collectTrackedFiles(state); - const baseAbsDir = path.join(projectRoot, BASE_DIR); - - // Include base dir files - const baseFiles = walkDir(baseAbsDir, baseAbsDir); - for (const f of baseFiles) { - trackedFiles.add(f); - } - - // Backup - const filesToBackup: string[] = []; - for (const relPath of trackedFiles) { - const absPath = path.join(projectRoot, relPath); - if (fs.existsSync(absPath)) filesToBackup.push(absPath); - const baseFilePath = path.join(baseAbsDir, relPath); - if (fs.existsSync(baseFilePath)) filesToBackup.push(baseFilePath); - } - const stateFilePath = path.join(projectRoot, NANOCLAW_DIR, 'state.yaml'); - filesToBackup.push(stateFilePath); - createBackup(filesToBackup); - - try { - // Generate unified diff: base vs working tree (archival record) - let combinedPatch = ''; - let filesInPatch = 0; - - for (const relPath of trackedFiles) { - const basePath = path.join(baseAbsDir, relPath); - const workingPath = path.join(projectRoot, relPath); - - const oldPath = fs.existsSync(basePath) ? basePath : '/dev/null'; - const newPath = fs.existsSync(workingPath) ? workingPath : '/dev/null'; - - if (oldPath === '/dev/null' && newPath === '/dev/null') continue; - - try { - const diff = execFileSync('diff', ['-ruN', oldPath, newPath], { - encoding: 'utf-8', - }); - if (diff.trim()) { - combinedPatch += diff; - filesInPatch++; - } - } catch (err: unknown) { - const execErr = err as { status?: number; stdout?: string }; - if (execErr.status === 1 && execErr.stdout) { - combinedPatch += execErr.stdout; - filesInPatch++; - } else { - throw err; - } - } - } - - // Save combined patch - const patchPath = path.join(projectRoot, NANOCLAW_DIR, 'combined.patch'); - fs.writeFileSync(patchPath, combinedPatch, 'utf-8'); - - if (newBasePath) { - // --- Rebase with new base: three-way merge with resolution model --- - - // Save current working tree content before overwriting - const savedContent: Record = {}; - for (const relPath of trackedFiles) { - const workingPath = path.join(projectRoot, relPath); - if (fs.existsSync(workingPath)) { - savedContent[relPath] = fs.readFileSync(workingPath, 'utf-8'); - } - } - - const absNewBase = path.resolve(newBasePath); - - // Replace base - if (fs.existsSync(baseAbsDir)) { - fs.rmSync(baseAbsDir, { recursive: true, force: true }); - } - fs.mkdirSync(baseAbsDir, { recursive: true }); - copyDir(absNewBase, baseAbsDir); - - // Copy new base to working tree - copyDir(absNewBase, projectRoot); - - // Three-way merge per file: new-base ← old-base → saved-working-tree - const mergeConflicts: string[] = []; - - for (const relPath of trackedFiles) { - const newBaseSrc = path.join(absNewBase, relPath); - const currentPath = path.join(projectRoot, relPath); - const saved = savedContent[relPath]; - - if (!saved) continue; // No working tree content to merge - if (!fs.existsSync(newBaseSrc)) { - // File only existed in working tree, not in new base — restore it - fs.mkdirSync(path.dirname(currentPath), { recursive: true }); - fs.writeFileSync(currentPath, saved); - continue; - } - - const newBaseContent = fs.readFileSync(newBaseSrc, 'utf-8'); - if (newBaseContent === saved) continue; // No diff - - // Find old base content from backup - const oldBasePath = path.join( - projectRoot, - '.nanoclaw', - 'backup', - BASE_DIR, - relPath, - ); - if (!fs.existsSync(oldBasePath)) { - // No old base — keep saved content - fs.writeFileSync(currentPath, saved); - continue; - } - - // Three-way merge: current(new base) ← old-base → saved(modifications) - const tmpSaved = path.join( - os.tmpdir(), - `nanoclaw-rebase-${crypto.randomUUID()}-${path.basename(relPath)}`, - ); - fs.writeFileSync(tmpSaved, saved); - - const result = mergeFile(currentPath, oldBasePath, tmpSaved); - fs.unlinkSync(tmpSaved); - - if (!result.clean) { - mergeConflicts.push(relPath); - } - } - - if (mergeConflicts.length > 0) { - // Return with backup pending for Claude Code / user resolution - return { - success: false, - patchFile: patchPath, - filesInPatch, - mergeConflicts, - backupPending: true, - error: `Merge conflicts in: ${mergeConflicts.join(', ')}. Resolve manually then call clearBackup(), or restoreBackup() + clearBackup() to abort.`, - }; - } - } else { - // --- Rebase without new base: flatten into base --- - // Update base to current working tree state (all skills baked in) - for (const relPath of trackedFiles) { - const workingPath = path.join(projectRoot, relPath); - const basePath = path.join(baseAbsDir, relPath); - - if (fs.existsSync(workingPath)) { - fs.mkdirSync(path.dirname(basePath), { recursive: true }); - fs.copyFileSync(workingPath, basePath); - } else if (fs.existsSync(basePath)) { - // File was removed by skills — remove from base too - fs.unlinkSync(basePath); - } - } - } - - // Update state - const now = new Date().toISOString(); - - for (const skill of state.applied_skills) { - const updatedHashes: Record = {}; - for (const relPath of Object.keys(skill.file_hashes)) { - const absPath = path.join(projectRoot, relPath); - if (fs.existsSync(absPath)) { - updatedHashes[relPath] = computeFileHash(absPath); - } - } - skill.file_hashes = updatedHashes; - } - - delete state.custom_modifications; - state.rebased_at = now; - writeState(state); - - clearBackup(); - - return { - success: true, - patchFile: patchPath, - filesInPatch, - rebased_at: now, - }; - } catch (err) { - restoreBackup(); - clearBackup(); - throw err; - } - } finally { - releaseLock(); - } -} diff --git a/skills-engine/replay.ts b/skills-engine/replay.ts deleted file mode 100644 index 4f2f5e2..0000000 --- a/skills-engine/replay.ts +++ /dev/null @@ -1,270 +0,0 @@ -import crypto from 'crypto'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { BASE_DIR, NANOCLAW_DIR } from './constants.js'; -import { copyDir } from './fs-utils.js'; -import { readManifest } from './manifest.js'; -import { mergeFile } from './merge.js'; -import { loadPathRemap, resolvePathRemap } from './path-remap.js'; -import { - mergeDockerComposeServices, - mergeEnvAdditions, - mergeNpmDependencies, - runNpmInstall, -} from './structured.js'; - -export interface ReplayOptions { - skills: string[]; - skillDirs: Record; - projectRoot?: string; -} - -export interface ReplayResult { - success: boolean; - perSkill: Record; - mergeConflicts?: string[]; - error?: string; -} - -/** - * Scan .claude/skills/ for a directory whose manifest.yaml has skill: . - */ -export function findSkillDir( - skillName: string, - projectRoot?: string, -): string | null { - const root = projectRoot ?? process.cwd(); - const skillsRoot = path.join(root, '.claude', 'skills'); - if (!fs.existsSync(skillsRoot)) return null; - - for (const entry of fs.readdirSync(skillsRoot, { withFileTypes: true })) { - if (!entry.isDirectory()) continue; - const dir = path.join(skillsRoot, entry.name); - const manifestPath = path.join(dir, 'manifest.yaml'); - if (!fs.existsSync(manifestPath)) continue; - - try { - const manifest = readManifest(dir); - if (manifest.skill === skillName) return dir; - } catch { - // Skip invalid manifests - } - } - - return null; -} - -/** - * Replay a list of skills from clean base state. - * Used by uninstall (replay-without) and rebase. - */ -export async function replaySkills( - options: ReplayOptions, -): Promise { - const projectRoot = options.projectRoot ?? process.cwd(); - const baseDir = path.join(projectRoot, BASE_DIR); - const pathRemap = loadPathRemap(); - - const perSkill: Record = {}; - const allMergeConflicts: string[] = []; - - // 1. Collect all files touched by any skill in the list - const allTouchedFiles = new Set(); - for (const skillName of options.skills) { - const skillDir = options.skillDirs[skillName]; - if (!skillDir) { - perSkill[skillName] = { - success: false, - error: `Skill directory not found for: ${skillName}`, - }; - return { - success: false, - perSkill, - error: `Missing skill directory for: ${skillName}`, - }; - } - - const manifest = readManifest(skillDir); - for (const f of manifest.adds) allTouchedFiles.add(f); - for (const f of manifest.modifies) allTouchedFiles.add(f); - } - - // 2. Reset touched files to clean base - for (const relPath of allTouchedFiles) { - const resolvedPath = resolvePathRemap(relPath, pathRemap); - const currentPath = path.join(projectRoot, resolvedPath); - const basePath = path.join(baseDir, resolvedPath); - - if (fs.existsSync(basePath)) { - // Restore from base - fs.mkdirSync(path.dirname(currentPath), { recursive: true }); - fs.copyFileSync(basePath, currentPath); - } else if (fs.existsSync(currentPath)) { - // Add-only file not in base — remove it - fs.unlinkSync(currentPath); - } - } - - // Replay each skill in order - // Collect structured ops for batch application - const allNpmDeps: Record = {}; - const allEnvAdditions: string[] = []; - const allDockerServices: Record = {}; - let hasNpmDeps = false; - - for (const skillName of options.skills) { - const skillDir = options.skillDirs[skillName]; - try { - const manifest = readManifest(skillDir); - - // Execute file_ops - if (manifest.file_ops && manifest.file_ops.length > 0) { - const { executeFileOps } = await import('./file-ops.js'); - const fileOpsResult = executeFileOps(manifest.file_ops, projectRoot); - if (!fileOpsResult.success) { - perSkill[skillName] = { - success: false, - error: `File operations failed: ${fileOpsResult.errors.join('; ')}`, - }; - return { - success: false, - perSkill, - error: `File ops failed for ${skillName}`, - }; - } - } - - // Copy add/ files - const addDir = path.join(skillDir, 'add'); - if (fs.existsSync(addDir)) { - for (const relPath of manifest.adds) { - const resolvedDest = resolvePathRemap(relPath, pathRemap); - const destPath = path.join(projectRoot, resolvedDest); - const srcPath = path.join(addDir, relPath); - if (fs.existsSync(srcPath)) { - fs.mkdirSync(path.dirname(destPath), { recursive: true }); - fs.copyFileSync(srcPath, destPath); - } - } - } - - // Three-way merge modify/ files - const skillConflicts: string[] = []; - - for (const relPath of manifest.modifies) { - const resolvedPath = resolvePathRemap(relPath, pathRemap); - const currentPath = path.join(projectRoot, resolvedPath); - const basePath = path.join(baseDir, resolvedPath); - const skillPath = path.join(skillDir, 'modify', relPath); - - if (!fs.existsSync(skillPath)) { - skillConflicts.push(relPath); - continue; - } - - if (!fs.existsSync(currentPath)) { - fs.mkdirSync(path.dirname(currentPath), { recursive: true }); - fs.copyFileSync(skillPath, currentPath); - continue; - } - - if (!fs.existsSync(basePath)) { - fs.mkdirSync(path.dirname(basePath), { recursive: true }); - fs.copyFileSync(currentPath, basePath); - } - - const tmpCurrent = path.join( - os.tmpdir(), - `nanoclaw-replay-${crypto.randomUUID()}-${path.basename(relPath)}`, - ); - fs.copyFileSync(currentPath, tmpCurrent); - - const result = mergeFile(tmpCurrent, basePath, skillPath); - - if (result.clean) { - fs.copyFileSync(tmpCurrent, currentPath); - fs.unlinkSync(tmpCurrent); - } else { - fs.copyFileSync(tmpCurrent, currentPath); - fs.unlinkSync(tmpCurrent); - skillConflicts.push(resolvedPath); - } - } - - if (skillConflicts.length > 0) { - allMergeConflicts.push(...skillConflicts); - perSkill[skillName] = { - success: false, - error: `Merge conflicts: ${skillConflicts.join(', ')}`, - }; - // Stop on first conflict — later skills would merge against conflict markers - break; - } else { - perSkill[skillName] = { success: true }; - } - - // Collect structured ops - if (manifest.structured?.npm_dependencies) { - Object.assign(allNpmDeps, manifest.structured.npm_dependencies); - hasNpmDeps = true; - } - if (manifest.structured?.env_additions) { - allEnvAdditions.push(...manifest.structured.env_additions); - } - if (manifest.structured?.docker_compose_services) { - Object.assign( - allDockerServices, - manifest.structured.docker_compose_services, - ); - } - } catch (err) { - perSkill[skillName] = { - success: false, - error: err instanceof Error ? err.message : String(err), - }; - return { - success: false, - perSkill, - error: `Replay failed for ${skillName}: ${err instanceof Error ? err.message : String(err)}`, - }; - } - } - - if (allMergeConflicts.length > 0) { - return { - success: false, - perSkill, - mergeConflicts: allMergeConflicts, - error: `Unresolved merge conflicts: ${allMergeConflicts.join(', ')}`, - }; - } - - // 4. Apply aggregated structured operations (only if no conflicts) - if (hasNpmDeps) { - const pkgPath = path.join(projectRoot, 'package.json'); - mergeNpmDependencies(pkgPath, allNpmDeps); - } - - if (allEnvAdditions.length > 0) { - const envPath = path.join(projectRoot, '.env.example'); - mergeEnvAdditions(envPath, allEnvAdditions); - } - - if (Object.keys(allDockerServices).length > 0) { - const composePath = path.join(projectRoot, 'docker-compose.yml'); - mergeDockerComposeServices(composePath, allDockerServices); - } - - // 5. Run npm install if any deps - if (hasNpmDeps) { - try { - runNpmInstall(); - } catch { - // npm install failure is non-fatal for replay - } - } - - return { success: true, perSkill }; -} diff --git a/skills-engine/state.ts b/skills-engine/state.ts deleted file mode 100644 index 6754116..0000000 --- a/skills-engine/state.ts +++ /dev/null @@ -1,119 +0,0 @@ -import crypto from 'crypto'; -import fs from 'fs'; -import path from 'path'; - -import { parse, stringify } from 'yaml'; - -import { - SKILLS_SCHEMA_VERSION, - NANOCLAW_DIR, - STATE_FILE, -} from './constants.js'; -import { AppliedSkill, CustomModification, SkillState } from './types.js'; - -function getStatePath(): string { - return path.join(process.cwd(), NANOCLAW_DIR, STATE_FILE); -} - -export function readState(): SkillState { - const statePath = getStatePath(); - if (!fs.existsSync(statePath)) { - throw new Error( - '.nanoclaw/state.yaml not found. Run initSkillsSystem() first.', - ); - } - const content = fs.readFileSync(statePath, 'utf-8'); - const state = parse(content) as SkillState; - - if (compareSemver(state.skills_system_version, SKILLS_SCHEMA_VERSION) > 0) { - throw new Error( - `state.yaml version ${state.skills_system_version} is newer than tooling version ${SKILLS_SCHEMA_VERSION}. Update your skills engine.`, - ); - } - - return state; -} - -export function writeState(state: SkillState): void { - const statePath = getStatePath(); - fs.mkdirSync(path.dirname(statePath), { recursive: true }); - const content = stringify(state, { sortMapEntries: true }); - // Write to temp file then atomic rename to prevent corruption on crash - const tmpPath = statePath + '.tmp'; - fs.writeFileSync(tmpPath, content, 'utf-8'); - fs.renameSync(tmpPath, statePath); -} - -export function recordSkillApplication( - skillName: string, - version: string, - fileHashes: Record, - structuredOutcomes?: Record, -): void { - const state = readState(); - - // Remove previous application of same skill if exists - state.applied_skills = state.applied_skills.filter( - (s) => s.name !== skillName, - ); - - state.applied_skills.push({ - name: skillName, - version, - applied_at: new Date().toISOString(), - file_hashes: fileHashes, - structured_outcomes: structuredOutcomes, - }); - - writeState(state); -} - -export function getAppliedSkills(): AppliedSkill[] { - const state = readState(); - return state.applied_skills; -} - -export function recordCustomModification( - description: string, - filesModified: string[], - patchFile: string, -): void { - const state = readState(); - - if (!state.custom_modifications) { - state.custom_modifications = []; - } - - const mod: CustomModification = { - description, - applied_at: new Date().toISOString(), - files_modified: filesModified, - patch_file: patchFile, - }; - - state.custom_modifications.push(mod); - writeState(state); -} - -export function getCustomModifications(): CustomModification[] { - const state = readState(); - return state.custom_modifications || []; -} - -export function computeFileHash(filePath: string): string { - const content = fs.readFileSync(filePath); - return crypto.createHash('sha256').update(content).digest('hex'); -} - -/** - * Compare two semver strings. Returns negative if a < b, 0 if equal, positive if a > b. - */ -export function compareSemver(a: string, b: string): number { - const partsA = a.split('.').map(Number); - const partsB = b.split('.').map(Number); - for (let i = 0; i < Math.max(partsA.length, partsB.length); i++) { - const diff = (partsA[i] || 0) - (partsB[i] || 0); - if (diff !== 0) return diff; - } - return 0; -} diff --git a/skills-engine/structured.ts b/skills-engine/structured.ts deleted file mode 100644 index 2d64171..0000000 --- a/skills-engine/structured.ts +++ /dev/null @@ -1,201 +0,0 @@ -import { execSync } from 'child_process'; -import fs from 'fs'; -import { parse, stringify } from 'yaml'; - -interface PackageJson { - dependencies?: Record; - devDependencies?: Record; - [key: string]: unknown; -} - -interface DockerComposeFile { - version?: string; - services?: Record; - [key: string]: unknown; -} - -function compareVersionParts(a: string[], b: string[]): number { - const len = Math.max(a.length, b.length); - for (let i = 0; i < len; i++) { - const aNum = parseInt(a[i] ?? '0', 10); - const bNum = parseInt(b[i] ?? '0', 10); - if (aNum !== bNum) return aNum - bNum; - } - return 0; -} - -export function areRangesCompatible( - existing: string, - requested: string, -): { compatible: boolean; resolved: string } { - if (existing === requested) { - return { compatible: true, resolved: existing }; - } - - // Both start with ^ - if (existing.startsWith('^') && requested.startsWith('^')) { - const eParts = existing.slice(1).split('.'); - const rParts = requested.slice(1).split('.'); - if (eParts[0] !== rParts[0]) { - return { compatible: false, resolved: existing }; - } - // Same major — take the higher version - const resolved = - compareVersionParts(eParts, rParts) >= 0 ? existing : requested; - return { compatible: true, resolved }; - } - - // Both start with ~ - if (existing.startsWith('~') && requested.startsWith('~')) { - const eParts = existing.slice(1).split('.'); - const rParts = requested.slice(1).split('.'); - if (eParts[0] !== rParts[0] || eParts[1] !== rParts[1]) { - return { compatible: false, resolved: existing }; - } - // Same major.minor — take higher patch - const resolved = - compareVersionParts(eParts, rParts) >= 0 ? existing : requested; - return { compatible: true, resolved }; - } - - // Mismatched prefixes or anything else (exact, >=, *, etc.) - return { compatible: false, resolved: existing }; -} - -export function mergeNpmDependencies( - packageJsonPath: string, - newDeps: Record, -): void { - const content = fs.readFileSync(packageJsonPath, 'utf-8'); - const pkg: PackageJson = JSON.parse(content); - - pkg.dependencies = pkg.dependencies || {}; - - for (const [name, version] of Object.entries(newDeps)) { - // Check both dependencies and devDependencies to avoid duplicates - const existing = pkg.dependencies[name] ?? pkg.devDependencies?.[name]; - if (existing && existing !== version) { - const result = areRangesCompatible(existing, version); - if (!result.compatible) { - throw new Error( - `Dependency conflict: ${name} is already at ${existing}, skill wants ${version}`, - ); - } - pkg.dependencies[name] = result.resolved; - } else { - pkg.dependencies[name] = version; - } - } - - // Sort dependencies for deterministic output - pkg.dependencies = Object.fromEntries( - Object.entries(pkg.dependencies).sort(([a], [b]) => a.localeCompare(b)), - ); - - if (pkg.devDependencies) { - pkg.devDependencies = Object.fromEntries( - Object.entries(pkg.devDependencies).sort(([a], [b]) => - a.localeCompare(b), - ), - ); - } - - fs.writeFileSync( - packageJsonPath, - JSON.stringify(pkg, null, 2) + '\n', - 'utf-8', - ); -} - -export function mergeEnvAdditions( - envExamplePath: string, - additions: string[], -): void { - let content = ''; - if (fs.existsSync(envExamplePath)) { - content = fs.readFileSync(envExamplePath, 'utf-8'); - } - - const existingVars = new Set(); - for (const line of content.split('\n')) { - const match = line.match(/^([A-Za-z_][A-Za-z0-9_]*)=/); - if (match) existingVars.add(match[1]); - } - - const newVars = additions.filter((v) => !existingVars.has(v)); - if (newVars.length === 0) return; - - if (content && !content.endsWith('\n')) content += '\n'; - content += '\n# Added by skill\n'; - for (const v of newVars) { - content += `${v}=\n`; - } - - fs.writeFileSync(envExamplePath, content, 'utf-8'); -} - -function extractHostPort(portMapping: string): string | null { - const str = String(portMapping); - const parts = str.split(':'); - if (parts.length >= 2) { - return parts[0]; - } - return null; -} - -export function mergeDockerComposeServices( - composePath: string, - services: Record, -): void { - let compose: DockerComposeFile; - - if (fs.existsSync(composePath)) { - const content = fs.readFileSync(composePath, 'utf-8'); - compose = (parse(content) as DockerComposeFile) || {}; - } else { - compose = { version: '3' }; - } - - compose.services = compose.services || {}; - - // Collect host ports from existing services - const usedPorts = new Set(); - for (const [, svc] of Object.entries(compose.services)) { - const service = svc as Record; - if (Array.isArray(service.ports)) { - for (const p of service.ports) { - const host = extractHostPort(String(p)); - if (host) usedPorts.add(host); - } - } - } - - // Add new services, checking for port collisions - for (const [name, definition] of Object.entries(services)) { - if (compose.services[name]) continue; // skip existing - - const svc = definition as Record; - if (Array.isArray(svc.ports)) { - for (const p of svc.ports) { - const host = extractHostPort(String(p)); - if (host && usedPorts.has(host)) { - throw new Error( - `Port collision: host port ${host} from service "${name}" is already in use`, - ); - } - if (host) usedPorts.add(host); - } - } - - compose.services[name] = definition; - } - - fs.writeFileSync(composePath, stringify(compose), 'utf-8'); -} - -export function runNpmInstall(): void { - execSync('npm install --legacy-peer-deps', { - stdio: 'inherit', - cwd: process.cwd(), - }); -} diff --git a/skills-engine/tsconfig.json b/skills-engine/tsconfig.json deleted file mode 100644 index cb99957..0000000 --- a/skills-engine/tsconfig.json +++ /dev/null @@ -1,16 +0,0 @@ -{ - "compilerOptions": { - "target": "ES2022", - "module": "NodeNext", - "moduleResolution": "NodeNext", - "lib": ["ES2022"], - "strict": true, - "esModuleInterop": true, - "skipLibCheck": true, - "forceConsistentCasingInFileNames": true, - "resolveJsonModule": true, - "noEmit": true - }, - "include": ["**/*.ts"], - "exclude": ["__tests__"] -} diff --git a/skills-engine/types.ts b/skills-engine/types.ts deleted file mode 100644 index f177eda..0000000 --- a/skills-engine/types.ts +++ /dev/null @@ -1,115 +0,0 @@ -export interface SkillManifest { - skill: string; - version: string; - description: string; - core_version: string; - adds: string[]; - modifies: string[]; - structured?: { - npm_dependencies?: Record; - env_additions?: string[]; - docker_compose_services?: Record; - }; - file_ops?: FileOperation[]; - conflicts: string[]; - depends: string[]; - test?: string; - author?: string; - license?: string; - min_skills_system_version?: string; - tested_with?: string[]; - post_apply?: string[]; -} - -export interface SkillState { - skills_system_version: string; - core_version: string; - applied_skills: AppliedSkill[]; - custom_modifications?: CustomModification[]; - path_remap?: Record; - rebased_at?: string; -} - -export interface AppliedSkill { - name: string; - version: string; - applied_at: string; - file_hashes: Record; - structured_outcomes?: Record; - custom_patch?: string; - custom_patch_description?: string; -} - -export interface ApplyResult { - success: boolean; - skill: string; - version: string; - mergeConflicts?: string[]; - backupPending?: boolean; - untrackedChanges?: string[]; - error?: string; -} - -export interface MergeResult { - clean: boolean; - exitCode: number; -} - -export interface FileOperation { - type: 'rename' | 'delete' | 'move'; - from?: string; - to?: string; - path?: string; -} - -export interface FileOpsResult { - success: boolean; - executed: FileOperation[]; - warnings: string[]; - errors: string[]; -} - -export interface CustomModification { - description: string; - applied_at: string; - files_modified: string[]; - patch_file: string; -} - -export interface UpdatePreview { - currentVersion: string; - newVersion: string; - filesChanged: string[]; - filesDeleted: string[]; - conflictRisk: string[]; - customPatchesAtRisk: string[]; -} - -export interface UpdateResult { - success: boolean; - previousVersion: string; - newVersion: string; - mergeConflicts?: string[]; - backupPending?: boolean; - customPatchFailures?: string[]; - skillReapplyResults?: Record; - error?: string; -} - -export interface UninstallResult { - success: boolean; - skill: string; - customPatchWarning?: string; - replayResults?: Record; - error?: string; -} - -export interface RebaseResult { - success: boolean; - patchFile?: string; - filesInPatch: number; - rebased_at?: string; - mergeConflicts?: string[]; - backupPending?: boolean; - error?: string; -} diff --git a/skills-engine/uninstall.ts b/skills-engine/uninstall.ts deleted file mode 100644 index 947574b..0000000 --- a/skills-engine/uninstall.ts +++ /dev/null @@ -1,231 +0,0 @@ -import { execFileSync, execSync } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import { clearBackup, createBackup, restoreBackup } from './backup.js'; -import { BASE_DIR, NANOCLAW_DIR } from './constants.js'; -import { acquireLock } from './lock.js'; -import { loadPathRemap, resolvePathRemap } from './path-remap.js'; -import { computeFileHash, readState, writeState } from './state.js'; -import { findSkillDir, replaySkills } from './replay.js'; -import type { UninstallResult } from './types.js'; - -export async function uninstallSkill( - skillName: string, -): Promise { - const projectRoot = process.cwd(); - const state = readState(); - - // 1. Block after rebase — skills are baked into base - if (state.rebased_at) { - return { - success: false, - skill: skillName, - error: - 'Cannot uninstall individual skills after rebase. The base includes all skill modifications. To remove a skill, start from a clean core and re-apply the skills you want.', - }; - } - - // 2. Verify skill exists - const skillEntry = state.applied_skills.find((s) => s.name === skillName); - if (!skillEntry) { - return { - success: false, - skill: skillName, - error: `Skill "${skillName}" is not applied.`, - }; - } - - // 3. Check for custom patch — warn but don't block - if (skillEntry.custom_patch) { - return { - success: false, - skill: skillName, - customPatchWarning: `Skill "${skillName}" has a custom patch (${skillEntry.custom_patch_description ?? 'no description'}). Uninstalling will lose these customizations. Re-run with confirmation to proceed.`, - }; - } - - // 4. Acquire lock - const releaseLock = acquireLock(); - - try { - // 4. Backup all files touched by any applied skill - const allTouchedFiles = new Set(); - for (const skill of state.applied_skills) { - for (const filePath of Object.keys(skill.file_hashes)) { - allTouchedFiles.add(filePath); - } - } - if (state.custom_modifications) { - for (const mod of state.custom_modifications) { - for (const f of mod.files_modified) { - allTouchedFiles.add(f); - } - } - } - - const filesToBackup = [...allTouchedFiles].map((f) => - path.join(projectRoot, f), - ); - createBackup(filesToBackup); - - // 5. Build remaining skill list (original order, minus removed) - const remainingSkills = state.applied_skills - .filter((s) => s.name !== skillName) - .map((s) => s.name); - - // 6. Locate all skill dirs - const skillDirs: Record = {}; - for (const name of remainingSkills) { - const dir = findSkillDir(name, projectRoot); - if (!dir) { - restoreBackup(); - clearBackup(); - return { - success: false, - skill: skillName, - error: `Cannot find skill package for "${name}" in .claude/skills/. All remaining skills must be available for replay.`, - }; - } - skillDirs[name] = dir; - } - - // 7. Reset files exclusive to the removed skill; replaySkills handles the rest - const baseDir = path.join(projectRoot, BASE_DIR); - const pathRemap = loadPathRemap(); - - const remainingSkillFiles = new Set(); - for (const skill of state.applied_skills) { - if (skill.name === skillName) continue; - for (const filePath of Object.keys(skill.file_hashes)) { - remainingSkillFiles.add(filePath); - } - } - - const removedSkillFiles = Object.keys(skillEntry.file_hashes); - for (const filePath of removedSkillFiles) { - if (remainingSkillFiles.has(filePath)) continue; // replaySkills handles it - const resolvedPath = resolvePathRemap(filePath, pathRemap); - const currentPath = path.join(projectRoot, resolvedPath); - const basePath = path.join(baseDir, resolvedPath); - - if (fs.existsSync(basePath)) { - fs.mkdirSync(path.dirname(currentPath), { recursive: true }); - fs.copyFileSync(basePath, currentPath); - } else if (fs.existsSync(currentPath)) { - // Add-only file not in base — remove - fs.unlinkSync(currentPath); - } - } - - // 8. Replay remaining skills on clean base - const replayResult = await replaySkills({ - skills: remainingSkills, - skillDirs, - projectRoot, - }); - - // 9. Check replay result before proceeding - if (!replayResult.success) { - restoreBackup(); - clearBackup(); - return { - success: false, - skill: skillName, - error: `Replay failed: ${replayResult.error}`, - }; - } - - // 10. Re-apply standalone custom_modifications - if (state.custom_modifications) { - for (const mod of state.custom_modifications) { - const patchPath = path.join(projectRoot, mod.patch_file); - if (fs.existsSync(patchPath)) { - try { - execFileSync('git', ['apply', '--3way', patchPath], { - stdio: 'pipe', - cwd: projectRoot, - }); - } catch { - // Custom patch failure is non-fatal but noted - } - } - } - } - - // 11. Run skill tests - const replayResults: Record = {}; - for (const skill of state.applied_skills) { - if (skill.name === skillName) continue; - const outcomes = skill.structured_outcomes as - | Record - | undefined; - if (!outcomes?.test) continue; - - try { - execSync(outcomes.test as string, { - stdio: 'pipe', - cwd: projectRoot, - timeout: 120_000, - }); - replayResults[skill.name] = true; - } catch { - replayResults[skill.name] = false; - } - } - - // Check for test failures - const testFailures = Object.entries(replayResults).filter( - ([, passed]) => !passed, - ); - if (testFailures.length > 0) { - restoreBackup(); - clearBackup(); - return { - success: false, - skill: skillName, - replayResults, - error: `Tests failed after uninstall: ${testFailures.map(([n]) => n).join(', ')}`, - }; - } - - // 11. Update state - state.applied_skills = state.applied_skills.filter( - (s) => s.name !== skillName, - ); - - // Update file hashes for remaining skills - for (const skill of state.applied_skills) { - const newHashes: Record = {}; - for (const filePath of Object.keys(skill.file_hashes)) { - const absPath = path.join(projectRoot, filePath); - if (fs.existsSync(absPath)) { - newHashes[filePath] = computeFileHash(absPath); - } - } - skill.file_hashes = newHashes; - } - - writeState(state); - - // 12. Cleanup - clearBackup(); - - return { - success: true, - skill: skillName, - replayResults: - Object.keys(replayResults).length > 0 ? replayResults : undefined, - }; - } catch (err) { - restoreBackup(); - clearBackup(); - return { - success: false, - skill: skillName, - error: err instanceof Error ? err.message : String(err), - }; - } finally { - releaseLock(); - } -} diff --git a/skills-engine/update.ts b/skills-engine/update.ts deleted file mode 100644 index 5d2e7f7..0000000 --- a/skills-engine/update.ts +++ /dev/null @@ -1,355 +0,0 @@ -import { execFileSync, execSync } from 'child_process'; -import crypto from 'crypto'; -import fs from 'fs'; -import os from 'os'; -import path from 'path'; - -import { parse as parseYaml } from 'yaml'; - -import { clearBackup, createBackup, restoreBackup } from './backup.js'; -import { BASE_DIR, NANOCLAW_DIR } from './constants.js'; -import { copyDir } from './fs-utils.js'; -import { isCustomizeActive } from './customize.js'; -import { acquireLock } from './lock.js'; -import { mergeFile } from './merge.js'; -import { recordPathRemap } from './path-remap.js'; -import { computeFileHash, readState, writeState } from './state.js'; -import { - mergeDockerComposeServices, - mergeEnvAdditions, - mergeNpmDependencies, - runNpmInstall, -} from './structured.js'; -import type { UpdatePreview, UpdateResult } from './types.js'; - -function walkDir(dir: string, root?: string): string[] { - const rootDir = root ?? dir; - const results: string[] = []; - for (const entry of fs.readdirSync(dir, { withFileTypes: true })) { - const fullPath = path.join(dir, entry.name); - if (entry.isDirectory()) { - results.push(...walkDir(fullPath, rootDir)); - } else { - results.push(path.relative(rootDir, fullPath)); - } - } - return results; -} - -export function previewUpdate(newCorePath: string): UpdatePreview { - const projectRoot = process.cwd(); - const state = readState(); - const baseDir = path.join(projectRoot, BASE_DIR); - - // Read new version from package.json in newCorePath - const newPkgPath = path.join(newCorePath, 'package.json'); - let newVersion = 'unknown'; - if (fs.existsSync(newPkgPath)) { - const pkg = JSON.parse(fs.readFileSync(newPkgPath, 'utf-8')); - newVersion = pkg.version ?? 'unknown'; - } - - // Walk all files in newCorePath, compare against base to find changed files - const newCoreFiles = walkDir(newCorePath); - const filesChanged: string[] = []; - const filesDeleted: string[] = []; - - for (const relPath of newCoreFiles) { - const basePath = path.join(baseDir, relPath); - const newPath = path.join(newCorePath, relPath); - - if (!fs.existsSync(basePath)) { - filesChanged.push(relPath); - continue; - } - - const baseHash = computeFileHash(basePath); - const newHash = computeFileHash(newPath); - if (baseHash !== newHash) { - filesChanged.push(relPath); - } - } - - // Detect files deleted in the new core (exist in base but not in newCorePath) - if (fs.existsSync(baseDir)) { - const baseFiles = walkDir(baseDir); - const newCoreSet = new Set(newCoreFiles); - for (const relPath of baseFiles) { - if (!newCoreSet.has(relPath)) { - filesDeleted.push(relPath); - } - } - } - - // Check which changed files have skill overlaps - const conflictRisk: string[] = []; - const customPatchesAtRisk: string[] = []; - - for (const relPath of filesChanged) { - // Check applied skills - for (const skill of state.applied_skills) { - if (skill.file_hashes[relPath]) { - conflictRisk.push(relPath); - break; - } - } - - // Check custom modifications - if (state.custom_modifications) { - for (const mod of state.custom_modifications) { - if (mod.files_modified.includes(relPath)) { - customPatchesAtRisk.push(relPath); - break; - } - } - } - } - - return { - currentVersion: state.core_version, - newVersion, - filesChanged, - filesDeleted, - conflictRisk, - customPatchesAtRisk, - }; -} - -export async function applyUpdate(newCorePath: string): Promise { - const projectRoot = process.cwd(); - const state = readState(); - const baseDir = path.join(projectRoot, BASE_DIR); - - // --- Pre-flight --- - if (isCustomizeActive()) { - return { - success: false, - previousVersion: state.core_version, - newVersion: 'unknown', - error: - 'A customize session is active. Run commitCustomize() or abortCustomize() first.', - }; - } - - const releaseLock = acquireLock(); - - try { - // --- Preview --- - const preview = previewUpdate(newCorePath); - - // --- Backup --- - const filesToBackup = [ - ...preview.filesChanged.map((f) => path.join(projectRoot, f)), - ...preview.filesDeleted.map((f) => path.join(projectRoot, f)), - ]; - createBackup(filesToBackup); - - // --- Three-way merge --- - const mergeConflicts: string[] = []; - - for (const relPath of preview.filesChanged) { - const currentPath = path.join(projectRoot, relPath); - const basePath = path.join(baseDir, relPath); - const newCoreSrcPath = path.join(newCorePath, relPath); - - if (!fs.existsSync(currentPath)) { - // File doesn't exist yet — just copy from new core - fs.mkdirSync(path.dirname(currentPath), { recursive: true }); - fs.copyFileSync(newCoreSrcPath, currentPath); - continue; - } - - if (!fs.existsSync(basePath)) { - // No base — use current as base - fs.mkdirSync(path.dirname(basePath), { recursive: true }); - fs.copyFileSync(currentPath, basePath); - } - - // Three-way merge: current ← base → newCore - const tmpCurrent = path.join( - os.tmpdir(), - `nanoclaw-update-${crypto.randomUUID()}-${path.basename(relPath)}`, - ); - fs.copyFileSync(currentPath, tmpCurrent); - - const result = mergeFile(tmpCurrent, basePath, newCoreSrcPath); - - if (result.clean) { - fs.copyFileSync(tmpCurrent, currentPath); - fs.unlinkSync(tmpCurrent); - } else { - // Conflict — copy markers to working tree - fs.copyFileSync(tmpCurrent, currentPath); - fs.unlinkSync(tmpCurrent); - mergeConflicts.push(relPath); - } - } - - if (mergeConflicts.length > 0) { - // Preserve backup so user can resolve conflicts manually, then continue - // Call clearBackup() after resolution or restoreBackup() + clearBackup() to abort - return { - success: false, - previousVersion: preview.currentVersion, - newVersion: preview.newVersion, - mergeConflicts, - backupPending: true, - error: `Unresolved merge conflicts in: ${mergeConflicts.join(', ')}. Resolve manually then call clearBackup(), or restoreBackup() + clearBackup() to abort.`, - }; - } - - // --- Remove deleted files --- - for (const relPath of preview.filesDeleted) { - const currentPath = path.join(projectRoot, relPath); - if (fs.existsSync(currentPath)) { - fs.unlinkSync(currentPath); - } - } - - // --- Re-apply custom patches --- - const customPatchFailures: string[] = []; - if (state.custom_modifications) { - for (const mod of state.custom_modifications) { - const patchPath = path.join(projectRoot, mod.patch_file); - if (!fs.existsSync(patchPath)) { - customPatchFailures.push( - `${mod.description}: patch file missing (${mod.patch_file})`, - ); - continue; - } - try { - execFileSync('git', ['apply', '--3way', patchPath], { - stdio: 'pipe', - cwd: projectRoot, - }); - } catch { - customPatchFailures.push(mod.description); - } - } - } - - // --- Record path remaps from update metadata --- - const remapFile = path.join( - newCorePath, - '.nanoclaw-meta', - 'path_remap.yaml', - ); - if (fs.existsSync(remapFile)) { - const remap = parseYaml(fs.readFileSync(remapFile, 'utf-8')) as Record< - string, - string - >; - if (remap && typeof remap === 'object') { - recordPathRemap(remap); - } - } - - // --- Update base --- - if (fs.existsSync(baseDir)) { - fs.rmSync(baseDir, { recursive: true, force: true }); - } - fs.mkdirSync(baseDir, { recursive: true }); - copyDir(newCorePath, baseDir); - - // --- Structured ops: re-apply from all skills --- - const allNpmDeps: Record = {}; - const allEnvAdditions: string[] = []; - const allDockerServices: Record = {}; - let hasNpmDeps = false; - - for (const skill of state.applied_skills) { - const outcomes = skill.structured_outcomes as - | Record - | undefined; - if (!outcomes) continue; - - if (outcomes.npm_dependencies) { - Object.assign( - allNpmDeps, - outcomes.npm_dependencies as Record, - ); - hasNpmDeps = true; - } - if (outcomes.env_additions) { - allEnvAdditions.push(...(outcomes.env_additions as string[])); - } - if (outcomes.docker_compose_services) { - Object.assign( - allDockerServices, - outcomes.docker_compose_services as Record, - ); - } - } - - if (hasNpmDeps) { - const pkgPath = path.join(projectRoot, 'package.json'); - mergeNpmDependencies(pkgPath, allNpmDeps); - } - - if (allEnvAdditions.length > 0) { - const envPath = path.join(projectRoot, '.env.example'); - mergeEnvAdditions(envPath, allEnvAdditions); - } - - if (Object.keys(allDockerServices).length > 0) { - const composePath = path.join(projectRoot, 'docker-compose.yml'); - mergeDockerComposeServices(composePath, allDockerServices); - } - - if (hasNpmDeps) { - runNpmInstall(); - } - - // --- Run tests for each applied skill --- - const skillReapplyResults: Record = {}; - - for (const skill of state.applied_skills) { - const outcomes = skill.structured_outcomes as - | Record - | undefined; - if (!outcomes?.test) continue; - - const testCmd = outcomes.test as string; - try { - execSync(testCmd, { - stdio: 'pipe', - cwd: projectRoot, - timeout: 120_000, - }); - skillReapplyResults[skill.name] = true; - } catch { - skillReapplyResults[skill.name] = false; - } - } - - // --- Update state --- - state.core_version = preview.newVersion; - writeState(state); - - // --- Cleanup --- - clearBackup(); - - return { - success: true, - previousVersion: preview.currentVersion, - newVersion: preview.newVersion, - customPatchFailures: - customPatchFailures.length > 0 ? customPatchFailures : undefined, - skillReapplyResults: - Object.keys(skillReapplyResults).length > 0 - ? skillReapplyResults - : undefined, - }; - } catch (err) { - restoreBackup(); - clearBackup(); - return { - success: false, - previousVersion: state.core_version, - newVersion: 'unknown', - error: err instanceof Error ? err.message : String(err), - }; - } finally { - releaseLock(); - } -} diff --git a/src/channels/index.ts b/src/channels/index.ts new file mode 100644 index 0000000..44f4f55 --- /dev/null +++ b/src/channels/index.ts @@ -0,0 +1,12 @@ +// Channel self-registration barrel file. +// Each import triggers the channel module's registerChannel() call. + +// discord + +// gmail + +// slack + +// telegram + +// whatsapp diff --git a/src/channels/registry.test.ts b/src/channels/registry.test.ts new file mode 100644 index 0000000..e89f62b --- /dev/null +++ b/src/channels/registry.test.ts @@ -0,0 +1,42 @@ +import { describe, it, expect } from 'vitest'; + +import { + registerChannel, + getChannelFactory, + getRegisteredChannelNames, +} from './registry.js'; + +// The registry is module-level state, so we need a fresh module per test. +// We use dynamic import with cache-busting to isolate tests. +// However, since vitest runs each file in its own context and we control +// registration order, we can test the public API directly. + +describe('channel registry', () => { + // Note: registry is shared module state across tests in this file. + // Tests are ordered to account for cumulative registrations. + + it('getChannelFactory returns undefined for unknown channel', () => { + expect(getChannelFactory('nonexistent')).toBeUndefined(); + }); + + it('registerChannel and getChannelFactory round-trip', () => { + const factory = () => null; + registerChannel('test-channel', factory); + expect(getChannelFactory('test-channel')).toBe(factory); + }); + + it('getRegisteredChannelNames includes registered channels', () => { + registerChannel('another-channel', () => null); + const names = getRegisteredChannelNames(); + expect(names).toContain('test-channel'); + expect(names).toContain('another-channel'); + }); + + it('later registration overwrites earlier one', () => { + const factory1 = () => null; + const factory2 = () => null; + registerChannel('overwrite-test', factory1); + registerChannel('overwrite-test', factory2); + expect(getChannelFactory('overwrite-test')).toBe(factory2); + }); +}); diff --git a/src/channels/registry.ts b/src/channels/registry.ts new file mode 100644 index 0000000..ab871c3 --- /dev/null +++ b/src/channels/registry.ts @@ -0,0 +1,28 @@ +import { + Channel, + OnInboundMessage, + OnChatMetadata, + RegisteredGroup, +} from '../types.js'; + +export interface ChannelOpts { + onMessage: OnInboundMessage; + onChatMetadata: OnChatMetadata; + registeredGroups: () => Record; +} + +export type ChannelFactory = (opts: ChannelOpts) => Channel | null; + +const registry = new Map(); + +export function registerChannel(name: string, factory: ChannelFactory): void { + registry.set(name, factory); +} + +export function getChannelFactory(name: string): ChannelFactory | undefined { + return registry.get(name); +} + +export function getRegisteredChannelNames(): string[] { + return [...registry.keys()]; +} diff --git a/src/channels/whatsapp.test.ts b/src/channels/whatsapp.test.ts deleted file mode 100644 index d7d0875..0000000 --- a/src/channels/whatsapp.test.ts +++ /dev/null @@ -1,949 +0,0 @@ -import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest'; -import { EventEmitter } from 'events'; - -// --- Mocks --- - -// Mock config -vi.mock('../config.js', () => ({ - STORE_DIR: '/tmp/nanoclaw-test-store', - ASSISTANT_NAME: 'Andy', - ASSISTANT_HAS_OWN_NUMBER: false, -})); - -// Mock logger -vi.mock('../logger.js', () => ({ - logger: { - debug: vi.fn(), - info: vi.fn(), - warn: vi.fn(), - error: vi.fn(), - }, -})); - -// Mock db -vi.mock('../db.js', () => ({ - getLastGroupSync: vi.fn(() => null), - setLastGroupSync: vi.fn(), - updateChatName: vi.fn(), -})); - -// Mock fs -vi.mock('fs', async () => { - const actual = await vi.importActual('fs'); - return { - ...actual, - default: { - ...actual, - existsSync: vi.fn(() => true), - mkdirSync: vi.fn(), - }, - }; -}); - -// Mock child_process (used for osascript notification) -vi.mock('child_process', () => ({ - exec: vi.fn(), -})); - -// Build a fake WASocket that's an EventEmitter with the methods we need -function createFakeSocket() { - const ev = new EventEmitter(); - const sock = { - ev: { - on: (event: string, handler: (...args: unknown[]) => void) => { - ev.on(event, handler); - }, - }, - user: { - id: '1234567890:1@s.whatsapp.net', - lid: '9876543210:1@lid', - }, - sendMessage: vi.fn().mockResolvedValue(undefined), - sendPresenceUpdate: vi.fn().mockResolvedValue(undefined), - groupFetchAllParticipating: vi.fn().mockResolvedValue({}), - end: vi.fn(), - // Expose the event emitter for triggering events in tests - _ev: ev, - }; - return sock; -} - -let fakeSocket: ReturnType; - -// Mock Baileys -vi.mock('@whiskeysockets/baileys', () => { - return { - default: vi.fn(() => fakeSocket), - Browsers: { macOS: vi.fn(() => ['macOS', 'Chrome', '']) }, - DisconnectReason: { - loggedOut: 401, - badSession: 500, - connectionClosed: 428, - connectionLost: 408, - connectionReplaced: 440, - timedOut: 408, - restartRequired: 515, - }, - fetchLatestWaWebVersion: vi - .fn() - .mockResolvedValue({ version: [2, 3000, 0] }), - makeCacheableSignalKeyStore: vi.fn((keys: unknown) => keys), - useMultiFileAuthState: vi.fn().mockResolvedValue({ - state: { - creds: {}, - keys: {}, - }, - saveCreds: vi.fn(), - }), - }; -}); - -import { WhatsAppChannel, WhatsAppChannelOpts } from './whatsapp.js'; -import { getLastGroupSync, updateChatName, setLastGroupSync } from '../db.js'; - -// --- Test helpers --- - -function createTestOpts( - overrides?: Partial, -): WhatsAppChannelOpts { - return { - onMessage: vi.fn(), - onChatMetadata: vi.fn(), - registeredGroups: vi.fn(() => ({ - 'registered@g.us': { - name: 'Test Group', - folder: 'test-group', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - ...overrides, - }; -} - -function triggerConnection(state: string, extra?: Record) { - fakeSocket._ev.emit('connection.update', { connection: state, ...extra }); -} - -function triggerDisconnect(statusCode: number) { - fakeSocket._ev.emit('connection.update', { - connection: 'close', - lastDisconnect: { - error: { output: { statusCode } }, - }, - }); -} - -async function triggerMessages(messages: unknown[]) { - fakeSocket._ev.emit('messages.upsert', { messages }); - // Flush microtasks so the async messages.upsert handler completes - await new Promise((r) => setTimeout(r, 0)); -} - -// --- Tests --- - -describe('WhatsAppChannel', () => { - beforeEach(() => { - fakeSocket = createFakeSocket(); - vi.mocked(getLastGroupSync).mockReturnValue(null); - }); - - afterEach(() => { - vi.restoreAllMocks(); - }); - - /** - * Helper: start connect, flush microtasks so event handlers are registered, - * then trigger the connection open event. Returns the resolved promise. - */ - async function connectChannel(channel: WhatsAppChannel): Promise { - const p = channel.connect(); - // Flush microtasks so connectInternal completes its await and registers handlers - await new Promise((r) => setTimeout(r, 0)); - triggerConnection('open'); - return p; - } - - // --- Version fetch --- - - describe('version fetch', () => { - it('connects with fetched version', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - await connectChannel(channel); - - const { fetchLatestWaWebVersion } = - await import('@whiskeysockets/baileys'); - expect(fetchLatestWaWebVersion).toHaveBeenCalledWith({}); - }); - - it('falls back gracefully when version fetch fails', async () => { - const { fetchLatestWaWebVersion } = - await import('@whiskeysockets/baileys'); - vi.mocked(fetchLatestWaWebVersion).mockRejectedValueOnce( - new Error('network error'), - ); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - await connectChannel(channel); - - // Should still connect successfully despite fetch failure - expect(channel.isConnected()).toBe(true); - }); - }); - - // --- Connection lifecycle --- - - describe('connection lifecycle', () => { - it('resolves connect() when connection opens', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - expect(channel.isConnected()).toBe(true); - }); - - it('sets up LID to phone mapping on open', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // The channel should have mapped the LID from sock.user - // We can verify by sending a message from a LID JID - // and checking the translated JID in the callback - }); - - it('flushes outgoing queue on reconnect', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Disconnect - (channel as any).connected = false; - - // Queue a message while disconnected - await channel.sendMessage('test@g.us', 'Queued message'); - expect(fakeSocket.sendMessage).not.toHaveBeenCalled(); - - // Reconnect - (channel as any).connected = true; - await (channel as any).flushOutgoingQueue(); - - // Group messages get prefixed when flushed - expect(fakeSocket.sendMessage).toHaveBeenCalledWith('test@g.us', { - text: 'Andy: Queued message', - }); - }); - - it('disconnects cleanly', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.disconnect(); - expect(channel.isConnected()).toBe(false); - expect(fakeSocket.end).toHaveBeenCalled(); - }); - }); - - // --- QR code and auth --- - - describe('authentication', () => { - it('exits process when QR code is emitted (no auth state)', async () => { - vi.useFakeTimers(); - const mockExit = vi - .spyOn(process, 'exit') - .mockImplementation(() => undefined as never); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - // Start connect but don't await (it won't resolve - process exits) - channel.connect().catch(() => {}); - - // Flush microtasks so connectInternal registers handlers - await vi.advanceTimersByTimeAsync(0); - - // Emit QR code event - fakeSocket._ev.emit('connection.update', { qr: 'some-qr-data' }); - - // Advance timer past the 1000ms setTimeout before exit - await vi.advanceTimersByTimeAsync(1500); - - expect(mockExit).toHaveBeenCalledWith(1); - mockExit.mockRestore(); - vi.useRealTimers(); - }); - }); - - // --- Reconnection behavior --- - - describe('reconnection', () => { - it('reconnects on non-loggedOut disconnect', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - expect(channel.isConnected()).toBe(true); - - // Disconnect with a non-loggedOut reason (e.g., connectionClosed = 428) - triggerDisconnect(428); - - expect(channel.isConnected()).toBe(false); - // The channel should attempt to reconnect (calls connectInternal again) - }); - - it('exits on loggedOut disconnect', async () => { - const mockExit = vi - .spyOn(process, 'exit') - .mockImplementation(() => undefined as never); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Disconnect with loggedOut reason (401) - triggerDisconnect(401); - - expect(channel.isConnected()).toBe(false); - expect(mockExit).toHaveBeenCalledWith(0); - mockExit.mockRestore(); - }); - - it('retries reconnection after 5s on failure', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Disconnect with stream error 515 - triggerDisconnect(515); - - // The channel sets a 5s retry — just verify it doesn't crash - await new Promise((r) => setTimeout(r, 100)); - }); - }); - - // --- Message handling --- - - describe('message handling', () => { - it('delivers message for registered group', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-1', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'Hello Andy' }, - pushName: 'Alice', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'registered@g.us', - expect.any(String), - undefined, - 'whatsapp', - true, - ); - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ - id: 'msg-1', - content: 'Hello Andy', - sender_name: 'Alice', - is_from_me: false, - }), - ); - }); - - it('only emits metadata for unregistered groups', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-2', - remoteJid: 'unregistered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'Hello' }, - pushName: 'Bob', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'unregistered@g.us', - expect.any(String), - undefined, - 'whatsapp', - true, - ); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('ignores status@broadcast messages', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-3', - remoteJid: 'status@broadcast', - fromMe: false, - }, - message: { conversation: 'Status update' }, - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).not.toHaveBeenCalled(); - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('ignores messages with no content', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-4', - remoteJid: 'registered@g.us', - fromMe: false, - }, - message: null, - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('extracts text from extendedTextMessage', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-5', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - extendedTextMessage: { text: 'A reply message' }, - }, - pushName: 'Charlie', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: 'A reply message' }), - ); - }); - - it('extracts caption from imageMessage', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-6', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - imageMessage: { - caption: 'Check this photo', - mimetype: 'image/jpeg', - }, - }, - pushName: 'Diana', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: 'Check this photo' }), - ); - }); - - it('extracts caption from videoMessage', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-7', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - videoMessage: { caption: 'Watch this', mimetype: 'video/mp4' }, - }, - pushName: 'Eve', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ content: 'Watch this' }), - ); - }); - - it('handles message with no extractable text (e.g. voice note without caption)', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-8', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { - audioMessage: { mimetype: 'audio/ogg; codecs=opus', ptt: true }, - }, - pushName: 'Frank', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - // Skipped — no text content to process - expect(opts.onMessage).not.toHaveBeenCalled(); - }); - - it('uses sender JID when pushName is absent', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-9', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'No push name' }, - // pushName is undefined - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onMessage).toHaveBeenCalledWith( - 'registered@g.us', - expect.objectContaining({ sender_name: '5551234' }), - ); - }); - }); - - // --- LID ↔ JID translation --- - - describe('LID to JID translation', () => { - it('translates known LID to phone JID', async () => { - const opts = createTestOpts({ - registeredGroups: vi.fn(() => ({ - '1234567890@s.whatsapp.net': { - name: 'Self Chat', - folder: 'self-chat', - trigger: '@Andy', - added_at: '2024-01-01T00:00:00.000Z', - }, - })), - }); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // The socket has lid '9876543210:1@lid' → phone '1234567890@s.whatsapp.net' - // Send a message from the LID - await triggerMessages([ - { - key: { - id: 'msg-lid', - remoteJid: '9876543210@lid', - fromMe: false, - }, - message: { conversation: 'From LID' }, - pushName: 'Self', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - // Should be translated to phone JID - expect(opts.onChatMetadata).toHaveBeenCalledWith( - '1234567890@s.whatsapp.net', - expect.any(String), - undefined, - 'whatsapp', - false, - ); - }); - - it('passes through non-LID JIDs unchanged', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-normal', - remoteJid: 'registered@g.us', - participant: '5551234@s.whatsapp.net', - fromMe: false, - }, - message: { conversation: 'Normal JID' }, - pushName: 'Grace', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - expect(opts.onChatMetadata).toHaveBeenCalledWith( - 'registered@g.us', - expect.any(String), - undefined, - 'whatsapp', - true, - ); - }); - - it('passes through unknown LID JIDs unchanged', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await triggerMessages([ - { - key: { - id: 'msg-unknown-lid', - remoteJid: '0000000000@lid', - fromMe: false, - }, - message: { conversation: 'Unknown LID' }, - pushName: 'Unknown', - messageTimestamp: Math.floor(Date.now() / 1000), - }, - ]); - - // Unknown LID passes through unchanged - expect(opts.onChatMetadata).toHaveBeenCalledWith( - '0000000000@lid', - expect.any(String), - undefined, - 'whatsapp', - false, - ); - }); - }); - - // --- Outgoing message queue --- - - describe('outgoing message queue', () => { - it('sends message directly when connected', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.sendMessage('test@g.us', 'Hello'); - // Group messages get prefixed with assistant name - expect(fakeSocket.sendMessage).toHaveBeenCalledWith('test@g.us', { - text: 'Andy: Hello', - }); - }); - - it('prefixes direct chat messages on shared number', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.sendMessage('123@s.whatsapp.net', 'Hello'); - // Shared number: DMs also get prefixed (needed for self-chat distinction) - expect(fakeSocket.sendMessage).toHaveBeenCalledWith( - '123@s.whatsapp.net', - { text: 'Andy: Hello' }, - ); - }); - - it('queues message when disconnected', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - // Don't connect — channel starts disconnected - await channel.sendMessage('test@g.us', 'Queued'); - expect(fakeSocket.sendMessage).not.toHaveBeenCalled(); - }); - - it('queues message on send failure', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Make sendMessage fail - fakeSocket.sendMessage.mockRejectedValueOnce(new Error('Network error')); - - await channel.sendMessage('test@g.us', 'Will fail'); - - // Should not throw, message queued for retry - // The queue should have the message - }); - - it('flushes multiple queued messages in order', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - // Queue messages while disconnected - await channel.sendMessage('test@g.us', 'First'); - await channel.sendMessage('test@g.us', 'Second'); - await channel.sendMessage('test@g.us', 'Third'); - - // Connect — flush happens automatically on open - await connectChannel(channel); - - // Give the async flush time to complete - await new Promise((r) => setTimeout(r, 50)); - - expect(fakeSocket.sendMessage).toHaveBeenCalledTimes(3); - // Group messages get prefixed - expect(fakeSocket.sendMessage).toHaveBeenNthCalledWith(1, 'test@g.us', { - text: 'Andy: First', - }); - expect(fakeSocket.sendMessage).toHaveBeenNthCalledWith(2, 'test@g.us', { - text: 'Andy: Second', - }); - expect(fakeSocket.sendMessage).toHaveBeenNthCalledWith(3, 'test@g.us', { - text: 'Andy: Third', - }); - }); - }); - - // --- Group metadata sync --- - - describe('group metadata sync', () => { - it('syncs group metadata on first connection', async () => { - fakeSocket.groupFetchAllParticipating.mockResolvedValue({ - 'group1@g.us': { subject: 'Group One' }, - 'group2@g.us': { subject: 'Group Two' }, - }); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Wait for async sync to complete - await new Promise((r) => setTimeout(r, 50)); - - expect(fakeSocket.groupFetchAllParticipating).toHaveBeenCalled(); - expect(updateChatName).toHaveBeenCalledWith('group1@g.us', 'Group One'); - expect(updateChatName).toHaveBeenCalledWith('group2@g.us', 'Group Two'); - expect(setLastGroupSync).toHaveBeenCalled(); - }); - - it('skips sync when synced recently', async () => { - // Last sync was 1 hour ago (within 24h threshold) - vi.mocked(getLastGroupSync).mockReturnValue( - new Date(Date.now() - 60 * 60 * 1000).toISOString(), - ); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await new Promise((r) => setTimeout(r, 50)); - - expect(fakeSocket.groupFetchAllParticipating).not.toHaveBeenCalled(); - }); - - it('forces sync regardless of cache', async () => { - vi.mocked(getLastGroupSync).mockReturnValue( - new Date(Date.now() - 60 * 60 * 1000).toISOString(), - ); - - fakeSocket.groupFetchAllParticipating.mockResolvedValue({ - 'group@g.us': { subject: 'Forced Group' }, - }); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.syncGroupMetadata(true); - - expect(fakeSocket.groupFetchAllParticipating).toHaveBeenCalled(); - expect(updateChatName).toHaveBeenCalledWith('group@g.us', 'Forced Group'); - }); - - it('handles group sync failure gracefully', async () => { - fakeSocket.groupFetchAllParticipating.mockRejectedValue( - new Error('Network timeout'), - ); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Should not throw - await expect(channel.syncGroupMetadata(true)).resolves.toBeUndefined(); - }); - - it('skips groups with no subject', async () => { - fakeSocket.groupFetchAllParticipating.mockResolvedValue({ - 'group1@g.us': { subject: 'Has Subject' }, - 'group2@g.us': { subject: '' }, - 'group3@g.us': {}, - }); - - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - // Clear any calls from the automatic sync on connect - vi.mocked(updateChatName).mockClear(); - - await channel.syncGroupMetadata(true); - - expect(updateChatName).toHaveBeenCalledTimes(1); - expect(updateChatName).toHaveBeenCalledWith('group1@g.us', 'Has Subject'); - }); - }); - - // --- JID ownership --- - - describe('ownsJid', () => { - it('owns @g.us JIDs (WhatsApp groups)', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('12345@g.us')).toBe(true); - }); - - it('owns @s.whatsapp.net JIDs (WhatsApp DMs)', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('12345@s.whatsapp.net')).toBe(true); - }); - - it('does not own Telegram JIDs', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('tg:12345')).toBe(false); - }); - - it('does not own unknown JID formats', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.ownsJid('random-string')).toBe(false); - }); - }); - - // --- Typing indicator --- - - describe('setTyping', () => { - it('sends composing presence when typing', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.setTyping('test@g.us', true); - expect(fakeSocket.sendPresenceUpdate).toHaveBeenCalledWith( - 'composing', - 'test@g.us', - ); - }); - - it('sends paused presence when stopping', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - await channel.setTyping('test@g.us', false); - expect(fakeSocket.sendPresenceUpdate).toHaveBeenCalledWith( - 'paused', - 'test@g.us', - ); - }); - - it('handles typing indicator failure gracefully', async () => { - const opts = createTestOpts(); - const channel = new WhatsAppChannel(opts); - - await connectChannel(channel); - - fakeSocket.sendPresenceUpdate.mockRejectedValueOnce(new Error('Failed')); - - // Should not throw - await expect( - channel.setTyping('test@g.us', true), - ).resolves.toBeUndefined(); - }); - }); - - // --- Channel properties --- - - describe('channel properties', () => { - it('has name "whatsapp"', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect(channel.name).toBe('whatsapp'); - }); - - it('does not expose prefixAssistantName (prefix handled internally)', () => { - const channel = new WhatsAppChannel(createTestOpts()); - expect('prefixAssistantName' in channel).toBe(false); - }); - }); -}); diff --git a/src/channels/whatsapp.ts b/src/channels/whatsapp.ts deleted file mode 100644 index f603025..0000000 --- a/src/channels/whatsapp.ts +++ /dev/null @@ -1,378 +0,0 @@ -import { exec } from 'child_process'; -import fs from 'fs'; -import path from 'path'; - -import makeWASocket, { - Browsers, - DisconnectReason, - WASocket, - fetchLatestWaWebVersion, - makeCacheableSignalKeyStore, - useMultiFileAuthState, -} from '@whiskeysockets/baileys'; - -import { - ASSISTANT_HAS_OWN_NUMBER, - ASSISTANT_NAME, - STORE_DIR, -} from '../config.js'; -import { getLastGroupSync, setLastGroupSync, updateChatName } from '../db.js'; -import { logger } from '../logger.js'; -import { - Channel, - OnInboundMessage, - OnChatMetadata, - RegisteredGroup, -} from '../types.js'; - -const GROUP_SYNC_INTERVAL_MS = 24 * 60 * 60 * 1000; // 24 hours - -export interface WhatsAppChannelOpts { - onMessage: OnInboundMessage; - onChatMetadata: OnChatMetadata; - registeredGroups: () => Record; -} - -export class WhatsAppChannel implements Channel { - name = 'whatsapp'; - - private sock!: WASocket; - private connected = false; - private lidToPhoneMap: Record = {}; - private outgoingQueue: Array<{ jid: string; text: string }> = []; - private flushing = false; - private groupSyncTimerStarted = false; - - private opts: WhatsAppChannelOpts; - - constructor(opts: WhatsAppChannelOpts) { - this.opts = opts; - } - - async connect(): Promise { - return new Promise((resolve, reject) => { - this.connectInternal(resolve).catch(reject); - }); - } - - private async connectInternal(onFirstOpen?: () => void): Promise { - const authDir = path.join(STORE_DIR, 'auth'); - fs.mkdirSync(authDir, { recursive: true }); - - const { state, saveCreds } = await useMultiFileAuthState(authDir); - - const { version } = await fetchLatestWaWebVersion({}).catch((err) => { - logger.warn( - { err }, - 'Failed to fetch latest WA Web version, using default', - ); - return { version: undefined }; - }); - this.sock = makeWASocket({ - version, - auth: { - creds: state.creds, - keys: makeCacheableSignalKeyStore(state.keys, logger), - }, - printQRInTerminal: false, - logger, - browser: Browsers.macOS('Chrome'), - }); - - this.sock.ev.on('connection.update', (update) => { - const { connection, lastDisconnect, qr } = update; - - if (qr) { - const msg = - 'WhatsApp authentication required. Run /setup in Claude Code.'; - logger.error(msg); - exec( - `osascript -e 'display notification "${msg}" with title "NanoClaw" sound name "Basso"'`, - ); - setTimeout(() => process.exit(1), 1000); - } - - if (connection === 'close') { - this.connected = false; - const reason = ( - lastDisconnect?.error as { output?: { statusCode?: number } } - )?.output?.statusCode; - const shouldReconnect = reason !== DisconnectReason.loggedOut; - logger.info( - { - reason, - shouldReconnect, - queuedMessages: this.outgoingQueue.length, - }, - 'Connection closed', - ); - - if (shouldReconnect) { - logger.info('Reconnecting...'); - this.connectInternal().catch((err) => { - logger.error({ err }, 'Failed to reconnect, retrying in 5s'); - setTimeout(() => { - this.connectInternal().catch((err2) => { - logger.error({ err: err2 }, 'Reconnection retry failed'); - }); - }, 5000); - }); - } else { - logger.info('Logged out. Run /setup to re-authenticate.'); - process.exit(0); - } - } else if (connection === 'open') { - this.connected = true; - logger.info('Connected to WhatsApp'); - - // Announce availability so WhatsApp relays subsequent presence updates (typing indicators) - this.sock.sendPresenceUpdate('available').catch((err) => { - logger.warn({ err }, 'Failed to send presence update'); - }); - - // Build LID to phone mapping from auth state for self-chat translation - if (this.sock.user) { - const phoneUser = this.sock.user.id.split(':')[0]; - const lidUser = this.sock.user.lid?.split(':')[0]; - if (lidUser && phoneUser) { - this.lidToPhoneMap[lidUser] = `${phoneUser}@s.whatsapp.net`; - logger.debug({ lidUser, phoneUser }, 'LID to phone mapping set'); - } - } - - // Flush any messages queued while disconnected - this.flushOutgoingQueue().catch((err) => - logger.error({ err }, 'Failed to flush outgoing queue'), - ); - - // Sync group metadata on startup (respects 24h cache) - this.syncGroupMetadata().catch((err) => - logger.error({ err }, 'Initial group sync failed'), - ); - // Set up daily sync timer (only once) - if (!this.groupSyncTimerStarted) { - this.groupSyncTimerStarted = true; - setInterval(() => { - this.syncGroupMetadata().catch((err) => - logger.error({ err }, 'Periodic group sync failed'), - ); - }, GROUP_SYNC_INTERVAL_MS); - } - - // Signal first connection to caller - if (onFirstOpen) { - onFirstOpen(); - onFirstOpen = undefined; - } - } - }); - - this.sock.ev.on('creds.update', saveCreds); - - this.sock.ev.on('messages.upsert', async ({ messages }) => { - for (const msg of messages) { - if (!msg.message) continue; - const rawJid = msg.key.remoteJid; - if (!rawJid || rawJid === 'status@broadcast') continue; - - // Translate LID JID to phone JID if applicable - const chatJid = await this.translateJid(rawJid); - - const timestamp = new Date( - Number(msg.messageTimestamp) * 1000, - ).toISOString(); - - // Always notify about chat metadata for group discovery - const isGroup = chatJid.endsWith('@g.us'); - this.opts.onChatMetadata( - chatJid, - timestamp, - undefined, - 'whatsapp', - isGroup, - ); - - // Only deliver full message for registered groups - const groups = this.opts.registeredGroups(); - if (groups[chatJid]) { - const content = - msg.message?.conversation || - msg.message?.extendedTextMessage?.text || - msg.message?.imageMessage?.caption || - msg.message?.videoMessage?.caption || - ''; - - // Skip protocol messages with no text content (encryption keys, read receipts, etc.) - if (!content) continue; - - const sender = msg.key.participant || msg.key.remoteJid || ''; - const senderName = msg.pushName || sender.split('@')[0]; - - const fromMe = msg.key.fromMe || false; - // Detect bot messages: with own number, fromMe is reliable - // since only the bot sends from that number. - // With shared number, bot messages carry the assistant name prefix - // (even in DMs/self-chat) so we check for that. - const isBotMessage = ASSISTANT_HAS_OWN_NUMBER - ? fromMe - : content.startsWith(`${ASSISTANT_NAME}:`); - - this.opts.onMessage(chatJid, { - id: msg.key.id || '', - chat_jid: chatJid, - sender, - sender_name: senderName, - content, - timestamp, - is_from_me: fromMe, - is_bot_message: isBotMessage, - }); - } - } - }); - } - - async sendMessage(jid: string, text: string): Promise { - // Prefix bot messages with assistant name so users know who's speaking. - // On a shared number, prefix is also needed in DMs (including self-chat) - // to distinguish bot output from user messages. - // Skip only when the assistant has its own dedicated phone number. - const prefixed = ASSISTANT_HAS_OWN_NUMBER - ? text - : `${ASSISTANT_NAME}: ${text}`; - - if (!this.connected) { - this.outgoingQueue.push({ jid, text: prefixed }); - logger.info( - { jid, length: prefixed.length, queueSize: this.outgoingQueue.length }, - 'WA disconnected, message queued', - ); - return; - } - try { - await this.sock.sendMessage(jid, { text: prefixed }); - logger.info({ jid, length: prefixed.length }, 'Message sent'); - } catch (err) { - // If send fails, queue it for retry on reconnect - this.outgoingQueue.push({ jid, text: prefixed }); - logger.warn( - { jid, err, queueSize: this.outgoingQueue.length }, - 'Failed to send, message queued', - ); - } - } - - isConnected(): boolean { - return this.connected; - } - - ownsJid(jid: string): boolean { - return jid.endsWith('@g.us') || jid.endsWith('@s.whatsapp.net'); - } - - async disconnect(): Promise { - this.connected = false; - this.sock?.end(undefined); - } - - async setTyping(jid: string, isTyping: boolean): Promise { - try { - const status = isTyping ? 'composing' : 'paused'; - logger.debug({ jid, status }, 'Sending presence update'); - await this.sock.sendPresenceUpdate(status, jid); - } catch (err) { - logger.debug({ jid, err }, 'Failed to update typing status'); - } - } - - /** - * Sync group metadata from WhatsApp. - * Fetches all participating groups and stores their names in the database. - * Called on startup, daily, and on-demand via IPC. - */ - async syncGroupMetadata(force = false): Promise { - if (!force) { - const lastSync = getLastGroupSync(); - if (lastSync) { - const lastSyncTime = new Date(lastSync).getTime(); - if (Date.now() - lastSyncTime < GROUP_SYNC_INTERVAL_MS) { - logger.debug({ lastSync }, 'Skipping group sync - synced recently'); - return; - } - } - } - - try { - logger.info('Syncing group metadata from WhatsApp...'); - const groups = await this.sock.groupFetchAllParticipating(); - - let count = 0; - for (const [jid, metadata] of Object.entries(groups)) { - if (metadata.subject) { - updateChatName(jid, metadata.subject); - count++; - } - } - - setLastGroupSync(); - logger.info({ count }, 'Group metadata synced'); - } catch (err) { - logger.error({ err }, 'Failed to sync group metadata'); - } - } - - private async translateJid(jid: string): Promise { - if (!jid.endsWith('@lid')) return jid; - const lidUser = jid.split('@')[0].split(':')[0]; - - // Check local cache first - const cached = this.lidToPhoneMap[lidUser]; - if (cached) { - logger.debug( - { lidJid: jid, phoneJid: cached }, - 'Translated LID to phone JID (cached)', - ); - return cached; - } - - // Query Baileys' signal repository for the mapping - try { - const pn = await this.sock.signalRepository?.lidMapping?.getPNForLID(jid); - if (pn) { - const phoneJid = `${pn.split('@')[0].split(':')[0]}@s.whatsapp.net`; - this.lidToPhoneMap[lidUser] = phoneJid; - logger.info( - { lidJid: jid, phoneJid }, - 'Translated LID to phone JID (signalRepository)', - ); - return phoneJid; - } - } catch (err) { - logger.debug({ err, jid }, 'Failed to resolve LID via signalRepository'); - } - - return jid; - } - - private async flushOutgoingQueue(): Promise { - if (this.flushing || this.outgoingQueue.length === 0) return; - this.flushing = true; - try { - logger.info( - { count: this.outgoingQueue.length }, - 'Flushing outgoing message queue', - ); - while (this.outgoingQueue.length > 0) { - const item = this.outgoingQueue.shift()!; - // Send directly — queued items are already prefixed by sendMessage - await this.sock.sendMessage(item.jid, { text: item.text }); - logger.info( - { jid: item.jid, length: item.text.length }, - 'Queued message sent', - ); - } - } finally { - this.flushing = false; - } - } -} diff --git a/src/claw-skill.test.ts b/src/claw-skill.test.ts new file mode 100644 index 0000000..24260c9 --- /dev/null +++ b/src/claw-skill.test.ts @@ -0,0 +1,45 @@ +import fs from 'fs'; +import os from 'os'; +import path from 'path'; +import { spawnSync } from 'child_process'; + +import { describe, expect, it } from 'vitest'; + +describe('claw skill script', () => { + it('exits zero after successful structured output even if the runtime is terminated', () => { + const tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'claw-skill-test-')); + const binDir = path.join(tempDir, 'bin'); + fs.mkdirSync(binDir, { recursive: true }); + + const runtimePath = path.join(binDir, 'container'); + fs.writeFileSync( + runtimePath, + `#!/bin/sh +cat >/dev/null +printf '%s\n' '---NANOCLAW_OUTPUT_START---' '{"status":"success","result":"4","newSessionId":"sess-1"}' '---NANOCLAW_OUTPUT_END---' +sleep 30 +`, + ); + fs.chmodSync(runtimePath, 0o755); + + const result = spawnSync( + 'python3', + ['.claude/skills/claw/scripts/claw', '-j', 'tg:123', 'What is 2+2?'], + { + cwd: process.cwd(), + encoding: 'utf8', + env: { + ...process.env, + NANOCLAW_DIR: tempDir, + PATH: `${binDir}:${process.env.PATH || ''}`, + }, + timeout: 15000, + }, + ); + + expect(result.status).toBe(0); + expect(result.signal).toBeNull(); + expect(result.stdout).toContain('4'); + expect(result.stderr).toContain('[session: sess-1]'); + }); +}); diff --git a/src/config.ts b/src/config.ts index 8a4cb92..e1cbe11 100644 --- a/src/config.ts +++ b/src/config.ts @@ -2,11 +2,15 @@ import os from 'os'; import path from 'path'; import { readEnvFile } from './env.js'; +import { isValidTimezone } from './timezone.js'; // Read config values from .env (falls back to process.env). -// Secrets are NOT read here — they stay on disk and are loaded only -// where needed (container-runner.ts) to avoid leaking to child processes. -const envConfig = readEnvFile(['ASSISTANT_NAME', 'ASSISTANT_HAS_OWN_NUMBER']); +const envConfig = readEnvFile([ + 'ASSISTANT_NAME', + 'ASSISTANT_HAS_OWN_NUMBER', + 'ONECLI_URL', + 'TZ', +]); export const ASSISTANT_NAME = process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy'; @@ -27,10 +31,15 @@ export const MOUNT_ALLOWLIST_PATH = path.join( 'nanoclaw', 'mount-allowlist.json', ); +export const SENDER_ALLOWLIST_PATH = path.join( + HOME_DIR, + '.config', + 'nanoclaw', + 'sender-allowlist.json', +); export const STORE_DIR = path.resolve(PROJECT_ROOT, 'store'); export const GROUPS_DIR = path.resolve(PROJECT_ROOT, 'groups'); export const DATA_DIR = path.resolve(PROJECT_ROOT, 'data'); -export const MAIN_GROUP_FOLDER = 'main'; export const CONTAINER_IMAGE = process.env.CONTAINER_IMAGE || 'nanoclaw-agent:latest'; @@ -42,6 +51,8 @@ export const CONTAINER_MAX_OUTPUT_SIZE = parseInt( process.env.CONTAINER_MAX_OUTPUT_SIZE || '10485760', 10, ); // 10MB default +export const ONECLI_URL = + process.env.ONECLI_URL || envConfig.ONECLI_URL || 'http://localhost:10254'; export const IPC_POLL_INTERVAL = 1000; export const IDLE_TIMEOUT = parseInt(process.env.IDLE_TIMEOUT || '1800000', 10); // 30min default — how long to keep container alive after last result export const MAX_CONCURRENT_CONTAINERS = Math.max( @@ -53,12 +64,30 @@ function escapeRegex(str: string): string { return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); } -export const TRIGGER_PATTERN = new RegExp( - `^@${escapeRegex(ASSISTANT_NAME)}\\b`, - 'i', -); +export function buildTriggerPattern(trigger: string): RegExp { + return new RegExp(`^${escapeRegex(trigger.trim())}\\b`, 'i'); +} -// Timezone for scheduled tasks (cron expressions, etc.) -// Uses system timezone by default -export const TIMEZONE = - process.env.TZ || Intl.DateTimeFormat().resolvedOptions().timeZone; +export const DEFAULT_TRIGGER = `@${ASSISTANT_NAME}`; + +export function getTriggerPattern(trigger?: string): RegExp { + const normalizedTrigger = trigger?.trim(); + return buildTriggerPattern(normalizedTrigger || DEFAULT_TRIGGER); +} + +export const TRIGGER_PATTERN = buildTriggerPattern(DEFAULT_TRIGGER); + +// Timezone for scheduled tasks, message formatting, etc. +// Validates each candidate is a real IANA identifier before accepting. +function resolveConfigTimezone(): string { + const candidates = [ + process.env.TZ, + envConfig.TZ, + Intl.DateTimeFormat().resolvedOptions().timeZone, + ]; + for (const tz of candidates) { + if (tz && isValidTimezone(tz)) return tz; + } + return 'UTC'; +} +export const TIMEZONE = resolveConfigTimezone(); diff --git a/src/container-runner.test.ts b/src/container-runner.test.ts index 67af8e2..64c3455 100644 --- a/src/container-runner.test.ts +++ b/src/container-runner.test.ts @@ -14,6 +14,7 @@ vi.mock('./config.js', () => ({ DATA_DIR: '/tmp/nanoclaw-test-data', GROUPS_DIR: '/tmp/nanoclaw-test-groups', IDLE_TIMEOUT: 1800000, // 30min + ONECLI_URL: 'http://localhost:10254', TIMEZONE: 'America/Los_Angeles', })); @@ -50,6 +51,17 @@ vi.mock('./mount-security.js', () => ({ validateAdditionalMounts: vi.fn(() => []), })); +// Mock OneCLI SDK +vi.mock('@onecli-sh/sdk', () => ({ + OneCLI: class { + applyContainerConfig = vi.fn().mockResolvedValue(true); + createAgent = vi.fn().mockResolvedValue({ id: 'test' }); + ensureAgent = vi + .fn() + .mockResolvedValue({ name: 'test', identifier: 'test', created: true }); + }, +})); + // Create a controllable fake ChildProcess function createFakeProcess() { const proc = new EventEmitter() as EventEmitter & { diff --git a/src/container-runner.ts b/src/container-runner.ts index 1af5b52..facc68c 100644 --- a/src/container-runner.ts +++ b/src/container-runner.ts @@ -13,19 +13,23 @@ import { DATA_DIR, GROUPS_DIR, IDLE_TIMEOUT, + ONECLI_URL, TIMEZONE, } from './config.js'; -import { readEnvFile } from './env.js'; import { resolveGroupFolderPath, resolveGroupIpcPath } from './group-folder.js'; import { logger } from './logger.js'; import { CONTAINER_RUNTIME_BIN, + hostGatewayArgs, readonlyMountArgs, stopContainer, } from './container-runtime.js'; +import { OneCLI } from '@onecli-sh/sdk'; import { validateAdditionalMounts } from './mount-security.js'; import { RegisteredGroup } from './types.js'; +const onecli = new OneCLI({ url: ONECLI_URL }); + // Sentinel markers for robust output parsing (must match agent-runner) const OUTPUT_START_MARKER = '---NANOCLAW_OUTPUT_START---'; const OUTPUT_END_MARKER = '---NANOCLAW_OUTPUT_END---'; @@ -38,7 +42,7 @@ export interface ContainerInput { isMain: boolean; isScheduledTask?: boolean; assistantName?: string; - secrets?: Record; + script?: string; } export interface ContainerOutput { @@ -74,6 +78,17 @@ function buildVolumeMounts( readonly: true, }); + // Shadow .env so the agent cannot read secrets from the mounted project root. + // Credentials are injected by the OneCLI gateway, never exposed to containers. + const envFile = path.join(projectRoot, '.env'); + if (fs.existsSync(envFile)) { + mounts.push({ + hostPath: '/dev/null', + containerPath: '/workspace/project/.env', + readonly: true, + }); + } + // Main also gets its group folder as the working directory mounts.push({ hostPath: groupDir, @@ -177,8 +192,17 @@ function buildVolumeMounts( group.folder, 'agent-runner-src', ); - if (!fs.existsSync(groupAgentRunnerDir) && fs.existsSync(agentRunnerSrc)) { - fs.cpSync(agentRunnerSrc, groupAgentRunnerDir, { recursive: true }); + if (fs.existsSync(agentRunnerSrc)) { + const srcIndex = path.join(agentRunnerSrc, 'index.ts'); + const cachedIndex = path.join(groupAgentRunnerDir, 'index.ts'); + const needsCopy = + !fs.existsSync(groupAgentRunnerDir) || + !fs.existsSync(cachedIndex) || + (fs.existsSync(srcIndex) && + fs.statSync(srcIndex).mtimeMs > fs.statSync(cachedIndex).mtimeMs); + if (needsCopy) { + fs.cpSync(agentRunnerSrc, groupAgentRunnerDir, { recursive: true }); + } } mounts.push({ hostPath: groupAgentRunnerDir, @@ -199,23 +223,34 @@ function buildVolumeMounts( return mounts; } -/** - * Read allowed secrets from .env for passing to the container via stdin. - * Secrets are never written to disk or mounted as files. - */ -function readSecrets(): Record { - return readEnvFile(['CLAUDE_CODE_OAUTH_TOKEN', 'ANTHROPIC_API_KEY']); -} - -function buildContainerArgs( +async function buildContainerArgs( mounts: VolumeMount[], containerName: string, -): string[] { + agentIdentifier?: string, +): Promise { const args: string[] = ['run', '-i', '--rm', '--name', containerName]; // Pass host timezone so container's local time matches the user's args.push('-e', `TZ=${TIMEZONE}`); + // OneCLI gateway handles credential injection — containers never see real secrets. + // The gateway intercepts HTTPS traffic and injects API keys or OAuth tokens. + const onecliApplied = await onecli.applyContainerConfig(args, { + addHostMapping: false, // Nanoclaw already handles host gateway + agent: agentIdentifier, + }); + if (onecliApplied) { + logger.info({ containerName }, 'OneCLI gateway config applied'); + } else { + logger.warn( + { containerName }, + 'OneCLI gateway not reachable — container will have no credentials', + ); + } + + // Runtime-specific args for host gateway resolution + args.push(...hostGatewayArgs()); + // Run as host user so bind-mounted files are accessible. // Skip when running as root (uid 0), as the container's node user (uid 1000), // or when getuid is unavailable (native Windows without WSL). @@ -253,7 +288,15 @@ export async function runContainerAgent( const mounts = buildVolumeMounts(group, input.isMain); const safeName = group.folder.replace(/[^a-zA-Z0-9-]/g, '-'); const containerName = `nanoclaw-${safeName}-${Date.now()}`; - const containerArgs = buildContainerArgs(mounts, containerName); + // Main group uses the default OneCLI agent; others use their own agent. + const agentIdentifier = input.isMain + ? undefined + : group.folder.toLowerCase().replace(/_/g, '-'); + const containerArgs = await buildContainerArgs( + mounts, + containerName, + agentIdentifier, + ); logger.debug( { @@ -293,12 +336,8 @@ export async function runContainerAgent( let stdoutTruncated = false; let stderrTruncated = false; - // Pass secrets via stdin (never written to disk or mounted as files) - input.secrets = readSecrets(); container.stdin.write(JSON.stringify(input)); container.stdin.end(); - // Remove secrets from input so they don't appear in logs - delete input.secrets; // Streaming output: parse OUTPUT_START/END marker pairs as they arrive let parseBuffer = ''; @@ -482,10 +521,20 @@ export async function runContainerAgent( const isError = code !== 0; if (isVerbose || isError) { + // On error, log input metadata only — not the full prompt. + // Full input is only included at verbose level to avoid + // persisting user conversation content on every non-zero exit. + if (isVerbose) { + logLines.push(`=== Input ===`, JSON.stringify(input, null, 2), ``); + } else { + logLines.push( + `=== Input Summary ===`, + `Prompt length: ${input.prompt.length} chars`, + `Session ID: ${input.sessionId || 'new'}`, + ``, + ); + } logLines.push( - `=== Input ===`, - JSON.stringify(input, null, 2), - ``, `=== Container Args ===`, containerArgs.join(' '), ``, @@ -628,6 +677,7 @@ export function writeTasksSnapshot( id: string; groupFolder: string; prompt: string; + script?: string | null; schedule_type: string; schedule_value: string; status: string; @@ -663,7 +713,7 @@ export function writeGroupsSnapshot( groupFolder: string, isMain: boolean, groups: AvailableGroup[], - registeredJids: Set, + _registeredJids: Set, ): void { const groupIpcDir = resolveGroupIpcPath(groupFolder); fs.mkdirSync(groupIpcDir, { recursive: true }); diff --git a/src/container-runtime.test.ts b/src/container-runtime.test.ts index 08ffd59..d111bf6 100644 --- a/src/container-runtime.test.ts +++ b/src/container-runtime.test.ts @@ -41,7 +41,7 @@ describe('readonlyMountArgs', () => { describe('stopContainer', () => { it('returns stop command using CONTAINER_RUNTIME_BIN', () => { expect(stopContainer('nanoclaw-test-123')).toBe( - `${CONTAINER_RUNTIME_BIN} stop nanoclaw-test-123`, + `${CONTAINER_RUNTIME_BIN} stop -t 1 nanoclaw-test-123`, ); }); }); @@ -93,12 +93,12 @@ describe('cleanupOrphans', () => { expect(mockExecSync).toHaveBeenCalledTimes(3); expect(mockExecSync).toHaveBeenNthCalledWith( 2, - `${CONTAINER_RUNTIME_BIN} stop nanoclaw-group1-111`, + `${CONTAINER_RUNTIME_BIN} stop -t 1 nanoclaw-group1-111`, { stdio: 'pipe' }, ); expect(mockExecSync).toHaveBeenNthCalledWith( 3, - `${CONTAINER_RUNTIME_BIN} stop nanoclaw-group2-222`, + `${CONTAINER_RUNTIME_BIN} stop -t 1 nanoclaw-group2-222`, { stdio: 'pipe' }, ); expect(logger.info).toHaveBeenCalledWith( diff --git a/src/container-runtime.ts b/src/container-runtime.ts index 4d417ad..6326fde 100644 --- a/src/container-runtime.ts +++ b/src/container-runtime.ts @@ -3,12 +3,22 @@ * All runtime-specific logic lives here so swapping runtimes means changing one file. */ import { execSync } from 'child_process'; +import os from 'os'; import { logger } from './logger.js'; /** The container runtime binary name. */ export const CONTAINER_RUNTIME_BIN = 'docker'; +/** CLI args needed for the container to resolve the host gateway. */ +export function hostGatewayArgs(): string[] { + // On Linux, host.docker.internal isn't built-in — add it explicitly + if (os.platform() === 'linux') { + return ['--add-host=host.docker.internal:host-gateway']; + } + return []; +} + /** Returns CLI args for a readonly bind mount. */ export function readonlyMountArgs( hostPath: string, @@ -19,7 +29,7 @@ export function readonlyMountArgs( /** Returns the shell command to stop a container by name. */ export function stopContainer(name: string): string { - return `${CONTAINER_RUNTIME_BIN} stop ${name}`; + return `${CONTAINER_RUNTIME_BIN} stop -t 1 ${name}`; } /** Ensure the container runtime is running, starting it if needed. */ @@ -56,7 +66,9 @@ export function ensureContainerRuntimeRunning(): void { console.error( '╚════════════════════════════════════════════════════════════════╝\n', ); - throw new Error('Container runtime is required but failed to start'); + throw new Error('Container runtime is required but failed to start', { + cause: err, + }); } } diff --git a/src/db-migration.test.ts b/src/db-migration.test.ts new file mode 100644 index 0000000..e26873d --- /dev/null +++ b/src/db-migration.test.ts @@ -0,0 +1,67 @@ +import Database from 'better-sqlite3'; +import fs from 'fs'; +import os from 'os'; +import path from 'path'; +import { describe, expect, it, vi } from 'vitest'; + +describe('database migrations', () => { + it('defaults Telegram backfill chats to direct messages', async () => { + const repoRoot = process.cwd(); + const tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-db-test-')); + + try { + process.chdir(tempDir); + fs.mkdirSync(path.join(tempDir, 'store'), { recursive: true }); + + const dbPath = path.join(tempDir, 'store', 'messages.db'); + const legacyDb = new Database(dbPath); + legacyDb.exec(` + CREATE TABLE chats ( + jid TEXT PRIMARY KEY, + name TEXT, + last_message_time TEXT + ); + `); + legacyDb + .prepare( + `INSERT INTO chats (jid, name, last_message_time) VALUES (?, ?, ?)`, + ) + .run('tg:12345', 'Telegram DM', '2024-01-01T00:00:00.000Z'); + legacyDb + .prepare( + `INSERT INTO chats (jid, name, last_message_time) VALUES (?, ?, ?)`, + ) + .run('tg:-10012345', 'Telegram Group', '2024-01-01T00:00:01.000Z'); + legacyDb + .prepare( + `INSERT INTO chats (jid, name, last_message_time) VALUES (?, ?, ?)`, + ) + .run('room@g.us', 'WhatsApp Group', '2024-01-01T00:00:02.000Z'); + legacyDb.close(); + + vi.resetModules(); + const { initDatabase, getAllChats, _closeDatabase } = + await import('./db.js'); + + initDatabase(); + + const chats = getAllChats(); + expect(chats.find((chat) => chat.jid === 'tg:12345')).toMatchObject({ + channel: 'telegram', + is_group: 0, + }); + expect(chats.find((chat) => chat.jid === 'tg:-10012345')).toMatchObject({ + channel: 'telegram', + is_group: 0, + }); + expect(chats.find((chat) => chat.jid === 'room@g.us')).toMatchObject({ + channel: 'whatsapp', + is_group: 1, + }); + + _closeDatabase(); + } finally { + process.chdir(repoRoot); + } + }); +}); diff --git a/src/db.test.ts b/src/db.test.ts index e7f772c..a40d376 100644 --- a/src/db.test.ts +++ b/src/db.test.ts @@ -5,9 +5,11 @@ import { createTask, deleteTask, getAllChats, + getAllRegisteredGroups, getMessagesSince, getNewMessages, getTaskById, + setRegisteredGroup, storeChatMetadata, storeMessage, updateTask, @@ -388,3 +390,95 @@ describe('task CRUD', () => { expect(getTaskById('task-3')).toBeUndefined(); }); }); + +// --- LIMIT behavior --- + +describe('message query LIMIT', () => { + beforeEach(() => { + storeChatMetadata('group@g.us', '2024-01-01T00:00:00.000Z'); + + for (let i = 1; i <= 10; i++) { + store({ + id: `lim-${i}`, + chat_jid: 'group@g.us', + sender: 'user@s.whatsapp.net', + sender_name: 'User', + content: `message ${i}`, + timestamp: `2024-01-01T00:00:${String(i).padStart(2, '0')}.000Z`, + }); + } + }); + + it('getNewMessages caps to limit and returns most recent in chronological order', () => { + const { messages, newTimestamp } = getNewMessages( + ['group@g.us'], + '2024-01-01T00:00:00.000Z', + 'Andy', + 3, + ); + expect(messages).toHaveLength(3); + expect(messages[0].content).toBe('message 8'); + expect(messages[2].content).toBe('message 10'); + // Chronological order preserved + expect(messages[1].timestamp > messages[0].timestamp).toBe(true); + // newTimestamp reflects latest returned row + expect(newTimestamp).toBe('2024-01-01T00:00:10.000Z'); + }); + + it('getMessagesSince caps to limit and returns most recent in chronological order', () => { + const messages = getMessagesSince( + 'group@g.us', + '2024-01-01T00:00:00.000Z', + 'Andy', + 3, + ); + expect(messages).toHaveLength(3); + expect(messages[0].content).toBe('message 8'); + expect(messages[2].content).toBe('message 10'); + expect(messages[1].timestamp > messages[0].timestamp).toBe(true); + }); + + it('returns all messages when count is under the limit', () => { + const { messages } = getNewMessages( + ['group@g.us'], + '2024-01-01T00:00:00.000Z', + 'Andy', + 50, + ); + expect(messages).toHaveLength(10); + }); +}); + +// --- RegisteredGroup isMain round-trip --- + +describe('registered group isMain', () => { + it('persists isMain=true through set/get round-trip', () => { + setRegisteredGroup('main@s.whatsapp.net', { + name: 'Main Chat', + folder: 'whatsapp_main', + trigger: '@Andy', + added_at: '2024-01-01T00:00:00.000Z', + isMain: true, + }); + + const groups = getAllRegisteredGroups(); + const group = groups['main@s.whatsapp.net']; + expect(group).toBeDefined(); + expect(group.isMain).toBe(true); + expect(group.folder).toBe('whatsapp_main'); + }); + + it('omits isMain for non-main groups', () => { + setRegisteredGroup('group@g.us', { + name: 'Family Chat', + folder: 'whatsapp_family-chat', + trigger: '@Andy', + added_at: '2024-01-01T00:00:00.000Z', + }); + + const groups = getAllRegisteredGroups(); + const group = groups['group@g.us']; + expect(group).toBeDefined(); + expect(group.isMain).toBeUndefined(); + }); +}); diff --git a/src/db.ts b/src/db.ts index 9d9a4d5..718bc60 100644 --- a/src/db.ts +++ b/src/db.ts @@ -93,6 +93,13 @@ function createSchema(database: Database.Database): void { /* column already exists */ } + // Add script column if it doesn't exist (migration for existing DBs) + try { + database.exec(`ALTER TABLE scheduled_tasks ADD COLUMN script TEXT`); + } catch { + /* column already exists */ + } + // Add is_bot_message column if it doesn't exist (migration for existing DBs) try { database.exec( @@ -106,6 +113,19 @@ function createSchema(database: Database.Database): void { /* column already exists */ } + // Add is_main column if it doesn't exist (migration for existing DBs) + try { + database.exec( + `ALTER TABLE registered_groups ADD COLUMN is_main INTEGER DEFAULT 0`, + ); + // Backfill: existing rows with folder = 'main' are the main group + database.exec( + `UPDATE registered_groups SET is_main = 1 WHERE folder = 'main'`, + ); + } catch { + /* column already exists */ + } + // Add channel and is_group columns if they don't exist (migration for existing DBs) try { database.exec(`ALTER TABLE chats ADD COLUMN channel TEXT`); @@ -121,7 +141,7 @@ function createSchema(database: Database.Database): void { `UPDATE chats SET channel = 'discord', is_group = 1 WHERE jid LIKE 'dc:%'`, ); database.exec( - `UPDATE chats SET channel = 'telegram', is_group = 1 WHERE jid LIKE 'tg:%'`, + `UPDATE chats SET channel = 'telegram', is_group = 0 WHERE jid LIKE 'tg:%'`, ); } catch { /* columns already exist */ @@ -145,6 +165,11 @@ export function _initTestDatabase(): void { createSchema(db); } +/** @internal - for tests only. */ +export function _closeDatabase(): void { + db.close(); +} + /** * Store chat metadata only (no message content). * Used for all chats to enable group discovery without storing sensitive content. @@ -263,7 +288,7 @@ export function storeMessage(msg: NewMessage): void { } /** - * Store a message directly (for non-WhatsApp channels that don't use Baileys proto). + * Store a message directly. */ export function storeMessageDirect(msg: { id: string; @@ -293,24 +318,29 @@ export function getNewMessages( jids: string[], lastTimestamp: string, botPrefix: string, + limit: number = 200, ): { messages: NewMessage[]; newTimestamp: string } { if (jids.length === 0) return { messages: [], newTimestamp: lastTimestamp }; const placeholders = jids.map(() => '?').join(','); // Filter bot messages using both the is_bot_message flag AND the content // prefix as a backstop for messages written before the migration ran. + // Subquery takes the N most recent, outer query re-sorts chronologically. const sql = ` - SELECT id, chat_jid, sender, sender_name, content, timestamp - FROM messages - WHERE timestamp > ? AND chat_jid IN (${placeholders}) - AND is_bot_message = 0 AND content NOT LIKE ? - AND content != '' AND content IS NOT NULL - ORDER BY timestamp + SELECT * FROM ( + SELECT id, chat_jid, sender, sender_name, content, timestamp, is_from_me + FROM messages + WHERE timestamp > ? AND chat_jid IN (${placeholders}) + AND is_bot_message = 0 AND content NOT LIKE ? + AND content != '' AND content IS NOT NULL + ORDER BY timestamp DESC + LIMIT ? + ) ORDER BY timestamp `; const rows = db .prepare(sql) - .all(lastTimestamp, ...jids, `${botPrefix}:%`) as NewMessage[]; + .all(lastTimestamp, ...jids, `${botPrefix}:%`, limit) as NewMessage[]; let newTimestamp = lastTimestamp; for (const row of rows) { @@ -324,20 +354,25 @@ export function getMessagesSince( chatJid: string, sinceTimestamp: string, botPrefix: string, + limit: number = 200, ): NewMessage[] { // Filter bot messages using both the is_bot_message flag AND the content // prefix as a backstop for messages written before the migration ran. + // Subquery takes the N most recent, outer query re-sorts chronologically. const sql = ` - SELECT id, chat_jid, sender, sender_name, content, timestamp - FROM messages - WHERE chat_jid = ? AND timestamp > ? - AND is_bot_message = 0 AND content NOT LIKE ? - AND content != '' AND content IS NOT NULL - ORDER BY timestamp + SELECT * FROM ( + SELECT id, chat_jid, sender, sender_name, content, timestamp, is_from_me + FROM messages + WHERE chat_jid = ? AND timestamp > ? + AND is_bot_message = 0 AND content NOT LIKE ? + AND content != '' AND content IS NOT NULL + ORDER BY timestamp DESC + LIMIT ? + ) ORDER BY timestamp `; return db .prepare(sql) - .all(chatJid, sinceTimestamp, `${botPrefix}:%`) as NewMessage[]; + .all(chatJid, sinceTimestamp, `${botPrefix}:%`, limit) as NewMessage[]; } export function createTask( @@ -345,14 +380,15 @@ export function createTask( ): void { db.prepare( ` - INSERT INTO scheduled_tasks (id, group_folder, chat_jid, prompt, schedule_type, schedule_value, context_mode, next_run, status, created_at) - VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?) + INSERT INTO scheduled_tasks (id, group_folder, chat_jid, prompt, script, schedule_type, schedule_value, context_mode, next_run, status, created_at) + VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) `, ).run( task.id, task.group_folder, task.chat_jid, task.prompt, + task.script || null, task.schedule_type, task.schedule_value, task.context_mode || 'isolated', @@ -387,7 +423,12 @@ export function updateTask( updates: Partial< Pick< ScheduledTask, - 'prompt' | 'schedule_type' | 'schedule_value' | 'next_run' | 'status' + | 'prompt' + | 'script' + | 'schedule_type' + | 'schedule_value' + | 'next_run' + | 'status' > >, ): void { @@ -398,6 +439,10 @@ export function updateTask( fields.push('prompt = ?'); values.push(updates.prompt); } + if (updates.script !== undefined) { + fields.push('script = ?'); + values.push(updates.script || null); + } if (updates.schedule_type !== undefined) { fields.push('schedule_type = ?'); values.push(updates.schedule_type); @@ -530,6 +575,7 @@ export function getRegisteredGroup( added_at: string; container_config: string | null; requires_trigger: number | null; + is_main: number | null; } | undefined; if (!row) return undefined; @@ -551,6 +597,7 @@ export function getRegisteredGroup( : undefined, requiresTrigger: row.requires_trigger === null ? undefined : row.requires_trigger === 1, + isMain: row.is_main === 1 ? true : undefined, }; } @@ -559,8 +606,8 @@ export function setRegisteredGroup(jid: string, group: RegisteredGroup): void { throw new Error(`Invalid group folder "${group.folder}" for JID ${jid}`); } db.prepare( - `INSERT OR REPLACE INTO registered_groups (jid, name, folder, trigger_pattern, added_at, container_config, requires_trigger) - VALUES (?, ?, ?, ?, ?, ?, ?)`, + `INSERT OR REPLACE INTO registered_groups (jid, name, folder, trigger_pattern, added_at, container_config, requires_trigger, is_main) + VALUES (?, ?, ?, ?, ?, ?, ?, ?)`, ).run( jid, group.name, @@ -569,6 +616,7 @@ export function setRegisteredGroup(jid: string, group: RegisteredGroup): void { group.added_at, group.containerConfig ? JSON.stringify(group.containerConfig) : null, group.requiresTrigger === undefined ? 1 : group.requiresTrigger ? 1 : 0, + group.isMain ? 1 : 0, ); } @@ -581,6 +629,7 @@ export function getAllRegisteredGroups(): Record { added_at: string; container_config: string | null; requires_trigger: number | null; + is_main: number | null; }>; const result: Record = {}; for (const row of rows) { @@ -601,6 +650,7 @@ export function getAllRegisteredGroups(): Record { : undefined, requiresTrigger: row.requires_trigger === null ? undefined : row.requires_trigger === 1, + isMain: row.is_main === 1 ? true : undefined, }; } return result; diff --git a/src/formatting.test.ts b/src/formatting.test.ts index ea85b9d..a630f20 100644 --- a/src/formatting.test.ts +++ b/src/formatting.test.ts @@ -1,6 +1,10 @@ import { describe, it, expect } from 'vitest'; -import { ASSISTANT_NAME, TRIGGER_PATTERN } from './config.js'; +import { + ASSISTANT_NAME, + getTriggerPattern, + TRIGGER_PATTERN, +} from './config.js'; import { escapeXml, formatMessages, @@ -58,13 +62,14 @@ describe('escapeXml', () => { // --- formatMessages --- describe('formatMessages', () => { - it('formats a single message as XML', () => { - const result = formatMessages([makeMsg()]); - expect(result).toBe( - '\n' + - 'hello\n' + - '', - ); + const TZ = 'UTC'; + + it('formats a single message as XML with context header', () => { + const result = formatMessages([makeMsg()], TZ); + expect(result).toContain(''); + expect(result).toContain('hello'); + expect(result).toContain('Jan 1, 2024'); }); it('formats multiple messages', () => { @@ -73,11 +78,16 @@ describe('formatMessages', () => { id: '1', sender_name: 'Alice', content: 'hi', - timestamp: 't1', + timestamp: '2024-01-01T00:00:00.000Z', + }), + makeMsg({ + id: '2', + sender_name: 'Bob', + content: 'hey', + timestamp: '2024-01-01T01:00:00.000Z', }), - makeMsg({ id: '2', sender_name: 'Bob', content: 'hey', timestamp: 't2' }), ]; - const result = formatMessages(msgs); + const result = formatMessages(msgs, TZ); expect(result).toContain('sender="Alice"'); expect(result).toContain('sender="Bob"'); expect(result).toContain('>hi'); @@ -85,22 +95,35 @@ describe('formatMessages', () => { }); it('escapes special characters in sender names', () => { - const result = formatMessages([makeMsg({ sender_name: 'A & B ' })]); + const result = formatMessages([makeMsg({ sender_name: 'A & B ' })], TZ); expect(result).toContain('sender="A & B <Co>"'); }); it('escapes special characters in content', () => { - const result = formatMessages([ - makeMsg({ content: '' }), - ]); + const result = formatMessages( + [makeMsg({ content: '' })], + TZ, + ); expect(result).toContain( '<script>alert("xss")</script>', ); }); it('handles empty array', () => { - const result = formatMessages([]); - expect(result).toBe('\n\n'); + const result = formatMessages([], TZ); + expect(result).toContain(''); + expect(result).toContain('\n\n'); + }); + + it('converts timestamps to local time for given timezone', () => { + // 2024-01-01T18:30:00Z in America/New_York (EST) = 1:30 PM + const result = formatMessages( + [makeMsg({ timestamp: '2024-01-01T18:30:00.000Z' })], + 'America/New_York', + ); + expect(result).toContain('1:30'); + expect(result).toContain('PM'); + expect(result).toContain(''); }); }); @@ -142,6 +165,28 @@ describe('TRIGGER_PATTERN', () => { }); }); +describe('getTriggerPattern', () => { + it('uses the configured per-group trigger when provided', () => { + const pattern = getTriggerPattern('@Claw'); + + expect(pattern.test('@Claw hello')).toBe(true); + expect(pattern.test(`@${ASSISTANT_NAME} hello`)).toBe(false); + }); + + it('falls back to the default trigger when group trigger is missing', () => { + const pattern = getTriggerPattern(undefined); + + expect(pattern.test(`@${ASSISTANT_NAME} hello`)).toBe(true); + }); + + it('treats regex characters in custom triggers literally', () => { + const pattern = getTriggerPattern('@C.L.A.U.D.E'); + + expect(pattern.test('@C.L.A.U.D.E hello')).toBe(true); + expect(pattern.test('@CXLXAUXDXE hello')).toBe(false); + }); +}); + // --- Outbound formatting (internal tag stripping + prefix) --- describe('stripInternalTags', () => { @@ -188,7 +233,7 @@ describe('formatOutbound', () => { describe('trigger gating (requiresTrigger interaction)', () => { // Replicates the exact logic from processGroupMessages and startMessageLoop: - // if (!isMainGroup && group.requiresTrigger !== false) { check trigger } + // if (!isMainGroup && group.requiresTrigger !== false) { check group.trigger } function shouldRequireTrigger( isMainGroup: boolean, requiresTrigger: boolean | undefined, @@ -199,39 +244,51 @@ describe('trigger gating (requiresTrigger interaction)', () => { function shouldProcess( isMainGroup: boolean, requiresTrigger: boolean | undefined, + trigger: string | undefined, messages: NewMessage[], ): boolean { if (!shouldRequireTrigger(isMainGroup, requiresTrigger)) return true; - return messages.some((m) => TRIGGER_PATTERN.test(m.content.trim())); + const triggerPattern = getTriggerPattern(trigger); + return messages.some((m) => triggerPattern.test(m.content.trim())); } it('main group always processes (no trigger needed)', () => { const msgs = [makeMsg({ content: 'hello no trigger' })]; - expect(shouldProcess(true, undefined, msgs)).toBe(true); + expect(shouldProcess(true, undefined, undefined, msgs)).toBe(true); }); it('main group processes even with requiresTrigger=true', () => { const msgs = [makeMsg({ content: 'hello no trigger' })]; - expect(shouldProcess(true, true, msgs)).toBe(true); + expect(shouldProcess(true, true, undefined, msgs)).toBe(true); }); it('non-main group with requiresTrigger=undefined requires trigger (defaults to true)', () => { const msgs = [makeMsg({ content: 'hello no trigger' })]; - expect(shouldProcess(false, undefined, msgs)).toBe(false); + expect(shouldProcess(false, undefined, undefined, msgs)).toBe(false); }); it('non-main group with requiresTrigger=true requires trigger', () => { const msgs = [makeMsg({ content: 'hello no trigger' })]; - expect(shouldProcess(false, true, msgs)).toBe(false); + expect(shouldProcess(false, true, undefined, msgs)).toBe(false); }); it('non-main group with requiresTrigger=true processes when trigger present', () => { const msgs = [makeMsg({ content: `@${ASSISTANT_NAME} do something` })]; - expect(shouldProcess(false, true, msgs)).toBe(true); + expect(shouldProcess(false, true, undefined, msgs)).toBe(true); + }); + + it('non-main group uses its per-group trigger instead of the default trigger', () => { + const msgs = [makeMsg({ content: '@Claw do something' })]; + expect(shouldProcess(false, true, '@Claw', msgs)).toBe(true); + }); + + it('non-main group does not process when only the default trigger is present for a custom-trigger group', () => { + const msgs = [makeMsg({ content: `@${ASSISTANT_NAME} do something` })]; + expect(shouldProcess(false, true, '@Claw', msgs)).toBe(false); }); it('non-main group with requiresTrigger=false always processes (no trigger needed)', () => { const msgs = [makeMsg({ content: 'hello no trigger' })]; - expect(shouldProcess(false, false, msgs)).toBe(true); + expect(shouldProcess(false, false, undefined, msgs)).toBe(true); }); }); diff --git a/src/group-queue.test.ts b/src/group-queue.test.ts index b1a4f9c..d7de517 100644 --- a/src/group-queue.test.ts +++ b/src/group-queue.test.ts @@ -40,7 +40,7 @@ describe('GroupQueue', () => { let concurrentCount = 0; let maxConcurrent = 0; - const processMessages = vi.fn(async (groupJid: string) => { + const processMessages = vi.fn(async (_groupJid: string) => { concurrentCount++; maxConcurrent = Math.max(maxConcurrent, concurrentCount); // Simulate async work @@ -69,7 +69,7 @@ describe('GroupQueue', () => { let maxActive = 0; const completionCallbacks: Array<() => void> = []; - const processMessages = vi.fn(async (groupJid: string) => { + const processMessages = vi.fn(async (_groupJid: string) => { activeCount++; maxActive = Math.max(maxActive, activeCount); await new Promise((resolve) => completionCallbacks.push(resolve)); @@ -104,7 +104,7 @@ describe('GroupQueue', () => { const executionOrder: string[] = []; let resolveFirst: () => void; - const processMessages = vi.fn(async (groupJid: string) => { + const processMessages = vi.fn(async (_groupJid: string) => { if (executionOrder.length === 0) { // First call: block until we release it await new Promise((resolve) => { @@ -243,6 +243,41 @@ describe('GroupQueue', () => { expect(processed).toContain('group3@g.us'); }); + // --- Running task dedup (Issue #138) --- + + it('rejects duplicate enqueue of a currently-running task', async () => { + let resolveTask: () => void; + let taskCallCount = 0; + + const taskFn = vi.fn(async () => { + taskCallCount++; + await new Promise((resolve) => { + resolveTask = resolve; + }); + }); + + // Start the task (runs immediately — slot available) + queue.enqueueTask('group1@g.us', 'task-1', taskFn); + await vi.advanceTimersByTimeAsync(10); + expect(taskCallCount).toBe(1); + + // Scheduler poll re-discovers the same task while it's running — + // this must be silently dropped + const dupFn = vi.fn(async () => {}); + queue.enqueueTask('group1@g.us', 'task-1', dupFn); + await vi.advanceTimersByTimeAsync(10); + + // Duplicate was NOT queued + expect(dupFn).not.toHaveBeenCalled(); + + // Complete the original task + resolveTask!(); + await vi.advanceTimersByTimeAsync(10); + + // Only one execution total + expect(taskCallCount).toBe(1); + }); + // --- Idle preemption --- it('does NOT preempt active container when not idle', async () => { diff --git a/src/group-queue.ts b/src/group-queue.ts index 06a56cc..a3b547d 100644 --- a/src/group-queue.ts +++ b/src/group-queue.ts @@ -18,6 +18,7 @@ interface GroupState { active: boolean; idleWaiting: boolean; isTaskContainer: boolean; + runningTaskId: string | null; pendingMessages: boolean; pendingTasks: QueuedTask[]; process: ChildProcess | null; @@ -41,6 +42,7 @@ export class GroupQueue { active: false, idleWaiting: false, isTaskContainer: false, + runningTaskId: null, pendingMessages: false, pendingTasks: [], process: null, @@ -90,7 +92,11 @@ export class GroupQueue { const state = this.getGroup(groupJid); - // Prevent double-queuing of the same task + // Prevent double-queuing: check both pending and currently-running task + if (state.runningTaskId === taskId) { + logger.debug({ groupJid, taskId }, 'Task already running, skipping'); + return; + } if (state.pendingTasks.some((t) => t.id === taskId)) { logger.debug({ groupJid, taskId }, 'Task already queued, skipping'); return; @@ -230,6 +236,7 @@ export class GroupQueue { state.active = true; state.idleWaiting = false; state.isTaskContainer = true; + state.runningTaskId = task.id; this.activeCount++; logger.debug( @@ -244,6 +251,7 @@ export class GroupQueue { } finally { state.active = false; state.isTaskContainer = false; + state.runningTaskId = null; state.process = null; state.containerName = null; state.groupFolder = null; @@ -343,7 +351,7 @@ export class GroupQueue { // via idle timeout or container timeout. The --rm flag cleans them up on exit. // This prevents WhatsApp reconnection restarts from killing working agents. const activeContainers: string[] = []; - for (const [jid, state] of this.groups) { + for (const [_jid, state] of this.groups) { if (state.process && !state.process.killed && state.containerName) { activeContainers.push(state.containerName); } diff --git a/src/index.ts b/src/index.ts index 278a7a7..bf57823 100644 --- a/src/index.ts +++ b/src/index.ts @@ -1,14 +1,23 @@ import fs from 'fs'; import path from 'path'; +import { OneCLI } from '@onecli-sh/sdk'; + import { ASSISTANT_NAME, + DEFAULT_TRIGGER, + getTriggerPattern, + GROUPS_DIR, IDLE_TIMEOUT, - MAIN_GROUP_FOLDER, + ONECLI_URL, POLL_INTERVAL, - TRIGGER_PATTERN, + TIMEZONE, } from './config.js'; -import { WhatsAppChannel } from './channels/whatsapp.js'; +import './channels/index.js'; +import { + getChannelFactory, + getRegisteredChannelNames, +} from './channels/registry.js'; import { ContainerOutput, runContainerAgent, @@ -38,6 +47,17 @@ import { GroupQueue } from './group-queue.js'; import { resolveGroupFolderPath } from './group-folder.js'; import { startIpcWatcher } from './ipc.js'; import { findChannel, formatMessages, formatOutbound } from './router.js'; +import { + restoreRemoteControl, + startRemoteControl, + stopRemoteControl, +} from './remote-control.js'; +import { + isSenderAllowed, + isTriggerAllowed, + loadSenderAllowlist, + shouldDropMessage, +} from './sender-allowlist.js'; import { startSchedulerLoop } from './task-scheduler.js'; import { Channel, NewMessage, RegisteredGroup } from './types.js'; import { logger } from './logger.js'; @@ -51,10 +71,30 @@ let registeredGroups: Record = {}; let lastAgentTimestamp: Record = {}; let messageLoopRunning = false; -let whatsapp: WhatsAppChannel; const channels: Channel[] = []; const queue = new GroupQueue(); +const onecli = new OneCLI({ url: ONECLI_URL }); + +function ensureOneCLIAgent(jid: string, group: RegisteredGroup): void { + if (group.isMain) return; + const identifier = group.folder.toLowerCase().replace(/_/g, '-'); + onecli.ensureAgent({ name: group.name, identifier }).then( + (res) => { + logger.info( + { jid, identifier, created: res.created }, + 'OneCLI agent ensured', + ); + }, + (err) => { + logger.debug( + { jid, identifier, err: String(err) }, + 'OneCLI agent ensure skipped', + ); + }, + ); +} + function loadState(): void { lastTimestamp = getRouterState('last_timestamp') || ''; const agentTs = getRouterState('last_agent_timestamp'); @@ -95,6 +135,29 @@ function registerGroup(jid: string, group: RegisteredGroup): void { // Create group folder fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true }); + // Copy CLAUDE.md template into the new group folder so agents have + // identity and instructions from the first run. (Fixes #1391) + const groupMdFile = path.join(groupDir, 'CLAUDE.md'); + if (!fs.existsSync(groupMdFile)) { + const templateFile = path.join( + GROUPS_DIR, + group.isMain ? 'main' : 'global', + 'CLAUDE.md', + ); + if (fs.existsSync(templateFile)) { + let content = fs.readFileSync(templateFile, 'utf-8'); + if (ASSISTANT_NAME !== 'Andy') { + content = content.replace(/^# Andy$/m, `# ${ASSISTANT_NAME}`); + content = content.replace(/You are Andy/g, `You are ${ASSISTANT_NAME}`); + } + fs.writeFileSync(groupMdFile, content); + logger.info({ folder: group.folder }, 'Created CLAUDE.md from template'); + } + } + + // Ensure a corresponding OneCLI agent exists (best-effort, non-blocking) + ensureOneCLIAgent(jid, group); + logger.info( { jid, name: group.name, folder: group.folder }, 'Group registered', @@ -140,7 +203,7 @@ async function processGroupMessages(chatJid: string): Promise { return true; } - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; + const isMainGroup = group.isMain === true; const sinceTimestamp = lastAgentTimestamp[chatJid] || ''; const missedMessages = getMessagesSince( @@ -153,13 +216,17 @@ async function processGroupMessages(chatJid: string): Promise { // For non-main groups, check if trigger is required and present if (!isMainGroup && group.requiresTrigger !== false) { - const hasTrigger = missedMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), + const triggerPattern = getTriggerPattern(group.trigger); + const allowlistCfg = loadSenderAllowlist(); + const hasTrigger = missedMessages.some( + (m) => + triggerPattern.test(m.content.trim()) && + (m.is_from_me || isTriggerAllowed(chatJid, m.sender, allowlistCfg)), ); if (!hasTrigger) return true; } - const prompt = formatMessages(missedMessages); + const prompt = formatMessages(missedMessages, TIMEZONE); // Advance cursor so the piping path in startMessageLoop won't re-fetch // these messages. Save the old cursor so we can roll back on error. @@ -200,7 +267,7 @@ async function processGroupMessages(chatJid: string): Promise { : JSON.stringify(result.result); // Strip ... blocks — agent uses these for internal reasoning const text = raw.replace(/[\s\S]*?<\/internal>/g, '').trim(); - logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`); + logger.info({ group: group.name }, `Agent output: ${raw.length} chars`); if (text) { await channel.sendMessage(chatJid, text); outputSentToUser = true; @@ -250,7 +317,7 @@ async function runAgent( chatJid: string, onOutput?: (output: ContainerOutput) => Promise, ): Promise<'success' | 'error'> { - const isMain = group.folder === MAIN_GROUP_FOLDER; + const isMain = group.isMain === true; const sessionId = sessions[group.folder]; // Update tasks snapshot for container to read (filtered by group) @@ -262,6 +329,7 @@ async function runAgent( id: t.id, groupFolder: t.group_folder, prompt: t.prompt, + script: t.script || undefined, schedule_type: t.schedule_type, schedule_value: t.schedule_value, status: t.status, @@ -332,7 +400,7 @@ async function startMessageLoop(): Promise { } messageLoopRunning = true; - logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`); + logger.info(`NanoClaw running (default trigger: ${DEFAULT_TRIGGER})`); while (true) { try { @@ -371,15 +439,20 @@ async function startMessageLoop(): Promise { continue; } - const isMainGroup = group.folder === MAIN_GROUP_FOLDER; + const isMainGroup = group.isMain === true; const needsTrigger = !isMainGroup && group.requiresTrigger !== false; // For non-main groups, only act on trigger messages. // Non-trigger messages accumulate in DB and get pulled as // context when a trigger eventually arrives. if (needsTrigger) { - const hasTrigger = groupMessages.some((m) => - TRIGGER_PATTERN.test(m.content.trim()), + const triggerPattern = getTriggerPattern(group.trigger); + const allowlistCfg = loadSenderAllowlist(); + const hasTrigger = groupMessages.some( + (m) => + triggerPattern.test(m.content.trim()) && + (m.is_from_me || + isTriggerAllowed(chatJid, m.sender, allowlistCfg)), ); if (!hasTrigger) continue; } @@ -393,7 +466,7 @@ async function startMessageLoop(): Promise { ); const messagesToSend = allPending.length > 0 ? allPending : groupMessages; - const formatted = formatMessages(messagesToSend); + const formatted = formatMessages(messagesToSend, TIMEZONE); if (queue.sendMessage(chatJid, formatted)) { logger.debug( @@ -451,6 +524,14 @@ async function main(): Promise { logger.info('Database initialized'); loadState(); + // Ensure OneCLI agents exist for all registered groups. + // Recovers from missed creates (e.g. OneCLI was down at registration time). + for (const [jid, group] of Object.entries(registeredGroups)) { + ensureOneCLIAgent(jid, group); + } + + restoreRemoteControl(); + // Graceful shutdown handlers const shutdown = async (signal: string) => { logger.info({ signal }, 'Shutdown signal received'); @@ -461,9 +542,78 @@ async function main(): Promise { process.on('SIGTERM', () => shutdown('SIGTERM')); process.on('SIGINT', () => shutdown('SIGINT')); + // Handle /remote-control and /remote-control-end commands + async function handleRemoteControl( + command: string, + chatJid: string, + msg: NewMessage, + ): Promise { + const group = registeredGroups[chatJid]; + if (!group?.isMain) { + logger.warn( + { chatJid, sender: msg.sender }, + 'Remote control rejected: not main group', + ); + return; + } + + const channel = findChannel(channels, chatJid); + if (!channel) return; + + if (command === '/remote-control') { + const result = await startRemoteControl( + msg.sender, + chatJid, + process.cwd(), + ); + if (result.ok) { + await channel.sendMessage(chatJid, result.url); + } else { + await channel.sendMessage( + chatJid, + `Remote Control failed: ${result.error}`, + ); + } + } else { + const result = stopRemoteControl(); + if (result.ok) { + await channel.sendMessage(chatJid, 'Remote Control session ended.'); + } else { + await channel.sendMessage(chatJid, result.error); + } + } + } + // Channel callbacks (shared by all channels) const channelOpts = { - onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg), + onMessage: (chatJid: string, msg: NewMessage) => { + // Remote control commands — intercept before storage + const trimmed = msg.content.trim(); + if (trimmed === '/remote-control' || trimmed === '/remote-control-end') { + handleRemoteControl(trimmed, chatJid, msg).catch((err) => + logger.error({ err, chatJid }, 'Remote control command error'), + ); + return; + } + + // Sender allowlist drop mode: discard messages from denied senders before storing + if (!msg.is_from_me && !msg.is_bot_message && registeredGroups[chatJid]) { + const cfg = loadSenderAllowlist(); + if ( + shouldDropMessage(chatJid, cfg) && + !isSenderAllowed(chatJid, msg.sender, cfg) + ) { + if (cfg.logDenied) { + logger.debug( + { chatJid, sender: msg.sender }, + 'sender-allowlist: dropping message (drop mode)', + ); + } + return; + } + } + storeMessage(msg); + }, onChatMetadata: ( chatJid: string, timestamp: string, @@ -474,10 +624,26 @@ async function main(): Promise { registeredGroups: () => registeredGroups, }; - // Create and connect channels - whatsapp = new WhatsAppChannel(channelOpts); - channels.push(whatsapp); - await whatsapp.connect(); + // Create and connect all registered channels. + // Each channel self-registers via the barrel import above. + // Factories return null when credentials are missing, so unconfigured channels are skipped. + for (const channelName of getRegisteredChannelNames()) { + const factory = getChannelFactory(channelName)!; + const channel = factory(channelOpts); + if (!channel) { + logger.warn( + { channel: channelName }, + 'Channel installed but credentials missing — skipping. Check .env or re-run the channel skill.', + ); + continue; + } + channels.push(channel); + await channel.connect(); + } + if (channels.length === 0) { + logger.fatal('No channels connected'); + process.exit(1); + } // Start subsystems (independently of connection handler) startSchedulerLoop({ @@ -504,11 +670,32 @@ async function main(): Promise { }, registeredGroups: () => registeredGroups, registerGroup, - syncGroupMetadata: (force) => - whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(), + syncGroups: async (force: boolean) => { + await Promise.all( + channels + .filter((ch) => ch.syncGroups) + .map((ch) => ch.syncGroups!(force)), + ); + }, getAvailableGroups, writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj), + onTasksChanged: () => { + const tasks = getAllTasks(); + const taskRows = tasks.map((t) => ({ + id: t.id, + groupFolder: t.group_folder, + prompt: t.prompt, + script: t.script || undefined, + schedule_type: t.schedule_type, + schedule_value: t.schedule_value, + status: t.status, + next_run: t.next_run, + })); + for (const group of Object.values(registeredGroups)) { + writeTasksSnapshot(group.folder, group.isMain === true, taskRows); + } + }, }); queue.setProcessMessagesFn(processGroupMessages); recoverPendingMessages(); diff --git a/src/ipc-auth.test.ts b/src/ipc-auth.test.ts index e155d44..0adf899 100644 --- a/src/ipc-auth.test.ts +++ b/src/ipc-auth.test.ts @@ -14,9 +14,10 @@ import { RegisteredGroup } from './types.js'; // Set up registered groups used across tests const MAIN_GROUP: RegisteredGroup = { name: 'Main', - folder: 'main', + folder: 'whatsapp_main', trigger: 'always', added_at: '2024-01-01T00:00:00.000Z', + isMain: true, }; const OTHER_GROUP: RegisteredGroup = { @@ -58,9 +59,10 @@ beforeEach(() => { setRegisteredGroup(jid, group); // Mock the fs.mkdirSync that registerGroup does }, - syncGroupMetadata: async () => {}, + syncGroups: async () => {}, getAvailableGroups: () => [], writeGroupsSnapshot: () => {}, + onTasksChanged: () => {}, }; }); @@ -73,10 +75,10 @@ describe('schedule_task authorization', () => { type: 'schedule_task', prompt: 'do something', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -93,7 +95,7 @@ describe('schedule_task authorization', () => { type: 'schedule_task', prompt: 'self task', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', targetJid: 'other@g.us', }, 'other-group', @@ -112,7 +114,7 @@ describe('schedule_task authorization', () => { type: 'schedule_task', prompt: 'unauthorized', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', targetJid: 'main@g.us', }, 'other-group', @@ -130,10 +132,10 @@ describe('schedule_task authorization', () => { type: 'schedule_task', prompt: 'no target', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', targetJid: 'unknown@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -149,11 +151,11 @@ describe('pause_task authorization', () => { beforeEach(() => { createTask({ id: 'task-main', - group_folder: 'main', + group_folder: 'whatsapp_main', chat_jid: 'main@g.us', prompt: 'main task', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', next_run: '2025-06-01T00:00:00.000Z', status: 'active', @@ -165,7 +167,7 @@ describe('pause_task authorization', () => { chat_jid: 'other@g.us', prompt: 'other task', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', next_run: '2025-06-01T00:00:00.000Z', status: 'active', @@ -176,7 +178,7 @@ describe('pause_task authorization', () => { it('main group can pause any task', async () => { await processTaskIpc( { type: 'pause_task', taskId: 'task-other' }, - 'main', + 'whatsapp_main', true, deps, ); @@ -214,7 +216,7 @@ describe('resume_task authorization', () => { chat_jid: 'other@g.us', prompt: 'paused task', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', next_run: '2025-06-01T00:00:00.000Z', status: 'paused', @@ -225,7 +227,7 @@ describe('resume_task authorization', () => { it('main group can resume any task', async () => { await processTaskIpc( { type: 'resume_task', taskId: 'task-paused' }, - 'main', + 'whatsapp_main', true, deps, ); @@ -263,7 +265,7 @@ describe('cancel_task authorization', () => { chat_jid: 'other@g.us', prompt: 'cancel me', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', next_run: null, status: 'active', @@ -272,7 +274,7 @@ describe('cancel_task authorization', () => { await processTaskIpc( { type: 'cancel_task', taskId: 'task-to-cancel' }, - 'main', + 'whatsapp_main', true, deps, ); @@ -286,7 +288,7 @@ describe('cancel_task authorization', () => { chat_jid: 'other@g.us', prompt: 'my task', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', next_run: null, status: 'active', @@ -305,11 +307,11 @@ describe('cancel_task authorization', () => { it('non-main group cannot cancel another groups task', async () => { createTask({ id: 'task-foreign', - group_folder: 'main', + group_folder: 'whatsapp_main', chat_jid: 'main@g.us', prompt: 'not yours', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', next_run: null, status: 'active', @@ -356,7 +358,7 @@ describe('register_group authorization', () => { folder: '../../outside', trigger: '@Andy', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -397,8 +399,12 @@ describe('IPC message authorization', () => { } it('main group can send to any group', () => { - expect(isMessageAuthorized('main', true, 'other@g.us', groups)).toBe(true); - expect(isMessageAuthorized('main', true, 'third@g.us', groups)).toBe(true); + expect( + isMessageAuthorized('whatsapp_main', true, 'other@g.us', groups), + ).toBe(true); + expect( + isMessageAuthorized('whatsapp_main', true, 'third@g.us', groups), + ).toBe(true); }); it('non-main group can send to its own chat', () => { @@ -424,9 +430,9 @@ describe('IPC message authorization', () => { it('main group can send to unregistered JID', () => { // Main is always authorized regardless of target - expect(isMessageAuthorized('main', true, 'unknown@g.us', groups)).toBe( - true, - ); + expect( + isMessageAuthorized('whatsapp_main', true, 'unknown@g.us', groups), + ).toBe(true); }); }); @@ -442,7 +448,7 @@ describe('schedule_task schedule types', () => { schedule_value: '0 9 * * *', // every day at 9am targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -466,7 +472,7 @@ describe('schedule_task schedule types', () => { schedule_value: 'not a cron', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -485,7 +491,7 @@ describe('schedule_task schedule types', () => { schedule_value: '3600000', // 1 hour targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -508,7 +514,7 @@ describe('schedule_task schedule types', () => { schedule_value: 'abc', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -525,7 +531,7 @@ describe('schedule_task schedule types', () => { schedule_value: '0', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -542,7 +548,7 @@ describe('schedule_task schedule types', () => { schedule_value: 'not-a-date', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -560,11 +566,11 @@ describe('schedule_task context_mode', () => { type: 'schedule_task', prompt: 'group context', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'group', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -579,11 +585,11 @@ describe('schedule_task context_mode', () => { type: 'schedule_task', prompt: 'isolated context', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'isolated', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -598,11 +604,11 @@ describe('schedule_task context_mode', () => { type: 'schedule_task', prompt: 'bad context', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', context_mode: 'bogus' as any, targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -617,10 +623,10 @@ describe('schedule_task context_mode', () => { type: 'schedule_task', prompt: 'no context mode', schedule_type: 'once', - schedule_value: '2025-06-01T00:00:00.000Z', + schedule_value: '2025-06-01T00:00:00', targetJid: 'other@g.us', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -642,7 +648,7 @@ describe('register_group success', () => { folder: 'new-group', trigger: '@Andy', }, - 'main', + 'whatsapp_main', true, deps, ); @@ -663,7 +669,7 @@ describe('register_group success', () => { name: 'Partial', // missing folder and trigger }, - 'main', + 'whatsapp_main', true, deps, ); diff --git a/src/ipc.ts b/src/ipc.ts index 52cf7d7..043b07a 100644 --- a/src/ipc.ts +++ b/src/ipc.ts @@ -3,12 +3,7 @@ import path from 'path'; import { CronExpressionParser } from 'cron-parser'; -import { - DATA_DIR, - IPC_POLL_INTERVAL, - MAIN_GROUP_FOLDER, - TIMEZONE, -} from './config.js'; +import { DATA_DIR, IPC_POLL_INTERVAL, TIMEZONE } from './config.js'; import { AvailableGroup } from './container-runner.js'; import { createTask, deleteTask, getTaskById, updateTask } from './db.js'; import { isValidGroupFolder } from './group-folder.js'; @@ -19,7 +14,7 @@ export interface IpcDeps { sendMessage: (jid: string, text: string) => Promise; registeredGroups: () => Record; registerGroup: (jid: string, group: RegisteredGroup) => void; - syncGroupMetadata: (force: boolean) => Promise; + syncGroups: (force: boolean) => Promise; getAvailableGroups: () => AvailableGroup[]; writeGroupsSnapshot: ( groupFolder: string, @@ -27,6 +22,7 @@ export interface IpcDeps { availableGroups: AvailableGroup[], registeredJids: Set, ) => void; + onTasksChanged: () => void; } let ipcWatcherRunning = false; @@ -57,8 +53,14 @@ export function startIpcWatcher(deps: IpcDeps): void { const registeredGroups = deps.registeredGroups(); + // Build folder→isMain lookup from registered groups + const folderIsMain = new Map(); + for (const group of Object.values(registeredGroups)) { + if (group.isMain) folderIsMain.set(group.folder, true); + } + for (const sourceGroup of groupFolders) { - const isMain = sourceGroup === MAIN_GROUP_FOLDER; + const isMain = folderIsMain.get(sourceGroup) === true; const messagesDir = path.join(ipcBaseDir, sourceGroup, 'messages'); const tasksDir = path.join(ipcBaseDir, sourceGroup, 'tasks'); @@ -160,6 +162,7 @@ export async function processTaskIpc( schedule_type?: string; schedule_value?: string; context_mode?: string; + script?: string; groupFolder?: string; chatJid?: string; targetJid?: string; @@ -235,18 +238,20 @@ export async function processTaskIpc( } nextRun = new Date(Date.now() + ms).toISOString(); } else if (scheduleType === 'once') { - const scheduled = new Date(data.schedule_value); - if (isNaN(scheduled.getTime())) { + const date = new Date(data.schedule_value); + if (isNaN(date.getTime())) { logger.warn( { scheduleValue: data.schedule_value }, 'Invalid timestamp', ); break; } - nextRun = scheduled.toISOString(); + nextRun = date.toISOString(); } - const taskId = `task-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`; + const taskId = + data.taskId || + `task-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`; const contextMode = data.context_mode === 'group' || data.context_mode === 'isolated' ? data.context_mode @@ -256,6 +261,7 @@ export async function processTaskIpc( group_folder: targetFolder, chat_jid: targetJid, prompt: data.prompt, + script: data.script || null, schedule_type: scheduleType, schedule_value: data.schedule_value, context_mode: contextMode, @@ -267,6 +273,7 @@ export async function processTaskIpc( { taskId, sourceGroup, targetFolder, contextMode }, 'Task created via IPC', ); + deps.onTasksChanged(); } break; @@ -279,6 +286,7 @@ export async function processTaskIpc( { taskId: data.taskId, sourceGroup }, 'Task paused via IPC', ); + deps.onTasksChanged(); } else { logger.warn( { taskId: data.taskId, sourceGroup }, @@ -297,6 +305,7 @@ export async function processTaskIpc( { taskId: data.taskId, sourceGroup }, 'Task resumed via IPC', ); + deps.onTasksChanged(); } else { logger.warn( { taskId: data.taskId, sourceGroup }, @@ -315,6 +324,7 @@ export async function processTaskIpc( { taskId: data.taskId, sourceGroup }, 'Task cancelled via IPC', ); + deps.onTasksChanged(); } else { logger.warn( { taskId: data.taskId, sourceGroup }, @@ -324,6 +334,72 @@ export async function processTaskIpc( } break; + case 'update_task': + if (data.taskId) { + const task = getTaskById(data.taskId); + if (!task) { + logger.warn( + { taskId: data.taskId, sourceGroup }, + 'Task not found for update', + ); + break; + } + if (!isMain && task.group_folder !== sourceGroup) { + logger.warn( + { taskId: data.taskId, sourceGroup }, + 'Unauthorized task update attempt', + ); + break; + } + + const updates: Parameters[1] = {}; + if (data.prompt !== undefined) updates.prompt = data.prompt; + if (data.script !== undefined) updates.script = data.script || null; + if (data.schedule_type !== undefined) + updates.schedule_type = data.schedule_type as + | 'cron' + | 'interval' + | 'once'; + if (data.schedule_value !== undefined) + updates.schedule_value = data.schedule_value; + + // Recompute next_run if schedule changed + if (data.schedule_type || data.schedule_value) { + const updatedTask = { + ...task, + ...updates, + }; + if (updatedTask.schedule_type === 'cron') { + try { + const interval = CronExpressionParser.parse( + updatedTask.schedule_value, + { tz: TIMEZONE }, + ); + updates.next_run = interval.next().toISOString(); + } catch { + logger.warn( + { taskId: data.taskId, value: updatedTask.schedule_value }, + 'Invalid cron in task update', + ); + break; + } + } else if (updatedTask.schedule_type === 'interval') { + const ms = parseInt(updatedTask.schedule_value, 10); + if (!isNaN(ms) && ms > 0) { + updates.next_run = new Date(Date.now() + ms).toISOString(); + } + } + } + + updateTask(data.taskId, updates); + logger.info( + { taskId: data.taskId, sourceGroup, updates }, + 'Task updated via IPC', + ); + deps.onTasksChanged(); + } + break; + case 'refresh_groups': // Only main group can request a refresh if (isMain) { @@ -331,7 +407,7 @@ export async function processTaskIpc( { sourceGroup }, 'Group metadata refresh requested via IPC', ); - await deps.syncGroupMetadata(true); + await deps.syncGroups(true); // Write updated snapshot immediately const availableGroups = deps.getAvailableGroups(); deps.writeGroupsSnapshot( @@ -365,6 +441,7 @@ export async function processTaskIpc( ); break; } + // Defense in depth: agent cannot set isMain via IPC deps.registerGroup(data.jid, { name: data.name, folder: data.folder, diff --git a/src/logger.ts b/src/logger.ts index 273dc0f..6b18a9b 100644 --- a/src/logger.ts +++ b/src/logger.ts @@ -1,11 +1,78 @@ -import pino from 'pino'; +const LEVELS = { debug: 20, info: 30, warn: 40, error: 50, fatal: 60 } as const; +type Level = keyof typeof LEVELS; -export const logger = pino({ - level: process.env.LOG_LEVEL || 'info', - transport: { target: 'pino-pretty', options: { colorize: true } }, -}); +const COLORS: Record = { + debug: '\x1b[34m', + info: '\x1b[32m', + warn: '\x1b[33m', + error: '\x1b[31m', + fatal: '\x1b[41m\x1b[37m', +}; +const KEY_COLOR = '\x1b[35m'; +const MSG_COLOR = '\x1b[36m'; +const RESET = '\x1b[39m'; +const FULL_RESET = '\x1b[0m'; -// Route uncaught errors through pino so they get timestamps in stderr +const threshold = + LEVELS[(process.env.LOG_LEVEL as Level) || 'info'] ?? LEVELS.info; + +function formatErr(err: unknown): string { + if (err instanceof Error) { + return `{\n "type": "${err.constructor.name}",\n "message": "${err.message}",\n "stack":\n ${err.stack}\n }`; + } + return JSON.stringify(err); +} + +function formatData(data: Record): string { + let out = ''; + for (const [k, v] of Object.entries(data)) { + if (k === 'err') { + out += `\n ${KEY_COLOR}err${RESET}: ${formatErr(v)}`; + } else { + out += `\n ${KEY_COLOR}${k}${RESET}: ${JSON.stringify(v)}`; + } + } + return out; +} + +function ts(): string { + const d = new Date(); + return `${String(d.getHours()).padStart(2, '0')}:${String(d.getMinutes()).padStart(2, '0')}:${String(d.getSeconds()).padStart(2, '0')}.${String(d.getMilliseconds()).padStart(3, '0')}`; +} + +function log( + level: Level, + dataOrMsg: Record | string, + msg?: string, +): void { + if (LEVELS[level] < threshold) return; + const tag = `${COLORS[level]}${level.toUpperCase()}${level === 'fatal' ? FULL_RESET : RESET}`; + const stream = LEVELS[level] >= LEVELS.warn ? process.stderr : process.stdout; + if (typeof dataOrMsg === 'string') { + stream.write( + `[${ts()}] ${tag} (${process.pid}): ${MSG_COLOR}${dataOrMsg}${RESET}\n`, + ); + } else { + stream.write( + `[${ts()}] ${tag} (${process.pid}): ${MSG_COLOR}${msg}${RESET}${formatData(dataOrMsg)}\n`, + ); + } +} + +export const logger = { + debug: (dataOrMsg: Record | string, msg?: string) => + log('debug', dataOrMsg, msg), + info: (dataOrMsg: Record | string, msg?: string) => + log('info', dataOrMsg, msg), + warn: (dataOrMsg: Record | string, msg?: string) => + log('warn', dataOrMsg, msg), + error: (dataOrMsg: Record | string, msg?: string) => + log('error', dataOrMsg, msg), + fatal: (dataOrMsg: Record | string, msg?: string) => + log('fatal', dataOrMsg, msg), +}; + +// Route uncaught errors through logger so they get timestamps in stderr process.on('uncaughtException', (err) => { logger.fatal({ err }, 'Uncaught exception'); process.exit(1); diff --git a/src/mount-security.ts b/src/mount-security.ts index 3dceea5..a724876 100644 --- a/src/mount-security.ts +++ b/src/mount-security.ts @@ -9,16 +9,10 @@ import fs from 'fs'; import os from 'os'; import path from 'path'; -import pino from 'pino'; - import { MOUNT_ALLOWLIST_PATH } from './config.js'; +import { logger } from './logger.js'; import { AdditionalMount, AllowedRoot, MountAllowlist } from './types.js'; -const logger = pino({ - level: process.env.LOG_LEVEL || 'info', - transport: { target: 'pino-pretty', options: { colorize: true } }, -}); - // Cache the allowlist in memory - only reloads on process restart let cachedAllowlist: MountAllowlist | null = null; let allowlistLoadError: string | null = null; diff --git a/src/remote-control.test.ts b/src/remote-control.test.ts new file mode 100644 index 0000000..7dbf69c --- /dev/null +++ b/src/remote-control.test.ts @@ -0,0 +1,397 @@ +import fs from 'fs'; +import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'; + +// Mock config before importing the module under test +vi.mock('./config.js', () => ({ + DATA_DIR: '/tmp/nanoclaw-rc-test', +})); + +// Mock child_process +const spawnMock = vi.fn(); +vi.mock('child_process', () => ({ + spawn: (...args: any[]) => spawnMock(...args), +})); + +import { + startRemoteControl, + stopRemoteControl, + restoreRemoteControl, + getActiveSession, + _resetForTesting, + _getStateFilePath, +} from './remote-control.js'; + +// --- Helpers --- + +function createMockProcess(pid = 12345) { + return { + pid, + unref: vi.fn(), + kill: vi.fn(), + stdin: { write: vi.fn(), end: vi.fn() }, + }; +} + +describe('remote-control', () => { + const STATE_FILE = _getStateFilePath(); + let readFileSyncSpy: ReturnType; + let writeFileSyncSpy: ReturnType; + let unlinkSyncSpy: ReturnType; + let _mkdirSyncSpy: ReturnType; + let openSyncSpy: ReturnType; + let closeSyncSpy: ReturnType; + + // Track what readFileSync should return for the stdout file + let stdoutFileContent: string; + + beforeEach(() => { + _resetForTesting(); + spawnMock.mockReset(); + stdoutFileContent = ''; + + // Default fs mocks + _mkdirSyncSpy = vi + .spyOn(fs, 'mkdirSync') + .mockImplementation(() => undefined as any); + writeFileSyncSpy = vi + .spyOn(fs, 'writeFileSync') + .mockImplementation(() => {}); + unlinkSyncSpy = vi.spyOn(fs, 'unlinkSync').mockImplementation(() => {}); + openSyncSpy = vi.spyOn(fs, 'openSync').mockReturnValue(42 as any); + closeSyncSpy = vi.spyOn(fs, 'closeSync').mockImplementation(() => {}); + + // readFileSync: return stdoutFileContent for the stdout file, state file, etc. + readFileSyncSpy = vi.spyOn(fs, 'readFileSync').mockImplementation((( + p: string, + ) => { + if (p.endsWith('remote-control.stdout')) return stdoutFileContent; + if (p.endsWith('remote-control.json')) { + throw Object.assign(new Error('ENOENT'), { code: 'ENOENT' }); + } + return ''; + }) as any); + }); + + afterEach(() => { + _resetForTesting(); + vi.restoreAllMocks(); + }); + + // --- startRemoteControl --- + + describe('startRemoteControl', () => { + it('spawns claude remote-control and returns the URL', async () => { + const proc = createMockProcess(); + spawnMock.mockReturnValue(proc); + + // Simulate URL appearing in stdout file on first poll + stdoutFileContent = + 'Session URL: https://claude.ai/code?bridge=env_abc123\n'; + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + const result = await startRemoteControl('user1', 'tg:123', '/project'); + + expect(result).toEqual({ + ok: true, + url: 'https://claude.ai/code?bridge=env_abc123', + }); + expect(spawnMock).toHaveBeenCalledWith( + 'claude', + ['remote-control', '--name', 'NanoClaw Remote'], + expect.objectContaining({ cwd: '/project', detached: true }), + ); + expect(proc.unref).toHaveBeenCalled(); + }); + + it('uses file descriptors for stdout/stderr (not pipes)', async () => { + const proc = createMockProcess(); + spawnMock.mockReturnValue(proc); + stdoutFileContent = 'https://claude.ai/code?bridge=env_test\n'; + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + await startRemoteControl('user1', 'tg:123', '/project'); + + const spawnCall = spawnMock.mock.calls[0]; + const options = spawnCall[2]; + // stdio[0] is 'pipe' so we can write 'y' to accept the prompt + expect(options.stdio[0]).toBe('pipe'); + expect(typeof options.stdio[1]).toBe('number'); + expect(typeof options.stdio[2]).toBe('number'); + }); + + it('closes file descriptors in parent after spawn', async () => { + const proc = createMockProcess(); + spawnMock.mockReturnValue(proc); + stdoutFileContent = 'https://claude.ai/code?bridge=env_test\n'; + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + await startRemoteControl('user1', 'tg:123', '/project'); + + // Two openSync calls (stdout + stderr), two closeSync calls + expect(openSyncSpy).toHaveBeenCalledTimes(2); + expect(closeSyncSpy).toHaveBeenCalledTimes(2); + }); + + it('saves state to disk after capturing URL', async () => { + const proc = createMockProcess(99999); + spawnMock.mockReturnValue(proc); + stdoutFileContent = 'https://claude.ai/code?bridge=env_save\n'; + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + await startRemoteControl('user1', 'tg:123', '/project'); + + expect(writeFileSyncSpy).toHaveBeenCalledWith( + STATE_FILE, + expect.stringContaining('"pid":99999'), + ); + }); + + it('returns existing URL if session is already active', async () => { + const proc = createMockProcess(); + spawnMock.mockReturnValue(proc); + stdoutFileContent = 'https://claude.ai/code?bridge=env_existing\n'; + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + await startRemoteControl('user1', 'tg:123', '/project'); + + // Second call should return existing URL without spawning + const result = await startRemoteControl('user2', 'tg:456', '/project'); + expect(result).toEqual({ + ok: true, + url: 'https://claude.ai/code?bridge=env_existing', + }); + expect(spawnMock).toHaveBeenCalledTimes(1); + }); + + it('starts new session if existing process is dead', async () => { + const proc1 = createMockProcess(11111); + const proc2 = createMockProcess(22222); + spawnMock.mockReturnValueOnce(proc1).mockReturnValueOnce(proc2); + + // First start: process alive, URL found + const killSpy = vi + .spyOn(process, 'kill') + .mockImplementation((() => true) as any); + stdoutFileContent = 'https://claude.ai/code?bridge=env_first\n'; + await startRemoteControl('user1', 'tg:123', '/project'); + + // Old process (11111) is dead, new process (22222) is alive + killSpy.mockImplementation(((pid: number, sig: any) => { + if (pid === 11111 && (sig === 0 || sig === undefined)) { + throw new Error('ESRCH'); + } + return true; + }) as any); + + stdoutFileContent = 'https://claude.ai/code?bridge=env_second\n'; + const result = await startRemoteControl('user1', 'tg:123', '/project'); + + expect(result).toEqual({ + ok: true, + url: 'https://claude.ai/code?bridge=env_second', + }); + expect(spawnMock).toHaveBeenCalledTimes(2); + }); + + it('returns error if process exits before URL', async () => { + const proc = createMockProcess(33333); + spawnMock.mockReturnValue(proc); + stdoutFileContent = ''; + + // Process is dead (poll will detect this) + vi.spyOn(process, 'kill').mockImplementation((() => { + throw new Error('ESRCH'); + }) as any); + + const result = await startRemoteControl('user1', 'tg:123', '/project'); + expect(result).toEqual({ + ok: false, + error: 'Process exited before producing URL', + }); + }); + + it('times out if URL never appears', async () => { + vi.useFakeTimers(); + const proc = createMockProcess(44444); + spawnMock.mockReturnValue(proc); + stdoutFileContent = 'no url here'; + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + const promise = startRemoteControl('user1', 'tg:123', '/project'); + + // Advance past URL_TIMEOUT_MS (30s), with enough steps for polls + for (let i = 0; i < 160; i++) { + await vi.advanceTimersByTimeAsync(200); + } + + const result = await promise; + expect(result).toEqual({ + ok: false, + error: 'Timed out waiting for Remote Control URL', + }); + + vi.useRealTimers(); + }); + + it('returns error if spawn throws', async () => { + spawnMock.mockImplementation(() => { + throw new Error('ENOENT'); + }); + + const result = await startRemoteControl('user1', 'tg:123', '/project'); + expect(result).toEqual({ + ok: false, + error: 'Failed to start: ENOENT', + }); + }); + }); + + // --- stopRemoteControl --- + + describe('stopRemoteControl', () => { + it('kills the process and clears state', async () => { + const proc = createMockProcess(55555); + spawnMock.mockReturnValue(proc); + stdoutFileContent = 'https://claude.ai/code?bridge=env_stop\n'; + const killSpy = vi + .spyOn(process, 'kill') + .mockImplementation((() => true) as any); + + await startRemoteControl('user1', 'tg:123', '/project'); + + const result = stopRemoteControl(); + expect(result).toEqual({ ok: true }); + expect(killSpy).toHaveBeenCalledWith(55555, 'SIGTERM'); + expect(unlinkSyncSpy).toHaveBeenCalledWith(STATE_FILE); + expect(getActiveSession()).toBeNull(); + }); + + it('returns error when no session is active', () => { + const result = stopRemoteControl(); + expect(result).toEqual({ + ok: false, + error: 'No active Remote Control session', + }); + }); + }); + + // --- restoreRemoteControl --- + + describe('restoreRemoteControl', () => { + it('restores session if state file exists and process is alive', () => { + const session = { + pid: 77777, + url: 'https://claude.ai/code?bridge=env_restored', + startedBy: 'user1', + startedInChat: 'tg:123', + startedAt: '2026-01-01T00:00:00.000Z', + }; + readFileSyncSpy.mockImplementation(((p: string) => { + if (p.endsWith('remote-control.json')) return JSON.stringify(session); + return ''; + }) as any); + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + restoreRemoteControl(); + + const active = getActiveSession(); + expect(active).not.toBeNull(); + expect(active!.pid).toBe(77777); + expect(active!.url).toBe('https://claude.ai/code?bridge=env_restored'); + }); + + it('clears state if process is dead', () => { + const session = { + pid: 88888, + url: 'https://claude.ai/code?bridge=env_dead', + startedBy: 'user1', + startedInChat: 'tg:123', + startedAt: '2026-01-01T00:00:00.000Z', + }; + readFileSyncSpy.mockImplementation(((p: string) => { + if (p.endsWith('remote-control.json')) return JSON.stringify(session); + return ''; + }) as any); + vi.spyOn(process, 'kill').mockImplementation((() => { + throw new Error('ESRCH'); + }) as any); + + restoreRemoteControl(); + + expect(getActiveSession()).toBeNull(); + expect(unlinkSyncSpy).toHaveBeenCalled(); + }); + + it('does nothing if no state file exists', () => { + // readFileSyncSpy default throws ENOENT for .json + restoreRemoteControl(); + expect(getActiveSession()).toBeNull(); + }); + + it('clears state on corrupted JSON', () => { + readFileSyncSpy.mockImplementation(((p: string) => { + if (p.endsWith('remote-control.json')) return 'not json{{{'; + return ''; + }) as any); + + restoreRemoteControl(); + + expect(getActiveSession()).toBeNull(); + expect(unlinkSyncSpy).toHaveBeenCalled(); + }); + + // ** This is the key integration test: restore → stop must work ** + it('stopRemoteControl works after restoreRemoteControl', () => { + const session = { + pid: 77777, + url: 'https://claude.ai/code?bridge=env_restored', + startedBy: 'user1', + startedInChat: 'tg:123', + startedAt: '2026-01-01T00:00:00.000Z', + }; + readFileSyncSpy.mockImplementation(((p: string) => { + if (p.endsWith('remote-control.json')) return JSON.stringify(session); + return ''; + }) as any); + const killSpy = vi + .spyOn(process, 'kill') + .mockImplementation((() => true) as any); + + restoreRemoteControl(); + expect(getActiveSession()).not.toBeNull(); + + const result = stopRemoteControl(); + expect(result).toEqual({ ok: true }); + expect(killSpy).toHaveBeenCalledWith(77777, 'SIGTERM'); + expect(unlinkSyncSpy).toHaveBeenCalled(); + expect(getActiveSession()).toBeNull(); + }); + + it('startRemoteControl returns restored URL without spawning', () => { + const session = { + pid: 77777, + url: 'https://claude.ai/code?bridge=env_restored', + startedBy: 'user1', + startedInChat: 'tg:123', + startedAt: '2026-01-01T00:00:00.000Z', + }; + readFileSyncSpy.mockImplementation(((p: string) => { + if (p.endsWith('remote-control.json')) return JSON.stringify(session); + return ''; + }) as any); + vi.spyOn(process, 'kill').mockImplementation((() => true) as any); + + restoreRemoteControl(); + + return startRemoteControl('user2', 'tg:456', '/project').then( + (result) => { + expect(result).toEqual({ + ok: true, + url: 'https://claude.ai/code?bridge=env_restored', + }); + expect(spawnMock).not.toHaveBeenCalled(); + }, + ); + }); + }); +}); diff --git a/src/remote-control.ts b/src/remote-control.ts new file mode 100644 index 0000000..2f0bdc4 --- /dev/null +++ b/src/remote-control.ts @@ -0,0 +1,224 @@ +import { spawn } from 'child_process'; +import fs from 'fs'; +import path from 'path'; + +import { DATA_DIR } from './config.js'; +import { logger } from './logger.js'; + +interface RemoteControlSession { + pid: number; + url: string; + startedBy: string; + startedInChat: string; + startedAt: string; +} + +let activeSession: RemoteControlSession | null = null; + +const URL_REGEX = /https:\/\/claude\.ai\/code\S+/; +const URL_TIMEOUT_MS = 30_000; +const URL_POLL_MS = 200; +const STATE_FILE = path.join(DATA_DIR, 'remote-control.json'); +const STDOUT_FILE = path.join(DATA_DIR, 'remote-control.stdout'); +const STDERR_FILE = path.join(DATA_DIR, 'remote-control.stderr'); + +function saveState(session: RemoteControlSession): void { + fs.mkdirSync(path.dirname(STATE_FILE), { recursive: true }); + fs.writeFileSync(STATE_FILE, JSON.stringify(session)); +} + +function clearState(): void { + try { + fs.unlinkSync(STATE_FILE); + } catch { + // ignore + } +} + +function isProcessAlive(pid: number): boolean { + try { + process.kill(pid, 0); + return true; + } catch { + return false; + } +} + +/** + * Restore session from disk on startup. + * If the process is still alive, adopt it. Otherwise, clean up. + */ +export function restoreRemoteControl(): void { + let data: string; + try { + data = fs.readFileSync(STATE_FILE, 'utf-8'); + } catch { + return; + } + + try { + const session: RemoteControlSession = JSON.parse(data); + if (session.pid && isProcessAlive(session.pid)) { + activeSession = session; + logger.info( + { pid: session.pid, url: session.url }, + 'Restored Remote Control session from previous run', + ); + } else { + clearState(); + } + } catch { + clearState(); + } +} + +export function getActiveSession(): RemoteControlSession | null { + return activeSession; +} + +/** @internal — exported for testing only */ +export function _resetForTesting(): void { + activeSession = null; +} + +/** @internal — exported for testing only */ +export function _getStateFilePath(): string { + return STATE_FILE; +} + +export async function startRemoteControl( + sender: string, + chatJid: string, + cwd: string, +): Promise<{ ok: true; url: string } | { ok: false; error: string }> { + if (activeSession) { + // Verify the process is still alive + if (isProcessAlive(activeSession.pid)) { + return { ok: true, url: activeSession.url }; + } + // Process died — clean up and start a new one + activeSession = null; + clearState(); + } + + // Redirect stdout/stderr to files so the process has no pipes to the parent. + // This prevents SIGPIPE when NanoClaw restarts. + fs.mkdirSync(DATA_DIR, { recursive: true }); + const stdoutFd = fs.openSync(STDOUT_FILE, 'w'); + const stderrFd = fs.openSync(STDERR_FILE, 'w'); + + let proc; + try { + proc = spawn('claude', ['remote-control', '--name', 'NanoClaw Remote'], { + cwd, + stdio: ['pipe', stdoutFd, stderrFd], + detached: true, + }); + } catch (err: any) { + fs.closeSync(stdoutFd); + fs.closeSync(stderrFd); + return { ok: false, error: `Failed to start: ${err.message}` }; + } + + // Auto-accept the "Enable Remote Control?" prompt + if (proc.stdin) { + proc.stdin.write('y\n'); + proc.stdin.end(); + } + + // Close FDs in the parent — the child inherited copies + fs.closeSync(stdoutFd); + fs.closeSync(stderrFd); + + // Fully detach from parent + proc.unref(); + + const pid = proc.pid; + if (!pid) { + return { ok: false, error: 'Failed to get process PID' }; + } + + // Poll the stdout file for the URL + return new Promise((resolve) => { + const startTime = Date.now(); + + const poll = () => { + // Check if process died + if (!isProcessAlive(pid)) { + resolve({ ok: false, error: 'Process exited before producing URL' }); + return; + } + + // Check for URL in stdout file + let content = ''; + try { + content = fs.readFileSync(STDOUT_FILE, 'utf-8'); + } catch { + // File might not have content yet + } + + const match = content.match(URL_REGEX); + if (match) { + const session: RemoteControlSession = { + pid, + url: match[0], + startedBy: sender, + startedInChat: chatJid, + startedAt: new Date().toISOString(), + }; + activeSession = session; + saveState(session); + + logger.info( + { url: match[0], pid, sender, chatJid }, + 'Remote Control session started', + ); + resolve({ ok: true, url: match[0] }); + return; + } + + // Timeout check + if (Date.now() - startTime >= URL_TIMEOUT_MS) { + try { + process.kill(-pid, 'SIGTERM'); + } catch { + try { + process.kill(pid, 'SIGTERM'); + } catch { + // already dead + } + } + resolve({ + ok: false, + error: 'Timed out waiting for Remote Control URL', + }); + return; + } + + setTimeout(poll, URL_POLL_MS); + }; + + poll(); + }); +} + +export function stopRemoteControl(): + | { + ok: true; + } + | { ok: false; error: string } { + if (!activeSession) { + return { ok: false, error: 'No active Remote Control session' }; + } + + const { pid } = activeSession; + try { + process.kill(pid, 'SIGTERM'); + } catch { + // already dead + } + activeSession = null; + clearState(); + logger.info({ pid }, 'Remote Control session stopped'); + return { ok: true }; +} diff --git a/src/router.ts b/src/router.ts index 3c9fbc0..c14ca89 100644 --- a/src/router.ts +++ b/src/router.ts @@ -1,4 +1,5 @@ import { Channel, NewMessage } from './types.js'; +import { formatLocalTime } from './timezone.js'; export function escapeXml(s: string): string { if (!s) return ''; @@ -9,12 +10,18 @@ export function escapeXml(s: string): string { .replace(/"/g, '"'); } -export function formatMessages(messages: NewMessage[]): string { - const lines = messages.map( - (m) => - `${escapeXml(m.content)}`, - ); - return `\n${lines.join('\n')}\n`; +export function formatMessages( + messages: NewMessage[], + timezone: string, +): string { + const lines = messages.map((m) => { + const displayTime = formatLocalTime(m.timestamp, timezone); + return `${escapeXml(m.content)}`; + }); + + const header = `\n`; + + return `${header}\n${lines.join('\n')}\n`; } export function stripInternalTags(text: string): string { diff --git a/src/routing.test.ts b/src/routing.test.ts index 32bfc1f..6e44586 100644 --- a/src/routing.test.ts +++ b/src/routing.test.ts @@ -1,6 +1,6 @@ import { describe, it, expect, beforeEach } from 'vitest'; -import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js'; +import { _initTestDatabase, storeChatMetadata } from './db.js'; import { getAvailableGroups, _setRegisteredGroups } from './index.js'; beforeEach(() => { diff --git a/src/sender-allowlist.test.ts b/src/sender-allowlist.test.ts new file mode 100644 index 0000000..5bb8569 --- /dev/null +++ b/src/sender-allowlist.test.ts @@ -0,0 +1,216 @@ +import fs from 'fs'; +import os from 'os'; +import path from 'path'; +import { afterEach, beforeEach, describe, expect, it } from 'vitest'; + +import { + isSenderAllowed, + isTriggerAllowed, + loadSenderAllowlist, + SenderAllowlistConfig, + shouldDropMessage, +} from './sender-allowlist.js'; + +let tmpDir: string; + +function cfgPath(name = 'sender-allowlist.json'): string { + return path.join(tmpDir, name); +} + +function writeConfig(config: unknown, name?: string): string { + const p = cfgPath(name); + fs.writeFileSync(p, JSON.stringify(config)); + return p; +} + +beforeEach(() => { + tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'allowlist-test-')); +}); + +afterEach(() => { + fs.rmSync(tmpDir, { recursive: true, force: true }); +}); + +describe('loadSenderAllowlist', () => { + it('returns allow-all defaults when file is missing', () => { + const cfg = loadSenderAllowlist(cfgPath()); + expect(cfg.default.allow).toBe('*'); + expect(cfg.default.mode).toBe('trigger'); + expect(cfg.logDenied).toBe(true); + }); + + it('loads allow=* config', () => { + const p = writeConfig({ + default: { allow: '*', mode: 'trigger' }, + chats: {}, + logDenied: false, + }); + const cfg = loadSenderAllowlist(p); + expect(cfg.default.allow).toBe('*'); + expect(cfg.logDenied).toBe(false); + }); + + it('loads allow=[] (deny all)', () => { + const p = writeConfig({ + default: { allow: [], mode: 'trigger' }, + chats: {}, + }); + const cfg = loadSenderAllowlist(p); + expect(cfg.default.allow).toEqual([]); + }); + + it('loads allow=[list]', () => { + const p = writeConfig({ + default: { allow: ['alice', 'bob'], mode: 'drop' }, + chats: {}, + }); + const cfg = loadSenderAllowlist(p); + expect(cfg.default.allow).toEqual(['alice', 'bob']); + expect(cfg.default.mode).toBe('drop'); + }); + + it('per-chat override beats default', () => { + const p = writeConfig({ + default: { allow: '*', mode: 'trigger' }, + chats: { 'group-a': { allow: ['alice'], mode: 'drop' } }, + }); + const cfg = loadSenderAllowlist(p); + expect(cfg.chats['group-a'].allow).toEqual(['alice']); + expect(cfg.chats['group-a'].mode).toBe('drop'); + }); + + it('returns allow-all on invalid JSON', () => { + const p = cfgPath(); + fs.writeFileSync(p, '{ not valid json }}}'); + const cfg = loadSenderAllowlist(p); + expect(cfg.default.allow).toBe('*'); + }); + + it('returns allow-all on invalid schema', () => { + const p = writeConfig({ default: { oops: true } }); + const cfg = loadSenderAllowlist(p); + expect(cfg.default.allow).toBe('*'); + }); + + it('rejects non-string allow array items', () => { + const p = writeConfig({ + default: { allow: [123, null, true], mode: 'trigger' }, + chats: {}, + }); + const cfg = loadSenderAllowlist(p); + expect(cfg.default.allow).toBe('*'); // falls back to default + }); + + it('skips invalid per-chat entries', () => { + const p = writeConfig({ + default: { allow: '*', mode: 'trigger' }, + chats: { + good: { allow: ['alice'], mode: 'trigger' }, + bad: { allow: 123 }, + }, + }); + const cfg = loadSenderAllowlist(p); + expect(cfg.chats['good']).toBeDefined(); + expect(cfg.chats['bad']).toBeUndefined(); + }); +}); + +describe('isSenderAllowed', () => { + it('allow=* allows any sender', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: '*', mode: 'trigger' }, + chats: {}, + logDenied: true, + }; + expect(isSenderAllowed('g1', 'anyone', cfg)).toBe(true); + }); + + it('allow=[] denies any sender', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: [], mode: 'trigger' }, + chats: {}, + logDenied: true, + }; + expect(isSenderAllowed('g1', 'anyone', cfg)).toBe(false); + }); + + it('allow=[list] allows exact match only', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: ['alice', 'bob'], mode: 'trigger' }, + chats: {}, + logDenied: true, + }; + expect(isSenderAllowed('g1', 'alice', cfg)).toBe(true); + expect(isSenderAllowed('g1', 'eve', cfg)).toBe(false); + }); + + it('uses per-chat entry over default', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: '*', mode: 'trigger' }, + chats: { g1: { allow: ['alice'], mode: 'trigger' } }, + logDenied: true, + }; + expect(isSenderAllowed('g1', 'bob', cfg)).toBe(false); + expect(isSenderAllowed('g2', 'bob', cfg)).toBe(true); + }); +}); + +describe('shouldDropMessage', () => { + it('returns false for trigger mode', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: '*', mode: 'trigger' }, + chats: {}, + logDenied: true, + }; + expect(shouldDropMessage('g1', cfg)).toBe(false); + }); + + it('returns true for drop mode', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: '*', mode: 'drop' }, + chats: {}, + logDenied: true, + }; + expect(shouldDropMessage('g1', cfg)).toBe(true); + }); + + it('per-chat mode override', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: '*', mode: 'trigger' }, + chats: { g1: { allow: '*', mode: 'drop' } }, + logDenied: true, + }; + expect(shouldDropMessage('g1', cfg)).toBe(true); + expect(shouldDropMessage('g2', cfg)).toBe(false); + }); +}); + +describe('isTriggerAllowed', () => { + it('allows trigger for allowed sender', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: ['alice'], mode: 'trigger' }, + chats: {}, + logDenied: false, + }; + expect(isTriggerAllowed('g1', 'alice', cfg)).toBe(true); + }); + + it('denies trigger for disallowed sender', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: ['alice'], mode: 'trigger' }, + chats: {}, + logDenied: false, + }; + expect(isTriggerAllowed('g1', 'eve', cfg)).toBe(false); + }); + + it('logs when logDenied is true', () => { + const cfg: SenderAllowlistConfig = { + default: { allow: ['alice'], mode: 'trigger' }, + chats: {}, + logDenied: true, + }; + isTriggerAllowed('g1', 'eve', cfg); + // Logger.debug is called — we just verify no crash; logger is a real pino instance + }); +}); diff --git a/src/sender-allowlist.ts b/src/sender-allowlist.ts new file mode 100644 index 0000000..9cc2bde --- /dev/null +++ b/src/sender-allowlist.ts @@ -0,0 +1,128 @@ +import fs from 'fs'; + +import { SENDER_ALLOWLIST_PATH } from './config.js'; +import { logger } from './logger.js'; + +export interface ChatAllowlistEntry { + allow: '*' | string[]; + mode: 'trigger' | 'drop'; +} + +export interface SenderAllowlistConfig { + default: ChatAllowlistEntry; + chats: Record; + logDenied: boolean; +} + +const DEFAULT_CONFIG: SenderAllowlistConfig = { + default: { allow: '*', mode: 'trigger' }, + chats: {}, + logDenied: true, +}; + +function isValidEntry(entry: unknown): entry is ChatAllowlistEntry { + if (!entry || typeof entry !== 'object') return false; + const e = entry as Record; + const validAllow = + e.allow === '*' || + (Array.isArray(e.allow) && e.allow.every((v) => typeof v === 'string')); + const validMode = e.mode === 'trigger' || e.mode === 'drop'; + return validAllow && validMode; +} + +export function loadSenderAllowlist( + pathOverride?: string, +): SenderAllowlistConfig { + const filePath = pathOverride ?? SENDER_ALLOWLIST_PATH; + + let raw: string; + try { + raw = fs.readFileSync(filePath, 'utf-8'); + } catch (err: unknown) { + if ((err as NodeJS.ErrnoException).code === 'ENOENT') return DEFAULT_CONFIG; + logger.warn( + { err, path: filePath }, + 'sender-allowlist: cannot read config', + ); + return DEFAULT_CONFIG; + } + + let parsed: unknown; + try { + parsed = JSON.parse(raw); + } catch { + logger.warn({ path: filePath }, 'sender-allowlist: invalid JSON'); + return DEFAULT_CONFIG; + } + + const obj = parsed as Record; + + if (!isValidEntry(obj.default)) { + logger.warn( + { path: filePath }, + 'sender-allowlist: invalid or missing default entry', + ); + return DEFAULT_CONFIG; + } + + const chats: Record = {}; + if (obj.chats && typeof obj.chats === 'object') { + for (const [jid, entry] of Object.entries( + obj.chats as Record, + )) { + if (isValidEntry(entry)) { + chats[jid] = entry; + } else { + logger.warn( + { jid, path: filePath }, + 'sender-allowlist: skipping invalid chat entry', + ); + } + } + } + + return { + default: obj.default as ChatAllowlistEntry, + chats, + logDenied: obj.logDenied !== false, + }; +} + +function getEntry( + chatJid: string, + cfg: SenderAllowlistConfig, +): ChatAllowlistEntry { + return cfg.chats[chatJid] ?? cfg.default; +} + +export function isSenderAllowed( + chatJid: string, + sender: string, + cfg: SenderAllowlistConfig, +): boolean { + const entry = getEntry(chatJid, cfg); + if (entry.allow === '*') return true; + return entry.allow.includes(sender); +} + +export function shouldDropMessage( + chatJid: string, + cfg: SenderAllowlistConfig, +): boolean { + return getEntry(chatJid, cfg).mode === 'drop'; +} + +export function isTriggerAllowed( + chatJid: string, + sender: string, + cfg: SenderAllowlistConfig, +): boolean { + const allowed = isSenderAllowed(chatJid, sender, cfg); + if (!allowed && cfg.logDenied) { + logger.debug( + { chatJid, sender }, + 'sender-allowlist: trigger denied for sender', + ); + } + return allowed; +} diff --git a/src/task-scheduler.test.ts b/src/task-scheduler.test.ts index 62129e8..2032b51 100644 --- a/src/task-scheduler.test.ts +++ b/src/task-scheduler.test.ts @@ -3,6 +3,7 @@ import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'; import { _initTestDatabase, createTask, getTaskById } from './db.js'; import { _resetSchedulerLoopForTests, + computeNextRun, startSchedulerLoop, } from './task-scheduler.js'; @@ -50,4 +51,79 @@ describe('task scheduler', () => { const task = getTaskById('task-invalid-folder'); expect(task?.status).toBe('paused'); }); + + it('computeNextRun anchors interval tasks to scheduled time to prevent drift', () => { + const scheduledTime = new Date(Date.now() - 2000).toISOString(); // 2s ago + const task = { + id: 'drift-test', + group_folder: 'test', + chat_jid: 'test@g.us', + prompt: 'test', + schedule_type: 'interval' as const, + schedule_value: '60000', // 1 minute + context_mode: 'isolated' as const, + next_run: scheduledTime, + last_run: null, + last_result: null, + status: 'active' as const, + created_at: '2026-01-01T00:00:00.000Z', + }; + + const nextRun = computeNextRun(task); + expect(nextRun).not.toBeNull(); + + // Should be anchored to scheduledTime + 60s, NOT Date.now() + 60s + const expected = new Date(scheduledTime).getTime() + 60000; + expect(new Date(nextRun!).getTime()).toBe(expected); + }); + + it('computeNextRun returns null for once-tasks', () => { + const task = { + id: 'once-test', + group_folder: 'test', + chat_jid: 'test@g.us', + prompt: 'test', + schedule_type: 'once' as const, + schedule_value: '2026-01-01T00:00:00.000Z', + context_mode: 'isolated' as const, + next_run: new Date(Date.now() - 1000).toISOString(), + last_run: null, + last_result: null, + status: 'active' as const, + created_at: '2026-01-01T00:00:00.000Z', + }; + + expect(computeNextRun(task)).toBeNull(); + }); + + it('computeNextRun skips missed intervals without infinite loop', () => { + // Task was due 10 intervals ago (missed) + const ms = 60000; + const missedBy = ms * 10; + const scheduledTime = new Date(Date.now() - missedBy).toISOString(); + + const task = { + id: 'skip-test', + group_folder: 'test', + chat_jid: 'test@g.us', + prompt: 'test', + schedule_type: 'interval' as const, + schedule_value: String(ms), + context_mode: 'isolated' as const, + next_run: scheduledTime, + last_run: null, + last_result: null, + status: 'active' as const, + created_at: '2026-01-01T00:00:00.000Z', + }; + + const nextRun = computeNextRun(task); + expect(nextRun).not.toBeNull(); + // Must be in the future + expect(new Date(nextRun!).getTime()).toBeGreaterThan(Date.now()); + // Must be aligned to the original schedule grid + const offset = + (new Date(nextRun!).getTime() - new Date(scheduledTime).getTime()) % ms; + expect(offset).toBe(0); + }); }); diff --git a/src/task-scheduler.ts b/src/task-scheduler.ts index f6cfa72..f2b964d 100644 --- a/src/task-scheduler.ts +++ b/src/task-scheduler.ts @@ -2,12 +2,7 @@ import { ChildProcess } from 'child_process'; import { CronExpressionParser } from 'cron-parser'; import fs from 'fs'; -import { - ASSISTANT_NAME, - MAIN_GROUP_FOLDER, - SCHEDULER_POLL_INTERVAL, - TIMEZONE, -} from './config.js'; +import { ASSISTANT_NAME, SCHEDULER_POLL_INTERVAL, TIMEZONE } from './config.js'; import { ContainerOutput, runContainerAgent, @@ -26,6 +21,47 @@ import { resolveGroupFolderPath } from './group-folder.js'; import { logger } from './logger.js'; import { RegisteredGroup, ScheduledTask } from './types.js'; +/** + * Compute the next run time for a recurring task, anchored to the + * task's scheduled time rather than Date.now() to prevent cumulative + * drift on interval-based tasks. + * + * Co-authored-by: @community-pr-601 + */ +export function computeNextRun(task: ScheduledTask): string | null { + if (task.schedule_type === 'once') return null; + + const now = Date.now(); + + if (task.schedule_type === 'cron') { + const interval = CronExpressionParser.parse(task.schedule_value, { + tz: TIMEZONE, + }); + return interval.next().toISOString(); + } + + if (task.schedule_type === 'interval') { + const ms = parseInt(task.schedule_value, 10); + if (!ms || ms <= 0) { + // Guard against malformed interval that would cause an infinite loop + logger.warn( + { taskId: task.id, value: task.schedule_value }, + 'Invalid interval value', + ); + return new Date(now + 60_000).toISOString(); + } + // Anchor to the scheduled time, not now, to prevent drift. + // Skip past any missed intervals so we always land in the future. + let next = new Date(task.next_run!).getTime() + ms; + while (next <= now) { + next += ms; + } + return new Date(next).toISOString(); + } + + return null; +} + export interface SchedulerDependencies { registeredGroups: () => Record; getSessions: () => Record; @@ -94,7 +130,7 @@ async function runTask( } // Update tasks snapshot for container to read (filtered by group) - const isMain = task.group_folder === MAIN_GROUP_FOLDER; + const isMain = group.isMain === true; const tasks = getAllTasks(); writeTasksSnapshot( task.group_folder, @@ -103,6 +139,7 @@ async function runTask( id: t.id, groupFolder: t.group_folder, prompt: t.prompt, + script: t.script, schedule_type: t.schedule_type, schedule_value: t.schedule_value, status: t.status, @@ -143,6 +180,7 @@ async function runTask( isMain, isScheduledTask: true, assistantName: ASSISTANT_NAME, + script: task.script || undefined, }, (proc, containerName) => deps.onProcess(task.chat_jid, proc, containerName, task.group_folder), @@ -155,6 +193,7 @@ async function runTask( } if (streamedOutput.status === 'success') { deps.queue.notifyIdle(task.chat_jid); + scheduleClose(); // Close promptly even when result is null (e.g. IPC-only tasks) } if (streamedOutput.status === 'error') { error = streamedOutput.error || 'Unknown error'; @@ -167,7 +206,7 @@ async function runTask( if (output.status === 'error') { error = output.error || 'Unknown error'; } else if (output.result) { - // Messages are sent via MCP tool (IPC), result text is just logged + // Result was already forwarded to the user via the streaming callback above result = output.result; } @@ -192,18 +231,7 @@ async function runTask( error, }); - let nextRun: string | null = null; - if (task.schedule_type === 'cron') { - const interval = CronExpressionParser.parse(task.schedule_value, { - tz: TIMEZONE, - }); - nextRun = interval.next().toISOString(); - } else if (task.schedule_type === 'interval') { - const ms = parseInt(task.schedule_value, 10); - nextRun = new Date(Date.now() + ms).toISOString(); - } - // 'once' tasks have no next run - + const nextRun = computeNextRun(task); const resultSummary = error ? `Error: ${error}` : result diff --git a/src/timezone.test.ts b/src/timezone.test.ts new file mode 100644 index 0000000..1003a61 --- /dev/null +++ b/src/timezone.test.ts @@ -0,0 +1,73 @@ +import { describe, it, expect } from 'vitest'; + +import { + formatLocalTime, + isValidTimezone, + resolveTimezone, +} from './timezone.js'; + +// --- formatLocalTime --- + +describe('formatLocalTime', () => { + it('converts UTC to local time display', () => { + // 2026-02-04T18:30:00Z in America/New_York (EST, UTC-5) = 1:30 PM + const result = formatLocalTime( + '2026-02-04T18:30:00.000Z', + 'America/New_York', + ); + expect(result).toContain('1:30'); + expect(result).toContain('PM'); + expect(result).toContain('Feb'); + expect(result).toContain('2026'); + }); + + it('handles different timezones', () => { + // Same UTC time should produce different local times + const utc = '2026-06-15T12:00:00.000Z'; + const ny = formatLocalTime(utc, 'America/New_York'); + const tokyo = formatLocalTime(utc, 'Asia/Tokyo'); + // NY is UTC-4 in summer (EDT), Tokyo is UTC+9 + expect(ny).toContain('8:00'); + expect(tokyo).toContain('9:00'); + }); + + it('does not throw on invalid timezone, falls back to UTC', () => { + expect(() => + formatLocalTime('2026-01-01T00:00:00.000Z', 'IST-2'), + ).not.toThrow(); + const result = formatLocalTime('2026-01-01T12:00:00.000Z', 'IST-2'); + // Should format as UTC (noon UTC = 12:00 PM) + expect(result).toContain('12:00'); + expect(result).toContain('PM'); + }); +}); + +describe('isValidTimezone', () => { + it('accepts valid IANA identifiers', () => { + expect(isValidTimezone('America/New_York')).toBe(true); + expect(isValidTimezone('UTC')).toBe(true); + expect(isValidTimezone('Asia/Tokyo')).toBe(true); + expect(isValidTimezone('Asia/Jerusalem')).toBe(true); + }); + + it('rejects invalid timezone strings', () => { + expect(isValidTimezone('IST-2')).toBe(false); + expect(isValidTimezone('XYZ+3')).toBe(false); + }); + + it('rejects empty and garbage strings', () => { + expect(isValidTimezone('')).toBe(false); + expect(isValidTimezone('NotATimezone')).toBe(false); + }); +}); + +describe('resolveTimezone', () => { + it('returns the timezone if valid', () => { + expect(resolveTimezone('America/New_York')).toBe('America/New_York'); + }); + + it('falls back to UTC for invalid timezone', () => { + expect(resolveTimezone('IST-2')).toBe('UTC'); + expect(resolveTimezone('')).toBe('UTC'); + }); +}); diff --git a/src/timezone.ts b/src/timezone.ts new file mode 100644 index 0000000..d8cc6cc --- /dev/null +++ b/src/timezone.ts @@ -0,0 +1,37 @@ +/** + * Check whether a timezone string is a valid IANA identifier + * that Intl.DateTimeFormat can use. + */ +export function isValidTimezone(tz: string): boolean { + try { + Intl.DateTimeFormat(undefined, { timeZone: tz }); + return true; + } catch { + return false; + } +} + +/** + * Return the given timezone if valid IANA, otherwise fall back to UTC. + */ +export function resolveTimezone(tz: string): string { + return isValidTimezone(tz) ? tz : 'UTC'; +} + +/** + * Convert a UTC ISO timestamp to a localized display string. + * Uses the Intl API (no external dependencies). + * Falls back to UTC if the timezone is invalid. + */ +export function formatLocalTime(utcIso: string, timezone: string): string { + const date = new Date(utcIso); + return date.toLocaleString('en-US', { + timeZone: resolveTimezone(timezone), + year: 'numeric', + month: 'short', + day: 'numeric', + hour: 'numeric', + minute: '2-digit', + hour12: true, + }); +} diff --git a/src/types.ts b/src/types.ts index 7038b3a..bcef463 100644 --- a/src/types.ts +++ b/src/types.ts @@ -39,6 +39,7 @@ export interface RegisteredGroup { added_at: string; containerConfig?: ContainerConfig; requiresTrigger?: boolean; // Default: true for groups, false for solo chats + isMain?: boolean; // True for the main control group (no trigger, elevated privileges) } export interface NewMessage { @@ -57,6 +58,7 @@ export interface ScheduledTask { group_folder: string; chat_jid: string; prompt: string; + script?: string | null; schedule_type: 'cron' | 'interval' | 'once'; schedule_value: string; context_mode: 'group' | 'isolated'; @@ -87,6 +89,8 @@ export interface Channel { disconnect(): Promise; // Optional: typing indicator. Channels that support it implement it. setTyping?(jid: string, isTyping: boolean): Promise; + // Optional: sync group/chat names from the platform. + syncGroups?(force: boolean): Promise; } // Callback type that channels use to deliver inbound messages @@ -94,7 +98,7 @@ export type OnInboundMessage = (chatJid: string, message: NewMessage) => void; // Callback for chat metadata discovery. // name is optional — channels that deliver names inline (Telegram) pass it here; -// channels that sync names separately (WhatsApp syncGroupMetadata) omit it. +// channels that sync names separately (via syncGroups) omit it. export type OnChatMetadata = ( chatJid: string, timestamp: string, diff --git a/src/whatsapp-auth.ts b/src/whatsapp-auth.ts deleted file mode 100644 index 48545d1..0000000 --- a/src/whatsapp-auth.ts +++ /dev/null @@ -1,180 +0,0 @@ -/** - * WhatsApp Authentication Script - * - * Run this during setup to authenticate with WhatsApp. - * Displays QR code, waits for scan, saves credentials, then exits. - * - * Usage: npx tsx src/whatsapp-auth.ts - */ -import fs from 'fs'; -import path from 'path'; -import pino from 'pino'; -import qrcode from 'qrcode-terminal'; -import readline from 'readline'; - -import makeWASocket, { - Browsers, - DisconnectReason, - fetchLatestWaWebVersion, - makeCacheableSignalKeyStore, - useMultiFileAuthState, -} from '@whiskeysockets/baileys'; - -const AUTH_DIR = './store/auth'; -const QR_FILE = './store/qr-data.txt'; -const STATUS_FILE = './store/auth-status.txt'; - -const logger = pino({ - level: 'warn', // Quiet logging - only show errors -}); - -// Check for --pairing-code flag and phone number -const usePairingCode = process.argv.includes('--pairing-code'); -const phoneArg = process.argv.find((_, i, arr) => arr[i - 1] === '--phone'); - -function askQuestion(prompt: string): Promise { - const rl = readline.createInterface({ - input: process.stdin, - output: process.stdout, - }); - return new Promise((resolve) => { - rl.question(prompt, (answer) => { - rl.close(); - resolve(answer.trim()); - }); - }); -} - -async function connectSocket( - phoneNumber?: string, - isReconnect = false, -): Promise { - const { state, saveCreds } = await useMultiFileAuthState(AUTH_DIR); - - if (state.creds.registered && !isReconnect) { - fs.writeFileSync(STATUS_FILE, 'already_authenticated'); - console.log('✓ Already authenticated with WhatsApp'); - console.log( - ' To re-authenticate, delete the store/auth folder and run again.', - ); - process.exit(0); - } - - const { version } = await fetchLatestWaWebVersion({}).catch((err) => { - logger.warn( - { err }, - 'Failed to fetch latest WA Web version, using default', - ); - return { version: undefined }; - }); - const sock = makeWASocket({ - version, - auth: { - creds: state.creds, - keys: makeCacheableSignalKeyStore(state.keys, logger), - }, - printQRInTerminal: false, - logger, - browser: Browsers.macOS('Chrome'), - }); - - if (usePairingCode && phoneNumber && !state.creds.me) { - // Request pairing code after a short delay for connection to initialize - // Only on first connect (not reconnect after 515) - setTimeout(async () => { - try { - const code = await sock.requestPairingCode(phoneNumber!); - console.log(`\n🔗 Your pairing code: ${code}\n`); - console.log(' 1. Open WhatsApp on your phone'); - console.log(' 2. Tap Settings → Linked Devices → Link a Device'); - console.log(' 3. Tap "Link with phone number instead"'); - console.log(` 4. Enter this code: ${code}\n`); - fs.writeFileSync(STATUS_FILE, `pairing_code:${code}`); - } catch (err: any) { - console.error('Failed to request pairing code:', err.message); - process.exit(1); - } - }, 3000); - } - - sock.ev.on('connection.update', (update) => { - const { connection, lastDisconnect, qr } = update; - - if (qr) { - // Write raw QR data to file so the setup skill can render it - fs.writeFileSync(QR_FILE, qr); - console.log('Scan this QR code with WhatsApp:\n'); - console.log(' 1. Open WhatsApp on your phone'); - console.log(' 2. Tap Settings → Linked Devices → Link a Device'); - console.log(' 3. Point your camera at the QR code below\n'); - qrcode.generate(qr, { small: true }); - } - - if (connection === 'close') { - const reason = (lastDisconnect?.error as any)?.output?.statusCode; - - if (reason === DisconnectReason.loggedOut) { - fs.writeFileSync(STATUS_FILE, 'failed:logged_out'); - console.log('\n✗ Logged out. Delete store/auth and try again.'); - process.exit(1); - } else if (reason === DisconnectReason.timedOut) { - fs.writeFileSync(STATUS_FILE, 'failed:qr_timeout'); - console.log('\n✗ QR code timed out. Please try again.'); - process.exit(1); - } else if (reason === 515) { - // 515 = stream error, often happens after pairing succeeds but before - // registration completes. Reconnect to finish the handshake. - console.log('\n⟳ Stream error (515) after pairing — reconnecting...'); - connectSocket(phoneNumber, true); - } else { - fs.writeFileSync(STATUS_FILE, `failed:${reason || 'unknown'}`); - console.log('\n✗ Connection failed. Please try again.'); - process.exit(1); - } - } - - if (connection === 'open') { - fs.writeFileSync(STATUS_FILE, 'authenticated'); - // Clean up QR file now that we're connected - try { - fs.unlinkSync(QR_FILE); - } catch {} - console.log('\n✓ Successfully authenticated with WhatsApp!'); - console.log(' Credentials saved to store/auth/'); - console.log(' You can now start the NanoClaw service.\n'); - - // Give it a moment to save credentials, then exit - setTimeout(() => process.exit(0), 1000); - } - }); - - sock.ev.on('creds.update', saveCreds); -} - -async function authenticate(): Promise { - fs.mkdirSync(AUTH_DIR, { recursive: true }); - - // Clean up any stale QR/status files from previous runs - try { - fs.unlinkSync(QR_FILE); - } catch {} - try { - fs.unlinkSync(STATUS_FILE); - } catch {} - - let phoneNumber = phoneArg; - if (usePairingCode && !phoneNumber) { - phoneNumber = await askQuestion( - 'Enter your phone number (with country code, no + or spaces, e.g. 14155551234): ', - ); - } - - console.log('Starting WhatsApp authentication...\n'); - - await connectSocket(phoneNumber); -} - -authenticate().catch((err) => { - console.error('Authentication failed:', err.message); - process.exit(1); -}); diff --git a/vitest.config.ts b/vitest.config.ts index 354e6a5..a456d1c 100644 --- a/vitest.config.ts +++ b/vitest.config.ts @@ -2,6 +2,6 @@ import { defineConfig } from 'vitest/config'; export default defineConfig({ test: { - include: ['src/**/*.test.ts', 'setup/**/*.test.ts', 'skills-engine/**/*.test.ts'], + include: ['src/**/*.test.ts', 'setup/**/*.test.ts'], }, }); diff --git a/vitest.skills.config.ts b/vitest.skills.config.ts new file mode 100644 index 0000000..3be7fcd --- /dev/null +++ b/vitest.skills.config.ts @@ -0,0 +1,7 @@ +import { defineConfig } from 'vitest/config'; + +export default defineConfig({ + test: { + include: ['.claude/skills/**/tests/*.test.ts'], + }, +});