Files
nanoclaw/.claude/skills/add-ollama-tool/modify/container/agent-runner/src/index.ts.intent.md
daniviber 298c3eade4 feat: add /add-ollama skill for local model inference (#712)
* feat: add /add-ollama skill for local model inference

Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.

Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
  ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
  (log surfacing)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: rename skill from /add-ollama to /add-ollama-tool

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>
2026-03-04 23:48:23 +02:00

909 B

Intent: container/agent-runner/src/index.ts modifications

What changed

Added Ollama MCP server configuration so the container agent can call local Ollama models as tools.

Key sections

allowedTools array (inside runQuery → options)

  • Added: 'mcp__ollama__*' to the allowedTools array (after 'mcp__nanoclaw__*')

mcpServers object (inside runQuery → options)

  • Added: ollama entry as a stdio MCP server
    • command: 'node'
    • args: resolves to ollama-mcp-stdio.js in the same directory as ipc-mcp-stdio.js
    • Uses path.join(path.dirname(mcpServerPath), 'ollama-mcp-stdio.js') to compute the path

Invariants (must-keep)

  • All existing allowedTools entries unchanged
  • nanoclaw MCP server config unchanged
  • All other query options (permissionMode, hooks, env, etc.) unchanged
  • MessageStream class unchanged
  • IPC polling logic unchanged
  • Session management unchanged