* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
909 B
909 B
Intent: container/agent-runner/src/index.ts modifications
What changed
Added Ollama MCP server configuration so the container agent can call local Ollama models as tools.
Key sections
allowedTools array (inside runQuery → options)
- Added:
'mcp__ollama__*'to the allowedTools array (after'mcp__nanoclaw__*')
mcpServers object (inside runQuery → options)
- Added:
ollamaentry as a stdio MCP server- command:
'node' - args: resolves to
ollama-mcp-stdio.jsin the same directory asipc-mcp-stdio.js - Uses
path.join(path.dirname(mcpServerPath), 'ollama-mcp-stdio.js')to compute the path
- command:
Invariants (must-keep)
- All existing allowedTools entries unchanged
- nanoclaw MCP server config unchanged
- All other query options (permissionMode, hooks, env, etc.) unchanged
- MessageStream class unchanged
- IPC polling logic unchanged
- Session management unchanged