* feat: add /add-ollama skill for local model inference
Adds a skill that integrates Ollama as an MCP server, allowing the
container agent to offload tasks to local models (summarization,
translation, general queries) while keeping Claude as orchestrator.
Skill contents:
- ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and
ollama_generate tools
- ollama-watch.sh: macOS notification watcher for Ollama activity
- Modifications to index.ts (MCP config) and container-runner.ts
(log surfacing)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
* chore: rename skill from /add-ollama to /add-ollama-tool
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
---------
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: gavrielc <gabicohen22@yahoo.com>