feat: add /add-ollama skill for local model inference (#712)
* feat: add /add-ollama skill for local model inference Adds a skill that integrates Ollama as an MCP server, allowing the container agent to offload tasks to local models (summarization, translation, general queries) while keeping Claude as orchestrator. Skill contents: - ollama-mcp-stdio.ts: stdio MCP server with ollama_list_models and ollama_generate tools - ollama-watch.sh: macOS notification watcher for Ollama activity - Modifications to index.ts (MCP config) and container-runner.ts (log surfacing) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * chore: rename skill from /add-ollama to /add-ollama-tool Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: gavrielc <gabicohen22@yahoo.com>
This commit is contained in:
17
.claude/skills/add-ollama-tool/manifest.yaml
Normal file
17
.claude/skills/add-ollama-tool/manifest.yaml
Normal file
@@ -0,0 +1,17 @@
|
||||
skill: ollama
|
||||
version: 1.0.0
|
||||
description: "Local Ollama model inference via MCP server"
|
||||
core_version: 0.1.0
|
||||
adds:
|
||||
- container/agent-runner/src/ollama-mcp-stdio.ts
|
||||
- scripts/ollama-watch.sh
|
||||
modifies:
|
||||
- container/agent-runner/src/index.ts
|
||||
- src/container-runner.ts
|
||||
structured:
|
||||
npm_dependencies: {}
|
||||
env_additions:
|
||||
- OLLAMA_HOST
|
||||
conflicts: []
|
||||
depends: []
|
||||
test: "npm run build"
|
||||
Reference in New Issue
Block a user