NewAITees_ollama-MCP-server
If you are the rightful owner of NewAITees_ollama-MCP-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Ollamaãšéä¿¡ããModel Context Protocol (MCP) ãµãŒããŒ
ollama-MCP-server
Ollamaãšéä¿¡ããModel Context Protocol (MCP) ãµãŒããŒ
æŠèŠ
ãã®MCPãµãŒããŒã¯ãããŒã«ã«ã®Ollama LLMã€ã³ã¹ã¿ã³ã¹ãšMCPäºæã¢ããªã±ãŒã·ã§ã³ã®éã§ã·ãŒã ã¬ã¹ãªçµ±åãå¯èœã«ããé«åºŠãªã¿ã¹ã¯åè§£ãè©äŸ¡ãã¯ãŒã¯ãããŒç®¡çãæäŸããŸãã
äž»ãªæ©èœ:
- è€éãªåé¡ã®ã¿ã¹ã¯åè§£
- çµæã®è©äŸ¡ãšæ€èšŒ
- Ollamaã¢ãã«ã®ç®¡çãšå®è¡
- MCPãããã³ã«ã«ããæšæºåãããéä¿¡
- é«åºŠãªãšã©ãŒåŠçãšè©³çްãªãšã©ãŒã¡ãã»ãŒãž
- ããã©ãŒãã³ã¹æé©åïŒã³ãã¯ã·ã§ã³ããŒãªã³ã°ãLRUãã£ãã·ã¥ïŒ
ã³ã³ããŒãã³ã
ãªãœãŒã¹
ãµãŒããŒã¯ä»¥äžã®ãªãœãŒã¹ãå®è£ ããŠããŸã:
- task:// - åå¥ã®ã¿ã¹ã¯ã«ã¢ã¯ã»ã¹ããããã®URIã¹ããŒã
- result:// - è©äŸ¡çµæã«ã¢ã¯ã»ã¹ããããã®URIã¹ããŒã
- model:// - å©çšå¯èœãªOllamaã¢ãã«ã«ã¢ã¯ã»ã¹ããããã®URIã¹ããŒã
åãªãœãŒã¹ã«ã¯ãæé©ãªLLMãšã®å¯Ÿè©±ã®ããã®é©åãªã¡ã¿ããŒã¿ãšMIMEã¿ã€ããèšå®ãããŠããŸãã
ããã³ãããšããŒã«ã®é¢ä¿
MCPãµãŒããŒã§ã¯ãããã³ãããšããŒã«ã¯å¯æ¥ã«é¢é£ããŠããŸãããç°ãªã圹å²ãæã£ãŠããŸãã
- ããã³ããïŒLLMã«ç¹å®ã®æèæ¹æ³ãæ§é ãæäŸããã¹ããŒãïŒSchemaïŒã®ãããªåœ¹å²
- ããŒã«ïŒå®éã«ã¢ã¯ã·ã§ã³ãå®è¡ãããã³ãã©ãŒïŒHandlerïŒã®ãããªåœ¹å²
åããŒã«ã«ã¯å¯Ÿå¿ããã¹ããŒãïŒããã³ããïŒãå¿ èŠã§ãããããã«ããLLMã®æèèœåãšå®éã®ã·ã¹ãã æ©èœã广çã«é£æºãããããšãã§ããŸãã
ããã³ãã
ãµãŒããŒã¯ããã€ãã®ç¹æ®ãªããã³ãããæäŸããŸã:
-
decompose-task - è€éãªã¿ã¹ã¯ã管çãããããµãã¿ã¹ã¯ã«åè§£
- ã¿ã¹ã¯ã®èª¬æãšç²åºŠã¬ãã«ã®ãªãã·ã§ã³ãã©ã¡ãŒã¿ãååŸ
- äŸåé¢ä¿ãšæšå®è€éæ§ãå«ãæ§é åãããå èš³ãè¿ã
-
evaluate-result - æå®ãããåºæºã«å¯ŸããŠã¿ã¹ã¯çµæãåæ
- çµæã®å 容ãšè©äŸ¡ãã©ã¡ãŒã¿ãååŸ
- ã¹ã³ã¢ãšæ¹åææ¡ãå«ã詳现ãªè©äŸ¡ãè¿ã
ããŒã«
ãµãŒããŒã¯ããã€ãã®åŒ·åãªããŒã«ãå®è£ ããŠããŸã:
-
add-task
- å¿
é ãã©ã¡ãŒã¿:
name
(æåå),description
(æåå) - ãªãã·ã§ã³ãã©ã¡ãŒã¿:
priority
(æ°å€),deadline
(æåå),tags
(é å) - ã·ã¹ãã ã«æ°ããã¿ã¹ã¯ãäœæãããã®èå¥åãè¿ã
- 察å¿ããã¹ããŒã: ã¿ã¹ã¯äœæã®ããã®ããŒã¿æ€èšŒã¹ããŒã
- å¿
é ãã©ã¡ãŒã¿:
-
decompose-task
- å¿
é ãã©ã¡ãŒã¿:
task_id
(æåå),granularity
(æåå: "high"|"medium"|"low") - ãªãã·ã§ã³ãã©ã¡ãŒã¿:
max_subtasks
(æ°å€) - Ollamaã䜿çšããŠè€éãªã¿ã¹ã¯ã管çå¯èœãªãµãã¿ã¹ã¯ã«åè§£
- 察å¿ããã¹ããŒã: äžèšã®
decompose-task
ããã³ãã
- å¿
é ãã©ã¡ãŒã¿:
-
evaluate-result
- å¿
é ãã©ã¡ãŒã¿:
result_id
(æåå),criteria
(ãªããžã§ã¯ã) - ãªãã·ã§ã³ãã©ã¡ãŒã¿:
detailed
(ããŒã«å€) - æå®ãããåºæºã«å¯ŸããŠçµæãè©äŸ¡ãããã£ãŒãããã¯ãæäŸ
- 察å¿ããã¹ããŒã: äžèšã®
evaluate-result
ããã³ãã
- å¿
é ãã©ã¡ãŒã¿:
-
run-model
- å¿
é ãã©ã¡ãŒã¿:
model
(æåå),prompt
(æåå) - ãªãã·ã§ã³ãã©ã¡ãŒã¿:
temperature
(æ°å€),max_tokens
(æ°å€) - æå®ããããã©ã¡ãŒã¿ã§Ollamaã¢ãã«ãå®è¡
- 察å¿ããã¹ããŒã: Ollamaã¢ãã«å®è¡ãã©ã¡ãŒã¿ã®æ€èšŒã¹ããŒã
- å¿
é ãã©ã¡ãŒã¿:
æ°æ©èœãšæ¹åç¹
æ¡åŒµãšã©ãŒåŠç
ãµãŒããŒã¯ãããè©³çŽ°ã§æ§é åããããšã©ãŒã¡ãã»ãŒãžãæäŸããŸããããã«ãããã¯ã©ã€ã¢ã³ãã¢ããªã±ãŒã·ã§ã³ã¯ãšã©ãŒããã广çã«åŠçã§ããŸãããšã©ãŒã¬ã¹ãã³ã¹ã®äŸ:
{
"error": {
"message": "Task not found: task-123",
"status_code": 404,
"details": {
"provided_id": "task-123"
}
}
}
ããã©ãŒãã³ã¹æé©å
- ã³ãã¯ã·ã§ã³ããŒãªã³ã°: å ±æHTTPæ¥ç¶ããŒã«ã䜿çšããããšã§ããªã¯ãšã¹ãã®ããã©ãŒãã³ã¹ãåäžãããªãœãŒã¹äœ¿çšçãäœæžãããŸãã
- LRUãã£ãã·ã¥: åäžãŸãã¯é¡äŒŒã®ãªã¯ãšã¹ãã«å¯Ÿããå¿çããã£ãã·ã¥ããããšã§ãã¬ã¹ãã³ã¹æéãççž®ãããOllamaãµãŒããŒã®è² è·ã軜æžãããŸãã
ãããã®èšå®ã¯ config.py
ã§èª¿æŽã§ããŸã:
# ããã©ãŒãã³ã¹é¢é£èšå®
cache_size: int = 100 # ãã£ãã·ã¥ã«ä¿åããæå€§ãšã³ããªæ°
max_connections: int = 10 # åææ¥ç¶ã®æå€§æ°
max_connections_per_host: int = 10 # ãã¹ãããšã®æå€§æ¥ç¶æ°
request_timeout: int = 60 # ãªã¯ãšã¹ãã¿ã€ã ã¢ãŠãïŒç§ïŒ
ã¢ãã«æå®æ©èœ
æŠèŠ
Ollama-MCP-Serverã¯ãè€æ°ã®æ¹æ³ã§Ollamaã¢ãã«ãæå®ã§ããæè»ãªæ©èœãæäŸããŸãã
ã¢ãã«æå®ã®åªå é äœ
ã¢ãã«ã¯ä»¥äžã®åªå é äœã§æå®ãããŸãïŒ
- ããŒã«åŒã³åºãæã®ãã©ã¡ãŒã¿ (
model
ãã©ã¡ãŒã¿) - MCPèšå®ãã¡ã€ã«ã®
env
ã»ã¯ã·ã§ã³ - ç°å¢å€æ° (
OLLAMA_DEFAULT_MODEL
) - ããã©ã«ãå€ (
llama3
)
MCPèšå®ãã¡ã€ã«ã䜿ã£ãã¢ãã«æå®
Claude Desktopãªã©ã®ã¯ã©ã€ã¢ã³ãã§äœ¿çšããå ŽåãMCPèšå®ãã¡ã€ã«ã䜿çšããŠã¢ãã«ãæå®ã§ããŸãïŒ
{
"mcpServers": {
"ollama-MCP-server": {
"command": "python",
"args": [
"-m",
"ollama_mcp_server"
],
"env": [
{"model": "llama3:latest"}
]
}
}
}
å©çšå¯èœãªã¢ãã«ã®ç¢ºèª
ãµãŒããŒèµ·åæã«ãèšå®ãããã¢ãã«ãååšããããã§ãã¯ãããŸããã¢ãã«ãèŠã€ãããªãå Žåã¯èŠåãã°ãåºåãããŸãããŸããrun-model
ããŒã«ã¯å©çšå¯èœãªã¢ãã«äžèЧãè¿ãããããŠãŒã¶ãŒã¯æå¹ãªã¢ãã«ãéžæã§ããŸãã
ãšã©ãŒãã³ããªã³ã°ã®æ¹å
æå®ããã¢ãã«ãååšããªãå Žåãéä¿¡ãšã©ãŒãçºçããå Žåã詳现ãªãšã©ãŒã¡ãã»ãŒãžãæäŸãããŸãããšã©ãŒã¡ãã»ãŒãžã«ã¯å©çšå¯èœãªã¢ãã«äžèЧãå«ãŸããããããŠãŒã¶ãŒã¯çŽ æ©ãåé¡ã解決ã§ããŸãã
ãã¹ã
ãããžã§ã¯ãã«ã¯å æ¬çãªãã¹ãã¹ã€ãŒããå«ãŸããŠããŸã:
- ãŠããããã¹ã: åã ã®ã³ã³ããŒãã³ãã®æ©èœããã¹ã
- çµ±åãã¹ã: ãšã³ãããŒãšã³ãã®ã¯ãŒã¯ãããŒããã¹ã
ãã¹ããå®è¡ããã«ã¯:
# ãã¹ãŠã®ãã¹ããå®è¡
python -m unittest discover
# ç¹å®ã®ãã¹ããå®è¡
python -m unittest tests.test_integration
èšå®
ç°å¢å€æ°
OLLAMA_HOST=http://localhost:11434
DEFAULT_MODEL=llama3
LOG_LEVEL=info
Ollamaã®ã»ããã¢ãã
Ollamaãã€ã³ã¹ããŒã«ãããé©åãªã¢ãã«ã§å®è¡ãããŠããããšã確èªããŠãã ãã:
# Ollamaãã€ã³ã¹ããŒã«ïŒãŸã ã€ã³ã¹ããŒã«ãããŠããªãå ŽåïŒ
curl -fsSL https://ollama.com/install.sh | sh
# æšå¥šã¢ãã«ãããŠã³ããŒã
ollama pull llama3
ollama pull mistral
ollama pull qwen2
ã¯ã€ãã¯ã¹ã¿ãŒã
ã€ã³ã¹ããŒã«
pip install ollama-mcp-server
Claude Desktopèšå®
MacOS
ãã¹: ~/Library/Application\ Support/Claude/claude_desktop_config.json
Windows
ãã¹: %APPDATA%/Claude/claude_desktop_config.json
éçº/æªå ¬éãµãŒããŒã®èšå®
"mcpServers": {
"ollama-MCP-server": {
"command": "uv",
"args": [
"--directory",
"/path/to/ollama-MCP-server",
"run",
"ollama-MCP-server"
],
"ENV":["model":"deepseek:r14B"]
}
}
å ¬éãµãŒããŒã®èšå®
"mcpServers": {
"ollama-MCP-server": {
"command": "uvx",
"args": [
"ollama-MCP-server"
]
}
}
䜿çšäŸ
ã¿ã¹ã¯åè§£
è€éãªã¿ã¹ã¯ã管çå¯èœãªãµãã¿ã¹ã¯ã«åè§£ããã«ã¯:
result = await mcp.use_mcp_tool({
"server_name": "ollama-MCP-server",
"tool_name": "decompose-task",
"arguments": {
"task_id": "task://123",
"granularity": "medium",
"max_subtasks": 5
}
})
çµæè©äŸ¡
ç¹å®ã®åºæºã«å¯ŸããŠçµæãè©äŸ¡ããã«ã¯:
evaluation = await mcp.use_mcp_tool({
"server_name": "ollama-MCP-server",
"tool_name": "evaluate-result",
"arguments": {
"result_id": "result://456",
"criteria": {
"accuracy": 0.4,
"completeness": 0.3,
"clarity": 0.3
},
"detailed": true
}
})
Ollamaã¢ãã«ã®å®è¡
Ollamaã¢ãã«ã«å¯ŸããŠçŽæ¥ã¯ãšãªãå®è¡ããã«ã¯:
response = await mcp.use_mcp_tool({
"server_name": "ollama-MCP-server",
"tool_name": "run-model",
"arguments": {
"model": "llama3",
"prompt": "éåã³ã³ãã¥ãŒãã£ã³ã°ãç°¡åãªèšèã§èª¬æããŠãã ãã",
"temperature": 0.7
}
})
éçº
ãããžã§ã¯ãã®ã»ããã¢ãã
- ãªããžããªãã¯ããŒã³:
git clone https://github.com/yourusername/ollama-MCP-server.git
cd ollama-MCP-server
- ä»®æ³ç°å¢ãäœæããŠã¢ã¯ãã£ããŒã:
python -m venv venv
source venv/bin/activate # Windowsã®å Žå: venv\Scripts\activate
- éçºäŸåé¢ä¿ãã€ã³ã¹ããŒã«:
uv sync --dev --all-extras
ããŒã«ã«éçº
ãããžã§ã¯ãã«ã¯äŸ¿å©ãªéçºçšã¹ã¯ãªãããå«ãŸããŠããŸãïŒ
ãµãŒããŒã®å®è¡
./run_server.sh
ãªãã·ã§ã³:
--debug
: ãããã°ã¢ãŒãã§å®è¡ïŒãã°ã¬ãã«: DEBUGïŒ--log=LEVEL
: ãã°ã¬ãã«ãæå®ïŒDEBUG, INFO, WARNING, ERROR, CRITICALïŒ
ãã¹ãã®å®è¡
./run_tests.sh
ãªãã·ã§ã³:
--unit
: ãŠããããã¹ãã®ã¿å®è¡--integration
: çµ±åãã¹ãã®ã¿å®è¡--all
: ãã¹ãŠã®ãã¹ããå®è¡ïŒããã©ã«ãïŒ--verbose
: 詳现ãªãã¹ãåºå
ãã«ããšå ¬é
ããã±ãŒãžãé åžçšã«æºåããã«ã¯:
- äŸåé¢ä¿ãåæããŠããã¯ãã¡ã€ã«ãæŽæ°:
uv sync
- ããã±ãŒãžã®é åžç©ããã«ã:
uv build
ããã«ãããdist/
ãã£ã¬ã¯ããªã«ãœãŒã¹ãšãã€ãŒã«ã®é
åžç©ãäœæãããŸãã
- PyPIã«å ¬é:
uv publish
泚: PyPIèªèšŒæ å ±ãç°å¢å€æ°ãŸãã¯ã³ãã³ããã©ã°ã§èšå®ããå¿ èŠããããŸã:
- ããŒã¯ã³:
--token
ãŸãã¯UV_PUBLISH_TOKEN
- ãŸãã¯ãŠãŒã¶ãŒå/ãã¹ã¯ãŒã:
--username
/UV_PUBLISH_USERNAME
ãš--password
/UV_PUBLISH_PASSWORD
ãããã°
MCPãµãŒããŒã¯stdioãä»ããŠå®è¡ãããããããããã°ã¯é£ããå ŽåããããŸããæé©ãªãããã° äœéšã®ããã«ãMCP Inspectorã®äœ¿çšãåŒ·ãæšå¥šããŸãã
npm
ã䜿çšããŠMCP Inspectorãèµ·åããã«ã¯ã次ã®ã³ãã³ããå®è¡ããŸã:
npx @modelcontextprotocol/inspector uv --directory /path/to/ollama-MCP-server run ollama-mcp-server
èµ·åæãInspectorã¯ãã©ãŠã¶ã§ã¢ã¯ã»ã¹ããŠãããã°ãéå§ã§ããURLã衚瀺ããŸãã
ã¢ãŒãã
è²¢ç®
è²¢ç®ã¯æè¿ããŸãïŒãæ°è»œã«ãã«ãªã¯ãšã¹ããæåºããŠãã ããã
- ãªããžããªããã©ãŒã¯
- æ©èœãã©ã³ããäœæ (
git checkout -b feature/amazing-feature
) - 倿Žãã³ããã (
git commit -m 'Add some amazing feature'
) - ãã©ã³ãã«ããã·ã¥ (
git push origin feature/amazing-feature
) - ãã«ãªã¯ãšã¹ããéã
ã©ã€ã»ã³ã¹
ãã®ãããžã§ã¯ãã¯MITã©ã€ã»ã³ã¹ã®äžã§ã©ã€ã»ã³ã¹ãããŠããŸã - 詳现ã¯LICENSEãã¡ã€ã«ãåç §ããŠãã ããã
è¬èŸ
- åªãããããã³ã«èšèšãæäŸããModel Context ProtocolããŒã
- ããŒã«ã«LLMå®è¡ãã¢ã¯ã»ã¹å¯èœã«ããOllamaãããžã§ã¯ã
- ãã®ãããžã§ã¯ãã®ãã¹ãŠã®è²¢ç®è