morriartie/mcp_home_info
3.2
If you are the rightful owner of mcp_home_info and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server is designed to facilitate seamless interaction between machine learning models and various client applications, providing a robust infrastructure for model deployment and management.
Tools
2
Resources
0
Prompts
0
Open WebUI with MCP Server Setup Guide
Quick Start
# 1. Build and start all services
docker-compose up --build
# 2. Pull model and start servers
make setup
Service Management
Starting the LLM Service
# 2. Access the Ollama container and run the model
docker exec -it llm-local /bin/bash
# Inside container:
ollama run qwen3:0.6b
Starting the MCP Server
# 3. Start the Python MCP server
docker exec -it mcp-server /bin/bash
# Inside container:
python mcp_main.py
Starting the MCPO Proxy
# 4. Start the MCPO proxy server
docker exec -it mcp-server /bin/bash
# Inside container:
mcpo --port 8001 --api-key "top-secret" --server-type "streamable-http" -- http://localhost:8000/sse
Service Overview
| Service | Port | Purpose | Access Command |
|---|---|---|---|
| Open WebUI | 3000 | Web Interface | http://localhost:3000 |
| Ollama | 11434 | LLM Service | http://localhost:11434 |
| MCP Server | 8000 | Python MCP Tools | Internal |
| MCPO Proxy | 8001 | Streamable HTTP Proxy | Internal |
Docker Compose Structure
services:
mcp-server:
# Python MCP server with custom tools
ports: ["8000:8000", "8001:8001"]
llm-local:
# Ollama LLM service
ports: ["11434:11434"]
open-webui:
# Web interface
ports: ["3000:8080"]
Notes
- The MCP server must be running before starting the MCPO proxy
- Models are automatically pulled on first run via the model-puller service
- Open WebUI is configured for single-user mode (no authentication)