mcp-server

darshan-regmi/mcp-server

3.2

If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

MCP (Model Context Protocol) Server is a modular AI assistant engine that extends the capabilities of language models beyond text generation.

Tools
5
Resources
0
Prompts
0

MCP Server

MCP (Model Context Protocol) Server is a modular AI assistant engine that runs on your machine to extend what a language model can do beyond just generating text.

Features

  • 🧠 AI Router: Switch between online APIs (OpenRouter, OpenAI) and local models (via Ollama)
  • šŸ› ļø Tool Plugins: Poetry generation, code assistance, Mac cleanup, daily agent, and more
  • šŸ“š Knowledge & Memory: Vector database storage for personal data and conversations
  • šŸŽ™ļø Multimodal: Speech-to-text, text-to-speech, and image processing capabilities
  • šŸ“¶ API Clients: OpenRouter.ai, HuggingFace, Google Search, WolframAlpha

Use Cases

  • āœļø Poetry Mode: Generate, complete, or enhance verses based on your own poetry database
  • šŸ§‘ā€šŸ’» Coding Mode: Code generation, debugging, snippet search, and more
  • šŸŽÆ Life Assistant Mode: Daily briefings, schedule tracking, and task management
  • šŸ“š Education & Research: Topic explanations, research article summarization
  • 🧠 Custom Persona: Create a chatbot that reflects your unique style and preferences

Installation

  1. Clone this repository:
git clone https://github.com/darshan-regmi/mcp-server.git
cd mcp-server
  1. Install dependencies:
pip install -r requirements.txt
  1. Create a .env file based on the example:
cp .env.example .env
# Edit .env with your API keys
  1. Start the server:
python -m app.main
  1. Open your browser and navigate to:
http://localhost:8000

API Keys

MCP Server can use various AI models and services. You'll need to obtain API keys for the services you want to use:

For offline usage, you can use Ollama to run models locally.

Project Structure

mcp-server/
ā”œā”€ā”€ app/                  # Main application code
│   ā”œā”€ā”€ routers/          # API routers
│   ā”œā”€ā”€ static/           # Web interface
│   └── main.py           # Application entry point
ā”œā”€ā”€ tools/                # Tool plugins
ā”œā”€ā”€ memory/               # Vector database storage
ā”œā”€ā”€ config/               # Configuration files
ā”œā”€ā”€ requirements.txt      # Python dependencies
└── .env                  # Environment variables (not in repo)

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Additional Setup

  1. Install Ollama for local model support (optional):
    # Visit https://ollama.ai/ for installation instructions
    

Usage

Starting the Server

python -m app.main

You can configure the server using environment variables in your .env file:

  • MCP_HOST: Host address to bind to (default: 0.0.0.0)
  • MCP_PORT: Port to listen on (default: 8000)
  • MCP_MEMORY_DIR: Directory for storing memory data (default: memory)

The server will be available at http://localhost:8000 by default.

API Endpoints

Chat
curl -X POST http://localhost:8000/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "Write a poem about dreams", "mode": "poetry"}'
Tools
curl -X POST http://localhost:8000/tools/execute \
  -H "Content-Type: application/json" \
  -d '{"tool_name": "poetry_gen", "parameters": {"topic": "dreams", "style": "sonnet"}}'
Memory
curl -X POST http://localhost:8000/memory/search \
  -H "Content-Type: application/json" \
  -d '{"query": "What did I write about dreams?", "collection": "notes"}'

Architecture

šŸ“¦ MCP Server (FastAPI + Python)
ā”œā”€ā”€ 🧠 AI Router (API vs Local LLM via Ollama)
ā”œā”€ā”€ šŸ› ļø Tool Plugins
│   ā”œā”€ā”€ poetry_gen.py
│   ā”œā”€ā”€ code_assist.py
│   ā”œā”€ā”€ mac_cleanup.py
│   ā”œā”€ā”€ daily_agent.py
│   └── memory_manager.py
ā”œā”€ā”€ šŸ“š Knowledge & Memory
│   ā”œā”€ā”€ Local Notes (MD, TXT, Notion export)
│   ā”œā”€ā”€ Vector Store (ChromaDB / Qdrant)
│   └── Browser Tools / Docs Reader
ā”œā”€ā”€ šŸŽ™ļø Multimodal
│   ā”œā”€ā”€ Speech-to-text (Whisper)
│   ā”œā”€ā”€ TTS (Piper / ElevenLabs)
│   └── Image Tools
└── šŸ“¶ API Clients
    ā”œā”€ā”€ OpenRouter.ai
    ā”œā”€ā”€ HuggingFace
    ā”œā”€ā”€ Google Search
    └── WolframAlpha

API Response Format

{
  "message": "Here's a sonnet about dreams...",
  "tool_calls": [
    {
      "name": "poetry_gen",
      "parameters": {"topic": "dreams", "style": "sonnet"}
    }
  ],
  "sources": [
    {"title": "My Dream Journal", "content": "Excerpt from your notes..."}
  ],
  "mode": "poetry",
  "model_used": "claude-3-opus-20240229",
  "processing_time": 1.25
}

Privacy and Security

  • šŸ”’ Your data stays local - no one sees your personal information
  • šŸ”‘ API keys are stored securely in your .env file
  • šŸ›”ļø Offline-capable with local models via Ollama
  • 🌐 Consider using SSL/TLS for secure communication in production
  • šŸ”„ Restrict access to the MCP server port using a firewall

Extending MCP Server

You can extend the MCP server by adding your own tool plugins:

  1. Create a new Python file in the tools directory
  2. Define a class that inherits from the Tool base class
  3. Implement the execute method
  4. Your tool will be automatically loaded when the server starts

License

This project is open source and available under the MIT License.