darshan-regmi/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP (Model Context Protocol) Server is a modular AI assistant engine that extends the capabilities of language models beyond text generation.
MCP Server
MCP (Model Context Protocol) Server is a modular AI assistant engine that runs on your machine to extend what a language model can do beyond just generating text.
Features
- š§ AI Router: Switch between online APIs (OpenRouter, OpenAI) and local models (via Ollama)
- š ļø Tool Plugins: Poetry generation, code assistance, Mac cleanup, daily agent, and more
- š Knowledge & Memory: Vector database storage for personal data and conversations
- šļø Multimodal: Speech-to-text, text-to-speech, and image processing capabilities
- š¶ API Clients: OpenRouter.ai, HuggingFace, Google Search, WolframAlpha
Use Cases
- āļø Poetry Mode: Generate, complete, or enhance verses based on your own poetry database
- š§āš» Coding Mode: Code generation, debugging, snippet search, and more
- šÆ Life Assistant Mode: Daily briefings, schedule tracking, and task management
- š Education & Research: Topic explanations, research article summarization
- š§ Custom Persona: Create a chatbot that reflects your unique style and preferences
Installation
- Clone this repository:
git clone https://github.com/darshan-regmi/mcp-server.git
cd mcp-server
- Install dependencies:
pip install -r requirements.txt
- Create a
.env
file based on the example:
cp .env.example .env
# Edit .env with your API keys
- Start the server:
python -m app.main
- Open your browser and navigate to:
http://localhost:8000
API Keys
MCP Server can use various AI models and services. You'll need to obtain API keys for the services you want to use:
- OpenAI API: https://platform.openai.com/
- OpenRouter: https://openrouter.ai/
- HuggingFace: https://huggingface.co/
- Google Custom Search: https://developers.google.com/custom-search/
- Wolfram Alpha: https://developer.wolframalpha.com/
- ElevenLabs: https://elevenlabs.io/
For offline usage, you can use Ollama to run models locally.
Project Structure
mcp-server/
āāā app/ # Main application code
ā āāā routers/ # API routers
ā āāā static/ # Web interface
ā āāā main.py # Application entry point
āāā tools/ # Tool plugins
āāā memory/ # Vector database storage
āāā config/ # Configuration files
āāā requirements.txt # Python dependencies
āāā .env # Environment variables (not in repo)
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Additional Setup
- Install Ollama for local model support (optional):
# Visit https://ollama.ai/ for installation instructions
Usage
Starting the Server
python -m app.main
You can configure the server using environment variables in your .env
file:
MCP_HOST
: Host address to bind to (default: 0.0.0.0)MCP_PORT
: Port to listen on (default: 8000)MCP_MEMORY_DIR
: Directory for storing memory data (default: memory)
The server will be available at http://localhost:8000 by default.
API Endpoints
Chat
curl -X POST http://localhost:8000/chat \
-H "Content-Type: application/json" \
-d '{"message": "Write a poem about dreams", "mode": "poetry"}'
Tools
curl -X POST http://localhost:8000/tools/execute \
-H "Content-Type: application/json" \
-d '{"tool_name": "poetry_gen", "parameters": {"topic": "dreams", "style": "sonnet"}}'
Memory
curl -X POST http://localhost:8000/memory/search \
-H "Content-Type: application/json" \
-d '{"query": "What did I write about dreams?", "collection": "notes"}'
Architecture
š¦ MCP Server (FastAPI + Python)
āāā š§ AI Router (API vs Local LLM via Ollama)
āāā š ļø Tool Plugins
ā āāā poetry_gen.py
ā āāā code_assist.py
ā āāā mac_cleanup.py
ā āāā daily_agent.py
ā āāā memory_manager.py
āāā š Knowledge & Memory
ā āāā Local Notes (MD, TXT, Notion export)
ā āāā Vector Store (ChromaDB / Qdrant)
ā āāā Browser Tools / Docs Reader
āāā šļø Multimodal
ā āāā Speech-to-text (Whisper)
ā āāā TTS (Piper / ElevenLabs)
ā āāā Image Tools
āāā š¶ API Clients
āāā OpenRouter.ai
āāā HuggingFace
āāā Google Search
āāā WolframAlpha
API Response Format
{
"message": "Here's a sonnet about dreams...",
"tool_calls": [
{
"name": "poetry_gen",
"parameters": {"topic": "dreams", "style": "sonnet"}
}
],
"sources": [
{"title": "My Dream Journal", "content": "Excerpt from your notes..."}
],
"mode": "poetry",
"model_used": "claude-3-opus-20240229",
"processing_time": 1.25
}
Privacy and Security
- š Your data stays local - no one sees your personal information
- š API keys are stored securely in your
.env
file - š”ļø Offline-capable with local models via Ollama
- š Consider using SSL/TLS for secure communication in production
- š„ Restrict access to the MCP server port using a firewall
Extending MCP Server
You can extend the MCP server by adding your own tool plugins:
- Create a new Python file in the
tools
directory - Define a class that inherits from the
Tool
base class - Implement the
execute
method - Your tool will be automatically loaded when the server starts
License
This project is open source and available under the MIT License.