MarcoPWx/learnforge-mcp
3.1
If you are the rightful owner of learnforge-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
GenericMCP is a template for building custom Model Context Protocol (MCP) servers using local LLMs for privacy-focused AI integrations.
Tools
5
Resources
0
Prompts
0
GenericMCP
🤖 A generic Model Context Protocol (MCP) server template - Build your own MCP integrations with local LLMs
🌟 Features
This generic MCP server provides a foundation for building custom AI integrations, using 100% local LLMs via Ollama for complete privacy:
- 🔍 Search Capabilities: Implement custom search across your data sources
- 🌐 Content Processing: Process and generate content with local LLMs
- 📝 Dynamic Generation: Create custom content using LLMs (Qwen, Llama, etc.)
- 🎯 Custom Workflows: Build your own AI-powered workflows
- 📊 Data Management: Track and manage your custom data
- 🚀 Extensible Framework: Easy to extend with your own tools
🏗️ Architecture
┌─────────────┐ MCP Protocol ┌───────────────────┐
│Claude/AI │◄───────────────────►│GenericMCP │
│Assistant │ │Server │
└─────────────┘ └───────┬───────────┘
│
┌───────────────────────────┼───────────────────────────┐
│ │ │
┌───────▼────────┐ ┌─────────▼────────┐ ┌──────────▼──────┐
│Your Data │ │Ollama Local LLMs │ │Custom Tools │
│Sources │ │Qwen/Llama/Mistral│ │& Integrations │
└────────────────┘ └──────────────────┘ └─────────────────┘
📦 Installation
Prerequisites
- Node.js 18+
- Ollama installed and running
- Your custom data sources (optional)
- Additional tools as needed for your use case
Quick Start
- Clone this repository:
git clone https://github.com/yourusername/GenericMCP.git
cd GenericMCP
- Install dependencies:
npm install
- Install Ollama and pull a model:
# Install Ollama (macOS)
brew install ollama
# Start Ollama service
ollama serve
# Pull Qwen 2.5 (recommended) or any other model
ollama pull qwen2.5:7b
- Test the server:
npm start
🔧 Configuration
Environment Variables
export DATA_PATH="/path/to/your/data"
export TOOLS_PATH="/path/to/your/tools"
export OLLAMA_HOST="http://localhost:11434"
export OLLAMA_MODEL="qwen2.5:7b"
Claude Desktop Integration
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"generic-mcp": {
"command": "node",
"args": ["/path/to/GenericMCP/index.js"],
"env": {
"DATA_PATH": "/path/to/your/data",
"TOOLS_PATH": "/path/to/your/tools",
"OLLAMA_MODEL": "qwen2.5:7b"
}
}
}
}
🛠️ Available Tools
| Tool | Description | Example |
|---|---|---|
search | Search your data sources | "Find relevant documents" |
process_content | Process content with LLMs | "Analyze this text" |
generate | Generate content using AI | "Create a summary" |
custom_workflow | Execute custom workflows | "Run my automation" |
manage_data | Manage your data | "Update records" |
extend_tools | Add new capabilities | "Add custom function" |
🔐 Privacy & Security
- ✅ 100% Local LLMs - No cloud API calls
- ✅ Ollama Integration - Run on your hardware
- ✅ No Telemetry - Zero tracking
- ✅ Offline Capable - Works without internet
- ✅ Open Source - Fully auditable
🚀 Supported Models
| Model | Size | Best For |
|---|---|---|
| qwen2.5:7b | 4.7GB | General purpose |
| llama3.2:3b | 2.0GB | Lightweight |
| mistral:7b | 4.1GB | Code generation |
| deepseek-coder | 4.1GB | Programming |
📄 License
MIT License - See file
Built with ❤️ for privacy-conscious AI learners