learnforge-mcp

MarcoPWx/learnforge-mcp

3.1

If you are the rightful owner of learnforge-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

GenericMCP is a template for building custom Model Context Protocol (MCP) servers using local LLMs for privacy-focused AI integrations.

Tools
5
Resources
0
Prompts
0

GenericMCP

License: MIT MCP Protocol Ollama

🤖 A generic Model Context Protocol (MCP) server template - Build your own MCP integrations with local LLMs

🌟 Features

This generic MCP server provides a foundation for building custom AI integrations, using 100% local LLMs via Ollama for complete privacy:

  • 🔍 Search Capabilities: Implement custom search across your data sources
  • 🌐 Content Processing: Process and generate content with local LLMs
  • 📝 Dynamic Generation: Create custom content using LLMs (Qwen, Llama, etc.)
  • 🎯 Custom Workflows: Build your own AI-powered workflows
  • 📊 Data Management: Track and manage your custom data
  • 🚀 Extensible Framework: Easy to extend with your own tools

🏗️ Architecture

┌─────────────┐     MCP Protocol    ┌───────────────────┐
│Claude/AI    │◄───────────────────►│GenericMCP         │
│Assistant    │                     │Server             │
└─────────────┘                     └───────┬───────────┘
                                            │
                ┌───────────────────────────┼───────────────────────────┐
                │                           │                           │
        ┌───────▼────────┐       ┌─────────▼────────┐       ┌──────────▼──────┐
        │Your Data       │       │Ollama Local LLMs │       │Custom Tools     │
        │Sources         │       │Qwen/Llama/Mistral│       │& Integrations   │
        └────────────────┘       └──────────────────┘       └─────────────────┘

📦 Installation

Prerequisites

  • Node.js 18+
  • Ollama installed and running
  • Your custom data sources (optional)
  • Additional tools as needed for your use case

Quick Start

  1. Clone this repository:
git clone https://github.com/yourusername/GenericMCP.git
cd GenericMCP
  1. Install dependencies:
npm install
  1. Install Ollama and pull a model:
# Install Ollama (macOS)
brew install ollama

# Start Ollama service
ollama serve

# Pull Qwen 2.5 (recommended) or any other model
ollama pull qwen2.5:7b
  1. Test the server:
npm start

🔧 Configuration

Environment Variables

export DATA_PATH="/path/to/your/data"
export TOOLS_PATH="/path/to/your/tools"
export OLLAMA_HOST="http://localhost:11434"
export OLLAMA_MODEL="qwen2.5:7b"

Claude Desktop Integration

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "generic-mcp": {
      "command": "node",
      "args": ["/path/to/GenericMCP/index.js"],
      "env": {
        "DATA_PATH": "/path/to/your/data",
        "TOOLS_PATH": "/path/to/your/tools",
        "OLLAMA_MODEL": "qwen2.5:7b"
      }
    }
  }
}

🛠️ Available Tools

ToolDescriptionExample
searchSearch your data sources"Find relevant documents"
process_contentProcess content with LLMs"Analyze this text"
generateGenerate content using AI"Create a summary"
custom_workflowExecute custom workflows"Run my automation"
manage_dataManage your data"Update records"
extend_toolsAdd new capabilities"Add custom function"

🔐 Privacy & Security

  • 100% Local LLMs - No cloud API calls
  • Ollama Integration - Run on your hardware
  • No Telemetry - Zero tracking
  • Offline Capable - Works without internet
  • Open Source - Fully auditable

🚀 Supported Models

ModelSizeBest For
qwen2.5:7b4.7GBGeneral purpose
llama3.2:3b2.0GBLightweight
mistral:7b4.1GBCode generation
deepseek-coder4.1GBProgramming

📄 License

MIT License - See file


Built with ❤️ for privacy-conscious AI learners