OI-Memory-Orchestrator

OI-OS/OI-Memory-Orchestrator

3.2

If you are the rightful owner of OI-Memory-Orchestrator and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Memory MCP Server (Orchestrator) is a backend system designed to enhance AI agents with persistent memory, task planning, and codebase intelligence.

Tools
5
Resources
0
Prompts
0

🧠 Memory MCP Server — Orchestrator

Memory MCP Server License: MIT Node.js TypeScript

🚀 Your AI Agent's Persistent Brain: A vivid backend for memory, task planning, and codebase intelligence.


⚠️ FOR AI AGENTS USING OI OS (Brain Trust 4)

If you are an AI agent installing this server into OI OS, please see for complete installation instructions, including:

  • Intent mappings for natural language queries
  • Parameter rules for tool calls
  • Parameter extractors configuration
  • OI OS-specific setup and troubleshooting

The OI.md file contains all the SQL and TOML configurations needed for full OI OS integration.


📋 Table of Contents


🌟 Overview

Memory MCP Server (Orchestrator) is a state-of-the-art backend that transforms AI agents into persistent, context-aware, and deeply code-literate collaborators. With rich, multi-turn memory, AI-powered planning, and semantic understanding of your codebase, it unlocks intelligent workflows for everything from code review to project management.


✨ Features

  • Persistent Memory: Multi-user conversation sessions, versioned context, and reference keys.
  • Project & Task Planning: Manage plans, tasks, and subtasks; boost with AI-powered plan/task generation and analysis.
  • Knowledge Graph: Portable, human-readable codebase graph (JSONL); store and query entities & relationships.
  • Semantic Code Search: Embed and search code for conceptual matches, not just keywords.
  • Integrated AI Services: Google Gemini for planning, summarization, and code analysis; Tavily for grounded web search.
  • Data Validation & Utilities: Input schema validation, robust error handling, and database backup/restore tools.

🚀 Installation

Prerequisites

RequirementVersion
Node.js18.x or higher
npmLatest
GitAny

Installation Steps

git clone https://github.com/rashee1997/orchestrator.git
cd orchestrator
npm install
npm run build

⚙️ Configuration

API Keys Setup

The server requires API keys for external services. These are best configured in your MCP client's settings file to avoid exposing them in your shell environment. For Google Gemini, you can provide multiple API keys (e.g., from different projects or for failover/load balancing) by appending an underscore and a number (e.g., GEMINI_API_KEY_2, GOOGLE_API_KEY_3). The server will automatically use these in a round-robin fashion.

ServiceEnvironment VariableRequiredGet API Key
Google GeminiGEMINI_API_KEYGet Key
GEMINI_API_KEY_2, etc.🔀 (Optional)
GOOGLE_API_KEY➡️ (Alias)
GOOGLE_API_KEY_2, etc.🔀 (Optional)
Tavily SearchTAVILY_API_KEYGet Key

MCP Client Configuration (VS Code Client Example)

  1. Locate the settings file:

    • Windows: %APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
    • macOS: ~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
    • Linux: ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
  2. Add the server configuration:

{
  "memory-mcp-server": {
    "disabled": false,
    "autoApprove": [],
    "timeout": 120,
    "transportType": "stdio",
    "command": "node",
    "args": [
      "/absolute/path/to/memory-mcp-server/build/index.js"
    ],
    "env": {
      "GEMINI_API_KEY": "your-primary-gemini-api-key",
      "GEMINI_API_KEY_2": "your-secondary-gemini-api-key",
      "GOOGLE_API_KEY": "another-gemini-key-alias",
      "TAVILY_API_KEY": "your-tavily-api-key-here"
    }
  }
}

Replace /absolute/path/to/memory-mcp-server/ with your actual path.


🛠️ Available Tools

  • Conversation Management: Create, manage, and retrieve conversation sessions and messages for persistent, multi-user dialogue.
    • Examples: create_conversation_session, get_conversation_messages
  • Plan & Task Management: Organize and update project plans, tasks, and subtasks, assign tasks, and track progress.
    • Examples: create_task_plan, list_tasks, assign_task
  • Subtask Management: Break tasks into subtasks for finer granularity and progress tracking.
    • Examples: create_subtask, list_subtasks
  • Knowledge Graph Tools: Parse your codebase, build a knowledge graph, and query or update code entities and relationships.
    • Examples: ingest_codebase_structure, query_knowledge_graph
  • Embeddings & Semantic Search: Generate and query vector embeddings for conceptual code search.
    • Examples: ingest_codebase_embeddings, query_codebase_embeddings
  • AI-Enhanced Planning/Tasks: Use AI to decompose tasks, suggest details, or analyze plans for coherence and completeness.
    • Examples: ai_suggest_subtasks, ai_analyze_plan
  • Prompt Refinement & AI: Refine natural language prompts and generate answers with Gemini.
    • Examples: get_refined_prompt, ask_gemini
  • Web Search & Database Utilities: Integrate grounded results via Tavily, export data, and manage DB backups.
    • Examples: tavily_web_search, backup_database, list_tools

⚡ Example Workflow

Here’s how you might orchestrate a multi-step AI workflow with these tools:

  1. Understand the Goal: Use ask_gemini (with execution_mode: plan_generation) to turn a high-level prompt into a structured project plan.
  2. Create the Plan: Call create_task_plan with the refined prompt to initialize a new plan.
  3. Analyze Codebase: Run ingest_codebase_structure to map code files and entities.
  4. Enrich Tasks: Use ai_suggest_subtasks to break complex tasks into actionable subtasks.
  5. Track Progress: Store and retrieve progress via get_task, update_task, and related tools.
  6. Search & Context: Use query_codebase_embeddings or tavily_web_search as context for tasks or code review.
  7. Audit & Export: Regularly export data with export_data_to_csv or back up the database.

[See the docs/ directory or the project wiki for more workflow recipes and advanced usage.]


🏗️ Architecture

Project Structure

memory-mcp-server/
├── src/
│   ├── database/        # Database schemas, services, and managers
│   │   ├── managers/    # Logic for managing specific data models
│   │   ├── parsers/     # Language parsers for codebase introspection
│   │   ├── services/    # Business logic (Gemini, Embeddings, etc.)
│   │   └── storage/     # Low-level storage (JSONL, Indexing)
│   ├── tools/           # MCP tool definitions and handlers
│   ├── types/           # Core TypeScript type definitions
│   └── index.ts         # Main server entry point
├── knowledge_graphs/    # JSONL for code graph
├── memory.db            # SQLite main db
├── vector_store.db      # SQLite for embeddings
└── README.md

Data Flow

flowchart TD
  Agent[AI Agent] -->|MCP Request| Server[Memory MCP Server]
  Server -->|Structured Data| SQLite[(SQLite memory.db)]
  Server -->|KG Operations| KG[(JSONL Knowledge Graph)]
  Server -->|Semantic Search| VecDB[(Vector Store)]
  Server -->|AI/Web Tasks| Ext{External Services}
  Ext --> Gemini[(Google Gemini)]
  Ext --> Tavily[(Tavily Search)]

💻 Development

npm install
npm run build    # Compile TypeScript
npm run watch    # Auto-rebuild on changes
npm test         # Run tests
  • Use npm run inspector for a web-based debugging UI.

🤝 Contributing

We love contributions! Fork, PR, and let’s build the future of intelligent agents together. Ensure you cover new features with tests and keep all existing tests green.


📄 License

MIT — see for details.


Built with creativity and care for next-gen AI agents.