whytcard

WhytcardAI/whytcard

3.2

If you are the rightful owner of whytcard and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

WhytCard is a high-performance AI infrastructure built with Rust, featuring a dynamic MCP Gateway for tool orchestration.

Tools
4
Resources
0
Prompts
0

WhytCard

AI Infrastructure — RAG. MCP. Multi-Agent.

A high-performance AI infrastructure built with Rust. MCP Gateway for dynamic tool orchestration.

Rust

Overview

WhytCard is an AI infrastructure featuring:

  • CORTEX Engine — Cognitive orchestration with Perceive → Execute → Learn pipeline
  • Triple Memory System — Semantic (vectors), Episodic (events), Procedural (rules)
  • Knowledge Graph — Structured entities and relations via SurrealDB
  • MCP Gateway — Install and orchestrate any MCP server dynamically
  • RAG Pipeline — Document ingestion with FastEmbed embeddings
┌─────────────────────────────────────────────────────────┐
│                    CORTEX ENGINE                         │
│  ┌──────────┐  ┌──────────┐  ┌──────────┐              │
│  │ PERCEIVE │─▶│ EXECUTE  │─▶│  LEARN   │              │
│  └──────────┘  └──────────┘  └──────────┘              │
└─────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│                  TRIPLE MEMORY                           │
│  ┌──────────┐  ┌──────────┐  ┌──────────┐              │
│  │ SEMANTIC │  │ EPISODIC │  │PROCEDURAL│              │
│  │ (Vectors)│  │ (Events) │  │ (Rules)  │              │
│  └──────────┘  └──────────┘  └──────────┘              │
└─────────────────────────────────────────────────────────┘

Architecture

whytcard/
├── core/                    # Rust modules
│   ├── intelligence/        # MCP Server + CORTEX Engine
│   ├── database/            # SurrealDB (documents, vectors, graph)
│   ├── rag/                 # Retrieval-Augmented Generation
│   ├── llm/                 # Local LLM inference (GGUF)
│   └── mcp/                 # MCP Gateway configuration
│
└── data/                    # Runtime data
    ├── cortex/              # Memory storage
    ├── vectors/             # Vector embeddings
    └── models/              # LLM models

Quick Start

Prerequisites

  • Rust 1.75+
  • Node.js 20+ (for addons)
  • Python 3.10+ (for voice/ears services)

Run the MCP Server

cd core/intelligence
cargo run --release

The server starts on stdio for MCP protocol communication.

With custom data directory

WHYTCARD_DATA_DIR=/path/to/data cargo run -p whytcard-intelligence

With namespace isolation

cargo run -p whytcard-intelligence -- --namespace copilot

MCP Tools

CORTEX (Cognitive Engine)

ToolDescription
cortex_processMain Perceive → Execute → Learn pipeline
cortex_feedbackFeedback for adaptive learning
cortex_statsEngine statistics
cortex_cleanupCleanup old data
cortex_executeExecute shell commands

Memory

ToolDescription
memory_storeStore with semantic indexing
memory_searchSemantic search
memory_getRetrieve by key
memory_deleteDelete by key
hybrid_searchSearch across all memory types
get_contextAggregated context for queries

Knowledge Graph

ToolDescription
knowledge_add_entityAdd entity
knowledge_add_relationCreate relation
knowledge_searchSearch graph
knowledge_get_entityGet entity + relations
knowledge_find_pathFind path between entities
export_graphExport full graph

External Integrations

ToolDescription
sequential_thinkingProblem decomposition
external_docsLibrary documentation (Context7)
external_searchWeb search (Tavily)

MCP Server Management

ToolDescription
mcp_available_serversList predefined MCP servers
mcp_list_installedList installed servers
mcp_installInstall a server
mcp_uninstallUninstall a server
mcp_connectConnect to a server
mcp_disconnectDisconnect from a server
mcp_list_toolsList tools of a server
mcp_callCall a tool on external server
mcp_statusGet connection status
mcp_configureConfigure server settings

Configuration

Environment Variables

# Data directory
WHYTCARD_DATA_DIR=/path/to/data

# Namespace for isolation
WHYTCARD_NAMESPACE=copilot

# External APIs (optional)
TAVILY_API_KEY=your-key

Predefined MCP Servers

The following MCP servers are pre-configured and can be connected on demand:

ServerDescriptionRequires API Key
sequential-thinkingProblem decomposition & analysisNo
context7Library documentation lookupNo
playwrightBrowser automation & testingNo
memoryPersistent memory storageNo
microsoft-learnMicrosoft/Azure documentationNo
markitdownDocument conversion to markdownNo
chrome-devtoolsChrome DevTools ProtocolNo
tavilyWeb search & researchYes (TAVILY_API_KEY)

MCP Server Configuration

Servers are managed via core/mcp/servers.json:

{
  "sequential-thinking": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-sequential-thinking"],
    "enabled": true
  },
  "context7": {
    "command": "npx",
    "args": ["-y", "@anthropic/context7-mcp"],
    "enabled": true
  },
  "playwright": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-playwright"],
    "enabled": true
  },
  "memory": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-memory"],
    "enabled": true
  },
  "microsoft-learn": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-microsoft-learn"],
    "enabled": true
  },
  "markitdown": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-markitdown"],
    "enabled": true
  },
  "chrome-devtools": {
    "command": "npx",
    "args": ["-y", "@anthropic/mcp-chrome-devtools"],
    "enabled": true
  },
  "tavily": {
    "command": "npx",
    "args": ["-y", "tavily-mcp"],
    "env": {
      "TAVILY_API_KEY": "${TAVILY_API_KEY}"
    },
    "enabled": true
  }
}

Development

Build all modules

# Database
cd core/database && cargo build

# RAG
cd core/rag && cargo build

# Intelligence (MCP Server)
cd core/intelligence && cargo build

# LLM
cd core/llm && cargo build

Run tests

cd core/intelligence
cargo test

Clippy

cargo clippy -p whytcard-intelligence

Tech Stack

ComponentTechnology
Core EngineRust
DatabaseSurrealDB (embedded)
EmbeddingsFastEmbed (ONNX)
LLM Inferencellama.cpp (GGUF)
MCP Protocolrmcp SDK
IDE IntegrationVS Code Extension (VSIX)

Roadmap

  • Triple Memory System
  • CORTEX Engine (Perceive, Execute, Learn)
  • Knowledge Graph
  • MCP Server
  • MCP Gateway (dynamic server management)
  • External Integrations (Context7, Tavily, MS Learn)
  • Multi-Agent System
  • Advanced RAG Pipeline

License

GPL-3.0 — See for details.

Links