rameshpilli/MCP-Server
If you are the rightful owner of MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a sophisticated system designed to facilitate agent registration and interaction with various tools, resources, and prompt templates through a Chainlit UI and FastAPI backend.
Model Context Protocol (MCP) Server with Agent Registration
This project implements a Model Context Protocol (MCP) server with a Chainlit UI and FastAPI backend that supports agent registration. It allows various tools, resources, and prompt templates to be registered and used by agents.
Architecture
The system follows a layered architecture:
ββββββββββββββββββ
β Chainlit UI β
ββββββββββββββββββ
β
βΌ
ββββββββββββββββββ
β MCP Client β
ββββββββββββββββββ
β
βΌ
ββββββββββββββββββ
β FastAPI Server β
ββββββββββββββββββ
β
βΌ
ββββββββββββββββββ
β MCP Server β
ββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββ
β β
βΌ βΌ
βββββββββββββββ ββββββββββββββββ
β External β β Internal β
β Agents β β Components β
βββββββββββββββ ββββββββββββββββ
β β
βΌ βΌ
ββββββββββββββββββββ¬ββββββββββββββββββββ¬βββββββββββββββββββ
β Tools β Resources β Prompts β
ββββββββββββββββββββ΄ββββββββββββββββββββ΄βββββββββββββββββββ
Components
-
Chainlit UI (ui/app.py)
- Web interface for user interaction
- Sends messages to the MCP client
-
MCP Client (mcp_client.py)
- Processes user messages
- Sends requests to the MCP server
- Handles context retrieval from Cohere
-
FastAPI Server (app/main.py)
- API endpoints for agent registration
- Forwards requests to MCP Server
- Manages agent metadata
-
MCP Server (app/mcp_server.py)
- Central orchestration engine
- Executes tools and processes requests
- Uses namespaced registries for tools, resources, and prompts
-
Registry (app/registry/)
- Management of tools, resources, and prompts
- Namespace support for multi-agent environments
- Dynamic registration and discovery
Flow
- User inputs a message in Chainlit UI
- Message is sent to MCP Client
- MCP Client forwards request to FastAPI Server
- FastAPI Server forwards to MCP Server
- MCP Server processes request, executes tools as needed
- Response is returned through the chain
- Response is displayed in Chainlit UI
Example tool chaining:
User Input
|
v
MCP Server
|
v
[Tool A] -> [Tool B] -> [Tool C]
|
v
Response
Agent Registration
The system supports registration of external agents, each with their own tools, resources, and prompts. See for additional documentation.
How to Register an Agent
import requests
# Register a new agent
response = requests.post(
"http://localhost:8000/api/v1/agents/register",
json={
"name": "MyAgent",
"description": "My custom agent for data processing",
"namespace": "myagent",
"capabilities": ["search", "summarize", "analyze"]
}
)
agent_id = response.json()["id"]
Namespaced Components
All components (tools, resources, prompts) are namespaced to avoid conflicts:
# Register a tool with namespace
from app.registry.tools import register_tool
@register_tool(
name="custom_search",
description="Custom search implementation",
namespace="myagent"
)
async def custom_search(query: str):
# Search implementation
pass
Setup
- Install dependencies from
pyproject.toml
:
pip install .
This project uses pyproject.toml
as the single source of truth for
dependencies. The provided Dockerfile and Kubernetes manifests install
packages the same way using pip install .
.
- Configure environment variables (create a
.env
file). You can start by copying.env.example
and then filling in the required values (e.g. your OpenAI API key):
# MCP Server
MCP_SERVER_HOST=localhost
MCP_SERVER_PORT=8080
# FastAPI Server
HOST=localhost
PORT=8000
# LLM Configuration
LLM_MODEL=claude-3-opus-20240229
LLM_BASE_URL=https://api.anthropic.com/v1/messages
# Add OAuth settings if needed
# Cohere Configuration (optional)
COHERE_INDEX_NAME=mcp_index
COHERE_SERVER_URL=
COHERE_SERVER_BEARER_TOKEN=
Running the Application
Option 1: Start Both Servers with Single Command
python run.py
Option 2: Start Each Server Separately
- Start the MCP server:
python app/mcp_server.py
To specify the server mode, use the --mode
flag:
python app/mcp_server.py --mode http # default, enables SSE
python app/mcp_server.py --mode stdio # run in STDIO mode
SSE is available when running in http
mode.
- Start the FastAPI server:
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
- Start the Chainlit UI:
chainlit run ui/app.py
Option 3: Use the uvx
CLI
After installing the project with pip install .
, you can start both servers and the Chainlit UI with a single command:
uvx run --host 0.0.0.0 --port 8000
The Chainlit interface will run on port+1
(default 8001
).
Running the Dummy Financial Server
To test the financial tools without corporate services, start the mock API:
uvicorn examples.dummy_financial_server:app --host 0.0.0.0 --port 8001
Set CLIENTVIEW_BASE_URL
to point the tools to this server:
export CLIENTVIEW_BASE_URL="http://localhost:8001"
- Access the UI at http://localhost:8501
- Access the API docs at http://localhost:8000/docs
Docker Usage
Build the container image:
docker build -t mcp-server .
The Dockerfile installs all dependencies using pip install .
, so the
pyproject.toml
is the single source of package versions.
Run the server using your .env
file:
docker run --env-file .env -p 8000:8000 -p 8081:8081 -p 8501:8501 mcp-server
Adding Documents
Place documents in the docs/
directory. The system supports Markdown (.md
) and text (.txt
) files.
API Endpoints
- GET /api/v1/health - Health check
- POST /api/v1/chat - Chat with the MCP server
- POST /api/v1/agents/register - Register a new agent
- GET /api/v1/agents - List all registered agents
- GET /api/v1/agents/{agent_id} - Get agent information
- DELETE /api/v1/agents/{agent_id} - Unregister an agent
Configuration
Edit app/config.py
to change configuration settings.
Logging
Logging behavior is controlled by two environment variables:
LOG_LEVEL
sets the verbosity (defaultINFO
).LOG_TO_STDOUT_ONLY
if set, disables file logging and writes logs only to stdout.
Project Structure
mcp-app/
βββ app/
β βββ main.py # FastAPI entrypoint that handles HTTP requests and routes them to the right components.
β βββ mcp_client.py # The brain of the operation - processes prompts and coordinates tool execution.
β βββ mcp_bridge.py # Connects to Cohere Compass to understand user intent and plan tool usage.
β βββ registry/
β β βββ tools.py # A catalog of available tools that can be used to help users.
β β βββ prompts.py # Templates for consistent communication with users and tools.
β β βββ resources.py # External service connections like CRM or databases.
β βββ memory/
β β βββ short_term.py # Keeps track of conversation context using Redis.
β βββ workers/
β β βββ summarizer.py # Example tool that processes and summarizes information.
β βββ chaining.py # Orchestrates multiple tools working together.
βββ ui/
β βββ app.py # A friendly chat interface for users to interact with the system.
βββ .env # Configuration secrets and API keys (keep this safe!).
βββ Dockerfile # Instructions for packaging the app into a container.
βββ pyproject.toml # Project metadata and dependencies.
βββ kubernetes/
βββ deployment.yaml # Tells Kubernetes how to run multiple copies of the app.
βββ service.yaml # Sets up networking so other services can talk to the app.
βββ ingress.yaml # Manages external access to the app.
Development Phases
- β UI Setup (Chainlit)
- β³ FastAPI Server Setup
- β³ MCP Client Logic
- β³ Cohere Compass Integration
- β³ Tool Chaining Logic
- β³ Sample Tools Implementation
- β³ Session Memory
- β³ Testing Setup
- β³ Docker Configuration - see
- β³ Kubernetes Deployment
API Endpoints
- POST
/chat
- Main chat endpoint - POST
/register
- Register new tools - POST
/tool/{tool_name}
- Execute specific tool
CRM MCP Server Request Flow
-
Client Request β’ The user (Chainlit UI or API consumer) sends a request (e.g., a message or command).
-
main.py (FastAPI) β’ Receives the HTTP request at /api/v1/chat β’ Calls mcp_client.process_message()
-
mcp_bridge.py β’ Acts as a bridge between FastAPI and the MCP server. β’ Uses Cohere Compass to classify the intent and decide which tools to run. β’ Returns a routing plan (tools + parameters).
-
mcp_server.py (FastMCP Server) β’ Receives the tool execution plan from the bridge. β’ Finds the matching tools from its registry. β’ Executes the tools (can support chaining).
-
Tools β’ Tools do the real work (e.g., fetch financial data, read docs). β’ Return results to the MCP server.
-
Backflow β’ Tools β MCP Server: Results go back to the FastMCP server. β’ MCP Server β mcp_bridge.py: FastMCP hands over the tool results. β’ mcp_bridge.py β main.py: Bridge optionally uses generate_response() to format the result. β’ main.py β Client: Final response is returned to the UI/API caller.
βΈ»
π§ Key Insight
The MCP Server is the brain, the Bridge is the router & formatter, and FastAPI is just the door.
Python SDK Usage
You can interact with the server from Python using the MCPClient
class:
from mcp_client import MCPClient
client = MCPClient(base_url="http://mcp-server:8000")
response = client.query_sync("Top clients in Canada")
print(response)
Use the MCP_SERVER_URL
environment variable to configure the default server URL. When unset, it falls back to http://localhost:8000
.
For a more in-depth explanation of the codebase see . For example cURL usage see .