nandanadileep/MCP-Server
If you are the rightful owner of MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project demonstrates a simple Model Context Protocol (MCP) server and client implementation using `fastmcp` and `mcp_use`.
MCP Server and Client Setup
This project demonstrates a simple Model Context Protocol (MCP) server and client implementation using fastmcp
and mcp_use
.
Project Structure
MCP-Server/
āāā simple_mcp_server.py # Simple MCP server with an "add" tool
āāā simple_mcp_server_config.json # Configuration for the server
āāā simple_mcp_client.py # Basic client for direct tool calls
āāā simple_client.py # Enhanced simple client with multiple tool tests
āāā client.py # Advanced client with agent capabilities
āāā mcp-env/ # Virtual environment with dependencies
āāā README.md # This file
Setup
-
Virtual Environment: The project uses a virtual environment located in
mcp-env/
-
Dependencies: The following packages are installed:
fastmcp
- For creating MCP serversmcp_use
- For creating MCP clients and agentslangchain-openai
- For OpenAI integrationlangchain-ollama
- For Ollama integration
Usage
1. Simple Direct Tool Calls
The simple_mcp_client.py
demonstrates basic direct tool calls:
python simple_mcp_client.py
2. Enhanced Simple Client
The simple_client.py
provides a more comprehensive demonstration:
python simple_client.py
This will:
- List all available tools
- Test the "add" tool with multiple parameter sets
- Show clear success/failure indicators
3. Advanced Client with Agent Capabilities
The client.py
demonstrates both direct tool calls and agent-based approaches:
python client.py
This will:
- Perform direct tool calls (works without LLM)
- Attempt agent-based approach if Ollama is available
- Provide helpful error messages and setup instructions
4. Using the Agent Approach
To use the agent approach with Ollama:
- Install Ollama: https://ollama.ai/
- Start Ollama:
ollama serve
- Pull the model:
ollama pull llama3.1:8b
- Run the client:
python client.py
Configuration
The simple_mcp_server_config.json
file contains the server configuration:
{
"mcpServers": {
"simple-server": {
"command": "/Users/nandana/Documents/Projects/MCP-Server/mcp-env/bin/python",
"args": ["simple_mcp_server.py"]
}
}
}
Server Implementation
The simple_mcp_server.py
implements a simple MCP server with an "add" tool:
#!/usr/bin/env python3
from mcp.server.fastmcp import FastMCP
app = FastMCP("simple-server")
@app.tool()
def add(a: int, b: int) -> int:
return a + b
if __name__ == "__main__":
app.run()
Client Examples
Direct Tool Call Example
import asyncio
from mcp_use import MCPClient
async def main():
client = MCPClient("simple_mcp_server_config.json")
try:
session = await client.create_session("simple-server")
result = await session.call_tool("add", {"a": 5, "b": 7})
print(f"Result: {result.content[0].text}")
finally:
await client.close_all_sessions()
if __name__ == "__main__":
asyncio.run(main())
Enhanced Client Example
import asyncio
from mcp_use import MCPClient
import json
async def main():
with open("simple_mcp_server_config.json", "r") as f:
config = json.load(f)
client = MCPClient.from_dict(config)
try:
await client.create_all_sessions()
for session_name in client.get_server_names():
session = client.get_session(session_name)
tools_response = await session.list_tools()
tool_names = [tool.name for tool in tools_response]
print(f"Available tools: {tool_names}")
# Test tools
for tool_name in tool_names:
if tool_name == 'add':
result = await session.call_tool("add", {"a": 10, "b": 20})
print(f"add(10, 20) = {result.content[0].text}")
finally:
await client.close_all_sessions()
if __name__ == "__main__":
asyncio.run(main())
Agent Example
import asyncio
from mcp_use import MCPClient, MCPAgent
from langchain_ollama import ChatOllama
async def main():
client = MCPClient("simple_mcp_server_config.json")
await client.create_all_sessions()
llm = ChatOllama(model="llama3.1:8b")
agent = MCPAgent(llm=llm, client=client, max_steps=5)
result = await agent.run("What is the result of 98765 + 98767?")
print(result)
await client.close_all_sessions()
if __name__ == "__main__":
asyncio.run(main())
Current Status
ā Working Features:
- MCP server with "add" tool
- Direct tool calls via
mcp_use
- Tool listing and discovery
- Multiple client implementations
- Proper error handling and cleanup
ā ļø Requires Setup:
- Ollama installation and model download for agent functionality
- OpenAI API key for OpenAI-based agents
Troubleshooting
Common Issues
-
ModuleNotFoundError: Make sure you're using the Python interpreter from the virtual environment:
/Users/nandana/Documents/Projects/MCP-Server/mcp-env/bin/python your_script.py
-
Connection Errors: Ensure the server configuration points to the correct Python interpreter path.
-
Agent Connection Issues:
- For Ollama: Install Ollama and pull the required model
- For OpenAI: Set the
OPENAI_API_KEY
environment variable
Debug Mode
Enable debug mode for more verbose output:
DEBUG=1 python your_script.py
Extending the Server
To add more tools to your server, simply add more functions with the @app.tool()
decorator:
@app.tool()
def multiply(a: int, b: int) -> int:
return a * b
@app.tool()
def divide(a: float, b: float) -> float:
if b == 0:
raise ValueError("Cannot divide by zero")
return a / b
Next Steps
- Add More Tools: Extend the server with additional functionality
- Multiple Servers: Configure multiple MCP servers in the configuration
- Custom Agents: Build more sophisticated agents with different LLM providers
- Error Handling: Add comprehensive error handling and retry logic
- Testing: Add unit tests for your tools and integration tests for the client