Reshendraraj/Custom-MCP-Server-Calculator-with-LangGraph-and-Ollama-Integration
If you are the rightful owner of Custom-MCP-Server-Calculator-with-LangGraph-and-Ollama-Integration and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project demonstrates a custom calculator implementation using an MCP Server integrated with LangChain Ollama models and LangGraph for building an intelligent math assistant.
Custom MCP Server Calculator with LangGraph and Ollama Integration
This project demonstrates a custom calculator implementation using an MCP (Modular Computation Protocol) Server integrated with LangChain Ollama models and LangGraph for building an intelligent math assistant. The system performs math computations using predefined tools (add, subtract, multiply, divide, factorial, square_root) and orchestrates model + tool interactions in a step-by-step manner.
🚀 Features
-
Implements a custom MCP server exposing calculator tools:
- Addition
- Subtraction
- Multiplication
- Division
- Factorial
- Square Root
-
Optimized interaction with a local Ollama model (
qwen3:0.6b). -
Uses LangGraph to define a computation workflow:
- Adds a system prompt for tool usage guidance
- Calls model and tools iteratively in a graph-based workflow
-
Designed for step-by-step math problem solving.
📂 Project Structure
. ├── custom_mcp_server.py # MCP server exposing calculator tools ├── main_client.py # Main client that invokes the calculator tools ├── .env # Environment variables (if needed) ├── requirements.txt # Project dependencies └── README.md # Project documentation
⚙️ Setup Instructions 1️⃣ Install Dependencies pip install -r requirements.txt
Make sure you have: Python 3.10+ langchain-ollama langgraph dotenv, asyncio, logging
2️⃣ Configure Environment Create a .env file in your project root if you need to configure environment variables, e.g., for Ollama model path or authentication. 3️⃣ Run the MCP Server python custom_mcp_server.py This will start the MCP server in stdio mode, exposing the math computation tools.
▶️ Running the Calculator Client Run the main client script: python main_client.py ## here 11.py
It will: Initialize the Ollama model. Connect to the MCP Server. Fetch available tools. Execute a math question like: "what's (3 + 5) x 12? Break it down step by step using available tools." print the final model result and tool usage history.