langgraph-mcp-server

atishay2411/langgraph-mcp-server

3.1

If you are the rightful owner of langgraph-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The LangGraph MCP Server is a specialized server designed to facilitate communication and data exchange between language models and various applications using the Model Context Protocol (MCP).

MCP Server using LangGraph

mcplanggraph is a Python project that integrates LangGraph, LangChain, and MCP (Model Context Protocol) to build powerful language model workflows with Groq LLM backend support.

Overview

This project enables coordination between multiple LLM-powered tools using MCP, Groq-backed LangChain agents, and LangGraph orchestration. It includes working tool servers, a LangChain agent client, and integration with Groq.

What is Model Context Protocol (MCP)?

The Model Context Protocol (MCP) is an open standard designed to enable seamless communication between LLM-based applications and external tools, data sources, or services. Introduced in 2024, it is built on top of JSON-RPC 2.0 and enables tools to be discovered, described, and invoked in a standardized way.

Key features of MCP:

  • Standardizes tool/plugin interaction across different LLM ecosystems.
  • Supports both local (stdio) and remote (streamable HTTP) transport layers.
  • Enables dynamic tool discovery and usage at runtime.
  • Facilitates chaining and coordination of multiple tools.

This protocol allows an LLM agent to invoke tools like math operations or weather services programmatically without hardcoding logic for each integration.

Project Structure

mcplanggraph/
ā”œā”€ā”€ client.py              # LangChain+Groq agent client using MCP tools
ā”œā”€ā”€ mathserver.py          # Math MCP tool server using stdio
ā”œā”€ā”€ weather.py             # Weather MCP tool server using HTTP
ā”œā”€ā”€ main.py                # Simple entrypoint script
ā”œā”€ā”€ requirements.txt       # Basic dependencies (alt to pyproject.toml)
ā”œā”€ā”€ pyproject.toml         # Project metadata and dependencies
ā”œā”€ā”€ .gitignore             # Git ignore rules
ā”œā”€ā”€ .python-version        # Python version pin (3.13)
└── README.md              # Project documentation

Dependencies

Defined in pyproject.toml and requirements.txt:

  • langchain-groq: LangChain integration with Groq LLMs
  • langchain-mcp-adapters: Bridges LangChain tools with MCP servers
  • langgraph: Graph-based workflow orchestrator
  • mcp: Model Context Protocol for tool invocation

Getting Started

Prerequisites

  • Python 3.13+
  • uv (optional for lockfile support)
  • Groq API key in your .env file

Installation

# Clone the repo
$ git clone https://github.com/atishay2411/langgraph-mcp-server
$ cd mcplanggraph

# Create a virtual environment
$ python3.13 -m venv .venv
$ source .venv/bin/activate

# Install dependencies
$ pip install -r requirements.txt
# Or use pyproject.toml
$ pip install .

Add a .env file with:

GROQ_API_KEY=your_api_key_here

Running the Example

Start the weather tool server (HTTP):

python weather.py

Run the client (spawns math tool subprocess):

python client.py

You should see the math and weather responses printed by the agent.


Key Technologies

ToolPurpose
LangGraphStateful LLM workflows
MCPProtocol for tool orchestration
LangChainFramework for LLM apps
GroqHigh-speed inference backend
uvFast dependency management

License

GNU

Contributing

Feel free to open issues or submit pull requests. Contributions are welcome!