k8s-agent-mcp-server-langgraph

sanjog-lama/k8s-agent-mcp-server-langgraph

3.1

If you are the rightful owner of k8s-agent-mcp-server-langgraph and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This project provides a Kubernetes-focused MCP (Multi-Server Chat Protocol) setup using LangGraph, MistralAI, and Streamlit for interactive queries and management.

Kubernetes MCP Server-Client with LangGraph and Streamlit

This project provides a Kubernetes-focused MCP (Multi-Server Chat Protocol) setup using LangGraph, MistralAI, and Streamlit for interactive queries and management.

Environment Setup

  1. Clone the repository:
git clone https://github.com/sanjog-lama/k8s-agent-mcp-server-langgraph.git
cd k8s-agent-mcp-server-langgraph
  1. Create and activate a virtual environment:

python3 -m venv venv
source venv/bin/activate
  1. Install required packages:

pip install -r requirements.txt
  1. Create a .env file in the project root and add your API keys:
MISTRAL_API_KEY=your_mistral_key
  1. Running the Project:
Open two separate terminals:
  • Terminal 1 - Custom MCP Server - Start the MCP server:
python k8_mcp_server.py

This will run your Kubernetes MCP server and expose the tools via streamable-http transport.

  • Terminal 2 - MCP Client

You can interact with the MCP server in two ways:

  • Option 1: Run the client script
python mcp_client_langgraph.py

This runs a CLI-based interaction using LangGraph.

  • Option 2: Run the Streamlit web app
streamlit run web_app.py