SaptaDey/Graph-of-Thought-MCP
If you are the rightful owner of Graph-of-Thought-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The ASR Graph of Thoughts (GoT) MCP server is a sophisticated implementation of the Model Context Protocol, designed to enhance AI reasoning using graph-based representations.
ASR Graph of Thoughts (GoT) Model Context Protocol (MCP) Server
The Advanced Scientific Research (ASR) Graph of Thoughts (GoT) MCP server is a highly efficient implementation of the Model Context Protocol (MCP) that allows for sophisticated reasoning workflows using graph-based representations.
Project Overview
This project implements a Model Context Protocol (MCP) server architecture that leverages a Graph of Thoughts approach to enhance AI reasoning capabilities. It can be connected to AI models or applications like Claude desktop app or API-based integrations.
Project Structure
asr-got-mcp/
āāā docker-compose.yml # Docker Compose configuration for multi-container setup
āāā Dockerfile # Docker configuration for the backend
āāā requirements.txt # Python dependencies
āāā src/ # Source code
ā āāā server.py # Main server implementation
ā āāā asr_got/ # Core ASR-GoT implementation
ā ā āāā core.py # Core functionality
ā ā āāā stages/ # Processing stages
ā ā ā āāā stage_1_initialization.py
ā ā ā āāā stage_2_decomposition.py
ā ā ā āāā stage_3_hypothesis.py
ā ā ā āāā stage_4_evidence.py
ā ā ā āāā stage_5_pruning.py
ā ā ā āāā stage_6_subgraph.py
ā ā ā āāā stage_7_composition.py
ā ā ā āāā stage_8_reflection.py
ā ā āāā utils/ # Utility functions
ā ā āāā models/ # Data models
ā āāā api/ # API implementation
ā āāā routes.py # API routes
ā āāā schema.py # API schemas
āāā config/ # Configuration files
āāā tests/ # Test suite
Running the Project with Docker
This project provides a multi-container Docker setup for both the Python backend (FastAPI) and the static JavaScript client. The setup uses Docker Compose for orchestration.
Project-Specific Docker Requirements
- Python Version: 3.13-slim (as specified in the backend Dockerfile)
- System Dependencies:
build-essential,curl(installed in the backend image) - Non-root Users: Both backend and client containers run as non-root users for security
- Virtual Environment: Python dependencies are installed in a virtual environment (
/app/.venv) - Static Client: Served via nginx (alpine) in a separate container
Environment Variables
The backend service sets the following environment variables (see Dockerfile):
PYTHONUNBUFFERED=1MCP_SERVER_PORT=8082(the FastAPI server port)LOG_LEVEL=INFO
Note: If you need to override or add environment variables, you can uncomment and use the
env_fileoption indocker-compose.yml.
Exposed Ports
- Backend (python-app):
- Host:
8082ā Container:8082(FastAPI server)
- Host:
- Client (js-client):
- Host:
80ā Container:80(nginx static server)
- Host:
Build and Run Instructions
-
Build and start all services:
docker compose up --buildThis will build both the backend and client images and start the containers.
-
Access the services:
- Backend API: http://localhost:8082
- Static Client: http://localhost/
Integration with AI Models
This MCP server can be integrated with:
- Claude desktop application
- API-based integrations with AI models
- Other MCP-compatible clients
Development
To set up a development environment without Docker:
- Clone this repository
- Create a virtual environment:
python -m venv venv - Activate the virtual environment:
- Windows:
venv\Scripts\activate - Linux/Mac:
source venv/bin/activate
- Windows:
- Install dependencies:
pip install -r requirements.txt - Run the server:
python src/server.py
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
If you update dependencies, remember to rebuild the images with docker compose build.