Traia-IO/uniswap-v3-subgraph-mcp-server
If you are the rightful owner of uniswap-v3-subgraph-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This server provides access to the Uniswap V3 Subgraph API using the Model Context Protocol (MCP) for seamless AI integration.
Uniswap V3 Subgraph MCP Server
This is an MCP (Model Context Protocol) server that provides access to the Uniswap V3 Subgraph API. It enables AI agents and LLMs to interact with Uniswap V3 Subgraph through standardized tools.
Features
- 🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
- 🌐 Full API Access: Provides tools for interacting with Uniswap V3 Subgraph endpoints
- 🐳 Docker Support: Easy deployment with Docker and Docker Compose
- ⚡ Async Operations: Built with FastMCP for efficient async handling
API Documentation
- Uniswap V3 Subgraph Website: https://api.thegraph.com/subgraphs/name/uniswap/uniswap-v3
- API Documentation: https://docs.uniswap.org/api/subgraph/overview
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)get_api_info: Get information about the API service and authentication status
Note: Replace example_tool with actual Uniswap V3 Subgraph API tools based on the documentation.
Installation
Using Docker (Recommended)
-
Clone this repository:
git clone https://github.com/Traia-IO/uniswap-v3-subgraph-mcp-server.git cd uniswap-v3-subgraph-mcp-server -
Run with Docker:
./run_local_docker.sh
Using Docker Compose
- Create a
.envfile with your configuration:
PORT=8000
2. Start the server:
```bash
docker-compose up
Manual Installation
-
Install dependencies using
uv:uv pip install -e . -
Run the server:
uv run python -m server
## Usage
### Health Check
Test if the server is running:
```bash
python mcp_health_check.py
Using with CrewAI
from traia_iatp.mcp.traia_mcp_adapter import create_mcp_adapter
# Connect to the MCP server
with create_mcp_adapter(
url="http://localhost:8000/mcp/"
) as tools:
# Use the tools
for tool in tools:
print(f"Available tool: {tool.name}")
# Example usage
result = await tool.example_tool(query="test")
print(result)
Development
Testing the Server
- Start the server locally
- Run the health check:
python mcp_health_check.py - Test individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
- Create API client functions for Uniswap V3 Subgraph endpoints
- Add
@mcp.tool()decorated functions - Update this README with the new tools
- Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
{
"github_url": "https://github.com/Traia-IO/uniswap-v3-subgraph-mcp-server",
"mcp_server": {
"name": "uniswap-v3-subgraph-mcp",
"description": "Decentralized exchange protocol api providing on-chain trading data for ethereum's largest dex. query liquidity pools, token swaps, and concentrated liquidity positions across 3,000+ trading pairs with $4b+ tvl. access real-time pool reserves, token prices derived from pool ratios, 24hr volume statistics, and fee tier distributions (0.01%, 0.05%, 0.3%, 1%). track individual positions with range orders, liquidity provision history, earned fees, and impermanent loss calculations. historical swap data includes transaction hashes, block numbers, timestamps, input/output amounts, and price impact for every trade. monitor pool creation events, liquidity adds/removes, and flash loan activity. advanced queries support tick-level granularity for concentrated liquidity ranges, time-series aggregations for volume/tvl tracking, and multi-hop route discovery for optimal swap paths. factory contract data provides protocol-wide statistics: total volume, total tvl, unique traders count, and transaction counts. token analytics include price history, volume rankings, holder distributions, and cross-pool arbitrage opportunities. essential for mev bot development, arbitrage detection, liquidity mining optimization, impermanent loss analysis, and automated market maker strategies. graphql interface supports complex queries with filtering, sorting, and pagination. rate limits: 1,000 queries/day for free tier.",
"server_type": "streamable-http",
"capabilities": [
// List all implemented tool names here
"example_tool",
"get_api_info"
]
},
"deployment_method": "cloud_run",
"gcp_project_id": "traia-mcp-servers",
"gcp_region": "us-central1",
"tags": ["uniswap v3 subgraph", "api"],
"ref": "main"
}
Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
- Build a container from the Dockerfile
- Deploy to Cloud Run with the specified configuration
- Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)
Troubleshooting
- Server not starting: Check Docker logs with
docker logs <container-id> - Connection errors: Ensure the server is running on the expected port3. Tool errors: Check the server logs for detailed error messages
Contributing
- Fork the repository
- Create a feature branch
- Implement new tools or improvements
- Update the README and deployment_params.json
- Submit a pull request