hetu-project/openresearch-mcp-server
If you are the rightful owner of openresearch-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
OpenResearch MCP Server is an AI-powered platform designed to enhance academic research by structuring papers into knowledge graphs and automating paper review and generation.
OpenResearch MCP Server
An AI-powered academic research platform that provides MCP (Model Context Protocol) server capabilities for structuring academic papers into knowledge graphs and enabling automated paper review and generation.
š Features
- Academic Paper Processing: Structure academic papers into knowledge graphs
- Automated Research Tools: AI-powered paper review and generation capabilities
- MCP Protocol Support: Full compatibility with Model Context Protocol
- Go Service Integration: High-performance backend services written in Go
- Asynchronous Architecture: Built with Python asyncio for optimal performance
- Comprehensive Logging: Structured logging with detailed error tracking
š Requirements
- Python 3.10
- Dependencies listed in
requirements.txt
š ļø Installation
- Clone the repository
git clone https://github.com/yourusername/openresearch-mcp-server.git
cd openresearch-mcp-server
- Install Python dependencies
pip install -r requirements.txt
š Quick Start
Running the MCP Server
python src/main.py
The server will start and listen for MCP protocol connections via stdio.
Configuration
The server uses configuration settings from src/config.py
. Key settings include:
server_name
: Name of the MCP server- Logging configuration
- Service endpoints and timeouts
šļø Architecture
Core Components
AcademicMCPServer
: Main server class handling MCP protocolGoServiceClient
: Client for communicating with Go backend servicesDataProcessor
: Handles academic data processing and analysis- Tool Registry: Dynamic tool registration and execution system
Project Structure
openresearch-mcp-server/
āāā src/
ā āāā server/
ā ā āāā mcp_server.py # Main MCP server implementation
ā āāā clients/
ā ā āāā go_client.py # Go service client
ā āāā services/
ā ā āāā data_processor.py # Data processing services
ā āāā core/
ā ā āāā tools.py # Tool definitions and registry
ā āāā utils/
ā ā āāā logging_config.py # Logging configuration
ā āāā config.py # Application configuration
ā āāā main.py # Application entry point
āāā scripts/ # Utility scripts
āāā requirements.txt # Python dependencies
āāā README.md # This file
š§ Available Tools
The server provides various research tools accessible via MCP protocol:
- Paper Analysis: Extract and analyze academic paper content
- Knowledge Graph Generation: Convert papers into structured knowledge graphs
- Research Synthesis: Automated literature review and synthesis
- Citation Analysis: Analyze citation networks and relationships
š” MCP Protocol Support
Supported Capabilities
- Tools: Dynamic tool listing and execution
- Error Handling: Comprehensive error reporting and recovery
- Async Operations: Full asynchronous operation support
Tool Execution
Tools are executed via the MCP call_tool
method:
{
"method": "tools/call",
"params": {
"name": "tool_name",
"arguments": {
"param1": "value1",
"param2": "value2"
}
}
}
š Logging and Monitoring
The server uses structured logging with the following features:
- Structured Logs: JSON-formatted logs with contextual information
- Error Tracking: Detailed error reporting with stack traces
- Performance Monitoring: Tool execution timing and performance metrics
- Debug Support: Configurable log levels for development and production
š”ļø Error Handling
Robust error handling includes:
- Tool Validation: Verification of tool existence before execution
- Input Validation: Argument validation for all tool calls
- Graceful Degradation: Proper error responses via MCP protocol
- Resource Cleanup: Automatic cleanup of resources on shutdown
š Development Workflow
Running in Development Mode
# Set development environment
export PYTHONPATH="${PYTHONPATH}:$(pwd)"
python src/main.py
Testing Tools
# List available tools
# (via MCP client)
# Execute a specific tool
# (via MCP client with tool parameters)
š Performance Considerations
- Async Architecture: Non-blocking I/O operations
- Connection Pooling: Efficient Go service client connections
- Resource Management: Proper cleanup and resource management
- Error Recovery: Automatic recovery from transient failures
š¤ Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
š License
This project is licensed under the MIT License - see the file for details.
š Support
For support and questions:
- Create an issue on GitHub
- Check the documentation in the
docs/
directory - Review the code comments for implementation details
š® Roadmap
- Enhanced paper parsing capabilities
- Additional knowledge graph formats
- Real-time collaboration features
- Advanced citation analysis
- Integration with more academic databases
ā” Performance Tips
- Connection Management: The server uses async context managers for efficient resource handling
- Tool Caching: Frequently used tools benefit from result caching
- Batch Processing: Process multiple papers in batches for better performance
Built with ā¤ļø for the academic research community