automataIA/mcp-py-json-doc
If you are the rightful owner of mcp-py-json-doc and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Rust Documentation Analyzer with Knowledge Graph & FastMCP Server is a tool designed to analyze Rust JSON documentation, extract entities and semantic relationships, and manage them in a local SQLite Knowledge Graph for advanced analysis with FastMCP 2.0 integration.
š¦ Rust Documentation Analyzer with Knowledge Graph & FastMCP Server
A powerful tool to analyze Rust JSON documentation, extract entities and semantic relationships, and manage them in a local SQLite Knowledge Graph for advanced analysis with FastMCP 2.0 integration. Seamlessly connect your Rust documentation to LLMs like Claude through the Model Context Protocol.
⨠Key Features
- š Advanced Analysis - Automatically extract modules, structs, functions, and traits from Rust code
- š§ Knowledge Graph - Store and query relationships between entities in a local SQLite database
- ā” FastMCP 2.0 - High-performance MCP server implementation with async support
- š¤ Dual Transport - Support for both STDIO and HTTP transports
- š Powerful Search - Search through entities with advanced filters and support for multiple libraries
- š Optimized Performance - Handles large codebases with fast response times
- š Extensible - Easy to integrate with other tools and workflows
- š¤ LLM Integration - Seamless integration with Claude Desktop and other MCP-compatible LLMs
š Quick Start
Prerequisites
- Python 3.10+
- uv package manager
- Rust (for building documentation)
š Generating Rust Documentation JSON
To analyze your Rust project, you'll need to generate JSON documentation first. See the for detailed instructions on how to generate the required JSON files from your Rust projects.
Installation
-
Clone the repository
git clone https://github.com/yourusername/mcp-py-json-doc.git cd mcp-py-json-doc
-
Set up the virtual environment (recommended):
# Create and activate the virtual environment python -m venv venv source venv/bin/activate # Linux/Mac # .\venv\Scripts\activate # Windows
-
Install dependencies with
uv
(faster than pip):uv pip install -e .
For development, install additional dependencies:
uv pip install -e ".[dev]"
š„ Importing Documentation
Loading JSON Files into the Knowledge Graph
To import Rust documentation JSON files into the Knowledge Graph, use the following command:
uv run python -m src.main <path_to_json> -o output [options]
Parameters:
<path_to_json>
: Path to the Rust documentation JSON file (required)-o, --output
: Output directory where the Knowledge Graph database will be stored (default:output
)--verbose, -v
: Enable verbose logging for debugging--no-kg
: Skip loading data into the Knowledge Graph (for testing)--no-progress
: Disable progress bars
Examples:
-
Basic usage (import a single file):
uv run python -m src.main doc/serde.json -o output
-
Import with verbose output:
uv run python -m src.main doc/tokio.json -o output --verbose
-
Import multiple files (run commands sequentially):
uv run python -m src.main doc/serde.json -o output && \ uv run python -m src.main doc/tokio.json -o output
Output:
The command will create the following files in the specified output directory:
knowledge_graph.db
: SQLite database containing all entities and relationshipsanalysis_results.json
: Summary of the import processfull_analysis.json
: Detailed analysis of the imported documentation
Notes:
- The database will be created if it doesn't exist
- Existing data will be appended to the database (use different output directories to keep datasets separate)
- For large documentation files, the import process might take several minutes
Verifying the Import
After importing, you can verify the contents of the Knowledge Graph using the interactive explorer:
python explore_kg.py output/knowledge_graph.db
This will start an interactive shell where you can query the imported data.
š Usage
1. Import Rust Documentation
First, import the Rust documentation into the knowledge graph:
uv run python -m src.main path/to/rust_docs.json -o output
2. Start the MCP Server
You can run the MCP server in two different transport modes:
STDIO Transport (for LLM integration)
PYTHONPATH=. python -c "from src.mcp_server.server import main; main()" --kg-db output/knowledge_graph.db --transport stdio
This mode is ideal for integration with Claude Desktop, Anthropic API, or other MCP-compatible LLM clients.
HTTP Transport (for API access)
PYTHONPATH=. python -c "from src.mcp_server.server import main; main()" --kg-db output/knowledge_graph.db --transport http --host 0.0.0.0 --port 8000
This mode provides a REST API interface for testing or integration with web applications.
3. MCP Client Integration
The MCP server is compatible with a growing ecosystem of applications and platforms that support the Model Context Protocol:
AI Development Environments
- Windsurf - Modern IDE with built-in MCP support
- Cursor - AI-first code editor with MCP integration
- Zed - High-performance editor with native MCP capabilities
- Replit - Cloud-based development environment with MCP support
AI Assistants
- Claude Desktop - Anthropic's desktop application with MCP tool integration
- Claude API - Use with the Claude API's tool calling capabilities
- Other MCP Clients - Any application that implements the MCP specification
Configuration
Most MCP clients can be configured by adding the server details to their respective configuration files. Common locations include:
~/.config/mcp/config.json
.codeium/windsurf/mcp_config.json
.kiro/settings/mcp.json
For detailed setup instructions, refer to your client's documentation on MCP integration.
MCP Client Configuration
To use this server with any MCP-compatible client (like Codeium, Claude, or custom clients), add the following configuration to your MCP client's configuration file (typically mcp_config.json
or similar):
{
"mcpServers": {
"rust-docs": {
"command": "/path/to/your/venv/bin/python",
"args": [
"/path/to/mcp-py-json-doc/run_mcp_server.py",
"--kg-db",
"/path/to/mcp-py-json-doc/output/knowledge_graph.db",
"--transport",
"stdio"
]
}
}
}
Example for a Local Development Setup
{
"mcpServers": {
"rust-docs": {
"command": "/home/username/mcp-py-json-doc/.venv/bin/python",
"args": [
"/home/username/mcp-py-json-doc/run_mcp_server.py",
"--kg-db",
"/home/username/mcp-py-json-doc/output/knowledge_graph.db",
"--transport",
"stdio"
]
}
}
}
Important Notes:
- Update all file paths to match your system's directory structure
- Ensure the Python path points to your virtual environment's Python executable
- The server will be available to MCP clients as
rust-docs
- For production deployments, consider using absolute paths for reliability
4. Available MCP Tools
The following tools are available through the MCP server interface, enabling powerful code analysis and documentation retrieval:
1. Search Entities
Search for entities (functions, structs, traits, etc.) in the Rust documentation.
Parameters:
query
: Search term to match against entity namesentity_type
: (Optional) Filter by type (functions
,structs
,traits
,impls
,enums
,constants
,modules
)limit
: Maximum number of results to return (default: 10)
Example:
{
"query": "Serialize",
"entity_type": "trait",
"limit": 5
}
2. Get Entity Details
Retrieve comprehensive information about a specific entity, including its documentation, attributes, and relationships.
Parameters:
entity_id
: Fully qualified name of the entity (e.g.,serde::ser::Serialize
)
Example:
{
"entity_id": "serde::ser::Serialize"
}
3. Find Related Entities
Discover entities that have relationships with the specified entity, such as implementations, definitions, or dependencies.
Parameters:
entity_id
: ID of the source entityrelation_type
: (Optional) Type of relationship (implements
,defines
,contains
,depends_on
)depth
: How many levels of relationships to traverse (default: 1)
Example:
{
"entity_id": "serde::ser::Serialize",
"relation_type": "implements",
"depth": 2
}
4. Analyze Trait Implementations
Analyze how a specific trait is implemented across the codebase, including all implementing types and their locations.
Parameters:
trait_name
: Name of the trait to analyze
Example:
{
"trait_name": "serde::ser::Serialize"
}
5. Get Module Structure
Retrieve the hierarchical structure of modules, showing the organization of code at different levels.
Parameters:
module_name
: (Optional) Name of a specific module to inspect. If not provided, returns the complete module hierarchy.
Example:
{
"module_name": "serde"
}
6. Search Documentation
Perform semantic search across documentation, including function/method documentation and inline code comments.
Parameters:
query
: Search query stringinclude_code_examples
: (Optional) Whether to include code examples in search results (default: false)
Example:
{
"query": "serialization",
"include_code_examples": true
}
7. Get Graph Statistics
Retrieve statistics and metrics about the knowledge graph, including entity counts, relationship types, and database metrics.
Parameters: None
Example:
{}
š§ Development
Project Structure
mcp-py-json-doc/
āāā src/
ā āāā mcp_server/
ā ā āāā __init__.py
ā ā āāā server.py # FastAPI server implementation
ā ā āāā handlers.py # Request handlers
ā ā āāā tools.py # Tool definitions
ā ā āāā models.py # Data models
ā āāā native_kg/ # Knowledge graph implementation
āāā tests/ # Test files
āāā pyproject.toml # Project configuration
āāā README.md # This file
š¤ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
š License
This project is licensed under the MIT License - see the file for details.
ļæ½ FastMCP Integration
This project uses FastMCP 2.0 to implement the Model Context Protocol (MCP), providing a standardized way for LLMs to interact with Rust documentation.
Why FastMCP 2.0?
FastMCP 2.0 offers several advantages over the official MCP SDK:
- More Features: Comprehensive toolkit beyond the core MCP specification
- Active Maintenance: Regularly updated with new features and improvements
- Production Ready: Designed for deployment in production environments
- Pythonic API: Clean, decorator-based approach for defining tools
- Dual Transport: Support for both STDIO and HTTP transports
š Acknowledgments
- FastMCP for the high-performance MCP server implementation
- Model Context Protocol for standardizing LLM tool interactions
- The Rust community for amazing documentation tools
- All contributors who helped improve this project