OnCore-AI

Gendyyy/OnCore-AI

3.2

If you are the rightful owner of OnCore-AI and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

OnCore Protocol MCP is a modern server designed for efficient retrieval of clinical trial data, optimized for AI integration.

Tools
4
Resources
0
Prompts
0

OnCore Protocol MCP

A modern Model Context Protocol (MCP) server for OnCore clinical trial management system, providing retrieval-focused access to protocol, subject, and staff data.

🎉 Implementation Status: AGENT-FRIENDLY MCP SERVER READY

  • ✅ Snowflake-powered analytics with private key authentication
  • ✅ Agent-friendly 4-tool interface for AI integration
  • ✅ Advanced protocol slicing with 20+ filter attributes
  • ✅ Case-insensitive Principal Investigator search
  • ✅ High-performance concurrent SQL queries
  • ✅ Comprehensive data models with Pydantic v2
  • ✅ Async operations with connection pooling
  • ✅ FastMCP server framework
  • ✅ Type-safe configuration management
  • ✅ Privacy-compliant data handling
  • ✅ Code quality improvements and performance optimization

Features

  • 🤖 Agent-Friendly Interface: 4 optimized tools designed specifically for AI agents
  • ⚡ Snowflake Analytics: High-performance direct database queries with concurrent execution
  • 🔍 Advanced Protocol Search: Support for complex filtering with case-insensitive PI search
  • 📊 Real-time Analytics: Live enrollment metrics, department breakdowns, timeline analysis
  • 🔐 Private Key Authentication: Secure Snowflake connection with industry-standard private key auth
  • 🎯 Intelligent Filtering: 20+ filter attributes with automatic value normalization
  • ⚙️ High Performance: Concurrent SQL queries eliminate API pagination limits
  • 🔒 Privacy Compliant: PHI filtering framework with audit logging
  • 📈 Business Intelligence: Statistical analytics, enrollment tracking, department metrics
  • 🏗️ Modern Architecture: FastMCP, async operations, structured logging

📚 Documentation

Core Documentation

  • - Comprehensive project understanding
  • Configuration Guide - Environment setup and Snowflake authentication
  • Tool Reference - Complete API reference for all 4 agent-friendly tools

Architecture Overview

  • Agent-Friendly Server: Primary entry point optimized for AI agents
  • Snowflake Analytics Engine: High-performance database operations
  • Value Normalization: Automatic field value standardization
  • Privacy Framework: PHI filtering and audit logging

Quick Start

Prerequisites

  • Python 3.12+
  • Snowflake access with OnCore clinical trial data
  • Private key file for Snowflake authentication (recommended)

Installation

# Using uv (recommended)
uv venv
uv pip install -e ".[dev]"

# Or using pip
pip install -e ".[dev]"

Configuration

Configure Snowflake connection with environment variables:

# Snowflake Configuration (Required)
SNOWFLAKE_ACCOUNT=your-account-name
SNOWFLAKE_USER=your-username  
SNOWFLAKE_AUTH_TYPE=private_key                    # or password
SNOWFLAKE_PRIVATE_KEY_FILE=path/to/rsa_key.p8     # for private key auth
SNOWFLAKE_PASSWORD=your_password                   # for password auth
SNOWFLAKE_WAREHOUSE=your-warehouse
SNOWFLAKE_DATABASE=your-database
SNOWFLAKE_SCHEMA=your-schema
SNOWFLAKE_ROLE=your-role

Running the Agent-Friendly Server

# Start the optimized MCP server for AI agents  
python -m oncore_mcp.agent_friendly_server

# Note: CLI entry point will be available after Phase 1 completion
# oncore-mcp serve

Usage Examples

Agent-Friendly Tool Examples

# Search protocols with Principal Investigator
await search_protocols({
    "query": "COVID",
    "pi_name": "Nikolich-Zugich"  # Case-insensitive, supports partial matches
})

# Advanced protocol filtering
await find_protocols({
    "principal_investigator": "janko",  # Case-insensitive search
    "phase": "Phase II",
    "department": "Oncology", 
    "status": "OPEN TO ACCRUAL",
    "min_enrollment": 25
})

# Get detailed protocol information
await get_protocol_info({
    "protocol_no": "STUDY00000323"
})

# Statistical analytics with filtering
await get_protocol_stats({
    "department": "Oncology",
    "status": "Active"
})

Common Use Cases

  • "Find all protocols by Dr. Smith" ➡️ search_protocols({"pi_name": "Smith"})
  • "Show oncology Phase II studies" ➡️ find_protocols({"department": "Oncology", "phase": "Phase II"})
  • "Get enrollment stats for active protocols" ➡️ get_protocol_stats({"status": "Active"})
  • "Protocol details for STUDY123" ➡️ get_protocol_info({"protocol_no": "STUDY123"})

Available Tools

Agent-Friendly Tools (4 Core Tools)

1. search_protocols - Simple Protocol Search
  • Purpose: Text-based protocol search with optional PI filtering
  • Key Parameters:
    • query: Search term for titles, objectives, protocol numbers
    • pi_name: Principal Investigator name (case-insensitive, partial matching)
    • limit: Number of results to return
  • Returns: Standardized response with protocols array and metadata
2. find_protocols - Advanced Protocol Filtering
  • Purpose: Complex multi-attribute protocol filtering
  • Key Parameters:
    • principal_investigator: PI name search (case-insensitive)
    • department: Department filter (single value)
    • phase: Phase filter (single value)
    • status: Protocol status filter
    • min_enrollment: Minimum current enrollment
    • limit: Maximum number of results
  • Returns: Filtered protocols with basic information
3. get_protocol_info - Detailed Protocol Information
  • Purpose: Retrieve complete protocol details by protocol number
  • Key Parameters:
    • protocol_no: Exact protocol number (required)
  • Returns: Full protocol information including enrollment, status, dates
4. get_protocol_stats - Statistical Analytics
  • Purpose: Generate statistical breakdowns and metrics
  • Key Parameters:
    • department: Filter by department
    • status: Filter by protocol status
  • Returns: Comprehensive analytics with enrollment metrics and department/phases/statuses breakdowns

Key Features of All Tools

  • Standardized Response Format: All tools return consistent {data, success, error, count, message} structure
  • Case-Insensitive PI Search: Automatic wildcard matching for Principal Investigator names
  • High Performance: Direct Snowflake queries with concurrent execution
  • Error Handling: Graceful error handling with informative messages
  • Privacy Compliant: PHI filtering and audit logging

Development

Setup Development Environment

# Clone repository
git clone https://github.com/oncore-ai/oncore-protocol-mcp.git
cd oncore-protocol-mcp

# Install with development dependencies
uv pip install -e ".[dev]"

# Install pre-commit hooks
pre-commit install

Code Quality

# Format and lint
ruff format .
ruff check .

# Type checking
mypy src/

# Run tests
pytest

# With coverage
pytest --cov

Testing

# Unit tests
pytest tests/unit/

# Integration tests (requires OnCore API access)
pytest tests/integration/

# All tests
pytest

Architecture

  • Agent-Friendly Server: Primary MCP server optimized for AI agents (agent_friendly_server.py)
  • Snowflake Analytics Engine: High-performance direct database queries with concurrent execution
  • FastMCP: Modern MCP server framework with async operations
  • Pydantic v2: Data validation and comprehensive data models
  • Private Key Authentication: Secure Snowflake connections
  • Value Normalization: Automatic field value standardization for consistent queries

Privacy & Compliance

  • Staff Data: Personal information excluded (names, emails, phones)
  • Subject Data: PHI included temporarily, marked for future filtering
  • Authentication: OAuth2 client credentials flow
  • Audit Trail: All API interactions logged

Configuration

Required Environment Variables

VariableDescriptionRequired
SNOWFLAKE_ACCOUNTSnowflake account identifierYes
SNOWFLAKE_USERSnowflake usernameYes
SNOWFLAKE_AUTH_TYPEAuthentication type (private_key/password)Yes
SNOWFLAKE_PRIVATE_KEY_FILEPath to private key file (.p8)Yes*
SNOWFLAKE_PASSWORDSnowflake passwordYes*
SNOWFLAKE_WAREHOUSESnowflake warehouse nameYes
SNOWFLAKE_DATABASESnowflake database nameYes
SNOWFLAKE_SCHEMASnowflake schema nameYes
SNOWFLAKE_ROLESnowflake roleYes
LOG_LEVELLogging levelNo

*Either SNOWFLAKE_PRIVATE_KEY_FILE or SNOWFLAKE_PASSWORD required based on auth type

Authentication Methods

Private Key Authentication (Recommended):

SNOWFLAKE_AUTH_TYPE=private_key
SNOWFLAKE_PRIVATE_KEY_FILE=/path/to/rsa_key.p8

Password Authentication (Fallback):

SNOWFLAKE_AUTH_TYPE=password
SNOWFLAKE_PASSWORD=your_password

License

MIT License - see file for details.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make changes with tests
  4. Run code quality checks
  5. Submit a pull request

Support