aspenas/readai-mcp-server
If you are the rightful owner of readai-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Read.ai MCP Server is a production-ready Model Context Protocol server designed to integrate with Read.ai's webhook system for storing and querying meeting data.
Read.ai MCP Server
A production-ready MCP (Model Context Protocol) server that integrates with Read.ai's webhook system to store and query meeting data.
Features
- Webhook Integration: Receives and stores Read.ai meeting data via webhook
- SQLite Database: Persistent storage of meetings, participants, action items, topics, transcripts
- 7 MCP Tools: Natural language querying through Tasklet/Claude
- FastAPI: Modern, fast web framework with automatic API documentation
- Docker Support: Easy deployment with Docker and Docker Compose
- Railway/Fly.io Ready: One-click deployment configurations included
MCP Tools Available
- search_meetings - Search meetings by title, participant, or date range
- get_meeting_details - Get full details of a specific meeting
- get_action_items - Retrieve action items with optional filtering
- search_transcript - Search meeting transcripts by content or speaker
- get_participant_meetings - Get all meetings for a specific participant
- get_topics - Get topics discussed across meetings
- get_meeting_summary - Get summary statistics and insights
Quick Start
Local Development
- Clone the repository:
git clone https://github.com/candlefish-ai/readai-mcp-server.git
cd readai-mcp-server
- Install dependencies:
pip install -r requirements.txt
- Copy environment file:
cp .env.example .env
- Run the server:
uvicorn readai_mcp.main:app --reload
The server will be available at http://localhost:8000
Docker Deployment
docker-compose up -d
Railway Deployment
- Install Railway CLI:
npm install -g railway
- Initialize and deploy:
railway login
railway init
railway variables set WEBHOOK_TOKEN=1df1a04b5ec6ba110262d321e8cb1202
railway up
- Get your deployment URL:
railway domain
Read.ai Webhook Configuration
- Log into Read.ai
- Navigate to Settings → Integrations → Webhooks
- Click "Add Webhook"
- Configure:
- URL:
https://your-deployment-url/webhook - Token:
1df1a04b5ec6ba110262d321e8cb1202 - Event:
meeting_end
- URL:
Connecting to Tasklet/Claude
Once deployed, connect via MCP:
setup_mcp_server_connection(
displayName="Read.ai Meetings",
serverUrl="https://your-deployment-url/mcp"
)
API Documentation
FastAPI automatically generates interactive API documentation:
- Swagger UI:
http://your-server/docs - ReDoc:
http://your-server/redoc
Testing
Run the test suite:
pytest tests/ -v
Test webhook manually:
curl -X POST https://your-server/webhook \
-H "Content-Type: application/json" \
-H "X-Webhook-Token: 1df1a04b5ec6ba110262d321e8cb1202" \
-d '{
"session_id": "test-123",
"trigger": "meeting_end",
"title": "Test Meeting",
"start_time": "2025-11-01T10:00:00Z",
"end_time": "2025-11-01T11:00:00Z",
"owner": {"name": "Test User", "email": "test@example.com"},
"participants": [],
"summary": "Test summary",
"action_items": [],
"key_questions": [],
"topics": [],
"report_url": "https://test.com"
}'
Example Queries (via MCP)
Once connected to Tasklet/Claude, you can ask:
- "Search for meetings with Jesse about PromoterOS"
- "Get all action items from last week's meetings"
- "Find transcript mentions of Watsonx Orchestrate"
- "Show me all meetings Alice participated in this month"
- "What topics were discussed in product meetings?"
- "Give me a summary of meeting activity for October"
Project Structure
readai-mcp-server/
├── readai_mcp/
│ ├── __init__.py
│ ├── main.py # FastAPI app + MCP server
│ ├── database.py # SQLAlchemy models & DB operations
│ ├── models.py # Pydantic models
│ ├── mcp_tools.py # MCP tool implementations
│ └── config.py # Configuration
├── tests/
│ ├── test_webhook.py # Webhook endpoint tests
│ └── test_mcp_tools.py # MCP tools tests
├── Dockerfile
├── docker-compose.yml
├── railway.json # Railway deployment config
├── fly.toml # Fly.io deployment config
├── requirements.txt
└── README.md
Environment Variables
| Variable | Default | Description |
|---|---|---|
WEBHOOK_TOKEN | (required) | Authentication token for Read.ai webhook |
DATABASE_URL | sqlite:///./meetings.db | Database connection string |
MCP_SERVER_NAME | Read.ai MCP Server | Server name for MCP |
MCP_SERVER_VERSION | 0.1.0 | Server version |
API_HOST | 0.0.0.0 | API host binding |
API_PORT | 8000 | API port |
LOG_LEVEL | INFO | Logging level |
Monitoring
View logs (Railway):
railway logs --follow
View logs (Docker):
docker logs readai-mcp-server -f
Check database:
sqlite3 meetings.db "SELECT * FROM meetings LIMIT 5;"
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
MIT License - see LICENSE file for details
Support
For issues or questions:
- GitHub Issues: github.com/candlefish-ai/readai-mcp-server/issues
- Linear Issue: CAN-17
Acknowledgments
- Built for PromoterOS/Facture discovery sessions
- Powered by FastAPI and MCP
- Integrated with Read.ai's webhook system