dp-pcs/Trilogy-AI-CoE-MCP
If you are the rightful owner of Trilogy-AI-CoE-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The AI CoE Trilogy MCP Server is a Model Context Protocol server that connects AI assistants to Substack feeds, providing access to articles, authors, and topics from the AI Center of Excellence at Trilogy.
AI CoE Trilogy MCP Server
A Model Context Protocol (MCP) server that connects to Substack feeds to provide AI assistants with access to articles, authors, and topics from the AI Center of Excellence at Trilogy.
What is MCP?
The Model Context Protocol (MCP) is an open standard that enables AI assistants to securely connect to external data sources and tools. Instead of building custom integrations for each AI platform, MCP provides a universal interface that works across Claude Desktop, Cursor, Windsurf, and other MCP-compatible applications.
Features
- š List Articles: Browse available articles from the Substack feed
- š„ List Authors: View all authors who have contributed content
- š·ļø List Topics: Explore articles by topic/category
- š Read Articles: Access full article content
- š Filter Content: Search by author, topic, or keywords
- š Universal Compatibility: Works with any MCP-compatible AI assistant
- ā” Fast & Reliable: Built-in caching and error handling
- š ļø Easy Setup: Simple installation and configuration
- š§ Node.js Compatible: Includes polyfills for web API compatibility
Quick Start
1. Install Dependencies
git clone https://github.com/dp-pcs/Trilogy-AI-CoE-MCP.git
cd Trilogy-AI-CoE-MCP
npm install
2. Build the Server
npm run build
3. Test the Installation
npm test
4. Configure Your AI Assistant
Add this configuration to your AI assistant's MCP settings:
{
"mcpServers": {
"trilogy-ai-coe": {
"command": "node",
"args": ["/path/to/your/Trilogy-AI-CoE-MCP/dist/index.js"],
"env": {
"SUBSTACK_FEED_URL": "https://trilogyai.substack.com"
}
}
}
}
5. Start Using
Ask your AI assistant:
- "List the latest articles from the AI CoE"
- "Show me all authors"
- "What topics are covered?"
- "Read the article about [topic]"
Detailed Installation
For step-by-step installation instructions, see .
Available Tools
The server provides these tools to AI assistants:
list_articles
Get a list of available articles with optional filtering.
Parameters:
limit
(optional): Maximum number of articles to return (default: 10)author
(optional): Filter articles by author nametopic
(optional): Filter articles by topic
Example usage:
- "List the 5 most recent articles"
- "Show me articles by John Smith"
- "Find articles about AI governance"
list_authors
Get all authors who have written articles.
Returns: List of authors with article counts and latest publication dates.
list_topics
Get available topics/categories covered in articles.
Returns: List of topics with article counts and associated articles.
read_article
Read the full content of a specific article.
Parameters:
articleId
(optional): The ID of the article to readurl
(optional): The URL of the article to readtitle
(optional): Search for article by title
Example usage:
- "Read the article about AI strategy"
- "Show me the full content of the latest article"
Configuration
Environment Variables
Create a .env
file in the project root:
# Required: Substack feed URL
SUBSTACK_FEED_URL=https://trilogyai.substack.com
# Optional: Enable debug logging
DEBUG=false
# Optional: Server port for testing
PORT=3000
Custom Substack Feed
To use a different Substack publication:
- Update
SUBSTACK_FEED_URL
in your.env
file - Rebuild the server:
npm run build
- Restart your AI assistant
Development
Running in Development Mode
# Watch mode with auto-rebuild
npm run dev
# Run tests
npm test
# Build for production
npm run build
Project Structure
Trilogy-AI-CoE-MCP/
āāā src/
ā āāā index.ts # Main server implementation
ā āāā polyfill.js # Node.js web API compatibility polyfill
āāā dist/ # Compiled JavaScript (generated)
āāā package.json # Dependencies and scripts
āāā tsconfig.json # TypeScript configuration
āāā env.example # Environment variables template
āāā test-server.js # Test script
āāā README.md # This file
āāā INSTALLATION.md # Detailed installation guide
āāā DEMO_SCRIPT.md # Demo recording script
Technical Notes
Node.js Compatibility: This server includes a polyfill (src/polyfill.js
) that provides web APIs (ReadableStream
, Blob
, DOMException
) required by the cheerio
and undici
dependencies. This ensures compatibility across different Node.js environments and versions.
Supported AI Assistants
This MCP server works with:
- Claude Desktop - Full support with easy configuration
- Cursor - Full support via MCP settings
- Windsurf - Full support via MCP configuration
- Any MCP-compatible application - Universal protocol support
Troubleshooting
Common Issues
- Server won't start: Check Node.js version (18+ required)
- No articles found: Verify Substack feed URL and internet connection
- Permission errors: Ensure proper file permissions for the server executable
- Path issues: Use absolute paths in AI assistant configuration
Debug Mode
Enable detailed logging by setting DEBUG=true
in your environment or configuration.
Getting Help
- Check the guide
- Review the for examples
- Run
npm test
to verify your setup - Open an issue on GitHub for additional support
Demo
See for a complete recording script demonstrating all features.
Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature-name
- Make your changes
- Test thoroughly:
npm test
- Submit a pull request
License
MIT License - see LICENSE file for details.
About
This project demonstrates how to create a Model Context Protocol server that connects AI assistants to external data sources. It's part of the AI Center of Excellence at Trilogy's initiative to showcase practical AI integration patterns.