dp-pcs/Trilogy-AI-CoE-MCP-Remote
If you are the rightful owner of Trilogy-AI-CoE-MCP-Remote and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Trilogy AI CoE MCP Server is a universal Model Context Protocol server that connects multiple AI assistants to Trilogy's AI Center of Excellence content, demonstrating the power of 'write once, use everywhere'.
Trilogy AI CoE MCP Server
A universal Model Context Protocol (MCP) server demonstrating how one server can connect to multiple AI assistants. This server provides access to Trilogy's AI Center of Excellence Substack content and showcases MCP's true power: write once, use everywhere.
π Universal Compatibility Showcase
This project demonstrates MCP's core value proposition:
- β One Server β Multiple AI Clients
- β Claude Desktop - Full MCP integration
- β Cursor - Same configuration works
- β ChatGPT Deep Research - HTTP endpoint compatibility
- β Any MCP Client - Standard protocol compliance
π Live Demo Server
Server URL: https://ai-coe-mcp.latentgenius.ai
Ready to use immediately - no installation required!
What is MCP?
The Model Context Protocol (MCP) is an open standard that enables AI assistants to securely connect to external data sources and tools. This project showcases how a single MCP server can serve multiple AI clients with different integration approaches.
Quick Start - Universal Setup
Option 1: Claude Desktop & Cursor (Native MCP)
-
Download the client script: Save to your local machine
-
Configure your AI assistant:
Claude Desktop (claude_desktop_config.json
):
{
"mcpServers": {
"trilogy-ai-coe-remote": {
"command": "node",
"args": ["/path/to/mcp-remote-client.js"]
}
}
}
Cursor (same configuration in MCP settings):
{
"mcpServers": {
"trilogy-ai-coe-remote": {
"command": "node",
"args": ["/path/to/mcp-remote-client.js"]
}
}
}
- Restart your AI assistant and start using the tools!
Option 2: ChatGPT Deep Research (HTTP Endpoint)
- Go to ChatGPT Settings β Connectors
- Add MCP Server:
- Name:
Trilogy AI CoE MCP Server
- URL:
https://ai-coe-mcp.latentgenius.ai/mcp
- Authentication:
No authentication
- Name:
- Use Deep Research with Trilogy AI CoE content!
Available Tools
All AI assistants get access to these tools:
π search
Search through Trilogy AI CoE articles by keywords, topics, or authors.
Example: "Search for articles about agentic frameworks"
π list_recent
Get the latest articles sorted by publication date.
Example: "Show me the 5 most recent articles"
π fetch
Retrieve full article content by ID for detailed analysis.
Example: "Fetch the full content of article-4"
Example Usage
Once connected, try these queries in any supported AI assistant:
- "Search for articles about agentic frameworks"
- "What are the latest AI strategy insights?"
- "Find articles by David Proctor"
- "Show me recent content on machine learning"
- "Fetch the full article about AI transformation"
Architecture: One Server, Multiple Clients
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MCP Server β
β (ai-coe-mcp.latentgenius.ai) β
β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β Search β β List Recent β β Fetch β β
β β Tool β β Tool β β Tool β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
β Multiple Protocol Support
β
βββββββββββββββββββββββΌββββββββββββββββββββββ
β β β
βΌ βΌ βΌ
βββββββββββββββ βββββββββββββββ βββββββββββββββ
β Claude β β Cursor β β ChatGPT β
β Desktop β β β βDeep Researchβ
β β β β β β
β (stdio MCP) β β (stdio MCP) β β(HTTP JSON-RPC)β
βββββββββββββββ βββββββββββββββ βββββββββββββββ
Testing the Server
You can test the server directly using HTTP endpoints:
# Health check
curl https://ai-coe-mcp.latentgenius.ai/health
# List available tools
curl https://ai-coe-mcp.latentgenius.ai/tools
# Search for articles
curl -X POST https://ai-coe-mcp.latentgenius.ai/tools/search \
-H "Content-Type: application/json" \
-d '{"query": "agentic frameworks"}'
# Test MCP protocol (ChatGPT format)
curl -X POST https://ai-coe-mcp.latentgenius.ai/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","method":"tools/list","params":{},"id":1}'
Local Development
Want to run your own instance?
# Clone and setup
git clone https://github.com/dp-pcs/Trilogy-AI-CoE-MCP-Remote.git
cd trilogy-ai-coe-mcp-remote
npm install
# Configure environment
cp env.example .env
# Edit .env with your settings
# Build and run
npm run build
npm start
# Development mode
npm run dev
Key Features
- π Remote Deployment: Cloud-hosted, no local installation needed
- π Universal Protocol: Works with any MCP-compatible client
- π Multiple Interfaces: stdio MCP + HTTP JSON-RPC
- β‘ Real-time Data: Live access to Trilogy AI CoE content
- π Secure: HTTPS endpoints with proper CORS
- π Reliable: Built-in caching and error handling
Documentation
- - Detailed ChatGPT Deep Research setup
- - Local installation instructions
- - Deploy your own instance
About This Project
This project demonstrates the power of the Model Context Protocol to create universal AI tool integrations. By implementing both stdio MCP (for Claude/Cursor) and HTTP JSON-RPC (for ChatGPT), a single server can serve multiple AI assistants with their preferred integration methods.
Key Insight: MCP enables developers to write tools once and use them across the entire AI ecosystem, rather than building separate integrations for each AI assistant.
Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature-name
- Make your changes and test thoroughly
- Submit a pull request
License
MIT License - see file for details.
Built by the AI Center of Excellence at Trilogy to showcase practical MCP implementations and universal AI assistant integrations.