vavanv/mcp_server_client_boilerplate
If you are the rightful owner of mcp_server_client_boilerplate and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A comprehensive Model Context Protocol (MCP) implementation consisting of a server that provides AI company data tools and a React client application.
MCP Server Client Boilerplate
A comprehensive Model Context Protocol (MCP) implementation featuring a TypeScript server with AI company data tools and a modern React client with OpenAI integration.
Overview
This project demonstrates a complete MCP ecosystem with:
- MCP Server: A TypeScript-based server providing tools for querying AI company information from a PostgreSQL database
- MCP Client: A modern React application with Material-UI components, OpenAI integration, and real-time chat functionality
Server Component (mcp_server
)
Features
- Database Integration: Uses Prisma ORM with PostgreSQL to store and retrieve AI company and model data
- MCP Tools: Provides comprehensive tools for querying AI company information:
getCompanies
: Retrieves all companies with their chatbots and LLM modelsgetChats
: Gets chatbots for a specific companygetLLMs
: Gets LLM models for a specific companydiagnostic
: Health check and diagnostic tool
- Type Safety: Built with TypeScript for robust development experience
- Database Seeding: Includes comprehensive AI company data from CSV files
- Docker Support: Containerized deployment with health checks
- Express Integration: HTTP server with health endpoints
- MCP Inspector: Built-in debugging and inspection tools
Database Schema
The application uses a relational schema with three main entities:
- Company: Stores AI company information (id, company name, description) with one-to-many relationships
- Chat: Stores chatbot information (chatbot name) linked to companies
- LLM: Stores LLM model information with specializations linked to companies
Quick Start (Server)
-
Install dependencies:
cd mcp_server yarn install
-
Set up environment:
# Create .env file with: DATABASE_URL="postgresql://username:password@localhost:5432/your_database_name" PORT=3100 NODE_ENV=development
-
Setup database:
yarn prisma:generate yarn prisma:migrate yarn prisma:seed
-
Run the server:
yarn dev
Available Scripts (Server)
yarn dev
- Start development server with hot reloadyarn build
- Build TypeScript to JavaScriptyarn start
- Start production serveryarn inspect
- Launch MCP Inspector for debuggingyarn inspect:unsafe
- Launch MCP Inspector without authenticationyarn prisma:studio
- Open Prisma Studio database GUIyarn prisma:seed-chats
- Seed additional chat datayarn prisma:populate-chats
- Populate chat relationships
Docker Deployment
cd mcp_server
docker-compose build
docker-compose up -d
Client Component (mcp_client
)
Features
- React 19 Application: Built with the latest React and TypeScript for type safety
- Vite Build Tool: Lightning-fast development with instant HMR and optimized builds
- Material-UI v7: Modern UI components with custom styling architecture
- OpenAI Integration: Direct integration with OpenAI API for chat functionality
- MCP Service Integration: Connects to MCP server for AI company data retrieval
- Real-time Chat Interface: Interactive chat with suggested questions and message history
- Component Architecture: Clean separation of concerns with custom styling hooks
- Error Handling: Comprehensive error handling with user-friendly alerts
- Connection Status: Real-time MCP server connection monitoring
- CORS-Free Operation: Vite proxy configuration eliminates CORS issues
- Remote Server Support: Seamless connection to remote MCP servers via proxy
- Flexible Configuration: Environment-based proxy targeting for different deployment scenarios
Key Components
- Chat: Main chat interface with message handling and OpenAI integration
- Header: Application header with connection status indicators
- MessageList: Scrollable message history with user/assistant message styling
- ChatInput: Multi-line input with keyboard shortcuts (Enter to send, Shift+Enter for new line)
- SuggestedQuestions: Pre-defined questions to help users get started
- ErrorAlert: User-friendly error display and handling
Quick Start (Client)
-
Install dependencies:
cd mcp_client npm install
-
Set up environment (optional):
# Create .env.local file for OpenAI API key: VITE_OPENAI_API_KEY=your_openai_api_key_here
-
Development server:
npm run dev
-
Production build:
npm run build npm run preview
The development server will be available at http://localhost:5173
with instant hot module replacement.
Project Structure
mcp_server_client_boilerplate/
āāā mcp_server/ # MCP Server with database tools
ā āāā server/ # Server implementation
ā ā āāā server.ts # Main server entry point
ā ā āāā tools/ # MCP tool implementations
ā āāā prisma/ # Database schema and migrations
ā ā āāā schema.prisma # Database schema definition
ā ā āāā migrations/ # Database migration files
ā ā āāā seed-lastone.* # Database seeding scripts
ā āāā scripts/ # Utility scripts for data population
ā āāā docker-compose.yml # Docker configuration
ā āāā Dockerfile # Docker image definition
ā āāā package.json # Server dependencies
āāā mcp_client/ # React client application (Vite)
ā āāā src/ # React source code
ā ā āāā main.tsx # Application entry point
ā ā āāā App.tsx # Main app component
ā ā āāā Chat.tsx # Main chat interface
ā ā āāā components/ # Reusable UI components
ā ā ā āāā *Styles.tsx # Component styling hooks
ā ā ā āāā *.tsx # Component implementations
ā ā āāā services/ # API and service integrations
ā ā ā āāā mcpService.ts # MCP server communication
ā ā ā āāā openaiService.ts # OpenAI API integration
ā ā āāā types/ # TypeScript type definitions
ā ā āāā constants.ts # Application constants
ā āāā index.html # HTML entry point
ā āāā vite.config.ts # Vite configuration
ā āāā tsconfig.json # TypeScript configuration
ā āāā package.json # Client dependencies
āāā README.md # This file
Technologies Used
Server
- MCP SDK:
@modelcontextprotocol/sdk
v1.17.1 for building MCP servers - Prisma: Database ORM v6.13.0 for PostgreSQL
- TypeScript: Type-safe JavaScript development v5.9.2
- PostgreSQL: Relational database for data storage
- Express: HTTP server framework v5.1.0
- Docker: Containerization for deployment
- Zod: Runtime type validation v3.25.67
- Faker.js: Test data generation v9.9.0
Client
- React 19: Latest React v19.1.1 with concurrent features
- Vite 7: Next-generation frontend tooling v7.0.6 with instant HMR
- TypeScript 5.9: Latest TypeScript v5.9.2 with advanced type features
- Material-UI v7: Modern React UI component library v7.2.0
- Emotion: CSS-in-JS styling v11.14.0
- OpenAI: Direct API integration v5.11.0
- ESLint 9: Code quality and consistency v9.32.0
- npm: Package management
Sample Data
The server includes comprehensive AI company data with:
- 22+ unique companies (OpenAI, Google, Meta AI, Anthropic, Cohere, etc.)
- Chat platforms (ChatGPT, Gemini, Claude, Copilot, etc.)
- LLM models with specializations (GPT-4, PaLM 2, LLaMA, Claude-3, etc.)
How It Works
MCP Integration Flow
- Client Request: User asks a question about AI companies
- OpenAI Processing: Client sends query to OpenAI with function calling enabled
- MCP Tool Invocation: OpenAI determines which MCP tools to call based on the query
- Server Query: MCP client calls the appropriate server endpoints
- Database Retrieval: Server queries PostgreSQL database using Prisma
- Response Assembly: Data is returned through the MCP protocol
- AI Response: OpenAI generates a natural language response using the retrieved data
Architecture Benefits
- Separation of Concerns: MCP server handles data, client handles UI/UX
- Type Safety: End-to-end TypeScript ensures robust development
- Real-time Updates: Instant feedback on connection status and errors
- Scalable Design: Easy to add new tools and extend functionality
Development
Client Development with Vite
Built with Vite for superior development experience:
Performance Benefits:
- Lightning Fast Startup: Dev server starts in ~400ms
- Instant HMR: Hot Module Replacement updates in milliseconds
- Optimized Builds: Modern bundling with tree-shaking and code splitting
- Native ESM: Leverages browser's native ES modules for faster loading
Modern Architecture:
- Component Styling: Separated styling logic using custom hooks
- Service Layer: Clean separation between UI and API logic
- Error Boundaries: Comprehensive error handling throughout the app
- TypeScript Integration: Full type safety across all components
Adding New Tools (Server)
- Create new tool file in
server/tools/
- Implement the tool following the MCP SDK patterns
- Register the tool in
server/tools/index.ts
- Export the registration function
Example tool structure:
export const registerMyTool = (server: Server) => {
server.setRequestHandler(CallToolRequestSchema, async (request) => {
// Tool implementation
});
};
Database Changes (Server)
- Modify
prisma/schema.prisma
- Run:
yarn prisma:migrate
to create migration - Update client:
yarn prisma:generate
- Update seed data if needed
Customizing Components (Client)
- Components: Modify files in
src/components/
- Styles: Update corresponding
*Styles.tsx
files - Types: Update type definitions in
src/types/
- Services: Extend API integrations in
src/services/
Adding New UI Features
The client uses a clean architecture pattern:
- Styling: Use custom hooks (e.g.,
useMessageListStyles
) - State Management: React hooks with proper TypeScript typing
- API Integration: Extend
mcpService.ts
oropenaiService.ts
Testing
Server Testing
- MCP Inspector:
yarn inspect
(interactive tool debugging) - MCP Inspector (Unsafe):
yarn inspect:unsafe
(no auth required) - Health Check:
GET http://localhost:3100/health
- Direct Tool Testing: Use Postman with JSON-RPC requests
Client Testing
- Development Server:
npm run dev
(runs on http://localhost:5173 with HMR) - Production Build:
npm run build
(TypeScript compilation + Vite build) - Preview Build:
npm run preview
(preview production build on http://localhost:4173) - Linting:
npm run lint
(ESLint code quality checks) - Manual Testing: Test MCP server connection, OpenAI integration, and chat functionality
Getting Started (Full Setup)
Prerequisites
- Node.js 18+ and npm/yarn
- PostgreSQL database
- OpenAI API key (optional, for client chat functionality)
Complete Setup
-
Clone the repository:
git clone <repository-url> cd mcp_server_client_boilerplate
-
Setup MCP Server:
cd mcp_server yarn install # Create .env with DATABASE_URL yarn prisma:generate yarn prisma:migrate yarn prisma:seed yarn dev
-
Setup MCP Client (in new terminal):
cd mcp_client npm install # Create .env.local with VITE_OPENAI_API_KEY (optional) npm run dev
-
Access the application:
- Client: http://localhost:5173
- Server health: http://localhost:3100/health
- MCP Inspector:
yarn inspect
(from server directory)
Configuration
Environment Variables
Server (.env):
DATABASE_URL="postgresql://username:password@localhost:5432/database_name"
PORT=3100
NODE_ENV=development
Client (.env):
# Required: OpenAI API key
VITE_OPENAI_API_KEY=your_openai_api_key_here
# MCP Server URL (uses proxy to avoid CORS)
VITE_MCP_SERVER_URL=/api/mcp/mcp
# Optional: For remote server connections
# VITE_MCP_SERVER_URL=https://your-remote-server.com/mcp
CORS Resolution
The client uses Vite proxy configuration to eliminate CORS issues when connecting to MCP servers:
How it works:
- Client makes requests to
/api/mcp/*
(same-origin) - Vite proxy intercepts and forwards to the actual MCP server
- Server-to-server communication bypasses browser CORS restrictions
Proxy Configuration (vite.config.ts):
export default defineConfig(() => {
const proxyTarget = process.env.VITE_PROXY_TARGET || "https://mcp.bmcom.ca";
return {
// ... other config
server: {
proxy: {
"/api/mcp": {
target: proxyTarget,
changeOrigin: true,
secure: proxyTarget.startsWith("https://"),
rewrite: (path) => path.replace(/^\/api\/mcp/, ""),
},
},
},
};
});
Benefits:
- ā No CORS configuration needed on the server
- ā Works with any MCP server (local or remote)
- ā Seamless development experience
- ā Production-ready proxy setup
Troubleshooting
Common Issues
- MCP Server Connection Failed: Ensure server is running on port 3100
- Database Connection Error: Check PostgreSQL is running and DATABASE_URL is correct
- OpenAI API Error: Verify API key is set and has sufficient credits
- Port Conflicts: Client runs on 5173, server on 3100 - ensure ports are available
- CORS Errors:
- Use proxy configuration: Set
VITE_MCP_SERVER_URL=/api/mcp/mcp
in client.env
- Update
vite.config.ts
proxy target to your MCP server URL - Restart development server after proxy changes
- Use proxy configuration: Set
- Remote Server Connection Issues:
- Verify remote server is accessible (test with Postman)
- Check proxy target URL in
vite.config.ts
- Ensure remote server has proper CORS headers (if not using proxy)
- Test health endpoint:
https://your-server.com/health
License
ISC (Server) / MIT (Client)
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test both server and client thoroughly
- Submit a pull request with detailed description