mcp_server_client_boilerplate

vavanv/mcp_server_client_boilerplate

3.2

If you are the rightful owner of mcp_server_client_boilerplate and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A comprehensive Model Context Protocol (MCP) implementation consisting of a server that provides AI company data tools and a React client application.

Tools
3
Resources
0
Prompts
0

MCP Server Client Boilerplate

A comprehensive Model Context Protocol (MCP) implementation featuring a TypeScript server with AI company data tools and a modern React client with OpenAI integration.

Overview

This project demonstrates a complete MCP ecosystem with:

  • MCP Server: A TypeScript-based server providing tools for querying AI company information from a PostgreSQL database
  • MCP Client: A modern React application with Material-UI components, OpenAI integration, and real-time chat functionality

Server Component (mcp_server)

Features

  • Database Integration: Uses Prisma ORM with PostgreSQL to store and retrieve AI company and model data
  • MCP Tools: Provides comprehensive tools for querying AI company information:
    • getCompanies: Retrieves all companies with their chatbots and LLM models
    • getChats: Gets chatbots for a specific company
    • getLLMs: Gets LLM models for a specific company
    • diagnostic: Health check and diagnostic tool
  • Type Safety: Built with TypeScript for robust development experience
  • Database Seeding: Includes comprehensive AI company data from CSV files
  • Docker Support: Containerized deployment with health checks
  • Express Integration: HTTP server with health endpoints
  • MCP Inspector: Built-in debugging and inspection tools

Database Schema

The application uses a relational schema with three main entities:

  • Company: Stores AI company information (id, company name, description) with one-to-many relationships
  • Chat: Stores chatbot information (chatbot name) linked to companies
  • LLM: Stores LLM model information with specializations linked to companies

Quick Start (Server)

  1. Install dependencies:

    cd mcp_server
    yarn install
    
  2. Set up environment:

    # Create .env file with:
    DATABASE_URL="postgresql://username:password@localhost:5432/your_database_name"
    PORT=3100
    NODE_ENV=development
    
  3. Setup database:

    yarn prisma:generate
    yarn prisma:migrate
    yarn prisma:seed
    
  4. Run the server:

    yarn dev
    

Available Scripts (Server)

  • yarn dev - Start development server with hot reload
  • yarn build - Build TypeScript to JavaScript
  • yarn start - Start production server
  • yarn inspect - Launch MCP Inspector for debugging
  • yarn inspect:unsafe - Launch MCP Inspector without authentication
  • yarn prisma:studio - Open Prisma Studio database GUI
  • yarn prisma:seed-chats - Seed additional chat data
  • yarn prisma:populate-chats - Populate chat relationships

Docker Deployment

cd mcp_server
docker-compose build
docker-compose up -d

Client Component (mcp_client)

Features

  • React 19 Application: Built with the latest React and TypeScript for type safety
  • Vite Build Tool: Lightning-fast development with instant HMR and optimized builds
  • Material-UI v7: Modern UI components with custom styling architecture
  • OpenAI Integration: Direct integration with OpenAI API for chat functionality
  • MCP Service Integration: Connects to MCP server for AI company data retrieval
  • Real-time Chat Interface: Interactive chat with suggested questions and message history
  • Component Architecture: Clean separation of concerns with custom styling hooks
  • Error Handling: Comprehensive error handling with user-friendly alerts
  • Connection Status: Real-time MCP server connection monitoring
  • CORS-Free Operation: Vite proxy configuration eliminates CORS issues
  • Remote Server Support: Seamless connection to remote MCP servers via proxy
  • Flexible Configuration: Environment-based proxy targeting for different deployment scenarios

Key Components

  • Chat: Main chat interface with message handling and OpenAI integration
  • Header: Application header with connection status indicators
  • MessageList: Scrollable message history with user/assistant message styling
  • ChatInput: Multi-line input with keyboard shortcuts (Enter to send, Shift+Enter for new line)
  • SuggestedQuestions: Pre-defined questions to help users get started
  • ErrorAlert: User-friendly error display and handling

Quick Start (Client)

  1. Install dependencies:

    cd mcp_client
    npm install
    
  2. Set up environment (optional):

    # Create .env.local file for OpenAI API key:
    VITE_OPENAI_API_KEY=your_openai_api_key_here
    
  3. Development server:

    npm run dev
    
  4. Production build:

    npm run build
    npm run preview
    

The development server will be available at http://localhost:5173 with instant hot module replacement.

Project Structure

mcp_server_client_boilerplate/
ā”œā”€ā”€ mcp_server/              # MCP Server with database tools
│   ā”œā”€ā”€ server/              # Server implementation
│   │   ā”œā”€ā”€ server.ts        # Main server entry point
│   │   └── tools/           # MCP tool implementations
│   ā”œā”€ā”€ prisma/              # Database schema and migrations
│   │   ā”œā”€ā”€ schema.prisma    # Database schema definition
│   │   ā”œā”€ā”€ migrations/      # Database migration files
│   │   └── seed-lastone.*   # Database seeding scripts
│   ā”œā”€ā”€ scripts/             # Utility scripts for data population
│   ā”œā”€ā”€ docker-compose.yml   # Docker configuration
│   ā”œā”€ā”€ Dockerfile           # Docker image definition
│   └── package.json         # Server dependencies
ā”œā”€ā”€ mcp_client/              # React client application (Vite)
│   ā”œā”€ā”€ src/                 # React source code
│   │   ā”œā”€ā”€ main.tsx         # Application entry point
│   │   ā”œā”€ā”€ App.tsx          # Main app component
│   │   ā”œā”€ā”€ Chat.tsx         # Main chat interface
│   │   ā”œā”€ā”€ components/      # Reusable UI components
│   │   │   ā”œā”€ā”€ *Styles.tsx  # Component styling hooks
│   │   │   └── *.tsx        # Component implementations
│   │   ā”œā”€ā”€ services/        # API and service integrations
│   │   │   ā”œā”€ā”€ mcpService.ts    # MCP server communication
│   │   │   └── openaiService.ts # OpenAI API integration
│   │   ā”œā”€ā”€ types/           # TypeScript type definitions
│   │   └── constants.ts     # Application constants
│   ā”œā”€ā”€ index.html           # HTML entry point
│   ā”œā”€ā”€ vite.config.ts       # Vite configuration
│   ā”œā”€ā”€ tsconfig.json        # TypeScript configuration
│   └── package.json         # Client dependencies
└── README.md               # This file

Technologies Used

Server

  • MCP SDK: @modelcontextprotocol/sdk v1.17.1 for building MCP servers
  • Prisma: Database ORM v6.13.0 for PostgreSQL
  • TypeScript: Type-safe JavaScript development v5.9.2
  • PostgreSQL: Relational database for data storage
  • Express: HTTP server framework v5.1.0
  • Docker: Containerization for deployment
  • Zod: Runtime type validation v3.25.67
  • Faker.js: Test data generation v9.9.0

Client

  • React 19: Latest React v19.1.1 with concurrent features
  • Vite 7: Next-generation frontend tooling v7.0.6 with instant HMR
  • TypeScript 5.9: Latest TypeScript v5.9.2 with advanced type features
  • Material-UI v7: Modern React UI component library v7.2.0
  • Emotion: CSS-in-JS styling v11.14.0
  • OpenAI: Direct API integration v5.11.0
  • ESLint 9: Code quality and consistency v9.32.0
  • npm: Package management

Sample Data

The server includes comprehensive AI company data with:

  • 22+ unique companies (OpenAI, Google, Meta AI, Anthropic, Cohere, etc.)
  • Chat platforms (ChatGPT, Gemini, Claude, Copilot, etc.)
  • LLM models with specializations (GPT-4, PaLM 2, LLaMA, Claude-3, etc.)

How It Works

MCP Integration Flow

  1. Client Request: User asks a question about AI companies
  2. OpenAI Processing: Client sends query to OpenAI with function calling enabled
  3. MCP Tool Invocation: OpenAI determines which MCP tools to call based on the query
  4. Server Query: MCP client calls the appropriate server endpoints
  5. Database Retrieval: Server queries PostgreSQL database using Prisma
  6. Response Assembly: Data is returned through the MCP protocol
  7. AI Response: OpenAI generates a natural language response using the retrieved data

Architecture Benefits

  • Separation of Concerns: MCP server handles data, client handles UI/UX
  • Type Safety: End-to-end TypeScript ensures robust development
  • Real-time Updates: Instant feedback on connection status and errors
  • Scalable Design: Easy to add new tools and extend functionality

Development

Client Development with Vite

Built with Vite for superior development experience:

Performance Benefits:

  • Lightning Fast Startup: Dev server starts in ~400ms
  • Instant HMR: Hot Module Replacement updates in milliseconds
  • Optimized Builds: Modern bundling with tree-shaking and code splitting
  • Native ESM: Leverages browser's native ES modules for faster loading

Modern Architecture:

  • Component Styling: Separated styling logic using custom hooks
  • Service Layer: Clean separation between UI and API logic
  • Error Boundaries: Comprehensive error handling throughout the app
  • TypeScript Integration: Full type safety across all components

Adding New Tools (Server)

  1. Create new tool file in server/tools/
  2. Implement the tool following the MCP SDK patterns
  3. Register the tool in server/tools/index.ts
  4. Export the registration function

Example tool structure:

export const registerMyTool = (server: Server) => {
  server.setRequestHandler(CallToolRequestSchema, async (request) => {
    // Tool implementation
  });
};

Database Changes (Server)

  1. Modify prisma/schema.prisma
  2. Run: yarn prisma:migrate to create migration
  3. Update client: yarn prisma:generate
  4. Update seed data if needed

Customizing Components (Client)

  1. Components: Modify files in src/components/
  2. Styles: Update corresponding *Styles.tsx files
  3. Types: Update type definitions in src/types/
  4. Services: Extend API integrations in src/services/

Adding New UI Features

The client uses a clean architecture pattern:

  • Styling: Use custom hooks (e.g., useMessageListStyles)
  • State Management: React hooks with proper TypeScript typing
  • API Integration: Extend mcpService.ts or openaiService.ts

Testing

Server Testing

  • MCP Inspector: yarn inspect (interactive tool debugging)
  • MCP Inspector (Unsafe): yarn inspect:unsafe (no auth required)
  • Health Check: GET http://localhost:3100/health
  • Direct Tool Testing: Use Postman with JSON-RPC requests

Client Testing

  • Development Server: npm run dev (runs on http://localhost:5173 with HMR)
  • Production Build: npm run build (TypeScript compilation + Vite build)
  • Preview Build: npm run preview (preview production build on http://localhost:4173)
  • Linting: npm run lint (ESLint code quality checks)
  • Manual Testing: Test MCP server connection, OpenAI integration, and chat functionality

Getting Started (Full Setup)

Prerequisites

  • Node.js 18+ and npm/yarn
  • PostgreSQL database
  • OpenAI API key (optional, for client chat functionality)

Complete Setup

  1. Clone the repository:

    git clone <repository-url>
    cd mcp_server_client_boilerplate
    
  2. Setup MCP Server:

    cd mcp_server
    yarn install
    # Create .env with DATABASE_URL
    yarn prisma:generate
    yarn prisma:migrate
    yarn prisma:seed
    yarn dev
    
  3. Setup MCP Client (in new terminal):

    cd mcp_client
    npm install
    # Create .env.local with VITE_OPENAI_API_KEY (optional)
    npm run dev
    
  4. Access the application:

Configuration

Environment Variables

Server (.env):

DATABASE_URL="postgresql://username:password@localhost:5432/database_name"
PORT=3100
NODE_ENV=development

Client (.env):

# Required: OpenAI API key
VITE_OPENAI_API_KEY=your_openai_api_key_here

# MCP Server URL (uses proxy to avoid CORS)
VITE_MCP_SERVER_URL=/api/mcp/mcp

# Optional: For remote server connections
# VITE_MCP_SERVER_URL=https://your-remote-server.com/mcp

CORS Resolution

The client uses Vite proxy configuration to eliminate CORS issues when connecting to MCP servers:

How it works:

  1. Client makes requests to /api/mcp/* (same-origin)
  2. Vite proxy intercepts and forwards to the actual MCP server
  3. Server-to-server communication bypasses browser CORS restrictions

Proxy Configuration (vite.config.ts):

export default defineConfig(() => {
  const proxyTarget = process.env.VITE_PROXY_TARGET || "https://mcp.bmcom.ca";

  return {
    // ... other config
    server: {
      proxy: {
        "/api/mcp": {
          target: proxyTarget,
          changeOrigin: true,
          secure: proxyTarget.startsWith("https://"),
          rewrite: (path) => path.replace(/^\/api\/mcp/, ""),
        },
      },
    },
  };
});

Benefits:

  • āœ… No CORS configuration needed on the server
  • āœ… Works with any MCP server (local or remote)
  • āœ… Seamless development experience
  • āœ… Production-ready proxy setup

Troubleshooting

Common Issues

  1. MCP Server Connection Failed: Ensure server is running on port 3100
  2. Database Connection Error: Check PostgreSQL is running and DATABASE_URL is correct
  3. OpenAI API Error: Verify API key is set and has sufficient credits
  4. Port Conflicts: Client runs on 5173, server on 3100 - ensure ports are available
  5. CORS Errors:
    • Use proxy configuration: Set VITE_MCP_SERVER_URL=/api/mcp/mcp in client .env
    • Update vite.config.ts proxy target to your MCP server URL
    • Restart development server after proxy changes
  6. Remote Server Connection Issues:
    • Verify remote server is accessible (test with Postman)
    • Check proxy target URL in vite.config.ts
    • Ensure remote server has proper CORS headers (if not using proxy)
    • Test health endpoint: https://your-server.com/health

License

ISC (Server) / MIT (Client)

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test both server and client thoroughly
  5. Submit a pull request with detailed description