mcp-server-node-starter

charlesgobina/mcp-server-node-starter

3.3

If you are the rightful owner of mcp-server-node-starter and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This is a starter template for building MCP (Model Context Protocol) servers using Node.js, TypeScript, and Express.

Tools
1
Resources
0
Prompts
0

MCP Server Node Starter

A starter template for building MCP (Model Context Protocol) servers with Node.js, TypeScript, and Express. Features Streamable HTTP transport, web search capabilities (single tool available), and AI-powered tool interactions (for demo purposes).

License: MIT TypeScript Node.js

Features

  • āœ… Streamable HTTP Transport - Web-accessible MCP server
  • āœ… Web Search Tool - Powered by Tavily AI search
  • āœ… AI Client - OpenAI integration with tool use
  • āœ… Scalable Architecture - Clean folder structure for growth
  • āœ… TypeScript - Full type safety with strict mode
  • āœ… Production Ready - Logging, error handling, configuration

Quick Start

1. Clone and Install

git clone git@github.com:charlesgobina/mcp-server-node-starter.git
cd mcp-server-node-starter
npm install

2. Configure Environment

Copy .env.example to .env and add your API keys:

cp .env.example .env

Edit .env with your API keys:

OPENAI_API_KEY=sk-your-openai-key
TAVILY_API_KEY=tvly-your-tavily-key

Get API Keys:

3. Build and Start the Server

npm run build
npm start

You should see:

šŸš€ MCP Server running on http://localhost:3000
šŸ“” MCP endpoint: http://localhost:3000/mcp
šŸ’š Health check: http://localhost:3000/health

4. Test with AI Client

In a new terminal:

npm run ask "Search for the latest AI developments"

Or for development with auto-reload:

npm run dev

Project Structure

src/
ā”œā”€ā”€ config/              # Configuration and environment
│   └── index.ts
ā”œā”€ā”€ types/               # TypeScript type definitions
│   └── index.ts
ā”œā”€ā”€ utils/               # Utilities (logger, etc.)
│   └── logger.ts
ā”œā”€ā”€ tools/               # MCP tools
│   ā”œā”€ā”€ index.ts         # Tool registration
│   └── web-search.tool.ts
ā”œā”€ā”€ services/            # Business logic
│   ā”œā”€ā”€ tool-registry.ts # Tool management
│   └── ai-client.ts     # AI integration (for client)
ā”œā”€ā”€ controllers/         # Request handlers
│   └── mcp.controller.ts
ā”œā”€ā”€ middleware/          # Express middleware (future)
ā”œā”€ā”€ server/              # Server entry point
│   └── index.ts
ā”œā”€ā”€ client/              # Client applications
│   └── cli.ts           # CLI client
└── app.ts               # Express app setup

Available Scripts

# Production
npm run build       # Compile TypeScript to dist/
npm start           # Start the built server
npm run ask "..."   # Ask a question using AI client + tools

# Development
npm run dev         # Build + start server with nodemon (auto-reload)
npm run start:dev   # Start server in dev mode (tsx, no build needed)
npm run ask:dev "..." # Ask using dev client (tsx, no build needed)
npm run build:watch # Watch mode for TypeScript compilation

# Utilities
npm run clean       # Remove dist folder

How It Works

Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”      ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”      ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   AI Client │ ───► │  MCP Server  │ ───► │  Tavily API  │
│  (OpenAI)   │ ◄─── │   (Express)  │ ◄─── │ (Web Search) │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜      ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜      ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

Request Flow

  1. User asks a question via CLI client
  2. Client connects to MCP server at http://localhost:3000/mcp
  3. Gets available tools from the server
  4. Sends question to OpenAI with tools list
  5. OpenAI decides to use web_search tool
  6. Client calls the tool via MCP protocol
  7. Server executes web search using Tavily
  8. Results sent back to OpenAI
  9. OpenAI formats final answer for the user

Adding New Tools

Step 1: Create Tool File

Create src/tools/your-tool.tool.ts:

import type { Tool, ToolResult } from '../types/index.js';
import { logger } from '../utils/logger.js';

async function yourToolHandler(args: { param1: string }): Promise<ToolResult> {
  try {
    // Your tool logic here
    const result = doSomething(args.param1);

    return {
      content: [{
        type: 'text',
        text: `Result: ${result}`,
      }],
    };
  } catch (error: any) {
    logger.error('Tool error:', error.message);
    return {
      content: [{ type: 'text', text: `Error: ${error.message}` }],
      isError: true,
    };
  }
}

export const yourTool: Tool = {
  name: 'your_tool',
  description: 'What your tool does',
  inputSchema: {
    type: 'object',
    properties: {
      param1: {
        type: 'string',
        description: 'Parameter description',
      },
    },
    required: ['param1'],
  },
  handler: yourToolHandler,
};

Step 2: Register Tool

In src/tools/index.ts:

import { yourTool } from './your-tool.tool.js';

export function registerAllTools() {
  logger.info('Registering tools...');

  toolRegistry.register(webSearchTool);
  toolRegistry.register(yourTool);  // Add this line

  logger.info(`Total tools registered: ${toolRegistry.count()}`);
}

Step 3: Restart Server

# Stop server (Ctrl+C)
npm start

Your tool is now available to AI!

Configuration

Environment Variables

VariableRequiredDescription
PORTNoServer port (default: 3000)
NODE_ENVNoEnvironment (development/production)
LOG_LEVELNoLogging level (debug/info/warn/error)
OPENAI_API_KEYYesOpenAI API key
OPENAI_MODELNoModel to use (default: gpt-4o-mini)
TAVILY_API_KEYYesTavily API key for web search
SERVER_NAMENoServer name for MCP
CORS_ORIGINNoCORS origin (default: *)

Server Configuration

Edit src/config/index.ts to modify server behavior.

API Endpoints

Health Check

GET /health

Response:

{
  "status": "ok",
  "server": "mcp-server",
  "version": "1.0.0",
  "timestamp": "2025-10-04T17:39:00.000Z"
}

Server Info

GET /

Response:

{
  "name": "mcp-server",
  "version": "1.0.0",
  "status": "running",
  "transport": "Streamable HTTP",
  "endpoints": {
    "mcp": "POST /mcp - MCP protocol endpoint",
    "health": "GET /health - Health check"
  }
}

MCP Protocol

POST /mcp
Content-Type: application/json

{
  "jsonrpc": "2.0",
  "method": "tools/list",
  "id": 1
}

Deployment

Docker (Coming Soon)

docker build -t mcp-server .
docker run -p 3000:3000 --env-file .env mcp-server

Cloud Platforms

Deploy to:

  • Railway: railway up
  • Render: Connect GitHub repo
  • Fly.io: fly deploy
  • Vercel: Serverless functions

Development

File Naming Conventions

  • Controllers: *.controller.ts
  • Services: *.service.ts
  • Tools: *.tool.ts
  • Types: *.types.ts
  • Utils: *.util.ts

Best Practices

  1. Always use type imports for TypeScript types
  2. Log important operations using the logger
  3. Handle errors gracefully in tools
  4. Validate inputs in tool handlers
  5. Keep tools focused - one tool, one purpose
  6. Document your code with JSDoc comments

Troubleshooting

Server won't start

# Kill processes on port 3000
lsof -ti:3000 | xargs kill -9

# Restart
npm start

"Configuration errors"

Check .env file has required API keys:

  • OPENAI_API_KEY
  • TAVILY_API_KEY

Tools not appearing

  1. Check tool is registered in src/tools/index.ts
  2. Check server logs for registration confirmation
  3. Restart server after adding tools

AI not using tools

  • Make your question explicit (e.g., "Search for...")
  • Check logs to see what tools AI sees
  • Try with a simpler question first

Learning Resources

Examples

Check the learn/ folder for tutorial examples and documentation.

Why Use This Template?

  • šŸš€ Quick Start - Get your MCP server running in minutes
  • šŸ“¦ Best Practices - Production-ready architecture and error handling
  • šŸ”§ Easy to Extend - Simple tool registration system
  • šŸ“š Well Documented - Comprehensive README and code comments
  • šŸŽÆ Type Safe - Full TypeScript support
  • šŸ”Œ Plug & Play - Works with any MCP-compatible client

Use Cases

  • AI Assistants - Give AI models access to real-time data
  • Research Tools - Search, analyze, and process information
  • Automation - Build AI-powered workflows and pipelines
  • Integration - Connect AI to your existing APIs and services

What is MCP?

The Model Context Protocol (MCP) is an open standard that enables AI models to securely interact with external tools and data sources. Think of it as a USB port for AI - a universal way to plug in new capabilities.

Learn more: https://modelcontextprotocol.io

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-tool)
  3. Commit your changes (git commit -m 'Add amazing tool')
  4. Push to the branch (git push origin feature/amazing-tool)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the file for details.

Acknowledgments


⭐ Star this repo if you found it helpful!