charlesgobina/mcp-server-node-starter
If you are the rightful owner of mcp-server-node-starter and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This is a starter template for building MCP (Model Context Protocol) servers using Node.js, TypeScript, and Express.
MCP Server Node Starter
A starter template for building MCP (Model Context Protocol) servers with Node.js, TypeScript, and Express. Features Streamable HTTP transport, web search capabilities (single tool available), and AI-powered tool interactions (for demo purposes).
Features
- ā Streamable HTTP Transport - Web-accessible MCP server
- ā Web Search Tool - Powered by Tavily AI search
- ā AI Client - OpenAI integration with tool use
- ā Scalable Architecture - Clean folder structure for growth
- ā TypeScript - Full type safety with strict mode
- ā Production Ready - Logging, error handling, configuration
Quick Start
1. Clone and Install
git clone git@github.com:charlesgobina/mcp-server-node-starter.git
cd mcp-server-node-starter
npm install
2. Configure Environment
Copy .env.example
to .env
and add your API keys:
cp .env.example .env
Edit .env
with your API keys:
OPENAI_API_KEY=sk-your-openai-key
TAVILY_API_KEY=tvly-your-tavily-key
Get API Keys:
- OpenAI: https://platform.openai.com/api-keys
- Tavily: https://app.tavily.com/
3. Build and Start the Server
npm run build
npm start
You should see:
š MCP Server running on http://localhost:3000
š” MCP endpoint: http://localhost:3000/mcp
š Health check: http://localhost:3000/health
4. Test with AI Client
In a new terminal:
npm run ask "Search for the latest AI developments"
Or for development with auto-reload:
npm run dev
Project Structure
src/
āāā config/ # Configuration and environment
ā āāā index.ts
āāā types/ # TypeScript type definitions
ā āāā index.ts
āāā utils/ # Utilities (logger, etc.)
ā āāā logger.ts
āāā tools/ # MCP tools
ā āāā index.ts # Tool registration
ā āāā web-search.tool.ts
āāā services/ # Business logic
ā āāā tool-registry.ts # Tool management
ā āāā ai-client.ts # AI integration (for client)
āāā controllers/ # Request handlers
ā āāā mcp.controller.ts
āāā middleware/ # Express middleware (future)
āāā server/ # Server entry point
ā āāā index.ts
āāā client/ # Client applications
ā āāā cli.ts # CLI client
āāā app.ts # Express app setup
Available Scripts
# Production
npm run build # Compile TypeScript to dist/
npm start # Start the built server
npm run ask "..." # Ask a question using AI client + tools
# Development
npm run dev # Build + start server with nodemon (auto-reload)
npm run start:dev # Start server in dev mode (tsx, no build needed)
npm run ask:dev "..." # Ask using dev client (tsx, no build needed)
npm run build:watch # Watch mode for TypeScript compilation
# Utilities
npm run clean # Remove dist folder
How It Works
Architecture
āāāāāāāāāāāāāāā āāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāā
ā AI Client ā āāāāŗ ā MCP Server ā āāāāŗ ā Tavily API ā
ā (OpenAI) ā āāāā ā (Express) ā āāāā ā (Web Search) ā
āāāāāāāāāāāāāāā āāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāā
Request Flow
- User asks a question via CLI client
- Client connects to MCP server at
http://localhost:3000/mcp
- Gets available tools from the server
- Sends question to OpenAI with tools list
- OpenAI decides to use
web_search
tool - Client calls the tool via MCP protocol
- Server executes web search using Tavily
- Results sent back to OpenAI
- OpenAI formats final answer for the user
Adding New Tools
Step 1: Create Tool File
Create src/tools/your-tool.tool.ts
:
import type { Tool, ToolResult } from '../types/index.js';
import { logger } from '../utils/logger.js';
async function yourToolHandler(args: { param1: string }): Promise<ToolResult> {
try {
// Your tool logic here
const result = doSomething(args.param1);
return {
content: [{
type: 'text',
text: `Result: ${result}`,
}],
};
} catch (error: any) {
logger.error('Tool error:', error.message);
return {
content: [{ type: 'text', text: `Error: ${error.message}` }],
isError: true,
};
}
}
export const yourTool: Tool = {
name: 'your_tool',
description: 'What your tool does',
inputSchema: {
type: 'object',
properties: {
param1: {
type: 'string',
description: 'Parameter description',
},
},
required: ['param1'],
},
handler: yourToolHandler,
};
Step 2: Register Tool
In src/tools/index.ts
:
import { yourTool } from './your-tool.tool.js';
export function registerAllTools() {
logger.info('Registering tools...');
toolRegistry.register(webSearchTool);
toolRegistry.register(yourTool); // Add this line
logger.info(`Total tools registered: ${toolRegistry.count()}`);
}
Step 3: Restart Server
# Stop server (Ctrl+C)
npm start
Your tool is now available to AI!
Configuration
Environment Variables
Variable | Required | Description |
---|---|---|
PORT | No | Server port (default: 3000) |
NODE_ENV | No | Environment (development/production) |
LOG_LEVEL | No | Logging level (debug/info/warn/error) |
OPENAI_API_KEY | Yes | OpenAI API key |
OPENAI_MODEL | No | Model to use (default: gpt-4o-mini) |
TAVILY_API_KEY | Yes | Tavily API key for web search |
SERVER_NAME | No | Server name for MCP |
CORS_ORIGIN | No | CORS origin (default: *) |
Server Configuration
Edit src/config/index.ts
to modify server behavior.
API Endpoints
Health Check
GET /health
Response:
{
"status": "ok",
"server": "mcp-server",
"version": "1.0.0",
"timestamp": "2025-10-04T17:39:00.000Z"
}
Server Info
GET /
Response:
{
"name": "mcp-server",
"version": "1.0.0",
"status": "running",
"transport": "Streamable HTTP",
"endpoints": {
"mcp": "POST /mcp - MCP protocol endpoint",
"health": "GET /health - Health check"
}
}
MCP Protocol
POST /mcp
Content-Type: application/json
{
"jsonrpc": "2.0",
"method": "tools/list",
"id": 1
}
Deployment
Docker (Coming Soon)
docker build -t mcp-server .
docker run -p 3000:3000 --env-file .env mcp-server
Cloud Platforms
Deploy to:
- Railway:
railway up
- Render: Connect GitHub repo
- Fly.io:
fly deploy
- Vercel: Serverless functions
Development
File Naming Conventions
- Controllers:
*.controller.ts
- Services:
*.service.ts
- Tools:
*.tool.ts
- Types:
*.types.ts
- Utils:
*.util.ts
Best Practices
- Always use
type
imports for TypeScript types - Log important operations using the logger
- Handle errors gracefully in tools
- Validate inputs in tool handlers
- Keep tools focused - one tool, one purpose
- Document your code with JSDoc comments
Troubleshooting
Server won't start
# Kill processes on port 3000
lsof -ti:3000 | xargs kill -9
# Restart
npm start
"Configuration errors"
Check .env
file has required API keys:
OPENAI_API_KEY
TAVILY_API_KEY
Tools not appearing
- Check tool is registered in
src/tools/index.ts
- Check server logs for registration confirmation
- Restart server after adding tools
AI not using tools
- Make your question explicit (e.g., "Search for...")
- Check logs to see what tools AI sees
- Try with a simpler question first
Learning Resources
- MCP Specification: https://modelcontextprotocol.io/specification
- TypeScript Docs: https://www.typescriptlang.org/docs
- OpenAI Function Calling: https://platform.openai.com/docs/guides/function-calling
- Tavily API: https://docs.tavily.com
Examples
Check the learn/
folder for tutorial examples and documentation.
Why Use This Template?
- š Quick Start - Get your MCP server running in minutes
- š¦ Best Practices - Production-ready architecture and error handling
- š§ Easy to Extend - Simple tool registration system
- š Well Documented - Comprehensive README and code comments
- šÆ Type Safe - Full TypeScript support
- š Plug & Play - Works with any MCP-compatible client
Use Cases
- AI Assistants - Give AI models access to real-time data
- Research Tools - Search, analyze, and process information
- Automation - Build AI-powered workflows and pipelines
- Integration - Connect AI to your existing APIs and services
What is MCP?
The Model Context Protocol (MCP) is an open standard that enables AI models to securely interact with external tools and data sources. Think of it as a USB port for AI - a universal way to plug in new capabilities.
Learn more: https://modelcontextprotocol.io
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-tool
) - Commit your changes (
git commit -m 'Add amazing tool'
) - Push to the branch (
git push origin feature/amazing-tool
) - Open a Pull Request
License
This project is licensed under the MIT License - see the file for details.
Acknowledgments
- Built with the Model Context Protocol SDK
- Powered by OpenAI and Tavily
ā Star this repo if you found it helpful!