gpt5-mcp-server

nightness/gpt5-mcp-server

3.2

If you are the rightful owner of gpt5-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This document provides a comprehensive overview of an MCP server that integrates OpenAI's LLM capabilities, specifically GPT-5 models, for use with MCP-compatible AI assistants.

Tools
2
Resources
0
Prompts
0

LLM MCP Server

A modern Model Context Protocol (MCP) server providing access to OpenAI's GPT-5 models with optional web search capabilities.

๐Ÿš€ Features

  • Two Powerful Tools: Standard queries and web-enhanced queries with real-time search
  • Latest GPT-5 Models: Support for gpt-5, gpt-5-mini, and gpt-5-nano
  • Proper MCP Implementation: Uses stdio transport following MCP best practices
  • Type-Safe & Tested: Full TypeScript support with comprehensive test coverage
  • Easy Integration: Works with Claude Desktop, Claude Code, and custom MCP clients

๐Ÿ› ๏ธ Quick Start

  1. Install and setup:

    npm install
    cp .env.example .env
    # Add your OPENAI_API_KEY to .env
    
  2. Test it works:

    npx tsx examples/cli-client.ts "Hello! Introduce yourself."
    
  3. Run tests:

    npm test
    

๐ŸŽฏ Usage Examples

# Basic query
npx tsx examples/cli-client.ts "What is TypeScript?"

# Different models
npx tsx examples/cli-client.ts "Explain React hooks" --model gpt-5-mini

# Web search for current info
npx tsx examples/cli-client.ts "What's the weather in Paris?" --web-search

# Advanced reasoning
npx tsx examples/cli-client.ts "Solve this math problem: ..." --reasoning-effort high

See for more detailed usage examples.

๐Ÿ”ง Available Tools

llm_query - Standard Queries

Perfect for coding help, analysis, explanations, and general tasks.

Parameters:

  • prompt (required) - Your question or request
  • modelVariant - gpt-5 (default), gpt-5-mini, gpt-5-nano
  • reasoningEffort - minimal (default), low, medium, high
  • verbosity - low, medium (default), high
  • max_tokens - Maximum response length
  • instructions - Custom system instructions

llm_query_web - Web-Enhanced Queries

Same as llm_query but with real-time web search for current information.

Perfect for:

  • Weather and news
  • Current events
  • Live statistics
  • Recent documentation updates

๐Ÿ”— Integration

This server integrates with various MCP clients:

  • - Native MCP support
  • - Professional development environment
  • - With MCP extensions
  • - Build your own

See for detailed setup instructions.

๐Ÿงช Development

Commands

npm run dev          # Development with file watching
npm run build        # Compile to JavaScript
npm run type-check   # TypeScript validation
npm run start        # Run the server directly
npm test             # Run tests (interactive)
npm run test:run     # Run tests once

Project Structure

โ”œโ”€โ”€ src/
โ”‚   โ””โ”€โ”€ index.ts              # MCP server implementation
โ”œโ”€โ”€ examples/
โ”‚   โ”œโ”€โ”€ cli-client.ts         # Feature-complete CLI client
โ”‚   โ””โ”€โ”€ README.md            # Detailed usage examples
โ”œโ”€โ”€ tests/
โ”‚   โ””โ”€โ”€ server.test.ts        # Test suite (24+ tests)
โ”œโ”€โ”€ .env.example              # Environment template
โ”œโ”€โ”€ README.md                 # This file
โ”œโ”€โ”€ INTEGRATIONS.md           # Integration guides
โ””โ”€โ”€ package.json              # ES module configuration

Testing

The project includes comprehensive test coverage:

  • โœ… Tool schema validation
  • โœ… Parameter processing
  • โœ… OpenAI API integration
  • โœ… Error handling
  • โœ… Response formatting

โš™๏ธ Configuration

Create a .env file:

# Required: Your OpenAI API key
OPENAI_API_KEY=your-openai-api-key-here

๐Ÿšจ Troubleshooting

Common Issues:

  • "Cannot find module" โ†’ Run npm install
  • "Must use import to load ES Module" โ†’ Use tsx instead of node
  • Environment variables not loading โ†’ Verify .env file exists with OPENAI_API_KEY
  • TypeScript errors โ†’ Run npm run type-check

Test manually:

echo '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}' | npx tsx src/index.ts

๐Ÿ—๏ธ Architecture

This is a proper MCP server, not an HTTP API:

โœ… MCP Protocol Compliant - Standard stdio transport
โœ… Tool Discovery - Dynamic tool listing and schema validation
โœ… Process Isolation - Each client gets its own server instance
โœ… Type Safety - Full TypeScript integration
โœ… No Network Dependencies - Works anywhere, no ports needed

โŒ Not HTTP-based - No REST endpoints or web server
โŒ Not WebSocket-based - Uses standard input/output

๐Ÿค Contributing

  1. Make your changes
  2. Run tests: npm run test:run
  3. Type check: npm run type-check
  4. Build: npm run build
  5. Test manually: npx tsx examples/cli-client.ts "test query"

๐Ÿ“„ License

MIT License - see the file for details.


A modern MCP server built with TypeScript and OpenAI's GPT-5 models