barkow96/ai-mcp
If you are the rightful owner of ai-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a TypeScript-based server that provides a structured way to expose resources, tools, and prompts for interactive client applications.
Model Context Protocol Demo: MCP Server + Interactive Client
This repository demonstrates a minimal-but-practical setup for the Model Context Protocol (MCP): a TypeScript MCP server that exposes resources, tools, and prompts; and an interactive CLI client that connects to the server, lists capabilities, and lets you invoke them. The client additionally uses Google Gemini via the ai SDK to run inference for prompts and free-form queries.
Key Capabilities
-
MCP Server (TypeScript)
- Resources: Read data from a local JSON database (
users.json).users://all– returns all usersusers://{userId}/profile– returns details for a single user
- Tools:
create-user– creates a user from provided fieldscreate-random-user– asks the client to sample a fake user via LLM, then persists it
- Prompts:
generate-fake-user– produces a prompt template to create fake user data
- Resources: Read data from a local JSON database (
-
Interactive MCP Client (TypeScript)
- Connects to the server over stdio
- Lists and invokes Tools, Resources, and Prompts via a terminal menu
- Runs free-form queries with LLM tool use (the model can call server tools during generation)
- Supports sampling requests from the server
Project Structure
6_AI_model_context_protocol/
mcp_server/
src/
data/ # Simple JSON "database"
helpers/ # User management utilities
prompts/ # Prompt template definitions
resources/ # Resource implementations
tools/ # Tool implementations
types/ # Shared type definitions
server.ts # Server bootstrap
mcp_client/
src/
config/ # Global configuration
handlers/ # MCP interaction handlers
mappers/ # Type validation utilities
sampling/ # Sampling request handling
types/ # TypeScript type definitions
ai.ts # AI/LLM integration
client.ts # CLI client entry point
mcp.ts # MCP client management
menu.ts # Interactive menu system
package.json # Scripts for dev/build/run
tsconfig.json # TypeScript config
Requirements
- Node.js 20+ (recommended)
- npm 9+
- A Google Gemini API key for the client
- Set
GEMINI_API_KEYin your environment (e.g., a.envfile) - You can obtain a key from Google AI Studio (see Google's documentation)
- Set
Setup
-
Install dependencies:
npm install -
Configure environment variables for the client:
- Create a
.envfile at the project root:GEMINI_API_KEY=your_google_gemini_api_key - You can obtain a key from Google AI Studio (see Google's documentation).
- Create a
Build and Run
The client spawns the server from the compiled output, so build the server first.
-
Build the server:
npm run server:build -
Run the interactive client (this will connect to the built server):
npm run client:dev
Optional (standalone server run and inspection):
-
Run the server directly in dev mode (for debugging):
npm run server:dev -
Inspect the MCP server with the MCP Inspector:
npm run server:inspect
Using the Client
When the client starts you'll see a menu with four options:
-
Tools
- Select a tool to run. You'll be prompted for its arguments.
- Provided tools:
create-user: Entername,email,address,phoneto create a user.create-random-user: The server requests a sampled user from the client's LLM, then saves it.
-
Resources
- Browse server resources. If a resource has path parameters (e.g.,
{userId}), the client will prompt you for values. - Examples:
users://allusers://{userId}/profile
- Browse server resources. If a resource has path parameters (e.g.,
-
Prompts
- Pick a server prompt and (optionally) execute it locally with Gemini. The client prints the generated text.
-
Query
- Enter a free-form query. The client will run Gemini with tool support so the model can call server tools during generation. The final text or tool result is printed.
Data Persistence
- User data is stored in
mcp_server/src/data/users.json. - Tools that modify users will persist changes back to this file.
Architecture Details
Client Architecture
The client is organized into several key modules:
handlers/: Contains logic for handling different types of MCP interactions (tools, resources, prompts, queries)config/: Global configuration including AI model settingstypes/: TypeScript type definitions for the applicationmappers/: Utility functions for type validation and conversionsampling/: Handles sampling requests from the serverai.ts: Integration with Google Gemini AImcp.ts: MCP client initialization and connection managementmenu.ts: Interactive menu system for user interaction
Server Architecture
The server follows a modular structure:
tools/: Tool implementations (create-user, create-random-user)resources/: Resource implementations (user data access)prompts/: Prompt template definitionshelpers/: Utility functions for data managementdata/: Simple JSON-based data storagetypes/: Shared type definitions
AI Integration
The client uses Google Gemini 2.0 Flash model for:
- Free-form queries with tool support
- Prompt execution
- Sampling requests from the server
The integration is handled through the ai SDK and configured via environment variables.