shantanujumde/mcp-client-server
If you are the rightful owner of mcp-client-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This workspace contains a minimal Model Context Protocol (MCP) server and a companion consumer, designed to demonstrate the use of Server-Sent Events (SSE) and OpenAI's `gpt-4o` model.
MCP Server & Consumer
This workspace contains a minimal Model Context Protocol (MCP) server (mcp-server
) and a companion consumer (mcp-consumer
). The server exposes a simple get_length
tool over Server-Sent Events (SSE). The consumer connects to the server, enumerates available tools, and calls the length tool through OpenAI's gpt-4o
model using the ai
SDK.
Project Layout
mcp-server
– Express + MCP server that registers theget_length
tool and serves an SSE endpoint on port4001
.mcp-consumer
– Node.js script that connects to the MCP server, lists the tools, and requests the length of"Hello World"
via OpenAI.
Prerequisites
- Node.js 22 or later (tested with Node 22).
- npm 10 or later (bundled with Node 22).
- An OpenAI API key with access to the
gpt-4o
model.
Setup
-
Install dependencies for each package:
cd mcp-server npm install cd ../mcp-consumer npm install
-
Create a
.env
file inmcp-consumer/
with the required environment variables (see below).
Environment Variables
Create mcp-consumer/.env
containing the following key:
Key | Required | Description |
---|---|---|
OPENAI_API_KEY | Yes | OpenAI API key used by the consumer to call gpt-4o via the ai SDK. |
Optional overrides:
MCP_SERVER_URL
– Set to a custom SSE endpoint (defaults tohttp://localhost:4001/sse
). If you use this variable, updateindex.ts
accordingly.PORT
– Change the HTTP port used bymcp-server
(defaults to4001
). Update the consumer to match if you change it.
Running the Server
From mcp-server/
:
npm run dev
– Start the MCP server withnodemon
(auto-restarts on file changes).npm run build
– Compile TypeScript todist/
.npm start
– Run the built server fromdist/index.js
.
The server logs the listening URL and exposes:
- SSE endpoint:
http://localhost:4001/sse
- Message relay endpoint:
http://localhost:4001/messages
Running the Consumer
From mcp-consumer/
:
npm run dev
– Start the consumer withts-node
andnodemon
.npm run build
– Compile TypeScript todist/
.npm start
– Run the compiled consumer.
When the consumer runs it:
- Connects to the SSE endpoint.
- Lists available MCP tools in the console.
- Uses
gpt-4o
to call theget_length
tool and prints the tool output. - Closes the MCP client gracefully.
Development Notes
- TypeScript configs in each package emit CommonJS output under
dist/
. - The server currently registers a single tool. Add additional tools via
server.tool(...)
inmcp-server/index.ts
. - The consumer invokes
generateText
withtoolChoice: "required"
, forcing the model to use MCP tools for each request.
Troubleshooting
- Connection errors – Ensure the server is running on
http://localhost:4001
before starting the consumer. - 401/403 responses – Verify
OPENAI_API_KEY
is present and valid. - Model mismatch – Adjust the model name in
mcp-consumer/index.ts
if your key does not supportgpt-4o
.