Pimmetjeoss/MCP_server_azure
If you are the rightful owner of MCP_server_azure and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a framework designed to facilitate the interaction between language learning models (LLMs) and various tools, resources, and prompts. It provides a structured way to extend the capabilities of LLMs by integrating external functionalities and data sources.
Quick Start
Running the MCP server
Run your MCP server locally:
pnpm exec nx start example-azure-functions-ts
Testing with the MCP Inspector
Run the MCP Inspector locally:
npx -y @modelcontextprotocol/inspector@latest
Then, connect to your MCP server using Streamable HTTP transport (default URL: http://localhost:7071/mcp).
Project Structure
example-azure-functions-ts/
├── src/
│ ├── index.ts # Azure Functions entry point
│ └── server.ts # MCP server implementation
├── host.json # Azure Functions host configuration
├── local.settings.json # Azure Functions local settings
├── tsconfig.json
├── package.json
└── README.md
Adding Features
Tools
Tools provide executable functions to LLMs:
server.registerTool(
"my_tool",
{
title: "My Tool",
description: "What this tool does",
inputSchema: { param: z.string() },
},
({ param }) => ({
content: [
{
type: "text",
text: `Result for ${param}`,
},
],
}),
);
Resources
Resources expose data and content to LLMs:
server.registerResource(
"my_resource",
"resource://my-resource",
{
title: "My Resource",
description: "What this resource does",
mimeType: "text/plain",
},
(uri) => ({
contents: [
{
uri: uri.href,
text: "Content of my resource",
},
],
}),
);
Prompts
Prompts are reusable templates for interacting with LLMs:
server.registerPrompt(
"my_prompt",
{
title: "My Prompt",
description: "What this prompt does",
argsSchema: { arg: z.string() },
},
({ arg }) => ({
messages: [
{
role: "user",
content: {
type: "text",
text: `Prompt with ${arg}`,
},
},
],
}),
);