ryanturnberry/redoc-mcp-server-azure
3.2
If you are the rightful owner of redoc-mcp-server-azure and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
An MCP server built with ModelFetch, designed to facilitate interaction with LLMs through tools, resources, and prompts.
Redoc MCP Azure
An MCP server built with ModelFetch
Quick Start
Running the MCP server
Run your MCP server locally:
npm start
Testing with the MCP Inspector
Run the MCP Inspector locally:
npx -y @modelcontextprotocol/inspector@latest
Then, connect to your MCP server using Streamable HTTP transport (default URL: http://localhost:7071/mcp
).
Project Structure
redoc-mcp-azure/
āāā src/
ā āāā index.ts # Azure Functions entry point
ā āāā server.ts # MCP server implementation
āāā host.json # Azure Functions host configuration
āāā local.settings.json # Azure Functions local settings
āāā tsconfig.json
āāā package.json
āāā README.md
Adding Features
Tools
Tools provide executable functions to LLMs:
server.registerTool(
"my_tool",
{
title: "My Tool",
description: "What this tool does",
inputSchema: { param: z.string() },
},
({ param }) => ({
content: [
{
type: "text",
text: `Result for ${param}`,
},
],
}),
);
Resources
Resources expose data and content to LLMs:
server.registerResource(
"my_resource",
"resource://my-resource",
{
title: "My Resource",
description: "What this resource does",
mimeType: "text/plain",
},
(uri) => ({
contents: [
{
uri: uri.href,
text: "Content of my resource",
},
],
}),
);
Prompts
Prompts are reusable templates for interacting with LLMs:
server.registerPrompt(
"my_prompt",
{
title: "My Prompt",
description: "What this prompt does",
argsSchema: { arg: z.string() },
},
({ arg }) => ({
messages: [
{
role: "user",
content: {
type: "text",
text: `Prompt with ${arg}`,
},
},
],
}),
);