nascoder-azure-ai-foundry-mcp

freelancernasimofficial/nascoder-azure-ai-foundry-mcp

3.2

If you are the rightful owner of nascoder-azure-ai-foundry-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Nascoder Azure AI Foundry MCP Server is an intelligent server designed for seamless integration with Azure AI Foundry projects, featuring auto-routing, multi-service integration, and intelligent intent detection.

Tools
  1. ask_azure_ai

    Chat with deployed AI models.

  2. analyze_image

    Computer vision and image analysis.

  3. translate_text

    Multi-language text translation.

  4. analyze_document

    Document processing and extraction.

  5. check_content_safety

    Content moderation and safety.

  6. analyze_language

    Sentiment analysis and language processing.

  7. list_capabilities

    Show all available capabilities.

  8. health_check

    Service health monitoring.

  9. get_model_info

    Model deployment information.

🛡️ NasCoder Azure AI MCP Server

Professional Azure AI Integration for Model Context Protocol

🎯 Overview

A comprehensive Model Context Protocol (MCP) server that provides seamless integration with Azure AI services. Built for production use with intelligent routing and comprehensive error handling.

✅ Available Tools

  1. ask_azure_ai - Intelligent chat with auto-routing to best available model
  2. get_model_info - Real-time model deployment information
  3. health_check - Service health monitoring and diagnostics
  4. list_capabilities - Available capabilities and features
  5. analyze_image - Computer vision analysis
  6. translate_text - Multi-language text translation
  7. check_content_safety - Content moderation and safety analysis
  8. analyze_language - Language detection and sentiment analysis
  9. analyze_document - Document intelligence and analysis

🚀 Installation

npm install nascoder-azure-ai-mcp

⚙️ Configuration

Environment Variables

Create a .env file with your Azure credentials:

AZURE_AI_INFERENCE_API_KEY=your_azure_api_key_here
AZURE_AI_INFERENCE_ENDPOINT=https://your-endpoint.cognitiveservices.azure.com/
AZURE_AI_PROJECTS_CONNECTION_STRING=your_connection_string_here

MCP Client Integration

Add to your MCP client configuration:

{
  "mcpServers": {
    "nascoder-azure-ai": {
      "command": "npx",
      "args": ["nascoder-azure-ai-mcp"],
      "env": {
        "AZURE_AI_INFERENCE_API_KEY": "your_api_key_here",
        "AZURE_AI_INFERENCE_ENDPOINT": "your_endpoint_here"
      }
    }
  }
}

🔧 Usage Examples

Basic Chat

{
  "query": "Explain quantum computing",
  "context": "Educational content for beginners"
}

Image Analysis

{
  "imageUrl": "https://example.com/image.jpg",
  "features": "objects,text,faces"
}

Text Translation

{
  "text": "Hello world",
  "targetLanguage": "es"
}

🛡️ Security Features

  • Environment variable based configuration
  • No hardcoded credentials
  • Comprehensive error handling
  • Rate limiting and retry logic
  • Input validation and sanitization

📋 Requirements

  • Node.js >= 18.0.0
  • Valid Azure AI services subscription
  • Azure AI Foundry access

🔗 Links

📄 License

MIT License - see LICENSE file for details.