freelancernasimofficial/nascoder-azure-ai-foundry-mcp
If you are the rightful owner of nascoder-azure-ai-foundry-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Nascoder Azure AI Foundry MCP Server is an intelligent server designed for seamless integration with Azure AI Foundry projects, featuring auto-routing, multi-service integration, and intelligent intent detection.
ask_azure_ai
Chat with deployed AI models.
analyze_image
Computer vision and image analysis.
translate_text
Multi-language text translation.
analyze_document
Document processing and extraction.
check_content_safety
Content moderation and safety.
analyze_language
Sentiment analysis and language processing.
list_capabilities
Show all available capabilities.
health_check
Service health monitoring.
get_model_info
Model deployment information.
🛡️ NasCoder Azure AI MCP Server
Professional Azure AI Integration for Model Context Protocol
🎯 Overview
A comprehensive Model Context Protocol (MCP) server that provides seamless integration with Azure AI services. Built for production use with intelligent routing and comprehensive error handling.
✅ Available Tools
ask_azure_ai
- Intelligent chat with auto-routing to best available modelget_model_info
- Real-time model deployment informationhealth_check
- Service health monitoring and diagnosticslist_capabilities
- Available capabilities and featuresanalyze_image
- Computer vision analysistranslate_text
- Multi-language text translationcheck_content_safety
- Content moderation and safety analysisanalyze_language
- Language detection and sentiment analysisanalyze_document
- Document intelligence and analysis
🚀 Installation
npm install nascoder-azure-ai-mcp
⚙️ Configuration
Environment Variables
Create a .env
file with your Azure credentials:
AZURE_AI_INFERENCE_API_KEY=your_azure_api_key_here
AZURE_AI_INFERENCE_ENDPOINT=https://your-endpoint.cognitiveservices.azure.com/
AZURE_AI_PROJECTS_CONNECTION_STRING=your_connection_string_here
MCP Client Integration
Add to your MCP client configuration:
{
"mcpServers": {
"nascoder-azure-ai": {
"command": "npx",
"args": ["nascoder-azure-ai-mcp"],
"env": {
"AZURE_AI_INFERENCE_API_KEY": "your_api_key_here",
"AZURE_AI_INFERENCE_ENDPOINT": "your_endpoint_here"
}
}
}
}
🔧 Usage Examples
Basic Chat
{
"query": "Explain quantum computing",
"context": "Educational content for beginners"
}
Image Analysis
{
"imageUrl": "https://example.com/image.jpg",
"features": "objects,text,faces"
}
Text Translation
{
"text": "Hello world",
"targetLanguage": "es"
}
🛡️ Security Features
- Environment variable based configuration
- No hardcoded credentials
- Comprehensive error handling
- Rate limiting and retry logic
- Input validation and sanitization
📋 Requirements
- Node.js >= 18.0.0
- Valid Azure AI services subscription
- Azure AI Foundry access
🔗 Links
- Repository: GitHub
- Issues: GitHub Issues
- NPM: Package
📄 License
MIT License - see LICENSE file for details.