moreLLMMCP

coretao-infra/moreLLMMCP

3.1

If you are the rightful owner of moreLLMMCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

An MCP Server coded in Python and implemented as Azure Functions, exposing LLM endpoints intended to be consumed via GitHub Copilot Chat.

moreLLMMCP

An MCP Server coded in Python and implemented as Azure Functions, exposing LLM endpoints (like Azure OpenAI) intended to be consumed via GitHub Copilot Chat.

Project Highlights

  • Minimal, maintainable, and production-ready design
  • Canonical handler layer for easy LLM provider extension
  • Secure, Azure-only deployment (no local emulator)
  • Atomic, testable, and observable endpoints

Quick Start

  1. Clone the repo
  2. See scratchpad/design-considerations.md for architecture and implementation plan
  3. Follow the Implementation Plan phases for setup and deployment

Documentation

  • Design & Architecture: See
  • Contributing: See
  • Code of Conduct: See

Community & Support

  • Open issues for bugs, questions, or feature requests
  • PRs welcome! Please read the contributing guidelines first

License