omnillm-mcp
If you are the rightful owner of omnillm-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
OmniLLM is an MCP server that allows Claude to query and integrate responses from other large language models (LLMs) like ChatGPT, Azure OpenAI, and Google Gemini, creating a unified access point for all your AI needs.
OmniLLM serves as a universal bridge for Claude, enabling it to interact with multiple large language models (LLMs) such as OpenAI's ChatGPT, Azure OpenAI, and Google's Gemini. This integration allows users to access a wide range of AI capabilities from a single platform, facilitating seamless communication and comparison between different LLMs. The server is designed to be easily configurable, requiring only API keys for the desired services. Once set up, OmniLLM can be integrated with the Claude Desktop application, allowing users to query different LLMs and receive responses for various tasks, such as comparing opinions or gathering information from multiple AI sources.
Features
- Query OpenAI's ChatGPT models
- Query Azure OpenAI services
- Query Google's Gemini models
- Get responses from all LLMs for comparison
- Check which LLM services are configured and available