omnillm-mcp

sabpap/omnillm-mcp

3.2

If you are the rightful owner of omnillm-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

OmniLLM is an MCP server that allows Claude to query and integrate responses from other large language models (LLMs) like ChatGPT, Azure OpenAI, and Google Gemini, creating a unified access point for all your AI needs.

MCPHub score:3.17

Has a README

Github repo has a README.md.

Has a License

Github repo has license info.

Server can be inspected

Currently can not be tried on MCPHub.

Server schema can be extracted

Can not extract tools info from the README or server.

Online hosted on MCPHub

More deployment information is needed.

Has social accounts

Do not have any social accounts.

Claimed by the author or certified by MCPHub

If you are the author, claim authorship