jfuller1275/multi-llm-mcp-server
3.2
If you are the rightful owner of multi-llm-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Multi-LLM MCP Server is a versatile server that supports multiple language models including Llama, Gemini, OpenAI, and Copilot, providing a robust platform for various AI-driven applications.
MCPHub score:3.17
Has a README
Github repo has a README.md.
Has a License
Github repo doesn't have a valid license.
Server can be inspected
Currently can not be tried on MCPHub.
Server schema can be extracted
Can not extract tools info from the README or server.
Online hosted on MCPHub
More deployment information is needed.
Has social accounts
Do not have any social accounts.
Claimed by the author or certified by MCPHub
If you are the author, claim authorship