llm-bridge-mcp
If you are the rightful owner of llm-bridge-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
LLM Bridge MCP allows AI agents to interact with multiple large language models through a standardized interface.
LLM Bridge MCP is a server that facilitates interaction with various large language models (LLMs) through a unified interface. It uses the Message Control Protocol (MCP) to seamlessly connect with different LLM providers, enabling easy switching between models or concurrent use of multiple models within the same application. This server is built with Pydantic AI for type safety and validation, ensuring robust and reliable operations. It supports customizable parameters such as temperature and max tokens, allowing users to fine-tune the behavior of the models. Additionally, LLM Bridge MCP provides usage tracking and metrics, offering insights into model performance and resource utilization.
Features
- Unified interface to multiple LLM providers including OpenAI, Anthropic, Google, and DeepSeek.
- Built with Pydantic AI for type safety and validation.
- Supports customizable parameters like temperature and max tokens.
- Provides usage tracking and metrics.
Tools
run_llm
Send a prompt to LLM and get a response