otterbridge
If you are the rightful owner of otterbridge and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
OtterBridge is a lightweight, MCP server for connecting applications to various Large Language Model providers.
OtterBridge is a flexible server designed to connect applications with various Large Language Model (LLM) providers. It follows the principles of simplicity and composability, providing a clean interface to LLMs while maintaining adaptability for different use cases. Currently, it supports the Ollama provider, with plans to expand support to other providers like ChatGPT and Claude. OtterBridge is built with FastMCP, ensuring a reliable and efficient server implementation. It offers easy access to model information and capabilities, making it a versatile tool for developers looking to integrate LLM functionalities into their applications.
Features
- Provider-Agnostic: Designed to work with multiple LLM providers (currently Ollama, with ChatGPT and Claude coming soon).
- Simple, Composable Design: Follows best practices for LLM agent architecture.
- Lightweight Server: Built with FastMCP for reliable, efficient server implementation.
- Model Management: Easy access to model information and capabilities.
Tools
chat
Send messages to LLMs and get AI-generated responses.
list_models
Retrieve information about available language models.