ArtemisAI/LiteLLM-MCP-Server
If you are the rightful owner of LiteLLM-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
LiteLLM MCP Server is a robust Model Context Protocol server designed to facilitate integration between Claude AI and LiteLLM proxy instances, offering comprehensive model management and monitoring capabilities.
Tools
Functions exposed to the LLM to take actions
list_models
List all available models in your LiteLLM instance.
get_model_info
Retrieve detailed information about a specific model.
create_virtual_key
Generate a new virtual API key for rate limiting and user management.
get_spend
Monitor API usage and costs for a specific user.
Prompts
Interactive templates invoked by user choice
No prompts
Resources
Contextual data attached and managed by the client