llm-mcp-server

mbosley/llm-mcp-server

3.1

If you are the rightful owner of llm-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The LLM MCP Server is a versatile server that integrates various LLM APIs, providing them as tools for Claude Code.

Tools

Functions exposed to the LLM to take actions

analyze_with_gemini

Analyze large codebases or documents with Gemini 2.5 Pro's massive context window.

quick_gpt

Fast responses using GPT-4.1-nano for simple tasks.

balanced_llm

Use Gemini 2.5 Flash or GPT-4.1-mini for balanced tasks.

route_to_best_model

Automatically choose the best model based on the task.

check_costs

Check cumulative costs for all LLM usage in this session.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources