Cursor-Local-llm-MCP-proxy

Davz33/Cursor-Local-llm-MCP-proxy

3.2

If you are the rightful owner of Cursor-Local-llm-MCP-proxy and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This server implements speculative decoding with a local LLM, prioritizing local model responses and falling back to other models when necessary.

Tools

Functions exposed to the LLM to take actions

generate_text

Generate text using local LLM with fallback to other models.

chat_completion

Chat completion using local LLM with fallback.

generate_with_context

Generate text with automatic context gathering from available MCP servers.

validate_response

Validate a local LLM response using Cursor agent capabilities.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources