georgepok/local-llm-mcp-server
If you are the rightful owner of local-llm-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
A Model Context Protocol (MCP) server that bridges local LLMs running in LM Studio with Claude Desktop and other MCP clients, ensuring privacy by processing AI tasks locally.
Tools
Functions exposed to the LLM to take actions
local_reasoning
Use the local LLM for specialized reasoning tasks while keeping data private.
private_analysis
Analyze sensitive content locally without cloud exposure.
secure_rewrite
Rewrite or transform text locally for privacy.
code_analysis
Analyze code locally for security, quality, or documentation.
template_completion
Complete templates or forms using the local LLM.
Prompts
Interactive templates invoked by user choice
No prompts
Resources
Contextual data attached and managed by the client