local-llm-mcp-server

georgepok/local-llm-mcp-server

3.2

If you are the rightful owner of local-llm-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

A Model Context Protocol (MCP) server that bridges local LLMs running in LM Studio with Claude Desktop and other MCP clients, ensuring privacy by processing AI tasks locally.

MCPHub score:3.17

Has a README

Github repo has a README.md.

Has a License

Github repo has license info.

Server can be inspected

Currently can not be tried on MCPHub.

Server schema can be extracted

Can get at lease one tool info from the README or server.

Online hosted on MCPHub

More deployment information is needed.

Has social accounts

Do not have any social accounts.

Claimed by the author or certified by MCPHub

If you are the author, claim authorship