mcp-llm

mcp-llm

3.5

If you are the rightful owner of mcp-llm and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

An MCP server that provides access to LLMs using the LlamaIndexTS library.

The MCP LLM server is designed to facilitate interaction with large language models (LLMs) through the LlamaIndexTS library. It offers a suite of tools that enable developers to generate code, create documentation, and interact with LLMs to ask questions. The server can be installed via Smithery or manually from the source, providing flexibility in deployment. It supports various programming tasks by allowing users to generate code snippets, write code directly to files, and produce documentation in formats like JSDoc. Additionally, it can answer questions, making it a versatile tool for developers looking to leverage AI in their workflows.

Features

  • generate_code: Generate code based on a description
  • generate_code_to_file: Generate code and write it directly to a file at a specific line number
  • generate_documentation: Generate documentation for code
  • ask_question: Ask a question to the LLM