mcp-llama-github-integration

mcp-llama-github-integration

3.1

If you are the rightful owner of mcp-llama-github-integration and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This repository contains a Model Context Protocol (MCP) server implementation that integrates with a locally running Llama model and provides GitHub repository file listing capabilities.

The MCP server enhances AI applications by integrating a local Llama model with GitHub repository file listing capabilities. It consists of a FastAPI-based server that implements the Model Context Protocol, forwards queries to a local Llama model, and lists files from GitHub repositories. A sample Python client application demonstrates interaction with the MCP server. The server requires Python 3.7 or higher, a running Llama model server, internet access for GitHub API, and Git installed. The server runs on localhost and provides endpoints for querying context and listing GitHub files. The project supports customization of the Llama model and GitHub API integration, and includes error handling for model and API issues.

Features

  • Integration with Llama model for AI query processing
  • GitHub repository file listing capabilities
  • FastAPI-based server implementation
  • Sample Python client application
  • Customizable Llama model and GitHub API integration