ghcontext

MarcoMuellner/ghcontext

3.2

If you are the rightful owner of ghcontext and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

ghcontext is a Model Context Protocol server that provides real-time GitHub repository information to Large Language Models.

ghcontext: Supercharge Your LLMs with Real-time GitHub Context

"But my GitHub repo changed yesterday..." - Never worry about outdated information in your AI assistants again.

ghcontext (GitHub Context Provider) bridges the gap between GitHub and Large Language Models, giving AI assistants real-time access to repository information through the standardized Model Context Protocol (MCP).

🔥 Why ghcontext?

  • Accurate, Real-time Information: LLMs often have outdated knowledge about repositories. ghcontext provides the latest API docs, README contents, and codebase structure - Also for private repos
  • Deeper Understanding: Help LLMs grasp your project's architecture, design principles, and API usage patterns.
  • Seamless Integration: Compatible with any MCP-enabled models, including Claude, GPT, and others.
  • Highly Efficient: Intelligent caching reduces API calls while keeping information fresh.

✨ Key Features

  • API Documentation Extraction: Automatically identifies and extracts API documentation from READMEs and dedicated documentation files
  • Repository Structure Analysis: Provides a map of your codebase's organization
  • README Content Retrieval: Gets the latest documentation directly from GitHub
  • File Content Search: Find and extract specific files or code snippets
  • Repository Search: Discover repositories matching specific criteria

🚀 Quick Start

A note on tokens

ghcontext requires a GitHub token for authentication. You are responsible for managing your token securely, and you should give it only the scopes really necessary for your use case. For example, if you only need to read public repositories, you can create a token with the public_repo scope. ghcontext does not need write access to your repositories.

Installation

Method 1: Run without Installation using npx
# Run directly without installation (GitHub token is REQUIRED)
npx ghcontext --GITHUB_TOKEN your_github_token

# OR using pnpm
pnpm dlx ghcontext --GITHUB_TOKEN your_github_token

This is the preferred way to give it to the claude agent, as it doesn't require any installation and you can run it directly from the command line.

Method 2: Global Installation from npm
# Install globally using npm
npm install -g ghcontext

# OR using pnpm
pnpm add -g ghcontext

# Run ghcontext with your GitHub token (REQUIRED)
ghcontext --GITHUB_TOKEN your_github_token
Method 3: Manual Installation (Development)
# Clone the repository
git clone https://github.com/yourusername/ghcontext.git
cd ghcontext

# Install dependencies
pnpm install

# Start the server with GitHub token (REQUIRED)
pnpm start --GITHUB_TOKEN your_github_token

Usage with LLMs

Connect your MCP-compatible LLM to the ghcontext server endpoint:

http://localhost:3000/api/mcp

Your LLM will now have access to tools like:

  • get-repository-info: Get detailed information about a repository
  • get-repository-readme: Retrieve the current README content
  • get-repository-api-docs: Extract API documentation
  • search-repository-files: Find files in a repository
  • get-file-content: Retrieve specific file contents

🔍 Example Scenario

Ask your MCP-enabled AI assistant:

"What are the available methods in the axios library for handling request interceptors?"

Instead of getting outdated or generic information, your assistant can:

  1. Use get-repository-api-docs to fetch the latest axios API documentation
  2. Analyze the current documentation for interceptor methods
  3. Provide you with accurate, up-to-date information

🧰 Architecture

ghcontext follows a modular design:

┌─────────────────┐       ┌──────────────┐       ┌────────────────┐
│   MCP Server    │◄─────►│  GitHub API  │◄─────►│  GitHub.com    │
│  (TypeScript)   │       │    Client    │       │                │
└────────┬────────┘       └──────────────┘       └────────────────┘
         │
         │
┌────────▼────────┐       ┌──────────────┐
│ Context         │       │   Caching    │
│ Processors      │◄─────►│   System     │
└─────────────────┘       └──────────────┘
  • MCP Server: Handles the Model Context Protocol communication
  • GitHub API Client: Manages GitHub REST and GraphQL API interactions
  • Context Processors: Extract and organize relevant information
  • Caching System: Improves performance and reduces API load

🧠 Why It Matters

Traditional AI assistants struggle with:

  • Outdated knowledge of repositories
  • Incomplete understanding of project structure
  • Inability to see recent changes and updates

ghcontext solves these problems by giving LLMs a direct line to GitHub's latest information, making your AI assistants more accurate, more helpful, and more in sync with your evolving codebase.

🛠️ Development

# Build the project
pnpm run build

# Run tests
pnpm test

# Lint your code
pnpm run lint

# Format your code
pnpm run format

📦 Publishing to npm

If you're a maintainer of this package and need to publish a new version:

  1. Update the version in package.json:

    # For patch releases (bug fixes)
    npm version patch
    
    # For minor releases (new features, no breaking changes)
    npm version minor
    
    # For major releases (breaking changes)
    npm version major
    
  2. Publish to npm:

    # The prepublishOnly script will run linting, tests, and build automatically
    npm publish
    
  3. Push tags to GitHub:

    git push --follow-tags
    

📝 License

This project is MIT licensed - see the file for details.


ghcontext: Because your AI assistant should understand your code as well as you do.