jinni
If you are the rightful owner of jinni and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Jinni is a tool designed to efficiently provide Large Language Models (LLMs) with the context of your projects by consolidating relevant project files.
Jinni is a Model Context Protocol (MCP) server and command-line utility that helps integrate project context into AI tools. It provides a consolidated view of relevant project files, overcoming the inefficiencies of reading files individually. Jinni uses a system similar to `.gitignore` to determine which files to include or exclude, allowing for customizable configurations. The MCP server can be integrated with various AI tools, while the CLI allows for manual context generation. Jinni is designed to work out of the box for most use cases, automatically excluding binary files, dotfiles, and common temporary files, with the option to customize inclusions and exclusions using `.contextfiles`. The tool is particularly useful for feeding context to LLMs, enabling them to better understand and assist with project-related tasks.
Features
- Efficient Context Gathering: Reads and concatenates relevant project files in one operation.
- Intelligent Filtering: Uses a system based on `.gitignore` syntax for inclusion and exclusion of files.
- Customizable Configuration: Allows precise control over which files and directories to include or exclude.
- Large Context Handling: Aborts with a `DetailedContextSizeError` if the total size of included files exceeds a configurable limit.
- Metadata Headers: Includes a path header for each included file, which can be disabled with `list_only`.