Jinni is an MCP server that provides large language models with consolidated project context by reading and concatenating relevant source files, preceded by path headers. It exposes a `read_context` tool that intelligently excludes binary files, dotfiles, build artifacts, and other non-essential files by default while respecting `.gitignore` and custom `.contextfiles` rules for fine-grained control. This tool solves the inefficiency of reading project files individually by delivering a complete project overview within a single context window, allowing AI assistants to better understand and assist with code tasks.
claude mcp add --transport stdio smat-dev-jinni uvx jinni