Wraps Mozilla Readability and Turndown to strip web pages down to clean Markdown, cutting token usage for LLM processing. Exposes a read_website tool that fetches URLs and converts HTML to Markdown while preserving links, plus cache management resources. Built for Claude Desktop, VS Code, and other MCP clients when you need to feed web content to AI without the noise. Handles robots.txt, includes smart caching with SHA-256 hashing, and supports concurrent crawling with configurable depth. Reach for this when existing web scrapers are too slow or token-heavy for your AI workflows.
claude mcp add --transport stdio just-every-mcp-read-website-fast uvx mcp-read-website-fast