A straightforward wrapper around wget that lets Claude download entire websites for local browsing or archival. Exposes a single download_website tool that recursively crawls sites, preserves directory structure, converts links to work offline, and restricts downloads to the target domain. You can control crawl depth and output location. Handy when you need to grab documentation sites, create offline copies of web resources, or analyze site structure. Requires wget installed locally, works across macOS, Linux, and Windows. The tool handles all the wget flags for you, so you get proper recursive downloading with converted links without wrestling with command line options.
claude mcp add --transport stdio pskill9-website-downloader uvx website-downloader