This gives you production-ready data quality validation using Great Expectations, dbt tests, and data contracts. You get specific expectation suites for completeness, uniqueness, and range validation, plus automated checkpoints that can fail your pipeline when data quality drops. The patterns cover everything from basic null checks to statistical validation and cross-table relationships. Honestly, the Great Expectations setup is a bit verbose, but once configured it catches data issues before they hit production dashboards. Most useful when you need bulletproof data pipelines and can't afford bad data making it to stakeholders.
npx skills add https://github.com/wshobson/agents --skill data-quality-frameworks