Automates app testing and UI interaction across iOS, Android, tvOS, and macOS by taking snapshots, extracting UI elements, and executing taps, scrolls, and text input. Starts with deterministic setup through bootstrap routines, then uses read-only inspection before making UI changes. Handles React Native debug overlays, keyboard dismissal, and reference-based targeting instead of raw coordinates. The exploration workflow keeps you from breaking things by verifying state before mutations. Works well for QA automation, bug reproduction, and any scenario where you need to programmatically drive mobile or desktop app interfaces without manual interaction.
npx skills add https://github.com/callstackincubator/agent-device --skill agent-device