Handles the Swift side of Core ML integration: loading models, configuring compute units (CPU, GPU, Neural Engine), running predictions, and working with MLTensor and MLMultiArray. Covers both auto-generated model classes and manual MLFeatureProvider paths, plus batch inference, stateful predictions for sequence models, and Vision framework integration. The skill draws a clear line between Swift integration (what's here) and Python-side conversion and optimization (handled elsewhere). Good decision tables for compute unit selection and concrete examples of async loading, image preprocessing, and multi-model pipelines. Targets iOS 14+ with newer features flagged by version. Use this when you're wiring up Core ML models in an iOS app and need the boilerplate right.
npx skills add https://github.com/dpearson2699/swift-ios-skills --skill coreml