apple-on-device-ai

Installation
Summary

Deploy on-device AI across Apple platforms using Foundation Models, Core ML, MLX Swift, and llama.cpp.

  • Choose Foundation Models for zero-setup text generation and structured output on iOS 26+; Core ML for custom vision and NLP models; MLX Swift for maximum throughput on Apple Silicon; llama.cpp for cross-platform GGUF inference
  • Foundation Models includes session management, @Generable macros for type-safe structured output, tool calling, and streaming with always-enforced guardrails
  • Core ML supports PyTorch, TensorFlow, and scikit-learn conversion via coremltools, with quantization, palettization, and pruning for Neural Engine optimization
  • Multi-backend architecture patterns, memory management rules (60% RAM limit on iOS), and 10 common mistakes to avoid including availability checks, context window budgeting, and concurrent request handling
SKILL.md

On-Device AI for Apple Platforms

Guide for selecting, deploying, and optimizing on-device ML models. Covers Apple Foundation Models, Core ML, MLX Swift, and llama.cpp.

Contents

Framework Selection Router

Related skills

More from dpearson2699/swift-ios-skills

Installs
1.5K
GitHub Stars
572
First Seen
Mar 3, 2026