local-ai-models
iOS On-Device AI Models
Production-ready guide for implementing on-device AI models in iOS apps using Apple's Foundation Models framework and MLX Swift.
When to Use This Skill
- Implementing local LLM inference in iOS apps
- Building chat interfaces with Foundation Models
- Integrating Vision Language Models (VLMs)
- Adding text embeddings or image generation
- Implementing tool/function calling with LLMs
- Managing multi-turn conversations
- Optimizing memory usage for on-device models
- Supporting internationalization in AI features
Core Principles
More from mintuz/claude-plugins
gps-method
Evidence-based goal achievement framework using Goal, Plan, and System methodology. Use when users want to set goals, create actionable plans, build execution systems, or diagnose why they're struggling to make progress on existing goals. Triggers include requests to "set a goal", "help me achieve", "create a plan", "why am I not making progress", or similar goal-setting and achievement queries.
24app-store-scraper
>
19web-design
WHEN refining UI layout, typography, color, or polish; NOT code implementation; provides concise principles for intentional, legible design.
18eyes
WHEN users express dissatisfaction with visual appearance or behavior; use Playwright MCP to capture screenshots and collaborate on UI fixes with a structured feedback loop.
13react-testing
WHEN testing React components/hooks/context with React Testing Library; NOT e2e; covers renderHook, providers, forms, and anti-patterns.
12swiftui-architecture
WHEN building SwiftUI views, managing state, setting up shared services, or making architectural decisions; NOT for UIKit or legacy patterns; provides pure SwiftUI data flow without ViewModels using @State, @Binding, @Observable, and @Environment.
10