flash-moe-inference
Flash-MoE Inference Engine
Skill by ara.so — Daily 2026 Skills collection.
Flash-MoE is a pure C/Objective-C/Metal inference engine that runs Qwen3.5-397B-A17B (397B parameter Mixture-of-Experts) on a MacBook Pro with 48GB RAM at 4.4+ tokens/second. It streams 209GB of expert weights from NVMe SSD on demand — no Python, no ML frameworks, just C, Objective-C, and hand-tuned Metal shaders.
Requirements
- Hardware: Apple Silicon Mac (M3 Max or similar), 48GB+ unified memory, 1TB+ SSD with ~210GB free
- OS: macOS 26+ (Darwin 25+)
- Tools: Xcode Command Line Tools, Python 3.x (for weight extraction only)
- Model: Qwen3.5-397B-A17B safetensors weights (download separately from HuggingFace)
Installation & Build
# Clone the repo
git clone https://github.com/danveloper/flash-moe
cd flash-moe/metal_infer
More from aradotso/trending-skills
openclaw-control-center
Local-first, security-first control center for OpenClaw agents — visibility dashboard with readonly defaults, token attribution, collaboration tracing, and safe write operations.
3.9Kinkos-multi-agent-novel-writing
Multi-agent CLI system for autonomous novel writing, auditing, and revision with human review gates
1.8Keverything-claude-code-harness
Agent harness performance system for Claude Code and other AI coding agents — skills, instincts, memory, hooks, commands, and security scanning
1.6Kagency-agents-ai-specialists
A collection of specialized AI agent personalities for Claude Code, Cursor, Aider, Windsurf, and other AI coding tools — covering engineering, design, marketing, sales, and more.
1.6Kui-ux-pro-max-skill
AI design intelligence skill for building professional UI/UX across multiple platforms with 161 reasoning rules, 67 styles, and automated design system generation
1.5Kunderstand-anything-knowledge-graph
Turn any codebase into an interactive knowledge graph using Claude Code skills — explore, search, and ask questions about any project visually.
1.5K