local-coding
SKILL.md
Local Coding Assistant — Code Models Across Your Fleet
Run the best open-source coding models on your own hardware. DeepSeek-Coder, Codestral, StarCoder, and Qwen-Coder routed across your devices — the fleet picks the best machine for every code generation request.
Your code never leaves your network. No GitHub Copilot subscription, no cloud API costs.
Coding models available
| Model | Parameters | Ollama name | Strengths |
|---|---|---|---|
| Codestral | 22B | codestral |
80+ languages, fill-in-the-middle, Mistral's code specialist |
| DeepSeek-Coder-V2 | 236B MoE (21B active) | deepseek-coder-v2 |
Matches GPT-4 Turbo on code tasks |
| DeepSeek-Coder | 6.7B, 33B | deepseek-coder:33b |
Purpose-built for code (87% code training data) |
| Qwen2.5-Coder | 7B, 32B | qwen2.5-coder:32b |
Strong multi-language code generation |
| StarCoder2 | 3B, 7B, 15B | starcoder2:15b |
Trained on The Stack v2, 600+ languages |
| CodeGemma | 7B | codegemma |
Google's code-focused Gemma variant |