mistral
Mistral
Mistral AI focuses on efficiency and coding capabilities. Their "Mixture of Experts" (MoE) architecture (Mixtral) changed the game.
When to Use
- Coding: Mistral Large 2 (Codestral) is specifically optimized for code generation.
- Efficiency: Mixtral 8x7B offers GPT-3.5+ performance at a fraction of the inference cost.
- Open Weights: Apache 2.0 licenses (for smaller models).
Core Concepts
MoE (Mixture of Experts)
Only a subset of parameters (experts) are active per token. High quality, low compute.
Codestral
A model trained specifically on 80+ programming languages.
More from g1joshi/agent-skills
template
Expert [skill-name] assistance covering [feature 1], [feature 2], and [feature 3]. Use when [working with X], [debugging Y], or [implementing Z].
34mariadb
MariaDB MySQL-compatible database with Galera clustering. Use for MySQL-compatible database needs.
6claude
Anthropic Claude AI models for analysis and coding. Use for AI assistants.
5javascript
JavaScript ES6+ programming including async/await, DOM manipulation, modules, and Node.js. Use for .js files and web development.
4typescript
TypeScript static typing with interfaces, generics, decorators, and type inference. Use for .ts files.
4python
Python programming with type hints, async/await, decorators, and package management. Use for .py files and data science.
4