jax
JAX
JAX is "NumPy on steroids". It combines Autograd (automatic differentiation) with XLA (compilation). 2025 sees Flax NNX (PyTorch-style OOP) becoming standard.
When to Use
- TPU Training: JAX runs natively on Google TPUs.
- Research: If you need to compute 10th order derivatives or strange math.
- Massive Scale: DeepMind and OpenAI use JAX for training frontier models.
Core Concepts
Functional Transformations
grad(), jit(), vmap(), pmap().
Flax (NNX)
Neural network library. NNX introduces mutable state (OOP) to make JAX feel like PyTorch.
More from g1joshi/agent-skills
template
Expert [skill-name] assistance covering [feature 1], [feature 2], and [feature 3]. Use when [working with X], [debugging Y], or [implementing Z].
34mariadb
MariaDB MySQL-compatible database with Galera clustering. Use for MySQL-compatible database needs.
6claude
Anthropic Claude AI models for analysis and coding. Use for AI assistants.
5javascript
JavaScript ES6+ programming including async/await, DOM manipulation, modules, and Node.js. Use for .js files and web development.
4typescript
TypeScript static typing with interfaces, generics, decorators, and type inference. Use for .ts files.
4python
Python programming with type hints, async/await, decorators, and package management. Use for .py files and data science.
4