pytorch-lightning

Installation
Summary

Streamlined PyTorch training framework with automatic distributed training, callbacks, and minimal boilerplate.

  • Organizes PyTorch code into LightningModule with training_step, validation_step, and test_step methods; Trainer class handles device management, mixed precision, checkpointing, and logging automatically
  • Supports distributed training strategies including DDP, FSDP, and DeepSpeed with single-line configuration; scales from laptop to multi-node clusters without code changes
  • Built-in callbacks system for ModelCheckpoint, EarlyStopping, LearningRateMonitor, and custom extensions; integrates with TensorBoard and popular logging platforms
  • Handles gradient accumulation, learning rate scheduling, and precision modes (FP32, FP16, BF16, FP8); works across GPU, TPU, CPU, and Apple MPS accelerators
SKILL.md

PyTorch Lightning - High-Level Training Framework

Quick start

PyTorch Lightning organizes PyTorch code to eliminate boilerplate while maintaining flexibility.

Installation:

pip install lightning

Convert PyTorch to Lightning (3 steps):

import lightning as L
import torch
from torch import nn
from torch.utils.data import DataLoader, Dataset
Related skills

More from davila7/claude-code-templates

Installs
525
GitHub Stars
27.2K
First Seen
Jan 21, 2026