pytorch-lightning

Installation
SKILL.md

Overview

PyTorch Lightning is a lightweight wrapper for PyTorch that decouples the research code from the engineering code. It automates 40+ engineering details like epoch loops, optimization, and hardware acceleration, while allowing researchers to retain full control over the model logic.

When to Use

Use Lightning when you want to scale models to multi-GPU or multi-node environments without changing training code, or when you want to eliminate boilerplate for logging, checkpointing, and reproducibility.

Decision Tree

  1. Are you testing code logic on a small subset?
    • YES: Use Trainer(fast_dev_run=True).
  2. Do you need to scale to multiple GPUs?
    • YES: Set accelerator='gpu' and devices=N in the Trainer.
  3. Do you have logic that is non-essential to the model (e.g., special logging)?
    • YES: Implement it as a Callback.

Workflows

Related skills

More from cuba6112/skillfactory

Installs
2
First Seen
Feb 9, 2026