laravel:data-chunking-large-datasets
Data Chunking for Large Datasets
Process large datasets efficiently by breaking them into manageable chunks to reduce memory consumption and improve performance.
The Problem: Memory Exhaustion
// BAD: Loading all records into memory
$users = User::all(); // Could be millions of records!
foreach ($users as $user) {
$user->sendNewsletter();
}
// BAD: Even with select, still loads everything
$emails = User::pluck('email'); // Array of millions of emails
foreach ($emails as $email) {
Mail::to($email)->send(new Newsletter());
More from jpcaparas/superpowers-laravel
laravel:routes-best-practices
Keep routes clean and focused on mapping requests to controllers; avoid business logic, validation, or database operations in route files
97laravel:blade-components-and-layouts
Compose UIs with Blade components, slots, and layouts; keep templates pure and testable
96laravel:quality-checks
Unified quality gates for Laravel projects; Pint, static analysis (PHPStan/Psalm), Insights (optional), and JS linters; Sail and non-Sail pairs provided
87laravel:performance-caching
Use framework caches and value/query caching to reduce work; add tags, locks, and explicit invalidation strategies for correctness
84laravel:tdd-with-pest
Apply RED-GREEN-REFACTOR with Pest or PHPUnit; use factories, feature tests for HTTP, and parallel test runners; verify failures before implementation
83laravel:queues-and-horizon
Operate and verify queues with or without Horizon; safe worker flags, failure handling, and test strategies
83