processing-api-batches

Installation
SKILL.md

Processing API Batches

Overview

Optimize bulk API operations with batch request endpoints, parallel execution with concurrency control, partial failure handling, and progress tracking. Implement batch processing patterns that accept arrays of operations in a single request, execute them efficiently with database bulk operations, and return per-item results with individual success/failure status.

Prerequisites

  • Web framework capable of handling large request bodies (configure body size limits: 10MB+ for batch payloads)
  • Database with bulk operation support (bulk insert, bulk update, transactions)
  • Queue system for async batch processing: Bull/BullMQ (Node.js), Celery (Python), or SQS
  • Progress tracking store (Redis) for long-running batch status polling
  • Rate limiting aware of batch operations (count individual operations, not just requests)

Instructions

  1. Examine existing API endpoints using Read and Grep to identify operations frequently called in loops by consumers, which are candidates for batch equivalents.
  2. Design the batch request format: accept an array of operations in the request body, each with an optional client-provided id for result correlation, e.g., POST /batch with {operations: [{method: "POST", path: "/users", body: {...}, id: "op1"}]}.
  3. Implement synchronous batch processing for small batches (< 100 items): validate all items, execute in a database transaction, and return per-item results with {id, status, result|error} for each operation.
  4. Add asynchronous batch processing for large batches (> 100 items): accept the batch, return 202 Accepted with a batchId and status polling URL, process in a background worker, and update progress in Redis.
Related skills
Installs
29
GitHub Stars
2.2K
First Seen
Feb 16, 2026