What can you do with it?
Create batch jobs to process large datasets in parallel through queues. Each row in your CSV data becomes a separate work item that gets processed by your chosen workflow.How to use it?
The batch command supports two main operations:1. Start Batch Job
Create and start a new batch job to process data through queues.2. Get Batch Status
Check the status of an existing batch job.Parameters
For Starting a Batch Job:
- queueId: The queue where batch items will be placed (required)
- triggerId: The workflow trigger to execute for each item (required)
- csvData OR csvUrl: The data to process (required, choose one):
- csvData: Inline CSV data with headers
- csvUrl: URL pointing to a CSV file
- inputGroupSize (optional): Controls how many CSV rows are sent to each workflow run (default: 1)
- Size = 1: Each row becomes a separate run
- Size > 1: Rows are grouped into arrays for batch processing
- onDoneTriggerId (optional): Workflow to trigger when batch completes
- onDoneWebhookUrl (optional): Webhook to call when batch completes
For Getting Batch Status:
- queueId: The queue containing the batch (required)
- batchId: The ID of the batch to check (required)
Data Input Options
- Inline CSV Data: Paste CSV directly in the prompt
- CSV URL: Provide a URL to fetch CSV data
- Step Data: Reference CSV data from previous workflow steps
Response Format
Both operations return batch information for tracking progress. Starting a batch returns the newly created batch object, while getting batch status returns the current state of an existing batch. Individual items are processed asynchronously through the queue. Sample Response: Array containing single batch object with batchIdExamples
Starting a Batch Job
Simple batch with inline data:Getting Batch Status
Check the progress of a running batch:Advanced Batch Creation
Batch with URL and completion trigger:Batch Processing with Input Grouping
Process multiple CSV rows together in each workflow run:Specific Use Case
Processing product updates with webhook notification:Using Step Data
Reference data from previous steps:What is sent to your Worker Workflow
When the batch is running, it calls the worker that you specify for every row or group of rows in theinput
variable. This is what you can expect that input to look like:
Individual Processing (inputGroupSize = 1)
Each workflow run receives a single CSV row as a JSON object:Grouped Processing (inputGroupSize > 1)
Each workflow run receives multiple CSV rows in a structured format:batch
: Alwaystrue
when using input groupingbatchId
: Unique identifier for the entire batchbatchIndex
: Index of this group within the batch (0, 1, 2, etc.)totalRows
: Total number of CSV rows in this grouprows
: Array containing the actual CSV data as JSON objects
Notes
- Each CSV row becomes a separate queue item (when inputGroupSize = 1)
- With inputGroupSize > 1, multiple rows are grouped together in each queue item
- Headers are included with each item for context
- Batch completion triggers fire after all items process
- Monitor batch progress through queue dashboards
- Failed items can be retried individually
- Batches support thousands of items efficiently
- Input grouping reduces the number of workflow runs for large datasets