What can you do with it?

Parse and process CSV files with dynamic column handling. Automatically adapts to any CSV structure, supports filtering, sorting, and data transformation operations.

How to use it?

Basic Command Structure

/csv
file: the CSV file to process
action: what to do with the data

Parameters

  • file: CSV file from uploads or previous steps
  • action: Operation to perform (filter, sort, aggregate, transform)
  • filter criteria (optional): Column and value to filter by
  • sort column (optional): Column to sort by
  • output format (optional): How to save results (csv, json)

Response Format

Returns processed data with metadata including row count, column names, and sample rows.

Examples

Basic Usage

Parse and analyze a CSV file:
/csv
file: customers.csv
action: analyze structure and show sample data

Advanced Usage

Filter data by column value:
/csv
file: sales_data.csv
action: filter rows where region equals "North"
output format: json

Specific Use Case

Sort and transform data:
/csv
file: inventory.csv
action: sort by price descending, keep only top 100 items
output format: csv

Notes

  • Automatically handles various CSV formats and delimiters
  • Column names are dynamically discovered, no assumptions made
  • Supports case-insensitive column matching
  • Works with files from file-inputs or artifacts
  • Built-in normalization for common column name variations