CSV Parse with Streaming
Parse large CSV files using Node.js streams with row-by-row processing and backpressure handling.
import { createReadStream } from 'fs';
import { createInterface } from 'readline';
interface CsvRow {
[key: string]: string;
}
export async function* parseCsv(filePath: string): AsyncGenerator<CsvRow> {
const stream = createReadStream(filePath, { encoding: 'utf-8' });
const rl = createInterface({ input: stream, crlfDelay: Infinity });
let headers: string[] = [];
let isFirst = true;
for await (const line of rl) {
if (isFirst) {
headers = line.split(',').map((h) => h.trim().replace(/^"|"$/g, ''));
isFirst = false;
continue;
}
const values = line.split(',').map((v) => v.trim().replace(/^"|"$/g, ''));
const row: CsvRow = {};
headers.forEach((h, i) => (row[h] = values[i] ?? ''));
yield row;
}
}
// Usage:
// for await (const row of parseCsv('data.csv')) {
// console.log(row);
// }Use Cases
- Data import pipelines
- ETL processing
- Log file analysis
Tags
Related Snippets
Similar patterns you can reuse in the same workflow.
Python CSV Processing Examples
Read, write, and transform CSV files using the csv module and pandas with encoding and dialect handling.
Stream File Download
Express handler that streams a file to the client with proper headers, range support, and error handling.
JWT Verify Middleware
Express middleware that verifies JWT tokens from the Authorization header and attaches the decoded payload to the request.
In-Memory Rate Limiter for Express
Token bucket rate limiter middleware for Express with configurable window and max requests per IP.