Python CSV Processing Examples
Read, write, and transform CSV files using the csv module and pandas with encoding and dialect handling.
import csv
from pathlib import Path
import pandas as pd
# Read CSV with the csv module
with open("input.csv", newline="", encoding="utf-8") as f:
reader = csv.DictReader(f)
rows = [row for row in reader]
print(f"Read {len(rows)} rows, columns: {rows[0].keys() if rows else 'none'}")
# Write CSV with headers
output_rows = [
{"id": 1, "name": "Alice", "amount": 100.50},
{"id": 2, "name": "Bob", "amount": 200.75},
]
with open("output.csv", "w", newline="", encoding="utf-8") as f:
writer = csv.DictWriter(f, fieldnames=["id", "name", "amount"])
writer.writeheader()
writer.writerows(output_rows)
# Pandas: read with options
df = pd.read_csv(
"input.csv",
encoding="utf-8",
dtype={"id": str, "amount": float},
parse_dates=["date"],
na_values=["", "N/A", "null"],
usecols=["id", "name", "amount", "date"],
)
# Pandas: write with options
df.to_csv("clean_output.csv", index=False, encoding="utf-8", float_format="%.2f")
# Process large CSV in chunks
for chunk in pd.read_csv("large_file.csv", chunksize=10_000):
processed = chunk[chunk["status"] == "active"]
processed.to_csv(
"filtered.csv", mode="a", header=not Path("filtered.csv").exists(), index=False
)Use Cases
- Reading and cleaning CSV data files
- Processing large CSV files in streaming chunks
- Converting between CSV formats and encodings
Tags
Related Snippets
Similar patterns you can reuse in the same workflow.
CSV Parse with Streaming
Parse large CSV files using Node.js streams with row-by-row processing and backpressure handling.
Pandas DataFrame Transformations
Common pandas DataFrame transformations including column operations, type casting, and string methods.
Python ETL Pipeline Example
Complete extract-transform-load pipeline with error handling, logging, and incremental processing.
Python Batch Processing Script
Process large files in configurable batches with progress tracking, error handling, and resume support.