Data exports sound simple but always have edge cases around large datasets, memory limits, and encoding. Let Claude write the streaming version from the start — before you discover the memory issue at 3am.
"Write a Laravel data export for the orders table:
- Export filtered results (date range, status, customer) as CSV and Excel
- Stream large exports using LazyCollection to avoid memory exhaustion
- Queue exports over 50k rows and email a download link when ready
- Include relationships: customer name, line items, shipping address
- Handle currency formatting, timezones, and UTF-8 encoding correctly
Add Pest tests that assert the correct headers and row count."
The streaming part is where most hand-written exports fall over. Claude uses LazyCollection with chunk() to keep memory flat regardless of result size:
return response()->streamDownload(function () use ($filters) {
$handle = fopen('php://output', 'w');
fputcsv($handle, ['ID', 'Customer', 'Total', 'Status', 'Date']);
Order::with(['customer', 'items'])
->filter($filters)
->lazy()
->each(fn ($order) => fputcsv($handle, $order->toCsvRow()));
fclose($handle);
}, 'orders-export.csv');
For the Excel variant, specify Maatwebsite Excel or PhpSpreadsheet — Claude generates the correct export class with proper column widths, date formatting, and a header row style.
Export features that crash on large datasets aren't features — let Claude write the streaming version first.
Log in to leave a comment.
The /security-review command scans your uncommitted changes for injection vectors, auth gaps, hardcoded secrets, and other common vulnerabilities.
The SessionStart hook fires when any session begins or resumes, making it ideal for loading environment variables and running one-time setup scripts.
Ask Claude to write property-based tests for your functions using fast-check — it identifies the mathematical invariants in your code and generates tests that cover inputs you'd never enumerate by hand.