https://packagist.org/packages/bayareawebpro/laravel-simple-csv
- Import to LazyCollection.
- Export from Collection, LazyCollection, Iterable, Generator, Array.
- Low(er) Memory Consumption by use of LazyCollection Generators.
- Uses Native PHP SplFileObject.
- Facade Included.
Require the package and Laravel will Auto-Discover the Service Provider.
composer require bayareawebpro/laravel-simple-csv
Invokable classes can be passed to the import method allowing you to customize how each row is processed. Two classes to handle numerics and null values have been supplied.
use BayAreaWebPro\SimpleCsv\SimpleCsv;
use BayAreaWebPro\SimpleCsv\Casts\EmptyValuesToNull;
use BayAreaWebPro\SimpleCsv\Casts\NumericValues;
$lazyCsvCollection = SimpleCsv::import(storage_path('collection.csv'), [
EmptyValuesToNull::class,
NumericValues::class,
]);
Dependency Injection: Invokable classes can typehint required dependencies in a constructor method when defined.
<?php declare(strict_types=1);
namespace App\Csv\Casts;
use Carbon\Carbon;
class Timestamps
{
/** Invoked for each row in import collection. */
public function __invoke(array $item): array
{
foreach ($item as $key => $value){
if(in_array($key, ['created_at', 'updated_at'])){
$item[$key] = Carbon::parse($value);
}
}
return $item;
}
}
use BayAreaWebPro\SimpleCsv\SimpleCsv;
// Collection
SimpleCsv::export(
Collection::make(...),
storage_path('collection.csv')
);
// LazyCollection
SimpleCsv::export(
LazyCollection::make(...),
storage_path('collection.csv')
);
// Generator (Cursor)
SimpleCsv::export(
User::query()->where(...)->limit(500)->cursor(),
storage_path('collection.csv')
);
// Array
SimpleCsv::export(
[...],
storage_path('collection.csv')
);
use BayAreaWebPro\SimpleCsv\SimpleCsv;
return SimpleCsv::download([...], 'download.csv');
use Illuminate\Support\Facades\Config;
Config::set('simple-csv.delimiter', ...);
Config::set('simple-csv.enclosure', ...);
Config::set('simple-csv.escape', ...);
config/simple-csv.php
return [
'delimiter' => '?',
'enclosure' => '?',
'escape' => '?',
];
A file splitting utility has been included that will break large CSV files into chunks (while retaining column headers) which you can move/delete after importing. This can help with automating the import of large data sets.
Tip: Find your Bash Shell Binary Path: which sh
/bin/sh vendor/bayareawebpro/laravel-simple-csv/split-csv.sh /Projects/laravel/storage/big-file.csv 5000
File Output:
/Projects/laravel/storage/big-file-chunk-1.csv (chunk of 5000)
/Projects/laravel/storage/big-file-chunk-2.csv (chunk of 5000)
/Projects/laravel/storage/big-file-chunk-3.csv (chunk of 5000)
etc...
- Using Lazy Collections is the preferred method.
- Using the queue worker, you can import a several thousand rows at a time without much impact.
- Be sure to use "Database Transactions" and "Timeout Detection" to insure safe imports.
- Article: How to Insert & Update Many at Once