Laravel import large data csv file in mysql database example, In this tutorial your will learn how to import a huge and large records of CSV file in laravel application using database seeder.
Many times we have a thousands or millions records or CSV file and we want to store the the records in our database so we use the cron or queue jobs for it. But if we want import the CSV file only one time then we use the database seeder to importing the records to database.
Maximum time when we use the import funcationality to storing large data its, showing max time execution, because the file is large and the application can’t to handle maximum request one time. So here we have added a brilliant example code to insert the large lines of csv file in database just a 2-3 seconds with laravel php application.
Step 1: Install Laravel App
First create a new laravel application adding the following command in terminal.
composer create-project --prefer-dist laravel/laravel laravel-app
Go into the app
cd laravel-app
Step 2: Connect App to Database
Open the .env file and add your database credentials such as database name, username equally important password:
DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=db name
DB_USERNAME=db user name
DB_PASSWORD=db password
Step 3: Create Migration and Model
Next step, we need to execute the following command on terminal to generate model, migration file using the following command:
php artisan make:model Community -m
Add code in database/migrations/create_communities_table.php:
public function up()
{
Schema::create('communities', function (Blueprint $table) {
$table->bigIncrements('id');
$table->string('state');
$table->string('community');
$table->string('district');
$table->timestamps();
});
}
Next, open command prompt and execute the following command to create the table into database:
php artisan migrate
Place the below code in app/Models/Community.php file:
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Factories\HasFactory;
use Illuminate\Database\Eloquent\Model;
class Community extends Model
{
use HasFactory;
protected $guarded = [];
}
Step 4: Make Seeder
First you need to run the following command to generate a seeder file to uploading huge large csv file in database in just a seconds.
php artisan make:seeder CommunitySeeder
Step 5: Update Code in Seeder
Now update the code to import large line csv fine in laravel or php application. Open your database\seeders\CommunitySeeder.php file and put the below code on it;
<?php
namespace Database\Seeders;
use Illuminate\Database\Seeder;
use App\Models\LocalCommunity;
use Illuminate\Support\Facades\DB;
use Illuminate\Support\LazyCollection;
class CommunitySeeder extends Seeder
{
/**
* Run the database seeds.
*
* @return void
*/
public function run()
{
DB::disableQueryLog();
DB::table('local_communities')->truncate();
LazyCollection::make(function () {
$handle = fopen(public_path("comunities.csv"), 'r');
while (($line = fgetcsv($handle, 4096)) !== false) {
$dataString = implode(", ", $line);
$row = explode(';', $dataString);
yield $row;
}
fclose($handle);
})
->skip(1)
->chunk(1000)
->each(function (LazyCollection $chunk) {
$records = $chunk->map(function ($row) {
return [
"state" => $row[0],
"community" => $row[1],
"district" => $row[2]
];
})->toArray();
DB::table('communities')->insert($records);
});
}
}
Step 6: Run Seeder and Test
Now, run your application using the following command;
php artisan serve
Next, run the seeder using the below command;
php artisan db:seed --class=CommunitySeeder
Now check your database, its showing all the csv files records.