0

the file i am importing has thousands of records which is causing my network to get slow. i want to read only those columns that i need, before inserting it into database. When file is processed , it should first search those columns not the whole file, and fetch the rows of those columns

Excel::load($path, function($reader) {

//Getting headers using this


       $headers = $reader->first()->keys()->toArray();

//This is the array of required columns

       $headings = array('registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city');


});

Data insertion after file read.

 if($data->count() > 0)
             {
              foreach($data->toArray() as $value)
              {
                $insert[] = array(
                 'registrant_name'  => $value['registrant_name'],
                 'registrant_address'   => $value['registrant_address'],
                 'registrant_phone'   => $value['registrant_phone'],
                 'registrant_zip'   => $value['registrant_zip'],
                 'registrant_email'   => $value['registrant_email'],
                 'registrant_country'   => $value['registrant_country'],
                 'registrant_state'   => $value['registrant_state'],
                 'registrant_city'   => $value['registrant_city']

                );                
               }
              }


      if(!empty($insert))
      {
       DB::table('customers')->insert($insert);
      }    
Jo-ji
  • 17
  • 1
  • 6

1 Answers1

0

In collections, you can check the method only documentation

i.e.

$headers = $reader->first()->only(['registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city']);

You might also use chunk, to insert the data in smaller collections than in one big.

Could you update the post with the App\Import class in order to help you more.

Nikolas
  • 636
  • 4
  • 10