1

I have a table that must be optimized for bulk insert data over 100 00 rows for execution time.

Table columns

I try to insert data using PHP, where each row is element of array:

   $dataset = [["columnindex" => 1, "rowindex" => 2, "type" => "num", "value" => 400], ...]

Problem is when I try to insert array with 100 rows the Postgres does not work, also PDO does not return any errors.

I use insert from Laravel:

 SessionPrepared::insert($dataset);

If to slice array it is adeed to db:

 $dataset = array_slice($dataset, 0, 10);
Famida
  • 15
  • 4

0 Answers0