2

I've got an array of strings which are sql "input into ..." queries. Im looping through this array combining each 50 of them into a new string and then sending to mysql database. After that I reset my temporary string and prepare it for new queries.

But my array stops after first query and doesn't want to send everything so on DB server I've always got only 50 records. How can I fix my code and send everything?

code:

$data // array with sql queries 
$queryPartial = ''; // additional string for 50 queries to send

           foreach ($data as $index => $queryToSend) {

                $queryPartial .= $queryToSend;

                // send partials query
                if($index % 50 == 0)
                {

                    if($connection->multi_query($queryPartial))
                    {
                        $this->output->writeln('succesfull query number: '.$index);
                    } 
                    $queryPartial = ''; // clean string for next queries
                }


           }   
              //send the rest of the remaining queries
             if($queryPartial !== ''){

                 if($connection->multi_query($queryPartial))
                 {
                    $this->output->writeln('rest of the queries sended');
                 } 
             }

          $connection->close(); 
Bill Karwin
  • 538,548
  • 86
  • 673
  • 828
ampher911
  • 81
  • 1
  • 8

2 Answers2

1

To address why your code isn't working, I'd suggest looking at the full string of $queryPartial before you submit it as a query. I bet there's a syntax error because you're not separating the statements with ; or something like that.

Looking at the query itself, instead of your code, will probably make it more clear where the mistake is.

You should also check for error status returned by the mysqli functions. Most mysqli functions return FALSE if an error occurs, and you should check $connection->error for the error message. If you don't do that, errors might occur, but you'll never know.

I would guess that you're using multi-query to save on database calls, assuming that making fewer calls will help performance.

But the cost of making the call is small compared to the cost of the transactions that insert data. There's practically no benefit to using multi_query().

The following code does the same thing, has virtually the same performance, it's much simpler code:

foreach ($data as $index => $queryToSend) {
    $result = $connection->query($queryToSend);
    if ($result === false) {
        error_log("SQL: $queryToSend ERROR: {$connection->error}");
    }
}

(always check for error status returned by mysqli functions!)

If you're trying to speed up performance of bulk inserts, using multi_query() has insignificant benefit compared to other optimizations:

  • Combine multiple rows into a single INSERT statement
  • Combine multiple statements into a single transaction
  • Use LOAD DATA INFILE instead of INSERT

See my presentation Load Data Fast! for performance comparisons.

Bill Karwin
  • 538,548
  • 86
  • 673
  • 828
  • First, thank you for such a complex response. I'm sure that queries themselves are not a problem since all of them are generated automatically by my previous code and posses semicolon at the end. As you suggested I've put a $connection->error and get multiple "Commands out of sync; you can't run this command now" messages with each loop iteration. What does it mean? – ampher911 Jan 19 '18 at 02:40
  • See https://stackoverflow.com/questions/4801166/commands-out-of-sync-you-cant-run-this-command-now-caused-by-mysqlimulti – Bill Karwin Jan 19 '18 at 02:48
0

I realize that this is old, but this isn't how mysqli_multi_query works and no one has posted the actual problem with the original code. Although this comment alludes to it.

What you would actually need to do is to process each result from the first fifty inserts. That looks like this or this. Because you aren't doing that, it blocks all future queries on the connection until you process those results. This is why you can only do one batch. It's waiting for you to finish the first batch by calling next_result forty-nine times (usually done in a loop). Until then, the connection is blocked. You can't run query or multi_query on that connection.

However, if you want to do multiple inserts at once, you don't need multi_query. As already noted, MySQL supports

INSERT INTO table (column1, column2) VALUES (1, 'A'), (2, 'A'), (3, 'A'), (4, 'A'), (5, 'A')

That would insert five rows into the table. Then to insert five more rows, just do it again.

INSERT INTO table (column1, column2) VALUES (6, 'A'), (7, 'A'), (8, 'A'), (9, 'A'), (10, 'A')

This seems closest to the original intent of your code. And it's better than how multi_query works, because it just does one statement each time. Your multi_query version does one statement per row. It just allows you to bundle sending the fifty statements in one multi_query. But receiving the results back, you still have to add quite a bit of boilerplate code. Because it has to process each response.

If you want to increase performance by inserting multiple rows at once, that is possible. That's just not what multi_query does for you. The only time that multi_query makes some sense is when you have a file containing a list of SQL commands. Then you can pass that to multi_query and save some processing. If you are generating the SQL yourself, there's no need for multi_query. Just generate it in a format that lends itself to how MySQL works.

mdfst13
  • 850
  • 8
  • 18