1

Keep running into into the "max_allowed_packet" error msg, which to data I've just guessed at what the number should be....

I'd like to stop guessing, and know how to see what the number should be -- and it appears that info might be in the database table info in the "Data_length" column.

Is that correct, and if so, how do I calculate the value to enter in the MySQL config to avoid the error before I get it.

(If it matters, I keep getting these errors on DB restores.)

Questions, feedback, requests -- just comment, thanks!!


Additional info: The database is INNOdb, not MyISAM.


UPDATE (1):

In an effort to find a way to benchmark the "max_packet" I did the following, ran the restore, got the error with the line number causing the error, dumped that line into a text file to measure the file size.

ERROR:

ERROR 1153 (09S01) at line 2742: Got a packet bigger than "max_allowed_packet" bytes

SED code:

sed 'INSERTLINENUMBERq;d' INSERTDATABASEDUMPFILENAME.sql > INSERTLINENUMBER.txt

Problem is... that line is only 8mb, "max_allowed_packet" is set to 100mb.

blunders
  • 813
  • 7
  • 14
  • 30

1 Answers1

1

I'm not 100% sure how to profile a max_allowed_packet situation. But I know I have gotten a ~3 gig database export to import on a max_allowed_packet size of 100mb. What I usually do is just set it arbitrarily high, import the data then set it back down to 1M and restart mysqld.

lilott8
  • 496
  • 5
  • 14
  • @lilott8: Interesting, but that's just guessing -- and I'm guessing if you're resetting to 1M after import, there's no need to set it to 100mb on import; reason being that error does not just happen on import, but also on INSERT based on my experience. All that said, I really have no idea -- hence the request for a way to know how to find answer. – blunders Mar 04 '11 at 15:42
  • 1
    Right, it would happen on both an import and an INSERT. The mysqldump script will be utilizing the INSERT command. We can start to think through this.... In your error logs there should be something regarding the problematic INSERT statements. If not turn all logging on full force through your my.cf. Run a few test cases and see what's in those logs. You can then start setting your max_allowed_packet size based on those INSERTS. – lilott8 Mar 04 '11 at 15:46
  • @lilott8: +1 That's a pretty brute force approach, plus I'm loading almost 100-million rows; which is taking an few hours, don't have time being debugging the max-packet-size that way. Thanks though!!! – blunders Mar 04 '11 at 15:58
  • 1
    well, everything I read just says: "put it as high as needed" so I don't know that there is a way to calculate this value. I think best guessing might be the only way to go. Which is a bummer. I'll keep looking but it doesn't look too promising :( Are you getting any other errors that might suggest key or index buffer sizes could be the problem? Or just what you've described already? – lilott8 Mar 04 '11 at 16:44
  • @lilott8: +1 Thanks, I've in fact only seen the same; that being, just put a 100mb+ amount. And yes, that's the only error. I did end up putting 120mb max, but I have no idea what that means, or for that matter the true source of the error; guessing that it has to do with the max length of a field data wise. Again, thanks! – blunders Mar 04 '11 at 16:53
  • 2
    The max_allowed_packet defines the (select or insert) size of a Query statement. So you're effectively allowing a query to take place that is 120MB big. An example: If you have a large 70M file stored in the DB as a BLOB you can select it and insert it with a max_allowed_packet size of 100MB. Does that make sense or help? – lilott8 Mar 04 '11 at 17:03
  • @lilott8: +2 Yes, that does help -- in fact enough that I'm selecting you as the answer for now. Thanks!! – blunders Mar 04 '11 at 18:18
  • @lilott8: Based on your info on the source of the error being the query statement I ran a test. The results are posted as "UPDATE(1)" in the description of my question. Let me know if you have any suggestions, thanks! – blunders Mar 04 '11 at 18:47