2

I'm trying to work out why my database dump won't import on my new server. I'm using:

mysql --max_allowed_packet=512M -uDB_USERNAME -pPASS DB_NAME < /root/backup.sql

After a while of it running, I then get in /var/log/mysql/error.log:

2017-09-19T15:00:06.077222Z 130 [Note] Aborted connection 130 to db: 'DB_NAME' user: 'DB_USER' host: 'localhost' (Got a packet bigger than 'max_allowed_packet' bytes)

and the "mysql" CLI call comes back with:

ERROR 2006 (HY000) at line 5995: MySQL server has gone away

Here is some of the other mySQL config:

key_buffer_size     = 16M
max_allowed_packet  = 512M
thread_stack        = 192K
thread_cache_size       = 8
# This replaces the startup script and checks MyISAM tables if needed
# the first time they are touched
myisam-recover-options  = BACKUP
#max_connections        = 100
#table_cache            = 64
#thread_concurrency     = 10
#

I'm at a bit of a loss as to what to try. Any suggestions? The SQL file is 7gb, and the table it keeps dying on is 5gb on the old server.

Thanks!

Andrew Newby
  • 1,102
  • 2
  • 25
  • 58
  • well if your SQL file is 7GB big, why are you trying to import it with 512M limit ? – bocian85 Sep 19 '17 at 15:20
  • see if that answer helps you https://stackoverflow.com/questions/25663786/import-db-with-max-allowed-packet-limitation-issue-and-no-admin-access – bocian85 Sep 19 '17 at 15:23
  • @bocian85 - why would it matter how large the sql file is? The max_allowed_packet is for the QUERY, no? So in my case. if there was a query in the SQL file that tried to run, and it was > 512M, it would fail... correct? – Andrew Newby Sep 19 '17 at 15:50
  • `The max_allowed_packet is for the QUERY, no? ` No, it can be for any SQL statement sent to the MySQL server : including `insert into` that your SQL file certainly has. You may have some big BLOB columns, we don't know. It could also be a buffer-length limit. But first, try to increase the `max_allowed_packet` : let's say 1GB then what happened ? – krisFR Sep 19 '17 at 17:32
  • @krisFR - the columns are just pain TEXT columns, and at most a couple of mb (just text version of emails sent). I'm going to try the `--skip-extended-insert` on that one table, to see if that helps. I don't really want to do it on the other tables as well, as that will make the SQL file even larger. – Andrew Newby Sep 20 '17 at 05:55
  • OK, well this isn't really the solution to the problem I was having - but I've found a work around. I found out there was a `message_body` column in the table. This was basically a slurp of the incoming email (in its entirity). As such, there were some pretty large things in there (as it also has the base64 encoding for attachments, such as images). My guess would be that someone else some photos along in their reply, and this was then entered into the table. Seeing as we don't actually use that data, I've just deleted that column, and tweaked the email import script so it doesn't add it again – Andrew Newby Sep 20 '17 at 08:16

1 Answers1

1

At input line 5995 of your input you should find a line longer than 512M. Adjust your max_allowed_packet to cover it * 1.25 for a little breathing space.

Wilson Hauck
  • 472
  • 5
  • 11
  • thanks, the problem is that I can't even see that line in LTFViewer.txt (for viewing large files). The furthest it gets it line 800, and then dies on me (probably too wide of a line). Eugh – Andrew Newby Sep 20 '17 at 06:35
  • @Wilson_Hauck - eugh this is doing my head in. Even using LTFViewer it crashes (to try and find the offending line). I've googled around loads, but can't find any decent freeware to read a large line (it must be a large line to be crashing LTFViewer, as I'm on a decent spec machine) – Andrew Newby Sep 20 '17 at 07:41
  • 1
    @AndrewNewby editpadlite.com software will allow you to see the long line. And much more. – Wilson Hauck Sep 20 '17 at 12:53