I have been using a mysql db backup script in php and the backups take 21 hours my dbs. I want to use it as a daily backup script through cli and cron and was wondering if you guys could take a look at it and see if there is anyway I can optimize it.
<?
//Timer start
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$start = $time;
ini_set('memory_limit','4000M');
//ini_set('max_execution_time', 300);
$host = "host";
$user = "user";
$pass = "pass";
$db = "tagdb";
$link = mysql_connect($host,$user,$pass);
$result = mysql_query("show databases like 'tag%'"); // we only want tagdb
while($row = mysql_fetch_row($result))
{
$dbs[] = $row[0];
}
foreach($dbs as $db)
{
if(strlen($db) == 10 || $db == "tagdb" || $db == "tagui")
{
echo $db."\n";
backup_tables($host,$user,$pass,$db);
}
}
//backup_tables($host,$username,$password,$db);
/* backup the db OR just a table */
function backup_tables($host,$user,$pass,$name,$tables = '*')
{
$fname = 'db-backup-'.$name.'_'.time().'.sql';
echo $fname."\n";
$handle = fopen($fname,'w+');
$return = '';
fwrite($handle,$return);
fclose($handle);
$link = mysql_connect($host,$user,$pass);
mysql_select_db($name,$link);
//get all of the tables
if($tables == '*')
{
$tables = array();
$result = mysql_query('SHOW TABLES');
while($row = @mysql_fetch_row($result))
{
$tables[] = $row[0];
}
}
else
{
$tables = is_array($tables) ? $tables : explode(',',$tables);
}
foreach($tables as $table)
{
$handle = fopen($fname,'a');
fwrite( $handle, 'DROP TABLE IF EXISTS '.$table.';' );
$row2 = mysql_fetch_row(mysql_query('SHOW CREATE TABLE '.$table));
fwrite( $handle, "\n\n".$row2[1].";\n\n" );
$offset = 10000;
$start = 0;
do {
$result = mysql_query( "SELECT * FROM ".$table." LIMIT ".$start.", ".$offset."" );
$start += $offset;
$num_rows = mysql_num_rows( $result );
if (false === $result) {
//close original file
fclose($handle);
//open error file
$errfn = $fname.'.ERROR';
$errf = fopen($errfn,'a');
$err = mysql_error();
fwrite( $errf, $err );
fclose($errf);
//reopen original file
$handle = fopen($fname,'a');
//break loop
$num_rows = 0;
}
while( $row = mysql_fetch_row( $result ) ) {
$line = 'INSERT INTO '.$table.' VALUES(';
foreach( $row as $value ) {
$value = addslashes( $value );
@$value = ereg_replace( "\n","\\n", $value );
$line .= '"'.$value.'",';
}
$line = substr( $line, 0, -1 ); // cut the final ','
$line .= ");\n";
fwrite( $handle, $line );
}
} while( $num_rows !== 0 );
}
$return="\n\n\n";
fwrite($handle,$return);
fclose($handle);
}
//End timer and output time
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$finish = $time;
$total_time = round(($finish - $start), 4);
echo "\n Page generated in ".$total_time." seconds. \n";
?>
I was thinking about splitting the backups into parallel processes but my databases are huge and I'm not sure if php will run out of memory because it has before. Suggestions are welcome.
Thanks!
ps. I know there are better ways to do a backup but this particular script works well for me as the other options, such as mysqldump is not available to me.