0

I've got a script to backup a directory on a hosted webserver. The directory should be compressed about 1GB.

error_reporting(E_ALL);
ini_set('display_errors', 1); 
include 'Archive/Tar.php'; 

$dir = realpath(dirname(__FILE__).'/../../dir_to_zip/'); 
$archivname = dirname(__FILE__)."/backup_".basename($dir).date('_Y-m-d_His').".tar.gz"; 

$archiv = new Archive_Tar($archivname, 'gz'); 
$archiv->createModify($dir, ""); 
echo 'done';

However, I call the script via Brwoser (Firefox) and CronJob of the hoster multiple archives are created:

File Size
backup_dir_to_zip_2023-04-01_143924.tar.gz 129MB
backup_dir_to_zip_2023-04-01_143954.tar.gz 130MB
backup_dir_to_zip_2023-04-01_144024.tar.gz 135MB
backup_dir_to_zip_2023-04-01_144054.tar.gz 139MB
backup_dir_to_zip_2023-04-01_144125.tar.gz 159MB
backup_dir_to_zip_2023-04-01_144156.tar.gz 140MB

Its very suspicios that the partial archives are created every 30 seconds. It seems like any "keep-alive"/"refresh" from the browser that starts the script again and again.

I am also happy to hear about alternatives to zip a large file ( I know, very common problem). I have tried ZipArchive, but getting an ZipArchive::ER_MULTIDISK Exception while using the addFile-Method or an memory-overflow using the addFromString-method.

Chris
  • 1
  • 3

0 Answers0