My current code:
$fileone= fopen("fileone.sql", "r"); //opens file fileone.sql
$fileonewrite = fopen("fileone.rdf", "w"); //this is the file to write to
$fileNum=1;
$i=0;
while (!feof($fileone) ) { //feof = while not end of file
if ($contents = fread($fileonewrite ,53687091200)); //if file is more than 50gb, write to new file (below) .. this doesnt seem to work properly
{ file_put_contents('fileone'.$fileNum.'.rdf',$contents);
$fileNum++;
}
$fileoneRow[] = fgets($fileone); //fgets gets line
$fileoneParts = explode("\t", $fileoneRow[$i]); //explode using tab delimiter
fwrite( " lots of stuff" );
unset($fileoneParts);
}
fclose($fileonetype);
fclose($fileonewrite);
Im reading lots of data and outputting even more, the file created easily gets upto >200GB. This causes a memory problem. So what i would like to do, is when the file being written, e.g, fileone.rdf, gets to 50gb, i want to start writing to filetwo. My code atm, doesnt work quite well as it seems to output thousands of empty files.
Thanks for reading my query, any help, as always is much appreciated.