1

"receiver.php" file receives ±1000 Ajax post requests per second with $array data which is written with following code to a file.csv:

$file = new SplFileObject( __DIR__ . 'file.csv', 'a' );
$file->fputcsv( $array, "|", "'" );
$file = null;

Questions:

  1. Will be each of requests data properly appended to a file.csv? Or if some of requests will coincide at writing moment then those requests data will be lost because of "file locking"?

  2. What will happen if at the same time while those 1000 requests per second are writing to file.csv - other process initiated by cron service will step in and start reading that file.csv? Will be those 1000 requests able to continue appending data to file.csv or for the time while cron process will be working with file.csv - those 1000 requests will be "hitting a wall" and data will be not inserted, lost?

In overall - I am just interested if there could be data losses in such cases or not?

Sid
  • 4,302
  • 3
  • 26
  • 27

1 Answers1

2

Php does not file lock by default [if i am not mistaken] you could lock the file with the php flock() function.

docs on flock()

File locking will make the process wait for each lock to be "released" before starting another operation on it.

There is however a problem if the http request times out before the file's lock is released. But you can prevent this from occuring by setting the following envirement configs.

 set_time_limit(0); //sets timeout for runnin the script to unlimited.
 ignore_user_abort(); //prevents user from aborting the request once started

The 1000+ requests per seconds however seems like this approach is nearing it's viability. And i would suggest using a queing system for the incoming data and have the updating of the file decoupled from incoming requests.

YouriKoeman
  • 768
  • 3
  • 10