I am using amazon s3 API and setting the client to read as stream. It is working fine for me to use file_get_contents("s3://{bucket}/{key}"), which read the full data for the file(I am using video file & testing on my local system). However, I am trying to optimize the memory used by the script and thus trying to read and return data by chunk as below:
$stream = @fopen("s3://{bucket}/{key}", 'r');
$buffer = 1024;
while(!feof($stream)) {
echo @fread($stream, $buffer);
flush();
}
This is not working on my local system. I am just wondering what might be the issue using this technique. by searching, I found that this is also a very widely used technique. So, if anybody can please give any suggestion about what might be wrong here or any other approach, I should try with, it will be very helpful. Thanks.