0

I am using amazon s3 API and setting the client to read as stream. It is working fine for me to use file_get_contents("s3://{bucket}/{key}"), which read the full data for the file(I am using video file & testing on my local system). However, I am trying to optimize the memory used by the script and thus trying to read and return data by chunk as below:

$stream = @fopen("s3://{bucket}/{key}", 'r');
$buffer = 1024;
while(!feof($stream)) {
            echo  @fread($stream, $buffer);
            flush();
        }

This is not working on my local system. I am just wondering what might be the issue using this technique. by searching, I found that this is also a very widely used technique. So, if anybody can please give any suggestion about what might be wrong here or any other approach, I should try with, it will be very helpful. Thanks.

Rana
  • 5,912
  • 12
  • 58
  • 91
  • Start by getting rid of the `@`, so you'll see any errors being displayed – Mark Baker Nov 10 '13 at 17:12
  • But [readfile()](http://php.net/manual/en/function.readfile.php) is probably a better function to use if all you're doing with the file is spooling it direct to the browser – Mark Baker Nov 10 '13 at 17:18
  • yes, I know. However, I will have to write the buffer to somewhere else as well. that's why I am using reading it manually rather than using 'readfile' – Rana Nov 10 '13 at 17:42
  • Just to check, I tried with readfile and interestingly, it is not working either. However, all three ways works just fine if I give a path for local version of the file. But when its trying to retrieving from s3, only 'file_get_contents' is working, other two ways isn't. Just wondering, is this anything to do with the fact that, I am trying from my local system. It might work if I try on a ec2 server instance? Do you think so? – Rana Nov 10 '13 at 17:57
  • I've been working with s3 streaming over the last few days myself, specifically using readfile(), and it's worked from both my local development laptop and from my EC2s, so it should work... but is the fopen() working? Don't suppress errors with `@`, and see if you get any error messages – Mark Baker Nov 10 '13 at 18:12
  • No, as my main concern is fopen, it still not working. Even without '@', It isn't showing any error. If I remove the 'header(..' part, then on browser I can see the binary data, but not with 'header', its not showing the video it suppose to. Now, I am little curious, is there any way my data could got corrupted you think? Thanks. – Rana Nov 10 '13 at 20:04
  • 1
    If you're seeing the binary data with the headers removed, then it suggests that the problem is with your headers, not with the file access from S3 – Mark Baker Nov 10 '13 at 20:08
  • Hmm, I didn't think that way, as it was working with local file. If header had some issue, path to local file wouldn't work as well, right? However, here is the header I am using: header('Content-Type: video/quicktime'); my video is '.mov' format and testing on safari, as other browser doesn't support this format I guess. Any idea, if I need to put anything else? – Rana Nov 10 '13 at 20:21

1 Answers1

0

OK, finally got the solution. Somehow, some other output are being added to buffer. I had to put:

ob_get_clean();
header('Content-Type: video/quicktime');

In this way to clean anything if were added. Now its working fine.

Thanks Mark Baker for your valuable support through the debugging process.

Rana
  • 5,912
  • 12
  • 58
  • 91
  • This article on the AWS PHP Development Blog might be helpful as well: [Streaming Amazon S3 Objects From a Web Server](http://blogs.aws.amazon.com/php/post/Tx2C4WJBMSMW68A/Streaming-Amazon-S3-Objects-From-a-Web-Server) – Jeremy Lindblom Nov 11 '13 at 08:04