11

i am running multiple websites with high traffic , as a requirement , all images are downloaded via image.php?id=IMAGE_ID_HERE . If you ever done that before , you know that that file will be reading the file image and echoing it to the browser with special headers .

My problem is , the load on the server is very high (150-200) and TOP command shows multiple instances of image.php , so image.php is running slow !

the problem probably is fopen loading the image to the memory before sending it to the client. How to read a file and pass it through directly?

Thank you guys


UPDATE

After you optimized the code, used caching wherever possible, do create a CDN . couple of servers, sync methods, load balancers and no need to worry about requests anymore :)

Rami Dabain
  • 4,709
  • 12
  • 62
  • 106
  • What's wrong with `file_get_contents`? Or why don't you just use `fphassthru` or `readfile`? And if this is just about custom headers, consider using `mod_cern_meta` or `mod_headers`. – mario Oct 18 '11 at 11:37
  • 1
    For all questions asking whether fnA() or fnB() is faster: just profile your code. it will give you the answer and chances are you will find that something else is slowing you down. – Gordon Oct 18 '11 at 11:38
  • @Gordon , assuming you have checked all of the code , optimized everything , got rid of other things till nothing happened ? – Rami Dabain Oct 18 '11 at 11:44
  • Why not implement the special headers in Apache or Nginx conf files and serve the images right from files? No solution touching the PHP processor would mach it. – Halil Özgür Oct 18 '11 at 11:44
  • I don't think you can appreciably increase your speed changing `fopen` to `file_get_contents`, I think you should pay attention to caching and architecture changes maybe.. –  Oct 18 '11 at 11:46
  • 1
    @RonanDejhero Not sure what you are trying to say but in any case profiling is the way to go instead of asking for a synthetic a() vs b(). – Gordon Oct 18 '11 at 11:46
  • Might be on the sideline, but using js and ajax, you can download the pictures in the background, so the user don't have to wait for all of them to download. Common trick on pages with huge and detailed pictures. – OptimusCrime Oct 18 '11 at 11:52
  • @Optimus that's completely unrelated to the problem – Your Common Sense Oct 18 '11 at 11:54
  • @OptimusCrime ... am more worried about the server load that is caused of the image.php – Rami Dabain Oct 18 '11 at 12:00
  • @RonanDejhero ; then you might have to rethink the idea with image.php, can't you use any other solutions? – OptimusCrime Oct 18 '11 at 12:03
  • @OptimusCrime limited by requirements . i can just edit the code inside of iamge.php – Rami Dabain Oct 18 '11 at 12:05
  • @Gordon he has a problem. Like many others he has no clue where it is or even how to phrase it, so, he is asking quite pointless question, but *the problem* can be seen behind it. – Your Common Sense Oct 18 '11 at 12:05
  • @Gordon after finding the problem , the problem was the slow speed of fopen , the while loop for reading the file , then the echo of the contents (echoing takes more time than u think , in online algorithm tests using echo might give you 25% less mark) . ill be trying `readfile` – Rami Dabain Oct 18 '11 at 12:32

2 Answers2

17

fopen and file_get_contents are nearly equivalent

to speed up with consistence the page load you can use

http://www.php.net/fpassthru

or, even better

http://www.php.net/readfile

with those functions, content of file is printed directly, byte per byte

as opposed to file_get_contents, for example, where you store the whole data inside a variable

$var = file_get_contents();

so, to make these work correctly you will need to disable output buffering (otherwise it would make readfile() pointless) in the page that serves the images

hope this helps!

  • Let me guess you don't have a top populated with image.php, nor had it ever. – Your Common Sense Oct 18 '11 at 11:53
  • ill try that , hope the load drops :) – Rami Dabain Oct 18 '11 at 12:01
  • @RonanDejhero yes, do not use fopen or file_get_contents, try to use fpassthru for a week and monitor the server load –  Oct 18 '11 at 12:02
  • am doing that , the load will drop in minute – Rami Dabain Oct 18 '11 at 12:04
  • nopaste the image.php code, I want to listen the point of view of col.shrapnel –  Oct 18 '11 at 12:08
  • 2
    if you still having issues with serverload after using fpassthru and you are able to install apache modules, you can try mod_xsendfile see this link for more info http://codeutopia.net/blog/2009/03/06/sending-files-better-apache-mod_xsendfile-and-php/ – bumperbox Oct 18 '11 at 12:34
  • with `file_get_contents()` you don't _have_ to store the contents in a variable, for example when used as such: `file_put_contents('images/', file_get_contents($imgURL));` – pulsar Apr 03 '14 at 10:20
4

Why dont you cache the image content with apc ?

if(!apc_exists('img_'.$id)){
    apc_store('img_'.$id,file_get_content(...));
}

echo apc_fetch('img_'.$id);

this way image content will not be read from your disk more than once.

Paté
  • 1,914
  • 2
  • 22
  • 33
  • apc is hard to setup correctly. but it could be a good solution! –  Oct 18 '11 at 12:14
  • Well if the website as lots of hits than I/O on disk is high due to concurent file_get_contents, storing the content in ram would get rid of that – Paté Oct 18 '11 at 12:29
  • @Col.Shrapnel reading from Memory is faster than reading from Disk –  Oct 18 '11 at 12:43
  • there is not a sign of the I/O peak in the question. Go figure. It clearly says that it's CPU utilization above the limits. – Your Common Sense Oct 18 '11 at 12:56
  • In my experience lots of I/O is never good for load performance. If this guy has a load of 200 on this particular script than he has high IO because of the way the file are served. I'm not saying it will fix his issues but It's worth a shot IMO. – Paté Oct 18 '11 at 13:01
  • if you're getting like 5000 image requests/second , that would be a problem for the memory if APC is memory-based cache !!! specially if you have thousands of images !!! – Rami Dabain Oct 18 '11 at 13:04
  • I believe the issue is not the number of requests / seconds but more the size of all the images, you could also couple that with a RoundRobin solution. I know facebook uses memcache to cache some of their profile image so there is a reason why... – Paté Oct 18 '11 at 13:06
  • with the 5000 images /second ... i meant deferent images not the same , otherwise i would have cached them – Rami Dabain Aug 31 '12 at 19:00