0

In my cache system, I want it where if a new page is requested, a check is made to see if a file exists and if it doesn't then a copy is stored on the server, If it does exist, then it must not be overwritten.

The problem I have is that I may be using functions designed to be slow.

This is part of my current implementation to save files:

if (!file_exists($filename)){$h=fopen($filename,"wb");if ($h){fwrite($h,$c);fclose($h);}}

This is part of my implementation to load files:

if (($m=@filemtime($file)) !== false){
if ($m >= filemtime("sitemodification.file")){
    $outp=file_get_contents($file);
header("Content-length:".strlen($outp),true);echo $outp;flush();exit();
}
}

What I want to do is replace this with a better set of functions meant for performance and yet still achieve the same functionality. All caching files including sitemodification.file reside on a ramdisk. I added a flush before exit in hopes that content will be outputted faster.

I can't use direct memory addressing at this time because the file sizes to be stored are all different.

Is there a set of functions I can use that can execute the code I provided faster by at least a few milliseconds, especially the loading files code?

I'm trying to keep my time to first byte low.

1 Answers1

0

First, prefer is_file to file_exists and use file_put_contents:

if ( !is_file($filename) ) {
    file_put_contents($filename,$c);
}

Then, use the proper function for this kind of work, readfile:

if ( ($m = @filemtime($file)) !== false && $m >= filemtime('sitemodification.file')) {
        header('Content-length:'.filesize($file));
        readfile($file);
    }
}

You should see a little improvement but keep in mind that file accesses are slow and you check three times for files access before sending any content.

Community
  • 1
  • 1
Tiger-222
  • 6,677
  • 3
  • 47
  • 60
  • thanks but doesn't filesize and readfile both check that the file exists? –  May 21 '15 at 03:10