1

Im running PHP 5.4.

This is my setup:

../checker.php

../index.php

from conscript.php I use a cronjob to run it each 5 minutes or so but I want the echo of the cURL to display on index.php. How do I receive the echo at index.php?

It would also be possible to let cURL store the results from the checker.php in a file for ehm. results.php and let index.php grab/iframe from results.php but I have no idea how.

TehEnforce
  • 15
  • 5
  • 1
    Store the cURL result in a file and use file_get_contents() or other file function from index.php to read and echo it. – OBV Sep 28 '13 at 18:29
  • I believe [this](http://stackoverflow.com/a/865669/1438393) is what you're looking for. – Amal Murali Sep 28 '13 at 18:30
  • @user2745919 file_get_contensts() I could try but how would I store the cURL result in a file do? From the way its now is I visit the page. It runs the cURL script and shows it with echo. – TehEnforce Sep 28 '13 at 19:38
  • cron script a php file to cURL a page and save it to local file. You only want to echo it when someone visits index, so place read/echo code in index. – OBV Sep 28 '13 at 19:42
  • @AmalMurali That does not seem to work. Shows just the plain page (index.php) but without the results. No errors or anything it shows. Strange. – TehEnforce Sep 28 '13 at 19:47
  • @user2745919 Problem is. I don't want to cURL to crawl a page but check uptime of it. Storing the echo results in a file could work but no idea how I could let cURL to store the echo results in another file. Or it could be that you mean I should cURL the cURL result using a cron every 5 minutes and iframe it to index.php. Either way I dont have a idea how. – TehEnforce Sep 28 '13 at 19:58
  • See answer, may need to edit it for your use. Dont forget to vote up correct answers! – OBV Sep 28 '13 at 20:08

1 Answers1

0

cronCheckUrls.php

<?
//Set name of text file
$file = "urlUpResults.txt";

//Create blank file/delete contents of existing file before run rest of script
file_put_contents($file, " ");

//Build array of sites to check
$sitesArray = array("http://www.minecraft.net", "http://www.4stats.tk", "http://www.google.com", "http://www.stackoverflow.com", "http://www.brokentestwebsite.com");


foreach ($sitesArray as $value) { 
if (TPBUC($value))
    $upStatus = "<p>" . $value . " is up.</p>";
 else
    $upStatus = "<p>" . $value . " is down.</p>";
    file_put_contents($file, $upStatus, FILE_APPEND);
}


//cURL function

        function TPBUC($url){
             $agent = "Mozilla/5.0 (compatible; TPBUC/1.0;)";$ch=curl_init();
             curl_setopt ($ch, CURLOPT_URL,$url );
             curl_setopt($ch, CURLOPT_USERAGENT, $agent);
             curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
             curl_setopt ($ch,CURLOPT_VERBOSE,false);
             curl_setopt($ch, CURLOPT_TIMEOUT, 5);
             curl_setopt($ch,CURLOPT_SSL_VERIFYPEER, FALSE);
             curl_setopt($ch,CURLOPT_SSLVERSION,3);
             curl_setopt($ch,CURLOPT_SSL_VERIFYHOST, FALSE);
             $page=curl_exec($ch);
             $httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
             curl_close($ch);
             if($httpcode>=200 && $httpcode<400) return true;
             else return false;
      }

?>

index.php

<?
$file = "urlUpResults.txt";
echo "<h1>UpCheck results:<h1><br>" . file_get_contents($file);
?>
OBV
  • 1,169
  • 1
  • 12
  • 25
  • That seems to do the job! Awesome thanks. However. In the txt it just says get_content for some reason. – TehEnforce Sep 28 '13 at 20:42
  • Still seems to just say the classname do. It might be conflicting because of the way the cURL is done. Not to sure how do. This is the code Im using combined with yours. http://pastebin.com/UMJXQCts – TehEnforce Sep 28 '13 at 21:01
  • Ok actually tested it now, there was a syntax error, now fixed. Will check your code. – OBV Sep 28 '13 at 21:07
  • Do you actually need the site pages (html) to be saved? If you just need to know if website up or not here is: http://pastebin.com/TsG0uLhA – OBV Sep 28 '13 at 21:16
  • I don't want the HTML pages of those sites saved. However I want the results to be saved in a TXT or something and than let it convert to HTML back at the index. This way it does not do the cURL everytime and spam the site each refresh of the checker. Here will be the setup. (cURL main script. Checks uptime every 5 min. --> Checks at the main script echo and saves results (online or offline) in TXT or another format --> Index asks at TXT for down or not. And parses it a the main page --> Shows users that if down or not. – TehEnforce Sep 28 '13 at 23:05