1

I use a php function to return the status of more than 1000 websites (i.e. website is up or down).

public function curlCheck($nodes) {

    $results = array();
    $node_count = count($nodes);

    $curl_arr = array();
    $master = curl_multi_init();

    for ($i = 0; $i < $node_count; $i++) {
        $url = $nodes[$i];
        $curl_arr[$i] = curl_init($url);
        curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true);
        curl_setopt($curl_arr[$i], CURLOPT_NOBODY, true);
        curl_setopt($curl_arr[$i], CURLOPT_TIMEOUT, 5);
        curl_multi_add_handle($master, $curl_arr[$i]);
    }

    $running = null;
    do {
        curl_multi_exec($master, $running);
    } while ($running > 0);


    for ($i = 0; $i < $node_count; $i++) {
        $results[$i]['url'] = curl_getinfo($curl_arr[$i], CURLINFO_EFFECTIVE_URL);
        $results[$i]['code'] = curl_getinfo($curl_arr[$i], CURLINFO_HTTP_CODE);
        $results[$i]['time'] = curl_getinfo($curl_arr[$i], CURLINFO_PRETRANSFER_TIME);
    }
    echo 'done';
    return $results;
}

$nodes = array('http://google.com','http://yahoo.com','http://msn.com');   
$result= curlCheck($nodes);
print_r($result);

while I use curl_multi, but the time of this process is long; how can I do parallel processing for this purpose.

Foad Tahmasebi
  • 1,333
  • 4
  • 16
  • 33

2 Answers2

0

According to https://bugs.php.net/bug.php?id=61141:

On Windows setups using libcurl version 7.24 or later (which seems to correspond to PHP 5.3.10 or later), you may find that curl_multi_select() always returns -1, causing the example code in the documentation to timeout. This is, apparently, not strictly a bug: according to the libcurl documentation, you should add your own sleep if curl_multi_select returns -1.

captain_a
  • 3,328
  • 1
  • 14
  • 23
-1

try this solution:

use set_time_limit(0); if using xampp or wampp; your execution time may end suddenly;

class myclass
{

public $multi_exec_curl_files = array();

//...

public function name()
{

//add urls to list

foreach($array as $id => $value)
$this->multi_exec_curl_files[] = array('link' => $value['link']);

//...

$this->multiCurl($this->multi_exec_curl_files);


...

}

public function multiCurl($res = array(), $options = "") {

        if (count($res) <= 0)
            return False;

        $handles = array();

        if (!$options) // add default options
            $options = self::$options;

        //print_r($options);
        // add curl options to each handle
        foreach ($res as $k => $row) {
            $ch{$k} = curl_init();
            $options[CURLOPT_URL] = $row['link'];
            //echo $row['link'].PHP_EOL;
            curl_setopt_array($ch{$k}, $options);
            $handles[$k] = $ch{$k};
        }
        //die('d');

        $mh = curl_multi_init();

        foreach ($handles as $k => $handle) {
            curl_multi_add_handle($mh, $handle);
        }

        $running_handles = null;
        //execute the handles
        do {
            $status_cme = curl_multi_exec($mh, $running_handles);
        } while ($status_cme == CURLM_CALL_MULTI_PERFORM);
//
        while ($running_handles && $status_cme == CURLM_OK) {
            if (curl_multi_select($mh) != -1) {
                do {
                    $status = curl_multi_exec($mh, $running_handles);
                } while ($status == CURLM_CALL_MULTI_PERFORM);
            }
        }

}

take it from here; this one works, i test it daily; addapt the function as you need it;

Guide:

  1. instantiate the class, case sensitive

    $app = new myclass();

  2. run main function

    $app->name();

In this function, you must do a query and extract from the database all your websites urls and store the links in the variable/property $this->multi_exec_curl_files;

I used the $array as my website list;

  1. after you have loaded all the websites urls, call the multi curl method;

    $this->multiCurl($this->multi_exec_curl_files);

This will start for each website url a execution line (handle) and it will do what you what you tell it to do;

I use this method to download multiple websites webpages in the same time and you can use this to see if a website is online (multiple websites);

$handles is the collection of links in curl

$handle is a url

Once the process is started, it will wait until all handles are done;

Ionut Flavius Pogacian
  • 4,750
  • 14
  • 58
  • 100