1

I want the following curl code stable for up to 50 connections from different ip's, So it can easily handle up to 50 connections requests at once without hanging or putting much load on server.

Actually i am on shared hosting. but i want this curl script to make less load on server even if.. if in future it gets more than 50 or 100 requests at once. otherwise my hosting resources can be limited by admin if i put high load on shared hosting server.

One more thing i have to tell that, each request fetch just average 30kb file from remote server with this curl script. So i think each request job will complete in few seconds less than 3 seconds. because file size is very small.

Also Please tell me is this script needs any modification like (curl_multi) to face 50 to 100 small requests at once ? ...OR it is perfect and NO need of any modification. ... OR i just need make changes in shared hosting php ini settings via cpanel.

$userid = $_GET['id'];

if (file_exists($userid.".txt") && (filemtime($userid.".txt") > (time() - 3600 * $ttime ))) {
$ffile = file_get_contents($userid.".txt");} else {
$dcurl = curl_init();
$ffile = fopen($userid.".txt", "w+");
curl_setopt($dcurl, CURLOPT_URL,"http://remoteserver.com/data/$userid");
curl_setopt($dcurl, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($dcurl, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_0);
curl_setopt($dcurl, CURLOPT_TIMEOUT, 50);
curl_setopt($dcurl, CURLOPT_FILE, $ffile);
$ffile = curl_exec($dcurl);

if(curl_errno($dcurl)) // check for execution errors
{
echo 'Script error: ' . curl_error($dcurl);
exit;
}
curl_close($dcurl);$ffile = file_get_contents($userid.".txt");}
smallbee
  • 233
  • 1
  • 4
  • 16
  • The script seems to be synchronous and I guess that's why it's not scalable, you need to make it async, so that it potentially can handle 100's of requests easily. Just execute that piece of code in a `fork`ed processes. – Boynux Dec 27 '15 at 08:16
  • You first need to identify why it's hanging. Most likely it isn't curl or this script. Is your web server configured to serve 500 simultaneous clients? Is your OS open file & socket limit high enough to allow all these connections at once? – drew010 Dec 27 '15 at 20:31

1 Answers1

3

you can use curl_multi
http://php.net/manual/en/function.curl-multi-init.php - description and example

Max P.
  • 5,579
  • 2
  • 13
  • 32
  • 6
    Do you want me to do your work for free? I've posted link with description and example. Use it as base for your example. – Max P. Dec 27 '15 at 15:10
  • Your link-only hint would be better positioned on this page as a comment under the question. The OP has presented a coding attempt (not all questions provide this). Please be more generous / educational if you are going to post an answer. – mickmackusa Jan 14 '20 at 07:05