I have 7-8 php scripts written which pulls data from remote server and store it into our server. Each script insert/update around 3000-4000 records at a time. When I hit any script from browser it works fine(individual script) but if I try to call all files together by writing header('Location: http://www.example.com/') it gets break. Can anyone suggest me a better way to work with this. Someone suggested me use multi-threading I have not used threading yet so can anyone help me with the better approach/solution. TIA.
Asked
Active
Viewed 355 times
1 Answers
0
Note: your current code doesn't work because header('Location: example.com')
redirects the browser to example.com
which means your php script finished running and the browser is now on example.com
Solution 1:
if allow_url_fopen
is "On" in php.ini you can execute them as using:
<?php
$url1 = file_get_contents('http://www.example.com/1.php');
$url2 = file_get_contents('http://www.example.com/2.php');
?>
and so on...
Solution 2:
function initCURL($url) {
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HEADER, false);
$data = curl_exec($curl);
curl_close($curl);
return $data;
}
use it as follows:
<?php
$url1 = initCURL('http://www.example.com/1.php');
$url2 = initCURL('http://www.example.com/2.php');
?>
in these examples $url1
and $url2
will carry whatever data is returned by the scripts.

Jaswinder
- 361
- 2
- 11
-
Hey thanks Jaswinder Singh for your help. I will try this code and will let you know. Thank you so much. – Shweta Shinde Oct 03 '18 at 10:34
-
Hey Jaswinder Singh Thank you so much. Solution1 worked perfectly. – Shweta Shinde Oct 05 '18 at 10:57