I need to run one php file 100000 times at a time. for that i used a exec command in a php file (runmyfile.php) and called that file using putty. The runmyfile.php file is have the following code.
for($i=0;$i<100000; $i++){
exec('php -f /home/myserver/test/myfile.php > /dev/null &');
}
It execute myfile.php file 100000 times in parallel.
This myfile.php fetches rows from mysql database table and perform some calculations and insert this values to another table.
But when running 100000 times it hangs out the server. I'm using centos as server.
Some times I'm getting resource unavailable error too.
If I run it 1000 times it works ok.
when I checked the following ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 514889
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 1000000
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 1024
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
and my mysql max_connection is 200000
Is there any settings that I need to change. So that I can execute my php file 100000 times properly.