-1

I have a new server install of CentOS 6.4 64bit on a VM with 2 cpu and 8gb of ram.

The server has nothing installed other than apache and php. The server is hosting a simple web API that writes data to a RabbitMQ queue on another server.

I'm running 10 instances of a simple script that does a curl call to this web server which I've looped 100,000 times each.

This has cause the web server to use 100% CPU but very little ram (500MB). I've played around with Apache config and even set it to 1000 maxclients and I get the same result.

Is there an issue with Apache doing a lot of work even with only a few clients?

Thanks

tdbui22
  • 103
  • 1
  • 2
  • 6

1 Answers1

1

You've got 10 threads making a million requests as quickly as they can from the local system (no network latency slowing down how quickly they can hit the server). I'm not sure how this would constitute the workload of only a few users? Maybe a few users right next to the server who have something heavy on their F5.

You will hit a bottleneck on one resource or another - in this case, it's CPU, simply due to the execution of PHP code. However simple the page is, as soon as it finishes one render, it's having to start the next one.

This shouldn't be a performance concern in itself; you should be looking at things like "how long does each page render take when under load", or "does one single request take more resources than expected". This test doesn't really give you any indication of that. Take a look at ab - it'll show you what kind of performance you're actually getting while pushing your server to its limit.

Shane Madden
  • 114,520
  • 13
  • 181
  • 251