I need to find a solution for a website which is struggling under load. The site gets ~500 simultaneous connections during peak time, and counts around 42k hits per day.
It's a wordpress based site bridged with a vbulletin forum with a lot of contents and a fairly complex structure which makes intensive use of the database. I already implemented code level full page caching (without this the server just crashes), and configured all other caching directives as well as combining css files and the like to limit http requests as much as possible.
I need to understand if there is more that can be done via software or if the load is just too much for the server to handle and it needs to be upgraded, because the server goes down occasionally during peak times.
Can't access the server now, but it's a dedicated CentOS machine (I think 4GB ram, can't say what CPU) running apache/mysql.
So back to the main question: how can I know when the users are just too many?
EDIT
I got access to the logs, according to error.log
during yesterday's down it was apache segfaulting:
[Mon Apr 19 18:26:51 2010] [notice] child pid 4825 exit signal Segmentation fault (11)
[Mon Apr 19 18:26:53 2010] [notice] child pid 4794 exit signal Segmentation fault (11)
[Mon Apr 19 18:27:08 2010] [notice] child pid 4595 exit signal Segmentation fault (11)
[Mon Apr 19 18:27:11 2010] [notice] child pid 4826 exit signal Segmentation fault (11)
.....
How can I tell what's the cause of this segfault?