2

I have a drupal website with many modules (don't ask the number). For six months the website has been stable, but recently the servers began to seize. Generally MySQL reaches the maximum number of concurrent connections (1000) and the website crashes.

I want to find out what web pages within the site are being visited, or what cron or drush processes are running that are bringing the site down.

What is the best strategy for finding out this information?

Do I parse the apache logs, and see what web pages were visited, then proceed to benchmark the last 100 pages on the log and see how much memory they are consuming, for example?

Or is there a more accurate way of saying "this particular page or process brought down your site"?

I know that there's the PHP log, the Apache log, the MySQL log, and the top command, but it seems like too much inconclusive information.

JeffG
  • 1,194
  • 6
  • 18
amateur barista
  • 498
  • 3
  • 7
  • 21

1 Answers1

0

I don't work with Drupal or MySQL, but you seemed to have all the parts you want to look at for starting to solve a problem like this.

Since the DB is the point of failure (just an assumption), I would suggest starting backwards: MySQL > PHP > Apache > OS > Network. Look at the time and the error at failure at every layer. Go back in a time a little bit. Does your hosting service provide network logs/stats? See if you can get that data as well.

Also, have you heard of New Relic? They have a free version of the diagnostic tool: Check out http://newrelic.com . They seem to be having a promo for their "Gold" release for 7 days -- might help ... ?

Good luck!

KM

KM.
  • 1,786
  • 2
  • 18
  • 31
  • I took a look at the relic program and it's pretty cool. The price not so much. The free version is pretty limited compared to the bronze one. It does hit the sweet spot where it says which pages are loading slower than others, and when it keeps track of the database queries. – amateur barista Mar 04 '11 at 21:17
  • Neither the MySQL, PHP, or Apache log say how much time a page took to load. They are all data all scattered around but without no context. I like how newrelic.com tidies this all up and tells you the speed of each page (which then you can use to benchmark troublesome pages). Is there anything like this but in Open Source? – amateur barista Mar 04 '11 at 21:24
  • @amateur barista: For slow loading pages, if you haven't already, add "%T" to Apache's logformat string -- this will record time to serve requests in seconds -- might give you some insight on slow pages ... ? http://httpd.apache.org/docs/2.0/mod/mod_log_config.html#formats – KM. Mar 04 '11 at 21:41
  • That sounds more like what I'm looking for, that will tell me for instance, what are the slowest pages. I can drill down from there with a profiler/debugger. Thanks. – amateur barista Mar 04 '11 at 22:47