0

Hi i have an apache server 2.2 with php (magento) scripts.

in normal times a php page renders in 1-2 sec, which is ok.

Sometimes at high traffic or crude spider bots all Apache-Slots are blocked.

The single requests run verry slow and use more and more memory until mysql calls the oom-killer, which kills my needy tomcat.

First i tried mod_evasive. But if i configure it too strict i can't browse snappy, if i configure it too loose the requests can come faster then they can be processed.

The problem are the php files. Other resources like images etc wont obstruct the slots.

Second i tried to limit it by lowering MaxClients. But now one client can obstruct all slots.

Any idea how to limit maximum connections per client or better maximum simultan php scripts per ip...

How are other Apache client configured to avoid more requests than they can process without favoring any client?

Tom O'Connor
  • 27,480
  • 10
  • 73
  • 148
wutzebaer
  • 129
  • 8

2 Answers2

3

Put your site behind an origin-pull CDN (Cloudflare springs to mind as one option, there are others.)

in normal times a php page renders in 1-2 sec, which is ok.

No it's not.

Optimise your code so that it doesn't take up to two seconds to generate the page.

Then cache all the things.

  • Cache page requests with Varnish
  • Cache Database lookups with Memcached.
  • Cache images on a CDN.
  • Cache PHP level with APC or eAccelerator (or similar opcode cache)
  • Cache whole pages on a CDN too, for that matter.

If you haven't already, spin your database server off onto a separate server, give it a metric pantload of memory, and uber-fast disks, then cache the hell out of your tables. Magento is a slut when it comes to joins and so on, so you'll need your database server to be painfully fast.

So that when a user views your site, the content will be pulled out of a cache somewhere and not have to be generated every time.


I took the liberty of running Yslow against your site. Here's a summary of the findings (although, you should do this yourself..)

  1. Make fewer HTTP requests. Basically you should combine your javascript and CSS files into one, so that you reduce the overhead of having to make multiple requests.
  2. Use a Content Delivery Network (CDN). I've already said this ^^above^^, but Yslow tells me that there are 60 static components not on a CDN. 1.4MB of these are coming from www.brainyoo.de. Which brings me on to..
  3. Move assets to multiple cookie-free domains. 60 assets are being loaded from the same domain as the rest of the site. Which means that the browser has to wait for each one to load before loading the next. A common way to get around this is to put static assets on separate, cookie-less domains (so that the request is smaller, faster, and doesn't contain the cookie info).
  4. Move Javascript to the bottom of the document. Considered best practice for scalable, fast websites.
  5. Minify Javascript and CSS. Seriously. You're sending nearly a megabyte of JS and CSS, that can be minified and...
  6. Compress components with Gzip. There are 39 plain-text components that could be gzipped.
  7. Add Expires headers. There are 63 static files without far-future expiration dates, so they're not being effectively cached by the browser.

Total YSlow Grade D. I'm starting to wonder if your problems aren't entirely server-side, but could do with a damn good fiddling with, and get your YSlow rating up. Because that way, a) less traffic served from your Apache server. b) It's quicker to load, so your visitors aren't locking up an Apache process for as long.

Tom O'Connor
  • 27,480
  • 10
  • 73
  • 148
1

generally, if magento runs slowly or kills your server every then and there, your server would run better with more ram/cpu-power. you can tune your server to run magento smoothly, but it takes a) some time and b) the neccessary (server-side) resources.

how much db-tuning did you did? magento is PITA but can be configured to be fast, given enough RAM.

use a tool like mysql_tuner or mysql tuning primer to tune your db.

and use as much cache as you can. i dont remember, if APC is supported; memcache is better.

you should also keep an eye on IOWAIT which is an hint for very bad hd - performance which might have different reasons.


EDIT:

How are other Apache client configured to avoid more requests than they can process without favoring any client? / I want to know how other apaches handle the situation when they get more requests than they can process

usually you'll scale your setup, depending on expected traffic. if your server goes down when spiders & crawlers are hitting it, either you have some very bad-performing scripts or your server is to small.

when i expect 1000 users on my server i use a server that can handle 5000 clients (and i monitor error.log for max_clients reached)

rate-limiting is possible, but you'll make the same experience like with mod_evasive. it really helps in term like DDOS, but NOT if you need to tune your setup.

  • hi i already use apc, and i know that i get get more performance, but this isn't an answer for my question. I want to know how other apaches handle the situation when they get more requests than they can process – wutzebaer Sep 06 '13 at 08:45
  • what you said "The problem are the php files" is not correct. php is usually not an issue, with magento, but db-performance. better performance means your website can handle more requests in the same time. you get better performance with tuning and more resources. for the rest, see my edit – that guy from over there Sep 06 '13 at 09:49
  • wutzebaer, how much ram do you have on your server? – that guy from over there Sep 06 '13 at 09:57