1

I have noticed that a new server that I am setting up is fairly slower than the current server, and made some stress tests/benchmarking to study the problem.

But after the tests I am getting contradictory results.

The machine details:

CentOS-6.4 (i386)
Apache 2.4.4
PHP 5.4.17
mySQL 5.6.12
8GB RAM
No cache

This is a Joomla site.

One note: This machine is currently only accessed via VPN (this may be relevant to the slowness)

I restarted httpd and mysql before doing the tests.

The tests:

  1. Browsing:

    It is slower (not just feels, it is indeed slower when browsing, with any browser type (Safari, Firefox, Chrome, IE), I didn't time it, but it is slower than the current site and without any other user there).

  2. Debug via Joomla:

    I turned on the debug for the homepage, and it takes an average of 1.294 seconds to render (via the log of that debug) whereas testing in the current server (same config but with CentOS 6.3 and PHP 5.3) it takes an average of 0.762 seconds.

  3. ab:

    I tried 5 users concurrently, doing 1000 requests, this was done from another machine.

    I really struggeled with this one, because for a static text file or a simple echo e a PHP it gave me this:

    Connection Times (ms)
                  min  mean[+/-sd] median   max
    Connect:        3    5   1.7      5      17
    Processing:     4    7   1.5      6      20
    Waiting:        4    6   1.5      6      19
    Total:          8   11   2.5     11      27
    

    And the Joomla homepage:

    Connection Times (ms)
                  min  mean[+/-sd] median   max
    Connect:        0    0   0.3      0       9
    Processing:   321  423 115.8    403    1737
    Waiting:      309  406 114.0    386    1706
    Total:        322  423 115.9    403    1737
    

    Anyway, this is faster than the >1 second that I had when browsing (test 1).

  4. jmeter (2.9):

    I made a test with 5 concurrent users, doing 100 requests with 2 seconds ramp-up.

    A static page gave me this average:

    label   # Samples   Average Median  90% Line    Min Max Error % Throughoutput   KB/sec
    TOTAL   500 8   8   11  7   21  0.0 92.66123054114159   36.55051195329874
    

    And the homepage gave me this:

    label   # Samples   Average Median  90% Line    Min Max Error % Throughoutput   KB/sec
    TOTAL   500 445 436 507 366 969 0.0 10.331645831180907  856.8125096859179
    

Finally, the question:

Why is the browsing so slow, knowing that the stress test (either ab or jmeter) gave me much faster results? (Are this tests adequate or should I try something else?)

(I don't know if the VPN can be at fault here, but that doesn't explain the browsing speed and the test speed, both need to be connected to the VPN).

jackJoe
  • 121
  • 6

3 Answers3

2

I suspect the answer lies in the things that browsers do, that jmeter and ab don't.

Namely cookies, and javascript (and images, to some extent).

Similarly, I'd argue that your benchmarks aren't really representative of real browsing.

When I visit a site, I don't go to the homepage and reload the page over and over. I go and click lots of different things. The important distinction here is that if your server's set up correctly, then lots of the generated PHP opcode, generated page fragments etc go into memory, so the first hit is slow, but all subsequent ones are fast.

You should try and find a way to simulate "real world" browsing. One idea off the top of my head is to run Selenium IDE, record your keystrokes and clicks while browsing, then be able to replay them time and time again, across multiple hosts.

The VPN might be responsible, but should theoretically cause a greater overhead on both types of connection, both ab et al. and browsers.

I suspect if you run tcpdump/wireshark/ngrep etc on your server when you ab it, you'll find that much fewer page assets are loaded compared to doing the same with the browser.

Tom O'Connor
  • 27,480
  • 10
  • 73
  • 148
  • Thanks for the reply. Testing the homepage was an example (for the sake of simplifying this question), because the full test includes several other pages. I agree that `jmeter` and `ab` may not be reliable in terms of a "true" browsing test, but for example the Joomla log is reliable in terms of speed and it is slower than another less powerfull server and that puzzles me, especially when the `jmeter` and `ab` *prove* that it is indeed faster. – jackJoe Sep 16 '13 at 14:40
1

check your page with a tool like "YSLOW", that might give a hint where the timelags begin.

and i'd always check for mysql slow_queries and use a tuning-script to evaluate, if your bottleneck is your db. if not -> (you did already) check the bare performance of your webserver with a simple static file; then with ab; then investigate the whole page with a browser and some webmaster-plugins.

1

As Tom said, you should simulate real browser behavior. For JMeter it's quite easy:

  1. Add HTTP Cookie Manager
  2. Add HTTP Cache Manager
  3. Add HTTP Header Manager, and configure User-Agent of browser which you usually use.
  4. In HTTP Request or HTTP Request Defaults in Option task check settings: Retrieve All Embedded Resources from HTML Files to simulate a browser retrieving embedded resources (such as gifs, css, js etc)
  5. In Option task check settings Use concurrent pool. Size: n. Use 2-4 threads.
Jay
  • 111
  • 1