3

I have a hosted environment that is migrating to some new hardware. I need to stress this new implementation remotely. Is there a quick and clean way to do this? I don't need anything fancy, just page hits and it doesn't need to have massive volume.

Thanks!

Craig
  • 182
  • 7
  • Went with the Web Application Stress Tool. SImple and straight forward. Thanks for the answers guys!! – Craig Jun 09 '09 at 21:29

5 Answers5

6

You can use the stress tools Microsoft provides:

either Web Capacity Analysis Tool (from the reskit) or the Web Application Stress Tool

Jim B
  • 24,081
  • 4
  • 36
  • 60
4

I assume you have a Linux machine around. If you don't install one. :)

This will time how much time does it take to pull 1000 pages from the server:

time for i in `seq 1000`; do wget http://127.0.0.1/~elcuco/test.php; done

Now, how about concurrent loads? There is a utility called "apache benchmark", lets have a test:

ab -c 20 -n 100  http://127.0.0.1/~elcuco/test.php

This pulls 100 pages, while keeping 20 concurrent downloads. Here is a real life demo, it's self explaining.

[elcuco@pinky ~]$ /usr/sbin/ab -c 20 -n 100 http://serverfault.com/questions/22785/stress-testing-a-hosted-iis-server
This is ApacheBench, Version 2.0.40-dev <$Revision: 1.146 $> apache-2.0
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Copyright 2006 The Apache Software Foundation, http://www.apache.org/

Benchmarking serverfault.com (be patient).....done


Server Software:        Microsoft-IIS/7.0
Server Hostname:        serverfault.com
Server Port:            80

Document Path:          /questions/22785/stress-testing-a-hosted-iis-server
Document Length:        30691 bytes

Concurrency Level:      20
Time taken for tests:   19.642924 seconds
Complete requests:      100
Failed requests:        0
Write errors:           0
Total transferred:      3151576 bytes
HTML transferred:       3129271 bytes
Requests per second:    5.09 [#/sec] (mean)
Time per request:       3928.585 [ms] (mean)
Time per request:       196.429 [ms] (mean, across all concurrent requests)
Transfer rate:          156.65 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:      266  480 506.2    397    3328
Processing:  1227 3236 1559.4   2963    9943
Waiting:      287  495 277.6    443    1836
Total:       1499 3716 1613.2   3433   10285

Percentage of the requests served within a certain time (ms)
  50%   3433
  66%   4040
  75%   4321
  80%   4953
  90%   6056
  95%   6795
  98%   9139
  99%  10285
 100%  10285 (longest request)
elcuco
  • 357
  • 1
  • 4
  • 10
3

Here ya go. http://support.microsoft.com/kb/231282

MathewC
  • 6,957
  • 9
  • 39
  • 53
2
  • Application Center Test (ACT) if you have Visual Studio

  • Web Stress tool and if you want something non-microsoft,

  • ANTS from Red-gate has done well for us in the past.

  • NUnitASP if you want something free which will fire HTTP requests at a website.

    All of these allow you to built scripts that will fire requests at a web server to see the response time. I like ACT because its runner shows an instananeous readout of requests/second as the script runs.

MikeJ
  • 1,381
  • 4
  • 13
  • 24
0

Multiple parallel "wget --mirror" instances? >smile< Can't get much simpler than that.

W/o knowing more about the site's use of server-side resources (database queries, etc), it's difficult to give you a "generic" test idea that's very concrete. There are lots of HTTP load testing tools out there, commercial and open source. If you know what particular pages generate the most server-side load, you could focus on those pages with one of those tools.

Evan Anderson
  • 141,881
  • 20
  • 196
  • 331