Using these metrics (shown below), I was able to utilize a workload modeling formula (Little’s Law) to come up with what I believe are the correct settings to sufficiently load test the application in question.
From Google Analytics:
- Users: 2,159
- Pageviews: 4,856
- Avg. Session Duration: 0:02:44
- Pages / Session: 2.21
- Sessions: 2,199
The formula is N = Throughput * (Response Time + Think Time)
- We calculated Throughput as 1.35 (4865 pageviews / 3600 (seconds in an hour))
- We calculated (Response Time + Think Time) as 74.21 (164 seconds avg. session duration / 2.21 pages per session)
Using the formula, we calculate N as 100 (1.35 Throughput * 74.21 (Response Time + Think Time)).
Therefore, according to my calculations, we can simulate the load the server experienced on the peak day during the peak hour with 100 users going through the business processes at a pace of 75 seconds between iterations (think time ignored).
So, in order to determine how the system responds under a heavier than normal load, we can double (200 users) or triple (300 users) the value of N and record the average response time for each transaction.
Is this all correct?