Background:
We have a web application, that is accessible as a hybrid mobile app. The mobile phones use 4G / LTE SIM cards for accessing the application which is hosted on a public cloud infrastructure.
Problem at hand:
We have to calculate the minimum data speed required for this particular mobile app to function and respond, to meet the required end-to-end response time of about 5 seconds. To meet this response time, the mobile signal strength / data speeds of the mobile service provider are needed to be good enough.
What we have done:
- Measured the HTTP response sizes of all the pages and found the response with maximum response size among all responses, let's say 2000 KB. Any response sizes lower than this can be assumed to be covered.
- The time for the response to reach the mobile after it was sent out to the wire in the server is 1 second. (from end-to-end response time of 5 seconds, 1 second is taken for client-server request time, 3 seconds for the backend to process the request and compute the response, so we are left with 1 second for the mobile client to receive the response).
What is the correct way to compute the data speed in Kilobits or Megabits given the following inputs:
- Data size to download: 2000 Kilo Bytes (KB)
- Time to download: 1 second
Theoretically 2000 KB per second would give us KBps, but to get kbps, is it as simple as multiplying it by 8 (converting the Bytes to bits?).
I understand the data speed practical implications and issues such as number of mobiles in the usage area, signal coverage and availability, etc. We are only interested in computing the data speed.
Can someone help with the correct method please? Thanks!