I'm developing a network support framework (initially for Android) which includes three basic services: Host discovery, Network Communication, and QoS Monitor.
For the last service, I'm trying to implement a method that returns the maximum Messages Per Second (MPS) a single host can send periodically to another host.
Based on the size of the object to be sent and the network speed, I can easily get a rough estimate with the ideal MPS that can be sent through the network. The problem appears when I try to include the Signal Strength (SS) in the equation.
protected int getMPS(NetworkApplicationData message, Context context) {
int messageSizeBits = MemoryUtils.sizeOf(message) * 8;
int networkSpeedMbps = getNetworkSpeed(context);
float signalStrength = getNetworkSignalStrength(context);
// FIXME: what about signalStrength?
return networkSpeedMbps * 1024 * 1024 / messageSizeBits;
}
So the basic question here is: Is there any pre-established study about the impact of the signal strengh on the speed in a wireless network?
Doing some tests, I've noticed that depending on the Signal Strength, the established Network Speed changes. For instance, with a "normalized" 100%SS, the Android API returns a 54Mbps network speed value; and with a 40%SS, the API returns 7Mbps network speed value. Should I rely only on the network speed value that the Android API returns? In this case, I will mostly get an overestimated MPS.
Anyway, I need to know which is the correct approach to solve this issue and base the calculations on formal studies.