To check the throughput i have modified the bluetooth chat example.
I have provided a Send button in the UI which send some predefined no. of bytes to the server socket and waits for Acknowledgement
The ServerSocket waits for some string and once it gets the data it replies by sending Acknowledgement.
Here Throughput for this connection I have calculated as follows.
I record the start time when the data is sent.
ON receiving of the Acknowledgement i record the end time.
So throughput will be (data sent size + ack received size) / time taken.
The results are:
dataSent(KB)-------------------Throughput(Kilo bits per second)
1KB ~200Kbps
5KB ~560Kbps
10KB ~688Kbps
50KB ~512Kbps
According some of the data got from the application The behavior is that for low data like 1 KB or 5 KB throughput is less. It increases till around 40KB or but again after 50 KB or so it starts decreasing. I can see some garbage collection happening in the receiving side, this adds to the delay.
I want to know if this is the right behavior. why for less data like 1KB or 5 KB throughput less and then it increases. What are the facts that i can consider which can add to the delay. Now the throughput is calculated when the sender receives the Acknowledgement from the receiver(Its full round trip calculation). Can I make it single trip like sending of data and after receiving everything I calculate throughput in the receiver. I tried this but there was milliseconds difference in the two phone which gave me wrong results some time negative value. Please help me in understanding the correct behavior.