1

I send data to my server from my TCP socket every 5 seconds. How much data would be consumed in an hour at this rate of sending messages.

Every time, a socket is opened and data is pumped out of it from the client to the sever. I am using a 3G GSM modem on my client side.

my message is, ID1$Socket$Open$timestamp. All are strings

bhuvan
  • 65
  • 1
  • 1
  • 9

1 Answers1

0

ID1$Socket$Open$timestamp is 25 bytes, assuming an 8-bit string, and assuming you are not sending any other data (headers/delimiters, etc) in between your messages.

So, 1 message sent every 5 seconds is 60 / 5 = 12 messages per minute, which is 12 * 60 = 720 messages per hour. At 25 bytes per message, that is 720 * 25 = 18000 bytes per hour (plus overhead for TCP/IP headers and framing per message, and ACKs per TCP frame).

Remy Lebeau
  • 555,201
  • 31
  • 458
  • 770
  • Somewhere there is a paper that shows the maximum bandwidth utilization of TCP is either 83% or 87%, I forget which, the rest being headers and maybe ACKs. – user207421 Mar 31 '15 at 23:36
  • 1
    TCP alone doesn't provide sufficient information to calculate such. Now if you say TCP/IP over 10baseT, then you have a 1500 byte maximum MTU with minimum 20 byte TCP and IP headers so 1460 payload bytes. Then ethernet adds a 14-byte header, a 4-byte CRC, 7-byte preamble, and 1-byte start-of-frame. There is also a 12-byte inter-frame-gap. Thus, assuming no breaks you have 1460/1538 == 94.9%. ACKs take no additional space and bandwidth is usually bidirectional. Other network technologies have different framing and different MTU thus changing the maximum efficiency. – Brian White Apr 02 '15 at 03:10