I m working on a project for sending some big size object throw UDP protocol (written in C#), all works well. The problem is that there is a difference in connection speed between the Client and Server that causing a loss of about 70 % of the packets. The solution that I m working on is to put a sleep while sending the packets to slow down the emission process. For that I want to know if there is any methods for getting the optimal sleeping time between two consecutive emissions because I need to have the maximum possible speed. If you need more details, just ask.
Asked
Active
Viewed 156 times
0
-
1Any process that requires 100% delivery is a bad choice for UDP. Use TCP instead. – Ross Patterson Jul 30 '13 at 10:55
-
It doesn t require 100%, but just to minimise loss. And I don t have the choice, I have to use UDP. This is my question : "For that I want to know if ther is any methodes for getting the optimal sleeping time between two consicutives emissions because I need to have the maximum possible speed" – user2492258 Jul 30 '13 at 11:21