i'm writing a server-client application and i have a problem sending files; sometimes the transfer is completed correctly, but very often "seems" like the server (the receiver) receives less bytes and continue to wait for other datas while the client has already finished to send all the file.
this is the sender:
public void sendFile(String path) {
FileInfo fi = new FileInfo(path);
long fileDim = fi.Length;
//Console.ReadLine();
writer.WriteLine(Convert.ToString(fileDim));
socket.SendFile(path);
}
this is the receiver:
public bool receiveFile(String path) {
byte[] recievedData = new byte[1024];
int bytesRead;
long fileDim = Convert.ToInt64(reader.ReadLine());
if (!Directory.Exists(Path.GetDirectoryName(path)))
Directory.CreateDirectory(Path.GetDirectoryName(path));
BinaryWriter bw = new BinaryWriter(File.Open(path, FileMode.Append));
try {
while (fileDim > 0) {
bytesRead = socket.Receive(recievedData, recievedData.Length, SocketFlags.None);
bw.Write(recievedData, 0, bytesRead);
fileDim -= bytesRead;
}
}
finally {
bw.Close();
}
}
If i uncomment the "Console.ReadLine" on the sender (so i stop it for some seconds before he sends the file) everything works. So seems like if sender is too fast some data are lost, but i know this is impossible in a TCP stream...(so why it happens???)
Please help me, thanks in advance