I implemented a TCP client-server model to test my bandwidth with the server through sending number of packets with different sizes and see the RTT then calculate the bandwidth through linear regression, Here is the server code:
import java.io.*;
import java.net.*;
public class Server implements Runnable {
ServerSocket welcomeSocket;
String clientSentence;
Thread thread;
Socket connectionSocket;
BufferedReader inFromClient;
DataOutputStream outToClient;
public Server() throws IOException {
welcomeSocket = new ServerSocket(6588);
connectionSocket = welcomeSocket.accept();
inFromClient = new BufferedReader(new InputStreamReader(connectionSocket.getInputStream()));
outToClient = new DataOutputStream(connectionSocket.getOutputStream());
thread = new Thread(this);
thread.start();
}
@Override
public void run() {
// TODO Auto-generated method stub
while(true)
{
try {
clientSentence = inFromClient.readLine();
if (clientSentence != null) {
System.out.println("Received: " + clientSentence);
outToClient.writeBytes(clientSentence + '\n');
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public static void main(String[] args) throws IOException {
new Server();
}
}
And this is the method in the Client class that return an array of the RTT by each packet
public int [] getResponseTime() throws UnknownHostException, IOException {
timeArray = new int[sizes.length];
for (int i = 0; i < sizes.length; i++) {
sentence = StringUtils.leftPad("", sizes[i], '*');
long start = System.nanoTime();
outToServer.writeBytes(sentence + '\n');
modifiedSentence = inFromServer.readLine();
long end = System.nanoTime();
System.out.println("FROM SERVER: " + modifiedSentence);
timeArray[i] = (int) (end - start);
simpleReg.addData(timeArray[i]* Math.pow(10, -9), sizes[i] * 2); // each char is 2 bytes
}
return timeArray;
}
when i get the slope it returns me a BW with kilo bytes however they are in the same network and the bandwidth should be much more . What i am doing wrong ?