I have a java program which sends a udp packet to a server when the user presses a button. The client then waits for a response from the server and aims to record the time in ms between when the udp packet was sent and when the response was received.
I'm having a problem whereby the timing does not seem to be accurate. It works fine most of the time giving a value of around 160ms (which is what I would expect). However, it sometimes enters a phase of giving values that are way to low (i.e. under 5ms).
I know the messages are being sent, as I can see the result appear on the server (and it is definitely more than a 1ms delay). This problem seems to occur if I spam the button many times.
My code is as follows:
public String sendMessage(String message){
long startTime = System.currentTimeMillis();
sendData = message.getBytes();
try{
DatagramPacket sendPacket = new DatagramPacket(sendData, sendData.length, IPAddress, port);
clientSocket.send(sendPacket);
DatagramPacket receivePacket = new DatagramPacket(receiveData, receiveData.length);
clientSocket.receive(receivePacket);
String returnString = new String(receivePacket.getData());
//arg1 message, arg2 - transmit time
addConsoleLine(returnString, System.currentTimeMillis() - startTime);
return returnString;
}catch (Exception e){
return "error";
}
}
Its possible your transmits and receives are overlapping (either because your sendMessage() is being invoked from more than one thread, or, a packet was dropped.
i.e. You send the current request but receive the response from a previous request which will give the illusion of a very fast response time.