I am trying to measure the ping of a socket connection.
Like so;
Server: Sends ping to client
Client: Receives ping and encodes current time as response
Server: Decode response and calculate ping as = current time - response time
In theory this should tell me a fairly accurate measurement of the amount of time it takes for data to transfer from the client -> server.
The ISSUE is the encoded time (millis) from the client (linux VM) is ~4s before the time cached on the server when sending the ping.
It would appear that Instant.now() is returning inconsistent results across machines.
I've confirmed this by simply outputting Instant.now().toEpochMilli()
Running both test at the "same time", the time on the VM is several seconds behind? What is going on here?
CodePudding user response:
VM takes server time by default. So you need to check the time set on your server machine.
Basically server & client are two different machines. Time in these two machines are not in sync. You need to check both machine time & rectify where it is not correct. This has nothing to do with Java.