console.log(new Date().getTime());
<iframe name="sif1" sandbox="allow-forms allow-modals allow-scripts" frameborder="0"></iframe>
I have a silly question: In the modern era, is the length of JS new Date().getTime()
always 13? Both JS and Java give time strings of this length.
My issue is that I have a random JS string created with
function generateUniqueID() {
return new Date().getTime() '' Math.round(Math.random() * Math.pow(10,6));
}
This string concatenates a Time string with a random number. I need to compare 2 IDs to see which one is for an earlier Date. But the random part of this string isn't always a 6-digit number, so I can't truncate the trailing 6 digits and consider that the time string. To get the time string, I can consider the first 13 digits. That seems to be the Unix Epoch Time string in the modern era.
But is that reliable? When would getTime()
start giving 14 digits?
Also, if I look up Unix Epoch Time on Wikipedia, I see a 10-digit number as the current Unix Time, not 13-digit: 1637093681
.
https://en.wikipedia.org/wiki/Unix_time
Updated
I see the length changes at this point: 2281-11-20
and 2281-11-21
console.log(new Date('2286-11-20').getTime().toString().length);
console.log(new Date('2286-11-21').getTime().toString().length);
<iframe name="sif2" sandbox="allow-forms allow-modals allow-scripts" frameborder="0"></iframe>
CodePudding user response:
The timestamp is milliseconds since 1970-01-01 00:00:00 UTC (the same as the Unix epoch). Subtract the current timestamp from 10000000000000
to get the number of milliseconds until it overflows to 14 digits, which is 8,362,906,319,000
. Then divide this by the number of milliseconds in a year, which is about 31,557,600,000
, to get the number of years until it reaches that value.
This is about 265 years.