Today, while writing new Date(), I accidentally discovered a very interesting method, getTime()
, and I searched on Baidu. Some people say that it calculates the number of milliseconds from January 1, 1970 to now.
Why is it 1970?
new Date().getTime();
// xxxxxxxxxxx
This originated from the birth of Unix. Unix was developed in 1969 and officially released in 1971. Before that, there was no need for any machine to represent time before 1970-01-01-00:00:00. Many languages later followed this convention, and JavaScript just followed this convention as well.
Of course, this practice is problematic now, for example, it is not convenient to use it to represent earlier times and the precision is limited.
Defining time starting from January 1, 1970, I suddenly thought that in Java, the time in Oracle databases is also calculated from January 1, 1970.
For example, in Java:
Date date = new Date(0);
System.out.println(date);
// The printed result: Thu Jan 01 08:00:00 CST 1970
It is January 1, 1970, and the actual time is 0:00:00 (it is printed as 8:00, which will be explained later).
Why was this time defined as January 1, 1970?
So I started Googling, but I couldn't find the answer on Chinese web pages. So I tried searching with English keywords and finally found an accurate post on the Sun Java forum:
http://forums.sun.com/thread.jspa?threadID=595140&start=15
There was a reply:
I suspect that Java was born and raised on a UNIX system.
UNIX considers the epoch (when did time begin) to be midnight, January 1, 1970.
It means that Java originated from UNIX systems, and UNIX considers 0:00 on January 1, 1970 to be the time epoch.
But this still doesn't explain "why". Out of curiosity, I continued Googling and finally found the answer:
http://en.wikipedia.org/wiki/Unix_time
The explanation here is:
Initially, computer operating systems were 32-bit, and time was also represented in 32 bits.
System.out.println(Integer.MAX_VALUE);
2147483647
In Java, Integer is represented in 32 bits, so the maximum value that can be represented in 32 bits is 2147483647. In addition, the total number of seconds in a year is 31536000, so 2147483647/31536000 = 68.1, which means that the longest time that can be represented in 32 bits is 68 years. In reality, on January 19, 2038, at 03:14:07, the maximum time will be reached. After this point in time, the time on all 32-bit operating systems will become 10000000 00000000 00000000 00000000, which is December 13, 1901, 20:45:52. This will cause a phenomenon of time regression, and many software will run abnormally.
So far, I think the answer to the question has come out:
Because using 32 bits to represent time has a maximum interval of 68 years, and the earliest UNIX operating system took into account the era of computer generation and the time limit of applications, so it chose January 1, 1970 as the UNIX TIME epoch (start time), and Java naturally follows this constraint.
As for the phenomenon of time regression, I believe it will gradually be solved with the emergence of 64-bit operating systems, because a 64-bit operating system can represent time up to December 4, 292,277,026,596, 15:30:08. I believe that our future generations, even on the day when the earth is destroyed, will not have to worry about it being insufficient, because this time is already billions of years later.
The last question:
In the above code System.out.println(new Date(0))
, the printed time is 8:00 instead of 0:00. The reason is the existence of system time and local time. In fact, the system time is still 0:00, but my computer's time zone is set to GMT+8, so the printed result is 8:00.