When UNIX was first being written back around 1970, engineers at AT&T/Bell Labs needed a general purpose function to provide the current time, so they wrote one and logically called it “time”. This function initially returned the current time as the number of 1/60th second intervals since Jan 1, 1971, which caused a wrap condition after just 2.5 years. The function was patched several times, and eventually took its most enduring form, returning a 32-bit signed value holding the number of seconds since the epoch of Jan 1, 1970.
You might wonder why 32-bits was chosen to represent time instead of 64-bits. After all, there’s no technical reason why the UNIX designers couldn’t have initially chosen to use 64-bits for time. Basically, the main reasons were the high hardware cost and limited availability. At the time, computers were extremely expensive, so limiting the representation of time to 32 bits seemed like an acceptable trade-off. In retrospect, it was clearly not the best choice. Those early designers surely must not have realized the magnitude of the problem they were going to cause by their decision. Either that, or they had a very warped sense of humor.
As the saying goes, the rest is history. AT&T/Bell Labs UNIX became a de facto standard, and most major vendors in the computer industry adopted the same representation of time as AT&T UNIX. These included all UNIX platforms, Linux, the IBM PC, DOS, all versions of Windows (prior to 64-bit versions), and all compilers and other tools that support these platforms.