The Hitchhiker’s Guide to the Galaxy opens with the following quote:
Far out in the uncharted backwaters of the unfashionable end of the western spiral arm of the Galaxy lies a small unregarded yellow sun.
Orbiting this at a distance of roughly ninety-two million miles is an utterly insignificant little blue green planet whose ape-descended life forms are so amazingly primitive that they still think digital watches are a pretty neat idea.
Though it was published three decades ago, the quote is relevant today. Not only because of our continuing enthusiasm for digital watches, in all their forms, but because our neat little clocks are still amazingly, stunningly primitive.
While some tech bloggers feel the issue is poor datetime programming practices, I think the issue is more fundamental than that. Why else would an established software platform vendor like Microsoft still be struggling to come up with a decent datetime implementation?
The problem is simply that computers can’t tell time. Or rather, that computers worldwide can’t agree on what time it is.
In the early prehistory of the Information Age — prior to 1970 — computers were large, unwieldy assemblages of equipment. ENIAC filled an entire room. Mainframes and supercomputers, descendants of those early digital beasts, share two characteristics with the early computing devices.
Personal computers, even today, have yet to shake some of this legacy. The upshot of this is, your computer thinks its clock is always right. It never checks the position of the sun, or better yet, checks with the International Bureau of Weights and Measures to get the Universal Coordinated Time. As a result, most computers today rely on the accuracy of their internal clocks, which are typically set to local time or the time at the chip factory.
Some operating systems will adjust this for you. If you’re running Microsoft Windows, for example, you’ll default to U.S. Pacific Standard time. Which is wrong for most of us on the planet, but you have to pick somewhere to start, right?
I know, you’re thinking, “All that crappy legacy left over from the mainframe days. Can’t we scrap it and move on?” Well, buckle up, friends, because there’s far more legacy thinking involved in timekeeping than that! Our system for time is based on work originally done in ancient Sumeria.
The Sumerians used a sexagesimal number system, which is almost as interesting as it sounds. Programmers are used to the everyday decimal number system, the binary number system, and hexadecimal. But base-60 is a complicated system indeed, with many fascinating properties, including the fact that it’s evenly divisible by the first six counting numbers.
So the reason why hours have 60 minutes and minutes have 60 seconds is all the Sumerians’ fault. As is the convention that a circle has 360 degrees.
And yes, they tried hard, but 360 is not quite the number of days in a calendar year. But they felt it was close enough. Though after two millennia, they’d made April Fools of many Catholics.
My point? Math is hard, and converting between different numeric systems makes it harder, and when real life calendars don’t quite fit into a neat mathematical box, you’ve got the makings of a complicated system that’s ripe for failure. So fail we do — and often!
In the interests of making one tiny, tiny optimization in this all-too-complicated matter of time (And we haven’t even factored in general or special relativity yet!), I propose that we all adopt UTC immediately. Forget about daylight savings, forget about time zones: UTC and datetime calculations are troublesome enough without having to worry about localization.
We now have a globe-spanning GPS system, constantly beaming timing signals to a wide array of portable devices. Our laptops, workstations, minicomputers and mainframes ought to use these as well. Let’s all synchronize our neat digital watches.