Recently, I made a plea for the adoption of Universal Coordinated Time (UTC) in computer applications. It’s a sensible recommendation, and I stand behind it.
The folks working on HTML 5 are proposing a
<time> element for the new standard. This makes sense to me. It will help eliminate some of the objections people have raised to the datetime design pattern proposed by the microformats team.
So, problem solved, right? We use UTC for time, and the usual calendar notation for dates. Neat.
Wait a minute. By “the usual” calendar notation, do we mean the modern Gregorian calendar, or…
He provides an overview of the major calendar reforms in the Western world and points out that the reforms were adopted by different countries at different times. So forming a consistent timeline requires a knowledge of both time and place.
And many important historical dates float. The rules that determine when Easter occurs in the church calendar are complicated, and Orthodox and Catholic calendars disagreed for many years. In the medieval period, years were often numbered according to the local monarch’s reign. In Roman times, extra days were added to the official calendar by decree to prevent the seasons from drifting too far out of line.
If we want to make the
<time> element safe for historical use, programmers would have to deal with this mess.
As useful as having universal, consistent
<time> element metadata would be, that’s just too hard. Frankly, I skimmed the last bits of PPK’s article myself, and I’m actually interested in the issue. Most working programmers won’t bother.
While it’d be nice to have trustworthy time data, we’re not likely to get it. The standard should reflect that. I vote for assigning a cutoff date for the
<time> element. January 1, 1970 works for me.