How saving a few bits almost sunk the earth and we still learned nothing ;-)

Remember the Y2K problem? We all filled our bath tubs with drinking water and bought loads of candles befor new year’s eve 1999 because we feared the world would simple explode or something even worse. We’re probably laughing about this today, but we weren’t back then.

The whole reason for the almost extinction of mankind and all life on earth was the need to save storage when writing bytes onto tapes. Back in the 60ies (or so), people decided that saving only the last two digits of a year would save a few bits per record, since you can save all the numbers between zero and 99 in 7 bits, whereas you need some more bits  for the century (depending how many years you plan as a minimum/maximum).

When the end of the century came closer, people thought: well, we still have thousands of tapes and files with only one byte for the year, so let’s assume that if that number is smaller than, say, 30, it must mean a year in the 21st century and still keep that date format.

This was okay for most purposes like birthdays of people signing a contract because very little people grew older than 99 years.

Beginning in the 1990’ies, some people worried that this cannot go on forever and decided that it’s better investing in another byte for a year than risking the end of the planet. This would at least work for a few thousand years from now.

In Languages like Smalltalk, there is an even better answer to that: Numbers in Smalltalk are of infinite size, so you could just manage years as Integers (which internally might become LargeIntegers) and be safe until shortly after the next big bang or so.

But our habits still haven’t changed, even in the face of THE END: We’re still talking about ’08 or ’05 and mostly mean 2008 or 2005, because nobody we talk to was living back in 1908 or 1905.

Why am I writing such a long story which is nothing new to anybody?

Because Boris Popov just sent a bug report to the VisualWorks NC mailing list:

(Date newDay: 1 monthNumber: 1 year: 10) year

7.6 = 2010
7.7 = 10
Couple of things,

– this isn’t mentioned in release notes (but I did find AR 57002 Date>>newDay:monthNumber:year: converts years <100 in to the current century) and affects quite a chunk of existing code, at least for us

– the comment inside the method hadn’t been updated to match new functionality, it still says years since the beginning of century are okay,

Answer with an instance of Date which is the day’th day of the month numbered

monthNumber in the year’th year.  The year may be specified as the actual

number of years since the beginning of the Roman calendar or the

number of years since the beginning of the current century.

I am *not* posting this here in order to bash any particular product or developer, just to say how we sometimes seem to be unable to get rid of bad habits. Maybe this even was just some stupid bug with no real intention behind it.

But we still try to find smart algorithms for determining what a user is really talking about. Shouldn’t we not just change our habbits and expect users to type in a full date? If a user tells me about a year 10, he’s simply not giving me enough information. If he’s a archeologist, he might mean 1510, 10 AC or even 10 BC or 310 BC.

Of course, the above mentioned problems with breaking existing code wouldn’t go away, but on the other hand every approach to solving this problem will be wrong for either an insurance agent or a paleonthologist. So in the end, whatever kind of fix one might come up with here, it will always be insufficient until it means that the user needs to enter complete information.

Funny we didn’t really start getting used to talk in full dates after the panic and fear back in 1999…