Home Y2K -- Disaster in a Byte

The Good Old Days -- The Computer Dark Ages

Here is the deal.  In the 1960's IBM invented the IBM 360 and sold/leased thousands of these business computers.  In the 1970's they sold/leased thousands more of their next generation (IBM 370) using the same programming.  The same for the 1980's.  By 1990, the world had gone to servers and networks and had a different culture.

But the culture of the IBM 360's hung on for a long time.  If I buy a PC today, it comes with 2-4 GIgabytes of memory and the internal registers are 64 bits (8 ASCII characters) wide.  In the 1960's, I grew up on an IBM 360-40 with 256K of memory and internal registers 32 bits wide.  The current PC runs 1000 instructions in a microsecond.   The 360-40 took 4 microseconds to execute one instruction. Ratio: 4,000,000: 1;

So. Today we have memory to spare.  Speed to spare. And graphics that makes our TVs turn green with envy.  In the 1960's we fought for every second and every byte (character) of data space and every operation.  Early computers worked either in decimal or binary.   Binary is good but requires conversion to user-readable decimal.  Decimal is great for people, hard on computers (space and operation).

The 360 had a new data format: packed decimal.  An 8-bit byte can hold a value between 0 and 255. A character byte can hold numeric values from '0' to '9'.  If I want to store a date in character format, I need 8 bytes minimum (yyyymmdd) -- and I can see it without conversion.  If I want to store a date in binary notation, 4 bytes will give me a number to cover the written history of the world but needs to be converted to understand the date in question.  Packed decimal gave programmers an out.  It put 2 decimal digits in each byte except the last which contained 1 digit and a sign flag.  Instead of 8 bytes, the date could now be placed in 4 bytes if you dropped the '19' at the front of the year.  Oh. Why readable?  People lived with memory dumps, sometimes inches deep.  If you had an error, you got a dump.  You read the dump to find the problem.  If you used binary data, you could not see the date causing the problem.  If you used decimal, even packed, you could find your records by date.

Grace Hopper, USN, invented a language for business usage.  She became the data processing hero of the 20th Century.  The language was the proverbial camel for computer programming.  It was verbose and it defined data records and fields and automatic conversion of formats.  One little change from "Computational" to "Computational-3" got you packed decimal from binary.  What a language.  Data manipulating and recording became a snap.  The language, implemented by committee, included some of the worst possible programming abilities.  Imagine a high-level language program with self-modifying code!

Commercial programmers were notoriously lazy and undereducated.  Undereducated?  Universities had just started turning our Computer Science degrees and most of these, and I emphasize most, were in scientific and mathematical backgrounds.  At the University of Wisconsin, there was one 3 credit course total in the Computer Science department relegated to non-mathematical programming and that course spent almost all of its time in esoteric languages (SLIP, LISP, etc.).  Taking a school of business computer course was like taking Meteorology 101: a study of clouds and a reading of a novel about the weather.  And your CS advisor ridiculed your choice.

So business computer departments stole people from other company departments who wanted to become part of the secret empire.  Programmers were looked upon as magic in those days and programmers guarded this vision jealously.  And what did they do?  They fought for data bytes and data disk space and wanted life to be easily debugged.  They flocked to a packed decimal 4-byte date.  Proudly they had saved 4 bytes per date and could still read their memory dumps.

But now they had a problem.  When the year 2000 came around, and only two digits were stored in the year portion, the computer could not tell 1989 from 2089.  In 1967, who thought their programs would still be in operation in the year 2000?  But the 360 configuration lasted in to the 1990s without converting the programs.  New techniques had evolved so that dates were now stored in a better format but old programs and old computers last a long time.

This was really a disaster waiting to happen. If the computing world had not jumped when it did, Y2K would have been the disaster predicted.  The solution was simple and took no more space: use binary notation rather than packed notation.  But this meant going through every COBOL program and making the conversion.  It also meant going through every data file and making the same conversion.  The two processes are separate as the new programs could not use the old data and vice versa.  You could add a subroutine function for each date reference.  If the date were smaller than 70, use 2000 else use 1900. There are some other methods but you can understand these easily.  The others are just variations of a theme.

But who to hire for this gigantic effort?  That is easy: hire the retired programmers who created the problem in the first place and pay them extravagant consulting fees and feel like you did your part.

This happened and there was no disaster.  Had it not happened, there would have been disasters although I think airplanes falling out of the sky is a little far fetched, we avoided a paralysis by paying for the same job twice.  Cheap at twice the price.

By the way.  The networking people at the universities referred to the IBM as the center of the Computer Dark Ages since its success came at the expense of 2 decades of delay in arriving at today's networking systems.

Questions?  Comments?  Push the Home/eMail above.
Written:  2007          Updated:  March 15, 2008          Back To Top