The Trouble with Gerrold: Dead languages
October 17, 2012 —
(Page 1 of 3)
Related Search Term(s): COBOL, Fortran, SNOBOL
The world is not going to end on Dec. 21, 2012.
The operative meme here is that the great stone wheel of the Mayan calendar ends on that date. Therefore, the world is going to end.
Actually, the Mayans had a 400-year calendar cycle very much like our own. We add one day to our calendar every four years. Every hundred years, we don’t add a leap day, but every 400 years, we do. So 1900 didn’t have a leap day, but 2000 did. We diddle the calendar like this to keep it in sync with the actual progression of the Earth around the sun, so that our solstices and equinoxes always occur on the same day of our calendar.
The Mayans did the same thing, and every 400 years they started a new cultural cycle. Their civilization eventually collapsed, evaporated, vanished when they were no longer able to irrigate and fertilize their crops. They left behind some stepped pyramids and some calendar wheels. The Illiterati promptly assumed that the ancients knew some vast profound secrets that have remained unknown to humankind for thousands of years, and are still a mystery to modern science and technology. Ancient aliens usually figure into this meme, sometimes with crystal skulls.
The world did not end at the stroke of midnight on Dec. 31, 1999 either. That was a far more real and knowable threat to the stability of the information age. But we survived that one too.
On Jan. 1, 2000, airplanes did not fall out of the sky. Nuclear reactors did not melt down. Electrical grids did not go dark. Cell phones did not go dead. The Internet did not disappear. The predicted apocalypse did not occur.
Does anyone remember the Y2K panic? Or why it happened in the first place? Let's recap.
Back in the Mesozoic era of computing, bytes were expensive, so programs had to be small. Whether you were using Fortran or COBOL or SNOBOL or hand-coding in assembly language, you had to be efficient.
A single byte can contain a numeric value between 0 and 255. That could have been enough to store 256 year values, but that would have required extra lines of code to translate that into a readable numeric value. So programmers stored the year value in two bytes, each byte containing an integer from 0 to 9. This gave the programmer 100 numeric values, 00 to 99. This was generally considered an efficient use of RAM and a good way to save space on precious storage media. In the 1960s, an 8-inch floppy could hold only 80K. (That’s K as in Kilobytes.)
With memory so spare, storing a year value as two bytes made more sense than using four bytes to add a redundant 1900 value. So year values were stored as 63 and 75 and 81 instead of 1963 and 1975 and 1981. At the time, 2000 was so far off that programmers operated under the assumption that everything then current would have been replaced by far more efficient machines and better software. It was a fair assumption. Moore’s Law was in high gear. Chip speeds and RAM capacities were doubling, the price per megabyte was falling. And we didn’t hit the heat ceiling until after the millennium. The word “legacy” was not part of the conversation because most people in the industry were looking ahead and not very many seemed to be considering the baggage we were dragging along from the past.