The Trouble with Gerrold: The part you left out
February 1, 2013 —
(Page 1 of 3)
Not too long ago, while browsing through Slashdot, I saw an interesting request for advice from a programmer: How do you tell someone he writes terrible code?
“I have a coworker who, despite being very smart and even very knowledgeable about software, writes the most horrible code imaginable. Entire programs are stuffed into single functions, artificially stretched thanks to relentless repetition; variable and class names so uninformative as to make grown men weep; basic language features ignored, when they could make everything shorter and more readable; and OOP abuse so sick and twisted that it may be considered a war crime.”
Coding style is as personal as a fingerprint. Programmers all believe that their way is the best. “It works, doesn’t it?” can stop most complaints. And, “I don’t have the time to rewrite it, you do it,” will likely stop the rest.
But this is a continuing conversation, one that has been going on probably since Bletchley. Over the years, many good and thoughtful people have attempted to codify good coding practices—where to indent, how far, how to structure an if-then statement, and so on. But sometimes it seems as if programmers, as soon as they get a little confidence in what they’re doing, also invents their own coding styles. Some are efficient. Others, not so much.
In the glory days of newspaper and magazine publishing, writers had style guides telling them where capital letters are necessary, when punctuation goes inside the quotation marks and when outside, when to use italics, when to spell out numbers and when to use actual digits, where to place the hyphen, and so on. Even today, the most important resource book on any writer’s shelf is Strunk and White’s “Elements of Style,” also known as “the little book.”
There is no such definitive style guide for programmers, possibly because there may be no right way to write code. Programming is a craft requiring high skill; occasionally it is even an art. And at the very bleeding edge, there may even be a bit of science involved, because when you construct a new algorithm, you might also be expanding the domain of human thought.