In recent columns, I’ve praised the low-level pleasures of assembly language and C. My argument is that, even if you never use these languages professionally, it’s important to broaden your mind and problem-solving techniques. That holds not just for low-level approaches, but for other programming models whose high-level abstractions are different than you’re likely to work with every day.

Prolog stands for “Programming in Logic.” The hey-day of Prolog—to the extent it had one—was the mid-1980s, when it was briefly in vogue in the artificial intelligence community (Prolog was to Lisp what Ruby is to C# and Java? Maybe). Japan picked Prolog as the implementation language of its “Fifth-Generation” computing project, and Borland’s Turbo Prolog was a low-cost IDE and compiler that made the language accessible to a broader audience. Today, there are a few commercial Prologs, but the simplest way to explore it is with GNU Prolog.

Program execution in Prolog is considerably different than in any of today’s mainstream languages. The Prolog interpreter (or compiler runtime) actually seeks a goal: a clause (the equivalent of a function or block) for which the current data makes every statement in the clause true. Execution can be halfway down a clause before it hits a statement that is false, at which point it backtracks and tries to find a different clause that is true for the data.

For instance, if you were trying to categorize animals in the traditional manner, you might say that a mammal is an animal that has a backbone, breathes air and has hair, while a bony fish is an animal that has a backbone, breathes with gills, has a bony skeleton, and so forth. Programming consists of defining a bunch of such clauses and running it against a database of, in this case, specimens in your zoo.

The interpreter in essence does a search: It begins iterating over the database, taking an item (a_specimen) and a clause (“is_a_mammal”). The first subclauses of “is_a_mammal” ask “is_an_animal” and “has_a_backbone” (both true in the case of the first specimen in the database), but then asks “has_hair,” which is false. The entire clause “is_a_mammal” is therefore false for the first item in the database.

But execution doesn’t stop. Rather, it backtracks and tries to find another clause (“is_a_bony_fish”) that is true for the current data, until finally it says “Specimen 1 is an Atlantic Herring (Clupea harengus): true.”

Clearly, this execution model is attractive for problems of classification and teasing out the rules that distinguish things of one type from things of another. So in that period of optimism about the power of “rule bases” to tame computation in the same way that relational databases tamed data, Prolog was a natural fit. It was less obvious that logic programming was attractive for the day-to-day tasks of generating reports and printing mailing labels, much less programming this new-fangled “Windows 3” operating system, for which this “object-oriented programming” was said to be a panacea.

There was a period in the late 1980s when I divided my time more or less evenly between programming Prolog, C and a 4GL called PAL (the power of 4GLs being a worthy topic for another day). I do not think that the quality of different programming paradigms is absolute. I’m quite convinced that it would be a bad idea to force every programmer to use academically beloved languages like LISP or Haskell, and I know for a fact that not every COBOL programmer became better after taking courses in object-orientation. But for me, the logic programming of Prolog “clicked” powerfully, and I found that if I was stuck understanding a problem in C, I could often figure it out in Prolog, and vice versa.

Fast-forward 20 years (or, sigh, more) to a world in which unit-testing is near universal, LINQ is widespread and Reactive Programming is making some noise. The paradigm of solving problems by accumulating examples, counter-examples and thinking of execution as a query or search is not nearly as esoteric as it was. Writing a lengthy LINQ expression isn’t exactly like writing a Prolog program, but it tickles the same neurons.

Once you start using LINQ extensively, you start to naturally fall into the Prolog-like (and functional programming-like) way of thinking of the “foreach” as the One True Control Structure. It’s a slippery slide to “programs as queries,” unification and backtracking. Well, maybe that’s overstating things a bit; it turns out that few real-world programs can afford to totally ignore control flow. Most Prolog programs end up with considerable amounts of code devoted to manipulating runtime behavior; such control is easier to read and maintain using traditional imperative control-flow statements.

Prolog is always foremost on my mind when the topic turns to the CLR’s ability to combine multiple languages in a single application. “Business rules in Prolog, UI and the rest in C#” would be a great combination, but unfortunately the market for it never emerged.

If you find yourself drawn to “clever” LINQ queries and pattern matching, do yourself a favor and check out Prolog. It might not be the most practical thing for your career, but it just might click with you.

Larry O’Brien is a technology consultant, analyst and writer. Read his blog at www.knowing.net.