Ritchie and McCarthy
In the past two weeks, there has - justifiably - been a lot of coverage of the death of Steve Jobs. However, two other computing pioneers have also died during that time. Dennis Ritchie and John McCarthy were not the household names in the same way, but their contribution was arguably more fundamental. Whilst Job’s work made modern computing accessible, Ritchie’s and McCarthy’s made it possible. They created, respectively, C and Lisp, which are essentially the Greek and Latin of programming1.
In a field which values the new over the old to an almost pathological extent, these two languages have lasted for decades. If someone embarking on a new software project tomorrow using, say, PL/I, it would seem like an obtuse exercise in nostalgia. Use C, on the other hand, and nobody would raise an eyebrow. Lisp’s influence is less obvious, but arguably more profound. Whilst you can still use it in unadulterated form, relatively few people do. However, the popular dynamic languages that all the cool kids are using - Python, Ruby, JavaScript - are essentially Lisp in new clothes2.
The origins of the two languages are instructively different. C was created to solve an immediate practical problem - writing Unix in a portable way. Moreover, it was targeting relatively modest hardware, the PDP-113. It therefore lacks some of the bells and whistles that, even in 1973, “real” languages were thought to need. However, instead of limiting its scope, this resulted in an elegant simplicity that has seen it adapt to four decades of hardware improvements whilst many “real” languages have fallen by the wayside.
In contrast, Lisp wasn’t intended to be a language to solve a particular practical problem. It wasn’t intended to be a programming language at all. McCarthy had written a paper outlining a theoretical system for formally describing algorithms. Fortuitously, he didn’t make the “theoretical” part clear to one of his grad students, who promptly went off and implemented it on the department’s computer, creating the first Lisp interpreter4. What had started as a mathematical model of programs turned out to be an amazingly flexible and powerful way to actually write them.
What both languages have in common is that they each have a small, conceptually coherent core. Because it’s small and coherent, the programmer is able to internalise the basic concepts of the langauge. This done, they’re free to concentrate on the task at hand, rather than the details of the language. This is also why they’ve lasted so long - simple, clear ideas date far less quickly than specific technologies.
You might wonder why, in an post about two men who have recently died, I’ve not said much about their lives. The reason is simple; I didn’t know them, and so I’ll leave the biography to those who did. Like millions of other people, though, I do know their work. Both have had an enormous impact on a field that touches the lives of almost everyone, and will continue to do so for a long time to come.
McCarthy said: “Steve Russell said, look, why don’t I program this eval…, and I said to him, ho, ho, you’re confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it. That is, he compiled the eval in my paper into IBM 703 machine code, fixing bug, and then advertised this as a Lisp interpreter, which it certainly was. So at that point Lisp had essentially the form that it has today…”
-
Smug Lisp Weenies, as they are known, would claim that these more modern languages are merely partial, cargo-cult facsimilies of the real thing. There’s something in this, but they also succeed in ways that Lisp doesn’t. That’s a subject for a different day. [back]
-
Not, as I said in the original version of this post, a PDP-7; porting the nascent operating system from the PDP-7 to the PDP-11 was one factor that led to the development of C. [back]
-
Wikipedia’s Lisp page cite’s the following from Hackers & Painters by Paul Graham. Unfortunately, I don’t have my copy to hand to find the original source. [back]