Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The term "axiomatic system" is consequently misused in the article. Probably the author had a "formal system" in mind, an axiomatic system is one in which one can deduce theorems from axioms and I don't think lambda calculus is such a system. A lot of the statements in the article are so vague and imprecise, it is hard to assign any meaning to them. Just look at the fragment of the article:

"Common Lisp is one answer to this question. Its core is an axiomatic system, the lambda calculus, which John McCarthy extended just enough to write an interpreter for the resulting calculus in itself."

Lambda calculus is not at the core of Common Lisp or even Lisp, it was just an inspiration for Lisps model of computation.

"Common Lisp augments the interpreter with other automata on the same substrate which make construction straightforwards. Though the choice of automata may vary, the essential idea of reifying the act of construction is universal."

Can someone explain what is he trying to say here? You can build languages with extremely different semantics while still keeping the ideas of S-expressions, centering the language around function application and closures, for example you can choose lazy evaluation over eager evaluation and end up with a very different language. In no way is lambda calculus some immutable core of Lisp, its model of computation is much more complex and arbitrary.

"Common Lisp and Haskell are built around variants of the lambda calculus. Each substrate makes certain approaches to problems natural, though neither is a natural vehicle for human thought. The brain’s adaptation to the lambda calculus is usually experienced as an epiphany."

Seldom if ever has programming in Lisp or Haskell anything to do with lambda calculus, unless of course you mean in a way so vague the term loses its meaning... He then goes one to essentially describe how Haskell is referentially transparent and Common Lisp is not, but he makes it sound like it is some philosophical difference deeply rooted in the ideas behind Lisp or Haskell, while it's just a design decision where each option has specific tradeoffs. To me all this is just some "metaphysical" mumbo-jambo, the metaphor comparing programming language to an artists medium from which he starts doesn't bring any practical insight into anything - I think some people like it only because the greatest painters are widely admired by the general public, while this is rarely so even with the greatest programmers, so someone might feel his job is a little bit cooler when reading this. I do not see what real insight someone could gain from this article.



Totally agree with you on how difficult this is to read. Author is speaking some form of high academic CS (and it isn't obviously used correctly).

"Common Lisp augments the interpreter with other automata on the same substrate which make construction straightforwards. Though the choice of automata may vary, the essential idea of reifying the act of construction is universal."

Here he is basically saying that you've turned the elements of program construction into a data-structure (the s-expression). I believe automata just refers to any program in this context. (Think computability + complexity; languages executable by a computer can be described in terms of different types of automata... NFA, DFA, PDA etc). So you are making program construction into a data-structure which can be used to manipulate programs construction, or can be manipulated itself.

I agree with you about the mumbo-jumbo.


In the PLs context, "axiomatic system" has been used for decades, especially in Lispy parts of PLs, in a sort of metaphorical way to refer to languages that start from a small core of primitive built-ins that are taken as givens (the "axioms"), and then build up the rest of the language by composing those "axioms" via "theorems" within the language that define the higher-level functionality. John McCarthy introduced the axiom/theorem terminology in his 1960 Lisp paper. Paul Graham is possibly responsible for re-popularizing it, since he uses that terminology a lot when discussing the design of Arc.


Can you back up what you said with any link or quotation? I did not find any mention of the word "axiom" in McCarthy's 1960 paper ("Recursive functions of symbolic expressions and their computation by machine"). Google for "axiomatic system lisp" pops up my own comment above as the first result.


I found what you are probably referring to:

http://www.paulgraham.com/arcchallenge.html

I doubt this is in any way an established terminology, sounds again just like a very sloppy metaphor on part of the author - are complex pieces build from simple primitive pieces enough to call for "theorems" and "axioms" ?


I don't disagree. The use of technical terms is sloppy, and some of the examples are weird (at the end, Prolog is said to be somehow an experiment along the same direction as Scheme...).

But I thought this was smart:

"C was shaped by poets. Kernighan, Ritchie, Pike, and Thompson built the language alongside the idiom. This had an effect equivalent to Petrarch writing in Italian. The language suddenly found itself in possession of existence and poetry at once. It has no theoretical basis. It has no overarching logic. [...] but in the hands of a master, it scintillates as its brethren do not."

It gets at the way the concise coding style in the K&R book guided people to a programming idiom that is pleasurable to write and read in a way that comparable languages, say Pascal or Basic, are not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: