I think there are really two types of programming languages, "idea" languages and "practical" languages.
"Idea" languages take a concept and try to build almost everything out of it. So like, lisp with lambdas, or smalltalk with message passing, or prolog with logic statements, etc.
Then there are "practical" languages, which don't necessarily have a "core" idea; more likely they're defined by the domains they're useful in and have a grab bag of ideas. So like C++, typescript, etc.
I think "idea" languages have their appeal because they seem elegant, and as coders, we often try to take a novel idea as far as we can. Usually, that ends up being too far. Like, if you're building a React app, and you go overboard, maybe everything becomes a reactive component even when expressing things in that way can be awkward and your component isn't really a ui "component". Or if you're writing in a functional language, all the sudden loops become recursion and you're praying that it has tail call optimization. If you're writing java, maybe you see dependency injection everywhere. I think we just have this urge to see how many nails our shiny hammer can hit, even to the point of diminishing returns.
Anyway, I guess all I'm saying is that even if you can express something as turtles all the way down, you might not necessarily want to build your house out of turtles.
I've never really liked that quote. It's just a way of saying that nothing's perfect, and of course nothing is perfect, but that doesn't mean you shouldn't try to improve them. The quote has always struck me as defensive in a way that stops discussion. It's okay (in fact, more than okay) to say that a complaint is founded on an unsound premise or that a proposed change is a bad idea, but calling other languages useless or unwanted (or implying something to that effect) in response to criticisms of C++ is a bit of deflection, as if saying "well, at least people are productive in it", and I don't think particularly constructive. I think you can start a constructive comment this way, though: "despite its problems, people are still productive in C++ and I think x, y and z are reasons why, so x, y, and z ought to be kept, &c". And in some other context, I think this observation is meaningful: all the languages that are commonly used are complained about, so a complaint about C++ does not necessarily imply that it's worse than any other commonly used language.
> You are reading way more into his comment than I think was intended or implied.
You're likely correct (I'm unable to find the original context of the comment), but it's rarely employed with its original context. Indeed, I was more trying to say that I disliked the quotation itself (and how it can be used) than whatever Stroustrup meant by it.
> C++ is one of the most successful programming languages in history. There is a reason why.
And I said as much in my comment.
(I would also point out that some of the reasons may not be good reasons. But some certainly are!)
> Stroustrup has spent much of his career working to improve C++.
To improve the current situation, we may want to abandon C++ altogether for new programs rather than stubbornly hold on to it.
(In the back of my mind is Stroustrup's recent reaction to the NSA report on memory safety.)
I'm under no illusion that programmers will begin formally proving properties of their programs, as interested as I am in that. But people seem to rarely avail themselves of the tools that help them write correct C++, and rarely manage to use only modern C++, which suggests we may want to simply let go of it. Or radically change it – making safety features opt-out rather than opt-in.
I routinely deploy very large C++ applications used by international airlines around the world. And I haven't had a single bug in production for 5+ years. Not even one.
The secret is 9000+ auto tests pushing the code way beyond real world usage.
Changing from C++ to another language won't change the fact that those tests are needed. Memory leaks and multi-threading issues are simply non-issues for me. I use tools like Helgrind and Valgrind to find and fix those.
In other words, new languages like Rust doesn't solve any problems I have.
That's very impressive! I desperately wish more people did their production work like you do.
I genuinely can believe that Rust, Carbon, &c wouldn't make much of a difference for you. But for others who don't exercise the same discipline (for whatever reason, internal or external) I think they can. I agree there are no silver bullets; that's why each little bit helps.
edit: I think our software woes have mostly to do with human nature; solutions (if they exist) will primarily be educational and social, not technical. But universal computers are less than a century old, and we can certainly do better on both fronts. If we've learned something about the way most people write programs and interact with existing tools, we perhaps ought to make of use of that experience to design better languages and tools.
> I think there are really two types of programming languages, "idea" languages and "practical" languages.
Idea languages invent features (first order functions, closures, garbage collection, algebraic types, structural patterns matching, type inference, ...) and 10-20 years later these start to appear in practical languages, too.
And then we look at language implementation and find message passing can be optimized away —
The control structures that are so optimized by the compiler are the conditional selection messages to Booleans (ifTrue:, ifFalse:, and ifTrue:ifFalse:), some of the logical operation messages to Booleans (and" and or:), and the conditional repetition messages to blocks (whileTrue: and whileFalse:).
Essentially all of our biggest software systems are made with php, c++, java etc. Elegance and beauty somehow always lose to the hack-together-itude of other languages
Platonic ideals are great for thinking clearly. Doing things in the messy reality often requires compromising. It's impossible to solve all corner cases a priori...
> Surprisingly, these particles were found decades ago by Alonzo Church, the inventor of Lambda calculus long before electronic computers even existed!!
Lambda calculus is not the simplest computational device based on functions; with SKI calculus, you only need to define two combinators (S and K) to perform any conceivable computation. Going further, you can define the full SKI calculus with only a single combinator called Iota, so in fact, you only need one function to perform any computation. Yes, Turing tarpits exist, why that should be a revelation? Practical programming languages (both functional and imperative and object-oriented) tend to find a compromise between a minimal core and ergonomics, otherwise the result will be disastrous.
> Surprisingly, these particles were found decades ago by Alonzo Church, the inventor of Lambda calculus long before electronic computers even existed!!
John McCarthy has introduced his lambda notation in October 1958, being inspired by the work of Alonzo Church, but it had very little to do with Lambda calculus.
The essential innovation brought at that time by John McCarthy to programming languages was that, unlike in earlier programming languages like Fortran, where a function definition was a separate section of a program source file, where an identifier had to be bound to a function definition, a McCarthy function definition behaves just as a constant value, very similarly to a character string constant, integer constant or floating-point constant, so it can appear in any place where a constant can appear, it can be assigned to an identifier, but it can also be used directly as an argument in the invocation of another function.
Once the new semantics of a function definition was established, the syntactic notation for it was a minor detail. A possible notation would have been to enclose any function definition in a special pair of parentheses, like a character string constant that is enclosed in quotation marks. Because this was not feasible due to the limited character set that was available, the chosen syntax was that of a LISP special form, which needs a keyword, for which the choice was "lambda".
To distinguish in a function definition the parameters used for function invocation, Alonzo Church had enclosed them in a pair of special parentheses, where the leading parenthesis was a small Greek lambda character and the trailing parenthesis was a raised middle dot.
This provided the inspiration for the "lambda" keyword. Perhaps McCarthy was also inspired by Church's work to change the handling of function definitions to be like that of any other values, but that is the maximum extent of the relationship between Lambda calculus and LISP.
Nevertheless, the LISP "lambda" notation played an important role in attracting attention upon the work of Alonzo Church, so in the following decade there was much research in functional languages that was much more strongly influenced by Lambda calculus than LISP was.
Came here to recommend that the author check out Combinatory Logic (SKI, Iota, Jot, etc). Although not practical, it is pretty incredible how simple/minimal systems can serve as the axiomatic foundation for essentially all of math/logic.
Lambda calculus is somewhat interesting in my opinion because in addition to being really simple, it’s actually a pretty usable language. You could use it as a basic template language without much modification. The same cannot be said for Turing machines or SKI calculus (the latter in particular since it has no natural IO mechanism).
I distinctly remember the moment 25 years ago, at around 2:00 am in bed in my dorm, the semester I was taking SICP:
“cons doesn’t need to be a special form! Closures are general purpose data structures!”
I raced (all of 6 feet) to my… hmm at that point must have been the old Gateway 2000 with an AMD K5 brain transplant… and spun up the Edwin environment to confirm my hypothesis.
Up there with getting married and having kids as the great life events of all time.
> He made an error in the code: evcon. and evlis. should come before eval..
A note: That's not an error. The code works just fine in Common Lisp because CL doesn't care about the order of definitions so long as the definitions are made prior to execution. Given that both `evcon.` and `evlis.` also call `eval.`, there is no way to create a canonical linear ordering of the three functions aside from aesthetic preference. You will get similar warnings if you put them before or put them after `eval.`.
It would work fine in Lisp in general, since Lisp is typically late bound and can call global functions via the symbols. Thus in a function writing a call to an undefined function is not an error. The function only needs to exist at runtime, even then in many debuggers it could added and the computation can be resumed.
This article uses the factorial function on Church numerals as motivation for discussing recursion and the Y-combinator.
Yet Church numerals have enough recursion "built-in" to compute factorials directly:
fac = λn.λf. n F (λx.f) (λx.x) where F = (λc.λn.n(c(λf.λx.n f(f x))))
For example, fac 3 evaluates as (writing Church numerals simply as 1,2...)
Thanks everyone for showing interest in my article. the traffic quickly exceeded my cloudflare account limits, resulting in it being inaccessible for a while. I've upgraded my plan now and the link works.
Unrelated to the content of this post... this looks to be hosted on Notion. Is that a reasonable option for a personal blog? I do a lot of writing in Notion for work and I would love to transition into something more public. I don't really want to mess around with Jekyll, Medium, or Substack. Public Notion feels sort of compelling.
Church-encodings are cute but closures take space, more specifically an unbounded amount of it (for the code, unless you defunctionalize it)! Hence it is much more practical to have a primitive notion of inert data that can be inspected and which has a fixed or at least predictable size.
Although there are also other more efficient lambda-encodings like Scott-encoding / Mendler-style algebras.
See https://firsov.ee/efficient-lambda/, it's a very cool paper. But it also shows it's very tricky to give a type to these encodings. In the end you're almost always better off building "real" datatypes with records and ADTs unless your only goal is to have a minimal theory.
If anyone is looking for a book on lambda calculus, I really enjoyed An Introduction to Functional Programming Through Lambda Calculus by Greg Michaelson [1]
Every time I read a post about the sublime nature of Lisp I end up feeling stupid. I read SICP and didn't really "get" it. I felt like it was just another language, but the structure of the language makes it a bit more obvious how scoping and closures work-- as well as being easier to compose functions of functions. Forth gave me a little bit more of an "aha" moment, but still no enlightenment.
I have never used either language to actually build anything useful, so maybe that's part of it.
Another revelation: lambda calculus can be reduced down to 4 primitive operations[0].
I had this revelation after a pilgrimage into the land of Binary Lambda Calculus[1], a binary encoding of lambda calculus that represents variables with numerical (de Bruijn) indices[2] in unary.
No, it's not lambdas all the way down. Lisp machines are long gone.
I really wish people would stop teaching the "metacircular" stuff so early. McCarthy even missed metacircularity. It took a genuine genius on the level of Stephen Russell to "get it" and convert everything to a functional programming language.
Tell people to implement a Lisp down on the imperative language of the machine. Suddenly, you will get it. Mutability is a pain in the ass. Recursion has issues. Garbage collection isn't optional--it's required because all of your "lambda" shenanigans are beating the hell out of the allocator and cons-ing up trash like a victim of "Hoarders".
Once you see the guts, suddenly metacircularity makes sense. It also isn't particularly interesting anymore.
They also had assembly and microcode (and Maclisp got scoping wrong until CL, anyway). So those weren't much better if you want to measure "purity".
> Tell people to implement a Lisp down on the imperative language of the machine. ...
Somehow there is no end to the difficulty of mapping an imperative language to another, as evidenced by the size of LLVM. One picks one's poison, I guess.
you can hack up eval, map, and apply in c without too much headache. I'd recommend int char and some sort of string representation as primitive. I'd also recommend not garbage collecting, and just malloc everything. Like, a weekend of effort for a really really bad implementation. So much magic disappears. Well, I guess this sort of assumes you know what a pointer is. If not, it's going to be a tough slog regardless.
as a second pass, check out the later parts of sicp and expose some array type, and implement the gc in scheme.
> Tell people to implement a Lisp down on the imperative language of the machine.
FWIW, my understanding is that the statement "God exists" in Aristotle is equivalent to the statement: "You cannot say it is Xs all the way down; there must be a base layer which is unchangeable (which we call God)."
Could there be a hint here that programming-deity is either: a) imperitive at heart; or b) inordinately fond of garbage collection?
> Could there be a hint here that programming-deity is either: a) imperitive at heart; or b) inordinately fond of garbage collection?
Or, perhaps, programming-devil is and we've all been tempted wrongly? :)
One problem with programming history is that it is all about compiling. And that is definitely the work of programming-devil.
People are trying to shake that off, but it's difficult. Programming languages advance one popular programming language at a time--and that progress is slow.
Definitely A: imperative at heart. Ultimately, the modern electronic digital computer executes instructions in sequence. Whilst there are other kinds of computers that can be inherently functional (like analogue or quantum computers), a conventional CPU is not.
I realized that when I call a bunch of functions one after the other, I want to be able to reference them. I would have to put them in an array.
Then I thought why do I have to make this choice case by case. If only everything was a list…
I now want to see how I can shoehorn this into TypeScript. Maybe with macros or transpilation or something.
Then I want to see what cool things I can now do.
The only thing that gives me pause is that the language is insane to read and when you look at the implementation of Lisp defun and defmacro the code gets pretty wacky.
Figuring out what is a primative and what is just lists is very difficult.
I wonder if there is a less purist Lisp out there that looks nicer.
this comes up alot. Dylan did this. I guess the curse and the salvation is that you get used to it pretty quickly. I really don't think syntax is all that fundamental - its all fine up to a point (lookin at you perl)
Maybe off topic....what emerges when we don't go down to boolean as atomic and instead have a multivariate logical foundation that contains uncertainty?
That's not to say that you can't get some value out of it. SQL is probably the closest language currently to implement TVL, and there have been recent works on datalog as well:
https://arxiv.org/pdf/2202.01718.pdf
Datalog has generally better defined reasoning than SQL, and can kinda serve duty for typechecking, formal verification.
It's worth noting that many languages have the (disjointed) concept of uninitialised values which are equivalent to TVL (Also note nullable types in eg. C#). As such, from a practical perspective, TVL is pushed out as much as possible from code, I suspect it's of more use for data.
I'm "A disciple of another sect", a Prolog programmer and I'm always a bit nonplussed when LISP adepts praise the simplicity of their preferred language for taking only (checks again) seven primitives to represent every function. That sounds like 6 primitives too many. In Prolog, there is one "primitive", the definite clause, and it is all the syntax of the language.
Quickly-quickly, a logic program is a conjunction of definite clauses, a definite clause is a clause with at most one positive literal, a clause is a disjunction of literals, a literal is an atom or its negation, an atom is a predicate symbol followed by n terms, a term is a constant, a variable, or a function symbol followed by m terms, where n, m are the arities of the atom and the term.
That's all you need to express any computable program. And note well that Prolog is a language created to be usable, and used in practice to write everyday programs such as the web server running the SWI-Prolog documentation website [1]; not some Turing tarpit that only exists to be studied under glass, or put on a pedestal and worshipped for its useless purity. It is not Brainfuck, it is not Haskell, it is, I dare say, the C of the non-imperative languages: simple, pragmatic, usable, to the point.
Now, I understand why LISP adepts praise the abstraction and power of their language: compared to, say, Java, or, from its own time, COBOL, or FORTRAN, LISP is certainly a leap ahead in modernity, in thinking about programming as computation. And, garbage collection, of course. Yes yes yes. But - all the special syntax! Seven primitives! Why is all that necessary?
Imperative languages are so exasperating to use sometimes _because_ they have all that special syntax, purportedly to "make it easy" to fulfill everyday programming tasks. But, may the gods help the unwary prorgammer who strays from the beaten path, as laid out in Stack Overflow. Special-puprose syntax is a straight-jacket, an oxymoron, an attempt to liberate the programmer by tying her hands behind her back. A language like LISP, that prides itself of being a completely different paradigm should not brag for its special purpose syntax, but try to get rid of it. Lest it become Haskell.
I am not at all convinced that it is useful to have closures in a programming language.
The implementation of closures is completely unnecessary for any of the features that exist in functional languages or in any programming languages.
The avoidance of closures becomes possible when the programming language includes the restriction that inside a function body the only external variables (i.e. which are neither parameters nor local variables) that are visible are those that are allocated statically, either as process-local or as thread-local.
This restriction is similar with the restrictions from C or from traditional C++, but it is less constraining, because nested functions are allowed, they just cannot see the stack-allocated variables from enclosing blocks, unless they are passed as parameters.
Any useful thing that can be implemented with stack-allocated variables captured inside a closure can also be done with statically-allocated variables, but in a much more efficient way and in a way that is less prone to subtle bugs. Other applications of closures in LISPy languages are just workarounds for the impossibility of defining record or structure types. In any language with user-defined data types such closure applications are not needed.
The only significant difference between a language with closures and one without closures, from the programmer POV, is that in a language without closures for each variable that is accessed inside a function body without being passed as a parameter, the programmer must add an allocation-class specifier, such as "static" or "thread_local".
This is a good thing in my opinion, because sometimes bugs are caused by unintentionally accessing external variables, which are automatically captured in closures.
A "purely" functional program must not access inside function bodies any variables that are not passed as parameters. Avoiding to write explicit "static" annotations, in order to make closures masquerade as functions, does not change the fact that such functions are no longer pure functions, but automata, which is good, whenever this is necessary.
When there are no closures, any function pointer can be just a simple pointer and any higher-order functions can have a faster implementation, which is better for any functional language. Closures add a completely unnecessary overhead.
FYI the message you are responding to is a reference to this ancient copypasta:
```
The venerable master Qc Na was walking with his student, Anton. Hoping to prompt the master into a discussion, Anton said "Master, I have heard that objects are a very good thing - is this true?" Qc Na looked pityingly at his student and replied, "Foolish pupil - objects are merely a poor man's closures."
Chastised, Anton took his leave from his master and returned to his cell, intent on studying closures. He carefully read the entire "Lambda: The Ultimate..." series of papers and its cousins, and implemented a small Scheme interpreter with a closure-based object system. He learned much, and looked forward to informing his master of his progress.
On his next walk with Qc Na, Anton attempted to impress his master by saying "Master, I have diligently studied the matter, and now understand that objects are truly a poor man's closures." Qc Na responded by hitting Anton with his stick, saying "When will you learn? Closures are a poor man's object." At that moment, Anton became enlightened.
```
Note that Java fits your definition (though it allows one extra type of variable to be accessed, members of the current instance). It is probably the only popular language that supports local functions but not closures.
Also, C++ doesn't suffer from the problem of accidentally capturing variables, since it requires you to explicitly state which variables from the external env you want to reference (with the [a, b] syntax at the start of a lambda declaration; a lambda starting with [] fits your desired constraints).
Thanks for pointing to Java. I have not thought about it, because I use Java only very seldom.
The members of the current instance are not really an extra type of variable.
They are the members of one hidden function parameter. While they are distinguished in the source code, in the translated machine code there is no difference between them and the explicit function parameters.
I agree that this is a good feature of C++, that it avoids the bugs caused by accidental captures by giving control to the programmer on what may be captured.
Nevertheless, perhaps it would have been possible to have a simpler implementation of the C++ lambda-declared functions than with the current function objects, by simply disallowing the capture of stack-allocated local variables.
The thing is, all of these languages have adopted something like closures because they're just very useful. Creating different copies of a function with different values for each item in some local variable/collection comes up very often. Even Java actually supports it, since it does allow you to copy values from the local scope to an inner function scope, even though it doesn't allow you to capture variables. And, since this is a GC language, those values can be references to local objects that can survive after the outer scope exits if they are referenced from the inner scope.
Even in Java, this was there as part of Java 1.0, in the form of instance-scoped inner objects, in a form like this:
for(int i=0; i < arr.length(); i++) {
final int copy = i;
listeners = new ActionListener(){
public void actionPerformed(ActionEvent e) {
System.out.println("pressed button " + copy);
}
}
}
For all such examples of using closures there are alternative implementations that use more efficient mechanisms.
For examples like this, if it is not desired to use a single function with one extra parameter, then multiple functions that differ in using some different constants or different private static variables can be generated by some template mechanism, without using closures.
The problem with closures is that they introduce an overhead in any function that is passed as a parameter or used for some other reason through a pointer, by adding an extra indirection, even if in most cases this is not needed, instead of adding overhead only in the few cases when capturing local variables is really useful (unless something like whole program optimization is used).
C++11 closures are very thin sugar over locally defined structures with an overloaded operator(), which you have been able to define since forever, so I think they do not present any significant implementation complexity.
Different languages deal with the constraint you describe in interesting ways.
PHP has the “use” keyword, which captures variables explicitly. You can’t capture anything in the outer frame otherwise.
I wonder now if this has any performance implications.
In Rust, every captured variable is statically checked. The compiler knows that you capture a mutable reference for example. This often means that you need to express additional constraints in order to satisfy ownership semantics.
Now again, I wonder what the implications are on the generated code.
"Idea" languages take a concept and try to build almost everything out of it. So like, lisp with lambdas, or smalltalk with message passing, or prolog with logic statements, etc.
Then there are "practical" languages, which don't necessarily have a "core" idea; more likely they're defined by the domains they're useful in and have a grab bag of ideas. So like C++, typescript, etc.
I think "idea" languages have their appeal because they seem elegant, and as coders, we often try to take a novel idea as far as we can. Usually, that ends up being too far. Like, if you're building a React app, and you go overboard, maybe everything becomes a reactive component even when expressing things in that way can be awkward and your component isn't really a ui "component". Or if you're writing in a functional language, all the sudden loops become recursion and you're praying that it has tail call optimization. If you're writing java, maybe you see dependency injection everywhere. I think we just have this urge to see how many nails our shiny hammer can hit, even to the point of diminishing returns.
Anyway, I guess all I'm saying is that even if you can express something as turtles all the way down, you might not necessarily want to build your house out of turtles.