Hacker News new | more | comments | ask | show | jobs | submit login
Esoteric Topics in Computer Programming (2002) (archive.org)
126 points by pmoriarty on Sept 16, 2016 | hide | past | web | favorite | 87 comments

I've said it before and I'll say it again: go and learn Brainfuck. Just write a program that outputs your name or does FizzBuzz or something trivial, that's enough to get a grasp of it.

Brainfuck is to Turing machines what Lisp is to Lambda calculus. Minimum viable amount of syntactic sugar (and a bit of "semantic sugar" as well, it's nicer to work in bytes than bits) to make it usable as a programming language.

Lambda calculus is just a bit more practical formalism than Turing machines, that's why Lisp is genuinely useful as a programming language, unlike Brainfuck. That doesn't make learning about Turing machines any less useful as a brain teaser exercise.

> what Lisp is to Lambda calculus

It's more like what Binary Lambda Calculus is to lambda calculus. See my IOCCC entry



which happens to include a Brainfuck interpreter...

>> Brainfuck is to Turing machines what Lisp is to Lambda calculus.

Indeed, the lambda calculus was never meant as a real programming language.

> Indeed, the lambda calculus was never meant as a real programming language.

In fact, not even Lisp itself was originally meant as a real programming language! McCarthy invented it as a notation for mathematical proofs, and one of his grad students thought it might make a good language.

In hindsight it's no coincidence, since proofs and programs turned out to be the same thing :)


Never knew that. That's a trip. Great accident to have in one's career given all that came of it.


Look for the section heading "Catching Up with Math".

That was a good read. Appreciate the link.

BrainFuck is quite far removed from a Turing Machine.

In BrainFuck, the "data" (the memory cells being mutated by the program) are completely separate from the code. As a consequence, BrainFuck code cannot self-modify and it has to maintain two pointers (instruction pointer and memory pointer). BrainFuck also allows the instruction pointer to jump arbitrarily far (i.e. from a "]" back to the corresponding "[").

In constrast, a Turing Machine makes no separation between code and data; it just has a tape (the machine's transition table is more like a language's interpreter than a program). Not only can a program modify itself, for many tasks it must; since the tape only advances one cell at a time, moving between two arbitrary cells requires the contribution of all those in between; whenever we traverse a cell, we must also modify it in preparation for any subsequent traversal.

Whilst BrainFuck is certainly a nice idea, I don't think it's the best example of anything in particular.

I wrote up some thoughts on this a few years ago http://chriswarbo.net/blog/2014-12-22-minimal_languages.html

> In constrast, a Turing Machine makes no separation between code and data; it just has a tape (the machine's transition table is more like a language's interpreter than a program).

I think you're conflating "Turing machine" with "universal Turing machine".

In a standard Turing machine, the state transition table is the "code". That code could be an interpreter for another language, including for arbitrary Turing machines, or it could be something entirely different. Either way, it's static, just like a Brainfuck program. In fact, the translation from Brainfuck to a Turing machine is trivial: if each cell of the tape is a byte, then each character of the program becomes one state of the machine.

This sounds like an excellent excuse to plug my Brainfuck IDE.


Ahh... it's a work in progress. I should probably add an option to disable my optimization passes, for starters...

I've always thought the Turing LISP was Forth. It's a tiny, stack machine with few ops plus some LISP capabilities. Also implemented in efficient hardware.

Is Turing Lisp a flavor of LISP? This is the first I've heard of it.

Turing Machines and Lambda Calculus are equivalent in power, but opposite in style, ways of formally describing computing. Imperative programming came from one and functional programming from the other. One of simplest... conceptually and in implementation... versions of Lambda Calculus is LISP. I'm guessing you already know what it can do. In Turing/Imperative Land, I suggested Forth comes closest to being simplest of very powerful languages.

There's probably implementations of all of these in all the others out there. I'm just saying Forth is a Turing-style language with LISP-like simplicity and benefits.

Ah OK, thanks for the clarification. That dichotomy makes perfect sense.

I haven't really looked at Forth, is your comment about its "LISP-like simplicity" why is such a perennial favorite amongst language aficionados?

Any recommended readings on the subject(Forth) ?

Intro here:


Yosefk's write-up is the definitive one for me, though, showing an intro, some amazing stuff about it, weird stuff, and reasons for ultimately not going with it. Link:


No, he means Turing machines are to FORTH as Lambda Calculus is to Lisp.


Chris' site didn't go anywhere, it just changed URLs: http://catseye.tc/

Thanks for the link!

However, it seems like the page linked in the post is not available anymore.

Well, you are talking about a gap of 14 years: the site's been reorganised since then, but all the content is still there.

One of my favourite examples is the blank C program that won the "Worst Abuse of the Rules" award in the 1994 Obfuscated C Code contest (last entry here):


It's a Quine, a Palindrome and a Null Program all in one!

I'm disappointed LOLCODE [1][2] is not on the list. [1] http://lolcode.org/ [2] https://en.wikipedia.org/wiki/LOLCODE

Can someone provide examples of programming forms, practices, or languages intentionally born esoteric that eventually turned mainstream? Nothing from this page popped out to me. It seems like sometimes an idea is so odd that it gets widespread attention just due to its oddness.

There's at least one on that list that's (at least in my experience) gone from relatively mainstream to esoteric - self-modification.

Back in the day when both memory and clock cycles were very precious, it wasn't unknown to use self-modifying code as a performance optimisation trick. I did it at least once in the late 80s, when I was working on comms software that had to be as fast as possible in order to avoid missing incoming data.

There was a check that needed to be done on every byte - I think it was whether I was now processing graphics characters or not - but the check was taking valuable time, and the value didn't change very often.

So the most efficient way I found to do it was to wait until I got a "switch to/from graphics" byte in the input stream and then update the instruction at a given location to either be "unconditional jump to graphics routine" or a "no operation (NOP)", which passed straight through to the routine for normal characters.

It was a horrible hack, but it worked.

Thankfully, I've not felt the need to even consider this approach for the past 20 years.

That eventually turned into JITs, and it's still a very powerful technique for tight loops on modern processors, although the pipeline means the benefit happens with more iterations than on the old non-pipelined/cacheless CPUs. It can even be done across multiple cores, as I coincidentally explained here a short while ago:


I don't think SMC has ever been "relatively mainstream", at least after HLLs gained popularity over Asm. But in Asm, it still has its uses where a full JIT would be far too much overhead.

Very long ago, before index registers were a fancy new feature for the cool kids, one would modify the address fields of load/store instructions in order to stride through memory or traverse a list.

> Thankfully, I've not felt the need to even consider this approach for the past 20 years.

That's because of all the branch predictors, probably. ;)

Does Funtional Programming qualify?

It is mainstream now that it is integrated into Java, C#, etc.

It was considered esoteric 20 years ago. Haskell initially had the slogan "avoid success at all costs".

SQL, HTML, CSS were probably esoteric at some point.

To clarify, the Haskell quote is usually parsed as 'avoid "success at all costs"', rather than 'avoid "success" at all costs'. Part of its rising popularity is probably the infrastructure work; despite the rough edges of Cabal and friends, I struggle to imagine what it must have been like before them!

Perhaps a more interesting example is the MLs, which have been around far longer than Haskell and seem to be gaining popularity for implementating and transforming other languages (that was their original purpose, after all!).

Scheme's also managed to maintain a minor presence since the 70s, especially in education and as an extension/configuration language.

Functional programming was definitely pretty esoteric when I learned about it during a CS degree in the mid 1980s!

HTML was regarded as too mundane to be interesting when it first came out - a lot of people working on hypertext at the time (early 90's) regarded it as a simplistic toy.

It's still a simplistic toy, albeit a highly formalized one nowadays.


Five or so years ago it as an esoteric language for statisticians, which had been around forever (especially if you count the predecessor S language).

Now the IEEE ranks it the 5th(!) most important (or something?) language[1], and it's the fastest growing language on StackOverflow[2].

That's a pretty amazing change for a language which has few redeeming factors (ha!) as a language, apart from lots of very useful libraries.

[1] http://blog.revolutionanalytics.com/2016/07/r-moves-up-to-5t...

[2] http://jkunst.com/r/what-do-we-ask-in-stackoverflow/

FWIW I was using R to produce Unix system-utilisation charts about 15 years ago — I never thought it was particularly esoteric.

Object Oriented Programming? I'm only half-kidding/-serious, btw. But it seems to me that there are a lot of things that become mainstream after an initial bump of esoteria .. React? Lua? Isn't this sort of the norm for new technologies ..

Javascript? j/k

Duff's Device comes close. It was born out of a real engineering problem, but spread as esoterica and I'm sure I'm far from the only one to learn it as esoterica and later use it in production.

If you're going to list Duff's Device, we should also mention Carmack's fast square root (despite it not actually being invented by Carmack). I dunno if you'd call it mainstream but for a while there it was in every wannabe game engine coder's repertoire.

Yeah, you should really call it SGI's FastInvSqrt. It's still a pretty cool hack.

UNIX was born as what you could consider a relatively esoteric response to the more popular MULTICS project. Same with B/C and BCPL. But I think that considering C and UNIX esoteric is a bit of a stretch, taking into account that they were developed at Bell Labs and almost everything related with computers could be considered esoteric back then.

Oh, I forgot the most prominent one as of late: Urbit, which started as a functional esoteric language with a Martian aesthetic... written by a guy famous for... other things... and... somehow hopped on the blockchain gravy train? last I heard it was... auctioning address space? for surprisingly large amounts of money?

Urbit is probably one of my favorite things. The language is insane. The base virtual machine it runs on is insane. The entire stack rewrites everything for the sake of doing everything perfectly. It redefines whatever it wants. And somehow, it's actually usable.

Hoon is a programming language that uses two-character runes instead of reserved words. It's purely functional and vaguely Lisp-like. It compiles down to a minimal combinator-based virtual machine built entirely out of cons cells and bignums, where the only operator is addition. And yet throughout all of that, it's actually possible (and scarily easy) to code in, and I'm somehow fluent in it. That kinda scares me.

Even within the blockchain space it's still quite esoteric.

Personally I think it looks like a great opportunity for an interesting weekend.

INTERCAL is famous for having a COMEFROM instruction, as an inverse of GOTO.

With GOTO, we can read a section of code and if it contains "GOTO N" then we know the execution will jump to location "N", so we can look up that location and keep reading. We have no idea if any other code will jump into the code we're currently reading, unless we search for "GOTO <location we're reading>".

COMEFROM is the opposite: when we're reading a section of code and we see "COMEFROM N", we know that the code at location "N" will jump to this section. We have no idea if the code we're reading will jump to somewhere else, unless we search for "COMEFROM <location we're reading>".

Despite being invented as a joke, this is very popular in mainstream programming, under the name "exception handling" ;)

Exception handling requires some piece of code to actually throw the exception though. A more thorough instantiation of the idea would be aspect-oriented programming.

There was some unix utility (`yes`?) that was implemented on many systems as an empty shell script. I read about in in an HN comment, can't remember which.

Possibly ":" (do nothing successfully) before it was a shell builtin?

I think you're thinking of `true`.

Oh, and I wrote small web apps in Camping.rb. Working with _why's code was a great pleasure, especially in those rare cases when the documentation wasn't good enough and I had to dig through the code.

Yeah, that squares with what I've heard about _why: Great ideas, not-so-great code.

Nononononono. The documentation was great, the times where I had to read the code were rare, and they were a pleasure. His code is not always the most pragmatic or straightforward, but it is packed with beauty, and I much prefer it to digging through heaps of very readable, character-less code.


I've heard tales of some of his code being absolutely unmaintainable. Guess I was wrong.

Every language and every practice that is now popular was once obscure.

Sure, but most weren't designed to be intentionally indecipherable or obtuse.


Missing from this list: FRACTRAN, a turing-complete language based on fractions, invented by John Conway.


On the palindrome section:

> I'd like to see one where the entire code is executed, or one done in Brainf*ck, as well.

I might be wrong, but this seems a really bad example for palindromes. Aren't all palindrome strings made of `+-.,<>` valid brainfuck programs? I know that this subset is not turing-complete, but I also think that any valid brainfuck program with `[]` can't be a palindrome, which again would make this language a bad example.

No love for ternary computers?

You're looking for Malbolge: https://esolangs.org/wiki/malbolge

I was actually thinking in the Setun: https://en.wikipedia.org/wiki/Setun and DSSP. But thanks for the reference, I did not know Malbolge (though it looks like it uses the set 0, 1, 2 instead of the IMO more useful -1, 0, 1).

Malbolge (and Dis) are the classic ternary esolangs. I fiddled around with that in a long lost esolang back in the day.

Keep in mind that Malbolge is meant to be awkward, so while (-1, 0, 1) might be more useful, (0, 1, 2) suit Malbolge's perverseness much better, though I doubt that was a conscious decision.

This is when you know you're about to experience a steep, learning curve:

"language designed to be as difficult to program in as possible. The first "Hello, world!" program written in it was produced by a Lisp program using a local beam search of the space of all possible programs."

And why Dis exists.

templeOS should be mentioned here :) (written by a genuine bigot)

Not to give excuses to his occasional behavior but he has his good moments. Some of his writings and interviews are pretty genious. Not to mention his projects, I sometimes enjoy studying them.

You gotta give him some benefit of doubt. He has a genuine, diagnosed mental illness. And even if he did say racist, misogynist and other derogatory things, some of it was years ago and he might have changed as a person (and hopefully for the better). The memory of the Internet is sometimes too long.

I gotta agree with him about one thing: computers are best when they have 16 colors (I spend most of my time in xterm with my favorite sixteen colors).

> And even if he did say racist, misogynist and other derogatory things, some of it was years ago and he might have changed as a person (and hopefully for the better). The memory of the Internet is sometimes too long.

While this charitable impulse is to be admired, I'm not sure "that was a long time ago and he might have changed" is inherently exculpatory.

He hasn't changed, and it's particularly difficult for him to change given his condition. From everything I've came across that he's written over the years, the racist, misogynistic, and generally derogatory stuff he writes seems to come out when he's not lucid and is having an episode. Schizophrenia isn't something you can really change, just at best manage with the help of others, and that, while maybe not totally exculpatory, does at least require us to be more tolerant of him than we would be for somebody who wasn't dealing with schizophrenia.

> exculpatory


It means 'tending to clear of charges'. Google (or a dictionary) is your friend.

> some of it was years ago

Check his post history, he's still doing it. (I don't remember his current username otherwise I'd post links)

yes, he also hes some beauty in his mind :)

I think however, it's not just years ago. Wrote some more about this in the comment below ( https://news.ycombinator.com/item?id=12513807 )

Ignoring the guy himself, I think the unique thing about it is why he did it. The motivation to achieve spiritual goals through programming is one that doesn't get tapped a lot despite its potential to bring in contributions. There's definitely software developers making products to serve religious markets. I'm talking more on tapping into that desire to make a difference and do positive things with one's time with extra layer of benefits from religious belief.

The moderators or leaders need above-average talent for such places, though. ;)

that guy is insane - it's a constant reminder to never get too overzealous about the value of your work. But I do have to admit, gotta love his intensity.... but he's a bigot so fuck him lol.

To be fair, the reason for a lot of that is his schizophrenia, so cut him a bit of slack.

This is an interesting topic in itself. As soon as you are diagnosed you are excused. I don't generally oppose that, but would want to extend it to all people.

As a believer in causality and a bit of chaos, I think we are 100% product of our surrounding and in that all innocent in all behavior.

In my home country (germany) this is luckily somewhat founded in the legislative system. If I'm not mistaken nobody here will get punished for the sake of revenge, but just to protect society. No matter what you do, you deserve the right to live an OK life. If you are are a danger to society its safety might however stand above your freedom of movement.

It's all about prevention through detention and deterrence. I don't believe much in deterrence, but want to keep up the corrective of society and make no exception for the bad behavior of this guy. If I'd address him directly I would aim for a more diplomatic approach tough. In general I want to express that I'm very much against some of his views and found it difficult to promote his work without distancing myself from his behavior and criticizing it.

> As soon as you are diagnosed you are excused.

Not excused, but allowances made. Unless somebody is an actual danger to others, if they exhibit some behaviour they have little or no conscious control over (such as is the case in schizophrenia, Tourette's, &c.), then they deserve at least some leeway. If it's possible for you to consciously work towards controlling such behaviour, then they deserve some leeway, just not as much. But regular assholes don't get a free pass in my book, though. Terry Davis has delusions he can't do much about, and that makes him different from somebody not dealing with schizophrenia, whose problem isn't due to delusions, but attitudes. This is an important difference, regardless of whether you consider free will to be a thing or not.

You might find this episode of the Rationally Speaking podcast interesting, as the guest aligns in some interesting ways with your point of view, and diverges in some even more interesting ones: http://rationallyspeakingpodcast.org/show/rs-163-gregg-carus...

Thanks, will give it a listen. And yes, my point of view becomes more difficult the more rational the person acts when doing harm. The stereotypical evil banker is such a case, I find it much harder to find sympathy for him than for the crazy guy who does something horrible in a delusional state of mind.

I think it also makes sense to punish the rational criminal harder, because it actually might help, while the delusional person needs another kind of help.

I'm not sure how much you can do about your attitudes, I think freedom of will is an illusion. If you decide to turn your life around and become a good person, what made you do that? What made another one not do that? I think it's, as said, just the consequence of everything that ever happened before.

I sadly can't find the article anymore, in which it said that the illusion of a self and free will is important to stay the same person over time and that only than, someone can be held responsible for something and that society can only work if you have something to fear from doing wrong. In the article it was even mentioned to be proven by some simple evolutionary algorithm.

I disagree, not necessarily with your premise (that we are 100% a product of our surroundings) but with the conclusion (that we are all innocent in all behaviour)—since, if we're all innocent, then the word (or at least the distinction between innocence and guilt) loses its meaning, and it seems like a meaning that I want to preserve.

However, I like your vision even if I find it unrealistically utopian, and I upvoted you for the sentiment "No matter what you do, you deserve the right to live an OK life", which is curiously heartwarming.

I don't want to argue about words :). Guilty, as in "doing bad to society" or "disobedient to the law" or "doing despite better knowledge of the contract of society" is something I still could get behind, just how to react to it and what is the cause I find debatable.

About the unrealistic utopia: As said, some of this is fixed in the german law. I think there is a maximum of about 25 consecutive years of jail and of course no death sentence. More jail time is only lawful as detention to protect society for which you have to be diagnosed as mentally ill (highly likely to do very bad things again).

edit: actually 15 years is the german "for life" and you can't get 2x life-sentences at once, even for multiple offenses. There are just a few exceptions extending it to about 23-25 years. In the sentence that caused this law it says, that a human is nothing more than a wreck after 20 years and a longer sentence neither helps resocializing nor protecting the law and that the total psychological destruction would strip the affected of his basic human rights. (german source, no Idea how to deep-link: http://www.servat.unibe.ch/dfr/bv045187.html search for "Wissenschaftliche Untersuchungen" to find the passage)

Right, and so far from 'Separate the artist from the art' that when OP refers to TempleOS, he replies "That guy..."

And bookending it with 'lol'.

ok I am cutting the bastard some slack - god speed to this proud bigot.

If I ever start working on an OS as more than a practice exercise, I will have to ask friends and family for an honest assessment of my sanity.

Does denouncing someone as a bigot make you yourself a bigot or is it okay because _obviously_ their views are bad and it's okay to take a moral high ground?

I don't really care to spend that much time considering the political correctness of my opinions - I use my best judgement.

If you put your opinions up on youtube (whether you have a mental illness or not), you open yourself up to public criticism. Maybe 'bigot' is a shitty word in general, and I was being partly facetious - but I've watched a few of the full templeOS youtube videos and they're disturbing - that's just my opinion.

It might just do when you don't take said person's extenuating circumstances (schizophrenia, in this case) into account when judging their actions.

Funny how non-imperative programming languages, so for example FP, are in the "esoteric" group. Looking at the industry today it no longer seems to be true.

Beautiful and lots of fun. Thanks for sharing.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact