I know we shouldn't care about the source, crazy-insane as he obviously is, but FWIW, the author is Yusuke Endoh, one of the Ruby core members. RubySource had an interview with him here:
This is truly baffling! I had to run it on my machine, and it really works as advertised.
For the curious who can't (or don't want to) install the compilers for all those languages, here's a zip file with all the intermediate files generated https://dl.dropboxusercontent.com/u/2683925/bigquine.zip
I've started looking through it. It's very clever, most of it beyond my ken, but even the technique with the image template of '#'s is a treat when you start to see how it works.
And there are interpreter implementations for both Whitespace and Unlambda.
There's even a Rake task to generate a Makefile that runs all the 50 multiquines and uses the dependency tracking of make to determine the order of execution instead of running the 50 commands one after the other in the same rule.
It's like everything in this project is incredibly elegant and self-referential.
Perhaps my favorite paragraph on all of wikipedia:
INTERCAL has many other features designed to make it even more aesthetically unpleasing to the programmer: it uses statements such as "READ OUT", "IGNORE", "FORGET", and modifiers such as "PLEASE". This last keyword provides two reasons for the program's rejection by the compiler: if "PLEASE" does not appear often enough, the program is considered insufficiently polite, and the error message says this; if too often, the program could be rejected as excessively polite. Although this feature existed in the original INTERCAL compiler, it was undocumented.
> COME FROM also allows elegant aspect-oriented programming :)
I used to make fun of Intercal, until I had to do with a Simulink model that made use of FROM blocks.
It's just as retarded as you'd expect. It's as if someone intentionally decided to take a step back from structured programming, and then some more steps back.
Or J to keep things ascii. Though I would have loved to have seen Malbolge. (http://en.wikipedia.org/wiki/Malbolge) It's already super impressive, but with some extra work it could be hyper impressive!
The code that generates the quine is included. There's still lots of super impressive stuff here, but don't be intimidated thinking he wrote that monster by hand!
Wow, even the source code of the original file is arranged to look like the Star of David surrounded by the dragon-eel thing, Obfuscated-C-contest style:
My thoughts went from "This is amazing" to "But he must have taken a lot of time and brainpower to do this" to "This is useless really" to "So what? If I wanted to do that I would take a year perhaps if I ever could" to "This is amazing"
That is both the most incredible thing I've ever seen, yet also by far the saddest thing I've ever seen. Props to the author for taking dedication to a whole new level.
Its not sad at all. Really good programmers can do things that ordinary programmers think impossible. This code was probably generated. Here is a paper on quine generation[1], and the authors have a very popular talk where they livecode a quine generator; that is, they livecode a program that generates an infinite list of quines[2] (source code from talk [3]).
I think he's referring to the relative uselessness of the code. Yes it would have taken a lot of time, talent and intellect to produce but from what I understand is of relatively little use?
There has been a spat of comments criticizing stories along the lines of "what use does it have?" As pointed out previously on all of them, and at the risk of sounding redundant, this is "Hacker News", not "Only Useful Things News" or even "Mildly Interesting News".
Looking at the front page of HN, I think it's pretty safe to say it's moved a fair distance from news of only VCs, startups and such to the realm of what we find interesting. This is interesting. It's not useful, particularly well built or even presentable at a sales meeting. But it is interesting.
possibly that someone clearly very smart wasted his/her intellect doing this, rather than advancing the species by curing obesity, heart disease and cancer?
If I make a ranked list of all the ways that people spend their time, I cannot imagine a sensible scoring algorithm that puts making a quine relay lower in value than posting an internet comment bitching about somebody making a quine relay.
Very funny. I think it might be possible to come up with a sensible algorithm that yields such a result. If so, it would definitely have $time_spent in a denominator somewhere :)
I also know plenty of people who waste their intellect painting canvases, when they could be advancing the species by doing something practical. Shall we shed a tear for their wasted efforts as well? But of course not. Art is its own justification.
Art that only exists for the sake of existing, without intent to express, explore, or otherwise stimulate human mind - art of that sort is garbage. There are plenty of people who vehemently defend that garbage, but it doesn't make it any less wasteful use of human time and attention. Personally, I will never respect it simply because there was a lot of effort involved.
"I will never respect it simply because there was a lot of effort involved."
Nor will I, or much of anyone else, I imagine. I respect it because it stimulates my mind. My imagination is moved by it in much the same way that it's moved by holding an acorn in my hand while contemplating a fully grown oak tree. Or studying the math that causes z <- z^2 + C to describe a ridiculously intricate shape. Or implementing my own minimal Lisp interpreter from scratch. Or trying to understand the Banach-Tarski paradox. These things stimulate my sense of wonder because they're like a magic trick, but there's no trickery involved. And when I contemplate them, I learn more about the world. Art cannot aspire to much more than that.
Yet people gladly upvote stories about computers implemented insite Minecraft and life-size spaceships made out of Lego blocks. Are you saying it has nothing to do with the difficulty involved? I doubt this story would be equally well received if there was some easy way to achieve the same result. If you take difficulty of the task out of the equation, what will be left? How much of what's left is genuine creativity and skill, and how much is mere patience?
Eli Siegel explains in his great essay "The Serious Aspect of Snobbery":
Snobbery is the unwillingness to like something, unless at the same time it
makes one feel more important;...
[It] is the inability or lack of desire to appreciate justly.
You using a quote of a quote to launch a personal attack, while explicitly doing the thing that you're accusing me of. That is, you assume that the motivation behind my comment is to feel superior to certain other people by excluding some things from the realm of "good art". Yet your comment explicitly divides people into "snobs" and "not snobs", makes a fairly obvious statement about superiority of one group and hints at the fact that you belong to the other group.
>...art of that sort is garbage. There are plenty of people
who vehemently defend that garbage, but it doesn't make
it any less wasteful use of human time and attention.
...is your opinion. Opinion that as downvoters have noted, is full of snobbery. Not even mildly vindicated snobbery that's tolerated on this forum like picking your religion (I.E. favorite IDE, programming language etc...) which someone could assert with reasonable confidence with anecdotes of personal experience at least; it was wholly devoid of intellectual merit... and was snobbery.
That's not a personal attack. I don't know you personally and nor was it an attack. I felt it was merely an observation. But If you took an observation as an attack, then doesn't that speak to the validity of the quote?
I don't appreciate a whole swath of what is considered "modern art" either (though I do like Dada, if anything for it's quirkiness that I find interesting). But never have I called someone else's creation of time and effort "garbage" purely due to my lack of appreciation.
I think the notion that purpose and intent behind art is far more important than the amount of effort involved in its creation has considerable amount of intellectual merit. Even more so considering how unpopular it is. Whether people choose to read the whole comment, or prefer to see "bla, bla, bla, garbage!" is a different matter.
>Whether people choose to read the whole comment, or prefer to see "bla, bla, bla, garbage!" is a different matter.
Interesting that you assume we didn't read the rest of the comment. I assure you most of us do prefer to take things in context. It's OK to look up once instead of looking down all the time.
Purpose is ancillary to art. Art without purpose and merely is as art has been a cornerstone of creative expression since art has existed. The fact that you prefer to inject purpose behind it now has no bearing whatsoever on what people will consider artistic. You're not the arbiter of worth or taste.
Unlike the others, I completely agree. Art that does not "express, explore, or otherwise stimulate the human mind"[1] is the definition of boring.
I would however claim that the vast majority of art does stimulate the mind, though it may not stimulate yours without some additional effort. Indeed just about anything in the world is capable of stimulating your mind if you look closely enough at it, to paraphrase John Cage.
[1] ... or, if we take a 'vibrant matter' / 'object oriented ontology' view of the world, stimulate something.
No, I wasn't quoting anybody, I was paraphrasing John Cage. Here is the quote:
“If something is boring after two minutes, try it for four. If still boring, then eight. Then sixteen. Then thirty-two. Eventually one discovers that it is not boring at all.” ― John Cage
On the contrary, to the extent something intends to "express, explore, or otherwise stimulate human mind," it isn't art - it's rhetoric, or science, or entertainment. Obviously lots of works of art were created for some end (renaissance paintings functioned as religious and political propaganda; Shakespeare's plays are entertaining), but they are art to the extent that they step back from that end, and display what Kant calls "purposiveness without particular purpose."
I think that people view art as a side effect of abundance. It is something that naturally arises given a surplus of some thing.
The point of contention here is that someone of that level of intellect has "wasted" their talent on art, rather than productivity, which "should" come first.
For varying definitions of "should" and "wasted", of course.
The effort involved in doing this is incomparable with the things you just mentioned. This is nothing like baking a cake, more like spending thousands of dollars to build a life-sized Eiffel Tower out of matchsticks.
You have no idea how long it took to write this program. I bet it's less than the amount of time a typical baseball fan spends watching games over the course of a season.
P.S. Knitting a sweater takes forever too. Especially if it has buttons.
Who cares about whether a person should build an Eiffel Tower out of matchsticks or go try to solve cancer? Pushing humanity forward is great - and so is finding personal happiness and growth through exploration. Don't knock one or the other; in fact, don't knock any at all. Both are ways to ascribe meaning to this meaningless place.
The sort of attitude he is demonstrating is backed by some sort of perverse notion that "smart" people somehow owe so much more to society than the rest of humanity that they should not have hobbies because that would delay them paying back their debt.
I don't know what could cause somebody to adopt such a twisted notion, but I think the roots of it go deeper than I care to know.
It depends on the quality of cake you are eating. There's a massive difference between a chocolate birthday cake for a 4 yr old or Profiteroles http://en.wikipedia.org/wiki/Profiterole
s = (something that generates your own source code)
puts s
At the same time, a Ruby program that outputs a Perl program looks like this:
puts "print(\"...\");"
If you combine these with quine methods, you get a multi-quine that goes between Ruby and Perl:
s = (something that generates your own source code)
puts "print(\"#{ s }\");"
Theoretically you just repeat this process. Except, there are the following practical problems:
- Need proper escaping
+ Lots of old languages do not support C style of escape sequences.
+ Depending on languages, you need to escape interpolations.
- Cobol and Fortran programs have troublesome constraints
+ Need 7 spaces on the left and less than 80 columns per line, etc.
- Naive escaping will cause intermediate files to get too large.
+ JVM Strings have 64K size limit.
+ Languages like Intercal and Whitespace will become unbearably slow.
In order to avoid these, Perl requires something like ppencode, Java needs LZW compression to avoid string size limits. Cobol's line length limit was satisfied using Closure generator, Fortran's line length limit was satisfied using Coffeescript generator. People who are interested should see the generated code.
"""
In general, there isn't any standard method to making quines. You just keep on tweaking the code until the input and output matches, mostly by trial and error.
I don't think people want to spend all their time trying to fight cancer or heart disease. Once in a while, you really need to get your mind to work on entirely useless things. IMO, that sort of mental workout makes people tackle the real hard problems better.
So are you saying that if he had not spent his time doing this, he would have cured cancer? You're forgetting the amount of people this will inspire to go on and create amazing software.
Saying that hacking is "sad" is arrogantly elevating your own limited worldview over the worldview of hackers. Perhaps someday you could hope to become as awesome as the person you're criticizing, but first you have to realize that being a hacker is something worth aspiring to.
Working on it now. I started a repo where I document the things I had to do to get it to work, i.e. where I had to do something other than what was in Mame's README:
Edit2: I ended up copying so much of what was already there that I re-did this as a fork, adding my contributions in two directories (installation and intermediate):
So far I'm stuck at an issue with the Pascal compiler not accepting long strings, which can be fixed either with a compiler switch or by editing the source earlier in the relay.
I'm grabbing the packages (in Ubuntu 12.04, just sub "clojure1.4" with "clojure" and you should be able to grab them all)... will attempt to run it once I get this all set up.
The symbol of note is not "The Star of David", but instead Ouroboros, the eternal snake... symbolized by a snake eating his own tail.
I believe he chose the symbol Ouroboros to represent the "immortality through change" of the code. In one story... Ouroboros was an immortal snake who constantly shed his skin, and took on many forms.
Similarly, this code constantly changes form, yet is immortal. While it changes form, it manages to keep its identity.
The Star of David is incidental. As far as I know... it was added to the Ouroboros symbol in the anime/manga Full Metal Alchemist, but that rendition is perhaps the most famous artistic rendition of Ouroboros in recent culture.
Ouroboros had some connotations in the classical psudo-science of Alchemy as well. So perhaps there is an earlier version of Ouroboros + Star of David.
> The Star of David is incidental. As far as I know... it was added to the Ouroboros symbol in the anime/manga Full Metal Alchemist, but that rendition is perhaps the most famous artistic rendition of Ouroboros in recent culture.
The "Star of David", like many Jewish symbols, has long been associated with alchemy and magic arts (and other less savoury enterprises) in classic European cultures. See for example [1] from a basic google search. This is likely why it was used in the anime, probably just to avoid the usual boring pentagram.
Which means it gets shown in anime for pure Rule of Cool reasons, and then Japanese people have not the slightest idea that it's actually the flag-symbol of Jews.
It's not like I would know the standard symbol for Shintoism unless I googled it.
Actually, the Ouroboros Tattoo on the villians Lust / Gluttony / Greed / etc. etc. from Full Metal Alchemist, of whom were created using the human transmutation circle (getting into the details of which would be spoilers...).
NB: I originally posted this with the link pointing to the QR.rb file, which is the main code file and also a piece of Ascii art. Looks like the mods switched the link back to the main project to provide more context. But to have people looking at QR.rb first and then reading about what it does afterwards was the idea. Amazing piece of work by @hirekoke.
This reminds me very much of the various types of mathematical exercises that seem to be very esoteric and academic but turned out to be very useful in some solution in physics (eg. lorentz contractions). I would not be surprised if some insight indirectly comes from this insanely amazing exercise.
If computer languages were evolving organisms, may this be a metaphor how genetic/chromosome encoding reshuffles itself, and this approximates some common minimalist packaging of the DNA among all these fifty(++?) programming languages? Can we consider this Quine Relay as a `prequel' ancestral "Hello Dad" gene, and also, how its wonderful aesthetic creativity, familiar layperson recognizable and understandable information and charming symbols at myriad representation levels, is enhancing its survival as a persistent executable(living?) `gene-meme'? Eg, is anyone else posting some of this on Facebook?
Per the explanation linked in the discussion [1], the initial file generates code in another language. That code, when executed, generates code in yet another language. The process continues until the output is the code of the original file.
What is unclear to me is how the final program manages to represent the entire code (thus the entire language loop). Wouldn't doing that make the source code recursive and infinite? Or am I missing something important?
If you look at the ruby source file[1], you'll see that the first thing it does is take all of its source and put it in a variable $s. Then it evaluates the source code in that variable, with that source having access to itself as a string. All of the intermediate representations could easily contain this string in its entirety (and they likely do). Hopefully you can see now how it's not so impossible (but still ridiculously awesome!)
Representing self is actually not that difficult or mind-bending; you just need a representation that is isomorphic to the program source. For instance, you could store the ASCII symbols in an integer array:
int array = { ... } // holds the source code,
// except its own representation
int array_index; // The index in the source code
// (where the actual integers in
// array interpreted as source appear)
for i = 0 to array_index:
print (array[i] as an ASCII character)
for i = 0 to array_length:
print (array[i] as integer ++ ", ")
for i = array_index + 1 to array_length:
print (array[i] as an ASCII character)
The core idea is that you can interpret `array` in two ways, as an array of integers or an array of ascii characters representing the program source. The only difficult part is adjusting array_index. With a little effort, this can be scaled to a chain of languages.
We already have lots of single-language quines, which are programs that output exactly their own source code when evaluated. In other words, there's a text C and a function F such that F(C) == C (i.e. C is a fixed point of F), where F is a single programming language evaluator and C is a source code file.
In this case, however, the function F happens to be the composition of a bunch of individual functions. So F is equivalent to Rexx ∘ R ∘ Python ∘ Prolog, but with many more functions composed than I am willing to type right now. This doesn't change the nature of what we're searching for, which is simply a fixed point of a given function. Realistically, of course, it makes it more difficult and impressive for someone to actually accomplish, but mathematically/computationally speaking it's the same thing.
This is a neat project but somehow the implementation is a lot less mind bending than the definition. Which is rather the opposite of what I experienced when I first heard of a Quine and figured out how to write one (http://en.wikipedia.org/wiki/Quine_(computing), Turbo Pascal in my case, heh).
No more recursive or infinite than any other self-replicator that has intermediate "bodies".
EDIT: To clarify, there's actually a wrapper script on top of it all that tells the first script to execute, storing its output in file 2, then execute file 2 with language 2, then execute file 3 with language 3, and so on.
Quines work the same way as cellular reproduction. I don't think there are any life forms that carry alternation of generations out to 50 different stages, though; the most I've heard of is two.
Awe inspiring stuff! But as a Ruby illiterate, I am left wondering what it is about the Ruby language that draws in such a level of creative / abstract / esoteric genius to use it as a starting point. Is there something about Ruby in particular, or is just a case of an individual obsessing over a craft?
Remember those delightful self-referential Scott Kim creations in Godel Escher Bach? Here, made alive, and becomes most everything in the known coder's world. A super-chameleon quine mime.
The arrangement of the languages is alphabetical, did this complicate things at all? Would it have been simpler in a different order (though less elegant)?
I think the code has a generator framework, so the fact that they are alphabetical may just be convenient/arbitrary; the code could be generated to run in any order (I think).
http://rubysource.com/meet-fifteen-ruby-core-committers/
He lists his "hobby in programming" as: "writing a Quine and enjoying esoteric programming."