Hacker News new | comments | show | ask | jobs | submit login
The Story Of Mel (catb.org)
169 points by jacquesm 1053 days ago | hide | past | web | 77 comments | favorite



The wikipedia entry (http://en.wikipedia.org/wiki/The_Story_of_Mel) has even the link to the picture of Mel, whose name is discovered from the preface to the manuals for the Royal McBee LGP-30 ACT 1 (Algebraic Compiler and Translator) compiler (dated 1959). Then somebody discovered the group picture in http://www.librascopememories.com/Librascope_Memories/1950_-...

and there it is:

http://zappa.brainiac.com/MelKaye.png


I was first to find that image of Mel in the Librazette archives. In fact, the following post was my first on HN: https://news.ycombinator.com/item?id=3110883

More context: http://librascopememories.blogspot.com/2012/03/59-mel-kaye-l...

I did eventually manage to get in contact with Mel, but I scared him away, unfortunately. That's a story for another day... :-/


> I did eventually manage to get in contact with Mel, but I scared him away, unfortunately. That's a story for another day... :-/

Tell us today! Tell us! Please?


I third Stratoscope's and jacquesm's request for that story.

Also interesting to note: that picture you found is from his first month on the job, yes? The "Welcome" section at the bottom of the PDF claims he just joined in July in "Engineering--Commercial," and the text above the picture says that the three instructors pictured (Hazlett, Behr, Kaye) will transfer to Royal-McBee payroll "as announced in the July Librazette." The July edition claims Hazlett, Behr, and Fred Flannel are the engineers to be transferred.


Oh wow. Well consider this another day! That's amazing.


Please do tell us about what he said!

I did upvote, but that doesn't communicate our desire to hear the story. This is one of those rare situations where a "me too" post serves a legitimate purpose.


He looks much younger - and less bearded - than I'd imagined when reading The Story of Mel countless times. This changes my whole perspective on the story; the character of Mel transitions from a scruffy and disillusioned veteran to something of a youthful and rebellious hotshot.

Thanks for the Librazette link and photo; that's awesome.


The same, although come to think of it, in the 50s, how many scruffy, bearded veteran programmers could there have been?


That's true; he's probably a proto-scruffy-bearded hacker in that photo. I'd be interested to see him 10 years later; he'd probably put Thompson and Ritchie to shame in terms of beard mass. ;)


And somebody even discovered the dump of the BlackJack for LGP-30, but still can't disassemble it.

http://www.jamtronix.com/blog/2011/03/25/on-the-trail-of-a-r...

There's also the BlackJack for RPC-4000 manual written by Mel Kaye(!)

http://bitsavers.trailing-edge.com/pdf/royalPrecision/RPC-40...


According to the caption, there's a guy from the NSA in the group.


> the picture of Mel

hacker bigfoot ???



I am working on an x86 code generator today for a small language I have been working on. This story seems very timely to re-read because it is so difficult to generate really good x86 code. Even though I basically know what I am doing when writing a code generator at this point, and I have previous ones to reference, there are so many edge cases it is hard to get it right.[1]

It used to be said that a compiler would never beat a programmer at assembly. Now we say the compiler is usually going to "do the right thing." I think there are two things going on here: 1) Our code generators are much better than they were in the bad old days. 2) Most people don't understand the ins and outs of their machines like Mel did (I certainly don't). This means that the compiler has "institutional" knowledge from a few really knowledgeable people and the rest of us just use that knowledge.

[1] see for simple example `imul` which has 5 different forms http://docs.oracle.com/cd/E19455-01/806-3773/instructionset-... . According to the intel optimization manual[2] the 16 bit forms suffer from "false LCP" stalls. The recommendation is to cast to 32 bit before preforming the `imul`.

[2] http://www.intel.com/content/dam/www/public/us/en/documents/... page 103 in the PDF. See also 506 for timings on the Atom architecture and 517 for Silvermont.


I find this trend slightly scary. The more certain bits of code are used, the more we tend to trust that they are good. Yet, the more time passes, the more complex systems get and the fewer people truly understand those core bits of code.

I've been in a few situations where I've seen core bits of code with glaring inefficiencies, but nobody even bothered to check them before, because we all trusted the group of people in charge of writing and maintaining that code. Turns out, those are "mere humans", just like the rest of us, and they can make mistakes too.


I think you are right to be fearful. Trust isn't good enough. As our machines become more complicated, formal methods of verification & validation will become necessary for the daily programmer. We cannot trust that it is correct, we must know with certainty. We can already deliver software and hardware solutions that are formally verified to perform correctly and not kill anyone. The knowledge and know-how just isn't widespread largely due to prohibitive cost and arcane nature of the subject material. Increasing complexity and security consciousness will be the primary driver for the adoption of these technologies in the mainstream I believe. We are already starting to see the articles about formal methods appear more frequently on Hacker News, which is a more general programming and entrepreneurial audience.


An important addition/corollary/reason for (2) is that modern machines are so much more complex. It's no longer possible for any one person to have that level of understanding of their machine, because there's too much to it. There are probably orders of magnitude more transistors in a single Intel CPU today than had ever been produced in the entire world at the time Mel was doing his thing.

It's overall a good thing. It's what lets us do so much and have machines that are so fast. But it does imply a lot of necessary abstraction to get stuff done.


> It used to be said that a compiler would never beat a programmer at assembly.

Probably still couldn't. But with gigs of ram, terabytes of hard drive space, and gazillions of flops of cpu time available, no one cares. No one can wait for Mel to optimize blackjack, your boss wants the new build 5-15 minutes after you've written the code, and that shouldn't take more than an hour itself.

It's not that the compiler got better than programmers... it just because cheaper. Hell, how many Mels are there out there anyway? Surely the number of people that clever and devoted to the bare metal of a CPU has never exceeded a few thousand people out of the entire population of Earth. Rarity makes them expensive, but it also means that shitty software shop in St. Louis or Tampa could never have such a person.

And since technology drives the economy, everything would crash if we had to rely on the availability of super-programmers.

Optimizing compilers are slightly better than they were long ago... but this isn't because they're all that impressive. We've just had thousands of monkeys pounding away over decades, hammering out one little optimization or another, and the storage to allow compilers to grow big enough to have a library of those to rely on.


It depends greatly on the programmer, and how much time you allot them. GCC will produce vastly better code than any human programmer, given a specification in C and a time bound of 1 second. Give them ten minutes? An hour? A week?

At the extreme a sufficiently adept assembly programmer (which probably still exist) will still beat the compiler, given enough time, though that can be helped by the ability to reference the code the compiler is generating.


Not any human programmer, at least not for some tasks. The handcoded x86 SIMD code in x264 (the software video encoder used by everyone who cares about video quality per bitrate) beats the output of any compiler trying to compile the pure-C fallback function. Same for a bunch of other code you use (indirectly) like memcpy or strlen.

But the optimized handcoded routines take weeks to develop while a compiler would be done in less than a second. For almost all code, a programmer's time would be better spent on something else.


You seem to have misread what I said, and are making basically the same point I did. I said that given only a single second in which to work (and given a problem specification in the form of a correct C program) then literally any human programmer will lose out to GCC. That seems unequivocally the case, and you seem to agree.

I also said that there still exist programmers who, given enough time, can probably beat the compiler. Certainly there are tasks where this is easier, but no one is getting it done in one second.


Personally I felt beaten the compiler with the advent of superstar CPU (around mid 90s for x86) Pentium P5, AMD K6 and the like.

In some particular loops some hand written assembly is bound to outperform C code but overall it's an uphill fruitless battle.




Linked from Ed Thelen's site is http://www.bemorehealthy.com/LGP-30Computer/The30.htm which has a couple photographs of Mel's handwritten code.


Stories like this are why I'm inclined to consider programming as much as an art as it is a science. Artists, too, can use their media (such as brushes or musical instruments) in a completely novel and unusual way. The end result is often not pleasant to casual viewers or conventional artists (that would be coders who write clean, maintainable, kludge-free code) but it can just as often leave you awestruck by how interestingly and differently a tool or a feature or a language construct has been used.

For a philosophical discussion on this, see [1] and [2], both from PG, who actively advocates for a playful, prototype-first approach to coding, which more resembles a painter painting a picture than, say, a sociologist conducting systematic research. ([1] is a transcription of a talk by Don Knuth, [2] is Paul's own essay.)

[1]: http://www.paulgraham.com/knuth.html

[2]: http://www.paulgraham.com/hp.html


As much as I like pg's writings (honestly, I (usually) (kinda) like/appreciate his writing style, content, etc.), re: hackers&painters, obligatory link to Maciej's retort/comments "Smushing Paul Graham":

http://idlewords.com/2005/04/dabblers_and_blowhards.htm

It's a very nice read, actually.

(You might remember Maciej from "The Internet With A Human Face"[1] / as the dude behind pinboard.in / etc.)

[1]: http://idlewords.com/bt14.htm


I'd actually argue that an analogy with architecture is a much stronger one than the one with painting. In both architecture and hacking, the creative output has to hold up to some objective standards (to compile, run and solve the problem/to adhere to the laws of physics and provide a purpose) but it also allows degrees of freedom not commonly thought about when one thinks of architecture or coding.

This is, of course, implying that an architect is also capable of "implementing" his solution (which should, strictly speaking, be an engineer's job), but you could also argue that you can devise a system where a hacker would write a functional prototype and then a software engineer would reimplement it rigorously, with specs, TDD or whatsoever. This is where the analogy starts being a little far fetched so I admit it's not the best one but I feel like an architect's thought process follows very closely a hacker's one.

Maciej's text is very nice, and I must say I'm stretched between PG's original text and this one right now. I guess I just to need it a little more thought and reread both of them :)

On a side note, I sometimes think about whether programming is really "just engineering" and if I'm just being delusional and just eager to call myself an "artist" because I love hacking. It's an interesting subject to ponder about, but I can't say I've made up my mind.


I think it's both an art and a science. The art part is often under-emphasized... horrible code tends to be produced by people whose grasp is only of the science end. It's "correct" but it's not beautiful, elegant, or creative. It's not just how the code looks -- those that only understand the science end often produce "clunky" solutions that solve the textbook case but do not perform elegantly and yield a poor user experience.


This story gets posted every now and I'm always surprised that people react positively to it. Doing clever tricks like these is NOT what a good programmer is supposed to do. Especially when they're not documented. This is just obfuscation for obfuscation's sake.


I don't think I can enumerate the number of ways in which I disagree with you.

- It's only obfuscation if you have the luxury of solving it in another way.

- the times have changed (1956), but the methods have not

- sure, it may be obscure but it shows perfect control of the machine and the issue at hand

- agreed, it is not maintainable, but back then feature changes usually meant re-writes and people thought longer about what they wanted before they started building. Machine time was more expensive than programmer time.

- being clever is (to me at least) exactly what hacking is all about, finding a solution to a problem where you thought none existed. Ask yourself this: if Mel's code was so useless how come some hotshot didn't re-write it using 'best practices' and zero 'clever tricks'? The answer: likely because it could not be done. Mel delivered.

- the rebel in me likes Mels attitude towards management. Sure, if you're in management you'd hate him twice weekly but I note that Mel did not get fired, instead moved on to greener pastures

- I compare this with the way a jeweler would make his little works of art versus the way a production welder connects pieces of pipe. Mel is the jeweler, you'd rather we all just join pieces of pipe.

- I've gone through some of the assembly code for 80's video games and that stuff was full of tricks like this (and without it would never have gotten of the ground). Making a computer output the right results within the margin of error visible to the user when lacking the 'proper' resources is where creativity comes in. And in that case I'd much rather have a guy like Mel by my side than any number of {insert high level language of choice here} pushers because if it is at all possible guys like Mel will find a way.

Anyway, as you can see, we're likely not going to agree on this one.


My professor said it this way: Engineering begins when the problem constraints admit of no solution.


Really nice quote, what is the name of your professor? :-)


Harry Kane


Bending the limits of the machine and doing the unexpected may not necessarily be great engineering, but it's amazing art.

It's not uncommon for the best programs to have a bit of art inside them that make them work. Especially in high performance timing critical operations close to the hardware. Very few people work in those realms these days and these types of things aren't often necessary, but they're still beautiful and occasionally outright necessary.


Considering that this was the 50s and the first compiler was written by Grace Hopper in 1952 I think judging that guy is a little unfair.

We are at the moment at a point where Hardware is (to a degree) cheaper than manpower. You might want to think that one cost can be traded in for the other, which often is the case, yet there are those moments were the assembler programmer might miss the asymptotically more efficient solution that saves magnitudes of time and situations were that Cache-Miss in one of the inner loops just kills performance (and maybe kills that startup because the Application just stalls).

While I would never ever want to work like Mel, I share the admiration for Mel's work and propose that "Best Practices" be renamed "Somewhat-Optimal Practices" in his honor.


I sort of agree with you. Mel's story is colorful and I definitely love it as a piece of hacker culture. But we should be thankful we do NOT have to work with Mel in the same team. He could probably out-program us all (at least back in the day), but I doubt his trick would be the most effective use of time in today's application programming.

Mel was a real programmer because he did the best with the limited tools of the day, but nowadays... "it was hard to write; it should be hard to read", while funny, is something to be avoided.


"I compared Mel's hand-optimized programs with the same code massaged by the optimizing assembler program, and Mel's always ran faster."

I think that's a big part of why people like the story so much.


It's a very "John Henry" story. A man fights the machine, and wins, but the victory is shallow because the machine continues to improve with every passing year. Eventually, people like the man are made obsolete.

http://en.wikipedia.org/wiki/John_Henry_%28folklore%29#Legen...


It's the story of a man who did a beautiful thing. It wasn't the _right_ thing, but it was still beautiful.

Also, it helps that the stakes are so low. Mel's optimized program cheats at Blackjack, and the protagonist is trying to make it cheat in a different way. We're not invested in his success, so we can enjoy his failure, and maybe even root for it. If the same story were set in the context of the Therac-25, it would be received very differently.


People like to watch James Bond even though they don't jump off a cliff with a wingsuit. So I love this story of optimizing and pessimizing machine code even though I don't write machine code.


This makes me think of demoscene code. Squeeze the most out of a tiny amount of code.

http://en.wikipedia.org/wiki/Demoscene


>is NOT what a good programmer is supposed to do

Doing what you're "supposed to be doing" is not always what life is about.

People also love anti-heroes.


"I found an innocent loop that had no test in it... It took me two weeks to figure it out."

I think you're right. How much time/money did that clever loop trick waste?


Don't forget the other side of that question, though: how much time/money did it save?

Best practice today is to waste computer time to save programmer time. It's absolutely right, too. In 99% of cases, computer time is plentiful and programmer time is scarce.

But that's the result of decades of hardware advances. Mel's computer had a clock speed measured in kilohertz and a memory access time measured in milliseconds. It might be literally a million times slower than what we're using today. The dirty tricks we shy away from today can be the only way to get some things done on hardware like that.

Finally, we're talking about the 1950s. The profession was pretty much brand new. Where do you think modern best practices came from? They're what came out of people finding out, by experience, what works and what doesn't.


Mel was simply pushing the envelope, doing things that other people couldn't using the exact same hardware. Just look at the machine (linked elsewhere in the thread) and imagine creating a poker game for it.

I remember making 'games' (for some definition of the word game that you probably would laugh at today) on a KIM-1 in 6502 machine language (no, I did not have an assembler yet, that was a luxury) and that was a very advanced environment compared to what Mel was working with.

Computer time is only plentiful when you're not pushing the boundaries.


> Where do you think modern best practices came from? They're what came out of people finding out, by experience, what works and what doesn't.

Definitely. But I think this is a pretty clear example of what doesn't work. "What, make a change? Sorry, nobody can figure out how to do that." So yay for Mel for doing what people thought couldn't be done, but let's never do it again.


According to Gordon Bell's Computer Structures: Reading and Examples, the best-case memory access time of the LGP-30 was 2.34 milliseconds†. Most instructions required two memory accesses (the instruction itself, and an operand), which gives you around 200 instructions per second.

The machine had only two registers: the accumulator, and the program counter. The only place you could use a pointer was in the address field of an instruction. In order to increment a pointer, you would load an instruction into the accumulator, add to it, and store it back. Best case, you could increment 67 pointers per second. Worst case, nine.

Mel's job, in this instance, was not to write beautiful maintainable code. It was to write a demo that would sell the machine to potential customers, rather than boring them to death.

http://research.microsoft.com/en-us/um/people/gbell/computer...


Why do you say this? I mean, it's true, but do you think there are people who disagree, who think that writing that kind of code on a modern computer is a good idea, and further that they are participating in this discussion?

I find this reaction confusing. Nobody is holding Mel up as an example for us to emulate in the specifics of how he worked.

It's like telling the story of Tex Johnson doing a barrel roll in the 707 prototype filled with VIPs, and someone comes along and interrupts with, "I'm surprised that people react positively. Doing clever tricks like these is NOT what a good pilot is supposed to do." He'd be right; airline pilots shouldn't be doing barrel rolls. But it's a pointless thing to say, because nobody is telling that story to convince airline pilots that they should be doing barrel rolls.


It's great to be able to handle stuff like this, when it actually does prove necessary (and the degree to which Mel could apparently handle it is impressive). I agree that it is not what you should be doing if there's any way around it (as there more often is, these days), and that the apparent lack of documentation is appalling (though depressingly common).


Back in the late 80s and early 90s it was still possible to regularly beat the stock compilers with hand written assembler (could that personally). The story used to produce a lot of awe but now it's getting doubted by those pesky best practices.


I agree that doing clever tricks is not good software development. But that does not diminish at all the delightfulness of this story nor Mel's awesome mastery.


When I'm in a place to interview programmers again, I should have them read this story. If they say they like it, I'll know not to hire them...


Please do. I'll know not to work with you or your company.


Send them to me.


Last time this story was posted here, I commented "I hope I never have to work with this guy, he sounds ghastly!"

This was a new account and I suddenly found myself with -7 karma points. I had to consider if I should throw that account away or keep it and earn those points back.

I kept it and now have a little under 3k points. He still sounds ghastly!


For anyone interested in how machines like the LGP-30 worked, there's a book titled Basics of Digital Computers by John S Murphy, nominally in 3 volumes, though you'll most likely find the combined edition. Though the title sounds similar to modern yellow-jacketed dross, this book was published in 1958 and the ‘basics’ go down to vacuum-tube circuits for arithmetic and control.


My favorite quote:

"I have often felt that programming is an art form, whose real value can only be appreciated by another versed in the same arcane art; there are lovely gems and brilliant coups hidden from human view and admiration, sometimes forever, by the very nature of the process."


The other funny story about this machine is pretty much the flip side:

Shop practices dictated that when you scheduled debugging time, you couldn't be bumped off as long as your program was still running.

This led some clever fellow to write a deliberate loop that kept his program "running" -- panel lights a-flashing -- while he fashioned a patch for the most recently noticed bug, which he could then dial in to the memory in-flight without losing his place in line.

When his ruse was discovered, he was chided for wasting lab resources, to which his response was: "Wasting resources? Nonsense! I used the optimising assembler!"


Do you have a source for that story?


I've seen this story around before, but I'm always curious if anyone has managed to track him down. Would be a very well read interview if someone could speak with him. Seems likely he could still be alive.


https://spreadsheets.google.com/pub?hl=en&hl=en&key=0Aqz4Zqs...

Lists him as still missing. He might very well still be alive but the clock is ticking, that librascope archive is full of 'in memoriams'. Tough to see all these pioneers pass away almost unnoticed.


“If a program can't rewrite its own code”, he asked, “what good is it?”

Amen.


Needs a [1992] in the title.


That's a classic.


A classic, but hard to believe anyone here hasn't seen it before!


There are new developers, like me, that have not been around for a long time.


In such case it's worth reading the whole jargon: http://www.catb.org/jargon/ (in print version - The New Hacker's Dictionary).


Unless I'm missing something, it doesn't seem to have had a discussion here for a few years.

https://hn.algolia.com/?q=%22story+of+mel%22#!/story/sort_by...


and yet it's been posted 6 times in the past year?

well the photo of mel here in the comments is new to me so we'll see what happens...


Proof positive lots of good stuff gets pushed off the new page very quickly. I absolutely love this story, especially the bit about the loop without a visible termination. Epic stuff.


I just don't get how the simple 'hot' (or 'news') algorithm, only taking a single link's time elapsed and raw number of votes(?) into account, is the best we have for a community like this.

And I'm not talking about reddit's 'infinite but disjointed sub-communities'...


If you have better ideas, list 'em. HN has been consistently good for the past seven-ish years.


tags, respecting different votes differently, linking old conversations with new ones instead of 'archiving' them into oblivion...

HN might be some sort of local maximum atm, but more experimentation is warranted on these fronts.


Feel free!

The source is out there. https://github.com/arclanguage/anarki


To test on what audience? There are hundreds of things I'd rather be hacking than a personal copy of some forum software.

> You're submitting too fast. Please slow down. Thanks.

fuckin thing won't even let you have a conversation ...


Well, if you can think of a better way to find out that it's Tuesday (at least in my timezone) again than seeing this reposted I'd very much like to know about it.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: