Hacker News new | past | comments | ask | show | jobs | submit login
John McCarthy Has Died (wikipedia.org)
1619 points by apgwoz on Oct 24, 2011 | hide | past | favorite | 218 comments



For a fun portrait of McCarthy's early life and aspirations, see the passages about him in "What the Doormouse Said". [1]

- Born to communist activists in 1920s Boston

- Studied math with John Nash at Princeton

- Organized the first computer-chess match in 1965, pitting his own algorithm against one created by computer scientists in the USSR, with moves communicated by telegraph.

While many will remember him for Lisp and his contributions to AI, perhaps equally important was his role in running a research lab in what became a characteristic West Coast fashion: finding intelligent people, setting them free to do what they wanted, and obdurately arguing with them over very little provocation. He was far enough from mainstream to be familiar, and even sympathetic to the 60s counterculture, but also cynical enough to embrace technology, rather than revolution, as a way forward. If this sounds a bit like running a startup today, there's one main reason it does: John McCarthy.

[1] http://books.google.com/books?id=cTyfxP-g2IIC&lpg=PT113&...


Excellent book BTW, would recommend it to anyone interested in the history of not just computing but what we think of as UI/UX.

It's almost depressing to see the same problems we keep (re-)solving have had great minds beating on them for quite a while now :)


Whether or not he has passed; if anyone has the book and fancies spending a little time adding interesting material to the WP article about him, that would be nice :)


I wonder if the chess match had any influence on the movie "Colossus: The Forbin Project"


Unfortunately it's true. I just heard back from Peter Norvig, who says "He died peacefully in his sleep last night."


Replying to your original reference: http://twitter.com/#!/wendyg/status/128584398642757632

Just to add to that - I've had a message from Wendy saying that McCarthy's daughter had 'phoned her with the news, and Wendy is the ultimate source of the wikipedia edit/entry.


October 2011 is turning out to be one sad month.


We should declare tomorrow November minus 6th...


Bring on November



What sad news. I do hope that Standford makes a statement, just to clarify.


What a sad month


Sad to hear about this. I love LISP and discovered the beauty of it when I was part of Gensysm product team - large scale real time system with AI and NN.


What a pity.

I sense a black bar coming.

The last few weeks have been terrible, one icon after another has fallen. What is different from other fields is that computing is still a young enough field that almost all of its luminaries except for the earliest ones are still alive.

I fear the avalanche has only barely begun, and at some point the frequency of these will become high enough that either we will stop to notice or we will have that black bar on here more or less permanently.


Death is the only guarantee in life and in fact is necessary for human ecology. I too am deeply saddened by this loss, but am supremely grateful that a man like John McCarthy was able to contribute to the humanity's knowledge. May his work be the foundation in many more advances in human understanding, and may his void be filled by someone who respects this journey.

RIP, John.


> Death is necessary for human ecology

Ugh, not now. References or stop talking. I have my fingers crossed he was signed up for cryonics. This is tragic.


Even if you can live for a million years from advanced technologies, you still inevitably will die. We all do.


While I'm pretty confident in humanity's current understanding of physics, I'm not that confident to say that given a million (more like several billion) years and Jupiter brains, we won't figure out a way out of whatever the fate (if there is one) of the universe is. The more important aspect however is how can I actually get to the point of living a million+ years? When I'm there I can argue over some supposed necessity or long-term inevitability of death; right now it doesn't have to be that way.


It's how any species gets to iterate. The old have to die off to make room for the young.


We're not Drosophila, we're people with minds capable of exploring and understanding a world that goes beyond biological evolution. We don't have to use death to iterate, we do it using general intelligence (it outperforms evolution by a factor of a bazillion). Also, not all organisms use preprogrammed death as a mechanism to iterate, ask an amoeba sometime.

All this talk about the inevitability of death and how that's supposdly a good thing is basically just pseudo-religious reasoning in defense of a status quo that is rapidly crumbling away.


Right, but if there is actually any selection for aging and death in organisms (maybe there is, I'm not sure), it's not useful anymore. There's a lot of evidence that there's no longer genetic selection acting on the human population, and it's therefore no longer true that our children should be genetically superior.


>There's a lot of evidence that there's no longer genetic selection acting on the human population

Sure there is. The fitness function is just different from what you or I consider 'optimal' or 'superior'. We're still becoming more resistent to disease, even if the future may have more in common with trailer park people than urban elites.


Maybe not genetically superior, but certainly culturally stronger. Old people have old ways of thinking.


And the way of thinking isn't constant. It changes over time. A 15 year old you, a 25 year old you and a 50 year old you are all different people. A 500 year old you will most probably be a different person.

There are people who resist change and disapprove of all viewpoints other their own, but I doubt it's an age thing. I see a lot of young people opposing gay marriages for instance.


We think differently as we age. Have you spent any amount of time arguing with old people? You can't teach an old dog new tricks, etc.

Ideas tend to propagate in generations - it's often necessary for the old guard to die off before a new idea will be accepted, whether this is in art, or science, or politics or whatever.

As new people crop up that aren't reputationally invested in a certain idea it allows more ideas to be considered seriously, etc.


You can't teach an old dog new tricks

Good to see ageism is alive and well.

I know a guy who did a bunch of market research for a senior care product. They found technology use, on average, dips in the 60s and then starts increasing again in the seventies.


different != superior.


What you're seeing is actually the death of the second generation of great comp sci minds, IMO. The first modern generation included von Neumann (b. 1903), Hopper (b. 1906), Turing (b. 1912), and others. McCarthy (1927), Knuth (1932), Hoare (1934) etc. are of the next generation. Obviously, you can cut generations on any arbitraty line and YMMV, but the 15-20 year gap seems a reasonable grouping, IMO.


Would that make Leibniz part of the zeroth generation?

http://en.wikipedia.org/wiki/Gottfried_Leibniz#Computation


He's pre-generational, along with people like Ramon Llull and Hero of Alexandria (who, however, did not have such a heroically unusual distribution of letters in his last name).


Eventually, I think, there will cease to be icons within the industry. What we have now are the people who invented the internet/computers and the basic components.

The people who are doing the inventing now are just as smart, but they aren't creating fundamental things, they are improving on an existing system.

We don't say, "engineer team y invented the iCore series of processors." we sort of just say "Intel is coming out with a new set of processors."

I'm not sure if I am making my point very clear. :\


Between Gates, Zuckerberg, Larry and Sergey, Torvalds, RMS, Knuth, and a few others, there're still plenty of icons left and so I don't forsee your scenario happening anytime soon. I am generally an optimist, and think the number of icons to come is far greater than those who have come and gone.


Is it really fair to compare Zuckerberg to Knuth? Zuckerberg made a PHP site for people to stalk their classmates. Knuth wrote TeX, hundreds of papers, and a series of books considered among the 100 most important scientific works of the 20th century.

http://www.americanscientist.org/bookshelf/pub/100-or-so-boo...

Zuckerberg is definitely a successful dude that one can look up to. But in terms of advancing the field of computing, he hasn't done a whole lot yet.


Your comment presupposes that advances in mathematics are more important than advances in social communication. Is Mozart less iconic because his achievements were in music?


I think it's more about how hard it is than how important. Facebook represents a lot of work but nothing that seems to demand superhuman insight; an army of Stanford CS grads supplied with pot and Red Bull could probably make a tolerable substitute in five or ten years. But it's far from certain any of them could ever produce the kind of work Turing or McCarthy brought us. (And I say this knowing I won't, either.)


I think an army of top end CS grads with caffeinated beverages is what Facebook currently employs.


> Your comment presupposes that advances in mathematics are more important than advances in social communication.

I don't see that... he just said it wasn't a fair comparison. If it comes off like he thinks advancements in mathematics are more important that other advances, it might be because he does think so, but that's not exactly what he's saying.


Not sure why you're being downvoted. Somehow people have this picture of mathematics as this universal body of knowledge, but that couldn't be further from the truth. They are as earthly as anything else we've been doing as a human race. For an interesting perspective on mathematics from a cognitive-scientific point of view, check out George Lakoff's book "Where Mathematics Comes From."


> Gates, Zuckerberg, Larry and Sergey, Torvalds, RMS, Knuth

It's not too hard to imagine a world without facebook, but I'd find it very hard indeed to imagine a world without Microsoft, Google, Linux, Open Source or the Art of Computer Programming.

It's almost like this is one of those trick questions: "which name does not belong in this list". Zuckerberg pops out at me immediately.

All the rest is timeless, Facebook will one day be gone again, I'm fairly sure of that.


I have to disagree on this one. If there is one who does not belong in the list, it would be Knuth. Zuckerberg, as well as Gates, Larry and Sergey, has created a technology company that changed the lives of millions. They are all well known by the "civilians". Torvalds less so, but still, the Linux project has had a relevance that trascends beyond mere technology.

Mr. Knuth's contributions, on the other hand, are much more fundamental. But, because of this, his name remains unknown for the general population. For us, he is a titan amongst men; for the world, he'd be an old, retired professor who cannot even figure out email.


> in terms of advancing the field of computing, he hasn't done a whole lot yet.

Theory, no. Practice, oh Hell yes. Facebook changed the game, for everyone. The "Social Revolution" has changed the face of business, investing, interpersonal communication, friendships, revolutions... the list goes on. Maybe Zuck was just in the right place at the right time, but he made the most of it and his company's impact has been enormous.


In terms of science, absolutely not. But impact, yes. Facebook changes everything. From now on, most people will be a part of an online social network. It's as important to human communication as the telephone was. Know how we use letters from the Civil War era to understand what that society was like? Imagine what will be possible in that regard in 100, 200 or 500 years.


I hate to sound negative, but things won't get better. OK, this month has been a shocker, but next year might be just as bad.

The real fundamentals of computing happened a long time ago. Hero of Alexandria, some Chinese and Islamic scholars, Albertus Magnus, Roger Bacon, Charles Babbage, Ada Lovelace, Joseph Marie Jacquard, Turing, non Neumann. But progress was linear, not exponential - the tools they made were so impractical that they didn't make further tools easier. You used a computer as a curiosity, or to solve a specific problem, not to build a better computer.

Then the singularity hit. Computers got faster, IO got richer, and you could practically write computer programs that would help you write computer programs. It's hard to say exactly when that happened, but the inventions of C, UNIX, and Lisp were certainly part of that critical period in history. This shift started around the mid-1960s to early 70s.

The people who were working on it are now in their 60s and 70s. The next generation (who are in their 50s and 60s) wasn't just inspired by them, they were given very useful tools, which made later advancements easier. It's easier to write a C++ compiler if you already have a C compiler. It's easier to write git if you have a CVS repo to store your work in. People no longer had to bootstrap using nothing but assembly, they started with a sound foundation, and could iterate.

Because of the work of these founders, I think there's a lot more icons left.


    It's easier to write git if you have a CVS repo 
    to store your work in
Not to diminish your point or anything, but regarding Git, I'll just quote Linus Tolvards:

    For the first 10 years of kernel maintenance, we 
    literally used tarballs and patches, which is a much
    superior source control management system than CVS is
Basically Git started as a bunch of scripts written by Linux for dealing with "tarballs and patches" and nobody starts something like this unless there's lots of hatred involved. What you get if you're a CVS user and want something better is SVN. You get Git when you consider CVS such an abomination that you'd rather do without any existing VCS altogether.

And I really think the world needs more people like Linus, as some wheels are worth reinventing.

    The real fundamentals of computing happened a long 
    time ago.
In the words of Alan Kay: The computer revolution hasn't happened yet ... http://video.google.com/videoplay?docid=-2950949730059754521 (listen to this talk because it's great).


I think it's worth noting that git did not replace tarballs and patches. Git replaced Bitkeeper. Bitkeeper, as far as I know, replaced tarballs and patches. And as far as I know, git did not start as a bunch of scripts for dealing with tarballs and patches. The design was inspired by his knowledge of and experience with Bitkeeper.


The workflow was indeed inspired by Bitkeeper, nothing is truly revolutionary, but Bitkeeper itself was not the status quo in a world dominated by CVS, SVN and Perforce.

On "tarballs and patches", I was referring to this famous quote from http://marc.info/?l=linux-kernel&m=111288700902396 ...

     So I'm writing some scripts to try to track 
     things a whole lot faster. Initial indications
     are that I should be able to do it almost as 
     quickly as I can just apply the patch, but 
     quite frankly, I'm at most half done, and if I 
     hit a snag maybe that's not true at all. 
     Anyway, the reason I can do it quickly is that
     my scripts will _not_ be an SCM, they'll be a 
     very specific "log Linus' state" kind of thing. 
     That will make the linear patch merge a lot more
     time-efficient, and thus possible.


Bitkeeper was not the status quo, but I don't think that's important in the overall point. The original point was that git was inspired by previous software. The original software pointed at, cvs, was wrong, but I don't think the original point changes when we put the correct software, Bitkeeper, in its place in the sentence.


Yes, but I included a "not to diminish your point" right in the first sentence.

This was just a facts correction because I find the story of Git being fascinating.


Then I suppose my comment was a correction to the correction.


I think Alan Kay would disagree with you: http://stackoverflow.com/questions/432922/significant-new-in...

   The basic idea here is not to bash the current state 
   of things but to try to understand something about the 
   progress of coming up with fundamental new ideas and 
   principles. 
  
   I claim that we need really new ideas in most areas of 
   computing, and I would like to know of any important and 
   powerful ones that have been done recently. If we can't 
   really find them, then we should ask "Why?" and "What 
   should we be doing?"
Which sounds to me like "The real fundamentals of computing happened a long time ago."

Perhaps there will be a more significant revolution in our future but that doesn't contradict the above statement.


OMG, Alan Kay is on Stack Overflow? That's freaking awesome :)


If any of those people die before October ends I'm blaming you directly.


I disagree, here's why:

First, the software engineering field is still very immature. The tools and terminology in use are still very primitive. Studies seeking to identify best practices via evidence are still few and far between. Worse yet, known best practices with substantial evidence behind them are widely ignored in the industry on average. I believe there is still a lot of room for new techniques, and new models of software engineering, which will serve to guide developers for generations.

Second, we are at a critical juncture in programming languages. There are a lot of new languages gaining popularity, and a lot of older languages seeing substantial change. The fundamental paradigms in use by the average programmer are changing (becoming increasingly more functional, generally). Meanwhile, there are even more advanced concepts on the horizon (monads, for example) which are still not accessible to most programmers. There's a huge opportunity there for the next generation of languages to increase the accessibility of those advanced techniques. Additionally, we're in the midst of a revolution in terms of compiler technology. We are seeing things like modular compilers, very advanced run-times capable of incredible speed improvements, cutting edge garbage collectors, and the potential for massive parallelization on the desktop. Again, there's a huge potential there for some very fundamental innovations.

Third, we are no where near the end of the line in computing hardware. Over the next few decades we could see some very crazy things. 3D micro-chips putting trillions of transistors into every cpu. Even more massive parallelization than what we've seen today. RSFQ processors that could put petaflops computing power on a single chip. MRAM married to processor cores leading to breakthroughs in performance. Non-Von Neumann architectures enabled by memory and processing power being mixed together. FPGAs with billions of gates with the capability to operate and reconfigure at multi-gigahertz speed. Asynchronous CPUs.

And these are just things at a very fundamental level. There's still a huge potential for lots of innovations in the software and hardware space closer to the end-user. There's plenty of room for more icons in the future.


I think the distinction you're highlighting is between the people who start the giant companies of the industry, and the people who keep those companies successful. Everyone knows that Noyce/Moore/Grove built Intel, not many people knows who runs it today.

The '50s computing leaders (mostly academics) are in their old age, the '70s leaders (academics and businessmen) are heading into retirement (except for Jobs unfortunately), and the later leaders (Google, Facebook) are still going strong.


People have been about the same intelligence for a long time. Steve, Dennis, and John were no smarter than their predecessors, they were just applying their intelligence in an area other smart people weren't looking at. Woz was designing calculators before he designed the Apple II. So what's the next area a smart person can make a big difference in?


Fundamental inventions and discoveries are never recognized right away. That's why Nobel prizes are usually awarded for a work done several decades ago. The guys who do revolutionary work in computing right now do exist, we just haven't recognized them yet.


For nearly two decades I was a diehard LISP advocate. I even forced all my programers to code three Crash Bandicoot and four Jak & Daxter games in custom LISP dialects that I wrote the compilers for (an article on one here http://all-things-andy-gavin.com/2011/03/12/making-crash-ban...). But by the mid 2000s I started doing the kind of programming I used to do in LISP in Ruby. It's not that Ruby is a better language, but mostly it was the momentum factor and the availability of modern libraries for interfacing with the vast array of services out there. Using the crappy unreliable or outdated LISP libraries -- if they worked at all -- was tedious. Plus the LISP implementations were so outmoded. It was very hard to get other programers (except a couple enthusiasts) to work that way. Ruby struct a decent compromise. And it's type system and object model are better than CL anyway. The syntax is more inconsistent, and the macro model nowhere near as good. But it turns out. Libraries and implementation matter a lot. Still, you can feel lots and lots of LISP influence in all the new runtime typed languages (Ruby, Python, etc). And 30 years later, listeners still rule!

Anyway I bundled up some of my thoughts on this on my blog: http://all-things-andy-gavin.com/2011/10/25/lispings-ala-joh...


With respect to the Library Problem -

Have you investigated Quicklisp[1] using SBCL? I have found Quicklisp generally a better experience than CPAN, pip, Cabal, and the other package managers I've used.

SBCL (A fork of CMUCL) is maintained monthly and recently had a very successful crowdfunding campaign.

Of course, if you find CL itself to be weaker in areas important to you than (say) Ruby, none of the above matters. But I thought you might like to know. :-)

http://www.quicklisp.org/beta/


You!

Reading about all the cool stuff you guys did with GOOL is the reason I got into Lisp.

Awesome to see that you post here on HN. :)


Same here!


“And it’s type system and object model are better than CL anyway. ”

Would be nice if you could expand on this, as it surprises me that an [ex-]Lisper would not prefer CLOS above all! :)

Also, we call it Lisp nowadays. <g>


Here lies a Lisper / Uninterned from this mortal package / Yet not gc'd / While we retain pointers to his memory


You've missed your calling, unless you are a visiting poet.

That was beautiful.


Splendid!!!


"If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility... The computer utility could become the basis of a new and important industry."

— John McCarthy (speaking at the MIT Centennial in 1961).

It's been pointed out that McCarthy used the term "utility" rather than "cloud", but many of us would argue they are one and the same. Indeed Wikipedia defines it as:

"Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid) over a network (typically the Internet)."

In any case, can anyone think of anyone more deserving of the honour of the title "father of cloud computing" than McCarthy? I certainly can't.


Just the cloud? It seems a prediction of the Internet.


This really needs a better source. There's no citation provided, and Wikipedia can't keep any details of his death up without one.

Basically, I have a spoonful of salt to hand.


Agreed. I did a quick Google News search and turned up nothing. Could be legit, or vandalism, but there's really no way to know until we have confirmation.

-

--EDIT-- Since I wrote this, I've seen one source appear, in Portuguese[1]. It's still unclear whether this story is legitimate, or if they simply copied wikipedia / the buzz surrounding the story.

-

--EDIT 2-- Globo is apparently the largest media conglomerate in South America[2] and thus unlikely to publish unsourced stories, so this looks (sadly) to be true.

[1] http://g1.globo.com/tecnologia/noticia/2011/10/morre-john-mc...

[2]http://en.wikipedia.org/wiki/Organizações_Globo


*Portuguese


Fixed, much obliged.


This news site supports the claim (if you translate it): http://g1.globo.com/tecnologia/noticia/2011/10/morre-john-mc...


They also provide no source, and they published after it was published here. Not reliable.


"Não há detalhes sobre a causa de sua morte" = "There are no details about the cause of death".


Ok, come on. Both here and on Wikipedia people are posting links to blog posts or minor news sites that supposedly "support" the claim. But these are clearly instigated by the original Wikipedia edit itself, so that they don't provide any additional proof.


Yes, there isn't any news about this.

This might be a good way to make people read about this kind of person.



Which cites... this thread. Hello TC!


In response to the skepticism being expressed, I've just had a message from Wendy Grossman saying:

   "... his daughter phoned me. I'm the
    ultimate source for the wikipedia entry."


Looks like someone has been doing "original research" on Wikipedia again.


I'm not sure what you're sayimg or implyimg here, but it feels like an insult directed either at me or at Wendy, I'm not sure which. So let me state this a little more clearly.

  I, Colin D Wright, hereby state that my colleague,
  Wendy M Grossman, personally sent me a message stating
  explicitly that she had been phoned personally by the
  daughter of John McCarthy and told of his death.
Now, in what way does that involve wikipedia, and who are you accusing of using WP for "original research"?


I did not in any way wish to insult you or Wendy. I was just making a sarcastic comment about the fact that Wikipedia discourages one from entering information on the site even if one had first hand direct knowledge of said information.

This is one of the more controversial and often debated Wikipedia rules.


I suspect hristov was snarkily pointing out that Wikipedia does not actually allow first-hand knowledge as a source.


I got a call this morning from his daughter that he has died this morning. Sorry to have him go.


Are there any other verifiable sources?


his daughter... assuming you mean Susan.

Does that make her AI, or merely AI's sister?


)))))))))))))))))))))))))))))))))))))))))))))))))))))))


]


What a terrible month for computing...


My thoughts exactly. It's been a black month for computing.


With all these great minds from our field passing away, I'm beginning to think we should pause for a moment and think what we could do to thank all those legendary computing figures still with us? Knuth, Kernighan and Woz come to mind, certainly there are a lot more.


I have a feeling this will start being a lot more common in the computer world. The founders of everything are starting to hit their later ages. :(


Yes.One of the more exciting things about computer science, a few decades ago, is that it was all brand new. Even the pioneers were contemporaries. These were exciting times to live in, since it made me feel like we were on the edge of a brand new technology.

But now, the pioneers have gotten old. And the excitement is slowly ebbing away.


Still awaiting a better source. The Wikipedia entry asserting his death has been reversed at the time of this writing...


Yeah, i'm still waiting on a source too other than twitter


Time for the black bar to appear at the top of Hacker News?


Yes. Unless the information is actually inaccurate, there's not even any question.

What an amazing pioneer. He will be missed.


Does anyone have a better article? I can't find any other information about it. I found out from the NYC Lisp mailing list, and Wikipedia confirmed it.

Sad, sad, sad day. :(


For what it's worth, it was just posted to the University of Waterloo CS announcement list:

"I just received news that John McCarthy passed away in his sleep a few hours ago; he was at home."


Do you have a source (i.e. link to the mailing list)? Unfortunately the person who added it to the Wiki article didn't add one.


The archives to the mailing list are private, so I can't link. You can see in the wikipedia history that who ever made the edit claims to have received word from McCarthy's sister.


Yeh; unfortunately that's not a strong enough source, particularly for this sort of edit :)

Thanks anyway - I am sure it will appear in time.

Usually stuff like this is legit; but sometimes it isn't. So the policy is to have a reliable source or nothing.


I tweeted at @stanfordeng, hoping that they can confirm.


Death date removed from Wikipedia.

Latest edit shows:

"(Protected John McCarthy (computer scientist): Violations of the biographies of living persons policy: Rumors of his death added to page. May be true, but better to wait than edit war."


That was me. I saw the laughing squid link, followed it here, then discovered that their ultimate reference was the wikipedia article. I'm keeping tabs on the news so if an article pops up I'll add it or unprotect the article.


Thank you.


Death date now reinstated. (22:09 BST, 21:09 Zulu)


It [edit: the Brazilian article] cites Steven Levy, senior writer of Wired, as the source. It does not say where the source is found, but I assume it's this tweet:

http://twitter.com/#!/stevenjayl/status/128568370055491584

He may be a writer for Wired, but he doesn't seem to have a solid source either.

EDIT: There seems to be someone that claims Stanford has confirmed it by phone, as you can see in the Discussion page of Wikipedia, though:

"I spoke to the Associate Director of Communications at Stanford School of Engineering, and Dan Stober of Stanford News Service. Both have confirmed that Professor McCarthy passed away this weekend. They both said there were not a lot of details available at this time. An obituary will be issued by Stanford soon. http://chime.in/user/toybuilder/chime/65979187159736320 -- I apologize for bad formatting. I'm not a regular Wiki editor. Joseph Chiu 24 October 2011, 2:11 PDT. I have not edited the article page."


Stanford tweeted that they're still in the process of gathering information https://twitter.com/#!/stanfordeng/status/128580022675054592


I reverted it again. I feel terrible doing that but the alternative is sourcing our article from personal communications or a tweet. Have yet to see a reference which is reliable (in the wikipedia sense) or not ultimately derived from the original edit to wikipedia.


http://i.imgur.com/Y02DM.jpg

RIP. Thanks for inventing all that awesome stuff modern languages are constantly rediscovering.


I have this sign on my office door. I'll be putting his timeline on it tomorrow.


McCarthy's wife died in 1977 in a climbing accident

http://www.mcjones.org/System_R/SQL_Reunion_95/sqlr95-Vera.h...


Very interesting. Off-topic thought, but...

"So she joined the group to climb Annapurna, and was part of the second team to attempt the summit. You go up in pairs, so you do pairwise summit attempts - these Himalayan style things where you do base camps. So she was working her way to the upper camp as the first summit team was coming down between the topmost and the second topmost. They passed, and then she was lost - she and her partner were lost. We're quite sure they fell. They were roped together; we think one fell and took the other with them."

As I was reading that I was thinking, 'why twos? that's not enough to anchor someone if they fall through an ice crevice or something, need at least three for that.' And then I get to the end, and that's exactly what happened. I wonder if people still attempt Himalayan mountains in twos like that.


Crevasses are normally encountered during glacier travel. These aren't particularly common on steep slopes.

Moreover, climbing as a trio tends to be (much) slower than traveling as a pair. When weather and hypoxia are your biggest threat, speed can literally be the only thing keeping you alive.

</climber notes="got sick around 4100m, big wimp">


This is my favorite John McCarthy story.

http://news.ycombinator.com/item?id=1803627

I like how one simple succinct relevant question can say so much about Norvig's claims. It seems appropriate coming from the creator of LISP. And, of course, the mutual respect, professionalism they both had to leave it at that.


This seems to be the original tweet http://twitter.com/#!/wendyg/status/128554733714669568


Not to be "that guy", but does she have any connection to him? Why should I believe she isn't just some random that made that up?


Looking at her linked webpage from Twitter, she's a freelance journalist who has written on tech topics, including articles about McCarthy.

http://www.pelicancrossing.net/credits.htm


As a guy on the internet I am pretty sure she didn't make it up.

I think we will have to wait for confirmation, but not long.


He has a string of accomplishments that still live on in active development... and he made it to 84. That's a good run.


I got into programming through Logo in elementary school. Logo is a Lisp dialect. Even if it took me 15 years to rediscover Lisp, I am thankful for the good beginnings.

Thanks to John McCarthy for making such huge contributions to the computing world, and to my hobby and career.


"Program designers have a tendency to think of the users as idiots who need to be controlled. They should rather think of their program as a servant, whose master, the user, should be able to control it. If designers and programmers think about the apparent mental qualities that their programs will have, they'll create programs that are easier and pleasanter — more humane — to deal with."

Thank you John, RIP


Steve Jobs and Dennis Ritchie were obviously major pioneers (and I admired them both), but of the three, McCarthy's work impresses me the most. I still marvel at the profound degree of lateral thinking evident in his articles.

I suspect this will be the least well reported obituary, which is kind of sad. He was surely one of the greats.


Visionary. Shine On!

"His 2001 short story The Robot and the Baby[9] lightheartedly explored the question of whether robots should have (or simulate having) emotions, and anticipated aspects of Internet culture and social networking that became more prominent in the ensuing decade."



thanks a lot. will read soon.



Long live the first Knight of the Lambda Calculus!


John McCarthy would say we are doing news completely wrong.


I think one of the most important things you can do, while you're alive, is to change the world...

Steve Jobs, Dennis Ritchie and John McCarthy managed to do it. May them rest in peace, now, and be remembered for ever.


WP article is now updated. The techcrunch piece cited HN but the author also contacted Stanford faculty. Gamasutra also ran a short piece. The article is no longer protected if you guys want to update it or clean it up.


(he (will (be (dearly (missed)))))


Whenever people make Lisp jokes by nesting parentheses without regard for the semantics that they denote, I get a little disappointed. It's not anger, nor contempt; just a feeling that we can do better.

    (will-be 'john-mccarthy (modifier 'missed 'dearly))
Incidentally, "I feel like we can do better" would probably be a pretty good summary of what John McCarthy was about.


Or,

    (assert (equal 'dearly (missed-amount 'john-mccarthy)))


I thought about doing something like that, but decided against it. I'm expressing condolences, not writing an obituary program!


It would at least be nice to have the parentheses in some sense represent the parse tree for the sentence, like:

(He (will be) (dearly missed))

That's not very Lispy, and I'm sure some linguists would have a thing or two to say about my tree, but it at least has a structure related to the meaning.


It's maybe a not very well known fact but McCarthy's original LISP proposal used M-expressions instead of S-expressions; M-expressions were then transformed into S-expressions. Using S-expressions directly, however, turned out to be more popular amongst programmers.

With M-expressions, this would look like: he[will[be[dearly[missed]]]]


I wish they had went with M-expressions, I'd be able to make Obj-C and Lips jokes at the same time!


((will-be (dearly missed)) he)

FTFY


(will-miss 'john-mccarthy :dearly t)


You should put more effort into your Lisp parody.


((not (parody?)) 'homage)


I never understood why McCarthy was never featured at YC or Startupschool. Listening & being able to ask McCarthy questions would have been like peering into the past & future at the same time.


Here's a recent interview with him -- an excellent way to understand his perspective on the future of programming languages:

http://www.infoq.com/interviews/mccarthy-elephant-2000


Here's the 1960 paper describing the original Lisp: http://www-formal.stanford.edu/jmc/recursive/recursive.html


My primary development computer, over many generations of system builds, has always been named mccarthy.

Rest In Peace.


I'm the guy that called Stanford. Look, it's not that hard -- someone here must have talked to a Bay Area reporter about their company or their job at some point -- can you please tell her/him to follow up with Stanford and talk to Dan Stober at Stanford News Service and/or Andy Meyer in the Engineering Department? http://chime.in/user/toybuilder/chime/65979187159736320



Anyone got a reliable source corroborating this?


The WikiPedia history states that there was some personal communication with his sister.


I am deeply saddened with this loss to our community. He was such an influence to computer science as a whole.


Well shit. I said it about a week back [1] (maybe a little indelicate, but in my defence, was a little over-wrought):

The technology pioneers who were in their 20s/30s in the the 60s/70s are now really old. The next few years will likely see a bloodbath.

Granted, he wasn't that young in those times, but the point holds. We're going to see a bunch of luminaries 'move on' in the coming months :(

[1] https://twitter.com/#!/shr1k/status/125712526397816832


bloodbath? c'mon... that's a word reserved for death of a bunch of beings, in a short time span, by an 'unnatural' cause. he apparently died peacefully and naturally.


Here is another link (via google translate) with the news although with no source listed, they may have just been going off wendyg's tweet.

http://translate.google.com/translate?sl=auto&tl=en&...


It reminded me of the death of Push Singh a few years ago, after I just started to know him personally. Rather than focusing on the woe, we tried to celebrate his life, achievements, and great inspirations he had brought us. It's really difficult to handle such sadness, but I guess it's very important for the rest of us to think about how we may continue their good will to advance the way we live and think, to inspire others, and to change the world to a better place.


Sorry but, what is your source ?


This has got to be the most depressing month in awhile.


If this is true may have to get CL or Racket set up tonight and do some coding in memory. Ungh what a crappy month for Computer Science.


This is selfish... but Dr. Knuth, please oh please pick up the pace a notch and select a side kick to finish your work. Please!


This is an understandable impulse, but completely ignores the nature of the work itself, which is essentially Knuthian. I don't think there's anyone else in the world, let alone an associate of his, who could continue this work.

It's been years since I was around him (I worked at a Stanford spin-off startup in the early 80's that commercialized some TeX printing technologies), but it's hard to overestimate what he's accomplished. What a wonderful man!


Progress and its Sustainability

http://www-formal.stanford.edu/jmc/progress/

Elephant 2000: A Programming Language Based on Speech Acts

http://www-formal.stanford.edu/jmc/elephant/elephant.html


I saw him speak a few times at AAAI conferences - few people, if any, contributed more to the field of AI.


You're absolutely right. He was a living legend.


Confirmed by Stanford Engineering: https://twitter.com/#!/stanfordeng/status/128615022044790784

I've changed the attribution on the TechCrunch story by the way, though it still gives this thread a shout out.


I saw the black band at the top of the page and I knew another great has left us - it's the black band of sorrow

R.I.P


Seems like October 2011 has taken the best of tech, each great in their own way... RIP John McCarthy


John McCarthy was an exceptional man. I wrote a short article on AI, Siri, and the fourth interface. His concept of engineering an artificial intelligence is soon materializing. http://blabeat.com


I'll cross my fingers and hope it's just a hoax of really, really, really bad taste.


It is truly a sad day for AI and LISP. I can only hope he did not suffer in his last moments.

RIP


Damn, with all that people dieing I always have to think about American Pie. https://www.youtube.com/watch?v=tr-BYVeCv6U


He was deeply excited about the neutrino experiments reported in the last month and following it and other such news till the end.

His daughter told me actually died while in the bathroom sitting on the pot.


The death has not been added to nndb, so I'm slow to believe it. nndb, for those of you who don't know, is the information source used by wolframalpha for its people results


How quick would it be added to nndb?


nndb has been very fast for reporting deaths in the last few months. A few hours at most.


Recently published article with some citations of McCarthy's contributions: http://goo.gl/luVMq [The Australian Eye News]


This is very sad. John was a brilliant man who always made himself available to not only his students but random people(myself) emailing him for research advice.


Thank you John McCarthy. For your work, for your relentless curiosity, and inventiveness that helped us all. Without knowing, you set the path for many of us.


I totally called it. "They always go in threes."

God damn it though. And fellow programmer students have no idea whom I'm talking about. Grar. (RIP '(JOHN MCCARTHY))


Is there any way to implement these memorial black bars in a way that doesn't break the mobile HN apps? I assume they're scraping.


I really hate waking up to a black bar at the top. RIP McCarthy, you are a legend through and through.


Earlier Ritchie and now McCarthy. "The old order changeth, yielding place to new."

I wonder who the "new" will be ?


MULTIVAC aka Wolfram Alpha


Oh no! Lisp has a huge impact on computing. It's hard to imagine how old it is.


It is 53 years old. You don't have to "imagine" how old it is.


53 years are like an eon in computing. Probably older than most of us. It's hard to imagine because it is very much relevant today despite its old age.

RIP.


)


(let ((john-mccarthy '())) (with-backtracking john-mccarthy))


He left a legacy that touched my intellectual life.

In that he will live on.


[deleted]


That lists its source as this post, which lists its source as Wikipedia. Still uncorroborated in my mind.


What a grim year. Jobs, Ritchie, and now McCarthy.


What, no HN black ribbon for McCarthy?


There was one for a little while, but it apparently got removed as there was uncertainty about the claim. I'm sure it'll be back soon.


I am without words, rest in peace.


To those who know LISP, here's a response from benfitz which sums it up

))))))))))))))))))))))))))))))))))))))))))))))))))))))))))) :-(


idol died 1by1 in dark october god bless Knuth I want to see vol7


We need better source.


worst year for CompSci.


Wikipedia has removed this claim now.


Are there any modern people who make great contributions to science who are not Jewish? As a non-Jewish person, I'm thankful for what Jews have given the world. Thank you Mr. McCarthy.


Death always comes in threes.


What a supremely provincial comment.


Oh, yes. Terribly, terribly provincial.


Isn't there something about deaths happening in 3s?

1. Steve Jobs

2. Dennis Ritchie

3. John McCarthy


Let's hope that it is limited to 3. I get terribly antsy just thinking about who else in computing has reached their 70's and 80's.


Agreed, and the close proximity of their deaths makes me think that that's it for a set of 3.


You can divide any list of more than three things into 3s.


How can you divide a list of 4 things into 3s?


You can divide it 4 different ways (4! / 3! = 4). Since the list is necessarily infinite, two more will join it soon enough.


Steve Jobs, Dennis Ritchie, and now John McCarthy - maybe some truth that deaths always come in 3's. Quite unnerving.

We should propose National Computing Month or something for October.


You can take any arbitrary, long running sequence and partition it into threes. Also: selection bias.


To quote the wise _Principia Discordia_:

> "I find the Law of Fives to be more and more manifest the harder I look."


I confess that when the news about Dennis Ritchie was posted, I immediately flashed to the thought, "who's number three?"

Thing is, so did a lot of other people, and tagging John McCarthy's WP page in this way would be just the sort of prank/hoax that would inspire. So the "deaths come in threes" aphorism makes me more suspicious of this, not less.


Poisson distribution? I wish I knew more probability and statistics to understand why this happens.


The explanation could be simpler than that: you just don't notice when notable deaths don't clump together.


Why in G-d's name are you people hemorraging about who or where the source came from? As great a man as he was, this doesn't warrant coniption fits about who knew what first. You're acting like he held national secrets that upon his death would lead to some catastrophe. Give it a rest. The man was in his 80s and led a wonderfully productive life. Get back to work, assholes.


While there may be a bit too much bickering going on about it - having a factual source is important instead of just believing a single tweet or Wikipedia edit.


A bit too much? Half of the posts in this thread are debating this very topic. It's like everyone missed the forest for the trees.


It's not about "who knew what first." It's about figuring out if it's actually true.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: