- Born to communist activists in 1920s Boston
- Studied math with John Nash at Princeton
- Organized the first computer-chess match in 1965, pitting his own algorithm against one created by computer scientists in the USSR, with moves communicated by telegraph.
While many will remember him for Lisp and his contributions to AI, perhaps equally important was his role in running a research lab in what became a characteristic West Coast fashion: finding intelligent people, setting them free to do what they wanted, and obdurately arguing with them over very little provocation. He was far enough from mainstream to be familiar, and even sympathetic to the 60s counterculture, but also cynical enough to embrace technology, rather than revolution, as a way forward. If this sounds a bit like running a startup today, there's one main reason it does: John McCarthy.
It's almost depressing to see the same problems we keep (re-)solving have had great minds beating on them for quite a while now :)
Just to add to that - I've had a message from Wendy saying that McCarthy's daughter had 'phoned her with the news, and Wendy is the ultimate source of the wikipedia edit/entry.
I sense a black bar coming.
The last few weeks have been terrible, one icon after another has fallen. What is different from other fields is that computing is still a young enough field that almost all of its luminaries except for the earliest ones are still alive.
I fear the avalanche has only barely begun, and at some point the frequency of these will become high enough that either we will stop to notice or we will have that black bar on here more or less permanently.
Ugh, not now. References or stop talking. I have my fingers crossed he was signed up for cryonics. This is tragic.
All this talk about the inevitability of death and how that's supposdly a good thing is basically just pseudo-religious reasoning in defense of a status quo that is rapidly crumbling away.
Sure there is. The fitness function is just different from what you or I consider 'optimal' or 'superior'. We're still becoming more resistent to disease, even if the future may have more in common with trailer park people than urban elites.
There are people who resist change and disapprove of all viewpoints other their own, but I doubt it's an age thing. I see a lot of young people opposing gay marriages for instance.
Ideas tend to propagate in generations - it's often necessary for the old guard to die off before a new idea will be accepted, whether this is in art, or science, or politics or whatever.
As new people crop up that aren't reputationally invested in a certain idea it allows more ideas to be considered seriously, etc.
Good to see ageism is alive and well.
I know a guy who did a bunch of market research for a senior care product. They found technology use, on average, dips in the 60s and then starts increasing again in the seventies.
The people who are doing the inventing now are just as smart, but they aren't creating fundamental things, they are improving on an existing system.
We don't say, "engineer team y invented the iCore series of processors." we sort of just say "Intel is coming out with a new set of processors."
I'm not sure if I am making my point very clear. :\
Zuckerberg is definitely a successful dude that one can look up to. But in terms of advancing the field of computing, he hasn't done a whole lot yet.
I don't see that... he just said it wasn't a fair comparison. If it comes off like he thinks advancements in mathematics are more important that other advances, it might be because he does think so, but that's not exactly what he's saying.
It's not too hard to imagine a world without facebook, but I'd find it very hard indeed to imagine a world without Microsoft, Google, Linux, Open Source or the Art of Computer Programming.
It's almost like this is one of those trick questions: "which name does not belong in this list". Zuckerberg pops out at me immediately.
All the rest is timeless, Facebook will one day be gone again, I'm fairly sure of that.
Mr. Knuth's contributions, on the other hand, are much more fundamental. But, because of this, his name remains unknown for the general population. For us, he is a titan amongst men; for the world, he'd be an old, retired professor who cannot even figure out email.
Theory, no. Practice, oh Hell yes. Facebook changed the game, for everyone. The "Social Revolution" has changed the face of business, investing, interpersonal communication, friendships, revolutions... the list goes on. Maybe Zuck was just in the right place at the right time, but he made the most of it and his company's impact has been enormous.
The real fundamentals of computing happened a long time ago. Hero of Alexandria, some Chinese and Islamic scholars, Albertus Magnus, Roger Bacon, Charles Babbage, Ada Lovelace, Joseph Marie Jacquard, Turing, non Neumann. But progress was linear, not exponential - the tools they made were so impractical that they didn't make further tools easier. You used a computer as a curiosity, or to solve a specific problem, not to build a better computer.
Then the singularity hit. Computers got faster, IO got richer, and you could practically write computer programs that would help you write computer programs. It's hard to say exactly when that happened, but the inventions of C, UNIX, and Lisp were certainly part of that critical period in history. This shift started around the mid-1960s to early 70s.
The people who were working on it are now in their 60s and 70s. The next generation (who are in their 50s and 60s) wasn't just inspired by them, they were given very useful tools, which made later advancements easier. It's easier to write a C++ compiler if you already have a C compiler. It's easier to write git if you have a CVS repo to store your work in. People no longer had to bootstrap using nothing but assembly, they started with a sound foundation, and could iterate.
Because of the work of these founders, I think there's a lot more icons left.
It's easier to write git if you have a CVS repo
to store your work in
For the first 10 years of kernel maintenance, we
literally used tarballs and patches, which is a much
superior source control management system than CVS is
And I really think the world needs more people like Linus, as some wheels are worth reinventing.
The real fundamentals of computing happened a long
On "tarballs and patches", I was referring to this famous quote from http://marc.info/?l=linux-kernel&m=111288700902396 ...
So I'm writing some scripts to try to track
things a whole lot faster. Initial indications
are that I should be able to do it almost as
quickly as I can just apply the patch, but
quite frankly, I'm at most half done, and if I
hit a snag maybe that's not true at all.
Anyway, the reason I can do it quickly is that
my scripts will _not_ be an SCM, they'll be a
very specific "log Linus' state" kind of thing.
That will make the linear patch merge a lot more
time-efficient, and thus possible.
This was just a facts correction because I find the story of Git being fascinating.
The basic idea here is not to bash the current state
of things but to try to understand something about the
progress of coming up with fundamental new ideas and
I claim that we need really new ideas in most areas of
computing, and I would like to know of any important and
powerful ones that have been done recently. If we can't
really find them, then we should ask "Why?" and "What
should we be doing?"
Perhaps there will be a more significant revolution in our future but that doesn't contradict the above statement.
First, the software engineering field is still very immature. The tools and terminology in use are still very primitive. Studies seeking to identify best practices via evidence are still few and far between. Worse yet, known best practices with substantial evidence behind them are widely ignored in the industry on average. I believe there is still a lot of room for new techniques, and new models of software engineering, which will serve to guide developers for generations.
Second, we are at a critical juncture in programming languages. There are a lot of new languages gaining popularity, and a lot of older languages seeing substantial change. The fundamental paradigms in use by the average programmer are changing (becoming increasingly more functional, generally). Meanwhile, there are even more advanced concepts on the horizon (monads, for example) which are still not accessible to most programmers. There's a huge opportunity there for the next generation of languages to increase the accessibility of those advanced techniques. Additionally, we're in the midst of a revolution in terms of compiler technology. We are seeing things like modular compilers, very advanced run-times capable of incredible speed improvements, cutting edge garbage collectors, and the potential for massive parallelization on the desktop. Again, there's a huge potential there for some very fundamental innovations.
Third, we are no where near the end of the line in computing hardware. Over the next few decades we could see some very crazy things. 3D micro-chips putting trillions of transistors into every cpu. Even more massive parallelization than what we've seen today. RSFQ processors that could put petaflops computing power on a single chip. MRAM married to processor cores leading to breakthroughs in performance. Non-Von Neumann architectures enabled by memory and processing power being mixed together. FPGAs with billions of gates with the capability to operate and reconfigure at multi-gigahertz speed. Asynchronous CPUs.
And these are just things at a very fundamental level. There's still a huge potential for lots of innovations in the software and hardware space closer to the end-user. There's plenty of room for more icons in the future.
The '50s computing leaders (mostly academics) are in their old age, the '70s leaders (academics and businessmen) are heading into retirement (except for Jobs unfortunately), and the later leaders (Google, Facebook) are still going strong.
Anyway I bundled up some of my thoughts on this on my blog: http://all-things-andy-gavin.com/2011/10/25/lispings-ala-joh...
Have you investigated Quicklisp using SBCL? I have found Quicklisp generally a better experience than CPAN, pip, Cabal, and the other package managers I've used.
SBCL (A fork of CMUCL) is maintained monthly and recently had a very successful crowdfunding campaign.
Of course, if you find CL itself to be weaker in areas important to you than (say) Ruby, none of the above matters. But I thought you might like to know. :-)
Reading about all the cool stuff you guys did with GOOL is the reason I got into Lisp.
Awesome to see that you post here on HN. :)
Would be nice if you could expand on this, as it surprises me that an [ex-]Lisper would not prefer CLOS above all! :)
Also, we call it Lisp nowadays. <g>
That was beautiful.
— John McCarthy (speaking at the MIT Centennial in 1961).
It's been pointed out that McCarthy used the term "utility" rather than "cloud", but many of us would argue they are one and the same. Indeed Wikipedia defines it as:
"Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid) over a network (typically the Internet)."
In any case, can anyone think of anyone more deserving of the honour of the title "father of cloud computing" than McCarthy? I certainly can't.
Basically, I have a spoonful of salt to hand.
--EDIT-- Since I wrote this, I've seen one source appear, in Portuguese. It's still unclear whether this story is legitimate, or if they simply copied wikipedia / the buzz surrounding the story.
--EDIT 2-- Globo is apparently the largest media conglomerate in South America and thus unlikely to publish unsourced stories, so this looks (sadly) to be true.
This might be a good way to make people read about this kind of person.
"... his daughter phoned me. I'm the
ultimate source for the wikipedia entry."
I, Colin D Wright, hereby state that my colleague,
Wendy M Grossman, personally sent me a message stating
explicitly that she had been phoned personally by the
daughter of John McCarthy and told of his death.
This is one of the more controversial and often debated Wikipedia rules.
Does that make her AI, or merely AI's sister?
But now, the pioneers have gotten old. And the excitement is slowly ebbing away.
What an amazing pioneer. He will be missed.
Sad, sad, sad day. :(
"I just received news that John McCarthy passed away in his sleep a few hours ago; he was at home."
Thanks anyway - I am sure it will appear in time.
Usually stuff like this is legit; but sometimes it isn't. So the policy is to have a reliable source or nothing.
Latest edit shows:
"(Protected John McCarthy (computer scientist): Violations of the biographies of living persons policy: Rumors of his death added to page. May be true, but better to wait than edit war."
He may be a writer for Wired, but he doesn't seem to have a solid source either.
EDIT: There seems to be someone that claims Stanford has confirmed it by phone, as you can see in the Discussion page of Wikipedia, though:
"I spoke to the Associate Director of Communications at Stanford School of Engineering, and Dan Stober of Stanford News Service. Both have confirmed that Professor McCarthy passed away this weekend. They both said there were not a lot of details available at this time. An obituary will be issued by Stanford soon. http://chime.in/user/toybuilder/chime/65979187159736320 -- I apologize for bad formatting. I'm not a regular Wiki editor. Joseph Chiu 24 October 2011, 2:11 PDT. I have not edited the article page."
RIP. Thanks for inventing all that awesome stuff modern languages are constantly rediscovering.
"So she joined the group to climb Annapurna, and was part of the second team to attempt the summit. You go up in pairs, so you do pairwise summit attempts - these Himalayan style things where you do base camps. So she was working her way to the upper camp as the first summit team was coming down between the topmost and the second topmost. They passed, and then she was lost - she and her partner were lost. We're quite sure they fell. They were roped together; we think one fell and took the other with them."
As I was reading that I was thinking, 'why twos? that's not enough to anchor someone if they fall through an ice crevice or something, need at least three for that.' And then I get to the end, and that's exactly what happened. I wonder if people still attempt Himalayan mountains in twos like that.
Moreover, climbing as a trio tends to be (much) slower than traveling as a pair. When weather and hypoxia are your biggest threat, speed can literally be the only thing keeping you alive.
</climber notes="got sick around 4100m, big wimp">
I like how one simple succinct relevant question can say so much about Norvig's claims. It seems appropriate coming from the creator of LISP. And, of course, the mutual respect, professionalism they both had to leave it at that.
I think we will have to wait for confirmation, but not long.
Thanks to John McCarthy for making such huge contributions to the computing world, and to my hobby and career.
Thank you John, RIP
I suspect this will be the least well reported obituary, which is kind of sad. He was surely one of the greats.
"His 2001 short story The Robot and the Baby lightheartedly explored the question of whether robots should have (or simulate having) emotions, and anticipated aspects of Internet culture and social networking that became more prominent in the ensuing decade."
Steve Jobs, Dennis Ritchie and John McCarthy managed to do it. May them rest in peace, now, and be remembered for ever.
(will-be 'john-mccarthy (modifier 'missed 'dearly))
(assert (equal 'dearly (missed-amount 'john-mccarthy)))
(He (will be) (dearly missed))
That's not very Lispy, and I'm sure some linguists would have a thing or two to say about my tree, but it at least has a structure related to the meaning.
With M-expressions, this would look like: he[will[be[dearly[missed]]]]
Rest In Peace.
The technology pioneers who were in their 20s/30s in the the 60s/70s are now really old. The next few years will likely see a bloodbath.
Granted, he wasn't that young in those times, but the point holds. We're going to see a bunch of luminaries 'move on' in the coming months :(
It's been years since I was around him (I worked at a Stanford spin-off startup in the early 80's that commercialized some TeX printing technologies), but it's hard to overestimate what he's accomplished. What a wonderful man!
Elephant 2000: A Programming Language Based on Speech Acts
I've changed the attribution on the TechCrunch story by the way, though it still gives this thread a shout out.
His daughter told me actually died while in the bathroom sitting on the pot.
God damn it though. And fellow programmer students have no idea whom I'm talking about. Grar. (RIP '(JOHN MCCARTHY))
I wonder who the "new" will be ?
In that he will live on.
1. Steve Jobs
2. Dennis Ritchie
3. John McCarthy
We should propose National Computing Month or something for October.
> "I find the Law of Fives to be more and more manifest the harder I look."
Thing is, so did a lot of other people, and tagging John McCarthy's WP page in this way would be just the sort of prank/hoax that would inspire. So the "deaths come in threes" aphorism makes me more suspicious of this, not less.