For a fun portrait of McCarthy's early life and aspirations, see the passages about him in "What the Doormouse Said". [1]
- Born to communist activists in 1920s Boston
- Studied math with John Nash at Princeton
- Organized the first computer-chess match in 1965, pitting his own algorithm against one created by computer scientists in the USSR, with moves communicated by telegraph.
While many will remember him for Lisp and his contributions to AI, perhaps equally important was his role in running a research lab in what became a characteristic West Coast fashion: finding intelligent people, setting them free to do what they wanted, and obdurately arguing with them over very little provocation. He was far enough from mainstream to be familiar, and even sympathetic to the 60s counterculture, but also cynical enough to embrace technology, rather than revolution, as a way forward. If this sounds a bit like running a startup today, there's one main reason it does: John McCarthy.
Whether or not he has passed; if anyone has the book and fancies spending a little time adding interesting material to the WP article about him, that would be nice :)
Just to add to that - I've had a message from Wendy saying that McCarthy's daughter had 'phoned her with the news, and Wendy is the ultimate source of the wikipedia edit/entry.
Sad to hear about this. I love LISP and discovered the beauty of it when I was part of Gensysm product team - large scale real time system with AI and NN.
The last few weeks have been terrible, one icon after another has fallen. What is different from other fields is that computing is still a young enough field that almost all of its luminaries except for the earliest ones are still alive.
I fear the avalanche has only barely begun, and at some point the frequency of these will become high enough that either we will stop to notice or we will have that black bar on here more or less permanently.
Death is the only guarantee in life and in fact is necessary for human ecology. I too am deeply saddened by this loss, but am supremely grateful that a man like John McCarthy was able to contribute to the humanity's knowledge. May his work be the foundation in many more advances in human understanding, and may his void be filled by someone who respects this journey.
While I'm pretty confident in humanity's current understanding of physics, I'm not that confident to say that given a million (more like several billion) years and Jupiter brains, we won't figure out a way out of whatever the fate (if there is one) of the universe is. The more important aspect however is how can I actually get to the point of living a million+ years? When I'm there I can argue over some supposed necessity or long-term inevitability of death; right now it doesn't have to be that way.
We're not Drosophila, we're people with minds capable of exploring and understanding a world that goes beyond biological evolution. We don't have to use death to iterate, we do it using general intelligence (it outperforms evolution by a factor of a bazillion). Also, not all organisms use preprogrammed death as a mechanism to iterate, ask an amoeba sometime.
All this talk about the inevitability of death and how that's supposdly a good thing is basically just pseudo-religious reasoning in defense of a status quo that is rapidly crumbling away.
Right, but if there is actually any selection for aging and death in organisms (maybe there is, I'm not sure), it's not useful anymore. There's a lot of evidence that there's no longer genetic selection acting on the human population, and it's therefore no longer true that our children should be genetically superior.
>There's a lot of evidence that there's no longer genetic selection acting on the human population
Sure there is. The fitness function is just different from what you or I consider 'optimal' or 'superior'. We're still becoming more resistent to disease, even if the future may have more in common with trailer park people than urban elites.
And the way of thinking isn't constant. It changes over time. A 15 year old you, a 25 year old you and a 50 year old you are all different people. A 500 year old you will most probably be a different person.
There are people who resist change and disapprove of all viewpoints other their own, but I doubt it's an age thing. I see a lot of young people opposing gay marriages for instance.
We think differently as we age. Have you spent any amount of time arguing with old people? You can't teach an old dog new tricks, etc.
Ideas tend to propagate in generations - it's often necessary for the old guard to die off before a new idea will be accepted, whether this is in art, or science, or politics or whatever.
As new people crop up that aren't reputationally invested in a certain idea it allows more ideas to be considered seriously, etc.
I know a guy who did a bunch of market research for a senior care product. They found technology use, on average, dips in the 60s and then starts increasing again in the seventies.
What you're seeing is actually the death of the second generation of great comp sci minds, IMO. The first modern generation included von Neumann (b. 1903), Hopper (b. 1906), Turing (b. 1912), and others. McCarthy (1927), Knuth (1932), Hoare (1934) etc. are of the next generation. Obviously, you can cut generations on any arbitraty line and YMMV, but the 15-20 year gap seems a reasonable grouping, IMO.
He's pre-generational, along with people like Ramon Llull and Hero of Alexandria (who, however, did not have such a heroically unusual distribution of letters in his last name).
Eventually, I think, there will cease to be icons within the industry. What we have now are the people who invented the internet/computers and the basic components.
The people who are doing the inventing now are just as smart, but they aren't creating fundamental things, they are improving on an existing system.
We don't say, "engineer team y invented the iCore series of processors." we sort of just say "Intel is coming out with a new set of processors."
I'm not sure if I am making my point very clear. :\
Between Gates, Zuckerberg, Larry and Sergey, Torvalds, RMS, Knuth, and a few others, there're still plenty of icons left and so I don't forsee your scenario happening anytime soon. I am generally an optimist, and think the number of icons to come is far greater than those who have come and gone.
Is it really fair to compare Zuckerberg to Knuth? Zuckerberg made a PHP site for people to stalk their classmates. Knuth wrote TeX, hundreds of papers, and a series of books considered among the 100 most important scientific works of the 20th century.
Your comment presupposes that advances in mathematics are more important than advances in social communication. Is Mozart less iconic because his achievements were in music?
I think it's more about how hard it is than how important. Facebook represents a lot of work but nothing that seems to demand superhuman insight; an army of Stanford CS grads supplied with pot and Red Bull could probably make a tolerable substitute in five or ten years. But it's far from certain any of them could ever produce the kind of work Turing or McCarthy brought us. (And I say this knowing I won't, either.)
> Your comment presupposes that advances in mathematics are more important than advances in social communication.
I don't see that... he just said it wasn't a fair comparison. If it comes off like he thinks advancements in mathematics are more important that other advances, it might be because he does think so, but that's not exactly what he's saying.
Not sure why you're being downvoted. Somehow people have this picture of mathematics as this universal body of knowledge, but that couldn't be further from the truth. They are as earthly as anything else we've been doing as a human race. For an interesting perspective on mathematics from a cognitive-scientific point of view, check out George Lakoff's book "Where Mathematics Comes From."
> Gates, Zuckerberg, Larry and Sergey, Torvalds, RMS, Knuth
It's not too hard to imagine a world without facebook, but I'd find it very hard indeed to imagine a world without Microsoft, Google, Linux, Open Source or the Art of Computer Programming.
It's almost like this is one of those trick questions: "which name does not belong in this list". Zuckerberg pops out at me immediately.
All the rest is timeless, Facebook will one day be gone again, I'm fairly sure of that.
I have to disagree on this one. If there is one who does not belong in the list, it would be Knuth. Zuckerberg, as well as Gates, Larry and Sergey, has created a technology company that changed the lives of millions. They are all well known by the "civilians". Torvalds less so, but still, the Linux project has had a relevance that trascends beyond mere technology.
Mr. Knuth's contributions, on the other hand, are much more fundamental. But, because of this, his name remains unknown for the general population. For us, he is a titan amongst men; for the world, he'd be an old, retired professor who cannot even figure out email.
> in terms of advancing the field of computing, he hasn't done a whole lot yet.
Theory, no. Practice, oh Hell yes. Facebook changed the game, for everyone. The "Social Revolution" has changed the face of business, investing, interpersonal communication, friendships, revolutions... the list goes on. Maybe Zuck was just in the right place at the right time, but he made the most of it and his company's impact has been enormous.
In terms of science, absolutely not. But impact, yes. Facebook changes everything. From now on, most people will be a part of an online social network. It's as important to human communication as the telephone was. Know how we use letters from the Civil War era to understand what that society was like? Imagine what will be possible in that regard in 100, 200 or 500 years.
I hate to sound negative, but things won't get better. OK, this month has been a shocker, but next year might be just as bad.
The real fundamentals of computing happened a long time ago. Hero of Alexandria, some Chinese and Islamic scholars, Albertus Magnus, Roger Bacon, Charles Babbage, Ada Lovelace, Joseph Marie Jacquard, Turing, non Neumann. But progress was linear, not exponential - the tools they made were so impractical that they didn't make further tools easier. You used a computer as a curiosity, or to solve a specific problem, not to build a better computer.
Then the singularity hit. Computers got faster, IO got richer, and you could practically write computer programs that would help you write computer programs. It's hard to say exactly when that happened, but the inventions of C, UNIX, and Lisp were certainly part of that critical period in history. This shift started around the mid-1960s to early 70s.
The people who were working on it are now in their 60s and 70s. The next generation (who are in their 50s and 60s) wasn't just inspired by them, they were given very useful tools, which made later advancements easier. It's easier to write a C++ compiler if you already have a C compiler. It's easier to write git if you have a CVS repo to store your work in. People no longer had to bootstrap using nothing but assembly, they started with a sound foundation, and could iterate.
Because of the work of these founders, I think there's a lot more icons left.
It's easier to write git if you have a CVS repo
to store your work in
Not to diminish your point or anything, but regarding Git, I'll just quote Linus Tolvards:
For the first 10 years of kernel maintenance, we
literally used tarballs and patches, which is a much
superior source control management system than CVS is
Basically Git started as a bunch of scripts written by Linux for dealing with "tarballs and patches" and nobody starts something like this unless there's lots of hatred involved. What you get if you're a CVS user and want something better is SVN. You get Git when you consider CVS such an abomination that you'd rather do without any existing VCS altogether.
And I really think the world needs more people like Linus, as some wheels are worth reinventing.
The real fundamentals of computing happened a long
time ago.
I think it's worth noting that git did not replace tarballs and patches. Git replaced Bitkeeper. Bitkeeper, as far as I know, replaced tarballs and patches. And as far as I know, git did not start as a bunch of scripts for dealing with tarballs and patches. The design was inspired by his knowledge of and experience with Bitkeeper.
The workflow was indeed inspired by Bitkeeper, nothing is truly revolutionary, but Bitkeeper itself was not the status quo in a world dominated by CVS, SVN and Perforce.
So I'm writing some scripts to try to track
things a whole lot faster. Initial indications
are that I should be able to do it almost as
quickly as I can just apply the patch, but
quite frankly, I'm at most half done, and if I
hit a snag maybe that's not true at all.
Anyway, the reason I can do it quickly is that
my scripts will _not_ be an SCM, they'll be a
very specific "log Linus' state" kind of thing.
That will make the linear patch merge a lot more
time-efficient, and thus possible.
Bitkeeper was not the status quo, but I don't think that's important in the overall point. The original point was that git was inspired by previous software. The original software pointed at, cvs, was wrong, but I don't think the original point changes when we put the correct software, Bitkeeper, in its place in the sentence.
The basic idea here is not to bash the current state
of things but to try to understand something about the
progress of coming up with fundamental new ideas and
principles.
I claim that we need really new ideas in most areas of
computing, and I would like to know of any important and
powerful ones that have been done recently. If we can't
really find them, then we should ask "Why?" and "What
should we be doing?"
Which sounds to me like "The real fundamentals of computing happened a long time ago."
Perhaps there will be a more significant revolution in our future but that doesn't contradict the above statement.
First, the software engineering field is still very immature. The tools and terminology in use are still very primitive. Studies seeking to identify best practices via evidence are still few and far between. Worse yet, known best practices with substantial evidence behind them are widely ignored in the industry on average. I believe there is still a lot of room for new techniques, and new models of software engineering, which will serve to guide developers for generations.
Second, we are at a critical juncture in programming languages. There are a lot of new languages gaining popularity, and a lot of older languages seeing substantial change. The fundamental paradigms in use by the average programmer are changing (becoming increasingly more functional, generally). Meanwhile, there are even more advanced concepts on the horizon (monads, for example) which are still not accessible to most programmers. There's a huge opportunity there for the next generation of languages to increase the accessibility of those advanced techniques. Additionally, we're in the midst of a revolution in terms of compiler technology. We are seeing things like modular compilers, very advanced run-times capable of incredible speed improvements, cutting edge garbage collectors, and the potential for massive parallelization on the desktop. Again, there's a huge potential there for some very fundamental innovations.
Third, we are no where near the end of the line in computing hardware. Over the next few decades we could see some very crazy things. 3D micro-chips putting trillions of transistors into every cpu. Even more massive parallelization than what we've seen today. RSFQ processors that could put petaflops computing power on a single chip. MRAM married to processor cores leading to breakthroughs in performance. Non-Von Neumann architectures enabled by memory and processing power being mixed together. FPGAs with billions of gates with the capability to operate and reconfigure at multi-gigahertz speed. Asynchronous CPUs.
And these are just things at a very fundamental level. There's still a huge potential for lots of innovations in the software and hardware space closer to the end-user. There's plenty of room for more icons in the future.
I think the distinction you're highlighting is between the people who start the giant companies of the industry, and the people who keep those companies successful. Everyone knows that Noyce/Moore/Grove built Intel, not many people knows who runs it today.
The '50s computing leaders (mostly academics) are in their old age, the '70s leaders (academics and businessmen) are heading into retirement (except for Jobs unfortunately), and the later leaders (Google, Facebook) are still going strong.
People have been about the same intelligence for a long time. Steve, Dennis, and John were no smarter than their predecessors, they were just applying their intelligence in an area other smart people weren't looking at. Woz was designing calculators before he designed the Apple II. So what's the next area a smart person can make a big difference in?
Fundamental inventions and discoveries are never recognized right away. That's why Nobel prizes are usually awarded for a work done several decades ago. The guys who do revolutionary work in computing right now do exist, we just haven't recognized them yet.
For nearly two decades I was a diehard LISP advocate. I even forced all my programers to code three Crash Bandicoot and four Jak & Daxter games in custom LISP dialects that I wrote the compilers for (an article on one here http://all-things-andy-gavin.com/2011/03/12/making-crash-ban...).
But by the mid 2000s I started doing the kind of programming I used to do in LISP in Ruby. It's not that Ruby is a better language, but mostly it was the momentum factor and the availability of modern libraries for interfacing with the vast array of services out there. Using the crappy unreliable or outdated LISP libraries -- if they worked at all -- was tedious. Plus the LISP implementations were so outmoded. It was very hard to get other programers (except a couple enthusiasts) to work that way.
Ruby struct a decent compromise. And it's type system and object model are better than CL anyway. The syntax is more inconsistent, and the macro model nowhere near as good. But it turns out. Libraries and implementation matter a lot. Still, you can feel lots and lots of LISP influence in all the new runtime typed languages (Ruby, Python, etc). And 30 years later, listeners still rule!
Have you investigated Quicklisp[1] using SBCL? I have found Quicklisp generally a better experience than CPAN, pip, Cabal, and the other package managers I've used.
SBCL (A fork of CMUCL) is maintained monthly and recently had a very successful crowdfunding campaign.
Of course, if you find CL itself to be weaker in areas important to you than (say) Ruby, none of the above matters. But I thought you might like to know. :-)
"If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility... The computer utility could become the basis of a new and important industry."
— John McCarthy (speaking at the MIT Centennial in 1961).
It's been pointed out that McCarthy used the term "utility" rather than "cloud", but many of us would argue they are one and the same. Indeed Wikipedia defines it as:
"Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid) over a network (typically the Internet)."
In any case, can anyone think of anyone more deserving of the honour of the title "father of cloud computing" than McCarthy? I certainly can't.
Agreed. I did a quick Google News search and turned up nothing. Could be legit, or vandalism, but there's really no way to know until we have confirmation.
-
--EDIT-- Since I wrote this, I've seen one source appear, in Portuguese[1]. It's still unclear whether this story is legitimate, or if they simply copied wikipedia / the buzz surrounding the story.
-
--EDIT 2-- Globo is apparently the largest media conglomerate in South America[2] and thus unlikely to publish unsourced stories, so this looks (sadly) to be true.
Ok, come on. Both here and on Wikipedia people are posting links to blog posts or minor news sites that supposedly "support" the claim. But these are clearly instigated by the original Wikipedia edit itself, so that they don't provide any additional proof.
I'm not sure what you're sayimg or implyimg here, but it feels like an insult directed either at me or at Wendy, I'm not sure which. So let me state this a little more clearly.
I, Colin D Wright, hereby state that my colleague,
Wendy M Grossman, personally sent me a message stating
explicitly that she had been phoned personally by the
daughter of John McCarthy and told of his death.
Now, in what way does that involve wikipedia, and who are you accusing of using WP for "original research"?
I did not in any way wish to insult you or Wendy. I was just making a sarcastic comment about the fact that Wikipedia discourages one from entering information on the site even if one had first hand direct knowledge of said information.
This is one of the more controversial and often debated Wikipedia rules.
With all these great minds from our field passing away, I'm beginning to think we should pause for a moment and think what we could do to thank all those legendary computing figures still with us? Knuth, Kernighan and Woz come to mind, certainly there are a lot more.
Yes.One of the more exciting things about computer science, a few decades ago, is that it was all brand new. Even the pioneers were contemporaries. These were exciting times to live in, since it made me feel like we were on the edge of a brand new technology.
But now, the pioneers have gotten old. And the excitement is slowly ebbing away.
Does anyone have a better article? I can't find any other information about it. I found out from the NYC Lisp mailing list, and Wikipedia confirmed it.
The archives to the mailing list are private, so I can't link. You can see in the wikipedia history that who ever made the edit claims to have received word from McCarthy's sister.
"(Protected John McCarthy (computer scientist): Violations of the biographies of living persons policy: Rumors of his death added to page. May be true, but better to wait than edit war."
That was me. I saw the laughing squid link, followed it here, then discovered that their ultimate reference was the wikipedia article. I'm keeping tabs on the news so if an article pops up I'll add it or unprotect the article.
It [edit: the Brazilian article] cites Steven Levy, senior writer of Wired, as the source. It does not say where the source is found, but I assume it's this tweet:
He may be a writer for Wired, but he doesn't seem to have a solid source either.
EDIT: There seems to be someone that claims Stanford has confirmed it by phone, as you can see in the Discussion page of Wikipedia, though:
"I spoke to the Associate Director of Communications at Stanford School of Engineering, and Dan Stober of Stanford News Service. Both have confirmed that Professor McCarthy passed away this weekend. They both said there were not a lot of details available at this time. An obituary will be issued by Stanford soon. http://chime.in/user/toybuilder/chime/65979187159736320 -- I apologize for bad formatting. I'm not a regular Wiki editor. Joseph Chiu 24 October 2011, 2:11 PDT. I have not edited the article page."
I reverted it again. I feel terrible doing that but the alternative is sourcing our article from personal communications or a tweet. Have yet to see a reference which is reliable (in the wikipedia sense) or not ultimately derived from the original edit to wikipedia.
"So she joined the group to climb Annapurna, and was part of the second team to attempt the summit. You go up in pairs, so you do pairwise summit attempts - these Himalayan style things where you do base camps. So she was working her way to the upper camp as the first summit team was coming down between the topmost and the second topmost. They passed, and then she was lost - she and her partner were lost. We're quite sure they fell. They were roped together; we think one fell and took the other with them."
As I was reading that I was thinking, 'why twos? that's not enough to anchor someone if they fall through an ice crevice or something, need at least three for that.' And then I get to the end, and that's exactly what happened. I wonder if people still attempt Himalayan mountains in twos like that.
Crevasses are normally encountered during glacier travel. These aren't particularly common on steep slopes.
Moreover, climbing as a trio tends to be (much) slower than traveling as a pair. When weather and hypoxia are your biggest threat, speed can literally be the only thing keeping you alive.
</climber notes="got sick around 4100m, big wimp">
I like how one simple succinct relevant question can say so much about Norvig's claims. It seems appropriate coming from the creator of LISP. And, of course, the mutual respect, professionalism they both had to leave it at that.
I got into programming through Logo in elementary school. Logo is a Lisp dialect. Even if it took me 15 years to rediscover Lisp, I am thankful for the good beginnings.
Thanks to John McCarthy for making such huge contributions to the computing world, and to my hobby and career.
"Program designers have a tendency to think of the users as idiots who need to be controlled. They should rather think of their program as a servant, whose master, the user, should be able to control it. If designers and programmers think about the apparent mental qualities that their programs will have, they'll create programs that are easier and pleasanter — more humane — to deal with."
Steve Jobs and Dennis Ritchie were obviously major pioneers (and I admired them both), but of the three, McCarthy's work impresses me the most. I still marvel at the profound degree of lateral thinking evident in his articles.
I suspect this will be the least well reported obituary, which is kind of sad. He was surely one of the greats.
"His 2001 short story The Robot and the Baby[9] lightheartedly explored the question of whether robots should have (or simulate having) emotions, and anticipated aspects of Internet culture and social networking that became more prominent in the ensuing decade."
WP article is now updated. The techcrunch piece cited HN but the author also contacted Stanford faculty. Gamasutra also ran a short piece. The article is no longer protected if you guys want to update it or clean it up.
Whenever people make Lisp jokes by nesting parentheses without regard for the semantics that they denote, I get a little disappointed. It's not anger, nor contempt; just a feeling that we can do better.
It would at least be nice to have the parentheses in some sense represent the parse tree for the sentence, like:
(He (will be) (dearly missed))
That's not very Lispy, and I'm sure some linguists would have a thing or two to say about my tree, but it at least has a structure related to the meaning.
It's maybe a not very well known fact but McCarthy's original LISP proposal used M-expressions instead of S-expressions; M-expressions were then transformed into S-expressions. Using S-expressions directly, however, turned out to be more popular amongst programmers.
With M-expressions, this would look like: he[will[be[dearly[missed]]]]
I never understood why McCarthy was never featured at YC or Startupschool. Listening & being able to ask McCarthy questions would have been like peering into the past & future at the same time.
I'm the guy that called Stanford. Look, it's not that hard -- someone here must have talked to a Bay Area reporter about their company or their job at some point -- can you please tell her/him to follow up with Stanford and talk to Dan Stober at Stanford News Service and/or Andy Meyer in the Engineering Department? http://chime.in/user/toybuilder/chime/65979187159736320
bloodbath? c'mon... that's a word reserved for death of a bunch of beings, in a short time span, by an 'unnatural' cause. he apparently died peacefully and naturally.
It reminded me of the death of Push Singh a few years ago, after I just started to know him personally. Rather than focusing on the woe, we tried to celebrate his life, achievements, and great inspirations he had brought us. It's really difficult to handle such sadness, but I guess it's very important for the rest of us to think about how we may continue their good will to advance the way we live and think, to inspire others, and to change the world to a better place.
This is an understandable impulse, but completely ignores the nature of the work itself, which is essentially Knuthian. I don't think there's anyone else in the world, let alone an associate of his, who could continue this work.
It's been years since I was around him (I worked at a Stanford spin-off startup in the early 80's that commercialized some TeX printing technologies), but it's hard to overestimate what he's accomplished. What a wonderful man!
John McCarthy was an exceptional man. I wrote a short article on AI, Siri, and the fourth interface. His concept of engineering an artificial intelligence is soon materializing. http://blabeat.com
The death has not been added to nndb, so I'm slow to believe it. nndb, for those of you who don't know, is the information source used by wolframalpha for its people results
This is very sad. John was a brilliant man who always made himself available to not only his students but random people(myself) emailing him for research advice.
Thank you John McCarthy. For your work, for your relentless curiosity, and inventiveness that helped us all. Without knowing, you set the path for many of us.
53 years are like an eon in computing. Probably older than most of us. It's hard to imagine because it is very much relevant today despite its old age.
Are there any modern people who make great contributions to science who are not Jewish? As a non-Jewish person, I'm thankful for what Jews have given the world. Thank you Mr. McCarthy.
I confess that when the news about Dennis Ritchie was posted, I immediately flashed to the thought, "who's number three?"
Thing is, so did a lot of other people, and tagging John McCarthy's WP page in this way would be just the sort of prank/hoax that would inspire. So the "deaths come in threes" aphorism makes me more suspicious of this, not less.
Why in G-d's name are you people hemorraging about who or where the source came from? As great a man as he was, this doesn't warrant coniption fits about who knew what first. You're acting like he held national secrets that upon his death would lead to some catastrophe. Give it a rest. The man was in his 80s and led a wonderfully productive life. Get back to work, assholes.
While there may be a bit too much bickering going on about it - having a factual source is important instead of just believing a single tweet or Wikipedia edit.
- Born to communist activists in 1920s Boston
- Studied math with John Nash at Princeton
- Organized the first computer-chess match in 1965, pitting his own algorithm against one created by computer scientists in the USSR, with moves communicated by telegraph.
While many will remember him for Lisp and his contributions to AI, perhaps equally important was his role in running a research lab in what became a characteristic West Coast fashion: finding intelligent people, setting them free to do what they wanted, and obdurately arguing with them over very little provocation. He was far enough from mainstream to be familiar, and even sympathetic to the 60s counterculture, but also cynical enough to embrace technology, rather than revolution, as a way forward. If this sounds a bit like running a startup today, there's one main reason it does: John McCarthy.
[1] http://books.google.com/books?id=cTyfxP-g2IIC&lpg=PT113&...