Hacker News new | comments | show | ask | jobs | submit login
Interview with Alan Kay (drdobbs.com)
203 points by gits1225 1979 days ago | hide | past | web | favorite | 108 comments



Alan Kay is a brilliant guy, but I really wish he could open his mouth without shooting it off. In particular -- to speak to my own domain -- he bleats about "feature creep" in an operating system with absolutely no demonstrated understanding for a modern OS. Is KVM "feature creep"? Is ZFS? Is DTrace? He shows not so much as an ounce of understanding for why these things exist or empathy with those for whom they were created. I have great reverence for history (and Kay and I share an intense love for the B5000[1]), but I also think it's a mistake to romanticize the past. Viz., when he wistfully recalls the Unix kernel having "1,000 lines of code", he can only be talking about Sixth Edition: even by Seventh Edition (circa 1979!), the kernel had (conservatively) over ten times that amount.[2] And this is still a system that lacked a VM system and TCP/IP stack -- and tossed if you created the 151st process![3] If he wants to criticize the path that history has taken, fine -- but when he is so ignorant of the specifics of that path, it's very hard to treat him as anything but a crank, albeit an eminent one.

[1] http://news.ycombinator.com/item?id=4010407

[2] http://minnie.tuhs.org/cgi-bin/utree.pl?file=V7

[3] newproc() in http://minnie.tuhs.org/cgi-bin/utree.pl?file=V7/usr/sys/sys/...


Cranks don't really put their money where there mouth is. If you haven't seen it yet, I'm sure you'll enjoy looking at the STEPS project: http://www.vpri.org/html/writings.php

In summary, the goal is a user-facing system in 25k SLOC, from kernel to GUI apps, including all the code for compilers for the custom languages they made. They call the languages "runnable maths", or "active maths", where they think of a minimal notation to specify how a system should behave, and then implement a compiler to run that notation as code. They manage this through a compiler-writing language, ometa.

For example, to implement TCP/IP, they parse the ASCII tables from the RFCs as code. Seriously, they just take it as is, and have a compiler to run it as code. The implementation is supposed to be around 200 LOC (includes the code written in Ometa for their compiler).

There's also a graphic subsystem, apparently with all the functionality of Cairo (which is 20k LOC) in about 800 LOC.

Crucially, all this code is supposed to be very readable. Not line noise like APL. The expressiveness comes from having custom languages.


There was a previous discussion of the 200 line TCP/IP stack on hacker news: https://news.ycombinator.com/item?id=846028

I believe most of the code is visible in the linked article. The packet diagrams were redone but are very nearly identical to the RFCs, just some minor formatting tweaks. They are very readable.

The compiler (in OMeta) is a very BNF like grammar.

The readability is striking. The first time I heard of it, I was expecting it to be extra dense and hard to parse. It turns out they have entire lines of just separators counting against their limit, which seems useless until you realize one of the main goals is readability.

Incredible work, I'm looking forward to the end result.


This style seems amazing, but I haven't seen any tutorials for people interested in learning how to do this themselves.. do you know of any such resources?


They are doing the real deal thing - for each problem, designing a language in which to express the solution, and then writing the solution. Part of the point is that the building such solutions isn't really teachable in the context of a tutorial. If a tutorial can teach it, then it's probably a pretty shallow skill, and shallow skills should get designed away into a solution-oriented programming language, and any tutorials for THAT language, etc etc...

The language they use to do that (O-Meta) is then a language-defining language, and designing THAT means balancing on a level of abstractions that is at least three levels deep, and each of those abstraction levels is hard and necessary. They achieve great simplicity by thinking damn hard.

On the other hand, the mailing list (as noted by the other respondent) is a pretty interesting place with few flamewars.


I think this tutorial does a pretty good job: http://news.ycombinator.com/item?id=4230995


This is exactly what I was looking for. Thanks.


Thank you - the light just went on for me! Is this what is meant by Domain Specific Languages (DSL's) ?


Ask around on their mailing list they are quite helpful. I also asked a similar question from the viewpoint of the TECS book: http://www.mail-archive.com/fonc@vpri.org/msg01614.html


STEPS is cool, but IMO he should badmouth the entire industry after he ships it.


That would be nice, but I think STEPS is a research project that looks into how things can be done differently. It would be nice if this particular implementation would result in a shippable product. But I doubt if that is their primary goal.


Kay and Ingalls et al. had a very specific vision for personal computing as liberating, and the keys to that vision were full access, full comprehensibility and making no distinction between users and programmers. The market has clearly shown that people would rather not have full access, don't care about comprehending the system and desperately want a firmly regimented distinction between users and programmers.

For me personally, the thought of giving up entirely on this vision is repugnant and it's important, even if such systems aren't going to make a grand reappearance anytime soon, to at least make sure that people who care about computing know that they once were. I see that reminding as Kay's primary role. I'm not switching to Pharo for my day-to-day work (in my defense I did try) but I find the philosophy appealing and don't see a compelling philosophical disproof. So it seems to me it could re-emerge someday, and a world built on these ideas will have different tradeoffs, and may be preferable.


Don't mistake the fact that normal users aren't interested in current tool chains and coding means that they aren't interested in being able to solve problems creatively. Kay mentions Hypercard in this interview, which had a long history of drawing normal people into designing solutions to their problems from scratch.

Back in the mid/late 80s I used to do work for a public library and helped set up several public Macintosh computers that had, in addition to word processing and spreadsheet software, Hypercard installed. I was amazed how people "got it" so quickly. From kids, natch, to street people to soccer moms. And by "got it" I don't mean running the built in stacks, but in creating their own.

Before long I was helping people make some really interesting projects. Some of them were beautiful monsters, but others were very well thought out. And all of them solved specific problems, many of which would not have had enough general appeal to be marketable or even transferable to anyone else.

And there's real value in having that ability. But these days most people simply can't get past the gatekeeper of beginning complexity. To be sure, in Hypercard you quickly ran up against a wall of limited extensibility in terms of its objects. I can easily imagine better.

But Kay is right, The web and most modern OPP aren't it. And modern tools for software development are a nightmare.


The market has no incentive to create a system that hands users a microscope that shows what is happening at the cellular level, that can, as a natural science not programming, demonstrate what is happening. To say the market has demonstrated that people don't care about comprehending is false: the market has not found a marketable way to sell explodable systems, and CS has not created runtimes that can illustrate their functionality, can be introspected into, in a high level or compelling systems view.

_The big problem with our culture is that it's being dominated, because the electronic media we have is so much better suited for transmitting pop-culture content than it is for high-culture content. I consider jazz to be a developed part of high culture. Anything that's been worked on and developed and you [can] go to the next couple levels._

I'm not sure that our media is necessarily better suited towards transmission: I suspect we just lack mediums that make looking at what is happening evident and interesting (even with X years of programming experience, watching another persons webapp run is rarely fun or an obvious exercise). This is a failure of our runtimes and the weak structures we use to build code.


I think inplicitly Kay is talking about personal computing, not servers, and that in isolation is a very narrow perspective. But following his thinking for a moment: does a 'PC' need VMs, ZFS and dtrace? Apple has shown that, by removing many things that seemed essential, like a user-visible file system, you can build a personal computer that appeals to a lot of people. I firmly believe this is not the right tradeoff to make, even for personal computing in general, but people are throwing loads of money at them...


If you are talking about iOS... it still has the concept of a filesystem [1], albeit in a slightly different form than normal.

Even if Alan Kay is talking about PCs and not servers, you still need important things like the ability to connect to the Internet (TCP/IP), IPC, VMM, threading and multi-tasking, etc. I suspect I've not understood your point though, so feel free to clarify what I've missed :-)

1. http://developer.apple.com/library/mac/#documentation/FileMa...

2. https://developer.apple.com/library/mac/#documentation/perfo...


When I started using Unix in the mid-90s, RAM was still expensive, and the rule thumb was that to get best use of a workstation you should have 2x main memory as swap, 4x for a server. This was reckoned to give the best trade-off between the number of processes running and the working set you actually needed.

Shortly afterwards this stopped making sense because RAM got cheap. Should a 4G PC have a 16G swapfile? Should a 256G server have a 1T swapfile?

In short virtual memory was a hack to get around economical constraints. If you have the main memory, use it! Were it not baked into the OS at a deep level, I've have stopped using swap 10 years ago. It could be taken out of the next version of all the main OSs altogether, and no-one would even notice.


I didn't actually mean to add that second link in there (I realised you were talking about virtual machines and edited my comment accordingly... But forgot to remove the second URL!)

However, disk-backed page faults are but one component of a virtual memory subsystem. The real genius is the OS can provide access to memory uniformly in a way that is transparent to processes.


Indeed but in practice, paging to disk and memory protection are so entwined now.


Not really. In Windows its just a change to a system setting to disable the swap file, and in a unix it's a config file change or a command like swapoff to turn it off.

The fact that the two are entwined now is because of system defaults and decisions by OS developers on how to implement the page replacement algorithms in MM subsystems.


swapoff -a


Bitter experience has taught me to have a swapfile 1x main memory and set swappiness=0 in the kernel.


Apple doesn't "remove a user visible file system", they just make direct file system manipulation not as much of a necessity. And yet, this is not why their OS is good: it's because, if you want to, you can manipulate the file system and view it in as much detail as a BSD (OpenBSD) server, with just as much ease, combined with great hardware-software compatibility.


I think konstruktor was talking about iOS.


The system still has the complete filesystem implementation, and despite it not being visible to end users I imagine life would be much harder for developers without it, at least without replacing it with something likely to actually be even more complex (in code, not in UI). I mean, it isn't like VM or DTrace is "user-visible"; if anything, it is less user-visible/intrusive than its predecessors.


even 6th ed has more than 10000 lines, but I incline to think the writer dropped one zero accidentally.


"A lot of people go into computing just because they are uncomfortable with other people. So it is no mean task to put together five different kinds of Asperger's syndrome and get them to cooperate. American business is completely fucked up because it is all about competition. Our world was built for the good from cooperation."

So much insight, so few words.


Kay may think that "American business is completely fucked up", but it's also undeniable that American business has been the greatest engine for technological progress the world has ever seen. Kay's anti-capitalism is showing, and it isn't particularly glorious to behold.


World War 1, 2 and the Cold War have been the greatest engines for technological progress the world has ever seen.


Ah yeah ? How do you demonstrate that?

Looks like the industrial revolution had nothing to do with wars at all... strange, somehow stuff got invented anyway.

I'd be very careful about making such implications that "wars are good for technology". We have seen more advances in the past 20 years in technology than any time before, yet it was already the end of the Cold war. How do you explain that??


>wyclif: American business has been the greatest engine for technological progress the world has ever seen

>ekianjo: We have seen more advances in the past 20 years in technology than any time before, yet it was already the end of the Cold war. How do you explain that??

The US has kept itself pretty busy with warfare over the last 20 years[1]. The Cold War is ages ago and war has been waging for quite some time since the end of it.

[1] http://en.wikipedia.org/wiki/List_of_wars_involving_the_Unit...


"Defense spending" as it is now, probably doesn't require actual war. The whole "war drives tech" idea is fueled by the idea that war causes society to funnel cash towards a singular purpose... RADAR, Manhattan Project, NASA, etc. Nowadays we are still spending a ton of money on the military, but think of what that spending would be like if WW3 were to break out (barring nuclear holocaust). That's the generic idea behind the 'war <=> tech' relationship.


Necessity is the mother of all innovation.

Wars just ensure such necessities occur at faster rates.


I disagree. Wars focus only on how to make destruction faster and more effective and deadly. It's not focused on saving people or making lives better.

The necessity from War is to stay alive and beat your foes. To this purpose, you burn your own economy to the ground (and the rest of the world with it) by going into massive spending and carnage.

WHatever innovation you actually get is from a massive, unnatural diversion of resources that were in the pockets of free people before the war. So, of course, innovation can happen at an accelerated pace, but compared to innovation during peace-time, it is way more costly and less effective during war-time, because you do not focus on the quality of your investments, and decisions are not taken on economic, rational grounds.

The cost of War is huge on society, and that's not a surprise the US were almost bankrupt by 1944 (remember the pressure of war bonds?) and wanted to end it as soon as possible.

I'd be more interested to make a rationale on the missed opportunities of technology BECAUSE of war. There are numerous cases of scientists who were disturbed by war events and who had to drop their works in order to save themselves. There are things that you see and things you do NOT see.


  > Wars focus only on how to make destruction
There are lots of side benefits to the things that aren't directly weapons (e.g. RADAR). I'm definitely not encouraging war, but I think it's a fallacy to imply that the only thing that come out of 'war research' is weapons. There are plenty of ways to make "destruction faster and more effective" without developing an actual weapon (e.g. making the supply chain more efficient, finding better fuels to power vehicles, etc).


Also examples of some of the greatest human meatgrinders and public-private war profiteering the world has ever seen. As a society we shouldn't remember the sweet without also remembering the sour.


I would be curious to know what makes you think that technological progress is happening because of competition. It might as well be despite competition, I see no rational way of telling for sure, as there are too many interdependent factors at work.

But I will say this: the academic world as I've been witnessing it is a world built on cooperation, and it seems to me to be the real driver of strong innovation. Another lucrative social website is not "strong innovation", but a new research algorithm that pushes the boundaries of what we can do, is. One is driven by profit, the other by curiosity. One happens because of competition, the other because of cooperation.


Academia is rife with competition. As they say, the battles are so fierce because the stakes are so small.

The open source community and Wikipedia are great examples of cooperation.

The biggest advances in business are usually by people who are completely focused on customer needs and ignore competition (google up until their google+ fascination, Elon musk, apple prior to android's success. Note how google hasn't made any advances from their competition with Facebook, and how Apple hasnt advanced the iPhone much since they started competing with Android).


I don't believe that competition is the primary driver of innovation either. But competition is an inevitable consequence of comparability. If you cannot compare outcomes of different approaches to the same problem you cannot improve the solutions. I also think that competition mitigates some social developments that lead to stagnation, like cronyism and lazy establishments holding on to resources without delivering any results.

But at the end of the day I think you are right to dispute the popular notion that somehow a cut throat environment that pits everyone against everyone else is the most conducive environment for innovation. It just causes fear and fear makes stupid.


There's plenty of competition in academia, and plenty of cooperation in the industry. And they cooperate between each other a lot, too.

And while lucrative social websites are not innovation per se, you should consider they may be funding the next SpaceX.


There's capitalism in general, and then there's our particular implementation. And just because we've (mostly) done well thus far, doesn't mean we've done nearly as well as we could, or that our manner of making progress is economically/ecologically sustainable. (Disclaimer: I believe culture trumps both economic and political systems in measuring long-term societal success.)


Agreed for some value of progress. Some worrying aspects.

http://news.cnet.com/8301-13579_3-57468103-37/apple-bows-out...


"American business has been the greatest engine for technological progress the world has ever seen"

That doesn't mean it couldn't be far better. Which is the point Kay was getting at.

Lose your vision and striving for the ideal, and you inevitably find yourself sliding backwards.


I think it seems so poignant because the general phenomenon has been corrupting our society for a long time. I can't imagine that this is the first time disruptive and powerful technologies have subverted the moral codes that their inventors were ostensibly upholding.


"A side note: Re-creating Kay's answers to interview questions was particularly difficult. Rather than the linear explanation in response to an interview question, his answers were more of a cavalcade of topics, tangents, and tales threaded together, sometimes quite loosely — always rich, and frequently punctuated by strong opinions. The text that follows attempts to create somewhat more linearity to the content. — ALB"

Audio would be ace.


It seems like he's still stuck thinking about computers the way he was many years ago. Wikipedia isn't amazing because it doesn't have a WYSIWYG editor and can't run code on the (very few) pages where that's even applicable?

He doesn't seem to get that what makes the web amazing is the data and people connected to it. That's generally far more interesting to users than novel interfaces or whiz-bang features.

Google, Facebook, Twitter, Wikipedia, Yelp, etc aren't technically all that interesting or innovative. They're infinitely more transformative than nearly all software that came before them though.


It seems like he's still stuck thinking about computers the way he was many years ago.

Google, Facebook, Twitter, Wikipedia, Yelp, etc. ... infinitely more transformative than nearly all software that came before

Ah, pop culture, you make me laugh... you make me cry...


Calling something a name isn't much of an argument.


Why so?


Not the GP, but the quote gives undue importance to a number of websites that have been successful. You have to consider whether some early software (say Unix and the C compiler) was more instrumental to the transformative effects of software on humanity, or say the software written Xerox PARC, or the original TCP/IP work.


It depends on your measure. It could be argued that it's not undue weight as these sites have impacted on such a massive proportion of the world's society. For instance, gcc and unix didn't allow for the sort of direct communication that twitter caused - look at the Arab Spring for example.

That the technologies you mention enabled the websites mentioned is not in doubt, but without the websites and related technologies would things like TCP/IP and Unix have had such an impact?


Seems that my comment is controversial as someone has voted me down. I'm curious to hear from opposing views - I love technologies like TCP/IP, gcc, Unix, C, etc. - but I think they wouldn't be much use unless they were used to enable applications like Twitter, Google, Nethack and Wikipedia.

I do want to clarify that I'm not saying that the underlying technologies aren't amazing.


I think the debate is whether TCP/IP would be useful without Google, or Wikipedia would have happened without TCP/IP. Personally I would say that IP changed the way people think about information, which made Wikipedia seem plausible.


He doesn't seem to get that what makes the web amazing is the data and people connected to it. That's generally far more interesting to users than novel interfaces or whiz-bang features.

Oh, the web has got plenty of whiz-bang features alright. My problem with it as a technology is that, compared to the Internet, it feels as an ad hoc concoction of such: clunky, buggy and a PITA to work with.

I shudder to think what my programming experience would be if the other layers in the stack (PC hardware, Unix, TCP/IP, etc.) were built the same way.


I shudder to think what my programming experience would be if the other layers in the stack (PC hardware, Unix, TCP/IP, etc.) were built the same way.

But it is!!

I'd speculate you just don't know it as well as you know web programming.

PC hardware.. do I really need to rehash the whole "the x86 architecture is horrible compared to Alpha/ARM/68000" discussion?

Unix.. The original "Worse is Better"[1] essay was written about how Unix beat Lisp machines because it was worse!

TCP/IP.. Well, there's that whole IPv6 thing where they try to fix some of the problems with IP[2]. Even that ignores many of the fundamental problems, though.

All software is hacky when you get close enough. Once it has been around long enough the terrible corners get broken off and people get used to the problems, and can't imagine another way.

[1] http://dreamsongs.com/WorseIsBetter.html

[2] http://en.wikipedia.org/wiki/IPv6


I'd speculate you just don't know it as well as you know web programming.

My background is in systems programming and low level stuff; I've only been a web-programmer for two years.

I agree with your points; all those technologies are not perfect. Their limitations are mostly the result of unexpected explosive growth (the 640kb barrier, depletion of the IP address space, etc.) But they feel soundly engineered overall.

I just don't get the same feeling working with the web-related things (browsers, JS, AJAX, CSS, cross-domain policies and hacks to bypass them, etc, etc.). They all feel ad-hoc and it shows in the codebase I have to work with nowadays: it's often ugly, buggy and hard to maintain.

As a systems programmer, I've been amazed by the ingenuity of things like process forking, pipes, the proc file system, regular expressions, lexer and parser generators (the list goes on). But I can't remember a single web-related technology that made my jaw drop in amazement.

And I don't even want to get started on the difference in the quality standards and average mindset of hardware/OS design and web development (the whole "agile" and "extreme" programming thing and such).


>>But they feel soundly engineered overall.

The 'feel' so because you are made to believe so. We haven't seen anything else better in comparison. But those who have, talk about it.

>>As a systems programmer, I've been amazed by the ingenuity of things like process forking, pipes, the proc file system, regular expressions, lexer and parser generators (the list goes on). But I can't remember a single web-related technology that made my jaw drop in amazement.

Web means? Back end engineering is very much a part of the web. And the most of the things that you talk of like Parsing, regular expressions, pipes et al are all there in languages like Perl.

Perl is known as a the swiss army knife of the Internet.


But I can't remember a single web-related technology that made my jaw drop in amazement.

http://bellard.org/jslinux/

Maybe not what you meant, but still pretty jaw dropping.


Jaw-dropping for the wrong reasons IMO. Reinventing the virtual machine poorly in a web browser is an impressive feat given the technical limitations but what does that actually prove? The demoscene is far more impressive in that regard.


All of the companies you cite ARE technically innovative - Google created map-reduce (amongst many other things); Facebook stores, indexes and retrieve billions of images; Twitter contributed real-time graph processing (Storm) to the open-source world, and mostly solved fan-out on a massive scale; Wikipedia - well, it's Wikipedia, and it's done on a tight budget, to boot.

Everyone's standing on the shoulders of giants, but the idea that the contributions of top engineers and scientists today are somehow less than those that came previously is ridiculous - Kay seems hurt that people have taken the ideas and run with them, and doesn't seem to get that not everyone's as smart as him - and that's fine, we can still have something to offer.


Google created map-reduce

Only if they also created a time machine, map and reduce are there in the Common Lisp spec from '84 and doubtless pre-date that by a decade or two.

mostly solved fan-out on a massive scale

Well, if you mean created a poor copy of what Teknekron (now known as Tibco) were doing in '86, and they were mainly commercializing research from the '70s.


It's worth noting that the Google "MapReduce" is not type equivalent to the map (roughly 'a list -> 'b list) and reduce ('a list -> 'b) found in Lisp and functional languages. It is actually in a way, an opposite. That is, the map (roughly 'a -> 'b list) in mapreduce is actually more like the dual/opposite of reduce (unfold). And reduce is well, a complicated beast.

I think - the ideas behind MapReduce are not difficult but nor are they necessarily inevitable. The innovative part of MapReduce is Google getting to a place such that the constraints force that approach as obvious. MapReduce is impressive.


Doing search well is really hard. Google's simple interface hides some incredibly challenging and innovative work. And not just search, it's the integration of all the pieces, like translate, maps, etc. I'm sure it increases world GDP.

As for for the others, yes technically they are just plumbing and their value is more of a social innovation and very important in that way.


A nitpick - "world GDP" is an oxymoron.


? World GDP is a well-defined economic measure, used by economists all the time.


GDP expands to Gross Domestic Product which doesn't make sense if you're talking about more than one country at a time, since most of that number will really be Gross Foreign Product :)


http://www.economist.com/blogs/dailychart/2011/05/world_gdp

Better go correct those guys at the Economist too.


Done. The Economist is wrong too.


During the course of the interview, it was clear Kay was very much talking about computers as consumer devices. He's really fixated on making them far easier to interact with and giving them UIs that do not require users to adopt the computer user mindset to get work done. Getting around that need is part of his attachment to WYSIWYG, ease of use, simplicity of design, and the rest. That quest for ease of use was a theme present in many one-off remarks on unrelated topics and tangents.


Wikipedia is great as an information resource. But what does its format do that Gutenberg didn't do 400years ago? It has color pictures - that's all that separates it from a woodblock print encyclopedia.

It's running on a ####ing computer and yet can't use any of the features of that computer - just because the browser is an app and the job of the OS is to protect itself from apps


> It has color pictures - that's all that separates it from a woodblock print encyclopedia.

Well, that and it fits on a desktop instead of in 100 monasteries connected by donkey carts.


And it's very up to date!

Usually.


I fully agree with him.

The web was designed for reading documents not to try to shoehorn an operating system into the browser.

Thankfully the mobile world is bringing back the desktop applications concept, while using the network for communication protocols as it is supposed to be.


That's the exact opposite position from what Kay said:

"...what you definitely don't want in a Web browser is any features.... You want to get those from the objects. You want it to be a mini-operating system..."

Essentially he's saying that the way we think of browsers is holding the web back. We could have collaborative WYSIWYG editing of Wikipedia or blogs, for example, but instead we're stuck with just reading documents.

Perhaps it would be correct to say that the web was designed for reading documents, but it shouldn't have been (at least according to Kay).


Excellent interview - Kay has some hard truths to tell and calls out a lot of BS.


Totally agree with you, but looking at the comments above it seems like we've been imprisoned for so long in this glorious cave called the World Wide Web that we've forgot how we could have transformed the real outside world. (http://en.wikipedia.org/wiki/Allegory_of_the_Cave)


I think I can guess the answer, but do you think that perhaps there could be any other reason, just any at all, that people don't see it your way?

Once you've started explaining away why people disagree you with that kind of rhetoric, it's really hard to learn anything from them.


> but do you think that perhaps there could be any other reason, just any at all, that people don't see it your way?

Of course that I'm aware of that, I'm not that self-centered :)

> Once you've started explaining away why people disagree you with that kind of rhetoric, it's really hard to learn anything from them.

English is not my primary language, so I'm not 100% sure I understand what you're saying, but in case you're suggesting that I'm full of what will turn out to be empty rhetoric then of course that you have the right to your opinion.

Anyway, I find it crazy that a guy like me (I think I'm in HN's bottom bracket when it comes to programming- or CS-ability) somehow sort of "defends" a guy like Alan Kay. I'm only doing it for the energy, the passion, the whatever-you-want-to-call-it that always gets to me when I'm reading Alan Kay's interviews. I agree it's something subjective.


There's a compelling quote within the first page: But the thing that traumatized me occurred a couple years later, when I found an old copy of Life magazine that had the Margaret Bourke-White photos from Buchenwald. This was in the 1940s — no TV, living on a farm. That's when I realized that adults were dangerous. Like, really dangerous. I forgot about those pictures for a few years, but I had nightmares. But I had forgotten where the images came from. Seven or eight years later, I started getting memories back in snatches, and I went back and found the magazine. That probably was the turning point that changed my entire attitude toward life. It was responsible for getting me interested in education. My interest in education is unglamorous. I don't have an enormous desire to help children, but I have an enormous desire to create better adults.

Here's a link to the photos (some unpublished) that he refers to: http://life.time.com/history/behind-the-picture-bourke-white...


Why is the title of the post renamed? Should the titles be boring and dry instead of being informative one line summary of the article that people may find interesting enough to read it?

Look what happened to the previous submission of the same article here on HN: http://news.ycombinator.com/item?id=4224966. Such a good article went unnoticed because the title was plain unattractive.

Can we have a little bit of 'submitters freedom' here?

P.S: Original title was: "Internet by Brilliant minds, Web by Amateurs, Browser a lament"


I actually think it was a good rename. He said a lot more than that, and sensational titles aren't really needed on HN.


I do agree on 'cheap' sensational titles, but I don't think mine was. I think it was a better title more in-line to what Alan Kay emphasized in the article, giving it a bit of context to interest someone to read a great interview, instead of just a generic 'Click Here' like the previous submitter's, leading people to think it was not worth 'clicking through'.


Fair comment - on further reflection I agree, the title was reasonable as it highlighted the emphasis placed by Alan Kay. It wasn't the title that was sensational, it what Alan Kay said in the interview. Apologies for suggesting otherwise!


Although the new title is bland, it got me to read it. I likely would have skipped the the original unless "Alan Kay" was in there somewhere.


Yes it was "Internet by Brilliant minds, Web by Amateurs, Browser a lament - Alan Kay"


Browsers were built around the idea of web pages, not web apps. The security model works for web pages but precludes powerful web apps. There needs to be a way for a user to grant UDP, TCP or POSIX power to a web app, because WebSockets are not TCP, and IndexedDB is not POSIX and never can be. Just UDP, TCP and POSIX and we're done. Let the community build everything better and faster on top.


If they reinvented what Engelbart, did we'd be way ahead of where we are now.

I give you http://www.dougengelbart.org/about/hyperscope.html, implemented in 2006 on that horrible web thing Kay complains about. So there is that...


If the web had proper time to incubate without the pressures of the dotcom craze, I suspect it would have developed in a much more controlled, and systematic way. As we know, development was very chaotic and competitive.

I do think we will get to a better place eventually. After all, the web is still a relatively new medium, especially when compared to the "Internet".


"You can't go to heaven unless you're baptized." Actually, no. Perhaps if you are Roman Catholic (and even then that's questionable); Evangelical Christians either believe that you don't do anything to get to heaven and Jesus saves you, or you choose to believe that Jesus Christ died as one perfect sacrifice for the forgiveness of those who follow Jesus.

You may not agree, but I'm pointing out that the statement by Kay is pretty inaccruate.


There are many interpretations of what it means to follow Christ. The rational interpretation is Christ's own interpretation, the historical gospel story, from the beginning of the Old Testament right through to the end of the New Testament. The Scriptures are historical, they provide the facts of the events. Thankfully they are also theological and interpret the meaning of the events. Did it happen? What does it mean?


Fundamentalist radical literalist Christian here :) It's simple: when Jesus and two thieves were being crucified together, he told one of them that he would see him in paradise that day. Ergo, baptism is not a necessary requirement of getting to heaven.

https://www.biblegateway.com/passage/?search=luke%2023&v... (I would start around verse 32)


Agreed - however, I was merely pointing out that Alan Kay stated that Christians believe that "You can't go to heaven unless you're baptized", which is about as far as possible from the position of such a considerable number of Christians it's unfunny!

So it's not that I disagree with any assertions that there are many interpretations of what it means to follow Christ, but that I disagree that the statement was accurate for all Christians. It is my opinion that a very small minority of Christians really believe that.

I realise that I've been voted down for pointing this out, but I thought it was an important viewpoint to bring forth in the discussion, given that Alan Kay made it the basis for his "Socrates should go to heaven" argument. Alan Kay might be excellent as a Computer Scientist, but his digression into the Christian religion was remarkably uninformed.


The problem with arguing over what Christians actually believe is that there are so many different beliefs that all call themselves Christianity. You can find some group that believes almost anything, and wonders why everyone else doesn't believe it. Even really unusual beliefs most people never even consider, like the idea that the Old Testament "God" is Satan in disguise, have adherents, or at least had them at some point. (I'm guessing other popular religions are similarly hard to argue about, but Christianity is the one I'm most familiar with.)


Is this how he really said it, or was the author paraphrasing?


I haven't read too much Alan Kay before, but this interview made me realize that he's not really the inventor of object-orientation as we currently think of it. He says as much directly in the interview:

"And that's true of most of the things that are called object-oriented systems today. None of them are object-oriented systems according to my definition. Objects were a radical idea, then they got retrograded."

It sounds like his conception of "objects" is a more user-facing thing; an object is something you see on your screen and can interact with and manipulate by programming. This view blurs the line between users and programmers. I'm not sure this is a terribly realistic model for how normal people want to interact with computers. For all his disdain for the web, it has succeeded in bring content to over 2B people worldwide, most of whom wouldn't have the first idea what to do with a Smalltalk environment.


I haven't read too much Alan Kay before, but this interview made me realize that he's not really the inventor of object-orientation as we currently think of it.

Alan Kay is really the inventor of object-orientation. But when the concept of "type bound procedures" was added to compiled languages like C (making it C++), they called it object-orientation, even though type bound procedures are a different thing than object-orientation. Real object orientation is about sending messages between objects, those messages do not have to be predefined, they are not dependent of type hierarchies, they can be arbitrary, they can be relayed, ignored, broadcasted etc. You do not need classes or types at all to be object-oriented by Kay's definition.


Nothing you said contradicts my comment at all. I said "he's not really the inventor of object-orientation as we currently think of it." What we currently think of as object-orientation is not what he invented. I don't see what's controversial about this, since he said exactly this in the interview, and I'm not understanding the downvotes to -3. The second part of my comment was expressing basically the same sentiment as this comment, which was not downvoted into oblivion: http://news.ycombinator.com/item?id=4229788


You also said

> It sounds like his conception of "objects" is a more user-facing thing; an object is something you see on your screen and can interact with and manipulate by programming. This view blurs the line between users and programmers. I'm not sure this is a terribly realistic model for how normal people want to interact with computers.

Which misrepresents his contribution rather badly.


From the interview:

KAY: We didn't use an operating system at PARC. We didn't have applications either.

BINSTOCK: So it was just an object loader?

KAY: An object exchanger, really. The user interface's job was to ask objects to show themselves and to composite those views with other ones.

BINSTOCK: You really radicalized the idea of objects by making everything in the system an object.

KAY: No, I didn't. I mean, I made up the term "objects." Since we did objects first, there weren't any objects to radicalize. We started off with that view of objects, which is exactly the same as the view we had of what the Internet had to be, except in software.

I realize that objects in Smalltalk were not all graphical, but this concept of objects as graphical entities seems to be near and dear to his heart.


The interview does not discuss Kay's contributions in any depth. You have misunderstood them as a result of making weak inferences and not doing any further research.


You're right that I don't know his work that well and probably extrapolated too much from this interview. I just get a little irritated at people (even smart, famous people) who criticize successful projects like the Web or Wikipedia for not being good enough, or inferior to their own work, without acknowledging that their success in the marketplace shows that they must have done something right.

I'm glad he's working on STEPS; I'm eager to see him push the boundaries of what is possible, and if it succeeds, it will validate his ideas. But Smalltalk and Squeak have been around for decades, and yet the Web and Wikipedia are orders of magnitude more popular. So why does he have to bash their creators as "amateurs" or "lacking imagination" when their ideas have caught hold in a way that his work has not? What does he have to back up this criticism? Sure, a lot of the ideas from Smalltalk and his early work on object-oriented design have influenced other programming languages, but he himself says that the way in which object-oriented design evolved runs counter to his vision, not in support of it.


You need to read Alan Kay.


Maybe so, but this interview didn't really inspire me to. Almost nothing he had to say made any sense whatsoever. Wikipedia has a failure of imagination because the Logo page doesn't let you write Logo programs? What's wrong with Wikipedia being an encyclopedia and linking to some web page that is a full-on Logo programming environment?

I'm not really sure what I got wrong that explains the downvotes to -3.


You're mistaking this particular example with the general idea. What he means is that computers can do so much more than displaying text based content. And he not only wants you te be able to interact with information, he wants you to be able to create that kind of interactivity without becoming a real programmer first. To give a different example than the one about Logo; imagine a Wikipedia page about a well known mathematical principle, you can explain it with text and some images, but you could have explained it just as well on paper. Instead you could explain things in ways that only a computer enables you to do, for example like this: http://worrydream.com/#!/KillMath But that still scratches the surface compared to the stuff that Alan Kay wants to be the norm. I advice you read Alan Kay's work with that in mind: http://www.vpri.org/pdf/tr2011004_steps11.pdf


Sure, it would be amazing if the knowledge on Wikipedia could be presented to me in a more interactive and enlightening way. But there are a lot of hurdles between here and there. Interactions take a lot of talent, skill, and work to design -- will Wikipedia contributors have this skill? The possibilities are much more open-ended than a simple encyclopedia article, will Wikipedia contributors be able to collaboratively design and refine such a thing as easily as they write and revise a simple article? The code that implements those interactions needs to be sandboxed -- how do you prevent a random Wikipedia contributor from modifying the interaction's code to steal the viewer's Wikipedia credentials?

I'm all for Alan trying to make his vision a reality with the STEPS project, and I truly am interested to see if he demonstrates a new way of thinking about computing. But when it comes to ideas, the proof is in the pudding: the web is enormously successful, Wikipedia is enormously successful, so his criticism of them rings hollow.


will Wikipedia contributors have this skill?

The problem is that right now it is simply to hard to create this kind of interactivity. But in my opinion it does not necessarily have to be more complex than creating or maintaining a corporate spreadsheet.

will Wikipedia contributors be able to collaboratively design and refine such a thing as easily as they write and revise a simple article?

Absolutely.

how do you prevent a random Wikipedia contributor from modifying the interaction's code to steal the viewer's Wikipedia credentials?

The same ways as you would for text or image based content.

and I truly am interested to see if he demonstrates a new way of thinking about computing

Even when he is unable to demonstrate a new way of thinking about computing, I am sure we all can agree that a lot of things can still be improved. It would be a shame if the best we can do on computers is presenting information like we do on paper.


Maybe so, but this interview didn't really inspire me to.

Try this one: http://www.tele-task.de/de/archive/video/flash/14029/


Thanks for that, I've bookmarked it and will watch it.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: