Hacker News new | past | comments | ask | show | jobs | submit login
The Deep Insights of Alan Kay (servicestack.net)
191 points by mythz on Feb 27, 2013 | hide | past | favorite | 95 comments



Thanks for this post; the importance of Alan Kay's work can hardly be overstated. Here's another quote by him that I find interesting:

I invented the term Object-Oriented and I can tell you I did not have C++ in mind.

Whether or not you share the disdain for C++ that is implicit in this quote, I believe it is important to understand the difference between the way OO was originally conceived of and the way C++ interprets it. I believe the best way to understand the difference is to look at object-variable binding. Smalltalk and almost every other OO language that followed it uses reference semantics: the assignment

x = y

causes the variable x to release its binding to whatever object it is currently referring to, and to refer to the object that y is referring to instead. C++, on the other hand, is pretty much the only OO language that uses value semantics: the intended meaning of the assignment

x = y

is to copy the state of the object to which y is referring over to the object to which x is referring. The older and wiser I get the more I am inclined to believe that this attempt to marry OO and value semantics was an experiment that has failed. Even if you do not believe this, you'd have to admit that the collective amount of time and effort that the C++ community has spent on dealing with the assignment operator is simply staggering. (I would have made that last statement even two years ago, and now we have rvalue refereces.)


The object model of C++ is almost entirely from Simula rather than from smalltalk. The object model of Simula is less dynamic and the language itself is based on Algol rather than being built from scratch like Smalltalk. In many ways, Simulas rather than Smalltalks object model has been the most influential, through C++ and later Java and C#.


Can't be. Simula wasn't invented by Alan Kay.


Yeah, but Java has primitives and thus value semantics you mention. Also, Java is much cleaner than c++ and on the scale of object-ness, java ranks higher than c++ but only a little bit.


True, Java has primitives, but I believe it is pretty much decided that they will be discontinued beginning with Java 8. (The fact that the wrapper classes like Double and Integer are immutable makes them behave sort of as if they had value semantics, but that's really an orthogonal issue.)


I've admired Alan Kay's work for years, but I wish he had another Jobs/Apple to turn his ideas into products. His work on the GUI was put into the Mac, Smalltalk directly influenced Objective-C, and the Dynabook is clearly a predecessor to the iPad. I don't think VPRI is a similar vehicle for new work.

It's a really tough dichotomy: people who invent world-changing things very rarely have the skills (or desire) to spend years polishing and shipping. I think almost all of academia is great example of this, where invention is praised over innovation.

I'm not saying this is bad. Researchers excel at doing research, and should focus in that domain. But papers and conference talks don't change the world. It takes real products, which are the result of substantial non-idea work. Even Doug Engelbart's famous demo took decades to actually build at scale.

One of the most important things about PARC may have been that it was a pairing of research-like projects with real world engineering accomplishments. For several years, I've felt like Google is on the verge of this as well. Glass may be their first shipping experimental product, so we'll see.

It'd be interesting if Alan took a research position at Google. Hal Abelson (inventor of Scheme, founder of Creative Commons, prof at MIT, etc.) spent a sabbatical there and shipped App Inventor for Android, which essentially took ideas from the Scratch programming environment and made them work for Android. Something like Smalltalk on top of App Engine could be really awesome.

    When the Mac first came out, Newsweek asked me what I [thought] 
    of it. I said: Well, it’s the first personal computer worth 
    criticizing. So at the end of the presentation, Steve came up 
    to me and said: Is the iPhone worth criticizing? And I said: 
    Make the screen five inches by eight inches, and you’ll rule the world.

    -- Alan Kay


Why would it help to have Smalltalk on an opaque, proprietary server platform which absolutely locks in every project that is coded for it?


Where on the web can you run Smalltalk right now? Where is the Heroku/Rails for Squeak?

And before you write it, I'll save you the time. Here's the typical HN response to this question:

    Just go get an EC2 instance and apt-get a bunch of shit and 
    then figure out how to use a CLI to push your code there and 
    add framework dependencies and learn Postgres and and configure it 
    for DNS and oh and you probably need a credit card. Not that 
    hard, srsly guys. Also, Dropbox is just git with a shiny frontend.
There is an unfathomably monumental difference between not being able to do something at all, and doing it in a way that is constricted and limited from the perspective of an expert. It's the most dramatic in terms of learning, exploration, experimentation, and imagination.

Lego Mindstorms was proprietary and I loved the shit out of that-- probably was the primary reason I dove into science and engineering. And to your point, everything I built with Lego was (literally) locked into the Lego world. But by the time it actually mattered, I had moved to a legit machine shop. But those plastic blocks laid the foundation for construction. That's what I really want Alan Kay's work to do: lay a new foundation for teaching people how to think about computation and symbolic manipulation with computers.


Have you ever heard of AppScale?


Not many people know this, but the iPhone was a stop-gap on the way to developing the iPad. I know a chap who was on the iPad R&D team, as an intern, before the iPhone came out.


You don't need an insider, Steve was saying it publicly years ago:

http://arstechnica.com/apple/2010/06/jobs-the-ipad-begat-the...


It's also detailed in Jobs' biography.


It's funny how various folk cite Alan Kay's Dynabook "vision" but plenty of people had this "vision" (witness the "prior art" for the iPad in 2001 or Hari Seldon's electronic notepad in Isaac Asimov's "Foundation"). Steve Jobs was designing notebooks and tablets on a sketchpad when the Mac was being developed. Bill Atkinson had a similar "vision" which led to his developing HyperCard (because it could be done at the time).


Get your history straight:

"In 1968 — three years before the invention of the microprocessor — Alan Kay stumbled across Don Bitzer's early flat-panel display. Its resolution was 16 pixels by 16 pixels — an impressive improvement over their earlier 4 pixel by 4 pixel display. Alan saw those 256 glowing orange squares, and he went home, and he picked up a pen, and he drew a picture of a goddamn iPad. And then he chased that carrot through decades of groundbreaking research, much of which is responsible for the hardware and software that you're currently reading this with."

http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...

Alan Kay is the goddamn Carl Friedrich Gauss of Interaction Design (Leonhard Euler was already taken by Douglas Engelbart).


Go read "Foundation" by Isaac Asimov and note the description of Hari Seldon's notepad. I believed it was written in the 50s, but Wikipedia says 1942. (Maybe 1942 refers to a short story that became part of it.)

In WWII Douglas Engelbart was trained as a radar operator. After WWII he is exposed to computers (driven by punchcards and tape, and emitting printed output) and immediately thinks "these should interact via cathode ray tubes".

It seems to me that anyone with imagination who could grok what a computer was immediately imagined the computer being embedded in any information device they could think of -- whether it's a watch, a notepad, a telephone, or the human brain.

The "light pen" was invented in 1952. Do you think the inventor's "vision" was that it be part of a monstrously big, complex, and expensive piece of hardware? (Do you think he/she hadn't read "Foundation"?)

http://en.wikipedia.org/wiki/Light_pen

Alan Kay has done many amazing things, we don't need to fight about this specific thing.


Writing science fiction is one thing, pursuing your vision and getting it made into a reality is entirely another. But you're right - we are probably more in agreement than disagreement.


I was mainly overreacting to the original article which is excessively worshipful. It's hard to argue against Alan Kay's importance, but likewise Donald Knuth, Engelbart, and Kernighan (among the living). All these guys relentlessly pursued pieces of the vision that makes the Macbook Pro I'm typing this on possible. The visible form factor is, I think, almost the least important (and most obvious) pieces of the puzzle. After all, the typewriter and the notepad are the same form factors as my laptop/PC and iPad.


Not that funny -- it was Alan's dissertation, a quite serious work, and it was done in... 1968!

In fact, you could say it was his push for the Dynabook that helped usher in the personal computer as well as WYSIWYG, etc.


When Alan Kay started grad school at Utah, he was handed a bunch of papers on SketchPad, which was done in 1963 :) So the dominoes started falling...SketchPad inspires both Smalltalk and GUI computing.


Do you know of any modern attempts at SketchPad?


There is a Squeak-based faithful recreation of Sketchpad at around 12:30 in Alan Kay's talk here: http://amturing.acm.org/acm_tcc_webcasts.cfm.


The tech startup system was for monetizing publicly-funded research. As I understand, you don't get funding for competing with companies in industry. At least I read things from people like Knuth, who mentioned that a company was angry that TeX was backed by public funding and competed with their product.

Many academics do desire to build product. But would they get funding? Maybe someone in academia could chime in.


Alan is the most important person alive in Computer Science, showing us -loudly- the way forward, and he's largely ignored. I don't know of a more damning indictment than that.


This is all too common in the IT community. Instead of learning what those before us learnt, we get hung up arguing over made up differences between industry and academia, or worry about the minutes it takes us to setup a new project in a given langage.

We jump on what's popular for the sake of being "cool as well." We get hung up on minor syntax differences in languages. Or worse, in some cases we ignore real benefits of syntax simply because we are too lazy to take the week it takes to retrain ourselves to learn it. We fight about editors! Editors!

If all of that wasn't bad enough, the more ignorant we are the more we seem to aggressively champion a given technology and attack others without the backing of any serious thought. We become so attached to our favorite tools that they become beyond criticism of any form.


> Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past or the future — it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from] — and the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.


To quote Alan Kay, we are too busy trying to reinvent the flat tire.


I agree that Alan Kay is brilliant, but to be completely fair, if Squeak is the way forward, I think I'm happy where I am. It's not bad, but the tradeoffs are such that it's hard to imagine ever using it for production software.


Note: Squeak/Smalltalk were revolutionary at the time and influenced many of the modern languages we use today. The way forward isn't to use Squeak/Smalltalk (which Alan also wanted to kill off), it's to be inspired by their ideas, embrace the good, rework the bad and to build something better.


Yes. Smalltalk was a work of genius. It seems like every good idea in modern languages I learn about turns out to have been poached from either Smalltalk or Lisp.


Every use Objective-C? It's pretty much Smalltalk with C backwards compatibility.


I've been using Objective-C for about 12 years now. It is similar to Smalltalk in its message system and naming conventions — which, to be fair, were at the core of Kay's Smalltalk vision — but otherwise the experience of programming in Objective-C is not much like programming in Squeak.


That is no bad thing, because programming in Squeak absolutely blows programming in Xcode away.


It's Smalltalk but without the image, without browsers, without inspectors, without live objects, without...

Well... you get it...


If you're interested in seeing what Objective-C would be like with a more Smalltalky environment, you might find F-Script interesting.


"but to be completely fair, if Squeak is the way forward, I think I'm happy where I am."

Well, it's still the language from the year 1980. The very fact that you judge a 1980-vintage system as "not bad" compared to existing alternatives is telling in itself. Alan is working on a 2010 version even as we speak. :-)


I admire Alan Kay as much as the next guy and a lot of this stuff in this article looks great, but he doesn't need his lily gilded:

* Laser Printers

Yes -- invented at Xerox PARC.

* Object Orientated Programming / Smalltalk

No. OOP was invented in Norway (Simula) / Yes

* Personal Computers

Um... really? No.

* Ethernet / Distributed Computing

Yes Xerox PARC. / Not really

* GUI / Mouse / WYSIWYG

No (Englebart et al) / No (Englebart et al) / Yes

See: http://en.wikipedia.org/wiki/History_of_the_graphical_user_i...


See: http://www.parc.com/about/ and http://en.wikipedia.org/wiki/PARC_(company)

There is no question that Alan did invent the term "Object Oriented Programming" (and was rewarded with a Turing award for his work) and also invented Smalltalk which was well known to be inspired by Lisp, Simula, Logo, Sketchpad.

The GUI, overlapping windows on a bitmap screen, WYSIWIG editor, etc i.e. the personal computer as we know it today was first created at PARC that later influenced and was commercialized by Apple.


Naming something isn't inventing it.

Kay's Turing award was "For pioneering many of the ideas at the root of contemporary object-oriented programming languages" which might or might not include inventing "Object Oriented Programming..." and as it happens turns out not to.

See: http://amturing.acm.org/award_winners/kay_3972189.cfm

Key quote from the article:

"While working on FLEX, Kay witnessed Douglas Engelbart’s demonstration of interactive computing designed to support collaborative work groups. Engelbart’s vision influenced Kay to adopt graphical interfaces, hypertext, and the mouse."

Incidentally, Doug Engelbart is still alive and a Turing award winner:

http://amturing.acm.org/award_winners/engelbart_7929781.cfm

"Engelbart slipped into relative obscurity after 1976 due to various misfortunes and misunderstandings. Several of Engelbart's best researchers had become alienated from him and left his organization when Xerox PARC was created in 1970."

History of OOP including Simula 67: http://en.wikipedia.org/wiki/Object-oriented_programming

Go look at the "Mother of All Demos" -- http://en.wikipedia.org/wiki/The_Mother_of_All_Demos -- which introduces the mouse, a macdraw-like drawing program, and a mouse-driven word processor. This is the invention of the GUI that not only predates Xerox PARC but students from the lab that did this work went on to work at Xerox PARC. (And the Xerox PARC mouse was a clumsy, expensive, and unreliable device very similar to the crude device from the older demo -- the modern mouse was actually invented by an engineer contracted by Steve Jobs).

"the personal computer as we know it today was first created at PARC"

The personal computer as we know it today was invented by a lot of people over a period of time, although it is usually ascribed to Apple, Altair, IMSAI, etc. -- not Xerox. Xerox added a GUI to their $10,000 "personal computer" and that is a feature of today's personal computer, but so are scalable fonts (btw: that's Donald Knuth who deserves as much fanboi love as Alan Kay) and web browsers, neither of which were created by Xerox.


At MIT we had perfectly usable and reliable mice on Lisp Machines long before the Mac or Lisa existed. It is true that they were not mass-produced, however.


Interesting: when did the Lisp Machine actually ship as a product? And did it have a mouse from day one? According to the Wikipedia article it looks like something may have shipped in 1980 for $70,000 per unit...


MIT never made Lisp Machines as a product. The ones made at MIT were hand-made for use only at MIT. (Well, there was a robotic wire-wrapping machine that did a lot of the work automatically, and there was another robot that would automatically test the connectivity of all the wires. IIRC, it would take a couple of weeks for the robotic connectivity tester just to test one backplane.)

Most of the people involved with designing the Lisp Machines ultimately left MIT, however, to start companies to manufacture and sell them. The most prominent of these was Symbolics. There was also LMI.

And yes, Lisp Machines were very expensive personal computers. But computers for "professionals" were generally pretty expensive at the time. A timesharing computer that could handle 20 or 30 users could easily cost $1 million.

How much did the mouse for a Lisp Machine cost? IIRC, around $250. Economies of scale, and all that.... Yes, Lisp Machines always had mice.

Btw, it was the founding of Symbolics and LMI that prompted Richard Stallman to become a free software radical. He stayed at MIT, and spent a lot of his time porting back to MIT's Lisp Machine Lisp the improvements made to it by Symbolics. This is why Stallman came up with the GPL. He was galled that Symbolics had hijacked MIT's open code and had made it proprietary.


@nessus42: Thanks for the info. Another question is how intrinsic the mouse was to use of the machine and did it have more than one button? Part of the genius of the Mac design was doing with one button what Xerox did (badly) with three. (Yes, other companies later added more buttons all over again, but the Xerox UI, which I used briefly, required using different buttons for distinct operations in a way that, once you'd used a Mac mouse, made no sense at all.)


The mouse was completely intrinsic to the Lisp Machine. Though I'm sure that most people would consider the GUI rather ugly and primitive by today's standards. Think Emacs for X11.

The mouse was a three button mouse. The way it worked certainly made sense to me at the time, but it was made for nerds. Apple certainly did a ton of great work to make personal computers usable (and affordable) by normal people.


Apple added other buttons, but you had to use your other hand to press them to get shift-click etc


Commercially the MIT derived Lisp Machine was first sold in 1981. With mouse. You could not use it without mouse.

At that time Xerox was also using their own Lisp Machines internally - Alan Kay knew those. Those were later also sold as a product.


The personal computer first created at PARC was the Alto, not the Star, and from what I've read/seen, I agree with the quote.

E.g. https://www.youtube.com/watch?v=AYlYSzMqGR8 (I only watched the first minute or so)


Overlapping windows were first done by Apple. (See your own references)


Actually I tried chasing this up. Xerox did overlapping windows first and they performed abominably (they drew everything using painter's algorithm). They dropped overlapping windows as a performance optimization.

Then Andy Hertzfeld or Bill Atkinson implemented clipping in such a way that most drawing commands became no-ops when they were outside a clipping region (if you set a clip rect then call "fillrect" and it's outside the clipping rect then you do nothing, right?), allowing highly performant overlapping windows. There's a bit somewhere (folklore, Hackers? can't remember) where they showed this to Xerox folk who couldn't believe how well it worked.


Actually, Bill Atkinson saw the PARC demo along with Jobs, and he thought Xerox had come up with an efficient implementation for overlapping regions. This inspired him to work on his own implementation for a problem he previously had considered too difficult. Only after Atkinson had finished his implementation did he learn that PARC was using a brute-force technique. (The Alto was a more powerful and expensive computer than the Lisa and Macintosh, so it could get away with a less efficient implementation.)


To me, one of the nice things about the semantics of real objects is that they are “real computers all the way down (RCATWD)” – this always retains the full ability to represent anything.

I remember 15 years ago when Philip Greenspun (http://philip.greenspun.com) first introduced me to the idea that a Web service is like an object -- it may seem obvious today, but at the time it completely changed my Web view:

The challenge is in realizing that the Web service itself is an object. The object has state, typically stored in a relational database management system. The object has methods (the URLs) and arguments to those methods (the inputs of the forms that target the URLs). The engineering challenges of Web development are (a) coming up with the correct data model for the object state, (b) coming up with a correct and maintainable organization of URLs, and (c) defining the semantics of each URL. By the time an individual page is constructed, the engineering challenge is over and it doesn't really matter whether you build that script in a simple language (e.g., Perl or Tcl) or a complex powerful language (e.g., Common Lisp or Java).

http://philip.greenspun.com/wtr/aolserver/introduction-1.htm...

Philip also discusses this idea in chapter 13 of his classic book, "Philip and Alex's Guide to Web Publishing" (http://philip.greenspun.com/panda/).

But as Alan Kay points out, this RPC-type model still hasn't evolved to a point where you can pass around self-contained objects that are RCATWD.


There's really no way to adequately summarize such a varied and content-rich article. You should really read it in full for yourself. But here are the section headings:

* On Software Engineering

* On Object Orientated Programming

* Tear it down and build something better

* On Messaging

* On LISP

* The Unknown side of Alan Kay


Anything I've ever read that Alan Kay has written, either inspires me greatly, or depresses me deeply. His POV is worth 80 IQ quote is something I live by. It applies to _every_ aspect of life. Then you listen to him talk about computer science, computing in general, and what the future might hold, and you end up wondering why we're using languages today that are essentially prettied up versions of languages 40 and 50 years old (sweeping generalisation!)


reading that back, damn, the "depresses me deeply" is a terrible exaggeration, making light of anyone who actually suffers from genuine depression.


come now... not all depression is clinical.


I don’t really understand what he means by the POV quote. Could you explain it?


If "message passing between objects" is so great, then why do those distributed object frameworks (CORBA,(D)COM) not rule the world? Bad execution?


CORBA/DCOM/SOAP-RPC are examples of RPC-based services, i.e. exactly what Alan is advocating against.

Here's my definition of a message-based service: https://github.com/ServiceStack/ServiceStack/wiki/What-is-a-...

Here are some of the advantages: https://github.com/ServiceStack/ServiceStack/wiki/Advantages...

Here is the difference between WCF RPC methods and a message-based ServiceStack service: http://www.servicestack.net/files/slide-35.png (i.e. 6 RPC methods vs 1 Msg service)

Here is the difference between WebApi RPC methods vs an equivalent message-based ServiceStack service: http://www.servicestack.net/files/slide-34.png (e.g. 5 RPC vs 2 msg ones)


Those aren't exactly what Kay has in mind. And as a matter of fact, message passing between objects does rule the world in the form of the internet. In CORBA and other RPC mechanisms, both sides have some kind of description of the protocol from which they generate methods. If the interface changes, then the code must be regenerated on both sides. On the other hand, Kay believes in individual nodes having full freedom to handle any kind of message and decide what they do with it, not necessarily strictly conforming to a decided-on protocol.

It's analogous to the idea of objects adopted by C++ and SmallTalk. With C++, method calls are essentially procedure calls with a specific signature that are dispatched based on a special parameter (i.e. this). It makes no sense to invoke a method that doesn't exist. In SmallTalk, you can send any message you'd like to an object, and the object can choose to deal with it in any way it wants, up to and including catch-all code for dealing with any kind of message.

CORBA and its ilk work with a protocol, and messages not conforming to the protocol are nonsensical. Computers on the internet can send any message to any computer, and each can deal with the messages in any way it wants. The internet is a network of message-passing objects.


Those are pretty much RPC (remote procedure call), which Kay names as the opposite of message passing. My take on what Kay is saying is that he is in favor of something much more like "service oriented architecture" -- but certainly without all the buttloads of cruft that have come to be associated with that phrase.


> The ARPA/PARC history shows that a combination of vision, a modest amount of funding, with a felicitous context and process can almost magically give rise to new technologies that not only amplify civilization, but also produce tremendous wealth for the society.

[...] felicitous context and process [...]

I am interested in what the process is he's referring to that PARC used.

I think software engineering as a discipline is too young to have decent processes figured out, as is evident with the takeover of the "agile/scrum" process (I don't mean this in a bad way, but before this it was either NO process or a bug tracker with dates in it).


He's thinking process at a higher level than you are. He means process in terms of funding, choice of vision and who chooses to commit people and resources. It's his contention that ARPA in the early to mid 60s had a funding/research process that really worked. The nut of it was to pick productive people with vision and get out of their way and let them drive.


While there's no question that some of Alan Kay's achievements are very important, I cannot help but thinking that our industry's tendency to surrender to 'fanism' (e.g: douglas crockford, linus torvalds) - is ridiculous.


Learning the educated insights from the brilliant minds in history is not 'fanism', it's benefiting and building off their experiences and is what Alan refers to when he says: "Point of view is worth 80 IQ points"

In his talk he uses Leonardo Da Vinci who despite being one of the smartest people in existence wasn't able to invent a motor for any of his machines, a feat accomplished by a less capable Mr Ford who did it en-masse.

Not learning from history is one of the reasons why the state of the industry is where it's at today.


Don't get me wrong, I'm not contradicting that, nor am I advocating not learning from history. I just think that stuff like that:

"The Deep Insights of Alan Kay" "Alan is the most important person alive in Computer Science""

is over the top.


I never said "Alan is the most important person alive in Computer Science" (though he's obviously a contender)

And I have no regrets over the title "The Deep Insights of Alan Kay"...

Alan has generated a wealth of intellect, experience and insights in his works that most IT people still don't know about or even know who he is. The objective of the post is to bring some of his meaningful and powerful ideas to the surface and summarize them so it's more palatable and more accessible to more people - I think the title accurately captures this.


You have to admit, Fascism is a great system for getting things done. Oh my bad, you said fanism.


Exactly. There are loads of people with much more interesting ideas, for example the author of this article: http://awelonblue.wordpress.com/2012/07/01/why-not-events/.


Some of those points are a bit misleading. Sure, you can define a lisp interpreter in very few lines of code. But to build a decent standard library and optimizing compiler/JIT, that is where the LOCs rack up.


Attitudes like this are short-sighted, and part of the problem. Finding innovative and elegant solutions to solving problems is more scalable than the brute-force approach of throwing resources and having them churn out more LOCs to solve it. Not saying its easier and requires less effort to do, just that it results in less LOC's that scales better and provides better building blocks to build on.

Look into the special properties of LISP which Alan refers to as the Maxwell’s equations of software: http://www.michaelnielsen.org/ddi/lisp-as-the-maxwells-equat...

Its more powerful and extensible than most languages despite requiring orders of magnitude less effort to implement (see: Peter Norvig's Lisp interpreter in 60 lines of Python http://norvig.com/lispy.html, and re-written in 40 lines of Ruby http://stuff.thedeemon.com/rispy/).


I'm familiar with the LISP metacircular interpreter and programming language theory. I know you can define all sorts of things with a very small set of primitives.

It's one thing to build a minimal implementation where your main constraint is that it works and its concise. It gets a lot messier once you start worry about constraints like efficiency on real machines - there are limits to how much you can abstract out the heuristics and special cases.


And its macro support has a major impact on the small code-base size of their class libraries.

Right, if you need it to be efficient than mechanical sympathy starts creeping into your code-base, though language design should optimally be designed to capture human thought - that's the part that doesn't scale well or solved by adding more machines.

The Nile/OMeta work is an example of using DSL's as one way to reduce the LOC's required, it captures the required top-level math fn's for image processing and uses OMeta to convert it to runnable source code. DSL's might not be the best way forward in all situations, but investing in research of different ways to the solve the problem can only help.


IIRC STEPS includes compilers and optimizers that are written in fairly few lines of code using stacks of DSLs.


It remains to be seen whether they succeed in producing a decent optimizing compiler and a decent standard library.


As far as optimizing compiler... Maru might not yet qualify as decent, but it's a good start.

http://piumarta.com/software/maru/

"The whole thing is 1750 lines of code (about twice what it should be) and includes a garbage collector and user-level implementations of user-defined types and structures, single-dispatch object-oriented "messaging", and generic "multimethods" (all of which are needed by the compiler). The compiled binary code runs at about 30% the speed of aggressively-optimised C."


Maru (a Lisp) looks awesome! However, unoptimized C runs at about 25% the speed of aggressively-optimized C.

The current version of Maru seems to be about 9000 lines of code (almost half the entire complexity budget for STEPS!) I think this reinforces army's original point, "Sure, you can define a lisp interpreter in very few lines of code. But to build a decent standard library and optimizing compiler/JIT, that is where the LOCs rack up."


The OMeta project is available to play with now: http://tinlizzie.org/ometa/

Which also has a JS port http://tinlizzie.org/ometa-js/ so you can create new languages/DSLs and run them in your browser.


"Object Orientated"? Really?

How does someone read so much material on Alan Kay and still manage to have "object orientated" stuck in their head? It's like putting together an overview of Edward Said's "Orientatilism".


I feel the same way about "orientated". But my British friends tell me that's the normal usage of the word in the UK and "oriented" sounds as strange to them as "orientated" does to me.

That said, Kay did call it object-oriented programming, so it makes sense to stick with his name.


I'm British and I think your British friends are wrong. I do agree that most of the people who use "orientated" are British but I don't understand where they get that from.

"Orientation" is the term for the process of orienting something. So a design process in which, say, user stories are turned into object-oriented designs could be called "object-orientation". But it does not follow that the result is "object-orientated" - we'd just call it "object-oriented", in the same way that something which has been through a transformation is not "transformated" but "transformed".


Updated, apologies for the misspelling.


... orientated around “visions rather than goals” and “funded people, not projects”

being exclusively goal oriented constrains the originality of the high level ideas that are produced.

on the other hand, being constrained by specific goals (eg: put a person on the moon) can produce a lot of creativity as people's efforts are focused on a narrow domain of problems to solve.


Surprised not to see any quotes from Dr. Dobb's interview with Kay from 2012 ( http://www.drdobbs.com/240003442 )

Some quotes from that interview:

"My interest in education is unglamorous. I don't have an enormous desire to help children, but I have an enormous desire to create better adults."

"Extracting patterns from today's programming practices ennobles them in a way they don't deserve."

"I like to say that in the old days, if you reinvented the wheel, you would get your wrist slapped for not reading. But nowadays people are reinventing the flat tire. I'd personally be happy if they reinvented the wheel, because at least we'd be moving forward. If they reinvented what Engelbart did we'd be way ahead of where we are now."

and so on...


Alan Kay is a voice that we need to listen more... we are distracted from the fact that our current Industry generates a lot of money... too look at the fact that our Craft is in the pyramid construction age.... we need real architecture and real engineering.


I don't think the dig on the "web amateurs" is fair. It seems clear to me that the simple and, perhaps, amateurish underpinnings of the web (http & HTML) ended up being a great strength.


I disagree.

HTTP is defensible, HTML also has its time and place.

But what it evolved into (HTML/CSS/JS) is pretty obviously a dead-end waiting to happen. The number of hacks we pile on one another for a "modern web-application" is nothing short of ridiculous.

Realistically we still live in the internet stone-age, helplessly piling up stones and wooden sticks...


Oh, how I would like to base everything on simple abstractions like “real computers all the way down (RCATWD)”. However, reality seems to think differently. James Hague recently said it well: "Simplicity is Wonderful, But Not a Requirement"

http://prog21.dadgum.com/167.html


Good article. I Just translated it into Chinese and posted that in my blog(http://ned11.iteye.com/blog/1828088). Thanks for the introduction to such a great people.


It's a small thing, I know, but the author's use of "orientated" for "oriented" was a bit annoying.

Other than that, it's a nice collection of quotes and links. Good to see.


I'd love to get a hold of that Bob Barton article Alan Kay talks about, but strangely it seems like all the PDFs of it have been taken down...


The revolutionary 1961 Bob Barton paper Alan Kay was referring to was called "A new approach to the functional design of a digital computer".

Here's a copy of it on Scribd: http://www.scribd.com/doc/61812037/Barton-B5000


Great!


has anyone seen him talking about a Tim Gallwey video ? Kathy Sierra mentioned one of Alan Kay's talks called 'doing with images makes symbols'... I'm trying to find this one ..


What network protocol did the ARPA network at PARC use?


The author lost me here: "Which didn’t even require inheritance, which is not like we know it today".


My new favorite from Alan Kay, which I just learned from TFA:

"Lisp: greatest single programming language ever designed"

My previous favorite quote attributed to Alan Kay, which is not in TFA:

"I made up the term "object-oriented", and I can tell you I did not have C++ in mind."

: )




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: