This business of underplaying or demeaning the importance of the use of icons, fonts, or familiar metaphors in favour of some abstract notion of an interface that no one has succeeded in ever implementing, not least because no one can agree on what it should be, is a running theme. This is often accompanied by banging on about the importance of what are really just clerical operations.
PARC, like any office of a large organisation, was invariably subject to revisionist history, office politics and all sorts of other nonsense, but attempting to deny the fact they did produce the direction for our last 30 years (and possibly into the near future) is ludicrous, much as a similar rant against Bell Labs would be misguided.
Most specifically he glosses over the fact that the nature of applications in the environments at PARC had far blurrier lines around the edges than say iOS apps do. They had things like OLE, which the web is clearly trending to recreate in the form of web components because though implementations have sucked the idea itself makes sense to people.
Paying close attention to the ideas from PARC you can see that the real philosophical difference is Nelson is closer to the idea of an all ruling single document format - whereas the vision PARC believed in was an all ruling single code format. Once more the parallel to the evolution of the web is clear, in that what began as a way to distribute documents is morphing, slowly, into a method for distributing rich code objects.
Start with Computer Lib to get the historical context. Then Dream Machines to get his original ideas and visions for computers as tools. Then Literary Machines, Geeks Bearing Gifts, and Possiplex (all worth it if you appreciate his work; if you don't then their level of interest decreases as publication dates increase). And finally if you really like the guy, then you can watch his Youtube lectures.
It's easy to dismiss Ted Nelson as some old crazy man who spent his life selling snake oil when you're behind your computer, but the reality of it is that he contributed a lot to the field and is deeply respected amongst his peers. He was a close friend of Engelbart, his work valued by his contemporaries, and there's a reason for which he was invited at the Homebrew Computer Club reunion a few months ago. Nelson's work gives us an insight on what computing could have looked like, in some weird parallel world.
He is controversial, perhaps; and some of his ideas/claims may be unpractical/unrealistic/etc. - but ignoring and dismissing him only sends the message that you're not as much of an expert as your rhetoric attempts to make you out to be.
In Computer Lib/Dream Machines, one of the systems Nelson describes and lauds is Calvin Mooers's TRAC macro language. Mooers took his IP seriously and defended it seriously. You can read his argument in favor of copyright to protect software in a Computing Surveys issue from some decades ago. Mooers protected TRAC right into oblivion--if even 1% of those who read HN have ever heard of it, I'll be amazed.
David Turner, who has done very important work on implementing applicative languages, similarly protected Miranda, with the result that the major work is being done elsewhere, on Haskell et al.
Project Xanadu is "worse is better" taken to the limit, in which the better doesn't even exist save perhaps as a prototype that you can at best see a short video of. The story of Project Xanadu is a tragedy of seeking funding and resources, exhausting them, and then having to look for more. How much better off would the world--including Nelson himself!--be had he decided to open it up to the bazaar.
Yeah, quasi-stealth mode vs. bazaar was a killer.
Even when one approach wins, it is foolish to avoid looking at what other approaches offer. And that is all he is doing (from his particular, biased-but-interesting viewpoint, of course).
I do think Nelson underestimates the importance of icon and metaphor but given that he specific paradigm in mind, I think that's an honest mistake, not a mean-spirit effort to denigrate someone (in contrast to, say, your post for example).
I agree that he's clear that the PARC view won, but he (and this is the critical Raskin similiarity) is deluded about why they won, and that is because the PARC type ideas are what people want. Additionally I would argue his view is "merely" an application layer you could build with the PARC derived structure, while the opposite is not the case, assuming it could be implemented at all.
The main lesson I get from this kind of thing is a life spent pontificating without delivering products, even in the form of research products, is basically a waste. Even if those products are in some sense misguided it's better to have something concrete to argue about that can be built upon, stolen from, or discarded than a lot of ideas which never get done. I guess with that I go back to my other windows and get back to work as it's a trap that is very tempting for me to fall into.
The thing is that he is not wrong. Computer documents are, for the most part, paper-simulators and that paradigm wastes a computer's potential. You're right that there is disagreement about what we can do to improve things, but he is saying that perhaps if the alternatives weren't fighting the conceptual fast-food that is WYSIWYG, we might have figured it out at some point over the past 30 years.
Ted Nelson's vision is attractive and brilliant and fundamentally misguided. I remember reading one of his pieces on hypertext and how there was this idea of everything being, in essence, part of a disambiguated database of information so that any version of, say, Macbeth, would be linked to the canonical Macbeth... And I thought -- hang on, does he not realize that when it comes to almost any creative work there is no canonical version ever? (Actually at the time I thought Shakespeare's works -- where we don't have any of his original manuscripts in most cases and rely on warring fragments of revisions -- was unusual, but in fact almost anything has these issues of provenance, even in the wholly digital world.)
I would love to be able to link from one data element to another. I believe it would eliminate a whole class of applications.
One of the reasons I think the Android ecosystem will win out in the end is due to Intents, I don't see any other operating system even attempting something that cool.
His book, The Humane Inerface (http://www.amazon.com/The-Humane-Interface-Directions-Intera...) does have some interesting thoughts on how novice and expert users have completely different ways of interacting with machines.
Unfortunately he was convinced his ideas for interfaces should be protected by patents, virtually guaranteeing that things like his "leap" keys would never be adopted.
Back in the 1960s and 1970s there was a tremendous sense of possibility for what computers could do for us on the personal level. I think Nelson, like Alan Kay, is worth revisiting on that basis: reminding us of possibilities, not so much who was right and wrong or who was first to think of X.
There's a reason that he is famous, and it's not because his Xanadu project failed in many senses.
Edit: If you watch "Computers for Cynics 4 - The Dance of Apple and Microsoft  he's much less cranky and sings praises to Steve Jobs in a way that made me re-evaluate my opinion of Jobs.
It was similarly cranky, and littered with statements that were either poor descriptions of the concepts they aimed to explain, highly misleading, or just flat-out incorrect (if he has any idea how filesystems actually work, he's doing a damn good job hiding it).
Among other things, it seems like one could apply this argument to claim that the-web-as-interface is strictly better than the PARC-style GUI.
It well known, however, that Nelson objects to the web nearly as much as he objects to the "PUI". And this is because web is at best as a leaky simulation of multi-part document rather than a real implementation of Nelson's Xanadu vision (articulated in the 60's).
The thing is the Xanadu vision, multi-part documents with living links (distributed version control, permissions and etc) is more or less impractical to implement fully.
On the other hand, the point that a single-person word-processor certainly could use a series of piece a-la the pre-computer method he describes.
As Nelson mentions, one of the key ways the PUI was able to dominate was by organizing a variety of operations with a single metaphor. I suspect that Nelson is underestimating how important that is for making computers accessible to people.
It's really interesting to see, but in order for any of his idea to actually work you need to rethink the entire operating system it seems - not just the UI. He wants all content to have history, which would fundamentally break copy and paste (one of his points actually). The problem I see with that is you'd still have individual files (actual bits) on the hard drive, just they'd all have some meta content that describes it's origin. Or you'd need some sort of database file system (WinFS).
I'd like to see technology trending more this guy's way, but the reality is that the technological challenges seem to be too great at the moment.
I noted a couple of apparent mistakes: at https://www.youtube.com/watch?v=c6SUOeAqOjU#t=4m26s Nelson says that Xerox paid Jobs to look at the PARC work, while AFAIK Apple in fact paid Xerox, in pre-IPO Apple stock. He also suggested that Xerox tried to bring in Jobs, when it seems to have been Apple who made the approach (at the instigation of Jef Raskin at Apple). https://www.youtube.com/watch?v=c6SUOeAqOjU#t=2m15s is misleading. PARC was more-or-less forbidden to buy a DEC PDP-10, which would have reflected badly on the competing machines from Xerox's new acquisition Scientific Data Systems, so PARC built a PDP-10 clone (the MAXC) in-house for their own use. That had very little to do with why they later designed and built the Altos: there was nothing remotely similar to an Alto out there that they could have bought (Nelson knows this).
I especially like the last line:
13:52in order to sell printers they threw away the universe
Nelson is a fascinating character for his idealism concerning what computer interfaces should do.
It's easy to see how that idealism might be serving as screen hiding the inherent barriers to such a system ever existing - especially, as technology allows one to give data more structures and inter-relations, there's actually no guarantee that the tools for dealing with those structures will keep with their complexity.
I'm not saying he doesn't have valid criticisms, but he's not making it very easy to sympathize with him.
Bringing a complex tangle of code to heel becomes tractable if you just print it out, cut it up, and rearrange the parts.
It would be fairly sexy to be able to do this on your computer... granted, you need a lot of screen space to pull it off.
Oh, and I'm ignorant of Ted Nelson.... But it seems fairly obvious these talks are meant to be taken with some humour.
Ted Nelson has yet to admit missing the boat by never being able to release anything in over 40 years. He is a super-consultant & inverse opposite of what Y&C stands for. A great visionary with zero implementation skill. Time to die.
> "I sometimes find myself wondering if the invention of the high-level compiler was a fundamental and grave (if perhaps inevitable) mistake, not unlike, say, leaded gasoline."
As I gather, he's spent about the last five years or so trying to build a LISP machine from scratch on an FPGA.
(PS I see from your profile that you're an iOS dev too... I would think that anybody who writes Objective-C for a living would typically sustain an intense and burning hatred for the state of contemporary programming, and be otherwise eternally fascinated by any proposed alternatives. How do you do it?)