I wouldn't put Ted Nelson on the same page like the other ones. The other three actually build stuff that is now everywhere. Ted never got anywhere is now bitching all the time on how much better his ideas were than what we actually got.
Hmm. I could show where your dashed-off summary is wrong by knocking down the other three, or educating you on Ted's context. I'll go with education, as Ted would prefer.
You are confusing "I heard their names" with "They made the thing I use". Berners-Lee didn't write the browser you use. Kay isn't hard at work on the new iPhone. and Engelbart doesn't pop into the Logitech offices to see how the new controls are going.
Nelson wrote multiple critical works of theory and practice, edited very influential magazines, coded prototypes and processes that were then re-used or modified as time went on, and has continually spoken on his principles and ideas behind the interface of humans and machines. He has stressed, for over half a century, the importance of the treatment and access of information, which turns out to be the top priority of the human race in the contemporary sphere.
Ted's name has been brought up in meetings you're not invited to and events you can't attend, as well as ones you could be. His influence is gigantic, and the fact he freely shares his ideas past the age of 80 is a gift some of us appreciate very much.
He doesn't strut across a stage next to a product he yelled at others to make. That doesn't make him less of a giant.
> He has stressed, for over half a century, the importance of the treatment and access of information, which turns out to be the top priority of the human race in the contemporary sphere.
Jason was being as polite as he could to someone coming after Ted. It's not unreasonable to emphasize. More politely: Please catch up.
Thanks for putting so well what I lacked the patience to write, I don't think the OP deserves it, the comment is bordering on trolling, is unsubstantiated, poorly conceived and poorly researched by someone who claims to have an interest in these things.
I think his biggest problem is that he refuses to collaborate with other people, or build on top of current technology.
He's had a lot of great important inspirational ideas, but his implementation of those ideas didn't go anywhere, he's angry and bitter, and he hasn't bothered re-implementing them with any of the "inferior technologies" that he rejects.
Back in 1999, project Xanadu released their source code as open source. It was a classic example of "open sourcing" something that was never going to ship otherwise, and that nobody could actually use or improve, just to get some attention ("open source" was a huge fad at the time).
>Register believe it or not factoid: Nelson's book Computer Lib was at one point published by Microsoft Press. Oh yes. ®
They originally wrote Xanadu in Smalltalk, then implemented a Smalltalk to C++ compiler, and finally they released the machine generated output of that compiler, which was unreadable and practically useless. It completely missed the point and purpose of "open source software".
I looked at the code when it was released in 1999 and wrote up some initial reactions that Dave Winer asked me to post to his UserLand Frontier discussion group:
A few excerpts (remember I wrote this in 1999 so some of the examples are dated):
>Sheez. You don't actually believe anybody will be able to do anything useful with all that source code, do you? Take a look at the code. It's mostly uncommented glue gluing glue to glue. Nothing reusable there.
>Have you gotten it running? The documentation included was not very helpful. Is there a web page that tells me how to run Xanadu? Did you have to install Python, and run it in a tty window?
>What would be much more useful, would be some well written design documents and port-mortems, comparisons with current technologies like DHTML, XML, XLink, XPath, HyTime, XSL, etc, and proposals for extending current technologies and using them to capture the good ideas of Xanadu.
>Has Xanadu been used to document its own source code? How does it compare to, say, the browseable cross-referenced mozilla source code? Or Knuth's classic Literate Programming work with TeX?
>Last time I saw Ted Nelson talk (a few years ago at Ted Selker's NPUC workshop at IBM Almaden), he was quite bitter, but he didn't have anything positive to contribute. He talked about how he invented everything before anyone else, but everyone thought he was crazy, and how the world wide web totally sucks, but it's not his fault, if only they would have listened to him. And he verbally attacked a nice guy from Netscape (Martin Haeberli -- Paul's brother) for lame reasons, when there were plenty of other perfectly valid things to rag the poor guy about.
>Don't get me wrong -- I've got my own old worn-out copy of the double sided Dream Machines / Computer Lib, as well as Literary Machines, which I enjoyed and found very inspiring. I first met the Xanadu guys some time ago in the 80's, when they were showing off Xanadu at the MIT AI lab.
>I was a "random turist" high school kid visiting the AI lab on a pilgrimage. That was when I first met Hugh Daniel: this energetic excited big hairy hippie guy in a Xanadu baseball cap with wings, who I worked with later, hacking NeWS. Hugh and I worked together for two different companies porting NeWS to the Mac.
>I "got" the hypertext demo they were showing (presumably the same code they've finally released -- that they were running on an Ann Arbor Ambassador, of course). I thought Xanadu was neat and important, but an obvious idea that had been around in many forms, that a lot of people were working on. It reminded me of the "info" documentation browser in emacs (but it wasn't programmable).
>The fact that Xanadu didn't have a built-in extension language was a disappointment, since extensibility was an essential ingredient to the success of Emacs, HyperCard, Director, and the World Wide Web.
>I would be much more interested in reading about why Xanadu failed, and how it was found to be inadequate, than how great it would have been if only it had taken over the world.
>Anyway, my take on all this hyper-crap is that it's useless without a good scripting language. I think that's why Emacs was so successful, why HyperCard was so important, what made NeWS so interesting, why HyperLook was so powerful, why Director has been so successful, how it's possible for you to read this discussion board served by Frontier, and what made the World Wide Web what it is today: they all had extension languages built into them.
>So what's Xanadu's scripting language story? Later on, in the second version, they obviously recognized the need for an interactive programming language like Smalltalk, for development.
>But a real-world system like the World Wide Web is CONSTANTLY in development (witness all the stupid "under construction" icons), so the Xanadu back and front end developers aren't the only people who need the flexibility that only an extension language can provide. As JavaScript and the World Wide Web have proven, authors (the many people writing web pages) need extension languages at least as much as developers (the few people writing browsers and servers).
>Ideally, an extension language should be designed into the system from day one. JavaScript kind of fits the bill, but was really just nailed onto the side of HTML as an afterthought, and is pretty kludgey compared to how it could have been.
>That's Xanadu's problem too -- it tries to explain the entire universe from creation to collapse in terms of one grand unified theory, when all we need now are some practical techniques for rubbing sticks together to make fire, building shelters over our heads to keep the rain out, and convincing people to be nice and stop killing each other. The grandiose theories of Xanadu were certainly ahead of their time.
>It's the same old story of gross practicality winning out over pure idealism.
>Anyway, my point, as it relates to Xanadu, and is illustrated by COM (which has its own, more down-to-earth set of ideals), is that it's the interfaces, and the ideas and protocols behind them, that are important. Not the implementation. Code is (and should be) throw-away.
>There's nothing wrong with publishing old code for educational purposes, to learn from its successes and mistakes, but don't waste your time trying to make it into something it's not.
Your preface to the UserLand thread was fantastic. "We haven't even INVENTED twitter yet, but I promise y'all will ENJOY the hell out of this series of comments-with-character-limits, but you're gonna do it IRONICALLY."
It's been about a year since I talked to Ted Nelson, and I suspect the talk you saw at Almaden would've been close to the height of his bitterness about HTTP. He also speaks frankly about the deeper aspect of his bitterness, retrospecting on the 70s back in 1990:
I am curious about your current assessment that he refuses to collaborate with other people. Obviously he worked with other people to get Xanadu to where it got. And I don't have enough inside knowledge about the competitive strategy of Netscape or other commercial hypertext vendors, but I'm sure there was a lot more going on around the open sourcing decision. At least with nearly 20 years of hindsight since the open sourcing, he seems perfectly clear-eyed about the failure of the Xanadu project.
I would love to hear that postmortem more than I want to read the released code, too. But when I asked him about it, Ted just seemed a lot more interested in talking about the good ideas from Xanadu. Sure, he strikes me as a little grumpy, but I don't know anyone my age or older that isn't a little bit grumpy about computing. After I watched that WGBH interview, I developed an enormous amount of empathy for the fact that he had to live through the 70s.
> Anyway, my take on all this hyper-crap is that it's useless without a good scripting language. I think that's why Emacs was so successful, why HyperCard was so important, what made NeWS so interesting, why HyperLook was so powerful, why Director has been so successful, how it's possible for you to read this discussion board served by Frontier, and what made the World Wide Web what it is today: they all had extension languages built into them.
I'm wondering what your wiser, older self has to say about this 20 years on. Isn't it useful that documents you wrote 20 years ago can still be read?
From my memories, the Web craze started well before JavaScript, and JavaScript really only jumped on the bandwagon; so how could it be the critical success factor for the Web?
The success of the Web and JavaScript in the last two decades speaks for itself; but in 2018, JavaScript and the procedural Web could very well be its undoing when considering the original goals of the Web, couldn't it?
I don't think Don meant that JS was the critical success factor for the Web. But that extensible scripting is crucial to the kind of Web Ted Nelson wanted in the first place.
From my lived experience, the Web craze would be better termed the Modem craze. And the critical success factor that turned it into the Web, was NSF removing the restrictions on commerce in 1995.
JavaScript is just what got HTML closer to some ideals of Xanadu. Not close enough for Ted's vision, but that is a broad sociopolitical vision.
Server side scripting languages were critical to the success of the web, before browser side JavaScript was available and matured.
Simple stateless perl cgi scripts forked from apache that talk to text databases or mysql were the first simplest step, but things got much more interesting with long running stateful application servers like Zope (Python), Java, Radio UserLand, HyperCard, node, etc.
My favorite thing about node is that it lets you use the same language and libraries and data on both the client and server side. That's an enormous advantage that far outweighs JavaScript's disadvantages. But some people just can't see or believe that, for whatever reason, and they're fine with flipping and flopping back and forth between different languages, and hiring different people to write multiple subtly divergent versions of everything in different languages.
Face it: for all its faults, JavaScript won. I will always have a place in my heart for FORTH, PostScript, MockLisp, ScriptX, TCL, Python, HyperTalk, UserTalk, CFML, Java, and all those other weird obsolete scripting languages, but it's soooo much easier to program in one language without switching context all the time, even if it's not the best language in the universe. And TypeScript is a pretty darn good way of writing JavaScript.
You're right, the web was held back until it was finally considered "ok" to use it for commercial activity!
I'd say JavaScript is just what got HTML closer to implementing any ideal you want, and there's no reason Xanadu couldn't be implemented on top of current web technologies (except that Ted doesn't want to). But I don't think extensibility and scripting itself was part of Ted's original vision or implementation.
Just as so much has happened since MVC was invented (yet it's still religiously applied by cargo-cult programmers), also so much has happened since Xanadu was invented (like distributed source code control, for example), which requires a total rethinking from basic principles. We also have the benefit of a lot of really terrible examples and disasterous experiments to learn from (wikipedia markup language, wordpress, etc). Many of Ted's principles should be among those basic principles considered, but they're not the only ones.
Hmm, HyperCard in the same list as Zope and node? Interesting. :-)
The idea that JavaScript "won" is a little controversial to me. I think it's huge and important, but the world is still changing. Embedded Python goes places that Node still can't. I absolutely see the value you describe in sticking to one ecosystem, but I don't think JavaScript/TypeScript/Node is the only way to get those benefits. (See also: Transcrypt) I really enjoyed the PyCon 2014 talk on the general subject: https://www.destroyallsoftware.com/talks/the-birth-and-death...
The most recent conversation I had with Ted was after someone had just demonstrated the HoloLens for him and a few others. Ted had some feedback for the UI developer, and it didn't have anything to do with JavaScript or that level of implementation detail at all. It was all about the user experience. I don't want to put words into his mouth, but like he says in this recent interview, this is all hard to talk about because it really has changed so quickly.
I do think you're right that a lot of what Ted wanted to see could be implemented today in JavaScript and Git. But I think about the technical meat of that vision to be about data-driven interfaces. I am simply not old enough to really understand how notions of "scripting" changed between the 60s and the 80s. But the fact that Xanadu was started in SmallTalk suggests to me that scripting was part of the vision, even if a notion like "browser extensions" might not have been in mind.
Completely agree that there are other voices to learn from, and other important mistakes that have been made since Xanadu! (I think Ted would agree, too.)
Reading documents from 20 years ago is a mixed bag. Links usually fail horribly, which was something Xanadu was trying to solve, but I'm not convinced they could have solved it so well that 20-year-old links would still actually work in practice.
I've always tried to write documents in a simple format that's easy to translate to newer formats, and minimizes noise and scaffolding and boilerplate.
When we were developing the HyperTIES hypermedia browser in 1988 [1] at the UMD HCIL, we considered using SGML as the markup language, but decided against it, because we were focusing on designing a system that made it easy for normal people to author documents, and working with SGML took a lot of tooling at the time. (It was great for publishing Boeing's 747 reference manual, but not for publishing poetry and cat pictures.) So we designed our own markup language. [2]
It's not which scripting language you have, it's that you have a scripting language at all that's important. HyperTIES was actually implemented in C, plus 3 different scripting languages: FORTH for the markup language interpreter and formatter [3], PostScript for the user interface and display driver and embedded applets [4], and Emacs MockLisp for the authoring tool [5].
When you try to design something from the start without a scripting language, like a hypermedia browser or authoring tool, or even a window system or user interface toolkit, you end up getting fucked by Greenspun's Tenth Rule [6]
[6] Greenspun's Tenth Rule: Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp. https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule
But when you start from day one with a scripting language, you can relegate all the flexible scripty stuff to that language, and don't have to implement a bunch of incoherent lobotomized almost-but-not-quite-turing-complete kludgy mechanisms (like using X Resources for event handler bindings and state machines, or the abomination that is XSLT, etc).
TCL/Tk really hit the nail on the head in that respect. TCL isn't a great language design (although it does have its virtues: clean simple C API, excellent for string processing, and a well written implementation of a mediocre language design), but its ubiquitous presence made the design of the Tk user interface toolkit MUCH simpler yet MUCH more extensible, by orders of magnitude compared to all existing X11 toolkits of the time, since it can just seamlessly call back into TCL with strings as event handlers and data, and there is no need for any of the ridiculous useless brittle contraptions that the X Toolkit Intrinsics tried to provide.
The web was pretty crippled before JavaScript and DHTML came along. Before there was client side JavaScript, there were server side scripting languages, like Perl, PHP, Python, Frontier (Radio Userland) [7], HyperTalk, etc.
Frontier / Manilla / Radio Userland was a programmable authoring tool, content management system, web server, with a build-in scripting language (UserTalk, integrated with an outliner and object database). That scriptability enabled Dave Winer and others to rapidly prototype and pioneer technologies such as blogging, RSS, podcasting, XML/RPC, SOAP, OPML, serving dynamic web sites and services, exporting static web sites and content, etc.
One of the coolest early applications of server side scripting was integrating HyperCard with MacHTTP/WebStar, such that you could publish live interactive HyperCard stacks on the web! Since it was based on good old HyperCard, it was one of the first scriptable web authoring tools that normal people and even children could actually use! [8]
I guess it's a matter of perspective whether you like the procedural Web (the developer/creative perspective) or not (the perspective of the consumer who gets all kinds of scripts for tracking, mining, fishing, and other nefarious purposes, all the while not being able to save something for later reading).
I have no doubt JavaScript was absolutely necessary to develop the Web to the point it is today. But I had hoped that development of HTML (the markup language) would keep up to eventually provide declarative means to achieve some of what only JavaScript can do, by sort of consolidating UI idioms and practices based on experience gained from JavaScript. But by and large this hasn't happened.
What has happened instead is that JavaScript-first development has taken over the Web since about 2010 (I like react myself when it's a good fit, so I'm not saying this as a grumpy old man or something). And today there's no coherent vision as to what the Web should be; there's no initiative left to drive the Web forward, except for very few parties/monopolies who benefit from the Web's shortcomings (in terms of privacy, lack of security, it's requirement of a Turing-complete scripting environment for even the most basic UI tasks etc).
> the abomination that is XSLT
Not trying to defend XSLT (which I find to be a mixed bag), but you're aware that it's precursor was DSSSL (Scheme), with pretty much a one-to-one correspondence of language constructs and symbol names, aren't you?
In the ideal world we would all be using s-expressions and Lisp, but now XML and JSON fill the need of language-independent data formats.
>Not trying to defend XSLT (which I find to be a mixed bag), but you're aware that it's precursor was DSSSL (Scheme), with pretty much a one-to-one correspondence of language constructs and symbol names, aren't you?
The mighty programmer James Clark wrote the de-facto reference SGML parser and DSSSL implementation, was technical lead of the XML working group, and also helped design and implement XSLT and XPath (not to mention expat, Trex / RELAX NG, etc)! It was totally flexible and incredibly powerful, but massively complicated, and you had to know scheme, which blew a lot of people's minds. But the major factor that killed SGML and DSSSL was the emergence of HTML, XML and XSLT, which were orders of magnitude simpler.
There's a wonderful DDJ interview with James Clark called "A Triumph of Simplicity: James Clark on Markup Languages and XML" where he explains how a standard has failed if everyone just uses the reference implementation, because the point of a standard is to be crisp and simple enough that many different implementations can interoperate perfectly.
I think it's safe to say that SGML and DSSSL fell short of that sought-after simplicity, and XML and XSLT were the answer to that.
"The standard has to be sufficiently simple that it makes sense to have multiple implementations." -James Clark
My (completely imaginary) impression of the XSLT committee is that there must have been representatives of several different programming languages (Lisp, Prolog, C++, RPG, Brainfuck, etc) sitting around the conference table facing off with each other, and each managed to get a caricature of their language's cliche cool programming technique hammered into XSLT, but without the other context and support it needed to actually be useful. So nobody was happy!
Then Microsoft came out with MSXML, with an XSL processor that let you include <script> tags in your XSLT documents to do all kinds of magic stuff by dynamically accessing the DOM and performing arbitrary computation (in VBScript, JavaScript, C#, or any IScriptingEngine compatible language). Once you hit a wall with XSLT you could drop down to JavaScript and actually get some work done. But after you got used to manipulating the DOM in JavaScript with XPath, you being to wonder what you ever needed XSLT for in the first place, and why you don't just write a nice flexible XML transformation library in JavaScript, and forget about XSLT.
Excerpts from the DDJ interview (it's fascinating -- read the whole thing!):
>DDJ: You're well known for writing very good reference implementations for SGML and XML Standards. How important is it for these reference implementations to be good implementations as opposed to just something that works?
>JC: Having a reference implementation that's too good can actually be a negative in some ways.
>DDJ: Why is that?
>JC: Well, because it discourages other people from implementing it. If you've got a standard, and you have only one real implementation, then you might as well not have bothered having a standard. You could have just defined the language by its implementation. The point of standards is that you can have multiple implementations, and they can all interoperate.
>You want to make the standard sufficiently easy to implement so that it's not so much work to do an implementation that people are discouraged by the presence of a good reference implementation from doing their own implementation.
>DDJ: Is that necessarily a bad thing? If you have a single implementation that's good enough so that other people don't feel like they have to write another implementation, don't you achieve what you want with a standard in that all implementations — in this case, there's only one of them — work the same?
>JC: For any standard that's really useful, there are different kinds of usage scenarios and different classes of users, and you can't have one implementation that fits all. Take SGML, for example. Sometimes you want a really heavy-weight implementation that does validation and provides lots of information about a document. Sometimes you'd like a much lighter weight implementation that just runs as fast as possible, doesn't validate, and doesn't provide much information about a document apart from elements and attributes and data. But because it's so much work to write an SGML parser, you end up having one SGML parser that supports everything needed for a huge variety of applications, which makes it a lot more complicated. It would be much nicer if you had one SGML parser that is perfect for this application, and another SGML parser that is perfect for this other application. To make that possible, the standard has to be sufficiently simple that it makes sense to have multiple implementations.
>DDJ: Is there any markup software out there that you like to use and that you haven't written yourself?
>JC: The software I probably use most often that I haven't written myself is Microsoft's XML parser and XSLT implementation. Their current version does a pretty credible job of doing both XML and XSLT. It's remarkable, really. If you said, back when I was doing SGML and DSSSL, that one day, you'd find as a standard part of Windows this DLL that did pretty much the same thing as SGML and DSSSL, I'd think you were dreaming. That's one thing I feel very happy about, that this formerly niche thing is now available to everybody.
> But the major factor that killed SGML and DSSSL was the emergence of HTML, XML and XSLT, which were orders of magnitude simpler.
That interview is wonderful, but in 2018, while XML has been successful in lots of fields, it has failed on the Web. SGML remains the only standardized and broadly applicable technique to parse HTML (short of ad-hoc HTML parser libraries) [1]. HTML isn't really simple; it requires full SGML tag inference (as in, you can leave out many tags, and HTML or SGML will infer their presence), SGML attribute minimization (as in `<option selected>`) and other forms of minimization only possible in the presence of a DTD (eg. declarations for the markup to parse).
> JC: [...] But because it's so much work to write an SGML parser, you end up having one SGML parser that supports everything needed for a huge variety of applications.*
Well, I've got news: there's a new implementation of SGML (mine) at [2].
> But after you got used to manipulating the DOM in JavaScript with XPath, you being to wonder what you ever needed XSLT for in the first place, and why you don't just write a nice flexible XML transformation library in JavaScript, and forget about XSLT
My thoughts exactly. Though I've done pretty complicated XSLTs (and occasionally am doing still), JavaScript was designed for DOM manipulation, and given XSLT is Turing-complete anyway, there's not that much benefit in using it over JavaScript except for XML literals and if we're being generous, maybe as a target language for code generation, it being itself based on XML. Ironically, the newest Web frameworks all have invented their own HTML-in-JavaScript notation, eg. react's JSX to drive virtual DOM creation, even though JavaScript started from day one with the principle design goal of a DOM manipulation language.
> My (completely imaginary) impression of the XSLT committee is that there must have been representatives of several different programming languages (Lisp, Prolog, C++, RPG, Brainfuck, etc) sitting around the conference table facing off with each other, and each managed to get a caricature of their language's cliche cool programming technique hammered into XSLT
+1. Though to be fair, XSLT has worked well for the things I did with it, and version 1 at least is very portable. These days XSLT at W3C seems more like a one man show where Michael Kay is both the language specification lead, as well as providing the only implementation (I'm wondering what has happened to W3C's stance on at least two interoperable implementations). The user audience (publishing houses, mostly), however, seem ok with it, as I witnessed at a conference last year; and there's no doubt Michael really provides tons of benefit to the community.
The 1999 "source code" referred to above is in two parts: xu88, the design my group worked out in 1979, now called "Xanadu Green", described in my book "Literary Machines"; and a later design I repudiate, called "Udanax Gold", which the team at XOC (not under direction of Roger Gregory or myself) redesigned for four years until terminated by Autodesk. That's the one with the dual implementation in Smalltalk. They tried hard but not on my watch. Please distinguish between these two endeavors.
Glad to hear from you, and I welcome your details and corrections to the record!
What are your opinions about scripting languages (not just for implementation, but for runtime extensibility)? Are they necessary from the start? Is JavaScript or TypeScript sufficient?
Ted Nelson's ideas continue to point a road to the future, to a better internet. Sir Tim's Web has been creaking along, almost since it's inception due to it's fragile design. That you don't rank him alongside the other three only says that you don't understand - which is fine.
I didn't say his ideas aren't good. I'm saying I value execution more than ideas. Everybody has good ideas, few do a good execution. It's like somebody had the idea to create a teleportation device, in 50 years time got absolutely nowhere with it and is now grumpy that the planes everybody uses are too slow. "If only they'd followed my idea of a teleportation device, everything would've been much better."
wrong, as in, false statement, there is evidence Ted has shipped software, your statement is wrong. "The other three actually build stuff that is now everywhere. Ted never got anywhere." This is patently ridiculous, it's like accusing Socrates of not creating the perfect society, I'm proud to wear a -1 on this, it means I'm annoying at least one other ill informed person.
what, like google, it's Ted Nelson and we are on hacker news, and the OP passed a comment that showed they obviously already know how to use the internet, what would be the point of me linking to sources, Ted Nelson, Computer Lib and Dream Machines, if you've not read it, what have you read?