Hacker News new | past | comments | ask | show | jobs | submit login

Reading documents from 20 years ago is a mixed bag. Links usually fail horribly, which was something Xanadu was trying to solve, but I'm not convinced they could have solved it so well that 20-year-old links would still actually work in practice.

I've always tried to write documents in a simple format that's easy to translate to newer formats, and minimizes noise and scaffolding and boilerplate.

When we were developing the HyperTIES hypermedia browser in 1988 [1] at the UMD HCIL, we considered using SGML as the markup language, but decided against it, because we were focusing on designing a system that made it easy for normal people to author documents, and working with SGML took a lot of tooling at the time. (It was great for publishing Boeing's 747 reference manual, but not for publishing poetry and cat pictures.) So we designed our own markup language. [2]

[1] Designing to Facilitate Browsing: A Look Back at the Hyperties Workstation Browser: http://www.donhopkins.com/drupal/node/102

[2] HyperTIES documentation directory: http://donhopkins.com/home/ties/doc/ typical document files: http://donhopkins.com/home/ties/doc/whyanew.st0 http://donhopkins.com/home/ties/doc/formatcommand.st0

It's not which scripting language you have, it's that you have a scripting language at all that's important. HyperTIES was actually implemented in C, plus 3 different scripting languages: FORTH for the markup language interpreter and formatter [3], PostScript for the user interface and display driver and embedded applets [4], and Emacs MockLisp for the authoring tool [5].

[3] HyperTIES Forth code: http://donhopkins.com/home/ties/doc/formatter.st0 http://donhopkins.com/home/ties/fmt.f

[4] HyperTIES PostScript code: http://donhopkins.com/home/ties/doc/tnformat.st0 http://donhopkins.com/home/ties/fmt.ps http://donhopkins.com/home/ties/target.ps

[5] HyperTIES MockLisp code: http://donhopkins.com/home/ties/yahtittie.ml

When you try to design something from the start without a scripting language, like a hypermedia browser or authoring tool, or even a window system or user interface toolkit, you end up getting fucked by Greenspun's Tenth Rule [6]

[6] Greenspun's Tenth Rule: Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp. https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule

But when you start from day one with a scripting language, you can relegate all the flexible scripty stuff to that language, and don't have to implement a bunch of incoherent lobotomized almost-but-not-quite-turing-complete kludgy mechanisms (like using X Resources for event handler bindings and state machines, or the abomination that is XSLT, etc).

TCL/Tk really hit the nail on the head in that respect. TCL isn't a great language design (although it does have its virtues: clean simple C API, excellent for string processing, and a well written implementation of a mediocre language design), but its ubiquitous presence made the design of the Tk user interface toolkit MUCH simpler yet MUCH more extensible, by orders of magnitude compared to all existing X11 toolkits of the time, since it can just seamlessly call back into TCL with strings as event handlers and data, and there is no need for any of the ridiculous useless brittle contraptions that the X Toolkit Intrinsics tried to provide.

The web was pretty crippled before JavaScript and DHTML came along. Before there was client side JavaScript, there were server side scripting languages, like Perl, PHP, Python, Frontier (Radio Userland) [7], HyperTalk, etc.

Frontier / Manilla / Radio Userland was a programmable authoring tool, content management system, web server, with a build-in scripting language (UserTalk, integrated with an outliner and object database). That scriptability enabled Dave Winer and others to rapidly prototype and pioneer technologies such as blogging, RSS, podcasting, XML/RPC, SOAP, OPML, serving dynamic web sites and services, exporting static web sites and content, etc.

[7] Frontier: https://en.wikipedia.org/wiki/UserLand_Software#Frontier Manilla: https://en.wikipedia.org/wiki/UserLand_Software#Manila Radio Userland: https://en.wikipedia.org/wiki/Radio_UserLand

One of the coolest early applications of server side scripting was integrating HyperCard with MacHTTP/WebStar, such that you could publish live interactive HyperCard stacks on the web! Since it was based on good old HyperCard, it was one of the first scriptable web authoring tools that normal people and even children could actually use! [8]

[8] MacHTTP / WebStar from StarNine by Chuck Shotton, and LiveCard HyperCard stack publisher: https://news.ycombinator.com/item?id=7865263 CGI and AppleScript: http://www.drdobbs.com/web-development/cgi-and-applescript/1...

That inspired me to do some similar stuff with another ill-fated scripting language, Kaleida ScriptX. [9]

[9] ScriptX and the World Wide Web: "Link Globally, Interact Locally": http://www.art.net/~hopkins/Don/lang/scriptx/scriptx-www.htm... Demo by Don Hopkins of DreamScape on Kaleida Labs ScriptX presented at the 1995 Apple World Wide Developer Conference. https://www.youtube.com/watch?v=5NytloOy7WM Kaleida ScriptX: https://en.wikipedia.org/wiki/Kaleida_Labs#ScriptX




Thanks for your comprehensive answer.

I guess it's a matter of perspective whether you like the procedural Web (the developer/creative perspective) or not (the perspective of the consumer who gets all kinds of scripts for tracking, mining, fishing, and other nefarious purposes, all the while not being able to save something for later reading).

I have no doubt JavaScript was absolutely necessary to develop the Web to the point it is today. But I had hoped that development of HTML (the markup language) would keep up to eventually provide declarative means to achieve some of what only JavaScript can do, by sort of consolidating UI idioms and practices based on experience gained from JavaScript. But by and large this hasn't happened.

What has happened instead is that JavaScript-first development has taken over the Web since about 2010 (I like react myself when it's a good fit, so I'm not saying this as a grumpy old man or something). And today there's no coherent vision as to what the Web should be; there's no initiative left to drive the Web forward, except for very few parties/monopolies who benefit from the Web's shortcomings (in terms of privacy, lack of security, it's requirement of a Turing-complete scripting environment for even the most basic UI tasks etc).

> the abomination that is XSLT

Not trying to defend XSLT (which I find to be a mixed bag), but you're aware that it's precursor was DSSSL (Scheme), with pretty much a one-to-one correspondence of language constructs and symbol names, aren't you?


In the ideal world we would all be using s-expressions and Lisp, but now XML and JSON fill the need of language-independent data formats.

>Not trying to defend XSLT (which I find to be a mixed bag), but you're aware that it's precursor was DSSSL (Scheme), with pretty much a one-to-one correspondence of language constructs and symbol names, aren't you?

The mighty programmer James Clark wrote the de-facto reference SGML parser and DSSSL implementation, was technical lead of the XML working group, and also helped design and implement XSLT and XPath (not to mention expat, Trex / RELAX NG, etc)! It was totally flexible and incredibly powerful, but massively complicated, and you had to know scheme, which blew a lot of people's minds. But the major factor that killed SGML and DSSSL was the emergence of HTML, XML and XSLT, which were orders of magnitude simpler.

James Clark: http://www.jclark.com/ https://en.wikipedia.org/wiki/James_Clark_(programmer)

There's a wonderful DDJ interview with James Clark called "A Triumph of Simplicity: James Clark on Markup Languages and XML" where he explains how a standard has failed if everyone just uses the reference implementation, because the point of a standard is to be crisp and simple enough that many different implementations can interoperate perfectly.

A Triumph of Simplicity: James Clark on Markup Languages and XML: http://www.drdobbs.com/a-triumph-of-simplicity-james-clark-o...

I think it's safe to say that SGML and DSSSL fell short of that sought-after simplicity, and XML and XSLT were the answer to that.

"The standard has to be sufficiently simple that it makes sense to have multiple implementations." -James Clark

My (completely imaginary) impression of the XSLT committee is that there must have been representatives of several different programming languages (Lisp, Prolog, C++, RPG, Brainfuck, etc) sitting around the conference table facing off with each other, and each managed to get a caricature of their language's cliche cool programming technique hammered into XSLT, but without the other context and support it needed to actually be useful. So nobody was happy!

Then Microsoft came out with MSXML, with an XSL processor that let you include <script> tags in your XSLT documents to do all kinds of magic stuff by dynamically accessing the DOM and performing arbitrary computation (in VBScript, JavaScript, C#, or any IScriptingEngine compatible language). Once you hit a wall with XSLT you could drop down to JavaScript and actually get some work done. But after you got used to manipulating the DOM in JavaScript with XPath, you being to wonder what you ever needed XSLT for in the first place, and why you don't just write a nice flexible XML transformation library in JavaScript, and forget about XSLT.

XSLT Stylesheet Scripting Using <msxsl:script>: https://docs.microsoft.com/en-us/dotnet/standard/data/xml/xs...

Excerpts from the DDJ interview (it's fascinating -- read the whole thing!):

>DDJ: You're well known for writing very good reference implementations for SGML and XML Standards. How important is it for these reference implementations to be good implementations as opposed to just something that works?

>JC: Having a reference implementation that's too good can actually be a negative in some ways.

>DDJ: Why is that?

>JC: Well, because it discourages other people from implementing it. If you've got a standard, and you have only one real implementation, then you might as well not have bothered having a standard. You could have just defined the language by its implementation. The point of standards is that you can have multiple implementations, and they can all interoperate.

>You want to make the standard sufficiently easy to implement so that it's not so much work to do an implementation that people are discouraged by the presence of a good reference implementation from doing their own implementation.

>DDJ: Is that necessarily a bad thing? If you have a single implementation that's good enough so that other people don't feel like they have to write another implementation, don't you achieve what you want with a standard in that all implementations — in this case, there's only one of them — work the same?

>JC: For any standard that's really useful, there are different kinds of usage scenarios and different classes of users, and you can't have one implementation that fits all. Take SGML, for example. Sometimes you want a really heavy-weight implementation that does validation and provides lots of information about a document. Sometimes you'd like a much lighter weight implementation that just runs as fast as possible, doesn't validate, and doesn't provide much information about a document apart from elements and attributes and data. But because it's so much work to write an SGML parser, you end up having one SGML parser that supports everything needed for a huge variety of applications, which makes it a lot more complicated. It would be much nicer if you had one SGML parser that is perfect for this application, and another SGML parser that is perfect for this other application. To make that possible, the standard has to be sufficiently simple that it makes sense to have multiple implementations.

>DDJ: Is there any markup software out there that you like to use and that you haven't written yourself?

>JC: The software I probably use most often that I haven't written myself is Microsoft's XML parser and XSLT implementation. Their current version does a pretty credible job of doing both XML and XSLT. It's remarkable, really. If you said, back when I was doing SGML and DSSSL, that one day, you'd find as a standard part of Windows this DLL that did pretty much the same thing as SGML and DSSSL, I'd think you were dreaming. That's one thing I feel very happy about, that this formerly niche thing is now available to everybody.


> But the major factor that killed SGML and DSSSL was the emergence of HTML, XML and XSLT, which were orders of magnitude simpler.

That interview is wonderful, but in 2018, while XML has been successful in lots of fields, it has failed on the Web. SGML remains the only standardized and broadly applicable technique to parse HTML (short of ad-hoc HTML parser libraries) [1]. HTML isn't really simple; it requires full SGML tag inference (as in, you can leave out many tags, and HTML or SGML will infer their presence), SGML attribute minimization (as in `<option selected>`) and other forms of minimization only possible in the presence of a DTD (eg. declarations for the markup to parse).

> JC: [...] But because it's so much work to write an SGML parser, you end up having one SGML parser that supports everything needed for a huge variety of applications.*

Well, I've got news: there's a new implementation of SGML (mine) at [2].

> But after you got used to manipulating the DOM in JavaScript with XPath, you being to wonder what you ever needed XSLT for in the first place, and why you don't just write a nice flexible XML transformation library in JavaScript, and forget about XSLT

My thoughts exactly. Though I've done pretty complicated XSLTs (and occasionally am doing still), JavaScript was designed for DOM manipulation, and given XSLT is Turing-complete anyway, there's not that much benefit in using it over JavaScript except for XML literals and if we're being generous, maybe as a target language for code generation, it being itself based on XML. Ironically, the newest Web frameworks all have invented their own HTML-in-JavaScript notation, eg. react's JSX to drive virtual DOM creation, even though JavaScript started from day one with the principle design goal of a DOM manipulation language.

> My (completely imaginary) impression of the XSLT committee is that there must have been representatives of several different programming languages (Lisp, Prolog, C++, RPG, Brainfuck, etc) sitting around the conference table facing off with each other, and each managed to get a caricature of their language's cliche cool programming technique hammered into XSLT

+1. Though to be fair, XSLT has worked well for the things I did with it, and version 1 at least is very portable. These days XSLT at W3C seems more like a one man show where Michael Kay is both the language specification lead, as well as providing the only implementation (I'm wondering what has happened to W3C's stance on at least two interoperable implementations). The user audience (publishing houses, mostly), however, seem ok with it, as I witnessed at a conference last year; and there's no doubt Michael really provides tons of benefit to the community.

[1]: http://sgmljs.net/blog/blog1701.html

[2]: http://sgmljs.net/docs/sgmlrefman.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: