Saturday, August 10
10:00am David Fuchs
What six orders of magnitude of space-time buys you
>TeX and MF were designed to run acceptably fast on computers with less than 1/1000th the memory and 1/1000th the processing power of modern devices. Many of the design trade-offs that were made are no longer required or even appropriate.
An absolute plain vanilla TeX, exactly as Knuth wrote it, and my tool chain compiles it, composes all 495 pages of The TeXbook in 0.300 seconds on a 2012 MacBook Pro laptop (the first "Retina" model). Single threaded, composing pages in 0.6 msec each, running in well under 1 Megabyte total for code and data. Back on the SAIL mainframe (a Dec PDP10) that Knuth used to develop TeX, it was almost exactly 1000 times slower: the pages would tick by every half-second or so (at night, anyway).
Of course, nowadays we also have lots more memory to throw around. One cool idea is to modify TeX to retain all the internal data structures for the pages it has created, and run a nice screen viewer directly from that; Doug McKenna gave a slick presentation at the recent Palo Alto / Stanford TUG meeting of such a system that he created in order to enable live viewing of his Hilbert Curves textbook, including displaying figures of space-filling fractal curves that can be arbitrarily zoomed, which is simply impossible to do via PDF.
Going further, you can additionally modify TeX so it takes snapshots of its internal state after every page, and is able to pop back to any of these states. Presto, now if the user makes an edit on page 223, TeX can quickly back up to the state of the world just before page 223, and continue on forward with the modified input. Page 223 gets recomposed and immediately redisplayed, essentially in real-time. Of course, the trick here is creating and storing the snapshots efficiently; the TUG demo I gave using The TeXbook runs in a few hundred megabytes, and does the whole "pop back, recompose a page, redisplay it" rigamarole in milliseconds.
The bad news is that my stuff is still in the proof-of-concept stage, as there's no support for the well-established extensions to Knuth's TeX (importing graphics, using system fonts, etc.) that are required by the vast majority of LaTeX users. I don't expect any of these features to slow things down appreciably, but time will tell. I intend to do a "Show HN" by and by, with lots more details, when it's able to handle real-world documents.
My apologies for failing to successfully fly under the radar until things were ready for prime time. My premature TUG demo was intended to wow Prof. Knuth sufficiently that he'd approve of a decades-late Ph.D. for me. (Happily, he did agree, contingent on just one additional feature being added...)
> Presto, now if the user makes an edit on page 223, TeX can quickly back up to the state of the world just before page 223, and continue on forward with the modified input. Page 223 gets recomposed and immediately redisplayed, essentially in real-time.
Isn't it possible, in the worst case, that editing the source line that maps to page 223 could trigger re-rendering arbitrarily far back before page 223? Like if you wrote all 223 pages without any chapters, parts, \newpage, etc. How does your program handle this?
Sure. It seems best to redisplay quickly, then update the screen again when everything is quiescent (the user hasn’t typed anything for a few tenths of a second, and the whole document has been fully recompiled with no changes detected). Usually it’s not even noticeable, though of course there are degenerate cases where a document oscillates, which gets called out in the UI in the unlikely case it happens.