
X: The First Fully Modular Software Disaster - handpickednames
http://www.art.net/~hopkins/Don/unix-haters/x-windows/disaster.html
======
pjc50
Ah, this is entertaining to contrast with
[https://news.ycombinator.com/item?id=15031814](https://news.ycombinator.com/item?id=15031814)
"Why did software go off the rails". The "UNIX haters handbook" is extremely
old, as if you couldn't tell from the references to Reagan and the 50-MIPS
workstation (roughly equivalent to a $1 Cortex-M0 in today's money).

And in many ways they're not wrong:

\- "X has defeated us by switching the meaning of client and server"

\- "most computers running X run just four programs: xterm, xload, xclock, and
a window manager" : the intervening 20 years have added a web browser. Almost
all software run by the user is in either the xterm or the browser.

\- ICCCM is hilariously complicated, although cut-and-paste largely works now.
It's just that there are two different cut-and-paste mechanisms in play.

\- client/server division of labour is _still_ being fought over on the web

\- X authentication is painful to do with the original mechanism, but was
eventually fixed by ssh X forwarding

\- Xdefaults is a great mystery, largely obsoleted by the GNOME people with
their own mysterious pseudo-registries and, god help you, polkit

\- X still has trouble with the wide variety of graphics hardware out there,
although you can usually get 2 monitors working eventually.

\- "NeWS an d NEXTSTEP were political failures because they suffer from the
same two problems: oBNoXiOuS capitalization, and Amiga Persecution
Attitude(TM)." : basically true, until they dropped the attitude and capitals
and became Cocoa.

~~~
astrodust
It also ignited debate over what it was even called. Is it the "X-Window
System", or abbreviated, "X-Window" or, like the British and their "maths", is
it "X-Windows"? Not to be confused with "Windows®" of course.

~~~
tranquilpig
I think the US is the odd one out hete with your one unit of 'math'.

Mathematic _s_ -> math _s_

You don't say, "I am learning mathematic", unless you are a cretin.

~~~
DSMan195276
It has an 's' at the end, but it is still treated as a singular noun in most
contexts. You don't say "My favorite subjects are mathematics", you say "My
favorite subject is mathematics".

~~~
mort96
To be fair, you would also not say “My favorite subjects are games”; there’s
only one subject, called “games”, even though the word “games” is a plural
form of “game”.

I can’t really come up with a situation where you would want to refer to
multiple “mathematicses”; it kind of just is something ethereal, not a _thing_
where it makes sense to refer to one or many.

~~~
DanBC
Hypothetical_Bob: "I like applied and pure maths. Mathss are my favourite
subjects. I don't know which one to pick for my A levels".

------
bonyt
> X has had its share of $5,000 toilet seats -- like Sun's Open Look clock
> tool, which gobbles up 1.4 megabytes of real memory! If you sacrificed all
> the RAM from 22 Commodore 64s to clock tool, it still wouldn't have enough
> to tell you the time. Even the vanilla X11R4 "xclock" utility consumed 656K
> to run. And X's memory usage is increasing.

I just opened up "clocktab.com" on Chrome, the first simple clock I could find
by googling "clock," and Chrome's task manager shows it using a full 42
megabytes of memory.

~~~
pvillano
We should have a competition for smallest analog clock in terms of running
memory.

~~~
c-smile
Here is the one: Sciter renders: [https://sciter.com/temp/sciter-analog-
clock.png](https://sciter.com/temp/sciter-analog-clock.png) \- it takes 50 Mb
of memory for the whole process.

This clock sample ( [https://github.com/c-smile/sciter-
sdk/blob/master/samples/gr...](https://github.com/c-smile/sciter-
sdk/blob/master/samples/graphics/analog-clock.htm) ) is a port of Mozilla's
clock :
[https://codepen.io/anon/pen/vJpqOY](https://codepen.io/anon/pen/vJpqOY)

Chrome needs 6 separate processes to run this sample with total memory
consumption of 575 Mb.

~~~
bonyt
I fired up Windows 2.03 in DOSBox, with its clock app, available on
archive.org[1], and it took up 60.1 MB. This is fun! I think the OS would run
fine (at least, enough for the clock) with as little as 512K of ram[2].

Screenshot: [https://i.imgur.com/09xjyPs.png](https://i.imgur.com/09xjyPs.png)

[1]:
[https://archive.org/details/msdos_win2_03](https://archive.org/details/msdos_win2_03)

[2]: [https://support.microsoft.com/en-us/help/32905/windows-
versi...](https://support.microsoft.com/en-us/help/32905/windows-version-
history) ("512K of memory or greater" for 2.03).

~~~
c-smile
If you have screen of size 640*480 and rendering without AA then you can
render by CPU. Otherwise, with modern hardware and 200 ppi monitors you will
need GPU rendering with the whole infrastructure.

------
paol
The Unix Haters Handbook is pretty interesting reading, but I remember that
the chapter on X was one of the weakest. It doesn't help that it starts with
the hoary old myth that X named the 'server' and 'client' backwards. More
relevant however is that from the vantage point of 2017, it becomes clear that
X is one of the most unreasonably successful software architectures of all
time.

Think about it: how many pieces of software design have remained in heavy use
worldwide for 30 years. The fact that no one bothered to replace it in all
that time is proof enough that whatever warts it has can't be _that_ painful.
(Now, finally, it seems like it's going to happen with Wayland, but it's not
there yet.)

Not only that, but computer display technology has changed massively over X's
lifetime. I'm writing this in a composited window manager hardware accelerated
via natively-3D hardware outputting to multiple monitors of different
resolutions over as many different digital display links. How much of this was
even imagined in the 1980s? Yet the X protocol via it's extension mechanisms
has proven adaptable enough to take in massive changes in the technology, and
still keeps on ticking.

~~~
fsloth
"how many pieces of software design have remained in heavy use worldwide for
30 years. The fact that no one bothered to replace it in all that time is
proof enough that whatever warts it has can't be that painful. "

I'm not convinced modern software ecosystem optimizes for quality. Many CAD
packages used today have roots as old or older. The dominant CAD format (at
least in construction) is DWG which - I can tell you from experience - is
about as nice to work with on programmatic level as trying to skin rotten cod.
The data contained in this steaming heap of obfuscation is mostly trivial in
complexity. Just because we have tools in use does not mean they could not be
better - it's rather that once something is in use, social proof, sunk cost
fallacy and some practical reasons kick in and development stagnates to
dealing with the kinks in the established system.

~~~
ansible
I was just ranting to some people yesterday about how with HDMI there are
still remnants of the original NTSC broadcast standard from nearly 80 years
ago. Even though we've (finally!) reached the point where my digital computer
is sending a digital image to my digital monitor with no unnecessary analog
conversions in between.

~~~
jandrese
One of my "favorite" parts of HDMI is that you can specify the resolution in
two different ways depending on if you are connected to a computer monitor or
a TV. Choosing the right one is sometimes important, a resolution might not
work on your display if you specify it using the other format.

~~~
wnoise
That sounds absolutely crazy, so I believe it. Can you elaborate though?

~~~
jandrese
Look up CEA vs. DMT video modes. Typically this can become a problem when you
need to specify the video mode before connecting the display, or if you're
running through a KVM.

~~~
mietek
Dimethyltryptamine video modes?… Sounds about right.

------
ethbro
I'd say that X is the foremost example of actual software evolution.

In the sense that (in the words of my pathologist father when I asked him how
hard drug discovery and design was),

 _" Nature is lazy before it's intelligent. At every turn, it would rather
reuse a system designed for a completely different purpose than design
something new from the ground up. Which is how we end up with single pathways
affecting six completely unrelated systems in the body."_

~~~
krylon
> Nature is lazy before it's intelligent. At every turn, it would rather reuse
> a system designed for a completely different purpose than design something
> new from the ground up.

So we can blame all the crazy things we do with computers and software on
nature? ;-)

But seriously, that is a great insight. Both where evolution is concerned, and
software development. (And other branches of engineering, too, I bet.)

------
jandrese
I'm probably one of the only people who liked the .Xresources file.

Sure it's a text and you do regexes to specify objects, but I found it easy to
customize and because much of it was automatic there would be a ton of the UI
variables in it. You want a different color on the background of a particular
text box? No problem. You want to change the font used in the app? No problem.
You need to customize the contents of a menu? That might be possible! Every
application used the same file so you only had to figure it out once.

Granted, you would leave 99% of the variables alone, but it was nice to have
the option if you needed it.

Interestingly enough, I still have the file on my box, although I doubt it
gets used much anymore. Looking at the file I forgot how you could do a sort
of primitive themeing by specifying broad wildcards like "*
MenuButton.Background" and "* Toggle.Font"[1]. That's probably a bit dangerous
but I can't remember it ever exploding on me spectacularly. Probably because
Athena widgets were so damn minimal to begin with.

Oh, man it even still has:

    
    
       Netscape4*blinkingEnabled: False
       Netscape4*myshopping.isEnabled: False
    

[1] The spaces aren't there in the file, they're just necessary to work around
the way HN's markdown does italics.

------
PLenz
Excerpted from Eric S. Raymond's The Unix Hater’s Handbook, Reconsidered
(2008)[[http://esr.ibiblio.org/?p=538](http://esr.ibiblio.org/?p=538)]: This
chapter begins unfortunately, with complaints about X’s performance and memory
usage that seem rather quaint when comparing it to the applications of 14
years later. It continues with a fling at the sparseness of X applications
circa 1990 which is unintentionally funny when read from within evince on a
Linux desktop also hosting the Emacs instance I’m writing in, a toolbar with
about a dozen applets on it, and a Web browser.

I judge that the authors’ rejection of mechanism/policy separation as a
guiding principle of X was foundationally mistaken. I argued in The Art of
Unix Programming that this principle gives X an ability to adapt to new
technologies and new thinking about UI that no competitor has ever matched. I
still think that’s true.

But not all the feces flung in this chapter is misdirected; Motif really did
suck pretty badly, it’s a good thing it’s dead. ICCCM is about as horrible as
the authors describe, but that’s hard to notice these days because modern
toolkits and window managers do a pretty good job of hiding the ugliness from
applications.

Though it’s not explicitly credited, I’m fairly sure most of this chapter was
written by Don Hopkins. Don is a wizard hacker and a good man who got caught
on the wrong side of history, investing a lot of effort in Sun’s NeWS just
before it got steamrollered by X, and this chapter is best read as the same
bitter lament for NeWS I heard from him face to face in the late 1980s.

Don may have been right, architecturally speaking. But X did not win by
accident; it clobbered NeWS essentially because it was open source while NeWS
was not. In the 20 years after 1987 that meant enough people put in enough
work that X got un-broken, notably when Keith Packard came back after 2001 and
completely rewrote the rendering core. The nasty resources system is pretty
much bypassed by modern toolkits. X-extension hell and the device portability
problems the authors were so aggrieved by turned out to be a temporary
phenomenon while people were still working on understanding the 2D-graphics
problem space.

That having been said, Olin Shivers’s rant about xauth is still pretty funny
and I’m glad I haven’t had to use it in years.

------
avolcano
Uh, as a frontend developer, I am kind of _shocked_ I've never read this
before. There's a lot of fascinating parallels to modern web development.
zitterbewegung noted the similarity to complaints about Electron apps' memory
consumption, for example, but:

> The right graphical client/server model is to have an extensible server.
> Application programs on remote machines can download their own special
> extension on demand and share libraries in the server. Downloaded code can
> draw windows, track input eents, provide fast interactive feedback, and
> minimize network traffic by communicating with the application using a
> dynamic, high-level protocol.

Certainly sounds a heck of a lot like how web applications work (even though
we're currently terrible at sharing libraries, heh).

> X gave programmers a way to display windows and pixels, but it didn't speak
> to buttons, menus, scroll bars, or any of the other necessary elements of a
> graphical user interface. Programmers invented their own. Soon the Unix
> community had six or so different interface standards.

Now _that's_ certainly familiar. Sure, the DOM is a heck of a lot closer to a
platform for displaying complex UIs than X was, but it still falls so far
short of what developers need that a plethora of frameworks, UI libraries,
etc. have appeared and fragmented the community. You could also stretch a bit
and say things like Google's work on web components are an attempt at a Motif-
like standardization around one questionable standard, but I don't know if I'm
quite cynical enough to make that jump.

> Even if you can get an X program to compile, there's no guarantee it'll work
> with your server. If an application requires an X extension that your server
> doesn't provide, then it fails. X applications can't extend the server
> themselves -- the extension has to be compiled and linked into the server.

While a lot of new browser features are polyfillable, a lot of the more
advanced ones (e.g. service workers) are not, and users and developers are at
the mercy of their browsers, much users would be with their X servers.

> Myth: X is "Device Independent"

The quirks discussed in this section apply to responsive web apps too. There's
actually quite a bit of nuance in making fancy canvas, WebGL, or CSS
transforms that look good on retina screens, etc.

I'm sure none of these comparisons truly map 1:1 to X development (having
never done it myself), but damn if it doesn't remind me how cyclical software
development has been over the past few decades. Not that that's a bad thing,
just that some things are very, very hard :)

~~~
DonHopkins
>Certainly sounds a heck of a lot like how web applications work (even though
we're currently terrible at sharing libraries, heh).

NeWS was architecturally similar to what is now called AJAX, except that NeWS
coherently:

\+ used PostScript code instead of JavaScript for programming.

\+ used PostScript graphics instead of DHTML and CSS for rendering.

\+ used PostScript data instead of XML and JSON for data representation.

[https://en.wikipedia.org/wiki/NeWS](https://en.wikipedia.org/wiki/NeWS)

~~~
wolfgang42
In an alternate universe in which NeWS won out, someone developed a standard
viewer for networked documents which provided standard routines for layout and
document linking. Eventually, as more people got viewers, many applications
began to be distributed directly as client-side interactive PostScript
documents, eschewing the NeWS network protocol altogether in favor of PSON, a
text-based protocol which had the advantage of working correctly through
corporate firewalls. Someone developed a server-side runtime engine called
Node.ps, and many people jumped on the bandwagon, claiming that it made
sharing code between the client and server much easier. As PostScript
development became more complex, preprocessor tools began proliferating,
including a strongly-typed version of PostScript, known as TypeScript. Due to
PostScript's lack of a good standard library, a company named "NPM" started a
package repository, which was soon filled with tiny libraries for each
PostScript procedure, eventually leading to the "string-length debacle" when
an upset developer unpublished a five-line package...

~~~
DonHopkins
Arthur van Hoff wrote an object oriented C to PostScript compiler called PdB,
which is kind of like TypeScript!

[https://compilers.iecc.com/comparch/article/93-01-152](https://compilers.iecc.com/comparch/article/93-01-152)

------
kronos29296
So this means we really need Wayland. Hope that somebody puts a EOL like Flash
on X. Only then will all those projects move to Wayland. Even Python 2 to 3
took well over a decade and we still haven't reached EOL.

~~~
nly
I dunno... modern X (XCB) doesn't seem so different to Wayland.

Both are asynchronous RPC protocols with bindings generated from a bunch of
XML specs. Both use shared memory aggressively. Both share a lot of concepts.
X just has more back-compat to deal with.

~~~
nhaehnle
People over-emphasize a lot of the problems X has. A lot of the real original
problems like high latency were evolved away. And while backwards
compatibility can be annoying, that's just what you get for trying to do
serious software engineering. So I agree with you that the differences between
X and Wayland are greatly exaggerated.

However, there is one _significant_ architectural difference between X and
Wayland that is quite difficult to evolve away. In modern X, there are
actually three processes involved: the client (app), the X server, and the
compositor. In Wayland, there are only two: the client (app) and the
compositor.

The X design is more fragile (state managed in 3 places instead of 2) hence
ICCCM, and has more latency and overhead. However, it made a lot of sense back
when the X server talked to the hardware directly and therefore needed both an
integrated driver and special privileges. It'd be silly to implement drivers
as part of compositors, so the separation of mechanism and policy was the
right way to go back then.

Since then, however, the mechanism has basically entirely moved into the
kernel (DRM/KMS) and Mesa (OpenGL). This happened in bits and pieces first
with the evolution of the DRI protocol and then the jump to kernel mode
setting.

 _That_ is the evolutionary development which lead to the move to Wayland
makeing sense.

I suppose this 3-to-2-process transition _could_ have been evolved by
refactoring the core of the X server source (the whole protocol handling etc.)
into a library that compositors then simply link against. But that would have
been a truly herculean task with not very many natural intermediate steps.
Implementing something like Wayland from scratch on the modern Linux graphics
stack is actually _much_ less work -- except perhaps for transitioning all the
toolkits, but then again, the X server source lends itself fairly well to
writing various "X-on-something-else" servers, since people have been doing
that for a while, so there's a natural if slightly awkward solution for
backwards compatibility.

So hacking up an initial prototype for Wayland could be done very quickly, but
note that actual adoption still took a long time. But the point is that the
Wayland path had more presentable intermediate steps (unlike the X server
refactor), so that's the path that software evolution took.

(Man, this got a lot longer than originally intended...)

~~~
nly
> In modern X, there are actually three processes involved: the client (app),
> the X server, and the compositor.

Is the compositor not optional?

~~~
nhaehnle
Even if you don't have a compositor you still have at least a window manager,
which has much the same issues (high latency, ICCCM).

------
gozur88
I liked xwindows very much. Still do. What other system allows you to use your
local window manager to control windows from multiple different machines?

~~~
pjmlp
RDP does.

~~~
Zardoz84
Its'n the same. With RDP, you are sharing a whole desktop. With X protocol,
you can share WINDOWS! You not need to run the whole windows manager/desktop
on the remote machine. You can simply launch the X11 application and allow the
local client to manage the windows. It's far more efficient and far more
responsive (if is compressed and cached like does NX).

I remember connecting to my desktop from a computer on the university using NX
protocol and running remote apps like was on local. And I don't had a superb
internet connexion.

~~~
wbkang
Quick note, RDP has extensions to forward just the app. It has been able to do
that for a while.

------
2sk21
One of the amazing things about X was how easy it was to snoop on others.
There was one prof at my university who appeared to be working long hours
until a grad student came up with the idea of dumping out the prof's frame
buffer - and it turned out he was playing Tetris! Default settings were very
permissive...

~~~
teddyh
Dumping the frame buffer has nothing to do with X; you can’t blame X for bad
permissions on the underlying device node in Unix. X didn’t create the device
node, and is therefore not responsible for the permissions on it.

~~~
emmelaich
Maybe the GP means the DISPLAY? People couldn't be bothered using xauth and
even just had _xhost +_ (allow anyone).

When we got internet connectivity at Sydney University we used to just dump
X-Windows from all over the world. One in particular was some displays in
Sweden. Forget what was actually on them though.

~~~
buckminster
It worked the other way round too. You could overwrite the screen buffer of
anyone on the network. Usually with something offensive. Sun even provided a
command to do it. This was hilarious in 1989.

------
jandrese
Ages ago when I read this article I thought they missed the one major feature
that X totally flubbed. IMHO X should have exported some way to do sound along
with the graphics. Sound support on Unix was a nightmare for most of the 80s
and 90s and even into the 2000s. Everything was different and broken in their
own special ways and some were proprietary and very few (only Sun's halfassed
network audio as the exception) worked over the network.

Most of the rest of the complaints come off as someone with an axe to grind,
especially all of the hand-wringing over flipping the client and server model
around. It's not like X was the first system to do that, active mode FTP does
exactly the same thing. Back in the days before firewalls and NAT this was a
legitimate design decision, they just ended up on the wrong side of history.

Besides, from a network architecture standpoint it makes sense. Your local X
display doesn't know that you've started an app on a remote box until
something tells it. Having it make the connection out would require some local
broker to inform it that it needs to make the connection.

I will agree with one point, ICCCM sucks and has been overdue for a complete
rework for at least 20 years now. That said, I'm not a fan of Gnome's Ctrl-
Shift-C Ctrl-Shift-V workaround either.

------
sidlls
"Self abuse kit" is such an appropriate way to describe Motif.

~~~
kabdib
A few years ago I found a book on Motif user interface programming in a used
bookstore, and started leafing through it. The entire book had exactly ONE
picture. ONE. And it was some architectural diagram describing X, depicting a
network of terminals and servers and happy little cloud things. The rest of
the book (asymptotically, _all_ of the book) described APIs and widgets and
protocols and whatnot, all without the benefit of showing what the user would
be interacting with.

That someone could write a book on UI programming and someone else would
publish a book on UI programming without any actual depictions of UI is pretty
much all I need to know about X and its community (although I know a lot more,
and my actual exposure to X goes back to early versions that came on 9-track
tapes directly from MIT and supported mostly just frame buffers on DEC Vaxen).

(For years -- and this may still be true -- the software to manage the
equipment in Comcast's head end datacenters was controlled by some X-based UI.
Now, I ain't gonna say "Them folks surely deserve it, because they done me
wrong" but that miserable train-wreck of a system goes a long way towards
explaining why it's often hard for Comcast to fix their stuff. I have un-fond
memories of taking down whole QAMs because I was fool enough to click some
check boxes in a dialog in the wrong order...)

You can write hideous UI in just about anything, but X seems to have a special
place in the ecosystem.

~~~
DonHopkins
It's because Motif is so ugly, they were embarrassed to include a screen
snapshot.

Then TCL/Tk came along and emulated Motif's look and feel, only better because
its default color was bisque.

"Bisque is Beautiful"

[http://www.ucolick.org/~de/Tcl/pictures/](http://www.ucolick.org/~de/Tcl/pictures/)

"The procedure tk_bisque is provided for backward compatibility: it restores
the application's colors to the light brown (“bisque”) color scheme used in Tk
3.6 and earlier versions."

[https://www.tcl.tk/man/tcl/TkCmd/palette.htm](https://www.tcl.tk/man/tcl/TkCmd/palette.htm)

[http://mars.cs.utu.fi/BioInfer/files/doc/public/Tkinter.Misc...](http://mars.cs.utu.fi/BioInfer/files/doc/public/Tkinter.Misc-
class.html#tk_bisque)

------
amelius
I love Unix, but I have to admit there is some truth to this article.

------
fergie
You can still use X to run a local GUI hooked up to a remote machine, and 15
years ago that was pure voodoo magic.

~~~
jandrese
Even today that's not exactly a common feature. Usually you need to start a
full desktop (RDP, VNC, etc...) to do anything similar.

I still use X display forwarding regularly. Often for something like sending a
Firefox window back so I can look at a page on a test network without needing
a direct route to it.

~~~
tatersolid
RDP has supported running single apps remotely for a decade. Windows Server
2008 had this feature.

------
mikhailfranco
PEX and GLX were really quite cool, and only recently matched by WebGL. Native
3D apps really could be opened remotely, then download display lists to the
local store, and use local 3D rendering hardware.

There just wasn't very much PEX interoperability between different
manufacturers, because of all the different extensions. Only a rare vanilla
app could be remoted in a heterogeneous environment.

Some vendors, like E&S, had PEX as the native graphics API. You could _not_
write a purely local app that couldn't be opened remotely, including stereo
displays (CrystalEyes) and use of dialbox or spaceball input devices.

~~~
pjmlp
RDP supports networked DirectX, with the GPGPU on the server or client side.

And DirectX offers so much more than downgraded OpenGL ES 3.0 (aka WebGL 2.0).

------
maxxxxx
I am not sure if the whole idea is considered bad or the way it has been
implemented. In general I think X is very useful.

------
dgfgfdagasdfgfa
Why is configuring displays so damn difficult? It was never clear to me why
there needed to be so many tools (xrandr, xorg, X, startx, xinit, xanorama) to
just get a basic display working at the max resolution when other operating
systems manage to have some sane plug-and-play behavior.

All of this seems to render linux pretty useless for hot-pluggable displays.
I'm sure ubuntu has some sort of solution (I never use the desktop version);
why can't this be integrated at the level of the X server (or hell, the
graphics driver) itself? Is X too firmly baked to adjust to the needs of its
users? Will wayland address this?

------
zitterbewegung
Reading this reminds me of people complaining about Electron for desktop apps.

------
rbanffy
Dammit... Now I want to install CDE again...

------
jdalgetty
To me this is always what made linux fun.

------
Cockbrand
It's called X Window, man. Singular, not plural.

~~~
lioeters
From the bottom of the page:

To annoy X fanatics, Don specifically asked that we include the hyphen after
the letter "X,", as well as the plural of the word "Windows," in his chapter
title.

