This might be a bit broader but I enjoyed "How We Got Here" by Andy Kessler, which is available for free (emailed link or direct if you prefer. It's more about how the systems and engineering that predated computing influenced its development to where it is now. Actually it's argument is it's been computing all along since before we had vacuum tubes. Worth a read.
There are lots of websynths that respond to MIDI note messages, and I've seen some that respond to CCs as well -- e.g. the midi-synth  by Chris Wilson (editor of the WebAudio and WebMIDI specs) -- but in that case, the CC messages are hardcoded to the UI. (It also doesn't support 14-bit messages.)
In 106.js, every UI control is arbitrarily assignable to any MIDI CC message, so you can map it to your device however you like. That's a feature I'm used to ("MIDI learn") from desktop DAWs, but haven't seen in any other websynths yet.
You won't be able to see the UI for this feature at all unless you've got a MIDI controller plugged in, though.
Hm, yeah, that would be a cool thing to publish separately. The MIDI mapping code is pretty much isolated to these two objects  , but I'd have to do a little thinking about how people would want to use it, and separating it from Backbone would take some work.
It's usually high density traffic and someone pulling a move like a forced merge which causes the person now behind them to slam on their brakes. I've seen this hyperactive "weaving" happen too many times to have it be explained only by "slow human reactions times".
I heard a few years back (no source) that when they model traffic they can design systems that work perfectly, its human error (not hitting the gas fast enough after a green light, changing lanes to often, 18 wheelers getting in the far left lane to turn right) that make the models break down.