Nice historical nugget from wikipedia on dithering:
> Etymology
> …[O]ne of the earliest [applications] of dither came in World War II. Airplane bombers used mechanical computers to perform navigation and bomb trajectory calculations. Curiously, these computers (boxes filled with hundreds of gears and cogs) performed more accurately when flying on board the aircraft, and less well on ground. Engineers realized that the vibration from the aircraft reduced the error from sticky moving parts. Instead of moving in short jerks, they moved more continuously. Small vibrating motors were built into the computers, and their vibration was called dither from the Middle English verb "didderen," meaning "to tremble." Today, when you tap a mechanical meter to increase its accuracy, you are applying dither, and modern dictionaries define dither as a highly nervous, confused, or agitated state. In minute quantities, dither successfully makes a digitization system a little more analog in the good sense of the word.
"Dithering Heights" was the name of the ColorSync team's lab at Apple in the 1990's.
Conference rooms and lab rooms often had amusing names — as I am sure is true at other companies besides Apple. "Rock" and "Hard Place" were two neighboring conference rooms.
Another floor as I recall had conference rooms named with oxymorons, "Military Intelligence", "Jumbo Shrimp" and — was it "Lawrence Expressway"? — something funny to Bay Area residents anyway.
Dithering ils also used in signal acquisition: adding some (known) noise before the sensor can help the signal reach the sensor's threshold. It can also help with quantization noise. Of course, that degrades the signal/noise ratio, but the SNR can be improved later by filtering out the spectral density of the noise, averaging multiple captures, making use of CRC, etc.
I don't buy that etymology, even in the expanded context. It seems like a dutch pronunciation of titter/teeter, and an amalgamation of a few words that became really popular and "cool".
I might dig out my norse etymology dictionary and a magnifying glass and try to trace it back further.
-- From Etymonline:
Teeter:
1843, "to seesaw," alteration of Middle English titter "move unsteadily," probably from a Scandinavian source akin to Old Norse titra "to shake, shiver, totter, tremble," from Proto-Germanic *ti-tra- (source also of German zittern "to tremble"). Meaning "move unsteadily, be on the edge of imbalance" is from 1844.
[...]Noun teeter-totter "see-saw" is attested from 1871 (earlier simply teeter, 1855, and titter-totter in same sense is from *1520s*). Totter (n.) "board swing" is recorded from late 14c.;
Dutch people are notorious for having issues with the th sound, which does not occur in the Dutch language and requires practice. They also don't "soften" leading consonants, they "harden" trailing consonants like zand pronounced zant. Thus I don't see why your explanation would be likely.
Source: am Dutch, would never pronounce titter as dither even with a bad accent. More likely the other way around actually.
I think the "Hacker News readers" and "People who like dithering" Venn diagram has converged slowly over time. I'm a happy resident of the overlapping zone.
Obligatory link to Obra Dinn's fascinating dev log post, regarding the challenge of spatial and temporal coherence when dithering realtime 3D (for aesthetic reasons, ie. deliberately noticeable to the player):
https://forums.tigsource.com/index.php?topic=40832.msg136374...
Lately I've been attempting to add the original dithering from Excel 97's easter egg to my online reproduction. In the era of indexed-color graphics, developers had to dither efficiently to reduce banding. Compare these two rendering techniques of the same subject, one with 16 shades of gray and dithering, and the other with 256 shades of gray:
Definitely check out the Obra Dinn dev log post, it shows gifs of the different approaches Lucas tried, and highlights how difficult it is to make dithering in 3d appear at all decent while the camera is moving.
Shameless plug, but after reading this blog post the last time it was posted, I tried to implement some of the same dithering methods in Futhark[0]. You can see the results here, if you're interested: https://munksgaard.github.io/bluenoise/
By varying the dither pattern each frame they are basically doing the equivalent of the hardware FRC in monitors to give you an extra fake 9th bit (for 8-bit rendering) of precision - as long as the framerate is high enough, the dithering becomes nearly invisible. Not that it's super visible to begin with. I'm personally using Dither17.
I remember coming upon this when trying to use PCoIP for remote access to workstations after the win7 to win 10 transition. It's a setting somewhere in Windows10 (maybe with specific video cards?) that uses dithering to improve image quality when you have a monitor that doesn't support HDR.
This absolutely wrecked the bandwidth as the whole screen was trying to refresh at 60Hz in a way that the PCoIP algorithm wasn't prepared for.
They're usually flipping the least important bit, so the video compressor will probably ignore the dithering. It does turn a perfectly stable image into an unstable image, though, so it could potentially be harmful if you're compressing video of static text and geometry - I haven't tested this before, so it's an interesting point.
Even if posted more than once on hacker news, there is a interesting forum post [1] from Lucas Pope, the author of Obra Dinn, describing dither mapping in 3d. [2] (2017) has some more links
Every time I read a Lukas Pope post, I just want to clap. He is someone who truly grasps the intersection of experience and ingestion. If anyone could embody a digital artist, it would be him, since he seems to work to create things with the full intention of how they’re seen. A true hacker.
I remember dithering from Samantha Fox Strip Poker on the C64. It wasn't very good, but when you have an active mind who needs photorealistic graphics.
From the pictures of error diffusion dithering it looks like they are consistently applied from row-by-row in the same direction (probably left to right).
There is a variation of these where every odd row is processed in one direction (left to right) and every even row is processed in the other direction (right to left). It usually produces a more visually appealing result.
I thought I had commented on that thread but ... apparently not.
Anyway. I built my own hand-rolled dither filter for my canvas library - see the CodePen demo to see it attempt to dither a live video stream (warning: site will ask for permission to use your device's camera before displaying the canvas) - https://codepen.io/kaliedarik/pen/OJOaOZz
The problem I have with the filter is that it's not consistent across video frames (the dither jiggles around, even when nothing is moving between frames). It's really annoying and I don't know how to solve it. Does anyone know of any research/solutions for this sort of thing?
I have seen that page before - an excellent writeup on how to morph the dither pattern to a sphere and match it to camera rotation to get rid of the dither jank as a character/camera moves through the game scene. But I don't think it applies to my particular issue of seeing dither jank on a webcam livestream.
Maybe there's a way of adapting my algorithm to give it some memory of previous frames to help minimise the jank that occurs as the current frame's output is calculated. Or identify static parts of the current frame compared to the previous frame and only recalculate the dither for the pixels that have changed beyond a minimal color distance threshold?
At the time when image viewers used to change screen palette, resolution and color depth just to show a picture, and decoding compressed images came with a progress bar attached, people regularly did want to get dithered black and white or 16- and 256-color images from better and bigger sources. Desktops only became True Color when VRAM started to be measured in megabytes, and there was no point in storing or transferring enormous files to use as illustrations or wallpapers. I remembered an artifact from the palette color era, shareware application called “Graphic Workshop” (GWS) with extensive dithering options, and found it among conveniently emulated software on archive.org.
It would be great to load other files instead of example images. In fact, archive.org stores virtual file system changes on client, and files remain available across sessions, but there seems to be no way to interact with those virtual Dosbox mounts, or to transfer any data except for typing it.
I made a C++ project[1] (using OpenCL and blue noise) to generate dithered images/video for a course project (the course taught how to program for the GPU). If anyone wants to just jump straight into generating dithered images, you can try it out yourself. (It only works if OpenCL is configured for your system. It works on my desktop with a GPU and does not work on my laptop without a separate GPU.)
EDIT: Oh right, it also requires FFmpeg to be installed as it uses it. And libpng. And CMake. And PkgConfig.
Seems like a good application for deep learning. None of the dither algorithms are a match for what seems to be the handmade dither in the game screenshot.
Or at least, we could boost the high frequencies on the image somehow before putting it in, to preserve detail, and add some randomness to the diffusion algorithms?
See also https://en.wikipedia.org/wiki/Halftone, a technique used in the print world to achieve a similar range of shades and colour using a small base set.
> Etymology
> …[O]ne of the earliest [applications] of dither came in World War II. Airplane bombers used mechanical computers to perform navigation and bomb trajectory calculations. Curiously, these computers (boxes filled with hundreds of gears and cogs) performed more accurately when flying on board the aircraft, and less well on ground. Engineers realized that the vibration from the aircraft reduced the error from sticky moving parts. Instead of moving in short jerks, they moved more continuously. Small vibrating motors were built into the computers, and their vibration was called dither from the Middle English verb "didderen," meaning "to tremble." Today, when you tap a mechanical meter to increase its accuracy, you are applying dither, and modern dictionaries define dither as a highly nervous, confused, or agitated state. In minute quantities, dither successfully makes a digitization system a little more analog in the good sense of the word.