Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Rainbow – an attempt to display colour on a B&W monitor (anfractuosity.com)
389 points by anfractuosity on June 27, 2020 | hide | past | favorite | 58 comments



Very neat, the outcome probably isn't what you were hoping for, but it has a beautiful and ethereal quality to it. It's like an art piece.

From the photos and video, only a small section of the screen has colour. Does that change as you move around, or is the coloured section static?

Off topic, but the visual effect seen here is a lot like how my memories and dreams feel; only the thing I'm currently focussing on has any colour or detail, whilst everything else is muted and fuzzy.


Yeah I would like to improve the effect. I think the main colour region seems to stay the same.

I'm wondering if I can get acetate that's ever so slightly adhesive, so it sticks better to the monitor, which I'm wondering might help the colours show better all over.


Acetate sheets for use on Over-Head Projectors (OHP), used to come on a plain white backing sheet. The top was glossy smooth, and the back was a little tacky; if you can find those ...


Also curious to know if you can get slightly better color spread by looking at it from far away; I assumed the misaligned light or distortion through the glass might be enough to throw things off if you're working with tolerances as small as it looks.

Or, maybe the electron beam scatters more as it's deflected to hit the the outer pixels, possibly giving you a signal too muddy for your magic potato pixels to work with. (these are obviously just vague guesses, I didn't even know this was possible before today)


huh, my memories have full color and some detail, I wonder if thats why I have problems remembering things


I get full color, smells, taste and sensations.

I am annoyed when I get woken up from a good meal. :)

(On the flip side, I can technically breath underwater in my dreams, but I feel the water going into my lungs so it isn't pleasant!)


Tektronix did it with a totally wild set of LCD color filter shutters over the whole face of the CRT, and then sequential color fields. It meant you got the whole spatial resolution of the display, but one third the temporal resolution. Similar to color-wheels in DLP projectors, really.

The system was called NuColor, if you want more detail.

The Bayer filter method in TFA is sensitive to perfect alignment between the filter and the image, which proved impractical in the real world. Sure makes for a neat experiment, though.


JVC built a working model with that approach. There's a neat breakdown of the tech on this YT channel:

https://youtu.be/z-q8ehzHeQQ


Man, that was the golden age of analog video right there. In the 90s it became just barely possible to do a lot of processing of analog video using a shitload of expensive electronics, and people did. Those were the days of "line doubling" deinterlacers that were as big as shoeboxes and as expensive as cars.


An LCD panel backlit by a CRT or a DLP projector was/is a way to make a high-dynamic-range display.

http://anyhere.com/gward/papers/Siggraph04.pdf


I've never forgotten this idea. I'd love to make one someday.


Can someone explains how this works? I have frustratingly been googling for 30 minutes now trying to learn how autochrome works. Being tech oriented it's so frustrating when people sum it up in 1 sentence "its made of potato starch and magically it makes color!"

How does it work? Do they hand paint the screen? How does it actually add red or blue to an arbitrary shade of gray which lacks any chroma? Does it work with video too?


The trick is because the shade of grey DOES have chroma. You can't just apply any autochrome panel to the image (unlike what the OP is trying to do with Rainbow), instead, the negative is generated using the chrominance filter

Imagine that each grain is blocking a 'pixel' of the negative. For simplicity of this explanation, I'll pretend the grains are red, green, and blue. If there's a blue grain, only blue light can get through, and thus the luminance is baked in specific to the amount of blue in that region. The same is true for the red, and the green.

Thus, each 'pixel' of the negative isn't determining a monochromatic luminosity (ie black and white), but instead telling the luminosity of a single channel of red, green, or blue.

When the grain filter and the developed negative are combined, we get natural colour. The same filter is used for generation and display.

Another way to think of it is like this: Each pixel on our screen is actually 3 subpixels, each defining red, green, and blue, and we format our image to alternate between telling the red, green, and blue channels how bright to be. This is doing the same, but the subpixels are distributed randomly because that's much easier to make.

EDIT:

If you organise the grains in a geometric pattern, you get Dufaycolor[1].

[1] https://en.wikipedia.org/wiki/Dufaycolor


Does it mean the filter has to very very precisely aligned when booth shooting and printing from the negative?

I may be daft, and your explanation sounds reasonable but I still feel "magic + potato starch" :-D


The filter is part of the photographic plate, and the plate is developed directly into a positive. So there's no alignment issue.


What TFA is doing is basically the inverse of a digital camera sensor. They're essentially putting a coloured tint in front of each pixel (actually a block of 4 pixels) in a bayer pattern (https://en.wikipedia.org/wiki/Bayer_filter), and using software to mosaic (convert) a colour image into its Bayer components if you will so that a block of four pixels under a green tint for example is lit only with it's green channel, and so on for each colour. You lose some spatial resolution this way of course.


Autochrome was an early technique for color photography. Wikipedia has a pretty good explanation [1].

The starch grains were randomly scattered on one side of the photographic plate, and acted as filters so that small portions of the plate would capture certain colors. The plate was then developed as a positive image.

When you would shine light through the developed plate it would go through the starch grains (still on the plate) as well as the developed image. The developed image controls the luminosity while the colored starch grains again act as a filter and control the hue of the transmitted light, roughly reproducing the original scene.

[1] https://en.wikipedia.org/wiki/Autochrome_Lumi%C3%A8re


> go through the starch grains (still on the plate)

This part is key to understanding. The plate with the starch-based colored mosaic is not simply a filter used upon capture to be removed after (as most filters are today), but also a filter to be used upon viewing. Since the same exact “filter” is used for both capturing and viewing of a given photograph, it allows the mosaic to be random, simplifying creation of the filter.


so... I wonder if the Bayer matrix on digital sensors was inspired by this. If each sensor was also a display, and the beyer matrix was random, it would be the same thing...


Autochrome is pretty much just subpixels. You capture and view a greyscale image through the same dots-of-color filter plate, and thus you see color, just like an image sensor uses a Bayer filter to deconstruct color into greyscale (with three times as many pixels), and then your screen has three subpixel per pixel which you perceive as a solid color because they're sorta far away and kinda small.


Analog displays have no pixels but dots.


This was actually a "thing" back in the 1950s. Apparently the results were awful:

http://www.earlytelevision.org/color_filter.html


They appear to just be green on the bottom, yellow in the middle, and blue on the top.

Such that it might be plausible for outdoor scenery, but useless otherwise.


Yes, that's how they were. I believe it worked for the occasional western or other outdoor filming.


This is also how automatic digital photo enhancement worked for a while.


Thanks for posting that. My grandmother told me about having that type of "color" tv before and I had never been able to find it. I had decided it must have just been some little cereal box kit with stick on color filters or something.


The Vectrex video game console had special overlays that matched the games layout, for example: http://forum.arcadecontrols.com/index.php/topic,154425.msg16...


Nice ads :) They claim it gives less eyes strain, is it true?


And if so, should we write a custom .css for HN with a similar effect?


Funny ads ! And my ad blocker didn't block them :)


Oh, I hoped to see something based on Fechner colours (https://en.m.wikipedia.org/wiki/Fechner_color). It gave me quite some creeps when a misconfigured XFree86 server managed to draw a bright red line right across the black and white LCD screen of my first laptop...


Wikipedia has it animated, too

EDIT: warning! strobing!

https://en.m.wikipedia.org/wiki/Benham%27s_top#/media/File%3...


Does anyone have an explanation of what I should be seeing here? I've been staring at it for a while (both full-screen and in the article), and I haven't noticed anything other than black and white.


In the spinning animation, the black bands in the white parts quickly turn another colour (brown for me, apparently blue or other colours for others) while the image is spinning.

The effect is much reduced if I tap on the image to make it full screen on mobile. The larger size seems to reduce the velocity of the spin, which seems to be an essential part of it. Could this be the reason you don't see it, are you perhaps on a low performance device for eg.?

Just for the sake of completion, the effect is incomplete for me - the outer black bands are always brown, but the inner two switch between black and brown. The middle one is more often black than brown, but it seems when that one's brown, the innermost one instead turns black.


When I look at that animation, I see brown bands. My girlfriend sees dark blue. Why?


I see brown too. Are you and your girlfriend using the same device?


I see brown. My wife and 13-yr-old son both see green, blue, and pink. Same iPhone, same time.


The wikipedia article says that not everyone sees the same color. I wonder if it is because of our eyes or how brains process the signal.


Here is a good short video about how JVC did this with color shutters in the front https://youtu.be/z-q8ehzHeQQ


When I had a Commodore PET computer I tried making colour by flashing areas at different rates. You can get a very limited colour by changing the length of flashes but not only did I get it to work (poorly) I also got terrible eye strain looking at those flashing squares.


I had no idea B&W LCD monitors even existed, looks like they are primarily marketed for medical imaging purposes, curious what advantage they have over displaying grey-scale images on a good color monitor. I'm guessing by the sub-pixels it is made from mostly the same panel as a color model.


I perused Ezio's website, and found three B&W monitors they currently sell, their "GX" series. Looking at the specs, the thing that stands out to me is that they're really bright: 1,200 cd/m^2 for the GX240 and GX340, and a whopping 2,500 cd/m^2 for the GX560 ! I bet these monitors are brighter than equivalently-backlit color displays from not having a Bayer filter in front.

Speculations on why medical monitors might be that bright: they can probably be better read in brightly-lit medical settings, shine through overlays and negatives placed over them, and be more easily read by some patients with poor vision.

They also offer palettized grayscale: up to 1,024 tones from a palette of 16,369. I'm not sure how useful that is in practice, but is is a unique feature.

https://www.eizo.com/products/radiforce/gx240/

https://www.eizo.com/products/radiforce/gx340/

https://www.eizo.com/products/radiforce/gx560/


It reminds me of back in the 68k Mac days you you could set your screen to 256 colors or thousands of colors or thousands of greys, where you had to choose a tradeoff of resolution and the greyscale mode was primarily for desktop publishing applications ah QuarkXPress and PageMaker..

From their website --

> 10-Bit Simultaneous Grayscale Display

> 10-bit (1,024 tones) simultaneous grayscale display extends grayscale fidelity to the boundaries of human visual perception abilities and helps radiologists discern the finest nuances within an image.


The gentle gradients and transformations from black & white to color are charming. It made me think of the scene in Wizard of Oz where Dorothy steps into Technicolor.


One reason Color TV wasn't implemented like this is the need for a compatible signal and the ability to decode the compatible signal with analog circuitry.


I bet you could get a lot better color if you used the screen itself to generate the bayer pattern on film rather than printing it with a printer.

Get some large format color negative film and thin cyan, magenta, and yellow filters. Put a cyan filter on the LCD, followed by the film on top. In a darkroom, light up the "red" pixels and to expose that part of the film. Repeat with the magenta filter and "green" pixels, and finally the yellow filter and "blue" pixels. Develop the film and reattach to the LCD.

This is kind of how the shadow mask was made in color CRT monitors. A photographic process was used to put the red, green, and blue phosphor dots on the screen by shining light through the shadow mask from the same angles the electron guns would later illuminate the phosphors. This ensures that the red, green, and blue phosphor dot pattern lines up (fairly) accurately with the shadow mask.


I'm reminded a bit of the JVC TM-L450TU LCCS (http://www.earlytelevision.org/jvc_tm-l450tu.html), which used some clever tricks to make a fundamentally B&W LCD show colour.


That's awesome! I guess diffraction of light between the original pixel and the filter is also a reason for the low quality.

It would be nice if the author explained a bit more how he printed those pdfs to such a resolution and precision. Is a home printer sufficient ?


The size of the acetate is around ~A2 size, I got the three acetate sheets printed by printer service for around £35.

https://github.com/anfractuosity/rainbow/blob/master/rainbow... is the code that I created to generate the pattern.

A home printer may work also though, as they're pretty high resolution these days, I just don't have an inkjet printer at the moment.

I'm kind of curious about https://en.wikipedia.org/wiki/Duratrans too, instead of acetate and ink, but they seem very dear.


An aside: I saw "Up" (the movie excerpted in the demo video) years ago, but I never stopped and thought about the physics until today. The house would have started rising as soon as sufficient balloons were inflated inside the house. Releasing the balloons doesn't magically start making them buoyant-- they're already buoyant when they're inside the house.


The balloons are under a tarp behind the house :)


Such a strong tarp!


I used to make screen captures of my Apple ][ monitor using Glow-In-The-Dark Silly Putty.

It took a minute or so to capture the image, but then it was like Kai Power Goo in non-Newtonian hardware!

https://www.macworld.com/article/3005783/an-ode-to-kais-powe...


I wonder if a b/w CRT might work better.


I think the limiting factor of the color intensity is the dye/ink used to make the mask. Probably each pixel is only blocking like 50% of the light in its stop band.


When my LHON was realy bad, this is how colors looked to me


In an age with RGB monitors and displays, I don't know why such a project could be useful.


It's called 'hacker' news. Not all the things have to be useful. It's interesting, at least to me. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: