Hacker News new | past | comments | ask | show | jobs | submit login
Why do color TVs make black and white snow? (stackexchange.com)
255 points by bluedino 9 days ago | hide | past | web | favorite | 78 comments





The other question is “why don’t you see that noise when the TV is tuned in?”

The TV has automatic gain control. When the signal is weak, it will amplify it up to the right level. If the signal is just background noise, then it will amplify that background noise until it’s at the “right level” for a proper TV signal. So, the gain is lower for stronger signals, and very high when there is no signal at all.

Fun fact: B&W TVs are more suitable as computer terminals because they have more bandwidth for the luma signal. Color TVs have to filter out the chroma signal, which reduces the bandwidth for the luma. If you are stuck with a color TV for your terminal, consider modifying it to accept a signal after the low-pass filter. This is not too hard for someone comfortable with a soldering iron, someone who isn’t scared of drilling a hole in their TV.

Thank you for coming to my TED talk.


For those who are interested, automatic gain control on VHS players was exploited by Macrovision to prevent the copying of commercial tapes. Analog video signals contain synchronizing pulses, which are necessary for correct sweeps. These pulses are not part of the visible lines shown by the TV. A Macrovision tape contains spikes of varying amplitudes within the synchronization pulses. The spikes are placed far enough from the visible lines (usually the first 40ms of a frame), so that the TV's AGC is not affected, and the picture remains stable.

When a VCR is receiving a Macrovision signal to record, the AGC (which unlike the TV does care about the first 40ms of a frame) amplifies the signal, causing the visible lines to be heavily altered in an erratic fashion. This sometimes appears as "snow" on the duplicated tape.

Some modern "VHS-To-PC" device drivers/conversion software will detect a Macrovision signal and refuse to accept it.

It was defeated in numerous ways, but the hassle free way to duplicate a Macrovision tape was simply to use an older VCR without an AGC circuit.


Username checks out, except that LaserDisc didn't have Macrovision.

Growing up we had a store-brand Montgomery Ward VCR with manual gain controls (video and audio). Great for copying tapes from Blockbuster.


> If you are stuck with a color TV for your terminal, consider modifying it to accept a signal after the low-pass filter. This is not too hard for someone comfortable with a soldering iron, someone who isn’t scared of drilling a hole in their TV.

Even better, if you want a color terminal, feed the original RGB signal (or its approximation) directly to your TV, don't use RF modulators or a composite input, so there's minimum degradation of video quality. It should be easy on a relatively modern TV. Follow this flowchart.

1. Is it an European TV with a SCART connector? If yes, it already has RGB input. [0]

2. Does the TV include a YPbPr component input? If yes, read Linear Technology Application Note #57 [1], which shows you a circuit to convert RGB to YPbPr using LT6550 chip (or find a similar commercial converter box).

3. Does the TV have an S-video input? If yes, read Analog AD725 datasheet [2] and use this chip to convert RGB to Chroma-Luma component signal for S-Video (or find a similar commercial converter box).

4. Does the TV have onscreen display (i.e. menus overlaid on top of the image)? If yes, there already exists an RGB signal in the TV for feeding the OSD. You can hack your TV and expose the RGB signal input for your own use [3]. Caution: high voltage, discharge the tube, and never work on it with power on.

5. If everything else fails, don't forget the fact a TV's electron gun is ultimately driven by an RGB signal. You can design a simple driving amplifier to feed the RGB signal directly to the electron guns. Unfortunately, I cannot find a reference design for the moment. But it's doable.

[0] https://en.wikipedia.org/wiki/SCART

[1] https://www.analog.com/media/en/technical-documentation/appl...

[2] https://www.analog.com/media/en/technical-documentation/data...

[3] https://hackaday.com/2014/09/21/component-video-input-hack-i...


My advice was aimed at people living in 1980 who had a spare TV they could use for the computer. I suppose your advice is good for people living in 1990 who don’t have a computer monitor yet.

Well timed. The appnote [1] I referred to, is a collection video circuits published in January 1994. Linear's effort to republish its old application notes at the beginning of the 21st century is respectable, lots of valuable documentation and tutorials from the 80s are still available in high-quality digital formats.

Once the Hobart Phase kicks in and has a chance to to work its magic for a few decades, your sagely advice will be relevant once again.

https://en.wikipedia.org/wiki/Counter-Clock_World#The_Hobart...


Wow so that episode of Red Dwarf was actually based on this? My respect for PKD rises.

I just happened to watch a YouTube video on this topic and can’t wish more if I understood it back in 2000s, or even five years ago.

I get you're excited, but old school CRTs can easily kill a person with the discharge off a capacitor, even when they've been unplugged for a while. It may not be a good idea to casually encourage people to go start hacking on them.

Note that suggestion #1, #2 and #3 do not involve modifications of the TV set (and the voltage is a small video signal), so it's risk-free, and should already covered the vast majority of the cases. As for #4, and #5, are more of a possibility and requires knowledge of the internal TV circuitry, those who already have the knowledge to do so, should already be well aware of the dangers involved.

Yeah CRT anything is bad news to go tinkering with. High voltage components and the tubes can violently explode if you compromise them, I tossed a monitor into an empty dumpster once and very nearly soiled myself as it happened to fail just right. If you have one open and the CRT tube fails you could very well be picking tiny shards of glass out of your person.

>relatively modern TV

Will accept hdmi. The OSSC is an hdmi outputting device that will handle analog input well, as long as it is rgb or ycrcb.

Future OSSC Pro will handle composite and s-video as an addon board.


>This is not too hard for someone comfortable with a soldering iron, someone who isn’t scared of drilling a hole in their TV.

And who is comfortable working with EHT. Let's not forget that.


Event Horizon Telescope?

Extra High Tension

From about ~10 kV in a B&W TV, up to 30 kV in a color TV.

Non-neglectable amperage, therefore this is really dangerous stuff. Never tweak it when it is plugged in.


Actually the high voltage anode lead is very well insulated; if it isn't any more, you'll be sure to notice it quickly (smell, arcing noises and lights, TV stops working eventually). However, in a typical CRT drive circuit you still may find somewhat high-power 100-200 V DC supplies (for deflection drivers and also what powers the LOPT) and low power ~1-3 kV supplies for focus control and such.

Doesn't the (big) capacitors inside stay charged for a couple of hours after the TV being disconnected?

A decent TV/CRT monitor should have high-value bleed resistors to discharge the capacitors, but since high-voltage resistors cost money they probably stopped fitting them about the same time as TV's stopped coming with circuit diagrams.

Even if they are installed you can't know if the resister has failed open until you try it. Always assume the tube is energized until you've grounded it.

In the later apple][/C64/etc era most of the color composite monitors also had that B&W switch built in. That way you flip it when editing text/etc vs playing games. The one I had also had a green screen button, which turned it into a green/black monitor.

The 8-Bit guy did a great piece on CGA that goes into detail about how old developers abused the fuzzy smeary nature of composite signals to output somewhat decent graphics. The tricks however didn't work when people upgraded to proper monitors and as a result a lot of CGA games look much worse today than they did on the intended equipment of the day.

https://www.youtube.com/watch?v=niKblgZupOc


He also explained how to mod a consumer TV to use RGB input (provided there is no SCART input). And he explains how to discharge the capacitors.

https://www.youtube.com/watch?v=DLz6pgvsZ_I


This I have wondered many times even since a kid but even now I think I would have had a hard time asking the question right.

Thanks!


I didn't understand your explanation but I think there is something (similar to car radios today) that detects if you are on a signal or not. And if you are not on a signal or it's too weak, it turns the screen black automatically (or mutes the radio). I have seen this happen on a TV when a bad storm weakens the signal from an antenna, for example.

Thanks, I found your presentation even more interesting than the article!

Forward and backward compatibility around the transition to color television was one of the most amazing innovations ever.

Hiding the color subcarrier in-band but largely invisible on black and white sets was a work of collective engineering genious.


Your comment made me think of a personal anecdote. File it under "Why I love living in Silicon Valley".

Our neighborhood blocked off the street for a Fourth of July picnic a few years ago. Sitting in a lawn chair talking to a long-retired neighbor over a beer, we got to talking about old analog TV standards. (We are both hams, so conversation tended to turn to radio-ish things.)

I remarked about how clever I thought it was mixing the video carrier and audio subcarrier to get the audio IF. And he says:

"Oh, it wasn't always like that. When I was a kid, there was a TV station in Philadelphia that transmitted 2 1/2 hours two nights a week. They taped up the schematics for a receiver inside a big picture window at their station, and we would all go down with notebooks and copy down the latest changes so that we could update our receivers. The audio subcarrier went through a lot of changes. AM, FM, and they moved the frequency around."

So... I realized that here was a guy who was watching television back when the way you got a TV receiver was to build it from scratch yourself. Wow number 1.

I said: "I suppose when the war came along you were sent to radar school like almost everybody else that worked on television?"

Answer: "No. Did you ever hear of Eckert and Mauchly?"

Me: stunned.


If you don't already know about it, you might be interested in "From Dits to Bits", the autobiography of Herman Lukoff, an engineer who worked with Eckert and Mauchly.

It's hard to find these days, but a competent university library may have a copy (where I read it). This ref may be helpful: https://dl.acm.org/doi/book/10.5555/539966

A quick link to his papers: https://archives.upenn.edu/collections/finding-aid/upt50l694


So, let me get this straight - he worked on the design or construction of the ENIAC? :)

I believe so. Or the war-time version they did for the Army. I don't remember all the history of those machines exactly.

While not in Silicon Valley, but the other Valley down in Burbank, I had a similar conversation. I was introduced to this person as we had a common interest in the proper methods of 24fps->29.97fps->23.976 video conversions. He told me of the summer he and his brother created a new method to replace the old school flying spot scanning technique that was common at the time. The new process included the ability of one of the first video noise removal capabilities. The coolest part of the story to me was that the first TV broadcast from the moon was sent through the bit of equipment he had built before being broadcast to the public.

The lunar TV system was fascinating in itself - narrow bandwidth forced the use of "slow scan" at about 10Hz. The conversion was done by having a long retention phospor TV and ... pointing a regular TV camera at it.

Some related engineering genius: the artefacts created in the black and white picture by the colour signal are hardly noticeable, but they are enough to recover the colour from a black and white recording!

https://colour-recovery.fandom.com/wiki/The_Unofficial_Colou...


That's amazing! I wonder if there are still photographs floating around which could be colour decoded too.

Maybe a recorded broadcast of a photo?

Only if the photo itself was in colour and broadcast in colour. It's the other way around I was thinking of - a still shot (in black and white) of a PAL or NTSC signal would still contain enough information to decode the colour now, in retrospect.

For instance, a high resolution (still) film photo of a large high resolution TV from back in the day.


Though there was a time when presenters knew not to wear anything with narrow vertical or fine chequerboard patterns, because the resulting high-frequency luminance signal would bleed through into the colour signal causing phantom colour effects. Interestingly the effect occasionaly occurs today, here in the UK where people use old Sky boxes to provide an RF output to a TV in another room.

Wasn't that long ago - it changed fashion - all those hounds-tooth jackets, paisley ties etc etc stopped being seen in public (on TV) and were replaced by solid colors.

Do you mean like when a moire pattern may sometimes appear to have a ghostly coloured after-image? I've seen this in old black-and-white films shown on colour TV. I've been trying to describe that to people to learn more about what it is but I always fail to. I feared there might be something wrong with my eyes. Is there a name for this effect?

For Tv's the colour is caused because the colour signal is hidden as a high frequecy component that older B&W Tv's would ignore. Conversely, a high frequency signal caused by closely spaced light and dark patterns fools colour TV's into thinking there's a colour signal present (as the filters in TV's improved the magnitude of the effect reduced, but it's there to this day in analogue TV).

What you're seeing is probably Fechner Color [0]. I've seen the effect myself, as I recall there was a bit of a craze for demoing it some years back.

[0] https://en.wikipedia.org/wiki/Fechner_color


Thanks. It could well be Fechner Colour.

Back in the 1980s, D-MAC was touted as the system that was going to eliminate that problem. Separation of analogue chrominance and luminance was specifically called out as a feature in popular science explanations.

I too hold this 30fps->29.97 solution to be quite ingenious. I wonder if they had any insight into the future of broadcast formats if they might have chosen a different path? I have no idea if anything else was possible, but looking back monday morning blah blah blah

This backwards compatibility has been the bane of my career. The 29.97 or 30000/1001 frequency, the interlacing, and the effects it had on 24fps film transfer has given me a niche career path, but one so frustrating when dealing with media content butchered by people that had zero understanding of the history of why film/video content was formatted the way it was.


And a curse long after frame modes lost any real world purpose. The workflows ended surviving much longer than the technology that justified them.

Well let's talk about gamma correction - originally created because B/W Tv tubes were not linear, rather than doing the linear->tube conversion in each TV (which typically had ~10 tubes in them - think 10 transistors) they did the conversion in the Tv studio for ALL the TV receivers and just standardized the non-linearity of the TV screen tube's response.

We kept that same gamma when we switched to digital color spaces (and MPEG compression) - it's why noiry dark movies (think of Blade Runner with people in dark spaces smoking) - because of gamma correction there are few digital codes available at the dark corner of the digital color cubes and the result tends to be blocky simply because the color space simply can't represent those colors .... all because of a technical choice made in the late 30s to reduce the cost of Tvs


Yeah I vividly recall encountering the same front/back porch and horizontal and vertical synchronization pulses in an LVDS output to a touchscreen as would be seen in analog television. Simply because you would need that data to work with monitors that have backward compatibility. This was on the TI AM335X so a fairly recent microcontroller.

They even shimmed in closed caption data and widescreen signalling to that same signal they've been using since the 40s. It's honestly beautiful.

Not to mention timecode, Macrovision, and a slew of other proprietary information was stored in the vertical blanking part of the signal. I really miss the hackable nature of analog signals.

There are two other tricks like that: stereo radio transmission and stereo gramophone records.

They didn't do it without slightly modifying the existing standard (luckily still within tolerance). Thus the source of that infernal 1000/1001 factor that pushed field rates from 60Hz to approximately 59.94Hz

Analog backward compatibility seems more complicated now.

After dealing with video, anything audio related seems way less complicated.

What a coincidence, just this morning I was reading a list of best opening sentences in novels, and smiled at this one:

"The sky above the port was the color of television, tuned to a dead channel." - William Gibson, Neuromancer.

But then I thought sadly, that more and more people won't understand that reference as time goes on. I can't remember the last time I saw real channel snow on TV. Probably the last time I owned a tube TV, which is easily a decade or so now... Digital TVs will automatically blank the bad channel out, so the experience of flipping to a blank channel and seeing/hearing snow is something few people will experience nowadays. Like hearing a record scratch, a real bell on a telephone or a fax screeching.


Chances are you may have a different image than what Gibson had since the way a dead channel appears on a CRT had changed over the years and Gibson had mentioned that he was thinking of some very old TVs from his childhood. Here is a relevant thread on scifi stackexchange including a video of a TV from the era Gibson described:

https://scifi.stackexchange.com/questions/163304/what-color-...

But in practice i think everyone would understand even without seeing it the first time - it is like phonographs/gramophones, most people nowadays haven't used one in real life but they do know how they look like.


I must admit I ctrl-F'ed this thread to see if my question from scifi.stackexchange was here. Thanks for making my day :)

It was only 12 years later when Neil Gaiman wrote:

"The sky was the perfect untroubled blue of a television screen, tuned to a dead channel."

-"Neverwhere" (1996)


> Digital TVs will automatically blank the bad channel out

Some digital TVs will put fake snow up when the signal is lost. I think I've seen my Samsung do it. First it says "No signal" but I think if you leave it long enough it goes to snow.


The snow was one thing that was better about analog TV. A very weak signal would show up as snow with some hints of picture. You could reposition your antenna and see if you were making progress or not.

With digital, at least on my Samsung, you don't get anything unless you have a fairly decent signal. This makes positioning an antenna a lot harder.

In theory, this can be addressed by going to the signal information screen in the diagnostics. But on my Samsung it will only show signal information for channels that it has found via its "scan for channels" function.

If you tell it to tune to a channel by number, it will do so even if that channel has never been found via a scan, but then the signal information option is greyed out on the menu.

You can change channels while on the signal information screen, but only via the channel up/down buttons on the remote, and they only step through the scanned channels.

How did it not occur to Samsung that customers might want to see the signal strength for a channel that was manually tuned to?

Further increasing my annoyance with Samsung, I have a Samsung monitor (SyncMaster T240) as my second monitor on my iMac, in the same room as the TV. If the Samsung monitor is on, I cannot get upper VHF or lower UHF channels on the TV. That's because the Samsung monitor spews a lot of strong radio interference in the low to mid 200 MHz range and in the 400-500 MHz range. On an SDR spectrum analyzer this takes the form of a bunch of narrow tall spikes evenly spread across the band, close enough together than dozens of them stomp on any TV channel in those frequency ranges.


Theoretically this is why TV manufacturers include dB meters in most all TVs now. My experience however is that those meters are not very useful because they jump around like crazy and it's hard to get a sense of just how strong the signal is.

It also seems to me that while it was pretty easy to tolerate some fuzz on an old analog station, digital artifacts and stalls pretty much ruin a show. We lost some old marginal stations with the "upgrade" to digital TV. Digital is perfect when the signal is good, but degrades much worse than analog.


I've seen B&W snow but on the other hand, the description seems to apply equally to a bright uniform blue. However, I may have a different perspective because in the 80s and 90s, the "TV" in my house was a computer monitor connected to a VCR with a tuner, connected to a cheap antenna. Any time there was no signal, it was blue.

https://www.youtube.com/watch?v=hz54Ij1fYNQ

The thing is, it doesn't make sense to me in the way it was apparently intended, because I've never seen a sky that literally looks like B&W static. Today it's snowing where I am, but the sky itself is a uniform blank light gray, as I would expect.


The line is poetic for sure. It evokes not only the cold grey that you'd get if you averaged out the static, but also the emptiness of having no signal. You aren't supposed to read it literally, it is intended to capture the ennui of staring into an empty void in an uncaring world.

The static disappeared with the transition to DTV, not the switch away from CRTs. Analog signals can be "fuzzy", digital signals are either received or not.

Digital tv signals can have "blocky" artifacts when the reception is poor. I've see them myself.

Digital reception will start to store the received signal within a buffer, and the decoder will interpret the signal from within that buffer. With poor reception of a digital signal, the buffer starts/stops being filled. Once the signal is back, it starts filling the buffer again with data missing from when the signal was lost. The blockiness you are seeing is the decoder attempting to decode that incomplete data.

The difference between poor analog signal vs poor digital is just totally different. Analog was much more forgiving for receiving less than full signal strength. With digital, the signal stops, starts, stops; there is no partial reception of the signal. Lost packets are not re-transmitted like TCP/IP packets.


my old digital tv still has an analog coax input that displays snow with very loud static when I'm trying to switch inputs and don't get past it quick enough.

This is exacerbated by the fact that it has no remote (possibly lost?) and only capacitive buttons on the front with nothing like a bump or a light to identify where the buttons are.

I hate that thing.


How do you know Gibson didn't mean a clear blue sky on a perfectly cloudless day? :)

When the topic came up, I showed my kids the black and white snow (or "war of the ants" as the local term would be translated) and a test pattern. On YouTube, of course. The idea of "there is nothing on" is really foreign.

I used to have a Philips 100Hz CRT TV, which used digital processing to 'double' the frame rate. One nice feature was it had a freeze-frame feature, back when such things were pretty rare.

So the party trick was to leave it on showing frozen black and white 'snow', and when someone mentioned it just say 'oh, it's stopped again' and bang on top of it with my fist. The snow would then start moving again, in the usual manner.

The trick was, of course, furtively pressing the 'unfreeze' button on the remote control in my other hand.


Just want to take this opportunity to remind everyone that a portion of the noise power in "black and white snow" is the Cosmic Microwave Background from the Big Bang.

http://www.astro.yale.edu/vdbosch/cmb_Osher.pdf


First answer is right. I have some cave man era video experience. Color is a clever add on hack to the black and white transmission standard that allowed forwards and backwards compatibility with all sets.

It wasn't entirely black and white snow. Also was a bit of pink and green too. Been awhile since I saw any though


I think I remember seeing colorful noise on top of a weak color signal. Makes sense given the color burst is there.

Wouldn't random colours average to greys to the HVS, pretty quickly, anyway?

No color burst on the front porch, so the color killer circuit disables color.

They originally did it all with vacuum tubes no less.

The short and academically correct answer: a random noise signal is random. The lack of organization precludes that there is no separate color carrier present, and the set goes, as is already answered, into b/w mode. For split seconds there might be randomly something that looks like a color carrier might be present, then there might be a quick colorish flicker.

yes, you do not see wrong:

while the generated signal is strictly b/w there are reasons that the academical requirement for b/w noise is not really happening: technical reasons and physiological reasons.

A little ramble of why we still see sometimes color in real random noise:

There are:

physiological reasons: 'optical illusion'

reasons that have to do with improper function of an analog TV

reasons that have to do with improper function of analog to digital conversion that can make random noise appear tinted on a digital TV (a 'should not happen' that happens actually all the time on cheap TVs, there is a reason for 'cheap' and 'costly' digital TVs...)

Let's go, or just stop here if you like... everything has basically been said, this is just for your enjoyment:

pink artifacts:

'snow', when we see it: our eyes do add color tints , as they do add 'correlations'. Some see faces and stuff (TV people...maybe talk to someone about it... ;-D )

there are some optical illusions out there that move some b/w patterns and we (at least I do, don't have my bionic eyes yet..) do see color although there is none at all on the moving pattern google for 'optical illusions'

Technical reasons why 'should be b/w noise' may have some hues and colored pixels:

color in random noise on an analog color TV (or computer monitor):

Reminding: noise from 'no signal' or from a blank analog magnetic medium on an analog TV cannot have a color signal which would require the organization of the noise into frames of certain durations ...

There are instances where a color TV will render random noise (b/w) as color:

The color convergence is out of alignment and the three (r/b/g) guns focus slightly off-center of the color triple on the screen (instead of 100% r/b/g it gives a little tint), either always, or per pixel and randomly. Stationary tints mean you need to demagnetize, random tints per pixel mean there is noise on the beam control.

"I see color noise on a digital TV..."

A digital TV has no noise, it works or not.

A rendered analog signal (from a VCR empty tape or from an analog tuner / receiver) will be b/w at the origin.

Then the analog signal is converted to a digital signal and here trouble can lurk in the detail:

Before the conversion from analog to digital, the above rules for analog signals do apply: no blanking period means no color signal went in.

Your analog inputs to a digital TV or monitor can come from

dual tuners:

even if there are no analog TV stations you might have an analog signal from some arcade games, from a camera abusing of an analog TV channel, or an old VCR you light into your digital TV via analog channel 2 or 3 (you need to know what a rotary dial phone is to understand this....).

Again: academically these should be strictly b/w after conversion from analog to digital.

Some reasons how conversion imprecision can add color to 'white' noise:

Bad converters used to convert from analog to digital can include 'dust' in the rapidly generated digital pixel value.

That is a software issue in the converter. A 0 is not exactly zero, a one is a little less than all bits on, and some bits are picked sometimes in the middle. When I say here and from here on '0' and '1' I am talking about 'all bits 0' and 'all bits 1' in the per pixel color depth value.

Bad video a/d converter designs use FP units on fast and cheap general-purpose chips: again: 0 is not exactly zero ever and 1 is at most .999999. they do some mantissa tricking to process, e.g., a 24 bit pixel signal through an FP unit which is a general-purpose, cheaper (and faster, vectorized) chip than a real video converter chip.

This creates a hue that is different per pixel and per instance (the same digital pixel lights up in slightly different hues each time it is used).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: