The TV has automatic gain control. When the signal is weak, it will amplify it up to the right level. If the signal is just background noise, then it will amplify that background noise until it’s at the “right level” for a proper TV signal. So, the gain is lower for stronger signals, and very high when there is no signal at all.
Fun fact: B&W TVs are more suitable as computer terminals because they have more bandwidth for the luma signal. Color TVs have to filter out the chroma signal, which reduces the bandwidth for the luma. If you are stuck with a color TV for your terminal, consider modifying it to accept a signal after the low-pass filter. This is not too hard for someone comfortable with a soldering iron, someone who isn’t scared of drilling a hole in their TV.
Thank you for coming to my TED talk.
When a VCR is receiving a Macrovision signal to record, the AGC (which unlike the TV does care about the first 40ms of a frame) amplifies the signal, causing the visible lines to be heavily altered in an erratic fashion. This sometimes appears as "snow" on the duplicated tape.
Some modern "VHS-To-PC" device drivers/conversion software will detect a Macrovision signal and refuse to accept it.
It was defeated in numerous ways, but the hassle free way to duplicate a Macrovision tape was simply to use an older VCR without an AGC circuit.
Growing up we had a store-brand Montgomery Ward VCR with manual gain controls (video and audio). Great for copying tapes from Blockbuster.
Even better, if you want a color terminal, feed the original RGB signal (or its approximation) directly to your TV, don't use RF modulators or a composite input, so there's minimum degradation of video quality. It should be easy on a relatively modern TV. Follow this flowchart.
1. Is it an European TV with a SCART connector? If yes, it already has RGB input. 
2. Does the TV include a YPbPr component input? If yes, read Linear Technology Application Note #57 , which shows you a circuit to convert RGB to YPbPr using LT6550 chip (or find a similar commercial converter box).
3. Does the TV have an S-video input? If yes, read Analog AD725 datasheet  and use this chip to convert RGB to Chroma-Luma component signal for S-Video (or find a similar commercial converter box).
4. Does the TV have onscreen display (i.e. menus overlaid on top of the image)? If yes, there already exists an RGB signal in the TV for feeding the OSD. You can hack your TV and expose the RGB signal input for your own use . Caution: high voltage, discharge the tube, and never work on it with power on.
5. If everything else fails, don't forget the fact a TV's electron gun is ultimately driven by an RGB signal. You can design a simple driving amplifier to feed the RGB signal directly to the electron guns. Unfortunately, I cannot find a reference design for the moment. But it's doable.
Will accept hdmi. The OSSC is an hdmi outputting device that will handle analog input well, as long as it is rgb or ycrcb.
Future OSSC Pro will handle composite and s-video as an addon board.
And who is comfortable working with EHT. Let's not forget that.
From about ~10 kV in a B&W TV, up to 30 kV in a color TV.
Non-neglectable amperage, therefore this is really dangerous stuff. Never tweak it when it is plugged in.
Hiding the color subcarrier in-band but largely invisible on black and white sets was a work of collective engineering genious.
Our neighborhood blocked off the street for a Fourth of July picnic a few years ago. Sitting in a lawn chair talking to a long-retired neighbor over a beer, we got to talking about old analog TV standards. (We are both hams, so conversation tended to turn to radio-ish things.)
I remarked about how clever I thought it was mixing the video carrier and audio subcarrier to get the audio IF. And he says:
"Oh, it wasn't always like that. When I was a kid, there was a TV station in Philadelphia that transmitted 2 1/2 hours two nights a week. They taped up the schematics for a receiver inside a big picture window at their station, and we would all go down with notebooks and copy down the latest changes so that we could update our receivers. The audio subcarrier went through a lot of changes. AM, FM, and they moved the frequency around."
So... I realized that here was a guy who was watching television back when the way you got a TV receiver was to build it from scratch yourself. Wow number 1.
I said: "I suppose when the war came along you were sent to radar school like almost everybody else that worked on television?"
Answer: "No. Did you ever hear of Eckert and Mauchly?"
It's hard to find these days, but a competent university library may have a copy (where I read it). This ref may be helpful: https://dl.acm.org/doi/book/10.5555/539966
A quick link to his papers: https://archives.upenn.edu/collections/finding-aid/upt50l694
For instance, a high resolution (still) film photo of a large high resolution TV from back in the day.
What you're seeing is probably Fechner Color . I've seen the effect myself, as I recall there was a bit of a craze for demoing it some years back.
This backwards compatibility has been the bane of my career. The 29.97 or 30000/1001 frequency, the interlacing, and the effects it had on 24fps film transfer has given me a niche career path, but one so frustrating when dealing with media content butchered by people that had zero understanding of the history of why film/video content was formatted the way it was.
We kept that same gamma when we switched to digital color spaces (and MPEG compression) - it's why noiry dark movies (think of Blade Runner with people in dark spaces smoking) - because of gamma correction there are few digital codes available at the dark corner of the digital color cubes and the result tends to be blocky simply because the color space simply can't represent those colors .... all because of a technical choice made in the late 30s to reduce the cost of Tvs
"The sky above the port was the color of television, tuned to a dead channel." - William Gibson, Neuromancer.
But then I thought sadly, that more and more people won't understand that reference as time goes on. I can't remember the last time I saw real channel snow on TV. Probably the last time I owned a tube TV, which is easily a decade or so now... Digital TVs will automatically blank the bad channel out, so the experience of flipping to a blank channel and seeing/hearing snow is something few people will experience nowadays. Like hearing a record scratch, a real bell on a telephone or a fax screeching.
But in practice i think everyone would understand even without seeing it the first time - it is like phonographs/gramophones, most people nowadays haven't used one in real life but they do know how they look like.
"The sky was the perfect untroubled blue of a television screen, tuned to a dead channel."
Some digital TVs will put fake snow up when the signal is lost. I think I've seen my Samsung do it. First it says "No signal" but I think if you leave it long enough it goes to snow.
With digital, at least on my Samsung, you don't get anything unless you have a fairly decent signal. This makes positioning an antenna a lot harder.
In theory, this can be addressed by going to the signal information screen in the diagnostics. But on my Samsung it will only show signal information for channels that it has found via its "scan for channels" function.
If you tell it to tune to a channel by number, it will do so even if that channel has never been found via a scan, but then the signal information option is greyed out on the menu.
You can change channels while on the signal information screen, but only via the channel up/down buttons on the remote, and they only step through the scanned channels.
How did it not occur to Samsung that customers might want to see the signal strength for a channel that was manually tuned to?
Further increasing my annoyance with Samsung, I have a Samsung monitor (SyncMaster T240) as my second monitor on my iMac, in the same room as the TV. If the Samsung monitor is on, I cannot get upper VHF or lower UHF channels on the TV. That's because the Samsung monitor spews a lot of strong radio interference in the low to mid 200 MHz range and in the 400-500 MHz range. On an SDR spectrum analyzer this takes the form of a bunch of narrow tall spikes evenly spread across the band, close enough together than dozens of them stomp on any TV channel in those frequency ranges.
It also seems to me that while it was pretty easy to tolerate some fuzz on an old analog station, digital artifacts and stalls pretty much ruin a show. We lost some old marginal stations with the "upgrade" to digital TV. Digital is perfect when the signal is good, but degrades much worse than analog.
The thing is, it doesn't make sense to me in the way it was apparently intended, because I've never seen a sky that literally looks like B&W static. Today it's snowing where I am, but the sky itself is a uniform blank light gray, as I would expect.
The difference between poor analog signal vs poor digital is just totally different. Analog was much more forgiving for receiving less than full signal strength. With digital, the signal stops, starts, stops; there is no partial reception of the signal. Lost packets are not re-transmitted like TCP/IP packets.
This is exacerbated by the fact that it has no remote (possibly lost?) and only capacitive buttons on the front with nothing like a bump or a light to identify where the buttons are.
I hate that thing.
So the party trick was to leave it on showing frozen black and white 'snow', and when someone mentioned it just say 'oh, it's stopped again' and bang on top of it with my fist. The snow would then start moving again, in the usual manner.
The trick was, of course, furtively pressing the 'unfreeze' button on the remote control in my other hand.
yes, you do not see wrong:
while the generated signal is strictly b/w there are reasons that the academical requirement for b/w noise is not really happening: technical reasons and physiological reasons.
A little ramble of why we still see sometimes color in real random noise:
physiological reasons: 'optical illusion'
reasons that have to do with improper function of an analog TV
reasons that have to do with improper function of analog to digital conversion that can make random noise appear tinted on a digital TV (a 'should not happen' that happens actually all the time on cheap TVs, there is a reason for 'cheap' and 'costly' digital TVs...)
Let's go, or just stop here if you like... everything has basically been said, this is just for your enjoyment:
'snow', when we see it: our eyes do add color tints , as they do add 'correlations'. Some see faces and stuff (TV people...maybe talk to someone about it... ;-D )
there are some optical illusions out there that move some b/w patterns and we (at least I do, don't have my bionic eyes yet..) do see color although there is none at all on the moving pattern google for 'optical illusions'
Technical reasons why 'should be b/w noise' may have some hues and colored pixels:
color in random noise on an analog color TV (or computer monitor):
Reminding: noise from 'no signal' or from a blank analog magnetic medium on an analog TV cannot have a color signal which would require the organization of the noise into frames of certain durations ...
There are instances where a color TV will render random noise (b/w) as color:
The color convergence is out of alignment and the three (r/b/g) guns focus slightly off-center of the color triple on the screen (instead of 100% r/b/g it gives a little tint), either always, or per pixel and randomly. Stationary tints mean you need to demagnetize, random tints per pixel mean there is noise on the beam control.
"I see color noise on a digital TV..."
A digital TV has no noise, it works or not.
A rendered analog signal (from a VCR empty tape or from an analog tuner / receiver) will be b/w at the origin.
Then the analog signal is converted to a digital signal and here trouble can lurk in the detail:
Before the conversion from analog to digital, the above rules for analog signals do apply: no blanking period means no color signal went in.
Your analog inputs to a digital TV or monitor can come from
even if there are no analog TV stations you might have an analog signal from some arcade games, from a camera abusing of an analog TV channel, or an old VCR you light into your digital TV via analog channel 2 or 3 (you need to know what a rotary dial phone is to understand this....).
Again: academically these should be strictly b/w after conversion from analog to digital.
Some reasons how conversion imprecision can add color to 'white' noise:
Bad converters used to convert from analog to digital can include 'dust' in the rapidly generated digital pixel value.
That is a software issue in the converter. A 0 is not exactly zero, a one is a little less than all bits on, and some bits are picked sometimes in the middle. When I say here and from here on '0' and '1' I am talking about 'all bits 0' and 'all bits 1' in the per pixel color depth value.
Bad video a/d converter designs use FP units on fast and cheap general-purpose chips: again: 0 is not exactly zero ever and 1 is at most .999999. they do some mantissa tricking to process, e.g., a 24 bit pixel signal through an FP unit which is a general-purpose, cheaper (and faster, vectorized) chip than a real video converter chip.
This creates a hue that is different per pixel and per instance (the same digital pixel lights up in slightly different hues each time it is used).